Bayesian analyses of time-interval data for environmental radiation monitoring.
Luo, Peng; Sharp, Julia L; DeVol, Timothy A
2013-01-01
Time-interval (time difference between two consecutive pulses) analysis based on the principles of Bayesian inference was investigated for online radiation monitoring. Using experimental and simulated data, Bayesian analysis of time-interval data [Bayesian (ti)] was compared with Bayesian and a conventional frequentist analysis of counts in a fixed count time [Bayesian (cnt) and single interval test (SIT), respectively]. The performances of the three methods were compared in terms of average run length (ARL) and detection probability for several simulated detection scenarios. Experimental data were acquired with a DGF-4C system in list mode. Simulated data were obtained using Monte Carlo techniques to obtain a random sampling of the Poisson distribution. All statistical algorithms were developed using the R Project for statistical computing. Bayesian analysis of time-interval information provided a similar detection probability as Bayesian analysis of count information, but the authors were able to make a decision with fewer pulses at relatively higher radiation levels. In addition, for the cases with very short presence of the source (< count time), time-interval information is more sensitive to detect a change than count information since the source data is averaged by the background data over the entire count time. The relationships of the source time, change points, and modifications to the Bayesian approach for increasing detection probability are presented.
NASA Astrophysics Data System (ADS)
Endreny, Theodore A.; Pashiardis, Stelios
2007-02-01
SummaryRobust and accurate estimates of rainfall frequencies are difficult to make with short, and arid-climate, rainfall records, however new regional and global methods were used to supplement such a constrained 15-34 yr record in Cyprus. The impact of supplementing rainfall frequency analysis with the regional and global approaches was measured with relative bias and root mean square error (RMSE) values. Analysis considered 42 stations with 8 time intervals (5-360 min) in four regions delineated by proximity to sea and elevation. Regional statistical algorithms found the sites passed discordancy tests of coefficient of variation, skewness and kurtosis, while heterogeneity tests revealed the regions were homogeneous to mildly heterogeneous. Rainfall depths were simulated in the regional analysis method 500 times, and then goodness of fit tests identified the best candidate distribution as the general extreme value (GEV) Type II. In the regional analysis, the method of L-moments was used to estimate location, shape, and scale parameters. In the global based analysis, the distribution was a priori prescribed as GEV Type II, a shape parameter was a priori set to 0.15, and a time interval term was constructed to use one set of parameters for all time intervals. Relative RMSE values were approximately equal at 10% for the regional and global method when regions were compared, but when time intervals were compared the global method RMSE had a parabolic-shaped time interval trend. Relative bias values were also approximately equal for both methods when regions were compared, but again a parabolic-shaped time interval trend was found for the global method. The global method relative RMSE and bias trended with time interval, which may be caused by fitting a single scale value for all time intervals.
Estimating short-run and long-run interaction mechanisms in interictal state.
Ozkaya, Ata; Korürek, Mehmet
2010-04-01
We address the issue of analyzing electroencephalogram (EEG) from seizure patients in order to test, model and determine the statistical properties that distinguish between EEG states (interictal, pre-ictal, ictal) by introducing a new class of time series analysis methods. In the present study: firstly, we employ statistical methods to determine the non-stationary behavior of focal interictal epileptiform series within very short time intervals; secondly, for such intervals that are deemed non-stationary we suggest the concept of Autoregressive Integrated Moving Average (ARIMA) process modelling, well known in time series analysis. We finally address the queries of causal relationships between epileptic states and between brain areas during epileptiform activity. We estimate the interaction between different EEG series (channels) in short time intervals by performing Granger-causality analysis and also estimate such interaction in long time intervals by employing Cointegration analysis, both analysis methods are well-known in econometrics. Here we find: first, that the causal relationship between neuronal assemblies can be identified according to the duration and the direction of their possible mutual influences; second, that although the estimated bidirectional causality in short time intervals yields that the neuronal ensembles positively affect each other, in long time intervals neither of them is affected (increasing amplitudes) from this relationship. Moreover, Cointegration analysis of the EEG series enables us to identify whether there is a causal link from the interictal state to ictal state.
The method of trend analysis of parameters time series of gas-turbine engine state
NASA Astrophysics Data System (ADS)
Hvozdeva, I.; Myrhorod, V.; Derenh, Y.
2017-10-01
This research substantiates an approach to interval estimation of time series trend component. The well-known methods of spectral and trend analysis are used for multidimensional data arrays. The interval estimation of trend component is proposed for the time series whose autocorrelation matrix possesses a prevailing eigenvalue. The properties of time series autocorrelation matrix are identified.
Measuring Land Change in Coastal Zone around a Rapidly Urbanized Bay.
Huang, Faming; Huang, Boqiang; Huang, Jinliang; Li, Shenghui
2018-05-23
Urban development is a major cause for eco-degradation in many coastal regions. Understanding urbanization dynamics and underlying driving factors is crucial for urban planning and management. Land-use dynamic degree indices and intensity analysis were used to measure land changes occurred in 1990, 2002, 2009, and 2017 in the coastal zone around Quanzhou bay, which is a rapidly urbanized bay in Southeast China. The comprehensive land-use dynamic degree and interval level intensity analysis both revealed that land change was accelerating across the three time intervals in a three-kilometer-wide zone along the coastal line (zone A), while land change was fastest during the second time interval 2002⁻2009 in a separate terrestrial area within coastal zone (zone B). Driven by urbanization, built-up gains and cropland losses were active for all time intervals in both zones. Mudflat losses were active except in the first time interval in zone A due to the intensive sea reclamation. The gain of mangrove was active while the loss of mangrove is dormant for all three intervals in zone A. Transition level analysis further revealed the similarities and differences in processes within patterns of land changes for both zones. The transition from cropland to built-up was systematically targeted and stationary while the transition from woodland to built-up was systematically avoiding transition in both zones. Built-up tended to target aquaculture for the second and third time intervals in zone A but avoid Aquaculture for all intervals in zone B. Land change in zone A was more significant than that in zone B during the second and third time intervals at three-level intensity. The application of intensity analysis can enhance our understanding of the patterns and processes in land changes and suitable land development plans in the Quanzhou bay area. This type of investigation is useful to provide information for developing sound land use policy to achieve urban sustainability in similar coastal areas.
Ratio-based lengths of intervals to improve fuzzy time series forecasting.
Huarng, Kunhuang; Yu, Tiffany Hui-Kuang
2006-04-01
The objective of this study is to explore ways of determining the useful lengths of intervals in fuzzy time series. It is suggested that ratios, instead of equal lengths of intervals, can more properly represent the intervals among observations. Ratio-based lengths of intervals are, therefore, proposed to improve fuzzy time series forecasting. Algebraic growth data, such as enrollments and the stock index, and exponential growth data, such as inventory demand, are chosen as the forecasting targets, before forecasting based on the various lengths of intervals is performed. Furthermore, sensitivity analyses are also carried out for various percentiles. The ratio-based lengths of intervals are found to outperform the effective lengths of intervals, as well as the arbitrary ones in regard to the different statistical measures. The empirical analysis suggests that the ratio-based lengths of intervals can also be used to improve fuzzy time series forecasting.
Goff, M L; Win, B H
1997-11-01
The postmortem interval for a set of human remains discovered inside a metal tool box was estimated using the development time required for a stratiomyid fly (Diptera: Stratiomyidae), Hermetia illucens, in combination with the time required to establish a colony of the ant Anoplolepsis longipes (Hymenoptera: Formicidae) capable of producing alate (winged) reproductives. This analysis resulted in a postmortem interval estimate of 14 + months, with a period of 14-18 months being the most probable time interval. The victim had been missing for approximately 18 months.
Sun, Jianguo; Feng, Yanqin; Zhao, Hui
2015-01-01
Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.
ERIC Educational Resources Information Center
Alpermann, Anke; Huber, Walter; Natke, Ulrich; Willmes, Klaus
2010-01-01
Improved fluency after stuttering therapy is usually measured by the percentage of stuttered syllables. However, outcome studies rarely evaluate the use of trained speech patterns that speakers use to manage stuttering. This study investigated whether the modified time interval analysis can distinguish between trained speech patterns, fluent…
Interval sampling methods and measurement error: a computer simulation.
Wirth, Oliver; Slaven, James; Taylor, Matthew A
2014-01-01
A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.
Wang, Peijie; Zhao, Hui; Sun, Jianguo
2016-12-01
Interval-censored failure time data occur in many fields such as demography, economics, medical research, and reliability and many inference procedures on them have been developed (Sun, 2006; Chen, Sun, and Peace, 2012). However, most of the existing approaches assume that the mechanism that yields interval censoring is independent of the failure time of interest and it is clear that this may not be true in practice (Zhang et al., 2007; Ma, Hu, and Sun, 2015). In this article, we consider regression analysis of case K interval-censored failure time data when the censoring mechanism may be related to the failure time of interest. For the problem, an estimated sieve maximum-likelihood approach is proposed for the data arising from the proportional hazards frailty model and for estimation, a two-step procedure is presented. In the addition, the asymptotic properties of the proposed estimators of regression parameters are established and an extensive simulation study suggests that the method works well. Finally, we apply the method to a set of real interval-censored data that motivated this study. © 2016, The International Biometric Society.
Statistical physics approaches to financial fluctuations
NASA Astrophysics Data System (ADS)
Wang, Fengzhong
2009-12-01
Complex systems attract many researchers from various scientific fields. Financial markets are one of these widely studied complex systems. Statistical physics, which was originally developed to study large systems, provides novel ideas and powerful methods to analyze financial markets. The study of financial fluctuations characterizes market behavior, and helps to better understand the underlying market mechanism. Our study focuses on volatility, a fundamental quantity to characterize financial fluctuations. We examine equity data of the entire U.S. stock market during 2001 and 2002. To analyze the volatility time series, we develop a new approach, called return interval analysis, which examines the time intervals between two successive volatilities exceeding a given value threshold. We find that the return interval distribution displays scaling over a wide range of thresholds. This scaling is valid for a range of time windows, from one minute up to one day. Moreover, our results are similar for commodities, interest rates, currencies, and for stocks of different countries. Further analysis shows some systematic deviations from a scaling law, which we can attribute to nonlinear correlations in the volatility time series. We also find a memory effect in return intervals for different time scales, which is related to the long-term correlations in the volatility. To further characterize the mechanism of price movement, we simulate the volatility time series using two different models, fractionally integrated generalized autoregressive conditional heteroscedasticity (FIGARCH) and fractional Brownian motion (fBm), and test these models with the return interval analysis. We find that both models can mimic time memory but only fBm shows scaling in the return interval distribution. In addition, we examine the volatility of daily opening to closing and of closing to opening. We find that each volatility distribution has a power law tail. Using the detrended fluctuation analysis (DFA) method, we show long-term auto-correlations in these volatility time series. We also analyze return, the actual price changes of stocks, and find that the returns over the two sessions are often anti-correlated.
Monitoring molecular interactions using photon arrival-time interval distribution analysis
Laurence, Ted A [Livermore, CA; Weiss, Shimon [Los Angels, CA
2009-10-06
A method for analyzing/monitoring the properties of species that are labeled with fluorophores. A detector is used to detect photons emitted from species that are labeled with one or more fluorophores and located in a confocal detection volume. The arrival time of each of the photons is determined. The interval of time between various photon pairs is then determined to provide photon pair intervals. The number of photons that have arrival times within the photon pair intervals is also determined. The photon pair intervals are then used in combination with the corresponding counts of intervening photons to analyze properties and interactions of the molecules including brightness, concentration, coincidence and transit time. The method can be used for analyzing single photon streams and multiple photon streams.
Analysis of single ion channel data incorporating time-interval omission and sampling
The, Yu-Kai; Timmer, Jens
2005-01-01
Hidden Markov models are widely used to describe single channel currents from patch-clamp experiments. The inevitable anti-aliasing filter limits the time resolution of the measurements and therefore the standard hidden Markov model is not adequate anymore. The notion of time-interval omission has been introduced where brief events are not detected. The developed, exact solutions to this problem do not take into account that the measured intervals are limited by the sampling time. In this case the dead-time that specifies the minimal detectable interval length is not defined unambiguously. We show that a wrong choice of the dead-time leads to considerably biased estimates and present the appropriate equations to describe sampled data. PMID:16849220
Time-variant random interval natural frequency analysis of structures
NASA Astrophysics Data System (ADS)
Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin
2018-02-01
This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.
Quantitative analysis of ground penetrating radar data in the Mu Us Sandland
NASA Astrophysics Data System (ADS)
Fu, Tianyang; Tan, Lihua; Wu, Yongqiu; Wen, Yanglei; Li, Dawei; Duan, Jinlong
2018-06-01
Ground penetrating radar (GPR), which can reveal the sedimentary structure and development process of dunes, is widely used to evaluate aeolian landforms. The interpretations for GPR profiles are mostly based on qualitative descriptions of geometric features of the radar reflections. This research quantitatively analyzed the waveform parameter characteristics of different radar units by extracting the amplitude and time interval parameters of GPR data in the Mu Us Sandland in China, and then identified and interpreted different sedimentary structures. The results showed that different types of radar units had specific waveform parameter characteristics. The main waveform parameter characteristics of sand dune radar facies and sandstone radar facies included low amplitudes and wide ranges of time intervals, ranging from 0 to 0.25 and 4 to 33 ns respectively, and the mean amplitudes changed gradually with time intervals. The amplitude distribution curves of various sand dune radar facies were similar as unimodal distributions. The radar surfaces showed high amplitudes with time intervals concentrated in high-value areas, ranging from 0.08 to 0.61 and 9 to 34 ns respectively, and the mean amplitudes changed drastically with time intervals. The amplitude and time interval values of lacustrine radar facies were between that of sand dune radar facies and radar surfaces, ranging from 0.08 to 0.29 and 11 to 30 ns respectively, and the mean amplitude and time interval curve was approximately trapezoidal. The quantitative extraction and analysis of GPR reflections could help distinguish various radar units and provide evidence for identifying sedimentary structure in aeolian landforms.
Kim, Tae Kyung; Kim, Hyung Wook; Kim, Su Jin; Ha, Jong Kun; Jang, Hyung Ha; Hong, Young Mi; Park, Su Bum; Choi, Cheol Woong; Kang, Dae Hwan
2014-01-01
Background/Aims The quality of bowel preparation (QBP) is the important factor in performing a successful colonoscopy. Several factors influencing QBP have been reported; however, some factors, such as the optimal preparation-to-colonoscopy time interval, remain controversial. This study aimed to determine the factors influencing QBP and the optimal time interval for full-dose polyethylene glycol (PEG) preparation. Methods A total of 165 patients who underwent colonoscopy from June 2012 to August 2012 were prospectively evaluated. The QBP was assessed using the Ottawa Bowel Preparation Scale (Ottawa) score according to several factors influencing the QBP were analyzed. Results Colonoscopies with a time interval of 5 to 6 hours had the best Ottawa score in all parts of the colon. Patients with time intervals of 6 hours or less had the better QBP than those with time intervals of more than 6 hours (p=0.046). In the multivariate analysis, the time interval (odds ratio, 1.897; 95% confidence interval, 1.006 to 3.577; p=0.048) was the only significant contributor to a satisfactory bowel preparation. Conclusions The optimal time was 5 to 6 hours for the full-dose PEG method, and the time interval was the only significant contributor to a satisfactory bowel preparation. PMID:25368750
Panek, Petr; Prochazka, Ivan
2007-09-01
This article deals with the time interval measurement device, which is based on a surface acoustic wave (SAW) filter as a time interpolator. The operating principle is based on the fact that a transversal SAW filter excited by a short pulse can generate a finite signal with highly suppressed spectra outside a narrow frequency band. If the responses to two excitations are sampled at clock ticks, they can be precisely reconstructed from a finite number of samples and then compared so as to determine the time interval between the two excitations. We have designed and constructed a two-channel time interval measurement device which allows independent timing of two events and evaluation of the time interval between them. The device has been constructed using commercially available components. The experimental results proved the concept. We have assessed the single-shot time interval measurement precision of 1.3 ps rms that corresponds to the time of arrival precision of 0.9 ps rms in each channel. The temperature drift of the measured time interval on temperature is lower than 0.5 ps/K, and the long term stability is better than +/-0.2 ps/h. These are to our knowledge the best values reported for the time interval measurement device. The results are in good agreement with the error budget based on the theoretical analysis.
NASA Technical Reports Server (NTRS)
Bergman, S. A., Jr.; Johnson, R. L.; Hoffler, G. W.
1977-01-01
Devices and techniques for measuring and analyzing systolic time intervals and quantitative phonocardiograms were initiated during Apollo 17. The data show that the systolic time interval from Apollo 17 crewmen remained elevated longer postflight than the response criteria of heart rate, blood pressure, and percent change in leg volume all of which had returned to preflight levels by the second day postflight. Although the systolic time interval values were only slightly outside the preflight fiducial limits, this finding suggested that: the analysis of systolic time intervals may help to identify the mechanisms of postflight orthostatic intolerance by virtue of measuring ventricular function more directly and, the noninvasive technique may prove useful in determining the extent and duration of cardiovascular instability after long duration space flight. The systolic time intervals obtained on the Apollo 17 crewmen during lower body negative pressure were similar to those noted in patients with significant heart disease.
Stochastic simulation and analysis of biomolecular reaction networks
Frazier, John M; Chushak, Yaroslav; Foy, Brent
2009-01-01
Background In recent years, several stochastic simulation algorithms have been developed to generate Monte Carlo trajectories that describe the time evolution of the behavior of biomolecular reaction networks. However, the effects of various stochastic simulation and data analysis conditions on the observed dynamics of complex biomolecular reaction networks have not recieved much attention. In order to investigate these issues, we employed a a software package developed in out group, called Biomolecular Network Simulator (BNS), to simulate and analyze the behavior of such systems. The behavior of a hypothetical two gene in vitro transcription-translation reaction network is investigated using the Gillespie exact stochastic algorithm to illustrate some of the factors that influence the analysis and interpretation of these data. Results Specific issues affecting the analysis and interpretation of simulation data are investigated, including: (1) the effect of time interval on data presentation and time-weighted averaging of molecule numbers, (2) effect of time averaging interval on reaction rate analysis, (3) effect of number of simulations on precision of model predictions, and (4) implications of stochastic simulations on optimization procedures. Conclusion The two main factors affecting the analysis of stochastic simulations are: (1) the selection of time intervals to compute or average state variables and (2) the number of simulations generated to evaluate the system behavior. PMID:19534796
Influence of the time scale on the construction of financial networks.
Emmert-Streib, Frank; Dehmer, Matthias
2010-09-30
In this paper we investigate the definition and formation of financial networks. Specifically, we study the influence of the time scale on their construction. For our analysis we use correlation-based networks obtained from the daily closing prices of stock market data. More precisely, we use the stocks that currently comprise the Dow Jones Industrial Average (DJIA) and estimate financial networks where nodes correspond to stocks and edges correspond to none vanishing correlation coefficients. That means only if a correlation coefficient is statistically significant different from zero, we include an edge in the network. This construction procedure results in unweighted, undirected networks. By separating the time series of stock prices in non-overlapping intervals, we obtain one network per interval. The length of these intervals corresponds to the time scale of the data, whose influence on the construction of the networks will be studied in this paper. Numerical analysis of four different measures in dependence on the time scale for the construction of networks allows us to gain insights about the intrinsic time scale of the stock market with respect to a meaningful graph-theoretical analysis.
Is walking a random walk? Evidence for long-range correlations in stride interval of human gait
NASA Technical Reports Server (NTRS)
Hausdorff, Jeffrey M.; Peng, C.-K.; Ladin, Zvi; Wei, Jeanne Y.; Goldberger, Ary L.
1995-01-01
Complex fluctuation of unknown origin appear in the normal gait pattern. These fluctuations might be described as being (1) uncorrelated white noise, (2) short-range correlations, or (3) long-range correlations with power-law scaling. To test these possibilities, the stride interval of 10 healthy young men was measured as they walked for 9 min at their usual rate. From these time series we calculated scaling indexes by using a modified random walk analysis and power spectral analysis. Both indexes indicated the presence of long-range self-similar correlations extending over hundreds of steps; the stride interval at any time depended on the stride interval at remote previous times, and this dependence decayed in a scale-free (fractallike) power-law fashion. These scaling indexes were significantly different from those obtained after random shuffling of the original time series, indicating the importance of the sequential ordering of the stride interval. We demonstrate that conventional models of gait generation fail to reproduce the observed scaling behavior and introduce a new type of central pattern generator model that sucessfully accounts for the experimentally observed long-range correlations.
Lo, Po-Han; Tsou, Mei-Yung; Chang, Kuang-Yi
2015-09-01
Patient-controlled epidural analgesia (PCEA) is commonly used for pain relief after total knee arthroplasty (TKA). This study aimed to model the trajectory of analgesic demand over time after TKA and explore its influential factors using latent curve analysis. Data were retrospectively collected from 916 patients receiving unilateral or bilateral TKA and postoperative PCEA. PCEA demands during 12-hour intervals for 48 hours were directly retrieved from infusion pumps. Potentially influential factors of PCEA demand, including age, height, weight, body mass index, sex, and infusion pump settings, were also collected. A latent curve analysis with 2 latent variables, the intercept (baseline) and slope (trend), was applied to model the changes in PCEA demand over time. The effects of influential factors on these 2 latent variables were estimated to examine how these factors interacted with time to alter the trajectory of PCEA demand over time. On average, the difference in analgesic demand between the first and second 12-hour intervals was only 15% of that between the first and third 12-hour intervals. No significant difference in PCEA demand was noted between the third and fourth 12-hour intervals. Aging tended to decrease the baseline PCEA demand but body mass index and infusion rate were positively correlated with the baseline. Only sex significantly affected the trend parameter and male individuals tended to have a smoother decreasing trend of analgesic demands over time. Patients receiving bilateral procedures did not consume more analgesics than their unilateral counterparts. Goodness of fit analysis indicated acceptable model fit to the observed data. Latent curve analysis provided valuable information about how analgesic demand after TKA changed over time and how patient characteristics affected its trajectory.
Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin
2013-01-01
Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509
Schaefer, Alexander; Brach, Jennifer S; Perera, Subashan; Sejdić, Ervin
2014-01-30
The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f)=1/f(β). The scaling exponent β is thus often interpreted as a "biomarker" of relative health and decline. This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. Copyright © 2013 Elsevier B.V. All rights reserved.
VizieR Online Data Catalog: Fermi/GBM GRB time-resolved spectral catalog (Yu+, 2016)
NASA Astrophysics Data System (ADS)
Yu, H.-F.; Preece, R. D.; Greiner, J.; Bhat, P. N.; Bissaldi, E.; Briggs, M. S.; Cleveland, W. H.; Connaughton, V.; Goldstein, A.; von Kienlin; A.; Kouveliotou, C.; Mailyan, B.; Meegan, C. A.; Paciesas, W. S.; Rau, A.; Roberts, O. J.; Veres, P.; Wilson-Hodge, C.; Zhang, B.-B.; van Eerten, H. J.
2016-01-01
Time-resolved spectral analysis results of BEST models: for each spectrum GRB name using the Fermi GBM trigger designation, spectrum number within individual burst, start time Tstart and end time Tstop for the time bin, BEST model, best-fit parameters of the BEST model, value of CSTAT per degrees of freedom, 10keV-1MeV photon and energy flux are given. Ep evolutionary trends: for each burst GRB name, number of spectra with Ep, Spearman's Rank Correlation Coefficients between Ep_ and photon flux and 90%, 95%, and 99% confidence intervals, Spearman's Rank Correlation Coefficients between Ep and energy flux and 90%, 95%, and 99% confidence intervals, Spearman's Rank Correlation Coefficient between Ep and time and 90%, 95%, and 99% confidence intervals, trends as determined by computer for 90%, 95%, and 99% confidence intervals, trends as determined by human eyes are given. (2 data files).
Using operations research to plan improvement of the transport of critically ill patients.
Chen, Jing; Awasthi, Anjali; Shechter, Steven; Atkins, Derek; Lemke, Linda; Fisher, Les; Dodek, Peter
2013-01-01
Operations research is the application of mathematical modeling, statistical analysis, and mathematical optimization to understand and improve processes in organizations. The objective of this study was to illustrate how the methods of operations research can be used to identify opportunities to reduce the absolute value and variability of interfacility transport intervals for critically ill patients. After linking data from two patient transport organizations in British Columbia, Canada, for all critical care transports during the calendar year 2006, the steps for transfer of critically ill patients were tabulated into a series of time intervals. Statistical modeling, root-cause analysis, Monte Carlo simulation, and sensitivity analysis were used to test the effect of changes in component intervals on overall duration and variation of transport times. Based on quality improvement principles, we focused on reducing the 75th percentile and standard deviation of these intervals. We analyzed a total of 3808 ground and air transports. Constraining time spent by transport personnel at sending and receiving hospitals was projected to reduce the total time taken by 33 minutes with as much as a 20% reduction in standard deviation of these transport intervals in 75% of ground transfers. Enforcing a policy of requiring acceptance of patients who have life- or limb-threatening conditions or organ failure was projected to reduce the standard deviation of air transport time by 63 minutes and the standard deviation of ground transport time by 68 minutes. Based on findings from our analyses, we developed recommendations for technology renovation, personnel training, system improvement, and policy enforcement. Use of the tools of operations research identifies opportunities for improvement in a complex system of critical care transport.
Zhang, Zhenwei; VanSwearingen, Jessie; Brach, Jennifer S.; Perera, Subashan
2016-01-01
Human gait is a complex interaction of many nonlinear systems and stride intervals exhibit self-similarity over long time scales that can be modeled as a fractal process. The scaling exponent represents the fractal degree and can be interpreted as a biomarker of relative diseases. The previous study showed that the average wavelet method provides the most accurate results to estimate this scaling exponent when applied to stride interval time series. The purpose of this paper is to determine the most suitable mother wavelet for the average wavelet method. This paper presents a comparative numerical analysis of sixteen mother wavelets using simulated and real fractal signals. Simulated fractal signals were generated under varying signal lengths and scaling exponents that indicate a range of physiologically conceivable fractal signals. The five candidates were chosen due to their good performance on the mean square error test for both short and long signals. Next, we comparatively analyzed these five mother wavelets for physiologically relevant stride time series lengths. Our analysis showed that the symlet 2 mother wavelet provides a low mean square error and low variance for long time intervals and relatively low errors for short signal lengths. It can be considered as the most suitable mother function without the burden of considering the signal length. PMID:27960102
Influence of the Time Scale on the Construction of Financial Networks
Emmert-Streib, Frank; Dehmer, Matthias
2010-01-01
Background In this paper we investigate the definition and formation of financial networks. Specifically, we study the influence of the time scale on their construction. Methodology/Principal Findings For our analysis we use correlation-based networks obtained from the daily closing prices of stock market data. More precisely, we use the stocks that currently comprise the Dow Jones Industrial Average (DJIA) and estimate financial networks where nodes correspond to stocks and edges correspond to none vanishing correlation coefficients. That means only if a correlation coefficient is statistically significant different from zero, we include an edge in the network. This construction procedure results in unweighted, undirected networks. By separating the time series of stock prices in non-overlapping intervals, we obtain one network per interval. The length of these intervals corresponds to the time scale of the data, whose influence on the construction of the networks will be studied in this paper. Conclusions/Significance Numerical analysis of four different measures in dependence on the time scale for the construction of networks allows us to gain insights about the intrinsic time scale of the stock market with respect to a meaningful graph-theoretical analysis. PMID:20949124
Sun, Xing; Li, Xiaoyun; Chen, Cong; Song, Yang
2013-01-01
Frequent rise of interval-censored time-to-event data in randomized clinical trials (e.g., progression-free survival [PFS] in oncology) challenges statistical researchers in the pharmaceutical industry in various ways. These challenges exist in both trial design and data analysis. Conventional statistical methods treating intervals as fixed points, which are generally practiced by pharmaceutical industry, sometimes yield inferior or even flawed analysis results in extreme cases for interval-censored data. In this article, we examine the limitation of these standard methods under typical clinical trial settings and further review and compare several existing nonparametric likelihood-based methods for interval-censored data, methods that are more sophisticated but robust. Trial design issues involved with interval-censored data comprise another topic to be explored in this article. Unlike right-censored survival data, expected sample size or power for a trial with interval-censored data relies heavily on the parametric distribution of the baseline survival function as well as the frequency of assessments. There can be substantial power loss in trials with interval-censored data if the assessments are very infrequent. Such an additional dependency controverts many fundamental assumptions and principles in conventional survival trial designs, especially the group sequential design (e.g., the concept of information fraction). In this article, we discuss these fundamental changes and available tools to work around their impacts. Although progression-free survival is often used as a discussion point in the article, the general conclusions are equally applicable to other interval-censored time-to-event endpoints.
Krstacic, Goran; Krstacic, Antonija; Smalcelj, Anton; Milicic, Davor; Jembrek-Gostovic, Mirjana
2007-04-01
Dynamic analysis techniques may quantify abnormalities in heart rate variability (HRV) based on nonlinear and fractal analysis (chaos theory). The article emphasizes clinical and prognostic significance of dynamic changes in short-time series applied on patients with coronary heart disease (CHD) during the exercise electrocardiograph (ECG) test. The subjects were included in the series after complete cardiovascular diagnostic data. Series of R-R and ST-T intervals were obtained from exercise ECG data after sampling digitally. The range rescaled analysis method determined the fractal dimension of the intervals. To quantify fractal long-range correlation's properties of heart rate variability, the detrended fluctuation analysis technique was used. Approximate entropy (ApEn) was applied to quantify the regularity and complexity of time series, as well as unpredictability of fluctuations in time series. It was found that the short-term fractal scaling exponent (alpha(1)) is significantly lower in patients with CHD (0.93 +/- 0.07 vs 1.09 +/- 0.04; P < 0.001). The patients with CHD had higher fractal dimension in each exercise test program separately, as well as in exercise program at all. ApEn was significant lower in CHD group in both RR and ST-T ECG intervals (P < 0.001). The nonlinear dynamic methods could have clinical and prognostic applicability also in short-time ECG series. Dynamic analysis based on chaos theory during the exercise ECG test point out the multifractal time series in CHD patients who loss normal fractal characteristics and regularity in HRV. Nonlinear analysis technique may complement traditional ECG analysis.
A new variable interval schedule with constant hazard rate and finite time range.
Bugallo, Mehdi; Machado, Armando; Vasconcelos, Marco
2018-05-27
We propose a new variable interval (VI) schedule that achieves constant probability of reinforcement in time while using a bounded range of intervals. By sampling each trial duration from a uniform distribution ranging from 0 to 2 T seconds, and then applying a reinforcement rule that depends linearly on trial duration, the schedule alternates reinforced and unreinforced trials, each less than 2 T seconds, while preserving a constant hazard function. © 2018 Society for the Experimental Analysis of Behavior.
Moulki, Naeem; Kealhofer, Jessica V; Benditt, David G; Gravely, Amy; Vakil, Kairav; Garcia, Santiago; Adabag, Selcuk
2018-06-16
Bifascicular block and prolonged PR interval on the electrocardiogram (ECG) have been associated with complete heart block and sudden cardiac death. We sought to determine if cardiac implantable electronic devices (CIED) improve survival in these patients. We assessed survival in relation to CIED status among 636 consecutive patients with bifascicular block and prolonged PR interval on the ECG. In survival analyses, CIED was considered as a time-varying covariate. Average age was 76 ± 9 years, and 99% of the patients were men. A total of 167 (26%) underwent CIED (127 pacemaker only) implantation at baseline (n = 23) or during follow-up (n = 144). During 5.4 ± 3.8 years of follow-up, 83 (13%) patients developed complete or high-degree atrioventricular block and 375 (59%) died. Patients with a CIED had a longer survival compared to those without a CIED in the traditional, static analysis (log-rank p < 0.0001) but not when CIED was considered as a time-varying covariate (log-rank p = 0.76). In the multivariable model, patients with a CIED had a 34% lower risk of death (hazard ratio 0.66, 95% confidence interval 0.52-0.83; p = 0.001) than those without CIED in the traditional analysis but not in the time-varying covariate analysis (hazard ratio 1.05, 95% confidence interval 0.79-1.38; p = 0.76). Results did not change in the subgroup with a pacemaker only. Bifascicular block and prolonged PR interval on ECG are associated with a high incidence of complete atrioventricular block and mortality. However, CIED implantation does not have a significant influence on survival when time-varying nature of CIED implantation is considered.
Method of high precision interval measurement in pulse laser ranging system
NASA Astrophysics Data System (ADS)
Wang, Zhen; Lv, Xin-yuan; Mao, Jin-jin; Liu, Wei; Yang, Dong
2013-09-01
Laser ranging is suitable for laser system, for it has the advantage of high measuring precision, fast measuring speed,no cooperative targets and strong resistance to electromagnetic interference,the measuremen of laser ranging is the key paremeters affecting the performance of the whole system.The precision of the pulsed laser ranging system was decided by the precision of the time interval measurement, the principle structure of laser ranging system was introduced, and a method of high precision time interval measurement in pulse laser ranging system was established in this paper.Based on the analysis of the factors which affected the precision of range measure,the pulse rising edges discriminator was adopted to produce timing mark for the start-stop time discrimination,and the TDC-GP2 high precision interval measurement system based on TMS320F2812 DSP was designed to improve the measurement precision.Experimental results indicate that the time interval measurement method in this paper can obtain higher range accuracy. Compared with the traditional time interval measurement system,the method simplifies the system design and reduce the influence of bad weather conditions,furthermore,it satisfies the requirements of low costs and miniaturization.
Waynforth, David
2015-10-01
Human birth interval length is indicative of the level of parental investment that a child will receive: a short interval following birth means that parental resources must be split with a younger sibling during a period when the older sibling remains highly dependent on their parents. From a life-history theoretical perspective, it is likely that there are evolved mechanisms that serve to maximize fitness depending on context. One context that would be expected to result in short birth intervals, and lowered parental investment, is after a child with low expected fitness is born. Here, data drawn from a longitudinal British birth cohort study were used to test whether birth intervals were shorter following the birth of a child with a long-term health problem. Data on the timing of 4543 births were analysed using discrete-time event history analysis. The results were consistent with the hypothesis: birth intervals were shorter following the birth of a child diagnosed by a medical professional with a severe but non-fatal medical condition. Covariates in the analysis were also significantly associated with birth interval length: births of twins or multiple births, and relationship break-up were associated with significantly longer birth intervals. © 2015 The Author(s).
Graphic analysis and multifractal on percolation-based return interval series
NASA Astrophysics Data System (ADS)
Pei, A. Q.; Wang, J.
2015-05-01
A financial time series model is developed and investigated by the oriented percolation system (one of the statistical physics systems). The nonlinear and statistical behaviors of the return interval time series are studied for the proposed model and the real stock market by applying visibility graph (VG) and multifractal detrended fluctuation analysis (MF-DFA). We investigate the fluctuation behaviors of return intervals of the model for different parameter settings, and also comparatively study these fluctuation patterns with those of the real financial data for different threshold values. The empirical research of this work exhibits the multifractal features for the corresponding financial time series. Further, the VGs deviated from both of the simulated data and the real data show the behaviors of small-world, hierarchy, high clustering and power-law tail for the degree distributions.
The dose delivery effect of the different Beam ON interval in FFF SBRT: TrueBEAM
NASA Astrophysics Data System (ADS)
Tawonwong, T.; Suriyapee, S.; Oonsiri, S.; Sanghangthum, T.; Oonsiri, P.
2016-03-01
The purpose of this study is to determine the dose delivery effect of the different Beam ON interval in Flattening Filter Free Stereotactic Body Radiation Therapy (FFF-SBRT). The three 10MV-FFF SBRT plans (2 half rotating Rapid Arc, 9 to10 Gray/Fraction) were selected and irradiated in three different intervals (100%, 50% and 25%) using the RPM gating system. The plan verification was performed by the ArcCHECK for gamma analysis and the ionization chamber for point dose measurement. The dose delivery time of each interval were observed. For gamma analysis (2%&2mm criteria), the average percent pass of all plans for 100%, 50% and 25% intervals were 86.1±3.3%, 86.0±3.0% and 86.1±3.3%, respectively. For point dose measurement, the average ratios of each interval to the treatment planning were 1.012±0.015, 1.011±0.014 and 1.011±0.013 for 100%, 50% and 25% interval, respectively. The average dose delivery time was increasing from 74.3±5.0 second for 100% interval to 154.3±12.6 and 347.9±20.3 second for 50% and 25% interval, respectively. The same quality of the dose delivery from different Beam ON intervals in FFF-SBRT by TrueBEAM was illustrated. While the 100% interval represents the breath-hold treatment technique, the differences for the free-breathing using RPM gating system can be treated confidently.
ERIC Educational Resources Information Center
Radley, Keith C.; O'Handley, Roderick D.; Labrot, Zachary C.
2015-01-01
Assessment in social skills training often utilizes procedures such as partial-interval recording (PIR) and momentary time sampling (MTS) to estimate changes in duration in social engagements due to intervention. Although previous research suggests PIR to be more inaccurate than MTS in estimating levels of behavior, treatment analysis decisions…
Horr, Ninja K.; Di Luca, Massimiliano
2015-01-01
In this work we investigate how judgments of perceived duration are influenced by the properties of the signals that define the intervals. Participants compared two auditory intervals that could be any combination of the following four types: intervals filled with continuous tones (filled intervals), intervals filled with regularly-timed short tones (isochronous intervals), intervals filled with irregularly-timed short tones (anisochronous intervals), and intervals demarcated by two short tones (empty intervals). Results indicate that the type of intervals to be compared affects discrimination performance and induces distortions in perceived duration. In particular, we find that duration judgments are most precise when comparing two isochronous and two continuous intervals, while the comparison of two anisochronous intervals leads to the worst performance. Moreover, we determined that the magnitude of the distortions in perceived duration (an effect akin to the filled duration illusion) is higher for tone sequences (no matter whether isochronous or anisochronous) than for continuous tones. Further analysis of how duration distortions depend on the type of filling suggests that distortions are not only due to the perceived duration of the two individual intervals, but they may also be due to the comparison of two different filling types. PMID:25717310
Interval timing in genetically modified mice: a simple paradigm
Balci, F.; Papachristos, E. B.; Gallistel, C. R.; Brunner, D.; Gibson, J.; Shumyatsky, G. P.
2009-01-01
We describe a behavioral screen for the quantitative study of interval timing and interval memory in mice. Mice learn to switch from a short-latency feeding station to a long-latency station when the short latency has passed without a feeding. The psychometric function is the cumulative distribution of switch latencies. Its median measures timing accuracy and its interquartile interval measures timing precision. Next, using this behavioral paradigm, we have examined mice with a gene knockout of the receptor for gastrin-releasing peptide that show enhanced (i.e. prolonged) freezing in fear conditioning. We have tested the hypothesis that the mutants freeze longer because they are more uncertain than wild types about when to expect the electric shock. The knockouts however show normal accuracy and precision in timing, so we have rejected this alternative hypothesis. Last, we conduct the pharmacological validation of our behavioral screen using D-amphetamine and methamphetamine. We suggest including the analysis of interval timing and temporal memory in tests of genetically modified mice for learning and memory and argue that our paradigm allows this to be done simply and efficiently. PMID:17696995
Interval timing in genetically modified mice: a simple paradigm.
Balci, F; Papachristos, E B; Gallistel, C R; Brunner, D; Gibson, J; Shumyatsky, G P
2008-04-01
We describe a behavioral screen for the quantitative study of interval timing and interval memory in mice. Mice learn to switch from a short-latency feeding station to a long-latency station when the short latency has passed without a feeding. The psychometric function is the cumulative distribution of switch latencies. Its median measures timing accuracy and its interquartile interval measures timing precision. Next, using this behavioral paradigm, we have examined mice with a gene knockout of the receptor for gastrin-releasing peptide that show enhanced (i.e. prolonged) freezing in fear conditioning. We have tested the hypothesis that the mutants freeze longer because they are more uncertain than wild types about when to expect the electric shock. The knockouts however show normal accuracy and precision in timing, so we have rejected this alternative hypothesis. Last, we conduct the pharmacological validation of our behavioral screen using d-amphetamine and methamphetamine. We suggest including the analysis of interval timing and temporal memory in tests of genetically modified mice for learning and memory and argue that our paradigm allows this to be done simply and efficiently.
Kuiper, Gerhardus J A J M; Houben, Rik; Wetzels, Rick J H; Verhezen, Paul W M; Oerle, Rene van; Ten Cate, Hugo; Henskens, Yvonne M C; Lancé, Marcus D
2017-11-01
Low platelet counts and hematocrit levels hinder whole blood point-of-care testing of platelet function. Thus far, no reference ranges for MEA (multiple electrode aggregometry) and PFA-100 (platelet function analyzer 100) devices exist for low ranges. Through dilution methods of volunteer whole blood, platelet function at low ranges of platelet count and hematocrit levels was assessed on MEA for four agonists and for PFA-100 in two cartridges. Using (multiple) regression analysis, 95% reference intervals were computed for these low ranges. Low platelet counts affected MEA in a positive correlation (all agonists showed r 2 ≥ 0.75) and PFA-100 in an inverse correlation (closure times were prolonged with lower platelet counts). Lowered hematocrit did not affect MEA testing, except for arachidonic acid activation (ASPI), which showed a weak positive correlation (r 2 = 0.14). Closure time on PFA-100 testing was inversely correlated with hematocrit for both cartridges. Regression analysis revealed different 95% reference intervals in comparison with originally established intervals for both MEA and PFA-100 in low platelet or hematocrit conditions. Multiple regression analysis of ASPI and both tests on the PFA-100 for combined low platelet and hematocrit conditions revealed that only PFA-100 testing should be adjusted for both thrombocytopenia and anemia. 95% reference intervals were calculated using multiple regression analysis. However, coefficients of determination of PFA-100 were poor, and some variance remained unexplained. Thus, in this pilot study using (multiple) regression analysis, we could establish reference intervals of platelet function in anemia and thrombocytopenia conditions on PFA-100 and in thrombocytopenia conditions on MEA.
Temporal Structure of Volatility Fluctuations
NASA Astrophysics Data System (ADS)
Wang, Fengzhong; Yamasaki, Kazuko; Stanley, H. Eugene; Havlin, Shlomo
Volatility fluctuations are of great importance for the study of financial markets, and the temporal structure is an essential feature of fluctuations. To explore the temporal structure, we employ a new approach based on the return interval, which is defined as the time interval between two successive volatility values that are above a given threshold. We find that the distribution of the return intervals follows a scaling law over a wide range of thresholds, and over a broad range of sampling intervals. Moreover, this scaling law is universal for stocks of different countries, for commodities, for interest rates, and for currencies. However, further and more detailed analysis of the return intervals shows some systematic deviations from the scaling law. We also demonstrate a significant memory effect in the return intervals time organization. We find that the distribution of return intervals is strongly related to the correlations in the volatility.
Krall, Scott P; Cornelius, Angela P; Addison, J Bruce
2014-03-01
To analyze the correlation between the many different emergency department (ED) treatment metric intervals and determine if the metrics directly impacted by the physician correlate to the "door to room" interval in an ED (interval determined by ED bed availability). Our null hypothesis was that the cause of the variation in delay to receiving a room was multifactorial and does not correlate to any one metric interval. We collected daily interval averages from the ED information system, Meditech©. Patient flow metrics were collected on a 24-hour basis. We analyzed the relationship between the time intervals that make up an ED visit and the "arrival to room" interval using simple correlation (Pearson Correlation coefficients). Summary statistics of industry standard metrics were also done by dividing the intervals into 2 groups, based on the average ED length of stay (LOS) from the National Hospital Ambulatory Medical Care Survey: 2008 Emergency Department Summary. Simple correlation analysis showed that the doctor-to-discharge time interval had no correlation to the interval of "door to room (waiting room time)", correlation coefficient (CC) (CC=0.000, p=0.96). "Room to doctor" had a low correlation to "door to room" CC=0.143, while "decision to admitted patients departing the ED time" had a moderate correlation of 0.29 (p <0.001). "New arrivals" (daily patient census) had a strong correlation to longer "door to room" times, 0.657, p<0.001. The "door to discharge" times had a very strong correlation CC=0.804 (p<0.001), to the extended "door to room" time. Physician-dependent intervals had minimal correlation to the variation in arrival to room time. The "door to room" interval was a significant component to the variation in "door to discharge" i.e. LOS. The hospital-influenced "admit decision to hospital bed" i.e. hospital inpatient capacity, interval had a correlation to delayed "door to room" time. The other major factor affecting department bed availability was the "total patients per day." The correlation to the increasing "door to room" time also reflects the effect of availability of ED resources (beds) on the patient evaluation time. The time that it took for a patient to receive a room appeared more dependent on the system resources, for example, beds in the ED, as well as in the hospital, than on the physician.
Tavakol, Najmeh; Kheiri, Soleiman; Sedehi, Morteza
2016-01-01
Time to donating blood plays a major role in a regular donor to becoming continues one. The aim of this study was to determine the effective factors on the interval between the blood donations. In a longitudinal study in 2008, 864 samples of first-time donors in Shahrekord Blood Transfusion Center, capital city of Chaharmahal and Bakhtiari Province, Iran were selected by a systematic sampling and were followed up for five years. Among these samples, a subset of 424 donors who had at least two successful blood donations were chosen for this study and the time intervals between their donations were measured as response variable. Sex, body weight, age, marital status, education, stay and job were recorded as independent variables. Data analysis was performed based on log-normal hazard model with gamma correlated frailty. In this model, the frailties are sum of two independent components assumed a gamma distribution. The analysis was done via Bayesian approach using Markov Chain Monte Carlo algorithm by OpenBUGS. Convergence was checked via Gelman-Rubin criteria using BOA program in R. Age, job and education were significant on chance to donate blood (P<0.05). The chances of blood donation for the higher-aged donors, clericals, workers, free job, students and educated donors were higher and in return, time intervals between their blood donations were shorter. Due to the significance effect of some variables in the log-normal correlated frailty model, it is necessary to plan educational and cultural program to encourage the people with longer inter-donation intervals to donate more frequently.
Emotion processing in the visual brain: a MEG analysis.
Peyk, Peter; Schupp, Harald T; Elbert, Thomas; Junghöfer, Markus
2008-06-01
Recent functional magnetic resonance imaging (fMRI) and event-related brain potential (ERP) studies provide empirical support for the notion that emotional cues guide selective attention. Extending this line of research, whole head magneto-encephalogram (MEG) was measured while participants viewed in separate experimental blocks a continuous stream of either pleasant and neutral or unpleasant and neutral pictures, presented for 330 ms each. Event-related magnetic fields (ERF) were analyzed after intersubject sensor coregistration, complemented by minimum norm estimates (MNE) to explore neural generator sources. Both streams of analysis converge by demonstrating the selective emotion processing in an early (120-170 ms) and a late time interval (220-310 ms). ERF analysis revealed that the polarity of the emotion difference fields was reversed across early and late intervals suggesting distinct patterns of activation in the visual processing stream. Source analysis revealed the amplified processing of emotional pictures in visual processing areas with more pronounced occipito-parieto-temporal activation in the early time interval, and a stronger engagement of more anterior, temporal, regions in the later interval. Confirming previous ERP studies showing facilitated emotion processing, the present data suggest that MEG provides a complementary look at the spread of activation in the visual processing stream.
Time series models on analysing mortality rates and acute childhood lymphoid leukaemia.
Kis, Maria
2005-01-01
In this paper we demonstrate applying time series models on medical research. The Hungarian mortality rates were analysed by autoregressive integrated moving average models and seasonal time series models examined the data of acute childhood lymphoid leukaemia.The mortality data may be analysed by time series methods such as autoregressive integrated moving average (ARIMA) modelling. This method is demonstrated by two examples: analysis of the mortality rates of ischemic heart diseases and analysis of the mortality rates of cancer of digestive system. Mathematical expressions are given for the results of analysis. The relationships between time series of mortality rates were studied with ARIMA models. Calculations of confidence intervals for autoregressive parameters by tree methods: standard normal distribution as estimation and estimation of the White's theory and the continuous time case estimation. Analysing the confidence intervals of the first order autoregressive parameters we may conclude that the confidence intervals were much smaller than other estimations by applying the continuous time estimation model.We present a new approach to analysing the occurrence of acute childhood lymphoid leukaemia. We decompose time series into components. The periodicity of acute childhood lymphoid leukaemia in Hungary was examined using seasonal decomposition time series method. The cyclic trend of the dates of diagnosis revealed that a higher percent of the peaks fell within the winter months than in the other seasons. This proves the seasonal occurrence of the childhood leukaemia in Hungary.
Factors influencing pre-hospital care time intervals in Iran: a qualitative study.
Khorasani-Zavareh, Davoud; Mohammadi, Reza; Bohm, Katarina
2018-06-23
Pre-hospital time management provides better access to victims of road traffic crashes (RTCs) and can help minimize preventable deaths, injuries and disabilities. While most studies have been focused on measuring various time intervals in the pre-hospital phase, to our best knowledge there is no study exploring the barriers and facilitators that affects these various intervals qualitatively. The present study aimed to explore factors affecting various time intervals relating to road traffic incidents in the pre-hospital phase and provides suggestions for improvements in Iran. The study was conducted during 2013-2014 at both the national and local level in Iran. Overall, 18 face-to-face interviews with emergency medical services (EMS) personnel were used for data collection. Qualitative content analysis was employed to analyze the data. The most important barriers in relation to pre-hospital intervals were related to the manner of cooperation by members of the public with the EMS and their involvement at the crash scene, as well as to pre-hospital system factors, including the number and location of EMS facilities, type and number of ambulances and manpower. These factors usually affect how rapidly the EMS can arrive at the scene of the crash and how quickly victims can be transferred to hospital. These two categories have six main themes: notification interval; activation interval; response interval; on-scene interval; transport interval; and delivery interval. Despite more focus on physical resources, cooperation from members of the public needs to be taken in account in order to achieve better pre-hospital management of the various intervals, possibly through the use of public education campaigns.
Alpermann, Anke; Huber, Walter; Natke, Ulrich; Willmes, Klaus
2010-09-01
Improved fluency after stuttering therapy is usually measured by the percentage of stuttered syllables. However, outcome studies rarely evaluate the use of trained speech patterns that speakers use to manage stuttering. This study investigated whether the modified time interval analysis can distinguish between trained speech patterns, fluent speech, and stuttered speech. Seventeen German experts on stuttering judged a speech sample on two occasions. Speakers of the sample were stuttering adults, who were not undergoing therapy, as well as participants in a fluency shaping and a stuttering modification therapy. Results showed satisfactory inter-judge and intra-judge agreement above 80%. Intervals with trained speech patterns were identified as consistently as stuttered and fluent intervals. We discuss limitations of the study, as well as implications of our findings for the development of training for identification of trained speech patterns and future outcome studies. The reader will be able to (a) explain different methods to measure the use of trained speech patterns, (b) evaluate whether German experts are able to discriminate intervals with trained speech patterns reliably from fluent and stuttered intervals and (c) describe how the measurement of trained speech patterns can contribute to outcome studies.
Cardiopulmonary resuscitation quality: Widespread variation in data intervals used for analysis.
Talikowska, Milena; Tohira, Hideo; Bailey, Paul; Finn, Judith
2016-05-01
There is a growing body of evidence for the relationship between CPR quality and survival in cardiac arrest patients. We sought to describe the characteristics of the analysis intervals used across studies. Relevant papers were selected as described in our recent systematic review. From these papers we collected information about (1) the time interval used for analysis; (2) the event that marked the beginning of the analysis interval; and (3) the minimum amount of CPR quality data required for a case to be included in the analysed cohort. We then compared this data across papers. Twenty-one studies reported on the association between CPR quality and cardiac arrest patient survival. In two thirds of studies data from the start of the resuscitation episode was analysed, in particular the first 5min. Commencement of the analysis interval was marked by various events including ECG pad placement and first chest compression. Nine studies specified a minimum amount of data that had to have been collected for the individual case to be included in the analysis; most commonly 1min of data. The use of shorter intervals allowed for inclusion of more cases as it included cases that did not have a complete dataset. To facilitate comparisons across studies, a standardised definition of the data analysis interval should be developed; one that maximises the amount of cases available without compromising the data's representability of the resuscitation effort. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Thurner, Stefan; Feurstein, Markus C.; Teich, Malvin C.
1998-02-01
We applied multiresolution wavelet analysis to the sequence of times between human heartbeats ( R-R intervals) and have found a scale window, between 16 and 32 heartbeat intervals, over which the widths of the R-R wavelet coefficients fall into disjoint sets for normal and heart-failure patients. This has enabled us to correctly classify every patient in a standard data set as belonging either to the heart-failure or normal group with 100% accuracy, thereby providing a clinically significant measure of the presence of heart failure from the R-R intervals alone. Comparison is made with previous approaches, which have provided only statistically significant measures.
Multivariable nonlinear analysis of foreign exchange rates
NASA Astrophysics Data System (ADS)
Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo
2003-05-01
We analyze the multivariable time series of foreign exchange rates. These are price movements that have often been analyzed, and dealing time intervals and spreads between bid and ask prices. Considering dealing time intervals as event timing such as neurons’ firings, we use raster plots (RPs) and peri-stimulus time histograms (PSTHs) which are popular methods in the field of neurophysiology. Introducing special processings to obtaining RPs and PSTHs time histograms for analyzing exchange rates time series, we discover that there exists dynamical interaction among three variables. We also find that adopting multivariables leads to improvements of prediction accuracy.
Dynamic response analysis of structure under time-variant interval process model
NASA Astrophysics Data System (ADS)
Xia, Baizhan; Qin, Yuan; Yu, Dejie; Jiang, Chao
2016-10-01
Due to the aggressiveness of the environmental factor, the variation of the dynamic load, the degeneration of the material property and the wear of the machine surface, parameters related with the structure are distinctly time-variant. Typical model for time-variant uncertainties is the random process model which is constructed on the basis of a large number of samples. In this work, we propose a time-variant interval process model which can be effectively used to deal with time-variant uncertainties with limit information. And then two methods are presented for the dynamic response analysis of the structure under the time-variant interval process model. The first one is the direct Monte Carlo method (DMCM) whose computational burden is relative high. The second one is the Monte Carlo method based on the Chebyshev polynomial expansion (MCM-CPE) whose computational efficiency is high. In MCM-CPE, the dynamic response of the structure is approximated by the Chebyshev polynomials which can be efficiently calculated, and then the variational range of the dynamic response is estimated according to the samples yielded by the Monte Carlo method. To solve the dependency phenomenon of the interval operation, the affine arithmetic is integrated into the Chebyshev polynomial expansion. The computational effectiveness and efficiency of MCM-CPE is verified by two numerical examples, including a spring-mass-damper system and a shell structure.
Age-related alterations in the fractal scaling of cardiac interbeat interval dynamics
NASA Technical Reports Server (NTRS)
Iyengar, N.; Peng, C. K.; Morin, R.; Goldberger, A. L.; Lipsitz, L. A.
1996-01-01
We postulated that aging is associated with disruption in the fractallike long-range correlations that characterize healthy sinus rhythm cardiac interval dynamics. Ten young (21-34 yr) and 10 elderly (68-81 yr) rigorously screened healthy subjects underwent 120 min of continuous supine resting electrocardiographic recording. We analyzed the interbeat interval time series using standard time and frequency domain statistics and using a fractal measure, detrended fluctuation analysis, to quantify long-range correlation properties. In healthy young subjects, interbeat intervals demonstrated fractal scaling, with scaling exponents (alpha) from the fluctuation analysis close to a value of 1.0. In the group of healthy elderly subjects, the interbeat interval time series had two scaling regions. Over the short range, interbeat interval fluctuations resembled a random walk process (Brownian noise, alpha = 1.5), whereas over the longer range they resembled white noise (alpha = 0.5). Short (alpha s)- and long-range (alpha 1) scaling exponents were significantly different in the elderly subjects compared with young (alpha s = 1.12 +/- 0.19 vs. 0.90 +/- 0.14, respectively, P = 0.009; alpha 1 = 0.75 +/- 0.17 vs. 0.99 +/- 0.10, respectively, P = 0.002). The crossover behavior from one scaling region to another could be modeled as a first-order autoregressive process, which closely fit the data from four elderly subjects. This implies that a single characteristic time scale may be dominating heartbeat control in these subjects. The age-related loss of fractal organization in heartbeat dynamics may reflect the degradation of integrated physiological regulatory systems and may impair an individual's ability to adapt to stress.
Luo, Yuan; Szolovits, Peter
2016-01-01
In natural language processing, stand-off annotation uses the starting and ending positions of an annotation to anchor it to the text and stores the annotation content separately from the text. We address the fundamental problem of efficiently storing stand-off annotations when applying natural language processing on narrative clinical notes in electronic medical records (EMRs) and efficiently retrieving such annotations that satisfy position constraints. Efficient storage and retrieval of stand-off annotations can facilitate tasks such as mapping unstructured text to electronic medical record ontologies. We first formulate this problem into the interval query problem, for which optimal query/update time is in general logarithm. We next perform a tight time complexity analysis on the basic interval tree query algorithm and show its nonoptimality when being applied to a collection of 13 query types from Allen's interval algebra. We then study two closely related state-of-the-art interval query algorithms, proposed query reformulations, and augmentations to the second algorithm. Our proposed algorithm achieves logarithmic time stabbing-max query time complexity and solves the stabbing-interval query tasks on all of Allen's relations in logarithmic time, attaining the theoretic lower bound. Updating time is kept logarithmic and the space requirement is kept linear at the same time. We also discuss interval management in external memory models and higher dimensions.
Luo, Yuan; Szolovits, Peter
2016-01-01
In natural language processing, stand-off annotation uses the starting and ending positions of an annotation to anchor it to the text and stores the annotation content separately from the text. We address the fundamental problem of efficiently storing stand-off annotations when applying natural language processing on narrative clinical notes in electronic medical records (EMRs) and efficiently retrieving such annotations that satisfy position constraints. Efficient storage and retrieval of stand-off annotations can facilitate tasks such as mapping unstructured text to electronic medical record ontologies. We first formulate this problem into the interval query problem, for which optimal query/update time is in general logarithm. We next perform a tight time complexity analysis on the basic interval tree query algorithm and show its nonoptimality when being applied to a collection of 13 query types from Allen’s interval algebra. We then study two closely related state-of-the-art interval query algorithms, proposed query reformulations, and augmentations to the second algorithm. Our proposed algorithm achieves logarithmic time stabbing-max query time complexity and solves the stabbing-interval query tasks on all of Allen’s relations in logarithmic time, attaining the theoretic lower bound. Updating time is kept logarithmic and the space requirement is kept linear at the same time. We also discuss interval management in external memory models and higher dimensions. PMID:27478379
NASA Astrophysics Data System (ADS)
Ren, Lixia; He, Li; Lu, Hongwei; Chen, Yizhong
2016-08-01
A new Monte Carlo-based interval transformation analysis (MCITA) is used in this study for multi-criteria decision analysis (MCDA) of naphthalene-contaminated groundwater management strategies. The analysis can be conducted when input data such as total cost, contaminant concentration and health risk are represented as intervals. Compared to traditional MCDA methods, MCITA-MCDA has the advantages of (1) dealing with inexactness of input data represented as intervals, (2) mitigating computational time due to the introduction of Monte Carlo sampling method, (3) identifying the most desirable management strategies under data uncertainty. A real-world case study is employed to demonstrate the performance of this method. A set of inexact management alternatives are considered in each duration on the basis of four criteria. Results indicated that the most desirable management strategy lied in action 15 for the 5-year, action 8 for the 10-year, action 12 for the 15-year, and action 2 for the 20-year management.
Time variations of solar UV irradiance as measured by the SOLSTICE (UARS) instrument
NASA Technical Reports Server (NTRS)
London, Julius; Rottman, Gary J.; Woods, Thomas N.; Wu, Fie
1993-01-01
An analysis is presented of solar ultraviolet irradiance measurements made by the SOLSTICE spectrometers on the Upper Atmosphere Research Satellite (UARS). Reported observations cover the wavelength interval 119-420 nm, and the analysis discussed here is for the time period 26 Nov 1991 to 31 Dec 1992, during which time solar activity decreased in intensity. At the time of peak activity, the average 27-day variation had a relative amplitude of about 8 percent at Ly-alpha, tailing off to about 0.6 percent at 260 nm. It is shown that over the spectral interval 119-260 nm, the relative 27-day harmonic was about a factor of two larger during the strongly disturbed as compared with the moderately disturbed period.
Statistical regularities in the return intervals of volatility
NASA Astrophysics Data System (ADS)
Wang, F.; Weber, P.; Yamasaki, K.; Havlin, S.; Stanley, H. E.
2007-01-01
We discuss recent results concerning statistical regularities in the return intervals of volatility in financial markets. In particular, we show how the analysis of volatility return intervals, defined as the time between two volatilities larger than a given threshold, can help to get a better understanding of the behavior of financial time series. We find scaling in the distribution of return intervals for thresholds ranging over a factor of 25, from 0.6 to 15 standard deviations, and also for various time windows from one minute up to 390 min (an entire trading day). Moreover, these results are universal for different stocks, commodities, interest rates as well as currencies. We also analyze the memory in the return intervals which relates to the memory in the volatility and find two scaling regimes, ℓ<ℓ* with α1=0.64±0.02 and ℓ> ℓ* with α2=0.92±0.04; these exponent values are similar to results of Liu et al. for the volatility. As an application, we use the scaling and memory properties of the return intervals to suggest a possibly useful method for estimating risk.
Return Intervals Approach to Financial Fluctuations
NASA Astrophysics Data System (ADS)
Wang, Fengzhong; Yamasaki, Kazuko; Havlin, Shlomo; Stanley, H. Eugene
Financial fluctuations play a key role for financial markets studies. A new approach focusing on properties of return intervals can help to get better understanding of the fluctuations. A return interval is defined as the time between two successive volatilities above a given threshold. We review recent studies and analyze the 1000 most traded stocks in the US stock markets. We find that the distribution of the return intervals has a well approximated scaling over a wide range of thresholds. The scaling is also valid for various time windows from one minute up to one trading day. Moreover, these results are universal for stocks of different countries, commodities, interest rates as well as currencies. Further analysis shows some systematic deviations from a scaling law, which are due to the nonlinear correlations in the volatility sequence. We also examine the memory in return intervals for different time scales, which are related to the long-term correlations in the volatility. Furthermore, we test two popular models, FIGARCH and fractional Brownian motion (fBm). Both models can catch the memory effect but only fBm shows a good scaling in the return interval distribution.
Voter model with non-Poissonian interevent intervals
NASA Astrophysics Data System (ADS)
Takaguchi, Taro; Masuda, Naoki
2011-09-01
Recent analysis of social communications among humans has revealed that the interval between interactions for a pair of individuals and for an individual often follows a long-tail distribution. We investigate the effect of such a non-Poissonian nature of human behavior on dynamics of opinion formation. We use a variant of the voter model and numerically compare the time to consensus of all the voters with different distributions of interevent intervals and different networks. Compared with the exponential distribution of interevent intervals (i.e., the standard voter model), the power-law distribution of interevent intervals slows down consensus on the ring. This is because of the memory effect; in the power-law case, the expected time until the next update event on a link is large if the link has not had an update event for a long time. On the complete graph, the consensus time in the power-law case is close to that in the exponential case. Regular graphs bridge these two results such that the slowing down of the consensus in the power-law case as compared to the exponential case is less pronounced as the degree increases.
Estimating average annual per cent change in trend analysis
Clegg, Limin X; Hankey, Benjamin F; Tiwari, Ram; Feuer, Eric J; Edwards, Brenda K
2009-01-01
Trends in incidence or mortality rates over a specified time interval are usually described by the conventional annual per cent change (cAPC), under the assumption of a constant rate of change. When this assumption does not hold over the entire time interval, the trend may be characterized using the annual per cent changes from segmented analysis (sAPCs). This approach assumes that the change in rates is constant over each time partition defined by the transition points, but varies among different time partitions. Different groups (e.g. racial subgroups), however, may have different transition points and thus different time partitions over which they have constant rates of change, making comparison of sAPCs problematic across groups over a common time interval of interest (e.g. the past 10 years). We propose a new measure, the average annual per cent change (AAPC), which uses sAPCs to summarize and compare trends for a specific time period. The advantage of the proposed AAPC is that it takes into account the trend transitions, whereas cAPC does not and can lead to erroneous conclusions. In addition, when the trend is constant over the entire time interval of interest, the AAPC has the advantage of reducing to both cAPC and sAPC. Moreover, because the estimated AAPC is based on the segmented analysis over the entire data series, any selected subinterval within a single time partition will yield the same AAPC estimate—that is it will be equal to the estimated sAPC for that time partition. The cAPC, however, is re-estimated using data only from that selected subinterval; thus, its estimate may be sensitive to the subinterval selected. The AAPC estimation has been incorporated into the segmented regression (free) software Joinpoint, which is used by many registries throughout the world for characterizing trends in cancer rates. Copyright © 2009 John Wiley & Sons, Ltd. PMID:19856324
Dehkordi, Parastoo; Garde, Ainara; Karlen, Walter; Wensley, David; Ansermino, J Mark; Dumont, Guy A
2013-01-01
Heart Rate Variability (HRV), the variation of time intervals between heartbeats, is one of the most promising and widely used quantitative markers of autonomic activity. Traditionally, HRV is measured as the series of instantaneous cycle intervals obtained from the electrocardiogram (ECG). In this study, we investigated the estimation of variation in heart rate from a photoplethysmography (PPG) signal, called pulse rate variability (PRV), and assessed its accuracy as an estimate of HRV in children with and without sleep disordered breathing (SDB). We recorded raw PPGs from 72 children using the Phone Oximeter, an oximeter connected to a mobile phone. Full polysomnography including ECG was simultaneously recorded for each subject. We used correlation and Bland-Altman analysis for comparing the parameters of HRV and PRV between two groups of children. Significant correlation (r > 0.90, p < 0.05) and close agreement were found between HRV and PRV for mean intervals, standard deviation of intervals (SDNN) and the root-mean square of the difference of successive intervals (RMSSD). However Bland-Altman analysis showed a large divergence for LF/HF ratio parameter. In addition, children with SDB had depressed SDNN and RMSSD and elevated LF/HF in comparison to children without SDB. In conclusion, PRV provides the accurate estimate of HRV in time domain analysis but does not reflect precise estimation for parameters in frequency domain.
Time Transfer from Combined Analysis of GPS and TWSTFT Data
2008-12-01
40th Annual Precise Time and Time Interval (PTTI) Meeting 565 TIME TRANSFER FROM COMBINED ANALYSIS OF GPS AND TWSTFT DATA...bipm.org Abstract This paper presents the time transfer results obtained from the combination of GPS data and TWSTFT data. Two different methods...view, constrained by TWSTFT data. Using the Vondrak-Cepek algorithm, the second approach (named PPP+TW) combines the TWSTFT time transfer data with
Persistence analysis of extreme CO, NO2 and O3 concentrations in ambient air of Delhi
NASA Astrophysics Data System (ADS)
Chelani, Asha B.
2012-05-01
Persistence analysis of air pollutant concentration and corresponding exceedance time series is carried out to examine for temporal evolution. For this purpose, air pollutant concentrations, namely, CO, NO2 and O3 observed during 2000-2009 at a traffic site in Delhi are analyzed using detrended fluctuation analysis. Two types of extreme values are analyzed; exceeded concentrations to a threshold provided by national pollution controlling agency and time interval between two exceedances. The time series of three pollutants is observed to possess persistence property whereas the extreme value time series of only primary pollutant concentrations is found to be persistent. Two time scaling regions are observed to be significant in extreme time series of CO and NO2, mainly attributed to implementation of CNG in vehicles. The presence of persistence in three pollutant concentration time series is linked to the property of self-organized criticality. The observed persistence in the time interval between two exceeded levels is a matter of concern as persistent high concentrations can trigger health problems.
Si, Tianmei; Li, Nan; Lu, Huafei; Cai, Shangli; Zhuo, Jianmin; Correll, Christoph U; Zhang, Lili; Feng, Yu
2018-06-01
Limited data are available to help identify patients with schizophrenia who are most likely to benefit from long-acting injectable antipsychotics. To investigate the efficacy of long-acting injectable antipsychotic paliperidone palmitate one-month formulation for preventing relapses, factors influencing time to first relapse, and the effect of different antipsychotic adherence levels on time to first relapse in Chinese patients with schizophrenia. This was a post-hoc analysis from an open-label, single-arm study of stable patients (Positive and Negative Syndrome Scale total score <70; n=367) receiving paliperidone palmitate one-month formulation at the end of an acute 13-week treatment phase, who entered a naturalistic one-year follow-up period, either continuing with flexibly dosed paliperidone palmitate one-month formulation (75-150 mg eq.) or switching to another antipsychotic(s). There were 362/367 patients (age=31.4±10.75 years) included in the analysis of time to first relapse (primary outcome) and 327/362 patients (39/327, poor antipsychotic adherence (<80%)) willing to receive antipsychotics were included in the exposure/adherence analysis. Overall, 84.6% (95% confidence interval=79.2-88.7) patients remained relapse-free. Poor adherence during follow-up (hazard ratio=2.97, 95% confidence interval=1.48-5.98, p=0.002) and frequent hospitalizations in the previous year (hazard ratio=1.29, 95% confidence interval=1.02-1.62, p=0.03) were associated with a significant risk of shorter time to first relapse in the univariate analysis. In patients with poor adherence, 'no use' (hazard ratio=13.13, 95% confidence interval=1.33-129.96, p=0.03) and 'interrupted use' (hazard ratio=11.04, 95% confidence interval=1.03-118.60, p=0.047) of paliperidone palmitate one-month formulation (vs continued use) showed a significantly higher risk of relapse; this was not observed in patients with good (≥80%) antipsychotic adherence. No new safety concerns were identified. Continued use of paliperidone palmitate one-month formulation/long-acting injectable antipsychotic was effective in preventing schizophrenia relapses, especially in patients with suboptimal antipsychotic adherence.
NASA Astrophysics Data System (ADS)
Glazner, Allen F.; Sadler, Peter M.
2016-12-01
The duration of a geologic interval, such as the time over which a given volume of magma accumulated to form a pluton, or the lifespan of a large igneous province, is commonly determined from a relatively small number of geochronologic determinations (e.g., 4-10) within that interval. Such sample sets can underestimate the true length of the interval by a significant amount. For example, the average interval determined from a sample of size n = 5, drawn from a uniform random distribution, will underestimate the true interval by 50%. Even for n = 10, the average sample only captures ˜80% of the interval. If the underlying distribution is known then a correction factor can be determined from theory or Monte Carlo analysis; for a uniform random distribution, this factor is
Fricke, T; Teachman, J D
1993-05-01
Using data from a Nepali population, this analysis argues that marriage style and postmarital living arrangements affect coital frequency to produce variations in the timing of first birth after marriage. Event history analysis of the first birth interval for 149 women suggests that women's autonomy in marriage decisions and marriage to cross-cousins accelerate the pace of entry into first birth. Extended-household residence with reduced natal kin contact, on the other hand, significantly lengthens the first birth interval. These findings are consistent with previous arguments in the literature while offering new evidence for the impact of extended-family residence on fertility.
Timing of Radiotherapy and Outcome in Patients Receiving Adjuvant Endocrine Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karlsson, Per, E-mail: per.karlsson@oncology.gu.s; Cole, Bernard F.; International Breast Cancer Study Group Statistical Center, Department of Biostatistics and Computational Biology, Dana-Farber Cancer Institute, Boston, MA
2011-06-01
Purpose: To evaluate the association between the interval from breast-conserving surgery (BCS) to radiotherapy (RT) and the clinical outcome among patients treated with adjuvant endocrine therapy. Patients and Methods: Patient information was obtained from three International Breast Cancer Study Group trials. The analysis was restricted to 964 patients treated with BCS and adjuvant endocrine therapy. The patients were divided into two groups according to the median number of days between BCS and RT and into four groups according to the quartile of time between BCS and RT. The endpoints were the interval to local recurrence, disease-free survival, and overall survival.more » Proportional hazards regression analysis was used to perform comparisons after adjustment for baseline factors. Results: The median interval between BCS and RT was 77 days. RT timing was significantly associated with age, menopausal status, and estrogen receptor status. After adjustment for these factors, no significant effect of a RT delay {<=}20 weeks was found. The adjusted hazard ratio for RT within 77 days vs. after 77 days was 0.94 (95% confidence interval [CI], 0.47-1.87) for the interval to local recurrence, 1.05 (95% CI, 0.82-1.34) for disease-free survival, and 1.07 (95% CI, 0.77-1.49) for overall survival. For the interval to local recurrence the adjusted hazard ratio for {<=}48, 49-77, and 78-112 days was 0.90 (95% CI, 0.34-2.37), 0.86 (95% CI, 0.33-2.25), and 0.89 (95% CI, 0.33-2.41), respectively, relative to {>=}113 days. Conclusion: A RT delay of {<=}20 weeks was significantly associated with baseline factors such as age, menopausal status, and estrogen-receptor status. After adjustment for these factors, the timing of RT was not significantly associated with the interval to local recurrence, disease-free survival, or overall survival.« less
An analysis of first-time blood donors return behaviour using regression models.
Kheiri, S; Alibeigi, Z
2015-08-01
Blood products have a vital role in saving many patients' lives. The aim of this study was to analyse blood donor return behaviour. Using a cross-sectional follow-up design of 5-year duration, 864 first-time donors who had donated blood were selected using a systematic sampling. The behaviours of donors via three response variables, return to donation, frequency of return to donation and the time interval between donations, were analysed based on logistic regression, negative binomial regression and Cox's shared frailty model for recurrent events respectively. Successful return to donation rated at 49·1% and the deferral rate was 13·3%. There was a significant reverse relationship between the frequency of return to donation and the time interval between donations. Sex, body weight and job had an effect on return to donation; weight and frequency of donation during the first year had a direct effect on the total frequency of donations. Age, weight and job had a significant effect on the time intervals between donations. Aging decreases the chances of return to donation and increases the time interval between donations. Body weight affects the three response variables, i.e. the higher the weight, the more the chances of return to donation and the shorter the time interval between donations. There is a positive correlation between the frequency of donations in the first year and the total number of return to donations. Also, the shorter the time interval between donations is, the higher the frequency of donations. © 2015 British Blood Transfusion Society.
SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS
NASA Technical Reports Server (NTRS)
Brownlow, J. D.
1994-01-01
The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval are removed by least-squares detrending. As many as ten channels of data may be analyzed at one time. Both tabular and plotted output may be generated by the SPA program. This program is written in FORTRAN IV and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 142K (octal) of 60 bit words. This core requirement can be reduced by segmentation of the program. The SPA program was developed in 1978.
Baek, Hyun Jae; Shin, JaeWook
2017-08-15
Most of the wrist-worn devices on the market provide a continuous heart rate measurement function using photoplethysmography, but have not yet provided a function to measure the continuous heart rate variability (HRV) using beat-to-beat pulse interval. The reason for such is the difficulty of measuring a continuous pulse interval during movement using a wearable device because of the nature of photoplethysmography, which is susceptible to motion noise. This study investigated the effect of missing heart beat interval data on the HRV analysis in cases where pulse interval cannot be measured because of movement noise. First, we performed simulations by randomly removing data from the RR interval of the electrocardiogram measured from 39 subjects and observed the changes of the relative and normalized errors for the HRV parameters according to the total length of the missing heart beat interval data. Second, we measured the pulse interval from 20 subjects using a wrist-worn device for 24 h and observed the error value for the missing pulse interval data caused by the movement during actual daily life. The experimental results showed that mean NN and RMSSD were the most robust for the missing heart beat interval data among all the parameters in the time and frequency domains. Most of the pulse interval data could not be obtained during daily life. In other words, the sample number was too small for spectral analysis because of the long missing duration. Therefore, the frequency domain parameters often could not be calculated, except for the sleep state with little motion. The errors of the HRV parameters were proportional to the missing data duration in the presence of missing heart beat interval data. Based on the results of this study, the maximum missing duration for acceptable errors for each parameter is recommended for use when the HRV analysis is performed on a wrist-worn device.
NASA Astrophysics Data System (ADS)
Deng, Wei; Wang, Jun
2015-06-01
We investigate and quantify the multifractal detrended cross-correlation of return interval series for Chinese stock markets and a proposed price model, the price model is established by oriented percolation. The return interval describes the waiting time between two successive price volatilities which are above some threshold, the present work is an attempt to quantify the level of multifractal detrended cross-correlation for the return intervals. Further, the concept of MF-DCCA coefficient of return intervals is introduced, and the corresponding empirical research is performed. The empirical results show that the return intervals of SSE and SZSE are weakly positive multifractal power-law cross-correlated, and exhibit the fluctuation patterns of MF-DCCA coefficients. The similar behaviors of return intervals for the price model is also demonstrated.
Longitudinal study of fingerprint recognition.
Yoon, Soweon; Jain, Anil K
2015-07-14
Human identification by fingerprints is based on the fundamental premise that ridge patterns from distinct fingers are different (uniqueness) and a fingerprint pattern does not change over time (persistence). Although the uniqueness of fingerprints has been investigated by developing statistical models to estimate the probability of error in comparing two random samples of fingerprints, the persistence of fingerprints has remained a general belief based on only a few case studies. In this study, fingerprint match (similarity) scores are analyzed by multilevel statistical models with covariates such as time interval between two fingerprints in comparison, subject's age, and fingerprint image quality. Longitudinal fingerprint records of 15,597 subjects are sampled from an operational fingerprint database such that each individual has at least five 10-print records over a minimum time span of 5 y. In regard to the persistence of fingerprints, the longitudinal analysis on a single (right index) finger demonstrates that (i) genuine match scores tend to significantly decrease when time interval between two fingerprints in comparison increases, whereas the change in impostor match scores is negligible; and (ii) fingerprint recognition accuracy at operational settings, nevertheless, tends to be stable as the time interval increases up to 12 y, the maximum time span in the dataset. However, the uncertainty of temporal stability of fingerprint recognition accuracy becomes substantially large if either of the two fingerprints being compared is of poor quality. The conclusions drawn from 10-finger fusion analysis coincide with the conclusions from single-finger analysis.
Longitudinal study of fingerprint recognition
Yoon, Soweon; Jain, Anil K.
2015-01-01
Human identification by fingerprints is based on the fundamental premise that ridge patterns from distinct fingers are different (uniqueness) and a fingerprint pattern does not change over time (persistence). Although the uniqueness of fingerprints has been investigated by developing statistical models to estimate the probability of error in comparing two random samples of fingerprints, the persistence of fingerprints has remained a general belief based on only a few case studies. In this study, fingerprint match (similarity) scores are analyzed by multilevel statistical models with covariates such as time interval between two fingerprints in comparison, subject’s age, and fingerprint image quality. Longitudinal fingerprint records of 15,597 subjects are sampled from an operational fingerprint database such that each individual has at least five 10-print records over a minimum time span of 5 y. In regard to the persistence of fingerprints, the longitudinal analysis on a single (right index) finger demonstrates that (i) genuine match scores tend to significantly decrease when time interval between two fingerprints in comparison increases, whereas the change in impostor match scores is negligible; and (ii) fingerprint recognition accuracy at operational settings, nevertheless, tends to be stable as the time interval increases up to 12 y, the maximum time span in the dataset. However, the uncertainty of temporal stability of fingerprint recognition accuracy becomes substantially large if either of the two fingerprints being compared is of poor quality. The conclusions drawn from 10-finger fusion analysis coincide with the conclusions from single-finger analysis. PMID:26124106
Single photon detection and timing in the Lunar Laser Ranging Experiment.
NASA Technical Reports Server (NTRS)
Poultney, S. K.
1972-01-01
The goals of the Lunar Laser Ranging Experiment lead to the need for the measurement of a 2.5 sec time interval to an accuracy of a nanosecond or better. The systems analysis which included practical retroreflector arrays, available laser systems, and large telescopes led to the necessity of single photon detection. Operation under all background illumination conditions required auxiliary range gates and extremely narrow spectral and spatial filters in addition to the effective gate provided by the time resolution. Nanosecond timing precision at relatively high detection efficiency was obtained using the RCA C31000F photomultiplier and Ortec 270 constant fraction of pulse-height timing discriminator. The timing accuracy over the 2.5 sec interval was obtained using a digital interval with analog vernier ends. Both precision and accuracy are currently checked internally using a triggerable, nanosecond light pulser. Future measurements using sub-nanosecond laser pulses will be limited by the time resolution of single photon detectors.
It's time to fear! Interval timing in odor fear conditioning in rats
Shionoya, Kiseko; Hegoburu, Chloé; Brown, Bruce L.; Sullivan, Regina M.; Doyère, Valérie; Mouly, Anne-Marie
2013-01-01
Time perception is crucial to goal attainment in humans and other animals, and interval timing also guides fundamental animal behaviors. Accumulating evidence has made it clear that in associative learning, temporal relations between events are encoded, and a few studies suggest this temporal learning occurs very rapidly. Most of these studies, however, have used methodologies that do not permit investigating the emergence of this temporal learning. In the present study we monitored respiration, ultrasonic vocalization (USV) and freezing behavior in rats in order to perform fine-grain analysis of fear responses during odor fear conditioning. In this paradigm an initially neutral odor (the conditioned stimulus, CS) predicted the arrival of an aversive unconditioned stimulus (US, footshock) at a fixed 20-s time interval. We first investigated the development of a temporal pattern of responding related to CS-US interval duration. The data showed that during acquisition with odor-shock pairings, a temporal response pattern of respiration rate was observed. Changing the CS-US interval duration from 20-s to 30-s resulted in a shift of the temporal response pattern appropriate to the new duration thus demonstrating that the pattern reflected the learning of the CS-US interval. A temporal pattern was also observed during a retention test 24 h later for both respiration and freezing measures, suggesting that the animals had stored the interval duration in long-term memory. We then investigated the role of intra-amygdalar dopaminergic transmission in interval timing. For this purpose, the D1 dopaminergic receptors antagonist SCH23390 was infused in the basolateral amygdala before conditioning. This resulted in an alteration of timing behavior, as reflected in differential temporal patterns between groups observed in a 24 h retention test off drug. The present data suggest that D1 receptor dopaminergic transmission within the amygdala is involved in temporal processing. PMID:24098277
Neuropsychology of Timing and Time Perception
ERIC Educational Resources Information Center
Meck, W.H.
2005-01-01
Interval timing in the range of milliseconds to minutes is affected in a variety of neurological and psychiatric populations involving disruption of the frontal cortex, hippocampus, basal ganglia, and cerebellum. Our understanding of these distortions in timing and time perception are aided by the analysis of the sources of variance attributable…
Analysis of noise-induced temporal correlations in neuronal spike sequences
NASA Astrophysics Data System (ADS)
Reinoso, José A.; Torrent, M. C.; Masoller, Cristina
2016-11-01
We investigate temporal correlations in sequences of noise-induced neuronal spikes, using a symbolic method of time-series analysis. We focus on the sequence of time-intervals between consecutive spikes (inter-spike-intervals, ISIs). The analysis method, known as ordinal analysis, transforms the ISI sequence into a sequence of ordinal patterns (OPs), which are defined in terms of the relative ordering of consecutive ISIs. The ISI sequences are obtained from extensive simulations of two neuron models (FitzHugh-Nagumo, FHN, and integrate-and-fire, IF), with correlated noise. We find that, as the noise strength increases, temporal order gradually emerges, revealed by the existence of more frequent ordinal patterns in the ISI sequence. While in the FHN model the most frequent OP depends on the noise strength, in the IF model it is independent of the noise strength. In both models, the correlation time of the noise affects the OP probabilities but does not modify the most probable pattern.
Park, Bumsoo; Choo, Seol Ho; Jeon, Hwang Gyun; Jeong, Byong Chang; Seo, Seong Il; Jeon, Seong Soo; Lee, Hyun Moo; Choi, Han Yong
2014-12-01
Traditionally, urologists recommend an interval of at least 4 weeks after prostate biopsy before radical prostatectomy. The aim of our study was to evaluate whether the interval from prostate biopsy to radical prostatectomy affects immediate operative outcomes, with a focus on differences in surgical approach. The study population of 1,848 radical prostatectomy patients was divided into two groups according to the surgical approach: open or minimally invasive. Open group included perineal and retropubic approach, and minimally invasive group included laparoscopic and robotic approach. The cut-off of the biopsy-to-surgery interval was 4 weeks. Positive surgical margin status, operative time and estimated blood loss were evaluated as endpoint parameters. In the open group, there were significant differences in operative time and estimated blood loss between the <4-week and ≥4-week interval subgroups, but there was no difference in positive margin rate. In the minimally invasive group, there were no differences in the three outcome parameters between the two subgroups. Multivariate analysis revealed that the biopsy-to-surgery interval was not a significant factor affecting immediate operative outcomes in both open and minimally invasive groups, with the exception of the interval ≥4 weeks as a significant factor decreasing operative time in the minimally invasive group. In conclusion, performing open or minimally invasive radical prostatectomy within 4 weeks of prostate biopsy is feasible for both approaches, and is even beneficial for minimally invasive radical prostatectomy to reduce operative time.
Immortal time bias in observational studies of drug effects in pregnancy.
Matok, Ilan; Azoulay, Laurent; Yin, Hui; Suissa, Samy
2014-09-01
The use of decongestants during the second or third trimesters of pregnancy has been associated with a decreased risk of preterm delivery in two observational studies. This effect may have been subject to immortal time bias, a bias arising from the improper classification of exposure during follow-up. We illustrate this bias by repeating the studies using a different data source. The United Kingdom Hospital Episodes Statistics and the Clinical Practice Research Datalink databases were linked to identify all live singleton pregnancies among women aged 15 to 45 years between 1997 and 2012. Cox proportional hazards models were used to estimate hazard ratios (HRs) and 95% confidence intervals of preterm delivery (before 37 weeks of gestation) by considering the use of decongestants during the third trimester as a time-fixed (biased analysis which misclassifies unexposed person-time as exposed person-time) and time-varying exposure (unbiased analysis with proper classification of unexposed person-time). All models were adjusted for maternal age, smoking status, maternal diabetes, maternal hypertension, preeclampsia, and parity. Of the 195,582 singleton deliveries, 10,248 (5.2%) were born preterm. In the time-fixed analysis, the HR of preterm delivery for the use of decongestants was below the null and suggestive of a 46% decreased risk (adjusted HR = 0.54; 95% confidence interval, 0.24-1.20). In contrast, the HR was closer to null (adjusted HR = 0.93 95% confidence interval, 0.42-2.06) when the use of decongestants was treated as a time-varying variable. Studies of drug safety in pregnancy should use the appropriate statistical techniques to avoid immortal time bias, particularly when the exposure occurs at later stages of pregnancy. © 2014 Wiley Periodicals, Inc.
Ballal, Nidambur Vasudev; Yegneswaran, Prakash Peralam; Mala, Kundabala; Bhat, Kadengodlu Seetharama
2011-11-01
The aim of this study was to evaluate the antimicrobial efficacy of 7% maleic acid (MA) and 17% ethylenediaminetetraacetic acid (EDTA) in elimination of Enterococcus faecalis, Candida albicans, and Staphylococcus aureus at different time intervals. Transfer culture of microbial strains were used for inoculum preparation and determination of time-kill assay. The viability counts of 7% MA and 17% EDTA suspensions were performed at 0, 2, 4, 6, 12, and 24 hours. Assay results were analyzed by determining number of strains that yielded log(10) CFU/mL of -1 compared with counts at 0 hours, for test medicaments at time intervals. Medicaments were considered to be microbicidal at a minimum inhibitory concentration that reduced original inoculum by >3 log(10) CFU/mL (99.9%) and microbiostatic if inoculum was reduced by <3 log(10) CFU/mL. Statistical analysis was performed using chi-square and Fisher exact tests as well as Friedman test for comparison of the time interval within the MA and EDTA groups. At all time intervals, there was no significant difference between MA and EDTA for all of the organisms (P > .05). However, within the MA and EDTA groups at various time intervals, there were significant differences (P < .001). Equivalent antimicrobial activity was observed by MA and EDTA against all of the organisms tested at various periods. Copyright © 2011 Mosby, Inc. All rights reserved.
Sharifi, Maryam; Ghassemi, Amirreza; Bayani, Shahin
2015-01-01
Success of orthodontic miniscrews in providing stable anchorage is dependent on their stability. The purpose of this study was to assess the effect of insertion method and postinsertion time interval on the removal torque of miniscrews as an indicator of their stability. Seventy-two miniscrews (Jeil Medical) were inserted into the femoral bones of three male German Shepherd dogs and assigned to nine groups of eight miniscrews. Three insertion methods, including hand-driven, motor-driven with 5.0-Ncm insertion torque, and motor-driven with 20.0-Ncm insertion torque, were tested. Three time intervals of 0, 2, and 6 weeks between miniscrew insertion and removal were tested as well. Removal torque values were measured in newton centimeters by a removal torque tester (IMADA). Data were analyzed by one-way analysis of variance (ANOVA) followed by the Bonferroni post hoc test at a .05 level of significance. A miniscrew survival rate of 93% was observed in this study. The highest mean value of removal torque among the three postinsertion intervals (2.4 ± 0.59 Ncm) was obtained immediately after miniscrew insertion with a statistically significant difference from the other two time intervals (P < .001). Insertion were observed in this regard (P = .46). The stability of miniscrews was not affected by the insertion method. However, of the postinsertion time intervals, the highest removal torque values were obtained immediately after insertion.
Craig, David Philip Arthur; Varnon, Christopher A.; Sokolowski, Michel B. C.; Wells, Harrington; Abramson, Charles I.
2014-01-01
Interval timing is a key element of foraging theory, models of predator avoidance, and competitive interactions. Although interval timing is well documented in vertebrate species, it is virtually unstudied in invertebrates. In the present experiment, we used free-flying honey bees (Apis mellifera ligustica) as a model for timing behaviors. Subjects were trained to enter a hole in an automated artificial flower to receive a nectar reinforcer (i.e. reward). Responses were continuously reinforced prior to exposure to either a fixed interval (FI) 15-sec, FI 30-sec, FI 60-sec, or FI 120-sec reinforcement schedule. We measured response rate and post-reinforcement pause within each fixed interval trial between reinforcers. Honey bees responded at higher frequencies earlier in the fixed interval suggesting subject responding did not come under traditional forms of temporal control. Response rates were lower during FI conditions compared to performance on continuous reinforcement schedules, and responding was more resistant to extinction when previously reinforced on FI schedules. However, no “scalloped” or “break-and-run” patterns of group or individual responses reinforced on FI schedules were observed; no traditional evidence of temporal control was found. Finally, longer FI schedules eventually caused all subjects to cease returning to the operant chamber indicating subjects did not tolerate the longer FI schedules. PMID:24983960
cyclostratigraphy, sequence stratigraphy and organic matter accumulation mechanism
NASA Astrophysics Data System (ADS)
Cong, F.; Li, J.
2016-12-01
The first member of Maokou Formation of Sichuan basin is composed of well preserved carbonate ramp couplets of limestone and marlstone/shale. It acts as one of the potential shale gas source rock, and is suitable for time-series analysis. We conducted time-series analysis to identify high-frequency sequences, reconstruct high-resolution sedimentation rate, estimate detailed primary productivity for the first time in the study intervals and discuss organic matter accumulation mechanism of source rock under sequence stratigraphic framework.Using the theory of cyclostratigraphy and sequence stratigraphy, the high-frequency sequences of one outcrop profile and one drilling well are identified. Two third-order sequences and eight fourth-order sequences are distinguished on outcrop profile based on the cycle stacking patterns. For drilling well, sequence boundary and four system tracts is distinguished by "integrated prediction error filter analysis" (INPEFA) of Gamma-ray logging data, and eight fourth-order sequences is identified by 405ka long eccentricity curve in depth domain which is quantified and filtered by integrated analysis of MTM spectral analysis, evolutive harmonic analysis (EHA), evolutive average spectral misfit (eASM) and band-pass filtering. It suggests that high-frequency sequences correlate well with Milankovitch orbital signals recorded in sediments, and it is applicable to use cyclostratigraphy theory in dividing high-frequency(4-6 orders) sequence stratigraphy.High-resolution sedimentation rate is reconstructed through the study interval by tracking the highly statistically significant short eccentricity component (123ka) revealed by EHA. Based on sedimentation rate, measured TOC and density data, the burial flux, delivery flux and primary productivity of organic carbon was estimated. By integrating redox proxies, we can discuss the controls on organic matter accumulation by primary production and preservation under the high-resolution sequence stratigraphic framework. Results show that high average organic carbon contents in the study interval are mainly attributed to high primary production. The results also show a good correlation between high organic carbon accumulation and intervals of transgression.
Inconsistencies in Numerical Simulations of Dynamical Systems Using Interval Arithmetic
NASA Astrophysics Data System (ADS)
Nepomuceno, Erivelton G.; Peixoto, Márcia L. C.; Martins, Samir A. M.; Rodrigues, Heitor M.; Perc, Matjaž
Over the past few decades, interval arithmetic has been attracting widespread interest from the scientific community. With the expansion of computing power, scientific computing is encountering a noteworthy shift from floating-point arithmetic toward increased use of interval arithmetic. Notwithstanding the significant reliability of interval arithmetic, this paper presents a theoretical inconsistency in a simulation of dynamical systems using a well-known implementation of arithmetic interval. We have observed that two natural interval extensions present an empty intersection during a finite time range, which is contrary to the fundamental theorem of interval analysis. We have proposed a procedure to at least partially overcome this problem, based on the union of the two generated pseudo-orbits. This paper also shows a successful case of interval arithmetic application in the reduction of interval width size on the simulation of discrete map. The implications of our findings on the reliability of scientific computing using interval arithmetic have been properly addressed using two numerical examples.
Variance of discharge estimates sampled using acoustic Doppler current profilers from moving boats
Garcia, Carlos M.; Tarrab, Leticia; Oberg, Kevin; Szupiany, Ricardo; Cantero, Mariano I.
2012-01-01
This paper presents a model for quantifying the random errors (i.e., variance) of acoustic Doppler current profiler (ADCP) discharge measurements from moving boats for different sampling times. The model focuses on the random processes in the sampled flow field and has been developed using statistical methods currently available for uncertainty analysis of velocity time series. Analysis of field data collected using ADCP from moving boats from three natural rivers of varying sizes and flow conditions shows that, even though the estimate of the integral time scale of the actual turbulent flow field is larger than the sampling interval, the integral time scale of the sampled flow field is on the order of the sampling interval. Thus, an equation for computing the variance error in discharge measurements associated with different sampling times, assuming uncorrelated flow fields is appropriate. The approach is used to help define optimal sampling strategies by choosing the exposure time required for ADCPs to accurately measure flow discharge.
Fractal Dynamics of Heartbeat Interval Fluctuations in Health and Disease
NASA Astrophysics Data System (ADS)
Meyer, M.; Marconi, C.; Rahmel, A.; Grassi, B.; Ferretti, G.; Skinner, J. E.; Cerretelli, P.
The dynamics of heartbeat interval time series were studied by a modified random walk analysis recently introduced as Detrended Fluctuation Analysis. In this analysis, the intrinsic fractal long-range power-law correlation properties of beat-to-beat fluctuations generated by the dynamical system (i.e. cardiac rhythm generator), after decomposition from extrinsic uncorrelated sources, can be quantified by the scaling exponent which, in healthy subjects, is about 1.0. The finding of a scaling coefficient of 1.0, indicating scale-invariant long-range power-law correlations (1/ƒnoise) of heartbeat fluctuations, would reflect a genuinely self-similar fractal process that typically generates fluctuations on a wide range of time scales. Lack of a characteristic time scale suggests that the neuroautonomic system underlying the control of heart rate dynamics helps prevent excessive mode-locking (error tolerance) that would restrict its functional responsiveness (plasticity) to environmental stimuli. The 1/ƒ dynamics of heartbeat interval fluctuations are unaffected by exposure to chronic hypoxia suggesting that the neuroautonomic cardiac control system is preadapted to hypoxia. Functional (hypothermia, cardiac disease) and/or structural (cardiac transplantation, early cardiac development) inactivation of neuroautonomic control is associated with the breakdown or absence of fractal complexity reflected by anticorrelated random walk-like dynamics, indicating that in these conditions the heart is unadapted to its environment.
Zhou, Zhenghua; Diao, Qinqin; Shao, Nan; Liang, Youke; Lin, Li; Lei, Yan; Zheng, Lingmei
2015-01-01
To conduct an analysis of the frequency of unhealthy food advertising on mainland Chinese television (TV) and children and adolescents' risk of exposure to them. The frequencies of all types of advertisements (ads) on forty TV channels in mainland China, the exact ad broadcast times, and the name and brand of all snacks and western fast foods advertised were recorded from 0800 hours to 2400 hours on both a weekday and a weekend day in a week. The difference in the frequencies of the diverse types of ads over eight time intervals (each time interval was 2 hours) were compared, and the trends in ad frequencies during the time intervals were described. The TV channels broadcast 155 (91-183) (expressed as median [P25-P75]) food ads, 87 (38-123) snack ads, 49 (11-85) beverage ads, and 58 (25-76) ads of snacks suitable for limited consumption (SSLCs) in a day. The proportion of snack ads among food ads (SPF%) was 55.5% (40.3%-71.0%), and the proportion of SSLC ads among snack ads (LPS%) was 67.4% (55.4%-79.3%). The ad frequencies for food, snacks, SSLCs, and beverages demonstrated significant differences among the eight time intervals (all P=0.000). TV channels broadcast the most frequent ads for food, snacks, SSLCs, and beverages during the time interval from 2000 hours to 2200 hours among the eight time intervals. Chinese children and adolescents may be at a high risk of exposure to unhealthy food advertising on TV. Reducing the exposure risk strongly requires multisectoral cooperation.
Zhou, Zhenghua; Diao, Qinqin; Shao, Nan; Liang, Youke; Lin, Li; Lei, Yan; Zheng, Lingmei
2015-01-01
Objective To conduct an analysis of the frequency of unhealthy food advertising on mainland Chinese television (TV) and children and adolescents’ risk of exposure to them. Methods The frequencies of all types of advertisements (ads) on forty TV channels in mainland China, the exact ad broadcast times, and the name and brand of all snacks and western fast foods advertised were recorded from 0800 hours to 2400 hours on both a weekday and a weekend day in a week. The difference in the frequencies of the diverse types of ads over eight time intervals (each time interval was 2 hours) were compared, and the trends in ad frequencies during the time intervals were described. Results The TV channels broadcast 155 (91-183) (expressed as median [P 25-P 75]) food ads, 87 (38-123) snack ads, 49 (11-85) beverage ads, and 58 (25-76) ads of snacks suitable for limited consumption (SSLCs) in a day. The proportion of snack ads among food ads (SPF%) was 55.5% (40.3%-71.0%), and the proportion of SSLC ads among snack ads (LPS%) was 67.4% (55.4%-79.3%). The ad frequencies for food, snacks, SSLCs, and beverages demonstrated significant differences among the eight time intervals (all P=0.000). TV channels broadcast the most frequent ads for food, snacks, SSLCs, and beverages during the time interval from 2000 hours to 2200 hours among the eight time intervals. Conclusions Chinese children and adolescents may be at a high risk of exposure to unhealthy food advertising on TV. Reducing the exposure risk strongly requires multisectoral cooperation. PMID:26133984
Newgard, Craig D.; Schmicker, Robert H.; Hedges, Jerris R.; Trickett, John P.; Davis, Daniel P.; Bulger, Eileen M.; Aufderheide, Tom P.; Minei, Joseph P.; Hata, J. Steven; Gubler, K. Dean; Brown, Todd B.; Yelle, Jean-Denis; Bardarson, Berit; Nichol, Graham
2010-01-01
Study objective The first hour after the onset of out-of-hospital traumatic injury is referred to as the “golden hour,” yet the relationship between time and outcome remains unclear. We evaluate the association between emergency medical services (EMS) intervals and mortality among trauma patients with field-based physiologic abnormality. Methods This was a secondary analysis of an out-of-hospital, prospective cohort registry of adult (aged ≥15 years) trauma patients transported by 146 EMS agencies to 51 Level I and II trauma hospitals in 10 sites across North America from December 1, 2005, through March 31, 2007. Inclusion criteria were systolic blood pressure less than or equal to 90 mm Hg, respiratory rate less than 10 or greater than 29 breaths/min, Glasgow Coma Scale score less than or equal to 12, or advanced airway intervention. The outcome was inhospital mortality. We evaluated EMS intervals (activation, response, on-scene, transport, and total time) with logistic regression and 2-step instrumental variable models, adjusted for field-based confounders. Results There were 3,656 trauma patients available for analysis, of whom 806 (22.0%) died. In multivariable analyses, there was no significant association between time and mortality for any EMS interval: activation (odds ratio [OR] 1.00; 95% confidence interval [CI] 0.95 to 1.05), response (OR 1.00; 95% CI 9.97 to 1.04), on-scene (OR 1.00; 95% CI 0.99 to 1.01), transport (OR 1.00; 95% CI 0.98 to 1.01), or total EMS time (OR 1.00; 95% CI 0.99 to 1.01). Subgroup and instrumental variable analyses did not qualitatively change these findings. Conclusion In this North American sample, there was no association between EMS intervals and mortality among injured patients with physiologic abnormality in the field. PMID:19783323
Sex Differences in the Age of Peak Marathon Race Time.
Nikolaidis, Pantelis T.; Rosemann, Thomas; Knechtle, Beat
2018-04-30
Recent studies showed that women were older than men when achieving their fastest marathon race time. These studies, however, investigated a limited sample of athletes. We investigated the age of peak marathon performance in a large sample of female and male marathon finishers by using data from all finishers. We analyzed the age of peak marathon performance in 1-year and 5-year age intervals of 451,637 runners (i.e. 168,702 women and 282,935 men) who finished the ‘New York City Marathon’ between 2006 and 2016, using analysis of variance and non-linear regression analysis. During these 11 years, men were faster and older than women, the participation of women increased disproportionately to that of men resulting in a decrease of the male-to-female ratio, and relatively more women participated in the younger age groups. Most women were in the age group 30-34 years and most men in the age group 40-44 years. The fastest race time was shown at 29.7 years in women and 34.8 years in men in the 1-year age intervals, and in age group 30-34 years in women and 35-39 years in men in the 5-year age intervals. In contrast to existing findings reporting a higher age of peak marathon performance in women compared to men, we found that women achieved their best marathon race time ~5 years earlier in life than men in both 1-year and 5-year age intervals. Female athletes and their coaches should plan to achieve their fastest marathon race time at the age of ~30 years.
Funke, K; Wörgötter, F
1995-01-01
1. The spike interval pattern during the light responses of 155 on- and 81 off-centre cells of the dorsal lateral geniculate nucleus (LGN) was studied in anaesthetized and paralysed cats by the use of a novel analysis. Temporally localized interval distributions were computed from a 100 ms time window, which was shifted along the time axis in 10 ms steps, resulting in a 90% overlap between two adjacent windows. For each step the interval distribution was computed inside the time window with 1 ms resolution, and plotted as a greyscale-coded pixel line orthogonal to the time axis. For visual stimulation, light or dark spots of different size and contrast were presented with different background illumination levels. 2. Two characteristic interval patterns were observed during the sustained response component of the cells. Mainly on-cells (77%) responded with multimodal interval distributions, resulting in elongated 'bands' in the 2-dimensional time window plots. In similar situations, the interval distributions for most (71%) off-cells were rather wide and featureless. In those cases where interval bands (i.e. multimodal interval distributions) were observed for off-cells (14%), they were always much wider than for the on-cells. This difference between the on- and off-cell population was independent of the background illumination and the contrast of the stimulus. Y on-cells also tended to produce wider interval bands than X on-cells. 3. For most stimulation situations the first interval band was centred around 6-9 ms, which has been called the fundamental interval; higher order bands are multiples thereof. The fundamental interval shifted towards larger sizes with decreasing stimulus contrast. Increasing stimulus size, on the other hand, resulted in a redistribution of the intervals into higher order bands, while at the same time the location of the fundamental interval remained largely unaffected. This was interpreted as an effect of the increasing surround inhibition at the geniculate level, by which individual retinal EPSPs were cancelled. A changing level of adaptation can result in a mixed shift/redistribution effect because of the changing stimulus contrast and changing level of tonic inhibition. 4. The occurrence of interval bands is not directly related to the shape of the autocorrelation function, which can be flat, weakly oscillatory or strongly oscillatory, regardless of the interval band pattern. 5. A simple computer model was devised to account for the observed cell behaviour. The model is highly robust against parameter variations.(ABSTRACT TRUNCATED AT 400 WORDS) Images Figure 1 Figure 2 Figure 3 Figure 4 Figure 6 Figure 7 Figure 8 Figure 9 Figure 10 Figure 11 Figure 12 Figure 13 Figure 15 PMID:7562612
Cho, Han-Jin; Lee, Kyung Yul; Nam, Hyo Suk; Kim, Young Dae; Song, Tae-Jin; Jung, Yo Han; Choi, Hye-Yeon; Heo, Ji Hoe
2014-10-01
Process improvement (PI) is an approach for enhancing the existing quality improvement process by making changes while keeping the existing process. We have shown that implementation of a stroke code program using a computerized physician order entry system is effective in reducing the in-hospital time delay to thrombolysis in acute stroke patients. We investigated whether implementation of this PI could further reduce the time delays by continuous improvement of the existing process. After determining a key indicator [time interval from emergency department (ED) arrival to intravenous (IV) thrombolysis] and conducting data analysis, the target time from ED arrival to IV thrombolysis in acute stroke patients was set at 40 min. The key indicator was monitored continuously at a weekly stroke conference. The possible reasons for the delay were determined in cases for which IV thrombolysis was not administered within the target time and, where possible, the problems were corrected. The time intervals from ED arrival to the various evaluation steps and treatment before and after implementation of the PI were compared. The median time interval from ED arrival to IV thrombolysis in acute stroke patients was significantly reduced after implementation of the PI (from 63.5 to 45 min, p=0.001). The variation in the time interval was also reduced. A reduction in the evaluation time intervals was achieved after the PI [from 23 to 17 min for computed tomography scanning (p=0.003) and from 35 to 29 min for complete blood counts (p=0.006)]. PI is effective for continuous improvement of the existing process by reducing the time delays between ED arrival and IV thrombolysis in acute stroke patients.
Kheifets, Aaron; Freestone, David; Gallistel, C R
2017-07-01
In three experiments with mice ( Mus musculus ) and rats (Rattus norvigicus), we used a switch paradigm to measure quantitative properties of the interval-timing mechanism. We found that: 1) Rodents adjusted the precision of their timed switches in response to changes in the interval between the short and long feed latencies (the temporal goalposts). 2) The variability in the timing of the switch response was reduced or unchanged in the face of large trial-to-trial random variability in the short and long feed latencies. 3) The adjustment in the distribution of switch latencies in response to changes in the relative frequency of short and long trials was sensitive to the asymmetry in the Kullback-Leibler divergence. The three results suggest that durations are represented with adjustable precision, that they are timed by multiple timers, and that there is a trial-by-trial (episodic) record of feed latencies in memory. © 2017 Society for the Experimental Analysis of Behavior.
Accelerometer Data Analysis and Presentation Techniques
NASA Technical Reports Server (NTRS)
Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy
1997-01-01
The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.
Evaluating the influential priority of the factors on insurance loss of public transit
Su, Yongmin; Chen, Xinqiang
2018-01-01
Understanding correlation between influential factors and insurance losses is beneficial for insurers to accurately price and modify the bonus-malus system. Although there have been a certain number of achievements in insurance losses and claims modeling, limited efforts focus on exploring the relative role of accidents characteristics in insurance losses. The primary objective of this study is to evaluate the influential priority of transit accidents attributes, such as the time, location and type of accidents. Based on the dataset from Washington State Transit Insurance Pool (WSTIP) in USA, we implement several key algorithms to achieve the objectives. First, K-means algorithm contributes to cluster the insurance loss data into 6 intervals; second, Grey Relational Analysis (GCA) model is applied to calculate grey relational grades of the influential factors in each interval; in addition, we implement Naive Bayes model to compute the posterior probability of factors values falling in each interval. The results show that the time, location and type of accidents significantly influence the insurance loss in the first five intervals, but their grey relational grades show no significantly difference. In the last interval which represents the highest insurance loss, the grey relational grade of the time is significant higher than that of the location and type of accidents. For each value of the time and location, the insurance loss most likely falls in the first and second intervals which refers to the lower loss. However, for accidents between buses and non-motorized road users, the probability of insurance loss falling in the interval 6 tends to be highest. PMID:29298337
Evaluating the influential priority of the factors on insurance loss of public transit.
Zhang, Wenhui; Su, Yongmin; Ke, Ruimin; Chen, Xinqiang
2018-01-01
Understanding correlation between influential factors and insurance losses is beneficial for insurers to accurately price and modify the bonus-malus system. Although there have been a certain number of achievements in insurance losses and claims modeling, limited efforts focus on exploring the relative role of accidents characteristics in insurance losses. The primary objective of this study is to evaluate the influential priority of transit accidents attributes, such as the time, location and type of accidents. Based on the dataset from Washington State Transit Insurance Pool (WSTIP) in USA, we implement several key algorithms to achieve the objectives. First, K-means algorithm contributes to cluster the insurance loss data into 6 intervals; second, Grey Relational Analysis (GCA) model is applied to calculate grey relational grades of the influential factors in each interval; in addition, we implement Naive Bayes model to compute the posterior probability of factors values falling in each interval. The results show that the time, location and type of accidents significantly influence the insurance loss in the first five intervals, but their grey relational grades show no significantly difference. In the last interval which represents the highest insurance loss, the grey relational grade of the time is significant higher than that of the location and type of accidents. For each value of the time and location, the insurance loss most likely falls in the first and second intervals which refers to the lower loss. However, for accidents between buses and non-motorized road users, the probability of insurance loss falling in the interval 6 tends to be highest.
Development of a New Paradigm for Analysis of Disdrometric Data
NASA Astrophysics Data System (ADS)
Larsen, Michael L.; Kostinski, Alexander B.
2017-04-01
A number of disdrometers currently on the market are able to characterize hydrometeors on a drop-by-drop basis with arrival timestamps associated with each arriving hydrometeor. This allows an investigator to parse a time series into disjoint intervals that have equal numbers of drops, instead of the traditional subdivision into equal time intervals. Such a "fixed-N" partitioning of the data can provide several advantages over the traditional equal time binning method, especially within the context of quantifying measurement uncertainty (which typically scales with the number of hydrometeors in each sample). An added bonus is the natural elimination of measurements that are devoid of all drops. This analysis method is investigated by utilizing data from a dense array of disdrometers located near Charleston, South Carolina, USA. Implications for the usefulness of this method in future studies are explored.
Dugué, Audrey Emmanuelle; Pulido, Marina; Chabaud, Sylvie; Belin, Lisa; Gal, Jocelyn
2016-12-01
We describe how to estimate progression-free survival while dealing with interval-censored data in the setting of clinical trials in oncology. Three procedures with SAS and R statistical software are described: one allowing for a nonparametric maximum likelihood estimation of the survival curve using the EM-ICM (Expectation and Maximization-Iterative Convex Minorant) algorithm as described by Wellner and Zhan in 1997; a sensitivity analysis procedure in which the progression time is assigned (i) at the midpoint, (ii) at the upper limit (reflecting the standard analysis when the progression time is assigned at the first radiologic exam showing progressive disease), or (iii) at the lower limit of the censoring interval; and finally, two multiple imputations are described considering a uniform or the nonparametric maximum likelihood estimation (NPMLE) distribution. Clin Cancer Res; 22(23); 5629-35. ©2016 AACR. ©2016 American Association for Cancer Research.
NASA Astrophysics Data System (ADS)
Laban, Shaban; El-Desouky, Aly
2013-04-01
The monitoring of real-time systems is a challenging and complicated process. So, there is a continuous need to improve the monitoring process through the use of new intelligent techniques and algorithms for detecting exceptions, anomalous behaviours and generating the necessary alerts during the workflow monitoring of such systems. The interval-based or period-based theorems have been discussed, analysed, and used by many researches in Artificial Intelligence (AI), philosophy, and linguistics. As explained by Allen, there are 13 relations between any two intervals. Also, there have also been many studies of interval-based temporal reasoning and logics over the past decades. Interval-based theorems can be used for monitoring real-time interval-based data processing. However, increasing the number of processed intervals makes the implementation of such theorems a complex and time consuming process as the relationships between such intervals are increasing exponentially. To overcome the previous problem, this paper presents a Rule-based Interval State Machine Algorithm (RISMA) for processing, monitoring, and analysing the behaviour of interval-based data, received from real-time sensors. The proposed intelligent algorithm uses the Interval State Machine (ISM) approach to model any number of interval-based data into well-defined states as well as inferring them. An interval-based state transition model and methodology are presented to identify the relationships between the different states of the proposed algorithm. By using such model, the unlimited number of relationships between similar large numbers of intervals can be reduced to only 18 direct relationships using the proposed well-defined states. For testing the proposed algorithm, necessary inference rules and code have been designed and applied to the continuous data received in near real-time from the stations of International Monitoring System (IMS) by the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). The CLIPS expert system shell has been used as the main rule engine for implementing the algorithm rules. Python programming language and the module "PyCLIPS" are used for building the necessary code for algorithm implementation. More than 1.7 million intervals constitute the Concise List of Frames (CLF) from 20 different seismic stations have been used for evaluating the proposed algorithm and evaluating stations behaviour and performance. The initial results showed that proposed algorithm can help in better understanding of the operation and performance of those stations. Different important information, such as alerts and some station performance parameters, can be derived from the proposed algorithm. For IMS interval-based data and at any period of time it is possible to analyze station behavior, determine the missing data, generate necessary alerts, and to measure some of station performance attributes. The details of the proposed algorithm, methodology, implementation, experimental results, advantages, and limitations of this research are presented. Finally, future directions and recommendations are discussed.
On computations of variance, covariance and correlation for interval data
NASA Astrophysics Data System (ADS)
Kishida, Masako
2017-02-01
In many practical situations, the data on which statistical analysis is to be performed is only known with interval uncertainty. Different combinations of values from the interval data usually lead to different values of variance, covariance, and correlation. Hence, it is desirable to compute the endpoints of possible values of these statistics. This problem is, however, NP-hard in general. This paper shows that the problem of computing the endpoints of possible values of these statistics can be rewritten as the problem of computing skewed structured singular values ν, for which there exist feasible (polynomial-time) algorithms that compute reasonably tight bounds in most practical cases. This allows one to find tight intervals of the aforementioned statistics for interval data.
Timing of Occurrence Is the Most Important Characteristic of Spot Sign.
Wang, Binli; Yan, Shenqiang; Xu, Mengjun; Zhang, Sheng; Liu, Keqin; Hu, Haitao; Selim, Magdy; Lou, Min
2016-05-01
Most previous studies have used single-phase computed tomographic angiography to detect the spot sign, a marker for hematoma expansion (HE) in spontaneous intracerebral hemorrhage. We investigated whether defining the spot sign based on timing on perfusion computed tomography (CTP) would improve its specificity for predicting HE. We prospectively enrolled supratentorial spontaneous intracerebral hemorrhage patients who underwent CTP within 6 hours of onset. Logistic regression was performed to assess the risk factors for HE and poor outcome. Predictive performance of individual CTP spot sign characteristics were examined with receiver operating characteristic analysis. Sixty-two men and 21 women with spontaneous intracerebral hemorrhage were included in this analysis. Spot sign was detected in 46% (38/83) of patients. Receiver operating characteristic analysis indicated that the timing of spot sign occurrence on CTP had the greatest area under receiver operating characteristic curve for HE (0.794; 95% confidence interval, 0.630-0.958; P=0.007); the cutoff time was 23.13 seconds. On multivariable analysis, the presence of early-occurring spot sign (ie, spot sign before 23.13 seconds) was an independent predictor not only of HE (odds ratio=28.835; 95% confidence interval, 6.960-119.458; P<0.001), but also of mortality at 3 months (odds ratio =22.377; 95% confidence interval, 1.773-282.334; P=0.016). Moreover, the predictive performance showed that the redefined early-occurring spot sign maintained a higher specificity for HE compared with spot sign (91% versus 74%). Redefining the spot sign based on timing of contrast leakage on CTP to determine early-occurring spot sign improves the specificity for predicting HE and 3-month mortality. The use of early-occurring spot sign could improve the selection of ICH patients for potential hemostatic therapy. © 2016 American Heart Association, Inc.
Perri, Amanda M.; O’Sullivan, Terri L.; Harding, John C.S.; Wood, R. Darren; Friendship, Robert M.
2017-01-01
The evaluation of pig hematology and biochemistry parameters is rarely done largely due to the costs associated with laboratory testing and labor, and the limited availability of reference intervals needed for interpretation. Within-herd and between-herd biological variation of these values also make it difficult to establish reference intervals. Regardless, baseline reference intervals are important to aid veterinarians in the interpretation of blood parameters for the diagnosis and treatment of diseased swine. The objective of this research was to provide reference intervals for hematology and biochemistry parameters of 3-week-old commercial nursing piglets in Ontario. A total of 1032 pigs lacking clinical signs of disease from 20 swine farms were sampled for hematology and iron panel evaluation, with biochemistry analysis performed on a subset of 189 randomly selected pigs. The 95% reference interval, mean, median, range, and 90% confidence intervals were calculated for each parameter. PMID:28373729
Tanaka, Tomohiro; Nishida, Satoshi
2015-01-01
The neuronal processes that underlie visual searches can be divided into two stages: target discrimination and saccade preparation/generation. This predicts that the length of time of the prediscrimination stage varies according to the search difficulty across different stimulus conditions, whereas the length of the latter postdiscrimination stage is stimulus invariant. However, recent studies have suggested that the length of the postdiscrimination interval changes with different stimulus conditions. To address whether and how the visual stimulus affects determination of the postdiscrimination interval, we recorded single-neuron activity in the lateral intraparietal area (LIP) when monkeys (Macaca fuscata) performed a color-singleton search involving four stimulus conditions that differed regarding luminance (Bright vs. Dim) and target-distractor color similarity (Easy vs. Difficult). We specifically focused on comparing activities between the Bright-Difficult and Dim-Easy conditions, in which the visual stimuli were considerably different, but the mean reaction times were indistinguishable. This allowed us to examine the neuronal activity when the difference in the degree of search speed between different stimulus conditions was minimal. We found that not only prediscrimination but also postdiscrimination intervals varied across stimulus conditions: the postdiscrimination interval was longer in the Dim-Easy condition than in the Bright-Difficult condition. Further analysis revealed that the postdiscrimination interval might vary with stimulus luminance. A computer simulation using an accumulation-to-threshold model suggested that the luminance-related difference in visual response strength at discrimination time could be the cause of different postdiscrimination intervals. PMID:25995344
A model of return intervals between earthquake events
NASA Astrophysics Data System (ADS)
Zhou, Yu; Chechkin, Aleksei; Sokolov, Igor M.; Kantz, Holger
2016-06-01
Application of the diffusion entropy analysis and the standard deviation analysis to the time sequence of the southern California earthquake events from 1976 to 2002 uncovered scaling behavior typical for anomalous diffusion. However, the origin of such behavior is still under debate. Some studies attribute the scaling behavior to the correlations in the return intervals, or waiting times, between aftershocks or mainshocks. To elucidate a nature of the scaling, we applied specific reshulffling techniques to eliminate correlations between different types of events and then examined how it affects the scaling behavior. We demonstrate that the origin of the scaling behavior observed is the interplay between mainshock waiting time distribution and the structure of clusters of aftershocks, but not correlations in waiting times between the mainshocks and aftershocks themselves. Our findings are corroborated by numerical simulations of a simple model showing a very similar behavior. The mainshocks are modeled by a renewal process with a power-law waiting time distribution between events, and aftershocks follow a nonhomogeneous Poisson process with the rate governed by Omori's law.
Visibility graph analysis of heart rate time series and bio-marker of congestive heart failure
NASA Astrophysics Data System (ADS)
Bhaduri, Anirban; Bhaduri, Susmita; Ghosh, Dipak
2017-09-01
Study of RR interval time series for Congestive Heart Failure had been an area of study with different methods including non-linear methods. In this article the cardiac dynamics of heart beat are explored in the light of complex network analysis, viz. visibility graph method. Heart beat (RR Interval) time series data taken from Physionet database [46, 47] belonging to two groups of subjects, diseased (congestive heart failure) (29 in number) and normal (54 in number) are analyzed with the technique. The overall results show that a quantitative parameter can significantly differentiate between the diseased subjects and the normal subjects as well as different stages of the disease. Further, the data when split into periods of around 1 hour each and analyzed separately, also shows the same consistent differences. This quantitative parameter obtained using the visibility graph analysis thereby can be used as a potential bio-marker as well as a subsequent alarm generation mechanism for predicting the onset of Congestive Heart Failure.
Reaction time in pilots during intervals of high sustained g.
Truszczynski, Olaf; Lewkowicz, Rafal; Wojtkowiak, Mieczyslaw; Biernacki, Marcin P
2014-11-01
An important problem for pilots is visual disturbances occurring under +Gz acceleration. Assessment of the degree of intensification of these disturbances is generally accepted as the acceleration tolerance level (ATL) criterion determined in human centrifuges. The aim of this research was to evaluate the visual-motor responses of pilots during rapidly increasing acceleration contained in cyclic intervals of +6 Gz to the maximum ATL. The study involved 40 male pilots ages 32-41 yr. The task was a quick and faultless response to the light stimuli presented on a light bar during exposure to acceleration until reaching the ATL. Simple response time (SRT) measurements were performed using a visual-motor analysis system throughout the exposures which allowed assessment of a pilot's ATL. There were 29 pilots who tolerated the initial phase of interval acceleration and achieved +6 Gz, completing the test at ATL. Relative to the control measurements, the obtained results indicate a significant effect of the applied acceleration on response time. SRT during +6 Gz exposure was not significantly longer compared with the reaction time between each of the intervals. SRT and erroneous reactions indicated no statistically significant differences between the "lower" and "higher" ATL groups. SRT measurements over the +6-Gz exposure intervals did not vary between "lower" and "higher" ATL groups and, therefore, are not useful in predicting pilot performance. The gradual exposure to the maximum value of +6 Gz with exposure to the first three intervals on the +6-Gz plateau effectively differentiated pilots.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cattaneo, Richard; Hanna, Rabbie K.; Jacobsen, Gordon
Purpose: Adjuvant radiation therapy (RT) has been shown to improve local control in patients with endometrial carcinoma. We analyzed the impact of the time interval between hysterectomy and RT initiation in patients with endometrial carcinoma. Methods and Materials: In this institutional review board-approved study, we identified 308 patients with endometrial carcinoma who received adjuvant RT after hysterectomy. All patients had undergone hysterectomy, oophorectomy, and pelvic and para-aortic lymph node evaluation from 1988 to 2010. Patients' demographics, pathologic features, and treatments were compared. The time interval between hysterectomy and the start of RT was calculated. The effects of time interval onmore » recurrence-free (RFS), disease-specific (DSS), and overall survival (OS) were calculated. Following univariate analysis, multivariate modeling was performed. Results: The median age and follow-up for the study cohort was 65 years and 72 months, respectively. Eighty-five percent of the patients had endometrioid carcinoma. RT was delivered with high-dose-rate brachytherapy alone (29%), pelvic RT alone (20%), or both (51%). Median time interval to start RT was 42 days (range, 21-130 days). A total of 269 patients (74%) started their RT <9 weeks after undergoing hysterectomy (group 1) and 26% started ≥9 weeks after surgery (group 2). There were a total of 43 recurrences. Tumor recurrence was significantly associated with treatment delay of ≥9 weeks, with 5-year RFS of 90% for group 1 compared to only 39% for group 2 (P<.001). On multivariate analysis, RT delay of ≥9 weeks (P<.001), presence of lymphovascular space involvement (P=.001), and higher International Federation of Gynecology and Obstetrics grade (P=.012) were independent predictors of recurrence. In addition, RT delay of ≥9 weeks was an independent significant predictor for worse DSS and OS (P=.001 and P=.01, respectively). Conclusions: Delay in administering adjuvant RT after hysterectomy was associated with worse survival endpoints. Our data suggest that shorter time interval between hysterectomy and start of RT may be beneficial.« less
Microcomputer Applications in Interaction Analysis.
ERIC Educational Resources Information Center
Wadham, Rex A.
The Timed Interval Categorical Observation Recorder (TICOR), a portable, battery powered microcomputer designed to automate the collection of sequential and simultaneous behavioral observations and their associated durations, was developed to overcome problems in gathering subtle interaction analysis data characterized by sequential flow of…
The Contribution of Human Factors in Military System Development: Methodological Considerations
1980-07-01
Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
... 500 Index option series in the pilot: (1) A time series analysis of open interest; and (2) an analysis... issue's total market share value, which is the share price times the number of shares outstanding. These... other series. Strike price intervals would be set no less than 5 points apart. Consistent with existing...
Multiscale multifractal DCCA and complexity behaviors of return intervals for Potts price model
NASA Astrophysics Data System (ADS)
Wang, Jie; Wang, Jun; Stanley, H. Eugene
2018-02-01
To investigate the characteristics of extreme events in financial markets and the corresponding return intervals among these events, we use a Potts dynamic system to construct a random financial time series model of the attitudes of market traders. We use multiscale multifractal detrended cross-correlation analysis (MM-DCCA) and Lempel-Ziv complexity (LZC) perform numerical research of the return intervals for two significant China's stock market indices and for the proposed model. The new MM-DCCA method is based on the Hurst surface and provides more interpretable cross-correlations of the dynamic mechanism between different return interval series. We scale the LZC method with different exponents to illustrate the complexity of return intervals in different scales. Empirical studies indicate that the proposed return intervals from the Potts system and the real stock market indices hold similar statistical properties.
Seo, Eun Hee; Kim, Tae Oh; Park, Min Jae; Joo, Hee Rin; Heo, Nae Yun; Park, Jongha; Park, Seung Ha; Yang, Sung Yeon; Moon, Young Soo
2012-03-01
Several factors influence bowel preparation quality. Recent studies have indicated that the time interval between bowel preparation and the start of colonoscopy is also important in determining bowel preparation quality. To evaluate the influence of the preparation-to-colonoscopy (PC) interval (the interval of time between the last polyethylene glycol dose ingestion and the start of the colonoscopy) on bowel preparation quality in the split-dose method for colonoscopy. Prospective observational study. University medical center. A total of 366 consecutive outpatients undergoing colonoscopy. Split-dose bowel preparation and colonoscopy. The quality of bowel preparation was assessed by using the Ottawa Bowel Preparation Scale according to the PC interval, and other factors that might influence bowel preparation quality were analyzed. Colonoscopies with a PC interval of 3 to 5 hours had the best bowel preparation quality score in the whole, right, mid, and rectosigmoid colon according to the Ottawa Bowel Preparation Scale. In multivariate analysis, the PC interval (odds ratio [OR] 1.85; 95% CI, 1.18-2.86), the amount of PEG ingested (OR 4.34; 95% CI, 1.08-16.66), and compliance with diet instructions (OR 2.22l 95% CI, 1.33-3.70) were significant contributors to satisfactory bowel preparation. Nonrandomized controlled, single-center trial. The optimal time interval between the last dose of the agent and the start of colonoscopy is one of the important factors to determine satisfactory bowel preparation quality in split-dose polyethylene glycol bowel preparation. Copyright © 2012 American Society for Gastrointestinal Endoscopy. Published by Mosby, Inc. All rights reserved.
Lambert, Ronald J W; Mytilinaios, Ioannis; Maitland, Luke; Brown, Angus M
2012-08-01
This study describes a method to obtain parameter confidence intervals from the fitting of non-linear functions to experimental data, using the SOLVER and Analysis ToolPaK Add-In of the Microsoft Excel spreadsheet. Previously we have shown that Excel can fit complex multiple functions to biological data, obtaining values equivalent to those returned by more specialized statistical or mathematical software. However, a disadvantage of using the Excel method was the inability to return confidence intervals for the computed parameters or the correlations between them. Using a simple Monte-Carlo procedure within the Excel spreadsheet (without recourse to programming), SOLVER can provide parameter estimates (up to 200 at a time) for multiple 'virtual' data sets, from which the required confidence intervals and correlation coefficients can be obtained. The general utility of the method is exemplified by applying it to the analysis of the growth of Listeria monocytogenes, the growth inhibition of Pseudomonas aeruginosa by chlorhexidine and the further analysis of the electrophysiological data from the compound action potential of the rodent optic nerve. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Coupling between perception and action timing during sensorimotor synchronization.
Serrien, Deborah J; Spapé, Michiel M
2010-12-17
Time is an important parameter in behaviour, especially when synchronization with external events is required. To evaluate the nature of the association between perception and action timing, this study introduced pitch accented tones during performance of a sensorimotor tapping task. Furthermore, regularity of the pacing cues was modified by small (subliminal) or large (conscious) timing perturbations. A global analysis across the intervals showed that repeated accented tones increased the tap-tone asynchrony in the regular (control) and irregular (subliminal) trials but not in the irregular trials with awareness of the perturbations. Asynchrony variability demonstrated no effect of accentuation in the regular and subliminal irregular trials, whereas it increased in the conscious irregular trials. A local analysis of the intervals showed that pitch accentuation lengthened the duration of the tapping responses, but only in the irregular trials with large timing perturbations. These data underline that common timing processes are automatically engaged for perception and action, although this arrangement can be overturned by cognitive intervention. Overall, the findings highlight a flexible association between perception and action timing within a functional information processing framework. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Properties of Asymmetric Detrended Fluctuation Analysis in the time series of RR intervals
NASA Astrophysics Data System (ADS)
Piskorski, J.; Kosmider, M.; Mieszkowski, D.; Krauze, T.; Wykretowicz, A.; Guzik, P.
2018-02-01
Heart rate asymmetry is a phenomenon by which the accelerations and decelerations of heart rate behave differently, and this difference is consistent and unidirectional, i.e. in most of the analyzed recordings the inequalities have the same directions. So far, it has been established for variance and runs based types of descriptors of RR intervals time series. In this paper we apply the newly developed method of Asymmetric Detrended Fluctuation Analysis, which so far has mainly been used with economic time series, to the set of 420 stationary 30 min time series of RR intervals from young, healthy individuals aged between 20 and 40. This asymmetric approach introduces separate scaling exponents for rising and falling trends. We systematically study the presence of asymmetry in both global and local versions of this method. In this study global means "applying to the whole time series" and local means "applying to windows jumping along the recording". It is found that the correlation structure of the fluctuations left over after detrending in physiological time series shows strong asymmetric features in both magnitude, with α+ <α-, where α+ is related to heart rate decelerations and α- to heart rate accelerations, and the proportion of the signal in which the above inequality holds. A very similar effect is observed if asymmetric noise is added to a symmetric self-affine function. No such phenomena are observed in the same physiological data after shuffling or with a group of symmetric synthetic time series.
MESOSCOPIC MODELING OF STOCHASTIC REACTION-DIFFUSION KINETICS IN THE SUBDIFFUSIVE REGIME
BLANC, EMILIE; ENGBLOM, STEFAN; HELLANDER, ANDREAS; LÖTSTEDT, PER
2017-01-01
Subdiffusion has been proposed as an explanation of various kinetic phenomena inside living cells. In order to fascilitate large-scale computational studies of subdiffusive chemical processes, we extend a recently suggested mesoscopic model of subdiffusion into an accurate and consistent reaction-subdiffusion computational framework. Two different possible models of chemical reaction are revealed and some basic dynamic properties are derived. In certain cases those mesoscopic models have a direct interpretation at the macroscopic level as fractional partial differential equations in a bounded time interval. Through analysis and numerical experiments we estimate the macroscopic effects of reactions under subdiffusive mixing. The models display properties observed also in experiments: for a short time interval the behavior of the diffusion and the reaction is ordinary, in an intermediate interval the behavior is anomalous, and at long times the behavior is ordinary again. PMID:29046618
The orbital evolution of NEA 30825 1900 TG1
NASA Astrophysics Data System (ADS)
Timoshkova, E. I.
2008-02-01
The orbital evolution of the near-Earth asteroid (NEA) 30825 1990 TG1 has been studied by numerical integration of the equations of its motion over the 100 000-year time interval with allowance for perturbations from eight major planets and Pluto, and the variations in its osculating orbit over this time interval were determined. The numerical integrations were performed using two methods: the Bulirsch-Stoer method and the Everhart method. The comparative analysis of the two resulting orbital evolutions of motion is presented for the time interval examined. The evolution of the asteroid motion is qualitatively the same for both variants, but the rate of evolution of the orbital elements is different. Our research confirms the known fact that the application of different integrators to the study of the long-term evolution of the NEA orbit may lead to different evolution tracks.
NASA Technical Reports Server (NTRS)
Holdaway, Daniel; Yang, Yuekui
2016-01-01
This is the second part of a study on how temporal sampling frequency affects satellite retrievals in support of the Deep Space Climate Observatory (DSCOVR) mission. Continuing from Part 1, which looked at Earth's radiation budget, this paper presents the effect of sampling frequency on DSCOVR-derived cloud fraction. The output from NASA's Goddard Earth Observing System version 5 (GEOS-5) Nature Run is used as the "truth". The effect of temporal resolution on potential DSCOVR observations is assessed by subsampling the full Nature Run data. A set of metrics, including uncertainty and absolute error in the subsampled time series, correlation between the original and the subsamples, and Fourier analysis have been used for this study. Results show that, for a given sampling frequency, the uncertainties in the annual mean cloud fraction of the sunlit half of the Earth are larger over land than over ocean. Analysis of correlation coefficients between the subsamples and the original time series demonstrates that even though sampling at certain longer time intervals may not increase the uncertainty in the mean, the subsampled time series is further and further away from the "truth" as the sampling interval becomes larger and larger. Fourier analysis shows that the simulated DSCOVR cloud fraction has underlying periodical features at certain time intervals, such as 8, 12, and 24 h. If the data is subsampled at these frequencies, the uncertainties in the mean cloud fraction are higher. These results provide helpful insights for the DSCOVR temporal sampling strategy.
Campbell, J P; Gratton, M C; Salomone, J A; Lindholm, D J; Watson, W A
1994-01-01
In some emergency medical services (EMS) system designs, response time intervals are mandated with monetary penalties for noncompliance. These times are set with the goal of providing rapid, definitive patient care. The time interval of vehicle at scene-to-patient access (VSPA) has been measured, but its effect on response time interval compliance has not been determined. To determine the effect of the VSPA interval on the mandated code 1 (< 9 min) and code 2 (< 13 min) response time interval compliance in an urban, public-utility model system. A prospective, observational study used independent third-party riders to collect the VSPA interval for emergency life-threatening (code 1) and emergency nonlife-threatening (code 2) calls. The VSPA interval was added to the 9-1-1 call-to-dispatch and vehicle dispatch-to-scene intervals to determine the total time interval from call received until paramedic access to the patient (9-1-1 call-to-patient access). Compliance with the mandated response time intervals was determined using the traditional time intervals (9-1-1 call-to-scene) plus the VSPA time intervals (9-1-1 call-to-patient access). Chi-square was used to determine statistical significance. Of the 216 observed calls, 198 were matched to the traditional time intervals. Sixty-three were code 1, and 135 were code 2. Of the code 1 calls, 90.5% were compliant using 9-1-1 call-to-scene intervals dropping to 63.5% using 9-1-1 call-to-patient access intervals (p < 0.0005). Of the code 2 calls, 94.1% were compliant using 9-1-1 call-to-scene intervals. Compliance decreased to 83.7% using 9-1-1 call-to-patient access intervals (p = 0.012). The addition of the VSPA interval to the traditional time intervals impacts system response time compliance. Using 9-1-1 call-to-scene compliance as a basis for measuring system performance underestimates the time for the delivery of definitive care. This must be considered when response time interval compliances are defined.
Carbon financial markets: A time-frequency analysis of CO2 prices
NASA Astrophysics Data System (ADS)
Sousa, Rita; Aguiar-Conraria, Luís; Soares, Maria Joana
2014-11-01
We characterize the interrelation of CO2 prices with energy prices (electricity, gas and coal), and with economic activity. Previous studies have relied on time-domain techniques, such as Vector Auto-Regressions. In this study, we use multivariate wavelet analysis, which operates in the time-frequency domain. Wavelet analysis provides convenient tools to distinguish relations at particular frequencies and at particular time horizons. Our empirical approach has the potential to identify relations getting stronger and then disappearing over specific time intervals and frequencies. We are able to examine the coherency of these variables and lead-lag relations at different frequencies for the time periods in focus.
Effects of aging on control of timing and force of finger tapping.
Sasaki, Hirokazu; Masumoto, Junya; Inui, Nobuyuki
2011-04-01
The present study examined whether the elderly produced a hastened or delayed tap with a negative or positive constant intertap interval error more frequently in self-paced tapping than in the stimulus-synchronized tapping for the 2 N target force at 2 or 4 Hz frequency. The analysis showed that, at both frequencies, the percentage of the delayed tap was larger in the self-paced tapping than in the stimulus-synchronized tapping, whereas the hastened tap showed the opposite result. At the 4 Hz frequency, all age groups had more variable intertap intervals during the self-paced tapping than during the stimulus-synchronized tapping, and the variability of the intertap intervals increased with age. Thus, although the increase in the frequency of delayed taps and variable intertap intervals in the self-paced tapping perhaps resulted from a dysfunction of movement timing in the basal ganglia with age, the decline in timing accuracy was somewhat improved by an auditory cue. The force variability of tapping at 4 Hz further increased with age, indicating an effect of aging on the control of force.
Evaluation of SAMe-TT2R2 Score on Predicting Success With Extended-Interval Warfarin Monitoring.
Hwang, Andrew Y; Carris, Nicholas W; Dietrich, Eric A; Gums, John G; Smith, Steven M
2018-06-01
In patients with stable international normalized ratios, 12-week extended-interval warfarin monitoring can be considered; however, predictors of success with this strategy are unknown. The previously validated SAMe-TT 2 R 2 score (considering sex, age, medical history, treatment, tobacco, and race) predicts anticoagulation control during standard follow-up (every 4 weeks), with lower scores associated with greater time in therapeutic range. To evaluate the ability of the SAMe-TT 2 R 2 score in predicting success with extended-interval warfarin follow-up in patients with previously stable warfarin doses. In this post hoc analysis of a single-arm feasibility study, baseline SAMe-TT 2 R 2 scores were calculated for patients with ≥1 extended-interval follow-up visit. The primary analysis assessed achieved weeks of extended-interval follow-up according to baseline SAMe-TT 2 R 2 scores. A total of 47 patients receiving chronic anticoagulation completed a median of 36 weeks of extended-interval follow-up. The median baseline SAMe-TT 2 R 2 score was 1 (range 0-5). Lower SAMe-TT 2 R 2 scores appeared to be associated with greater duration of extended-interval follow-up achieved, though the differences between scores were not statistically significant. No individual variable of the SAMe-TT 2 R 2 score was associated with achieved weeks of extended-interval follow-up. Analysis of additional patient factors found that longer duration (≥24 weeks) of prior stable treatment was significantly associated with greater weeks of extended-interval follow-up completed ( P = 0.04). Conclusion and Relevance: This pilot study provides limited evidence that the SAMe-TT 2 R 2 score predicts success with extended-interval warfarin follow-up but requires confirmation in a larger study. Further research is also necessary to establish additional predictors of successful extended-interval warfarin follow-up.
A validation of ground ambulance pre-hospital times modeled using geographic information systems.
Patel, Alka B; Waters, Nigel M; Blanchard, Ian E; Doig, Christopher J; Ghali, William A
2012-10-03
Evaluating geographic access to health services often requires determining the patient travel time to a specified service. For urgent care, many research studies have modeled patient pre-hospital time by ground emergency medical services (EMS) using geographic information systems (GIS). The purpose of this study was to determine if the modeling assumptions proposed through prior United States (US) studies are valid in a non-US context, and to use the resulting information to provide revised recommendations for modeling travel time using GIS in the absence of actual EMS trip data. The study sample contained all emergency adult patient trips within the Calgary area for 2006. Each record included four components of pre-hospital time (activation, response, on-scene and transport interval). The actual activation and on-scene intervals were compared with those used in published models. The transport interval was calculated within GIS using the Network Analyst extension of Esri ArcGIS 10.0 and the response interval was derived using previously established methods. These GIS derived transport and response intervals were compared with the actual times using descriptive methods. We used the information acquired through the analysis of the EMS trip data to create an updated model that could be used to estimate travel time in the absence of actual EMS trip records. There were 29,765 complete EMS records for scene locations inside the city and 529 outside. The actual median on-scene intervals were longer than the average previously reported by 7-8 minutes. Actual EMS pre-hospital times across our study area were significantly higher than the estimated times modeled using GIS and the original travel time assumptions. Our updated model, although still underestimating the total pre-hospital time, more accurately represents the true pre-hospital time in our study area. The widespread use of generalized EMS pre-hospital time assumptions based on US data may not be appropriate in a non-US context. The preference for researchers should be to use actual EMS trip records from the proposed research study area. In the absence of EMS trip data researchers should determine which modeling assumptions more accurately reflect the EMS protocols across their study area.
Serial recall and presentation schedule: a micro-analysis of local distinctiveness.
Lewandowsky, Stephan; Brown, Gordon D A
2005-01-01
According to temporal distinctiveness theories, items that are temporally isolated from their neighbours during presentation are more distinct and thus are recalled better. Event-based theories, which deny that elapsed time plays a role at encoding, explain isolation effects by assuming that temporal isolation provides extra time for rehearsal or consolidation of encoding. The two classes of theories can be differentiated by examining the symmetry of isolation effects: Event-based accounts predict that performance should be affected only by pauses following item presentation (because they allow time for rehearsal or consolidation), whereas distinctiveness predicts that items should also benefit from preceding pauses. The first experiment manipulated inter-item intervals and showed an effect of intervals following but not preceding presentation, in line with event-based accounts. The second experiment showed that the effect of following interval was abolished by articulatory suppression. The data are consistent with event-based theories but can be handled by time-based distinctiveness models if they allow for additional encoding during inter-item pauses.
Melkonian, D; Korner, A; Meares, R; Bahramali, H
2012-10-01
A novel method of the time-frequency analysis of non-stationary heart rate variability (HRV) is developed which introduces the fragmentary spectrum as a measure that brings together the frequency content, timing and duration of HRV segments. The fragmentary spectrum is calculated by the similar basis function algorithm. This numerical tool of the time to frequency and frequency to time Fourier transformations accepts both uniform and non-uniform sampling intervals, and is applicable to signal segments of arbitrary length. Once the fragmentary spectrum is calculated, the inverse transform recovers the original signal and reveals accuracy of spectral estimates. Numerical experiments show that discontinuities at the boundaries of the succession of inter-beat intervals can cause unacceptable distortions of the spectral estimates. We have developed a measure that we call the "RR deltagram" as a form of the HRV data that minimises spectral errors. The analysis of the experimental HRV data from real-life and controlled breathing conditions suggests transient oscillatory components as functionally meaningful elements of highly complex and irregular patterns of HRV. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Caputo, Ronald P; Kosinski, Robert; Walford, Gary; Giambartolomei, Alex; Grant, William; Reger, Mark J; Simons, Alan; Esente, Paolo
2005-04-01
As time to reperfusion correlates with outcomes, a door-to-balloon time of 90 +/- 30 min for primary percutaneous coronary revascularization (PCI) for the treatment of acute myocardial infarction has been recently established as a guideline by the ACC/AHA. The purpose of this study is to assess the effects of a continuous quality assurance program designed to expedite primary angioplasty at a community hospital. A database of all primary PCI procedures was created in 1998. Two groups of consecutive patients treated with primary PCI were studied. Group 1 represented patients in the time period between 1 June 1998 to 1 November 1998 and group 2 represented patients in the period between 1 January 2000 and 16 June 2000. Continuous quality assurance analysis was performed. Modifications to the primary angioplasty program were initiated in the latter group. Time intervals to certain treatment landmarks were compared between the groups. Significant decreases in the time intervals from emergency room registration to initial electrocardiogram (8.4 +/- 8.2 vs. 3.7 +/- 19.5 min; P < 0.001), presentation to the catheterization laboratory to arterial access (13.5 +/- 12.9 vs. 11.6 +/- 5.8 min; P < 0.001), and emergency room registration to initial angioplasty balloon inflation (132.0 +/- 69.2 vs. 112 +/- 72.0 min; P < 0.001) were achieved. For the subgroup of patients presenting with diagnostic ST elevation myocardial infarction, a large decrease in the door-to-balloon time interval between group 1 and group 2 was demonstrated (114.15 +/- 9.67 vs. 87.92 +/- 10.93 min; P = NS), resulting in compliance with ACC/AHA guidelines. Continuous quality improvement analysis can expedite care for patients treated by primary PCI in the community hospital setting. Copyright 2005 Wiley-Liss, Inc.
Temporal Variability in the Deglutition Literature
Molfenter, Sonja M.; Steele, Catriona M.
2013-01-01
A literature review was conducted on temporal measures of swallowing in healthy individuals with the purpose of determining the degree of variability present in such measures within the literature. A total of 46 studies that met inclusion criteria were reviewed. The definitions and descriptive statistics for all reported temporal parameters were compiled for meta-analysis. In total, 119 different temporal parameters were found in the literature. The three most-frequently occurring durational measures were: UES opening, laryngeal closure and hyoid movement. The three most-frequently occurring interval measures were: stage transition duration, pharyngeal transit time and duration from laryngeal closure to UES opening. Subtle variations in operational definitions across studies were noted, making the comparison of data challenging. Analysis of forest plots compiling descriptive statistical data (means and 95% confidence intervals) across studies revealed differing degrees of variability across durations and intervals. Two parameters (UES opening duration and the laryngeal-closure-to-UES-opening interval) demonstrated the least variability, reflected by small ranges for mean values and tight confidence intervals. Trends emerged for factors of bolus size and participant age for some variables. Other potential sources of variability are discussed. PMID:22366761
Analysis of an inventory model for both linearly decreasing demand and holding cost
NASA Astrophysics Data System (ADS)
Malik, A. K.; Singh, Parth Raj; Tomar, Ajay; Kumar, Satish; Yadav, S. K.
2016-03-01
This study proposes the analysis of an inventory model for linearly decreasing demand and holding cost for non-instantaneous deteriorating items. The inventory model focuses on commodities having linearly decreasing demand without shortages. The holding cost doesn't remain uniform with time due to any form of variation in the time value of money. Here we consider that the holding cost decreases with respect to time. The optimal time interval for the total profit and the optimal order quantity are determined. The developed inventory model is pointed up through a numerical example. It also includes the sensitivity analysis.
Correlated and uncorrelated heart rate fluctuations during relaxing visualization
NASA Astrophysics Data System (ADS)
Papasimakis, N.; Pallikari, F.
2010-05-01
The heart rate variability (HRV) of healthy subjects practicing relaxing visualization is studied by use of three multiscale analysis techniques: the detrended fluctuation analysis (DFA), the entropy in natural time (ENT) and the average wavelet (AWC) coefficient. The scaling exponent of normal interbeat interval increments exhibits characteristics of the presence of long-range correlations. During relaxing visualization the HRV dynamics change in the sense that two new features emerge independent of each other: a respiration-induced periodicity that often dominates the HRV at short scales (<40 interbeat intervals) and the decrease of the scaling exponent at longer scales (40-512 interbeat intervals). In certain cases, the scaling exponent during relaxing visualization indicates the breakdown of long-range correlations. These characteristics have been previously seen in the HRV dynamics during non-REM sleep.
Talving, Peep; Pålstedt, Joakim; Riddez, Louis
2005-01-01
Few previous studies have been conducted on the prehospital management of hypotensive trauma patients in Stockholm County. The aim of this study was to describe the prehospital management of hypotensive trauma patients admitted to the largest trauma center in Sweden, and to assess whether prehospital trauma life support (PHTLS) guidelines have been implemented regarding prehospital time intervals and fluid therapy. In addition, the effects of the age, type of injury, injury severity, prehospital time interval, blood pressure, and fluid therapy on outcome were investigated. This is a retrospective, descriptive study on consecutive, hypotensive trauma patients (systolic blood pressure < or = 90 mmHg on the scene of injury) admitted to Karolinska University Hospital in Stockholm, Sweden, during 2001-2003. The reported values are medians with interquartile ranges. Basic demographics, prehospital time intervals and interventions, injury severity scores (ISS), type and volumes of prehospital fluid resuscitation, and 30-day mortality were abstracted. The effects of the patient's age, gender, prehospital time interval, type of injury, injury severity, on-scene and emergency department blood pressure, and resuscitation fluid volumes on mortality were analyzed using the exact logistic regression model. In 102 (71 male) adult patients (age > or = 15 years) recruited, the median age was 35.5 years (range: 27-55 years) and 77 patients (75%) had suffered blunt injury. The predominant trauma mechanisms were falls between levels (24%) and motor vehicle crashes (22%) with an ISS of 28.5 (range: 16-50). The on-scene time interval was 19 minutes (range: 12-24 minutes). Fluid therapy was initiated at the scene of injury in the majority of patients (73%) regardless of the type of injury (77 blunt [75%] / 25 penetrating [25%]) or injury severity (ISS: 0-20; 21-40; 41-75). Age (odds ratio (OR) = 1.04), male gender (OR = 3.2), ISS 21-40 (OR = 13.6), and ISS >40 (OR = 43.6) were the significant factors affecting outcome in the exact logistic regression analysis. The time interval at the scene of injury exceeded PHTLS guidelines. The vast majority of the hypotensive trauma patients were fluid-resuscitated on-scene regardless of the type, mechanism, or severity of injury. A predefined fluid resuscitation regimen is not employed in hypotensive trauma victims with different types of injuries. The outcome was worsened by male gender, progressive age, and ISS > 20 in the exact multiple regression analysis.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-08
... series in the pilot: (1) A time series analysis of open interest; and (2) An analysis of the distribution... times the number of shares outstanding. These are summed for all 500 stocks and divided by a... below $3.00 and $0.10 for all other series. Strike price intervals would be set no less than 5 points...
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1978-01-01
A statistical analysis is presented of the temporal variability of wind vectors at 1 km altitude intervals from 0 to 27 km altitude after applying a digital filter to the original wind profile data sample.
Pullin, A N; Pairis-Garcia, M D; Campbell, B J; Campler, M R; Proudfoot, K L
2017-11-01
When considering methodologies for collecting behavioral data, continuous sampling provides the most complete and accurate data set whereas instantaneous sampling can provide similar results and also increase the efficiency of data collection. However, instantaneous time intervals require validation to ensure accurate estimation of the data. Therefore, the objective of this study was to validate scan sampling intervals for lambs housed in a feedlot environment. Feeding, lying, standing, drinking, locomotion, and oral manipulation were measured on 18 crossbred lambs housed in an indoor feedlot facility for 14 h (0600-2000 h). Data from continuous sampling were compared with data from instantaneous scan sampling intervals of 5, 10, 15, and 20 min using a linear regression analysis. Three criteria determined if a time interval accurately estimated behaviors: 1) ≥ 0.90, 2) slope not statistically different from 1 ( > 0.05), and 3) intercept not statistically different from 0 ( > 0.05). Estimations for lying behavior were accurate up to 20-min intervals, whereas feeding and standing behaviors were accurate only at 5-min intervals (i.e., met all 3 regression criteria). Drinking, locomotion, and oral manipulation demonstrated poor associations () for all tested intervals. The results from this study suggest that a 5-min instantaneous sampling interval will accurately estimate lying, feeding, and standing behaviors for lambs housed in a feedlot, whereas continuous sampling is recommended for the remaining behaviors. This methodology will contribute toward the efficiency, accuracy, and transparency of future behavioral data collection in lamb behavior research.
4D seismic monitoring of the miscible CO2 flood of Hall-Gurney Field, Kansas, U.S
Raef, A.E.; Miller, R.D.; Byrnes, A.P.; Harrison, W.E.
2004-01-01
A cost-effective, highly repeatable, 4D-optimized, single-pattern/patch seismic data-acquisition approach with several 3D data sets was used to evaluate the feasibility of imaging changes associated with the " water alternated with gas" (WAG) stage. By incorporating noninversion-based seismic-attribute analysis, the time and cost of processing and interpreting the data were reduced. A 24-ms-thick EOR-CO 2 injection interval-using an average instantaneous frequency attribute (AIF) was targeted. Changes in amplitude response related to decrease in velocity from pore-fluid replacement within this time interval were found to be lower relative to background values than in AIF analysis. Carefully color-balanced AIF-attribute maps established the overall area affected by the injected EOR-CO2.
NASA Astrophysics Data System (ADS)
Tang, Tie-Qiao; Wang, Tao; Chen, Liang; Huang, Hai-Jun
2018-01-01
In this paper, we introduce the fuel cost into each commuter's trip cost, define a new trip cost without late arrival and its corresponding equilibrium state, and use a car-following model to explore the impacts of the fuel cost on each commuter's departure time, departure interval, arrival time, arrival interval, traveling time, early arrival time and trip cost at the above equilibrium state. The numerical results show that considering the fuel cost in each commuter's trip cost has positive impacts on his trip cost and fuel cost, and the traffic situation in the system without late arrival, i.e., each commuter should explicitly consider the fuel cost in his trip cost.
Pöchmüller, Martin; Schwingshackl, Lukas; Colombani, Paolo C; Hoffmann, Georg
2016-01-01
Carbohydrate supplements are widely used by athletes as an ergogenic aid before and during sports events. The present systematic review and meta-analysis aimed at synthesizing all available data from randomized controlled trials performed under real-life conditions. MEDLINE, EMBASE, and the Cochrane Central Register of Controlled Trials were searched systematically up to February 2015. Study groups were categorized according to test mode and type of performance measurement. Subgroup analyses were done with reference to exercise duration and range of carbohydrate concentration. Random effects and fixed effect meta-analyses were performed using the Software package by the Cochrane Collaboration Review Manager 5.3. Twenty-four randomized controlled trials met the objectives and were included in the present systematic review, 16 of which provided data for meta-analyses. Carbohydrate supplementations were associated with a significantly shorter exercise time in groups performing submaximal exercise followed by a time trial [mean difference -0.9 min (95 % confidence interval -1.7, -0.2), p = 0.02] as compared to controls. Subgroup analysis showed that improvements were specific for studies administering a concentration of carbohydrates between 6 and 8 % [mean difference -1.0 min (95 % confidence interval -1.9, -0.0), p = 0.04]. Concerning groups with submaximal exercise followed by a time trial measuring power accomplished within a fixed time or distance, mean power output was significantly higher following carbohydrate load (mean difference 20.2 W (95 % confidence interval 9.0, 31.5), p = 0.0004]. Likewise, mean power output was significantly increased following carbohydrate intervention in groups with time trial measuring power within a fixed time or distance (mean difference 8.1 W (95 % confidence interval 0.5, 15.7) p = 0.04]. Due to the limitations of this systematic review, results can only be applied to a subset of athletes (trained male cyclists). For those, we could observe a potential ergogenic benefit of carbohydrate supplementation especially in a concentration range between 6 and 8 % when exercising longer than 90 min.
Marzban, Caren; Viswanathan, Raju; Yurtsever, Ulvi
2014-01-09
A recent study argued, based on data on functional genome size of major phyla, that there is evidence life may have originated significantly prior to the formation of the Earth. Here a more refined regression analysis is performed in which 1) measurement error is systematically taken into account, and 2) interval estimates (e.g., confidence or prediction intervals) are produced. It is shown that such models for which the interval estimate for the time origin of the genome includes the age of the Earth are consistent with observed data. The appearance of life after the formation of the Earth is consistent with the data set under examination.
A real-time approach for heart rate monitoring using a Hilbert transform in seismocardiograms.
Jafari Tadi, Mojtaba; Lehtonen, Eero; Hurnanen, Tero; Koskinen, Juho; Eriksson, Jonas; Pänkäälä, Mikko; Teräs, Mika; Koivisto, Tero
2016-11-01
Heart rate monitoring helps in assessing the functionality and condition of the cardiovascular system. We present a new real-time applicable approach for estimating beat-to-beat time intervals and heart rate in seismocardiograms acquired from a tri-axial microelectromechanical accelerometer. Seismocardiography (SCG) is a non-invasive method for heart monitoring which measures the mechanical activity of the heart. Measuring true beat-to-beat time intervals from SCG could be used for monitoring of the heart rhythm, for heart rate variability analysis and for many other clinical applications. In this paper we present the Hilbert adaptive beat identification technique for the detection of heartbeat timings and inter-beat time intervals in SCG from healthy volunteers in three different positions, i.e. supine, left and right recumbent. Our method is electrocardiogram (ECG) independent, as it does not require any ECG fiducial points to estimate the beat-to-beat intervals. The performance of the algorithm was tested against standard ECG measurements. The average true positive rate, positive prediction value and detection error rate for the different positions were, respectively, supine (95.8%, 96.0% and ≃0.6%), left (99.3%, 98.8% and ≃0.001%) and right (99.53%, 99.3% and ≃0.01%). High correlation and agreement was observed between SCG and ECG inter-beat intervals (r > 0.99) for all positions, which highlights the capability of the algorithm for SCG heart monitoring from different positions. Additionally, we demonstrate the applicability of the proposed method in smartphone based SCG. In conclusion, the proposed algorithm can be used for real-time continuous unobtrusive cardiac monitoring, smartphone cardiography, and in wearable devices aimed at health and well-being applications.
Hellström, Åke; Rammsayer, Thomas H
2015-10-01
Studies have shown that the discriminability of successive time intervals depends on the presentation order of the standard (St) and the comparison (Co) stimuli. Also, this order affects the point of subjective equality. The first effect is here called the standard-position effect (SPE); the latter is known as the time-order error. In the present study, we investigated how these two effects vary across interval types and standard durations, using Hellström's sensation-weighting model to describe the results and relate them to stimulus comparison mechanisms. In Experiment 1, four modes of interval presentation were used, factorially combining interval type (filled, empty) and sensory modality (auditory, visual). For each mode, two presentation orders (St-Co, Co-St) and two standard durations (100 ms, 1,000 ms) were used; half of the participants received correctness feedback, and half of them did not. The interstimulus interval was 900 ms. The SPEs were negative (i.e., a smaller difference limen for St-Co than for Co-St), except for the filled-auditory and empty-visual 100-ms standards, for which a positive effect was obtained. In Experiment 2, duration discrimination was investigated for filled auditory intervals with four standards between 100 and 1,000 ms, an interstimulus interval of 900 ms, and no feedback. Standard duration interacted with presentation order, here yielding SPEs that were negative for standards of 100 and 1,000 ms, but positive for 215 and 464 ms. Our findings indicate that the SPE can be positive as well as negative, depending on the interval type and standard duration, reflecting the relative weighting of the stimulus information, as is described by the sensation-weighting model.
Hanaki, Nao; Yamashita, Kazuto; Kunisawa, Susumu; Imanaka, Yuichi
2016-12-09
In Japan, ambulance staff sometimes must make request calls to find hospitals that can accept patients because of an inadequate information sharing system. This study aimed to quantify effects of the number of request calls on the time interval between an emergency call and hospital arrival. A cross-sectional study of an ambulance records database in Nara prefecture, Japan. A total of 43 663 patients (50% women; 31.2% aged 80 years and over): (1) transported by ambulance from April 2013 to March 2014, (2) aged 15 years and over, and (3) with suspected major illness. The time from call to hospital arrival, defined as the time interval from receipt of an emergency call to ambulance arrival at a hospital. The mean time interval from emergency call to hospital arrival was 44.5 min, and the mean number of requests was 1.8. Multilevel linear regression analysis showed that ∼43.8% of variations in transportation times were explained by patient age, sex, season, day of the week, time, category of suspected illness, person calling for the ambulance, emergency status at request call, area and number of request calls. A higher number of request calls was associated with longer time intervals to hospital arrival (addition of 6.3 min per request call; p<0.001). In an analysis dividing areas into three groups, there were differences in transportation time for diseases needing cardiologists, neurologists, neurosurgeons and orthopaedists. The study revealed 6.3 additional minutes needed in transportation time for every refusal of a request call, and also revealed disease-specific delays among specific areas. An effective system should be collaboratively established by policymakers and physicians to ensure the rapid identification of an available hospital for patient transportation in order to reduce the time from the initial emergency call to hospital arrival. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Varnes, D.J.; Bufe, C.G.
1996-01-01
Seismic activity in the 10 months preceding the 1980 February 14, mb 4.8 earthquake in the Virgin Islands, reported on by Frankel in 1982, consisted of four principal cycles. Each cycle began with a relatively large event or series of closely spaced events, and the duration of the cycles progressively shortened by a factor of about 3/4. Had this regular shortening of the cycles been recognized prior to the earthquake, the time of the next episode of setsmicity (the main shock) might have been closely estimated 41 days in advance. That this event could be much larger than the previous events is indicated from time-to-failure analysis of the accelerating rise in released seismic energy, using a non-linear time- and slip-predictable foreshock model. Examination of the timing of all events in the sequence shows an even higher degree of order. Rates of seismicity, measured by consecutive interevent times, when plotted on an iteration diagram of a rate versus the succeeding rate, form a triangular circulating trajectory. The trajectory becomes an ascending helix if extended in a third dimension, time. This construction reveals additional and precise relations among the time intervals between times of relatively high or relatively low rates of seismic activity, including period halving and doubling. The set of 666 time intervals between all possible pairs of the 37 recorded events appears to be a fractal; the set of time points that define the intervals has a finite, non-integer correlation dimension of 0.70. In contrast, the average correlation dimension of 50 random sequences of 37 events is significantly higher, dose to 1.0. In a similar analysis, the set of distances between pairs of epicentres has a fractal correlation dimension of 1.52. Well-defined cycles, numerous precise ratios among time intervals, and a non-random temporal fractal dimension suggest that the seismic series is not a random process, but rather the product of a deterministic dynamic system.
Hanaki, Nao; Yamashita, Kazuto; Kunisawa, Susumu; Imanaka, Yuichi
2016-01-01
Objectives In Japan, ambulance staff sometimes must make request calls to find hospitals that can accept patients because of an inadequate information sharing system. This study aimed to quantify effects of the number of request calls on the time interval between an emergency call and hospital arrival. Design and setting A cross-sectional study of an ambulance records database in Nara prefecture, Japan. Cases A total of 43 663 patients (50% women; 31.2% aged 80 years and over): (1) transported by ambulance from April 2013 to March 2014, (2) aged 15 years and over, and (3) with suspected major illness. Primary outcome measures The time from call to hospital arrival, defined as the time interval from receipt of an emergency call to ambulance arrival at a hospital. Results The mean time interval from emergency call to hospital arrival was 44.5 min, and the mean number of requests was 1.8. Multilevel linear regression analysis showed that ∼43.8% of variations in transportation times were explained by patient age, sex, season, day of the week, time, category of suspected illness, person calling for the ambulance, emergency status at request call, area and number of request calls. A higher number of request calls was associated with longer time intervals to hospital arrival (addition of 6.3 min per request call; p<0.001). In an analysis dividing areas into three groups, there were differences in transportation time for diseases needing cardiologists, neurologists, neurosurgeons and orthopaedists. Conclusions The study revealed 6.3 additional minutes needed in transportation time for every refusal of a request call, and also revealed disease-specific delays among specific areas. An effective system should be collaboratively established by policymakers and physicians to ensure the rapid identification of an available hospital for patient transportation in order to reduce the time from the initial emergency call to hospital arrival. PMID:27940625
NASA Technical Reports Server (NTRS)
Makikallio, T. H.; Koistinen, J.; Jordaens, L.; Tulppo, M. P.; Wood, N.; Golosarsky, B.; Peng, C. K.; Goldberger, A. L.; Huikuri, H. V.
1999-01-01
The traditional methods of analyzing heart rate (HR) variability have failed to predict imminent ventricular fibrillation (VF). We sought to determine whether new methods of analyzing RR interval variability based on nonlinear dynamics and fractal analysis may help to detect subtle abnormalities in RR interval behavior before the onset of life-threatening arrhythmias. RR interval dynamics were analyzed from 24-hour Holter recordings of 15 patients who experienced VF during electrocardiographic recording. Thirty patients without spontaneous or inducible arrhythmia events served as a control group in this retrospective case control study. Conventional time- and frequency-domain measurements, the short-term fractal scaling exponent (alpha) obtained by detrended fluctuation analysis, and the slope (beta) of the power-law regression line (log power - log frequency, 10(-4)-10(-2) Hz) of RR interval dynamics were determined. The short-term correlation exponent alpha of RR intervals (0.64 +/- 0.19 vs 1.05 +/- 0.12; p <0.001) and the power-law slope beta (-1.63 +/- 0.28 vs -1.31 +/- 0.20, p <0.001) were lower in the patients before the onset of VF than in the control patients, but the SD and the low-frequency spectral components of RR intervals did not differ between the groups. The short-term scaling exponent performed better than any other measurement of HR variability in differentiating between the patients with VF and controls. Altered fractal correlation properties of HR behavior precede the spontaneous onset of VF. Dynamic analysis methods of analyzing RR intervals may help to identify abnormalities in HR behavior before VF.
NASA Astrophysics Data System (ADS)
Mignanelli, L.; Bauer, G.; Klarmann, M.; Wang, H.; Rembe, C.
2017-07-01
Velocity signals acquired with a Laser Doppler Vibrometer on the thorax (Optical Vibrocardiography) contain important information, which have a relation to cardiovascular parameters and cardiovascular diseases. The acquired signal results in a superimposition of vibrations originated from different sources of the human body. Since we study the vibration generated by the heart to reliably detect a characteristic time interval corresponding to the PR interval in the ECG, these disturbance have to be removed by filtering. Moreover, the Laser Doppler Vibrometer measures only in the direction of the laser beam and, thus, the velocity signal is only a projection of the tridimensional movement of the thorax. This work presents an analysis of the influences of the filters and of the measurement direction on the characteristic time interval in Vibrocardiographic signals. Our analysis results in recommended settings for filters and we demonstrate that reliable detection of vibrocardiographic parameters is possible within an angle deviation of 30° in respect to the perpendicular irradiation on the front side of the subject.
Parametric Model Based On Imputations Techniques for Partly Interval Censored Data
NASA Astrophysics Data System (ADS)
Zyoud, Abdallah; Elfaki, F. A. M.; Hrairi, Meftah
2017-12-01
The term ‘survival analysis’ has been used in a broad sense to describe collection of statistical procedures for data analysis. In this case, outcome variable of interest is time until an event occurs where the time to failure of a specific experimental unit might be censored which can be right, left, interval, and Partly Interval Censored data (PIC). In this paper, analysis of this model was conducted based on parametric Cox model via PIC data. Moreover, several imputation techniques were used, which are: midpoint, left & right point, random, mean, and median. Maximum likelihood estimate was considered to obtain the estimated survival function. These estimations were then compared with the existing model, such as: Turnbull and Cox model based on clinical trial data (breast cancer data), for which it showed the validity of the proposed model. Result of data set indicated that the parametric of Cox model proved to be more superior in terms of estimation of survival functions, likelihood ratio tests, and their P-values. Moreover, based on imputation techniques; the midpoint, random, mean, and median showed better results with respect to the estimation of survival function.
Advanced analysis of finger-tapping performance: a preliminary study.
Barut, Cağatay; Kızıltan, Erhan; Gelir, Ethem; Köktürk, Fürüzan
2013-06-01
The finger-tapping test is a commonly employed quantitative assessment tool used to measure motor performance in the upper extremities. This task is a complex motion that is affected by external stimuli, mood and health status. The complexity of this task is difficult to explain with a single average intertap-interval value (time difference between successive tappings) which only provides general information and neglects the temporal effects of the aforementioned factors. This study evaluated the time course of average intertap-interval values and the patterns of variation in both the right and left hands of right-handed subjects using a computer-based finger-tapping system. Cross sectional study. Thirty eight male individuals aged between 20 and 28 years (Mean±SD = 22.24±1.65) participated in the study. Participants were asked to perform single-finger-tapping test for 10 seconds of test period. Only the results of right-handed (RH) 35 participants were considered in this study. The test records the time of tapping and saves data as the time difference between successive tappings for further analysis. The average number of tappings and the temporal fluctuation patterns of the intertap-intervals were calculated and compared. The variations in the intertap-interval were evaluated with the best curve fit method. An average tapping speed or tapping rate can reliably be defined for a single-finger tapping test by analysing the graphically presented data of the number of tappings within the test period. However, a different presentation of the same data, namely the intertap-interval values, shows temporal variation as the number of tapping increases. Curve fitting applications indicate that the variation has a biphasic nature. The measures obtained in this study reflect the complex nature of the finger-tapping task and are suggested to provide reliable information regarding hand performance. Moreover, the equation reflects both the variations in and the general patterns associated with the task.
Improving laboratory results turnaround time by reducing pre analytical phase.
Khalifa, Mohamed; Khalid, Parwaiz
2014-01-01
Laboratory turnaround time is considered one of the most important indicators of work efficiency in hospitals, physicians always need timely results to take effective clinical decisions especially in the emergency department where these results can guide physicians whether to admit patients to the hospital, discharge them home or do further investigations. A retrospective data analysis study was performed to identify the effects of ER and Lab staff training on new routines for sample collection and transportation on the pre-analytical phase of turnaround time. Renal profile tests requested by the ER and performed in 2013 has been selected as a sample, and data about 7,519 tests were retrieved and analyzed to compare turnaround time intervals before and after implementing new routines. Results showed significant time reduction on "Request to Sample Collection" and "Collection to In Lab Delivery" time intervals with less significant improvement on the analytical phase of the turnaround time.
Perin, Jamie; Walker, Neff
2015-01-01
Background Recent steep declines in child mortality have been attributed in part to increased use of contraceptives and the resulting change in fertility behaviour, including an increase in the time between births. Previous observational studies have documented strong associations between short birth spacing and an increase in the risk of neonatal, infant, and under-five mortality, compared to births with longer preceding birth intervals. In this analysis, we compare two methods to estimate the association between short birth intervals and mortality risk to better inform modelling efforts linking family planning and mortality in children. Objectives Our goal was to estimate the mortality risk for neonates, infants, and young children by preceding birth space using household survey data, controlling for mother-level factors and to compare the results to those from previous analyses with survey data. Design We assessed the potential for confounding when estimating the relative mortality risk by preceding birth interval and estimated mortality risk by birth interval in four categories: less than 18 months, 18–23 months, 24–35 months, and 36 months or longer. We estimated the relative risks among women who were 35 and older at the time of the survey with two methods: in a Cox proportional hazards regression adjusting for potential confounders and also by stratifying Cox regression by mother, to control for all factors that remain constant over a woman's childbearing years. We estimated the overall effects for birth spacing in a meta-analysis with random survey effects. Results We identified several factors known for their associations with neonatal, infant, and child mortality that are also associated with preceding birth interval. When estimating the effect of birth spacing on mortality, we found that regression adjustment for these factors does not substantially change the risk ratio for short birth intervals compared to an unadjusted mortality ratio. For birth intervals less than 18 months, standard regression adjustment for confounding factors estimated a risk ratio for neonatal mortality of 2.28 (95% confidence interval: 2.18–2.37). This same effect estimated within mother is 1.57 (95% confidence interval: 1.52–1.63), a decline of almost one-third in the effect on neonatal mortality. Conclusions Neonatal, infant, and child mortality are strongly and significantly related to preceding birth interval, where births within a short interval of time after the previous birth have increased mortality. Previous analyses have demonstrated this relationship on average across all births; however, women who have short spaces between births are different from women with long spaces. Among women 35 years and older where a comparison of birth spaces within mother is possible, we find a much reduced although still significant effect of short birth spaces on child mortality. PMID:26562139
Perin, Jamie; Walker, Neff
2015-01-01
Recent steep declines in child mortality have been attributed in part to increased use of contraceptives and the resulting change in fertility behaviour, including an increase in the time between births. Previous observational studies have documented strong associations between short birth spacing and an increase in the risk of neonatal, infant, and under-five mortality, compared to births with longer preceding birth intervals. In this analysis, we compare two methods to estimate the association between short birth intervals and mortality risk to better inform modelling efforts linking family planning and mortality in children. Our goal was to estimate the mortality risk for neonates, infants, and young children by preceding birth space using household survey data, controlling for mother-level factors and to compare the results to those from previous analyses with survey data. We assessed the potential for confounding when estimating the relative mortality risk by preceding birth interval and estimated mortality risk by birth interval in four categories: less than 18 months, 18-23 months, 24-35 months, and 36 months or longer. We estimated the relative risks among women who were 35 and older at the time of the survey with two methods: in a Cox proportional hazards regression adjusting for potential confounders and also by stratifying Cox regression by mother, to control for all factors that remain constant over a woman's childbearing years. We estimated the overall effects for birth spacing in a meta-analysis with random survey effects. We identified several factors known for their associations with neonatal, infant, and child mortality that are also associated with preceding birth interval. When estimating the effect of birth spacing on mortality, we found that regression adjustment for these factors does not substantially change the risk ratio for short birth intervals compared to an unadjusted mortality ratio. For birth intervals less than 18 months, standard regression adjustment for confounding factors estimated a risk ratio for neonatal mortality of 2.28 (95% confidence interval: 2.18-2.37). This same effect estimated within mother is 1.57 (95% confidence interval: 1.52-1.63), a decline of almost one-third in the effect on neonatal mortality. Neonatal, infant, and child mortality are strongly and significantly related to preceding birth interval, where births within a short interval of time after the previous birth have increased mortality. Previous analyses have demonstrated this relationship on average across all births; however, women who have short spaces between births are different from women with long spaces. Among women 35 years and older where a comparison of birth spaces within mother is possible, we find a much reduced although still significant effect of short birth spaces on child mortality.
An empirical comparison of SPM preprocessing parameters to the analysis of fMRI data.
Della-Maggiore, Valeria; Chau, Wilkin; Peres-Neto, Pedro R; McIntosh, Anthony R
2002-09-01
We present the results from two sets of Monte Carlo simulations aimed at evaluating the robustness of some preprocessing parameters of SPM99 for the analysis of functional magnetic resonance imaging (fMRI). Statistical robustness was estimated by implementing parametric and nonparametric simulation approaches based on the images obtained from an event-related fMRI experiment. Simulated datasets were tested for combinations of the following parameters: basis function, global scaling, low-pass filter, high-pass filter and autoregressive modeling of serial autocorrelation. Based on single-subject SPM analysis, we derived the following conclusions that may serve as a guide for initial analysis of fMRI data using SPM99: (1) The canonical hemodynamic response function is a more reliable basis function to model the fMRI time series than HRF with time derivative. (2) Global scaling should be avoided since it may significantly decrease the power depending on the experimental design. (3) The use of a high-pass filter may be beneficial for event-related designs with fixed interstimulus intervals. (4) When dealing with fMRI time series with short interstimulus intervals (<8 s), the use of first-order autoregressive model is recommended over a low-pass filter (HRF) because it reduces the risk of inferential bias while providing a relatively good power. For datasets with interstimulus intervals longer than 8 seconds, temporal smoothing is not recommended since it decreases power. While the generalizability of our results may be limited, the methods we employed can be easily implemented by other scientists to determine the best parameter combination to analyze their data.
Prevalence, Risk Factors and In-hospital Outcomes of QTc Interval Prolongation in Liver Cirrhosis.
Zhao, Jiancheng; Qi, Xingshun; Hou, Feifei; Ning, Zheng; Zhang, Xintong; Deng, Han; Peng, Ying; Li, Jing; Wang, Xiaoxi; Li, Hongyu; Guo, Xiaozhong
2016-09-01
QTc interval prolongation is an electrocardiographic abnormality in liver cirrhosis. The objective of this study was to evaluate the prevalence, risk factors and in-hospital outcomes of QTc interval prolongation in Chinese patients with liver cirrhosis. This was a retrospective analysis of a total of 1,268 patients with liver cirrhosis who were consecutively admitted to our hospital between January 2011 and June 2014. QTc interval data were collected from the medical records. QTc interval prolongation was defined as QTc interval > 440 milliseconds. The prevalence of QTc interval prolongation was 38.2% (485 of 1268). In the entire cohort, the risk factors for QTc interval prolongation included an older age, a higher proportion of alcohol abuse and ascites, higher bilirubin, blood urea nitrogen, creatinine, prothrombin time, international normalized ratio, Child-Pugh score and model for end-stage liver diseases score, and lower red blood cell (RBC), hemoglobin (Hb), albumin (ALB), alanine aminotransferase and calcium. The in-hospital mortality was not significantly different between patients with and without QTc interval prolongation (2.1% versus 1.3%, P = 0.276). In the subgroup analyses of patients with hepatitis B virus or alcohol alone-related liver cirrhosis, the risk factors included higher bilirubin, creatinine, prothrombin time, international normalized ratio, Child-Pugh score and model for end-stage liver diseases score, and lower RBC, Hb and ALB. In the subgroups analyses of patients with acute upper gastrointestinal bleeding or ascites, the risk factors included lower RBC, Hb and ALB. QTc interval prolongation was frequent in liver cirrhosis. Although QTc interval prolongation was positively associated with alcohol-related liver cirrhosis and more severe liver dysfunction, it did not significantly influence the in-hospital mortality. Copyright © 2016 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.
Multifractal analysis and the NYHA index
NASA Astrophysics Data System (ADS)
Muñoz-Diosdado, A.; Ramírez-Hernández, L.; Aguilar-Molina, A. M.; Zamora-Justo, J. A.; Gutiérrez-Calleja, R. A.; Virgilio-González, C. D.
2014-11-01
We did multifractal analysis of heartbeat interval time series of healthy persons and patients with congestive heart failure (CHF). To analyze circadian rhythm variations we analyzed time series of 24 hours records and segments of six hours when the subject is asleep and segments of six hours when is awake. A decrease in the multifractality degree occurs in the heartbeat interval time series of CHF patients. This multifractality loss is associated with the width reduction of the spectrum and the complexity loss of the signal. Multifractal spectra of healthy persons are right skewed, but the spectra of CHF patients tend to be symmetrical and in some cases are left skewed. To determine the therapy for CHF patients, cardiologists use an index proposed by the NYHA (New York Heart Association). There is a correlation between this index and the multifractal analysis parameters, i.e. while higher is the NYHA index the width of the multifractal spectrum is lower and it is also more symmetrical. In contrast, patients with NYHA index equal to 1 have multifractal parameters similar to those of healthy subjects.
Mengarelli, Alessandro; Cardarelli, Stefano; Verdini, Federica; Burattini, Laura; Fioretti, Sandro; Di Nardo, Francesco
2016-08-01
In this paper a graphical user interface (GUI) built in MATLAB® environment is presented. This interactive tool has been developed for the analysis of superficial electromyography (sEMG) signals and in particular for the assessment of the muscle activation time intervals. After the signal import, the tool performs a first analysis in a totally user independent way, providing a reliable computation of the muscular activation sequences. Furthermore, the user has the opportunity to modify each parameter of the on/off identification algorithm implemented in the presented tool. The presence of an user-friendly GUI allows the immediate evaluation of the effects that the modification of every single parameter has on the activation intervals recognition, through the real-time updating and visualization of the muscular activation/deactivation sequences. The possibility to accept the initial signal analysis or to modify the on/off identification with respect to each considered signal, with a real-time visual feedback, makes this GUI-based tool a valuable instrument in clinical, research applications and also in an educational perspective.
Cardiorespiratory dynamic response to mental stress: a multivariate time-frequency analysis.
Widjaja, Devy; Orini, Michele; Vlemincx, Elke; Van Huffel, Sabine
2013-01-01
Mental stress is a growing problem in our society. In order to deal with this, it is important to understand the underlying stress mechanisms. In this study, we aim to determine how the cardiorespiratory interactions are affected by mental arithmetic stress and attention. We conduct cross time-frequency (TF) analyses to assess the cardiorespiratory coupling. In addition, we introduce partial TF spectra to separate variations in the RR interval series that are linearly related to respiration from RR interval variations (RRV) that are not related to respiration. The performance of partial spectra is evaluated in two simulation studies. Time-varying parameters, such as instantaneous powers and frequencies, are derived from the computed spectra. Statistical analysis is carried out continuously in time to evaluate the dynamic response to mental stress and attention. The results show an increased heart and respiratory rate during stress and attention, compared to a resting condition. Also a fast reduction in vagal activity is noted. The partial TF analysis reveals a faster reduction of RRV power related to (3 s) than unrelated to (30 s) respiration, demonstrating that the autonomic response to mental stress is driven by mechanisms characterized by different temporal scales.
LADABAUM, URI; FIORITTO, ANN; MITANI, AYA; DESAI, MANISHA; KIM, JANE P.; REX, DOUGLAS K.; IMPERIALE, THOMAS; GUNARATNAM, NARESH
2017-01-01
BACKGROUND & AIMS Accurate optical analysis of colorectal polyps (optical biopsy) could prevent unnecessary polypectomies or allow a “resect and discard” strategy with surveillance intervals determined based on the results of the optical biopsy; this could be less expensive than histopathologic analysis of polyps. We prospectively evaluated real-time optical biopsy analysis of polyps with narrow band imaging (NBI) by community-based gastroenterologists. METHODS We first analyzed a computerized module to train gastroenterologists (N = 13) in optical biopsy skills using photographs of polyps. Then we evaluated a practice-based learning program for these gastroenterologists (n = 12) that included real-time optical analysis of polyps in vivo, comparison of optical biopsy predictions to histopathologic analysis, and ongoing feedback on performance. RESULTS Twelve of 13 subjects identified adenomas with >90% accuracy at the end of the computer study, and 3 of 12 subjects did so with accuracy ≥90% in the in vivo study. Learning curves showed considerable variation among batches of polyps. For diminutive rectosigmoid polyps assessed with high confidence at the end of the study, adenomas were identified with mean (95% confidence interval [CI]) accuracy, sensitivity, specificity, and negative predictive values of 81% (73%–89%), 85% (74%–96%), 78% (66%–92%), and 91% (86%–97%), respectively. The adjusted odds ratio for high confidence as a predictor of accuracy was 1.8 (95% CI, 1.3–2.5). The agreement between surveillance recommendations informed by high-confidence NBI analysis of diminutive polyps and results from histopathologic analysis of all polyps was 80% (95% CI, 77%–82%). CONCLUSIONS In an evaluation of real-time optical biopsy analysis of polyps with NBI, only 25% of gastroenterologists assessed polyps with ≥90% accuracy. The negative predictive value for identification of adenomas, but not the surveillance interval agreement, met the American Society for Gastrointestinal Endoscopy–recommended thresholds for optical biopsy. Better results in community practice must be achieved before NBI-based optical biopsy methods can be used routinely to evaluate polyps; ClinicalTrials.gov number, NCT01638091. PMID:23041328
Analysis of vector wind change with respect to time for Vandenberg Air Force Base, California
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1978-01-01
A statistical analysis of the temporal variability of wind vectors at 1 km altitude intervals from 0 to 27 km altitude taken from a 10-year data sample of twice-daily rawinsode wind measurements over Vandenberg Air Force Base, California is presented.
Angelova, Silvija; Ribagin, Simeon; Raikova, Rositsa; Veneva, Ivanka
2018-02-01
After a stroke, motor units stop working properly and large, fast-twitch units are more frequently affected. Their impaired functions can be investigated during dynamic tasks using electromyographic (EMG) signal analysis. The aim of this paper is to investigate changes in the parameters of the power/frequency function during elbow flexion between affected, non-affected, and healthy muscles. Fifteen healthy subjects and ten stroke survivors participated in the experiments. Electromyographic data from 6 muscles of the upper limbs during elbow flexion were filtered and normalized to the amplitudes of EMG signals during maximal isometric tasks. The moments when motion started and when the flexion angle reached its maximal value were found. Equal intervals of 0.3407 s were defined between these two moments and one additional interval before the start of the flexion (first one) was supplemented. For each of these intervals the power/frequency function of EMG signals was calculated. The mean (MNF) and median frequencies (MDF), the maximal power (MPw) and the area under the power function (APw) were calculated. MNF was always higher than MDF. A significant decrease in these frequencies was found in only three post-stroke survivors. The frequencies in the first time interval were nearly always the highest among all intervals. The maximal power was nearly zero during first time interval and increased during the next ones. The largest values of MPw and APw were found for the flexor muscles and they increased for the muscles of the affected arm compared to the non-affected one of stroke survivors. Copyright © 2017 Elsevier Ltd. All rights reserved.
Lianhui, Yang; Meifei, Lian; Zhongyue, Hu; Yunzhi, Feng
2017-08-01
Objective The aim of this study is to evaluate the relationship between periodontitis and hyperlipidemia risks through Meta-analysis. Methods Two researchers conducted an electronic search on PubMed, Cochrane Library, Embase, CBM, CNKI, Wanfang and VIP databases established until July 2016 for observational studies on the association between periodontitis and hyperlipidemia. The language used was limited to Chinese and English. After data extraction and quality evaluation of included trials, Meta-analysis was conducted using the RevMan 5.3 software. The GRADE 3.6 software was used to evaluate the quality level of the evidence. Results Six case-control studies and one cohort study were included. The results of Meta-analysis showed that serum triglyceride (TG) in patients with periodontitis was significantly higher than that of the periodontal health group (MD=50.50, 95% confidence interval=39.57-61.42, P<0.000 01), as well as serum total cholesterol (TC) (MD=17.54, 95% confidence interval=10.91-24.18, P<0.000 01). Furthermore, the risks of TG and TC in the serum of patients with chronic periodontitis were 4.73 times (OR=4.73, 95% confidence interval=2.74-8.17, P<0.000 01) and 3.62 times (OR=3.62, 95% confidence interval=2.18-6.03, P<0.000 01) of that of periodontal healthy patients. No significant difference was observed between the group with high-density lipoprotein cholesterol (HDL-C) and that with low density lipoprotein cholesterol (LDL-C). Conclusion Current evidence indicates that a correlation exists between chronic periodontitis and hyperlipidemia, and chronic periodontitis is an independent risk factor for hyperlipidemia, especially for TC and TG in serum.
Paleointensity Behavior and Intervals Between Geomagnetic Reversals in the Last 167 Ma
NASA Astrophysics Data System (ADS)
Kurazhkovskii, A. Yu.; Kurazhkovskaya, N. A.; Klain, B. I.
2018-01-01
The results of comparative analysis of the behavior of paleointensity and polarity (intervals between reversals) of the geomagnetic field for the last 167 Ma are presented. Similarities and differences in the behavior of these characteristics of the geomagnetic field are discussed. It is shown that bursts of paleointensity and long intervals between reversals occurred at high mean values of paleointensity in the Cretaceous and Paleogene. However, there are differences between the paleointensity behavior and the reversal regime: (1) the characteristic times of paleointensity variations are less than the characteristic times of the frequency of geomagnetic reversals, (2) the achievement of maximum values of paleointensity at the Cretaceous-Paleogene boundary and the termination of paleointensity bursts after the boundary of 45-40 Ma are not marked by explicit features in the geomagnetic polarity behavior.
NASA Technical Reports Server (NTRS)
Hausdorff, J. M.; Mitchell, S. L.; Firtion, R.; Peng, C. K.; Cudkowicz, M. E.; Wei, J. Y.; Goldberger, A. L.
1997-01-01
Fluctuations in the duration of the gait cycle (the stride interval) display fractal dynamics and long-range correlations in healthy young adults. We hypothesized that these stride-interval correlations would be altered by changes in neurological function associated with aging and certain disease states. To test this hypothesis, we compared the stride-interval time series of 1) healthy elderly subjects and young controls and of 2) subjects with Huntington's disease and healthy controls. Using detrended fluctuation analysis we computed alpha, a measure of the degree to which one stride interval is correlated with previous and subsequent intervals over different time scales. The scaling exponent alpha was significantly lower in elderly subjects compared with young subjects (elderly: 0.68 +/- 0.14; young: 0.87 +/- 0.15; P < 0.003). The scaling exponent alpha was also smaller in the subjects with Huntington's disease compared with disease-free controls (Huntington's disease: 0.60 +/- 0.24; controls: 0.88 +/-0.17; P < 0.005). Moreover, alpha was linearly related to degree of functional impairment in subjects with Huntington's disease (r = 0.78, P < 0.0005). These findings demonstrate that strike-interval fluctuations are more random (i.e., less correlated) in elderly subjects and in subjects with Huntington's disease. Abnormal alterations in the fractal properties of gait dynamics are apparently associated with changes in central nervous system control.
Swords, Douglas S; Zhang, Chong; Presson, Angela P; Firpo, Matthew A; Mulvihill, Sean J; Scaife, Courtney L
2018-04-01
Time-to-surgery from cancer diagnosis has increased in the United States. We aimed to determine the association between time-to-surgery and oncologic outcomes in patients with resectable pancreatic ductal adenocarcinoma undergoing upfront surgery. The 2004-2012 National Cancer Database was reviewed for patients undergoing curative-intent surgery without neoadjuvant therapy for clinical stage I-II pancreatic ductal adenocarcinoma. A multivariable Cox model with restricted cubic splines was used to define time-to-surgery as short (1-14 days), medium (15-42), and long (43-120). Overall survival was examined using Cox shared frailty models. Secondary outcomes were examined using mixed-effects logistic regression models. Of 16,763 patients, time-to-surgery was short in 34.4%, medium in 51.6%, and long in 14.0%. More short time-to-surgery patients were young, privately insured, healthy, and treated at low-volume hospitals. Adjusted hazards of mortality were lower for medium (hazard ratio 0.94, 95% confidence interval, .90, 0.97) and long time-to-surgery (hazard ratio 0.91, 95% confidence interval, 0.86, 0.96) than short. There were no differences in adjusted odds of node positivity, clinical to pathologic upstaging, being unresectable or stage IV at exploration, and positive margins. Medium time-to-surgery patients had higher adjusted odds (odds ratio 1.11, 95% confidence interval, 1.03, 1.20) of receiving an adequate lymphadenectomy than short. Ninety-day mortality was lower in medium (odds ratio 0.75, 95% confidence interval, 0.65, 0.85) and long time-to-surgery (odds ratio 0.72, 95% confidence interval, 0.60, 0.88) than short. In this observational analysis, short time-to-surgery was associated with slightly shorter OS and higher perioperative mortality. These results may suggest that delays for medical optimization and referral to high volume surgeons are safe. Published by Elsevier Inc.
Automated Interval velocity picking for Atlantic Multi-Channel Seismic Data
NASA Astrophysics Data System (ADS)
Singh, Vishwajit
2016-04-01
This paper described the challenge in developing and testing a fully automated routine for measuring interval velocities from multi-channel seismic data. Various approaches are employed for generating an interactive algorithm picking interval velocity for continuous 1000-5000 normal moveout (NMO) corrected gather and replacing the interpreter's effort for manual picking the coherent reflections. The detailed steps and pitfalls for picking the interval velocities from seismic reflection time measurements are describe in these approaches. Key ingredients these approaches utilized for velocity analysis stage are semblance grid and starting model of interval velocity. Basin-Hopping optimization is employed for convergence of the misfit function toward local minima. SLiding-Overlapping Window (SLOW) algorithm are designed to mitigate the non-linearity and ill- possessedness of root-mean-square velocity. Synthetic data case studies addresses the performance of the velocity picker generating models perfectly fitting the semblance peaks. A similar linear relationship between average depth and reflection time for synthetic model and estimated models proposed picked interval velocities as the starting model for the full waveform inversion to project more accurate velocity structure of the subsurface. The challenges can be categorized as (1) building accurate starting model for projecting more accurate velocity structure of the subsurface, (2) improving the computational cost of algorithm by pre-calculating semblance grid to make auto picking more feasible.
Petersen, Christian C; Mistlberger, Ralph E
2017-08-01
The mechanisms that enable mammals to time events that recur at 24-h intervals (circadian timing) and at arbitrary intervals in the seconds-to-minutes range (interval timing) are thought to be distinct at the computational and neurobiological levels. Recent evidence that disruption of circadian rhythmicity by constant light (LL) abolishes interval timing in mice challenges this assumption and suggests a critical role for circadian clocks in short interval timing. We sought to confirm and extend this finding by examining interval timing in rats in which circadian rhythmicity was disrupted by long-term exposure to LL or by chronic intake of 25% D 2 O. Adult, male Sprague-Dawley rats were housed in a light-dark (LD) cycle or in LL until free-running circadian rhythmicity was markedly disrupted or abolished. The rats were then trained and tested on 15- and 30-sec peak-interval procedures, with water restriction used to motivate task performance. Interval timing was found to be unimpaired in LL rats, but a weak circadian activity rhythm was apparently rescued by the training procedure, possibly due to binge feeding that occurred during the 15-min water access period that followed training each day. A second group of rats in LL were therefore restricted to 6 daily meals scheduled at 4-h intervals. Despite a complete absence of circadian rhythmicity in this group, interval timing was again unaffected. To eliminate all possible temporal cues, we tested a third group of rats in LL by using a pseudo-randomized schedule. Again, interval timing remained accurate. Finally, rats tested in LD received 25% D 2 O in place of drinking water. This markedly lengthened the circadian period and caused a failure of LD entrainment but did not disrupt interval timing. These results indicate that interval timing in rats is resistant to disruption by manipulations of circadian timekeeping previously shown to impair interval timing in mice.
Trading Speed and Accuracy by Coding Time: A Coupled-circuit Cortical Model
Standage, Dominic; You, Hongzhi; Wang, Da-Hui; Dorris, Michael C.
2013-01-01
Our actions take place in space and time, but despite the role of time in decision theory and the growing acknowledgement that the encoding of time is crucial to behaviour, few studies have considered the interactions between neural codes for objects in space and for elapsed time during perceptual decisions. The speed-accuracy trade-off (SAT) provides a window into spatiotemporal interactions. Our hypothesis is that temporal coding determines the rate at which spatial evidence is integrated, controlling the SAT by gain modulation. Here, we propose that local cortical circuits are inherently suited to the relevant spatial and temporal coding. In simulations of an interval estimation task, we use a generic local-circuit model to encode time by ‘climbing’ activity, seen in cortex during tasks with a timing requirement. The model is a network of simulated pyramidal cells and inhibitory interneurons, connected by conductance synapses. A simple learning rule enables the network to quickly produce new interval estimates, which show signature characteristics of estimates by experimental subjects. Analysis of network dynamics formally characterizes this generic, local-circuit timing mechanism. In simulations of a perceptual decision task, we couple two such networks. Network function is determined only by spatial selectivity and NMDA receptor conductance strength; all other parameters are identical. To trade speed and accuracy, the timing network simply learns longer or shorter intervals, driving the rate of downstream decision processing by spatially non-selective input, an established form of gain modulation. Like the timing network's interval estimates, decision times show signature characteristics of those by experimental subjects. Overall, we propose, demonstrate and analyse a generic mechanism for timing, a generic mechanism for modulation of decision processing by temporal codes, and we make predictions for experimental verification. PMID:23592967
Relating interesting quantitative time series patterns with text events and text features
NASA Astrophysics Data System (ADS)
Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.
2013-12-01
In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.
Kozak, Oksana V; Sukach, Georgiy G; Korchinskaya, Oksana I; Trembach, Alexander M; Turicina, Viktoria L; Voit, Natalia U
2005-06-01
To assess the correlations between the first 131I activity value, time interval between the courses of radioiodine treatment and the overall number of courses required for total destruction of lung metastases in patients with differentiated thyroid cancer with metastatic lesions in lungs. 27 patients with differentiated thyroid cancer with metastases in lungs have been treated with radioiodine after surgical intervention. Activities administered amounted from 1600 to 7980 MBq. The number of radioiodine courses before total ablation of all metastatic lesions amounted from 1 to 10. Time interval between the 1st and the 2nd courses amounted from 3.5 to 11.5 months (6 months in average). The regression analysis of the data has been made. The exponential model fits the actual number of courses as a function of the first-second activity value and time interval between the courses. The first activity has a decisive influence on the number of courses required for total metastases ablation. The greater was the first activity value, the lesser was the overall number of courses. Increasing time interval between 1st and 2nd courses to 10 months seems to result in reducing the number of courses. Nevertheless even in the case of high activities the probability to undergone less then 3 courses is low. According to the proposed model in thyroid cancer patients with metastases in lungs the first activity should be not lesser than 6000 MBq, time interval between treatments--approximately 10 months. The results of our study suggest that individual factors such as histology, the number and the size of metastases in lymph nodes could not contribute more to the final outcome than the treatment variables, namely the first-second activity and time interval, nor could they affect the hierarchy of the effects revealed for the treatment variables.
Changes in the Hurst exponent of heartbeat intervals during physical activity
NASA Astrophysics Data System (ADS)
Martinis, M.; Knežević, A.; Krstačić, G.; Vargović, E.
2004-07-01
The fractal scaling properties of the heartbeat time series are studied in different controlled ergometric regimes using both the improved Hurst rescaled range (R/S) analysis and the detrended fluctuation analysis (DFA). The long-time “memory effect” quantified by the value of the Hurst exponent H>0.5 is found to increase during progressive physical activity in healthy subjects, in contrast to those having stable angina pectoris, where it decreases. The results are also supported by the detrended fluctuation analysis. We argue that this finding may be used as a useful new diagnostic parameter for short heartbeat time series.
Parameter Transient Behavior Analysis on Fault Tolerant Control System
NASA Technical Reports Server (NTRS)
Belcastro, Christine (Technical Monitor); Shin, Jong-Yeob
2003-01-01
In a fault tolerant control (FTC) system, a parameter varying FTC law is reconfigured based on fault parameters estimated by fault detection and isolation (FDI) modules. FDI modules require some time to detect fault occurrences in aero-vehicle dynamics. This paper illustrates analysis of a FTC system based on estimated fault parameter transient behavior which may include false fault detections during a short time interval. Using Lyapunov function analysis, the upper bound of an induced-L2 norm of the FTC system performance is calculated as a function of a fault detection time and the exponential decay rate of the Lyapunov function.
A novel implementation of homodyne time interval analysis method for primary vibration calibration
NASA Astrophysics Data System (ADS)
Sun, Qiao; Zhou, Ling; Cai, Chenguang; Hu, Hongbo
2011-12-01
In this paper, the shortcomings and their causes of the conventional homodyne time interval analysis (TIA) method is described with respect to its software algorithm and hardware implementation, based on which a simplified TIA method is proposed with the help of virtual instrument technology. Equipped with an ordinary Michelson interferometer and dual channel synchronous data acquisition card, the primary vibration calibration system using the simplified method can perform measurements of complex sensitivity of accelerometers accurately, meeting the uncertainty requirements laid down in pertaining ISO standard. The validity and accuracy of the simplified TIA method is verified by simulation and comparison experiments with its performance analyzed. This simplified method is recommended to apply in national metrology institute of developing countries and industrial primary vibration calibration labs for its simplified algorithm and low requirements on hardware.
A subharmonic dynamical bifurcation during in vitro epileptiform activity
NASA Astrophysics Data System (ADS)
Perez Velazquez, Jose L.; Khosravani, Houman
2004-06-01
Epileptic seizures are considered to result from a sudden change in the synchronization of firing neurons in brain neural networks. We have used an in vitro model of status epilepticus (SE) to characterize dynamical regimes underlying the observed seizure-like activity. Time intervals between spikes or bursts were used as the variable to construct first-return interpeak or interburst interval plots, for studying neuronal population activity during the transition to seizure, as well as within seizures. Return maps constructed for a brief epoch before seizures were used for approximating the local system dynamics during that time window. Analysis of the first-return maps suggests that intermittency is a dynamical regime underlying the observed epileptic activity. This type of analysis may be useful for understanding the collective dynamics of neuronal populations in the normal and pathological brain.
Robust stability of interval bidirectional associative memory neural network with time delays.
Liao, Xiaofeng; Wong, Kwok-wo
2004-04-01
In this paper, the conventional bidirectional associative memory (BAM) neural network with signal transmission delay is intervalized in order to study the bounded effect of deviations in network parameters and external perturbations. The resultant model is referred to as a novel interval dynamic BAM (IDBAM) model. By combining a number of different Lyapunov functionals with the Razumikhin technique, some sufficient conditions for the existence of unique equilibrium and robust stability are derived. These results are fairly general and can be verified easily. To go further, we extend our investigation to the time-varying delay case. Some robust stability criteria for BAM with perturbations of time-varying delays are derived. Besides, our approach for the analysis allows us to consider several different types of activation functions, including piecewise linear sigmoids with bounded activations as well as the usual C1-smooth sigmoids. We believe that the results obtained have leading significance in the design and application of BAM neural networks.
ERIC Educational Resources Information Center
Hidalgo, Antonio J.; Otero, Jose
2004-01-01
This paper addresses the concept of geological time as used by students who face tasks that demand three types of skills: to locate events in time, to order them according to time calendar, and to manage time intervals. The empirical study consisted of asking high school students as well as technical school students to carry out tasks that…
A validation of ground ambulance pre-hospital times modeled using geographic information systems
2012-01-01
Background Evaluating geographic access to health services often requires determining the patient travel time to a specified service. For urgent care, many research studies have modeled patient pre-hospital time by ground emergency medical services (EMS) using geographic information systems (GIS). The purpose of this study was to determine if the modeling assumptions proposed through prior United States (US) studies are valid in a non-US context, and to use the resulting information to provide revised recommendations for modeling travel time using GIS in the absence of actual EMS trip data. Methods The study sample contained all emergency adult patient trips within the Calgary area for 2006. Each record included four components of pre-hospital time (activation, response, on-scene and transport interval). The actual activation and on-scene intervals were compared with those used in published models. The transport interval was calculated within GIS using the Network Analyst extension of Esri ArcGIS 10.0 and the response interval was derived using previously established methods. These GIS derived transport and response intervals were compared with the actual times using descriptive methods. We used the information acquired through the analysis of the EMS trip data to create an updated model that could be used to estimate travel time in the absence of actual EMS trip records. Results There were 29,765 complete EMS records for scene locations inside the city and 529 outside. The actual median on-scene intervals were longer than the average previously reported by 7–8 minutes. Actual EMS pre-hospital times across our study area were significantly higher than the estimated times modeled using GIS and the original travel time assumptions. Our updated model, although still underestimating the total pre-hospital time, more accurately represents the true pre-hospital time in our study area. Conclusions The widespread use of generalized EMS pre-hospital time assumptions based on US data may not be appropriate in a non-US context. The preference for researchers should be to use actual EMS trip records from the proposed research study area. In the absence of EMS trip data researchers should determine which modeling assumptions more accurately reflect the EMS protocols across their study area. PMID:23033894
When Human Walking is a Random Walk
NASA Astrophysics Data System (ADS)
Hausdorff, J. M.
1998-03-01
The complex, hierarchical locomotor system normally does a remarkable job of controlling an inherently unstable, multi-joint system. Nevertheless, the stride interval --- the duration of a gait cycle --- fluctuates from one stride to the next, even under stationary conditions. We used random walk analysis to study the dynamical properties of these fluctuations under normal conditions and how they change with disease and aging. Random walk analysis of the stride-to-stride fluctuations of healthy, young adult men surprisingly reveals a self-similar pattern: fluctuations at one time scale are statistically similar to those at multiple other time scales (Hausdorff et al, J Appl Phsyiol, 1995). To study the stability of this fractal property, we analyzed data obtained from healthy subjects who walked for 1 hour at their usual pace, as well as at slower and faster speeds. The stride interval fluctuations exhibited long-range correlations with power-law decay for up to a thousand strides at all three walking rates. In contrast, during metronomically-paced walking, these long-range correlations disappeared; variations in the stride interval were uncorrelated and non-fractal (Hausdorff et al, J Appl Phsyiol, 1996). To gain insight into the mechanism(s) responsible for this fractal property, we examined the effects of aging and neurological impairment. Using detrended fluctuation analysis (DFA), we computed α, a measure of the degree to which one stride interval is correlated with previous and subsequent intervals over different time scales. α was significantly lower in healthy elderly subjects compared to young adults (p < .003) and in subjects with Huntington's disease, a neuro-degenerative disorder of the central nervous system, compared to disease-free controls (p < 0.005) (Hausdorff et al, J Appl Phsyiol, 1997). α was also significantly related to degree of functional impairment in subjects with Huntington's disease (r=0.78). Recently, we have observed that just as there are changes with α during aging, there also changes with development. Apparently, the fractal scaling of walking does not become mature until children are eleven years old. Conclusions: The fractal dynamics of spontaneous stride interval fluctuations are normally quite robust and are apparently intrinsic to the healthy adult locomotor system. However, alterations in this fractal scaling property are associated with impairment in central nervous system control, aging and neural development.
Bester, Rachelle; Jooste, Anna E C; Maree, Hans J; Burger, Johan T
2012-09-27
Grapevine leafroll-associated virus 3 (GLRaV-3) is the main contributing agent of leafroll disease worldwide. Four of the six GLRaV-3 variant groups known have been found in South Africa, but their individual contribution to leafroll disease is unknown. In order to study the pathogenesis of leafroll disease, a sensitive and accurate diagnostic assay is required that can detect different variant groups of GLRaV-3. In this study, a one-step real-time RT-PCR, followed by high-resolution melting (HRM) curve analysis for the simultaneous detection and identification of GLRaV-3 variants of groups I, II, III and VI, was developed. A melting point confidence interval for each variant group was calculated to include at least 90% of all melting points observed. A multiplex RT-PCR protocol was developed to these four variant groups in order to assess the efficacy of the real-time RT-PCR HRM assay. A universal primer set for GLRaV-3 targeting the heat shock protein 70 homologue (Hsp70h) gene of GLRaV-3 was designed that is able to detect GLRaV-3 variant groups I, II, III and VI and differentiate between them with high-resolution melting curve analysis. The real-time RT-PCR HRM and the multiplex RT-PCR were optimized using 121 GLRaV-3 positive samples. Due to a considerable variation in melting profile observed within each GLRaV-3 group, a confidence interval of above 90% was calculated for each variant group, based on the range and distribution of melting points. The intervals of groups I and II could not be distinguished and a 95% joint confidence interval was calculated for simultaneous detection of group I and II variants. An additional primer pair targeting GLRaV-3 ORF1a was developed that can be used in a subsequent real-time RT-PCR HRM to differentiate between variants of groups I and II. Additionally, the multiplex RT-PCR successfully validated 94.64% of the infections detected with the real-time RT-PCR HRM. The real-time RT-PCR HRM provides a sensitive, automated and rapid tool to detect and differentiate different variant groups in order to study the epidemiology of leafroll disease.
Semiparametric regression analysis of failure time data with dependent interval censoring.
Chen, Chyong-Mei; Shen, Pao-Sheng
2017-09-20
Interval-censored failure-time data arise when subjects are examined or observed periodically such that the failure time of interest is not examined exactly but only known to be bracketed between two adjacent observation times. The commonly used approaches assume that the examination times and the failure time are independent or conditionally independent given covariates. In many practical applications, patients who are already in poor health or have a weak immune system before treatment usually tend to visit physicians more often after treatment than those with better health or immune system. In this situation, the visiting rate is positively correlated with the risk of failure due to the health status, which results in dependent interval-censored data. While some measurable factors affecting health status such as age, gender, and physical symptom can be included in the covariates, some health-related latent variables cannot be observed or measured. To deal with dependent interval censoring involving unobserved latent variable, we characterize the visiting/examination process as recurrent event process and propose a joint frailty model to account for the association of the failure time and visiting process. A shared gamma frailty is incorporated into the Cox model and proportional intensity model for the failure time and visiting process, respectively, in a multiplicative way. We propose a semiparametric maximum likelihood approach for estimating model parameters and show the asymptotic properties, including consistency and weak convergence. Extensive simulation studies are conducted and a data set of bladder cancer is analyzed for illustrative purposes. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Oscillatory dynamics of an intravenous glucose tolerance test model with delay interval
NASA Astrophysics Data System (ADS)
Shi, Xiangyun; Kuang, Yang; Makroglou, Athena; Mokshagundam, Sriprakash; Li, Jiaxu
2017-11-01
Type 2 diabetes mellitus (T2DM) has become prevalent pandemic disease in view of the modern life style. Both diabetic population and health expenses grow rapidly according to American Diabetes Association. Detecting the potential onset of T2DM is an essential focal point in the research of diabetes mellitus. The intravenous glucose tolerance test (IVGTT) is an effective protocol to determine the insulin sensitivity, glucose effectiveness, and pancreatic β-cell functionality, through the analysis and parameter estimation of a proper differential equation model. Delay differential equations have been used to study the complex physiological phenomena including the glucose and insulin regulations. In this paper, we propose a novel approach to model the time delay in IVGTT modeling. This novel approach uses two parameters to simulate not only both discrete time delay and distributed time delay in the past interval, but also the time delay distributed in a past sub-interval. Normally, larger time delay, either a discrete or a distributed delay, will destabilize the system. However, we find that time delay over a sub-interval might not. We present analytically some basic model properties, which are desirable biologically and mathematically. We show that this relatively simple model provides good fit to fluctuating patient data sets and reveals some intriguing dynamics. Moreover, our numerical simulation results indicate that our model may remove the defect in well known Minimal Model, which often overestimates the glucose effectiveness index.
Asthma and school commuting time.
McConnell, Rob; Liu, Feifei; Wu, Jun; Lurmann, Fred; Peters, John; Berhane, Kiros
2010-08-01
This study examined associations of asthma with school commuting time. Time on likely school commute route was used as a proxy for on-road air pollution exposure among 4741 elementary school children at enrollment into the Children's Health Study. Lifetime asthma and severe wheeze (including multiple attacks, nocturnal, or with shortness of breath) were reported by parents. In asthmatic children, severe wheeze was associated with commuting time (odds ratio, 1.54 across the 9-minute 5% to 95% exposure distribution; 95% confidence interval, 1.01 to 2.36). The association was stronger in analysis restricted to asthmatic children with commuting times 5 minutes or longer (odds ratio, 1.97; 95% confidence interval, 1.02 to 3.77). No significant associations were observed with asthma prevalence. Among asthmatics, severe wheeze was associated with relatively short school commuting times. Further investigation of effects of on-road pollutant exposure is warranted.
Stephen, Julia M; Ranken, Doug F; Aine, Cheryl J
2006-01-01
The sensitivity of visual areas to different temporal frequencies, as well as the functional connections between these areas, was examined using magnetoencephalography (MEG). Alternating circular sinusoids (0, 3.1, 8.7 and 14 Hz) were presented to foveal and peripheral locations in the visual field to target ventral and dorsal stream structures, respectively. It was hypothesized that higher temporal frequencies would preferentially activate dorsal stream structures. To determine the effect of frequency on the cortical response we analyzed the late time interval (220-770 ms) using a multi-dipole spatio-temporal analysis approach to provide source locations and timecourses for each condition. As an exploratory aspect, we performed cross-correlation analysis on the source timecourses to determine which sources responded similarly within conditions. Contrary to predictions, dorsal stream areas were not activated more frequently during high temporal frequency stimulation. However, across cortical sources the frequency-following response showed a difference, with significantly higher power at the second harmonic for the 3.1 and 8.7 Hz stimulation and at the first and second harmonics for the 14 Hz stimulation with this pattern seen robustly in area V1. Cross-correlations of the source timecourses showed that both low- and high-order visual areas, including dorsal and ventral stream areas, were significantly correlated in the late time interval. The results imply that frequency information is transferred to higher-order visual areas without translation. Despite the less complex waveforms seen in the late interval of time, the cross-correlation results show that visual, temporal and parietal cortical areas are intricately involved in late-interval visual processing.
Definition of Readmission in 3,041 Patients Undergoing Hepatectomy
Brudvik, Kristoffer W; Mise, Yoshihiro; Conrad, Claudius; Zimmitti, Giuseppe; Aloia, Thomas A; Vauthey, Jean-Nicolas
2015-01-01
Background Readmission rates of 9.7%–15.5% after hepatectomy have been reported. These rates are difficult to interpret due to variability in the time interval used to monitor readmission. The aim of this study was to refine the definition of readmission after hepatectomy. Study Design A prospectively maintained database of 3041 patients who underwent hepatectomy from 1998 through 2013 was merged with the hospital registry to identify readmissions. Area under the curve (AUC) analysis was used to determine the time interval that best captured unplanned readmission. Results Readmission rates at 30 days, 90 days, and 1 year after discharge were 10.7% (n = 326), 17.3% (n = 526), and 31.9% (n = 971) respectively. The time interval that best accounted for unplanned readmissions was 45 days after discharge (AUC, 0.956; p < 0.001), during which 389 patients (12.8%) were readmitted (unplanned: n = 312 [10.3%]; planned: n = 77 [2.5%]). In comparison, the 30 days after surgery interval (used in the ACS-NSQIP database) omitted 65 (26.3%) unplanned readmissions. Multivariate analysis revealed the following risk factors for unplanned readmission: diabetes (odds ratio [OR], 1.6; p = 0.024), right hepatectomy (OR, 2.1; p = 0.034), bile duct resection (OR, 1.9; p = 0.034), abdominal complication (OR, 1.8; p = 0.010), and a major postoperative complication (OR, 2.4; p < 0.001). Neither index hospitalization > 7 days nor postoperative hepatobiliary complications were independently associated with readmission. Conclusions To accurately assess readmission after hepatectomy, patients should be monitored 45 days after discharge. PMID:26047760
Definition of Readmission in 3,041 Patients Undergoing Hepatectomy.
Brudvik, Kristoffer W; Mise, Yoshihiro; Conrad, Claudius; Zimmitti, Giuseppe; Aloia, Thomas A; Vauthey, Jean-Nicolas
2015-07-01
Readmission rates of 9.7% to 15.5% after hepatectomy have been reported. These rates are difficult to interpret due to variability in the time interval used to monitor readmission. The aim of this study was to refine the definition of readmission after hepatectomy. A prospectively maintained database of 3,041 patients who underwent hepatectomy from 1998 through 2013 was merged with the hospital registry to identify readmissions. Area under the curve (AUC) analysis was used to determine the time interval that best captured unplanned readmission. Readmission rates at 30 days, 90 days, and 1 year after discharge were 10.7% (n = 326), 17.3% (n = 526), and 31.9% (n = 971) respectively. The time interval that best accounted for unplanned readmissions was 45 days after discharge (AUC, 0.956; p < 0.001), during which 389 patients (12.8%) were readmitted (unplanned: n = 312 [10.3%]; planned: n = 77 [2.5%]). In comparison, the 30 days after surgery interval (used in the ACS-NSQIP database) omitted 65 (26.3%) unplanned readmissions. Multivariate analysis revealed the following risk factors for unplanned readmission: diabetes (odds ratio [OR] 1.6; p = 0.024), right hepatectomy (OR 2.1; p = 0.034), bile duct resection (OR 1.9; p = 0.034), abdominal complication (OR 1.8; p = 0.010), and a major postoperative complication (OR 2.4; p < 0.001). Neither index hospitalization > 7 days nor postoperative hepatobiliary complications were independently associated with readmission. To accurately assess readmission after hepatectomy, patients should be monitored 45 days after discharge. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Moore, Hannah E; Adam, Craig D; Drijfhout, Falko P
2013-03-01
Previous studies on Diptera have shown the potential for the use of cuticular hydrocarbons' analysis in the determination of larval age and hence the postmortem interval (PMI) for an associated cadaver. In this work, hydrocarbon compounds, extracted daily until pupation from the cuticle of the blowfly Lucilia sericata (Diptera: Calliphoridae), have been analyzed using gas chromatography-mass spectrometry (GC-MS). The results show distinguishing features within the hydrocarbon profile over the period of the larvae life cycle, with significant chemical changes occurring from the younger larvae to the postfeeding larvae. Further interpretation of the chromatograms using principal component analysis revealed a strong correlation between the magnitudes of particular principal components and time. This outcome suggests that, under the conditions of this study, the cuticular hydrocarbons evolve in a systematic fashion with time, thus supporting the potential for GC-MS analysis as a tool for establishing PMI where such a species is present. © 2012 American Academy of Forensic Sciences.
Study of temperature distributions in wafer exposure process
NASA Astrophysics Data System (ADS)
Lin, Zone-Ching; Wu, Wen-Jang
During the exposure process of photolithography, wafer absorbs the exposure energy, which results in rising temperature and the phenomenon of thermal expansion. This phenomenon was often neglected due to its limited effect in the previous generation of process. However, in the new generation of process, it may very likely become a factor to be considered. In this paper, the finite element model for analyzing the transient behavior of the distribution of wafer temperature during exposure was established under the assumption that the wafer was clamped by a vacuum chuck without warpage. The model is capable of simulating the distribution of the wafer temperature under different exposure conditions. The flowchart of analysis begins with the simulation of transient behavior in a single exposure region to the variation of exposure energy, interval of exposure locations and interval of exposure time under continuous exposure to investigate the distribution of wafer temperature. The simulation results indicate that widening the interval of exposure locations has a greater impact in improving the distribution of wafer temperature than extending the interval of exposure time between neighboring image fields. Besides, as long as the distance between the field center locations of two neighboring exposure regions exceeds the straight distance equals to three image fields wide, the interacting thermal effect during wafer exposure can be ignored. The analysis flow proposed in this paper can serve as a supporting reference tool for engineers in planning exposure paths.
Contraction frequency after administration of misoprostol in obese versus nonobese women.
Stefely, Erin; Warshak, Carri R
2018-04-30
To examine impact of obesity on contraction frequency following misoprostol. Our hypothesis is that an increased volume of distribution reduces the bioavailability of misoprostol and may be an explanation for reduced efficacy. We examined the contraction frequency as a surrogate marker for bioavailability of misoprostol. We compared the rate of contractions at five time intervals in 313 subjects: prior to administration, and at four intervals post administration. We compared number of contractions in obese versus nonobese. As a planned secondary analysis, we then compared the rate of change in contractions per hour at four time intervals: a repeated measures analysis to compare the rate of change in contractions per hour over the 5-hour window controlling for race (White versus non-White) and parity (primiparous versus multiparous). General linear model and repeated measures analysis were conducted to report the parameter estimates, least square means, difference of least square means, and p values. Nonobese women presented with more contractions at baseline, 7 ± 5 versus 4 ± 5 c/h, p < .001. At all four time intervals after misoprostol administration obese women had fewer contractions per hour. The rate of change in contraction frequency after administration found obese women had a lower rate of increase in contraction frequency over the course of all four hours. We found a least squares means estimate (c/h): first hour (-0.87), p = .08, second hour (-2.43), p = .01, third hour (-1.80), p = .96, and fourth hour (-2.98), p = .007. Obese women have a lower rate of contractions per hour at baseline and at four intervals after misoprostol administration. In addition, the rate of change in the increase in contractions/hour also was reduced in obese women versus nonobese women. This suggests a lower bioavailability of misoprostol in women with a larger volume of distribution which would likely impact the efficacy of misoprostol in obese women when given the same dose of misoprostol. It is unknown if higher misoprostol dosing would increase efficacy of misoprostol in obese women.
Detrended fluctuation analysis of non-stationary cardiac beat-to-beat interval of sick infants
NASA Astrophysics Data System (ADS)
Govindan, Rathinaswamy B.; Massaro, An N.; Al-Shargabi, Tareq; Niforatos Andescavage, Nickie; Chang, Taeun; Glass, Penny; du Plessis, Adre J.
2014-11-01
We performed detrended fluctuation analysis (DFA) of cardiac beat-to-beat intervals (RRis) collected from sick newborn infants over 1-4 day periods. We calculated four different metrics from the DFA fluctuation function: the DFA exponents αL (>40 beats up to one-fourth of the record length), αs (15-30 beats), root-mean-square (RMS) fluctuation on a short-time scale (20-50 beats), and RMS fluctuation on a long-time scale (110-150 beats). Except αL , all metrics clearly distinguished two groups of newborn infants (favourable vs. adverse) with well-characterized outcomes. However, the RMS fluctuations distinguished the two groups more consistently over time compared to αS . Furthermore, RMS distinguished the RRi of the two groups earlier compared to the DFA exponent. In all the three measures, the favourable outcome group displayed higher values, indicating a higher magnitude of (auto-)correlation and variability, thus normal physiology, compared to the adverse outcome group.
Allan deviation analysis of financial return series
NASA Astrophysics Data System (ADS)
Hernández-Pérez, R.
2012-05-01
We perform a scaling analysis for the return series of different financial assets applying the Allan deviation (ADEV), which is used in the time and frequency metrology to characterize quantitatively the stability of frequency standards since it has demonstrated to be a robust quantity to analyze fluctuations of non-stationary time series for different observation intervals. The data used are opening price daily series for assets from different markets during a time span of around ten years. We found that the ADEV results for the return series at short scales resemble those expected for an uncorrelated series, consistent with the efficient market hypothesis. On the other hand, the ADEV results for absolute return series for short scales (first one or two decades) decrease following approximately a scaling relation up to a point that is different for almost each asset, after which the ADEV deviates from scaling, which suggests that the presence of clustering, long-range dependence and non-stationarity signatures in the series drive the results for large observation intervals.
Nandini, Suresh; Ballal, Suma; Kandaswamy, Deivanayagam
2007-02-01
The prolonged setting time of mineral trioxide aggregate (MTA) is the main disadvantage of this material. This study analyzes the influence of glass-ionomer cement on the setting of MTA using laser Raman spectroscopy (LRS). Forty hollow glass molds were taken in which MTA was placed. In Group I specimens, MTA was layered with glass-ionomer cement after 45 minutes. Similar procedures were done for Groups II and III at 4 hours and 3 days, respectively. No glass ionomer was added in Group IV, which were then considered as control samples. Each sample was scanned at various time intervals. At each time interval, the interface between MTA and glass-ionomer cement was also scanned (excluding Group IV). The spectral analysis proved that placement of glass-ionomer cement over MTA after 45 minutes did not affect its setting reaction and calcium salts may be formed in the interface of these two materials.
Time-dependent limited penetrable visibility graph analysis of nonstationary time series
NASA Astrophysics Data System (ADS)
Gao, Zhong-Ke; Cai, Qing; Yang, Yu-Xuan; Dang, Wei-Dong
2017-06-01
Recent years have witnessed the development of visibility graph theory, which allows us to analyze a time series from the perspective of complex network. We in this paper develop a novel time-dependent limited penetrable visibility graph (TDLPVG). Two examples using nonstationary time series from RR intervals and gas-liquid flows are provided to demonstrate the effectiveness of our approach. The results of the first example suggest that our TDLPVG method allows characterizing the time-varying behaviors and classifying heart states of healthy, congestive heart failure and atrial fibrillation from RR interval time series. For the second example, we infer TDLPVGs from gas-liquid flow signals and interestingly find that the deviation of node degree of TDLPVGs enables to effectively uncover the time-varying dynamical flow behaviors of gas-liquid slug and bubble flow patterns. All these results render our TDLPVG method particularly powerful for characterizing the time-varying features underlying realistic complex systems from time series.
Henschel, Volkmar; Engel, Jutta; Hölzel, Dieter; Mansmann, Ulrich
2009-02-10
Multivariate analysis of interval censored event data based on classical likelihood methods is notoriously cumbersome. Likelihood inference for models which additionally include random effects are not available at all. Developed algorithms bear problems for practical users like: matrix inversion, slow convergence, no assessment of statistical uncertainty. MCMC procedures combined with imputation are used to implement hierarchical models for interval censored data within a Bayesian framework. Two examples from clinical practice demonstrate the handling of clustered interval censored event times as well as multilayer random effects for inter-institutional quality assessment. The software developed is called survBayes and is freely available at CRAN. The proposed software supports the solution of complex analyses in many fields of clinical epidemiology as well as health services research.
The Smoking Habits of Three U. S. Newsmagazines: Surgeon General Be Damned?
ERIC Educational Resources Information Center
Tsien, Ay-Ling; Ostman, Ronald E.
A trend analysis was conducted to determine the characteristics of news articles, editorials, and advertisements about tobacco that appeared in the magazines "Newsweek,""Time," and "U. S. News and World Report." Nine time periods in three intervals were studied: 1959-1961-1963, 1965-1967-1969, and 1973-1975-1977. An…
Effect of Variations in IRU Integration Time Interval On Accuracy of Aqua Attitude Estimation
NASA Technical Reports Server (NTRS)
Natanson, G. A.; Tracewell, Dave
2003-01-01
During Aqua launch support, attitude analysts noticed several anomalies in Onboard Computer (OBC) rates and in rates computed by the ground Attitude Determination System (ADS). These included: 1) periodic jumps in the OBC pitch rate every 2 minutes; 2) spikes in ADS pitch rate every 4 minutes; 3) close agreement between pitch rates computed by ADS and those derived from telemetered OBC quaternions (in contrast to the step-wise pattern observed for telemetered OBC rates); 4) spikes of +/- 10 milliseconds in telemetered IRU integration time every 4 minutes (despite the fact that telemetered time tags of any two sequential IRU measurements were always 1 second apart from each other). An analysis presented in the paper explains this anomalous behavior by a small average offset of about 0.5 +/- 0.05 microsec in the time interval between two sequential accumulated angle measurements. It is shown that errors in the estimated pitch angle due to neglecting the aforementioned variations in the integration time interval by the OBC is within +/- 2 arcseconds. Ground attitude solutions are found to be accurate enough to see the effect of the variations on the accuracy of the estimated pitch angle.
Daniels, Carter W; Sanabria, Federico
2017-03-01
The distribution of latencies and interresponse times (IRTs) of rats was compared between two fixed-interval (FI) schedules of food reinforcement (FI 30 s and FI 90 s), and between two levels of food deprivation. Computational modeling revealed that latencies and IRTs were well described by mixture probability distributions embodying two-state Markov chains. Analysis of these models revealed that only a subset of latencies is sensitive to the periodicity of reinforcement, and prefeeding only reduces the size of this subset. The distribution of IRTs suggests that behavior in FI schedules is organized in bouts that lengthen and ramp up in frequency with proximity to reinforcement. Prefeeding slowed down the lengthening of bouts and increased the time between bouts. When concatenated, latency and IRT models adequately reproduced sigmoidal FI response functions. These findings suggest that behavior in FI schedules fluctuates in and out of schedule control; an account of such fluctuation suggests that timing and motivation are dissociable components of FI performance. These mixture-distribution models also provide novel insights on the motivational, associative, and timing processes expressed in FI performance. These processes may be obscured, however, when performance in timing tasks is analyzed in terms of mean response rates.
Evaluating the efficiency of environmental monitoring programs
Levine, Carrie R.; Yanai, Ruth D.; Lampman, Gregory G.; Burns, Douglas A.; Driscoll, Charles T.; Lawrence, Gregory B.; Lynch, Jason; Schoch, Nina
2014-01-01
Statistical uncertainty analyses can be used to improve the efficiency of environmental monitoring, allowing sampling designs to maximize information gained relative to resources required for data collection and analysis. In this paper, we illustrate four methods of data analysis appropriate to four types of environmental monitoring designs. To analyze a long-term record from a single site, we applied a general linear model to weekly stream chemistry data at Biscuit Brook, NY, to simulate the effects of reducing sampling effort and to evaluate statistical confidence in the detection of change over time. To illustrate a detectable difference analysis, we analyzed a one-time survey of mercury concentrations in loon tissues in lakes in the Adirondack Park, NY, demonstrating the effects of sampling intensity on statistical power and the selection of a resampling interval. To illustrate a bootstrapping method, we analyzed the plot-level sampling intensity of forest inventory at the Hubbard Brook Experimental Forest, NH, to quantify the sampling regime needed to achieve a desired confidence interval. Finally, to analyze time-series data from multiple sites, we assessed the number of lakes and the number of samples per year needed to monitor change over time in Adirondack lake chemistry using a repeated-measures mixed-effects model. Evaluations of time series and synoptic long-term monitoring data can help determine whether sampling should be re-allocated in space or time to optimize the use of financial and human resources.
Parameter identification for structural dynamics based on interval analysis algorithm
NASA Astrophysics Data System (ADS)
Yang, Chen; Lu, Zixing; Yang, Zhenyu; Liang, Ke
2018-04-01
A parameter identification method using interval analysis algorithm for structural dynamics is presented in this paper. The proposed uncertain identification method is investigated by using central difference method and ARMA system. With the help of the fixed memory least square method and matrix inverse lemma, a set-membership identification technology is applied to obtain the best estimation of the identified parameters in a tight and accurate region. To overcome the lack of insufficient statistical description of the uncertain parameters, this paper treats uncertainties as non-probabilistic intervals. As long as we know the bounds of uncertainties, this algorithm can obtain not only the center estimations of parameters, but also the bounds of errors. To improve the efficiency of the proposed method, a time-saving algorithm is presented by recursive formula. At last, to verify the accuracy of the proposed method, two numerical examples are applied and evaluated by three identification criteria respectively.
Sun, J
1995-09-01
In this paper we discuss the non-parametric estimation of a distribution function based on incomplete data for which the measurement origin of a survival time or the date of enrollment in a study is known only to belong to an interval. Also the survival time of interest itself is observed from a truncated distribution and is known only to lie in an interval. To estimate the distribution function, a simple self-consistency algorithm, a generalization of Turnbull's (1976, Journal of the Royal Statistical Association, Series B 38, 290-295) self-consistency algorithm, is proposed. This method is then used to analyze two AIDS cohort studies, for which direct use of the EM algorithm (Dempster, Laird and Rubin, 1976, Journal of the Royal Statistical Association, Series B 39, 1-38), which is computationally complicated, has previously been the usual method of the analysis.
Kersten, Paula; White, Peter J; Tennant, Alan
2014-01-01
Pain visual analogue scales (VAS) are commonly used in clinical trials and are often treated as an interval level scale without evidence that this is appropriate. This paper examines the internal construct validity and responsiveness of the pain VAS using Rasch analysis. Patients (n = 221, mean age 67, 58% female) with chronic stable joint pain (hip 40% or knee 60%) of mechanical origin waiting for joint replacement were included. Pain was scored on seven daily VASs. Rasch analysis was used to examine fit to the Rasch model. Responsiveness (Standardized Response Means, SRM) was examined on the raw ordinal data and the interval data generated from the Rasch analysis. Baseline pain VAS scores fitted the Rasch model, although 15 aberrant cases impacted on unidimensionality. There was some local dependency between items but this did not significantly affect the person estimates of pain. Daily pain (item difficulty) was stable, suggesting that single measures can be used. Overall, the SRMs derived from ordinal data overestimated the true responsiveness by 59%. Changes over time at the lower and higher end of the scale were represented by large jumps in interval equivalent data points; in the middle of the scale the reverse was seen. The pain VAS is a valid tool for measuring pain at one point in time. However, the pain VAS does not behave linearly and SRMs vary along the trait of pain. Consequently, Minimum Clinically Important Differences using raw data, or change scores in general, are invalid as these will either under- or overestimate true change; raw pain VAS data should not be used as a primary outcome measure or to inform parametric-based Randomised Controlled Trial power calculations in research studies; and Rasch analysis should be used to convert ordinal data to interval data prior to data interpretation.
NASA Astrophysics Data System (ADS)
Abidi, Oussama; Inoubli, Mohamed Hédi; Sebei, Kawthar; Amiri, Adnen; Boussiga, Haifa; Nasr, Imen Hamdi; Salem, Abdelhamid Ben; Elabed, Mahmoud
2017-05-01
The Maastrichtian-Paleocene El Haria formation was studied and defined in Tunisia on the basis of outcrops and borehole data; few studies were interested in its three-dimensional extent. In this paper, the El Haria formation is reviewed in the context of a tectono-stratigraphic interval using an integrated seismic stratigraphic analysis based on borehole lithology logs, electrical well logging, well shots, vertical seismic profiles and post-stack surface data. Seismic analysis benefits from appropriate calibration with borehole data, conventional interpretation, velocity mapping, seismic attributes and post-stack model-based inversion. The applied methodology proved to be powerful for charactering the marly Maastrichtian-Paleocene interval of the El Haria formation. Migrated seismic sections together with borehole measurements are used to detail the three-dimensional changes in thickness, facies and depositional environment in the Cap Bon and Gulf of Hammamet regions during the Maastrichtian-Paleocene time. Furthermore, dating based on their microfossil content divulges local and multiple internal hiatuses within the El Haria formation which are related to the geodynamic evolution of the depositional floor since the Campanian stage. Interpreted seismic sections display concordance, unconformities, pinchouts, sedimentary gaps, incised valleys and syn-sedimentary normal faulting. Based on the seismic reflection geometry and terminations, seven sequences are delineated. These sequences are related to base-level changes as the combination of depositional floor paleo-topography, tectonic forces, subsidence and the developed accommodation space. These factors controlled the occurrence of the various parts of the Maastrichtian-Paleocene interval. Detailed examinations of these deposits together with the analysis of the structural deformation at different time periods allowed us to obtain a better understanding of the sediment architecture in depth and the delineation of the geodynamic evolution of the region.
Albumin treatment regimen for type 1 hepatorenal syndrome: a dose-response meta-analysis.
Salerno, Francesco; Navickis, Roberta J; Wilkes, Mahlon M
2015-11-25
Recommended treatment for type 1 hepatorenal syndrome consists of albumin and vasoconstrictor. The optimal albumin dose remains poorly characterized. This meta-analysis aimed to determine the impact of albumin dose on treatment outcomes. Clinical studies of type 1 hepatorenal syndrome treatment with albumin and vasoconstrictor were sought. Search terms included: hepatorenal syndrome; albumin; vasoconstrictor; terlipressin; midodrine; octreotide; noradrenaline; and norepinephrine. A meta-analysis was performed of hepatorenal syndrome reversal and survival in relation to albumin dose. Nineteen clinical studies with 574 total patients were included, comprising 8 randomized controlled trials, 8 prospective studies and 3 retrospective studies. The pooled percentage of patients achieving hepatorenal syndrome reversal was 49.5% (95% confidence interval, 40.0-59.1%). Increments of 100 g in cumulative albumin dose were accompanied by significantly increased survival (hazard ratio, 1.15; 95% confidence interval, 1.02-1.31; p = 0.023). A non-significant increase of similar magnitude in hepatorenal syndrome reversal was also observed (odds ratio, 1.15; 95% confidence interval, 0.97-1.37; p = 0.10). Expected survival rates at 30 days among patients receiving cumulative albumin doses of 200, 400 and 600 g were 43.2% (95% confidence interval, 36.4-51.3%), 51.4% (95% confidence interval, 46.3-57.1%) and 59.0% (95% confidence interval, 51.9-67.2), respectively. Neither survival nor hepatorenal syndrome reversal was significantly affected by vasoconstrictor dose or type, treatment duration, age, baseline serum creatinine, bilirubin or albumin, baseline mean arterial pressure, or study design, size or time period. This meta-analysis suggests a dose-response relationship between infused albumin and survival in patients with type 1 hepatorenal syndrome. The meta-analysis provides the best current evidence on the potential role of albumin dose selection in improving outcomes of treatment for type 1 HRS and furnishes guidance for the design of future dose-ranging studies.
Triatomine Infestation in Guatemala: Spatial Assessment after Two Rounds of Vector Control
Manne, Jennifer; Nakagawa, Jun; Yamagata, Yoichi; Goehler, Alexander; Brownstein, John S.; Castro, Marcia C.
2012-01-01
In 2000, the Guatemalan Ministry of Health initiated a Chagas disease program to control Rhodnius prolixus and Triatoma dimidiata by periodic house spraying with pyrethroid insecticides to characterize infestation patterns and analyze the contribution of programmatic practices to these patterns. Spatial infestation patterns at three time points were identified using the Getis-Ord Gi*(d) test. Logistic regression was used to assess predictors of reinfestation after pyrethroid insecticide administration. Spatial analysis showed high and low clusters of infestation at three time points. After two rounds of spray, 178 communities persistently fell in high infestation clusters. A time lapse between rounds of vector control greater than 6 months was associated with 1.54 (95% confidence interval = 1.07–2.23) times increased odds of reinfestation after first spray, whereas a time lapse of greater than 1 year was associated with 2.66 (95% confidence interval = 1.85–3.83) times increased odds of reinfestation after first spray compared with localities where the time lapse was less than 180 days. The time lapse between rounds of vector control should remain under 1 year. Spatial analysis can guide targeted vector control efforts by enabling tracking of reinfestation hotspots and improved targeting of resources. PMID:22403315
Schiffman, Jeffrey M; Chelidze, David; Adams, Albert; Segala, David B; Hasselquist, Leif
2009-09-18
Linking human mechanical work to physiological work for the purpose of developing a model of physical fatigue is a complex problem that cannot be solved easily by conventional biomechanical analysis. The purpose of the study was to determine if two nonlinear analysis methods can address the fundamental issue of utilizing kinematic data to track oxygen consumption from a prolonged walking trial: we evaluated the effectiveness of dynamical systems and fractal analysis in this study. Further, we selected, oxygen consumption as a measure to represent the underlying physiological measure of fatigue. Three male US Army Soldier volunteers (means: 23.3 yr; 1.80 m; 77.3 kg) walked for 120 min at 1.34 m/s with a 40-kg load on a level treadmill. Gait kinematic data and oxygen consumption (VO(2)) data were collected over the 120-min period. For the fractal analysis, utilizing stride interval data, we calculated fractal dimension. For the dynamical systems analysis, kinematic angle time series were used to estimate phase space warping based features at uniform time intervals: smooth orthogonal decomposition (SOD) was used to extract slowly time-varying trends from these features. Estimated fractal dimensions showed no apparent trend or correlation with independently measured VO(2). While inter-individual difference did exist in the VO(2) data, dominant SOD time trends tracked and correlated with the VO(2) for all volunteers. Thus, dynamical systems analysis using gait kinematics may be suitable to develop a model to predict physiologic fatigue based on biomechanical work.
Tawfik, Ahmed M; Razek, Ahmed A; Elhawary, Galal; Batouty, Nihal M
2014-01-01
To evaluate the effect of increasing the sampling interval from 1 second (1 image per second) to 2 seconds (1 image every 2 seconds) on computed tomographic (CT) perfusion (CTP) of head and neck tumors. Twenty patients underwent CTP studies of head and neck tumors with images acquired in cine mode for 50 seconds using sampling interval of 1 second. Using deconvolution-based software, analysis of CTP was done with sampling interval of 1 second and then 2 seconds. Perfusion maps representing blood flow, blood volume, mean transit time, and permeability surface area product (PS) were obtained. Quantitative tumor CTP values were compared between the 2 sampling intervals. Two blinded radiologists compared the subjective quality of CTP maps using a 3-point scale between the 2 sampling intervals. Radiation dose parameters were recorded for the 2 sampling interval rates. No significant differences were observed between the means of the 4 perfusion parameters generated using both sampling intervals; all P >0.05. The 95% limits of agreement between the 2 sampling intervals were -65.9 to 48.1) mL/min per 100 g for blood flow, -3.6 to 3.1 mL/100 g for blood volume, -2.9 to 3.8 seconds for mean transit time, and -10.0 to 12.5 mL/min per 100 g for PS. There was no significant difference between the subjective quality scores of CTP maps obtained using the 2 sampling intervals; all P > 0.05. Radiation dose was halved when sampling interval increased from 1 to 2 seconds. Increasing the sampling interval rate to 1 image every 2 seconds does not compromise the image quality and has no significant effect on quantitative perfusion parameters of head and neck tumors. The radiation dose is halved.
Brasme, Jean-Francois; Grill, Jacques; Doz, Francois; Lacour, Brigitte; Valteau-Couanet, Dominique; Gaillard, Stephan; Delalande, Olivier; Aghakhani, Nozar; Puget, Stéphanie; Chalumeau, Martin
2012-01-01
Background The long time to diagnosis of medulloblastoma, one of the most frequent brain tumors in children, is the source of painful remorse and sometimes lawsuits. We analyzed its consequences for tumor stage, survival, and sequelae. Patients and Methods This retrospective population-based cohort study included all cases of pediatric medulloblastoma from a region of France between 1990 and 2005. We collected the demographic, clinical, and tumor data and analyzed the relations between the interval from symptom onset until diagnosis, initial disease stage, survival, and neuropsychological and neurological outcome. Results The median interval from symptom onset until diagnosis for the 166 cases was 65 days (interquartile range 31–121, range 3–457). A long interval (defined as longer than the median) was associated with a lower frequency of metastasis in the univariate and multivariate analyses and with a larger tumor volume, desmoplastic histology, and longer survival in the univariate analysis, but not after adjustment for confounding factors. The time to diagnosis was significantly associated with IQ score among survivors. No significant relation was found between the time to diagnosis and neurological disability. In the 62 patients with metastases, a long prediagnosis interval was associated with a higher T stage, infiltration of the fourth ventricle floor, and incomplete surgical resection; it nonetheless did not influence survival significantly in this subgroup. Conclusions We found complex and often inverse relations between time to diagnosis of medulloblastoma in children and initial severity factors, survival, and neuropsychological and neurological outcome. This interval appears due more to the nature of the tumor and its progression than to parental or medical factors. These conclusions should be taken into account in the information provided to parents and in expert assessments produced for malpractice claims. PMID:22485143
Richards, Kyle A; Ham, Sandra; Cohn, Joshua A; Steinberg, Gary D
2016-01-01
To determine the time to bladder cancer diagnosis from initial infection-like symptoms and its impact on cancer outcomes. Using Surveillance, Epidemiology and End Results-Medicare, we designed a retrospective cohort study identifying beneficiaries aged ≥ 66 years diagnosed with bladder cancer from 2007 to 2009. Patients were required to have a hematuria or urinary tract infection claim within 1 year of bladder cancer diagnosis (n = 21 216), and have 2 years of prior Medicare data (n = 18 956) without any precedent hematuria, bladder cancer or urinary tract infection claims (n = 12 195). The number of days to bladder cancer diagnosis was measured, as well as the impact of sex and presenting symptom on time to diagnosis, pathology, and oncological outcomes. The mean time to bladder cancer diagnosis was 72.2 days in women versus 58.9 days in men (P < 0.001). A logistic regression model identified the greatest predictors of ≥ pT2 pathology were both women (odds ratio 2.08, 95% confidence interval 1.70-2.55) and men (odds ratio 1.71, 95% confidence interval 1.49-1.97) presenting with urinary tract infection. Cox proportional hazards analysis identified an increased risk of mortality from bladder cancer and all causes in women presenting with urinary tract infection (hazard ratio 1.37, 95% confidence interval 1.10-1.71, and hazard ratio 1.47, 95% confidence interval 1.28-1.69) compared with women with hematuria. Women have a longer interval from urinary tract infection to diagnosis of bladder cancer. Urinary tract infection presentation can adversely affect time to diagnosis, pathology and survival. Time to diagnosis seems not to be an independent predictor of bladder cancer outcomes. © 2015 The Japanese Urological Association.
Bashir, Muhammad Mustehsan; Qayyum, Rehan; Saleem, Muhammad Hammad; Siddique, Kashif; Khan, Farid Ahmad
2015-08-01
To determine the optimal time interval between tumescent local anesthesia infiltration and the start of hand surgery without a tourniquet for improved operative field visibility. Patients aged 16 to 60 years who needed contracture release and tendon repair in the hand were enrolled from the outpatient clinic. Patients were randomized to 10-, 15-, or 25-minute intervals between tumescent anesthetic solution infiltration (0.18% lidocaine and 1:221,000 epinephrine) and the start of surgery. The end point of tumescence anesthetic infiltration was pale and firm skin. The surgical team was blinded to the time of anesthetic infiltration. At the completion of the procedure, the surgeon and the first assistant rated the operative field visibility as excellent, fair, or poor. We used logistic regression models without and with adjustment for confounding variables. Of the 75 patients enrolled in the study, 59 (79%) were males, 7 were randomized to 10-minute time intervals (further randomization was stopped after interim analysis found consistently poor operative field visibility), and 34 were randomized to the each of the 15- and 25-minute groups. Patients who were randomized to the 25-minute delay group had 29 times higher odds of having an excellent operative visual field than those randomized to the 15-minute delay group. After adjusting for age, sex, amount of tumescent solution infiltration, and duration of operation, the odds ratio remained highly significant. We found that an interval of 25 minutes provides vastly superior operative field visibility; 10-minute delay had the poorest results. Therapeutic I. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Tang, Zhongwen
2015-01-01
An analytical way to compute predictive probability of success (PPOS) together with credible interval at interim analysis (IA) is developed for big clinical trials with time-to-event endpoints. The method takes account of the fixed data up to IA, the amount of uncertainty in future data, and uncertainty about parameters. Predictive power is a special type of PPOS. The result is confirmed by simulation. An optimal design is proposed by finding optimal combination of analysis time and futility cutoff based on some PPOS criteria.
NASA Astrophysics Data System (ADS)
Cherednichenko, A. V.; Cherednichenko, A. V.; Cherednichenko, V. S.
2018-01-01
It is shown that a significant connection exists between the most important harmonics, extracted in the process of harmonic analysis of time series of precipitation in the catchment area of rivers and the amount of runoff. This allowed us to predict the size of the flow for a period of up to 20 years, assuming that the main parameters of the harmonics are preserved at the predicted time interval. The results of such a forecast for three river basins of Kazakhstan are presented.
Qualitative Research in the CJA/RCV: An 18-Year Analysis (1995-2012).
Humble, Áine M; Green, Maureen
2016-03-01
Some researchers have suggested that qualitative research is increasing in the gerontology field, but little systematic analysis has tested this assertion. Using the Canadian Journal on Aging/La Revue canadienne du vieillissement as a case study, we analysed articles reporting on original research from 1995 to 2012. One in four articles were qualitative, and results in three-year intervals show a clear increase in qualitative research findings during this 18-year time frame: (a) 1995-1997: 10 per cent; (b) 1998-2000: 19 per cent; (c) 2001-2003: 25 per cent; (d) 2004-2006: 25 per cent; (e) 2007-2009: 29 per cent; and (f) 2010-2012: 43 per cent. In all time intervals (with the exception of 2004-2006), French language articles were more likely to use a qualitative research design compared to English language articles. Topics, methodologies, and data collection strategies are also discussed.
Timescale- and Sensory Modality-Dependency of the Central Tendency of Time Perception.
Murai, Yuki; Yotsumoto, Yuko
2016-01-01
When individuals are asked to reproduce intervals of stimuli that are intermixedly presented at various times, longer intervals are often underestimated and shorter intervals overestimated. This phenomenon may be attributed to the central tendency of time perception, and suggests that our brain optimally encodes a stimulus interval based on current stimulus input and prior knowledge of the distribution of stimulus intervals. Two distinct systems are thought to be recruited in the perception of sub- and supra-second intervals. Sub-second timing is subject to local sensory processing, whereas supra-second timing depends on more centralized mechanisms. To clarify the factors that influence time perception, the present study investigated how both sensory modality and timescale affect the central tendency. In Experiment 1, participants were asked to reproduce sub- or supra-second intervals, defined by visual or auditory stimuli. In the sub-second range, the magnitude of the central tendency was significantly larger for visual intervals compared to auditory intervals, while visual and auditory intervals exhibited a correlated and comparable central tendency in the supra-second range. In Experiment 2, the ability to discriminate sub-second intervals in the reproduction task was controlled across modalities by using an interval discrimination task. Even when the ability to discriminate intervals was controlled, visual intervals exhibited a larger central tendency than auditory intervals in the sub-second range. In addition, the magnitude of the central tendency for visual and auditory sub-second intervals was significantly correlated. These results suggest that a common modality-independent mechanism is responsible for the supra-second central tendency, and that both the modality-dependent and modality-independent components of the timing system contribute to the central tendency in the sub-second range.
Mattos, A Z; Mattos, A A
Many different non-invasive methods have been studied with the purpose of staging liver fibrosis. The objective of this study was verifying if transient elastography is superior to aspartate aminotransferase to platelet ratio index for staging fibrosis in patients with chronic hepatitis C. A systematic review with meta-analysis of studies which evaluated both non-invasive tests and used biopsy as the reference standard was performed. A random-effects model was used, anticipating heterogeneity among studies. Diagnostic odds ratio was the main effect measure, and summary receiver operating characteristic curves were created. A sensitivity analysis was planned, in which the meta-analysis would be repeated excluding each study at a time. Eight studies were included in the meta-analysis. Regarding the prediction of significant fibrosis, transient elastography and aspartate aminotransferase to platelet ratio index had diagnostic odds ratios of 11.70 (95% confidence interval = 7.13-19.21) and 8.56 (95% confidence interval = 4.90-14.94) respectively. Concerning the prediction of cirrhosis, transient elastography and aspartate aminotransferase to platelet ratio index had diagnostic odds ratios of 66.49 (95% confidence interval = 23.71-186.48) and 7.47 (95% confidence interval = 4.88-11.43) respectively. In conclusion, there was no evidence of significant superiority of transient elastography over aspartate aminotransferase to platelet ratio index regarding the prediction of significant fibrosis, but the former proved to be better than the latter concerning prediction of cirrhosis.
Lee, Yu-Hao; Hsieh, Ya-Ju; Shiah, Yung-Jong; Lin, Yu-Huei; Chen, Chiao-Yun; Tyan, Yu-Chang; GengQiu, JiaCheng; Hsu, Chung-Yao; Chen, Sharon Chia-Ju
2017-01-01
Abstract To quantitate the meditation experience is a subjective and complex issue because it is confounded by many factors such as emotional state, method of meditation, and personal physical condition. In this study, we propose a strategy with a cross-sectional analysis to evaluate the meditation experience with 2 artificial intelligence techniques: artificial neural network and support vector machine. Within this analysis system, 3 features of the electroencephalography alpha spectrum and variant normalizing scaling are manipulated as the evaluating variables for the detection of accuracy. Thereafter, by modulating the sliding window (the period of the analyzed data) and shifting interval of the window (the time interval to shift the analyzed data), the effect of immediate analysis for the 2 methods is compared. This analysis system is performed on 3 meditation groups, categorizing their meditation experiences in 10-year intervals from novice to junior and to senior. After an exhausted calculation and cross-validation across all variables, the high accuracy rate >98% is achievable under the criterion of 0.5-minute sliding window and 2 seconds shifting interval for both methods. In a word, the minimum analyzable data length is 0.5 minute and the minimum recognizable temporal resolution is 2 seconds in the decision of meditative classification. Our proposed classifier of the meditation experience promotes a rapid evaluation system to distinguish meditation experience and a beneficial utilization of artificial techniques for the big-data analysis. PMID:28422856
Lee, Yu-Hao; Hsieh, Ya-Ju; Shiah, Yung-Jong; Lin, Yu-Huei; Chen, Chiao-Yun; Tyan, Yu-Chang; GengQiu, JiaCheng; Hsu, Chung-Yao; Chen, Sharon Chia-Ju
2017-04-01
To quantitate the meditation experience is a subjective and complex issue because it is confounded by many factors such as emotional state, method of meditation, and personal physical condition. In this study, we propose a strategy with a cross-sectional analysis to evaluate the meditation experience with 2 artificial intelligence techniques: artificial neural network and support vector machine. Within this analysis system, 3 features of the electroencephalography alpha spectrum and variant normalizing scaling are manipulated as the evaluating variables for the detection of accuracy. Thereafter, by modulating the sliding window (the period of the analyzed data) and shifting interval of the window (the time interval to shift the analyzed data), the effect of immediate analysis for the 2 methods is compared. This analysis system is performed on 3 meditation groups, categorizing their meditation experiences in 10-year intervals from novice to junior and to senior. After an exhausted calculation and cross-validation across all variables, the high accuracy rate >98% is achievable under the criterion of 0.5-minute sliding window and 2 seconds shifting interval for both methods. In a word, the minimum analyzable data length is 0.5 minute and the minimum recognizable temporal resolution is 2 seconds in the decision of meditative classification. Our proposed classifier of the meditation experience promotes a rapid evaluation system to distinguish meditation experience and a beneficial utilization of artificial techniques for the big-data analysis.
Jithesh, C; Venkataramana, V; Penumatsa, Narendravarma; Reddy, S N; Poornima, K Y; Rajasigamani, K
2015-08-01
To determine and compare the potential difference of nickel release from three different orthodontic brackets, in different artificial pH, in different time intervals. Twenty-seven samples of three different orthodontic brackets were selected and grouped as 1, 2, and 3. Each group was divided into three subgroups depending on the type of orthodontic brackets, salivary pH and the time interval. The Nickel release from each subgroup were analyzed by using inductively coupled plasma-Atomic Emission Spectrophotometer (Perkin Elmer, Optima 2100 DV, USA) model. Quantitative analysis of nickel was performed three times, and the mean value was used as result. ANOVA (F-test) was used to test the significant difference among the groups at 0.05 level of significance (P < 0.05). The descriptive method of statistics was used to calculate the mean, standard deviation, minimum and maximum. SPSS 18 software ((SPSS.Ltd, Quarry bay, Hong Kong, PASW-statistics 18) was used to analyze the study. The analysis shows a significant difference between three groups. The study shows that the nickel releases from the recycled stainless steel brackets have the highest at all 4.2 pH except in 120 h. The study result shows that the nickel release from the recycled stainless steel brackets is highest. Metal slot ceramic bracket release significantly less nickel. So, recycled stainless steel brackets should not be used for nickel allergic patients. Metal slot ceramic brackets are advisable.
Jithesh, C.; Venkataramana, V.; Penumatsa, Narendravarma; Reddy, S. N.; Poornima, K. Y.; Rajasigamani, K.
2015-01-01
Objectives: To determine and compare the potential difference of nickel release from three different orthodontic brackets, in different artificial pH, in different time intervals. Materials and Methods: Twenty-seven samples of three different orthodontic brackets were selected and grouped as 1, 2, and 3. Each group was divided into three subgroups depending on the type of orthodontic brackets, salivary pH and the time interval. The Nickel release from each subgroup were analyzed by using inductively coupled plasma-Atomic Emission Spectrophotometer (Perkin Elmer, Optima 2100 DV, USA) model. Quantitative analysis of nickel was performed three times, and the mean value was used as result. ANOVA (F-test) was used to test the significant difference among the groups at 0.05 level of significance (P < 0.05). The descriptive method of statistics was used to calculate the mean, standard deviation, minimum and maximum. SPSS 18 software ((SPSS.Ltd, Quarry bay, Hong Kong, PASW-statistics 18) was used to analyze the study. Result: The analysis shows a significant difference between three groups. The study shows that the nickel releases from the recycled stainless steel brackets have the highest at all 4.2 pH except in 120 h. Conclusion: The study result shows that the nickel release from the recycled stainless steel brackets is highest. Metal slot ceramic bracket release significantly less nickel. So, recycled stainless steel brackets should not be used for nickel allergic patients. Metal slot ceramic brackets are advisable. PMID:26538924
Multichannel Spectrometer of Time Distribution
NASA Astrophysics Data System (ADS)
Akindinova, E. V.; Babenko, A. G.; Vakhtel, V. M.; Evseev, N. A.; Rabotkin, V. A.; Kharitonova, D. D.
2015-06-01
For research and control of characteristics of radiation fluxes, radioactive sources in particular, for example, in paper [1], a spectrometer and methods of data measurement and processing based on the multichannel counter of time intervals of accident events appearance (impulses of particle detector) MC-2A (SPC "ASPECT") were created. The spectrometer has four independent channels of registration of time intervals of impulses appearance and correspondent amplitude and spectrometric channels for control along the energy spectra of the operation stationarity of paths of each of the channels from the detector to the amplifier. The registration of alpha-radiation is carried out by the semiconductor detectors with energy resolution of 16-30 keV. Using a spectrometer there have been taken measurements of oscillations of alpha-radiation 239-Pu flux intensity with a subsequent autocorrelative statistical analysis of the time series of readings.
Araújo, Célio U; Basting, Roberta T
2018-03-01
To perform an in situ evaluation of surface roughness and micromorphology of two soft liner materials for dentures at different time intervals. The surface roughness of materials may influence the adhesion of micro-organisms and inflammation of the mucosal tissues. The in situ evaluation of surface roughness and the micromorphology of soft liner materials over the course of time may present results different from those of in vitro studies, considering the constant presence of saliva and food, the changes in temperature and the pH level in the oral cavity. Forty-eight rectangular specimens of each of the two soft liner materials were fabricated: a silicone-based material (Mucopren Soft) and an acrylic resin-based material (Trusoft). The specimens were placed in the dentures of 12 participants (n = 12), and the materials were evaluated for surface roughness and micromorphology at different time intervals: 0, 7, 30 and 60 days. Roughness (Ra) was evaluated by means of a roughness tester. Surface micromorphology was evaluated by scanning electron microscopy. Analysis of variance for randomised block design and Tukey's test showed that surface roughness values were lower in the groups using the silicone-based material at all the time intervals (P < .0001). The average surface roughness was higher at time interval 0 than at the other intervals, for both materials (P < .0001). The surface micromorphology showed that the silicone material presented a more regular and smoother surface than the acrylic resin-based material. The surface roughness of acrylic resin-based and silicone-based denture soft liner materials decreased after 7 days of evaluation, leading to a smoother surface over time. The silicone-based material showed lower roughness values and a smoother surface than the acrylic resin-based material, thereby making it preferred when selecting more appropriate material, due its tendency to promote less biofilm build-up. © 2017 John Wiley & Sons A/S and The Gerodontology Association. Published by John Wiley & Sons Ltd.
High resolution data acquisition
Thornton, G.W.; Fuller, K.R.
1993-04-06
A high resolution event interval timing system measures short time intervals such as occur in high energy physics or laser ranging. Timing is provided from a clock, pulse train, and analog circuitry for generating a triangular wave synchronously with the pulse train (as seen in diagram on patent). The triangular wave has an amplitude and slope functionally related to the time elapsed during each clock pulse in the train. A converter forms a first digital value of the amplitude and slope of the triangle wave at the start of the event interval and a second digital value of the amplitude and slope of the triangle wave at the end of the event interval. A counter counts the clock pulse train during the interval to form a gross event interval time. A computer then combines the gross event interval time and the first and second digital values to output a high resolution value for the event interval.
High resolution data acquisition
Thornton, Glenn W.; Fuller, Kenneth R.
1993-01-01
A high resolution event interval timing system measures short time intervals such as occur in high energy physics or laser ranging. Timing is provided from a clock (38) pulse train (37) and analog circuitry (44) for generating a triangular wave (46) synchronously with the pulse train (37). The triangular wave (46) has an amplitude and slope functionally related to the time elapsed during each clock pulse in the train. A converter (18, 32) forms a first digital value of the amplitude and slope of the triangle wave at the start of the event interval and a second digital value of the amplitude and slope of the triangle wave at the end of the event interval. A counter (26) counts the clock pulse train (37) during the interval to form a gross event interval time. A computer (52) then combines the gross event interval time and the first and second digital values to output a high resolution value for the event interval.
Aortic stiffness and the balance between cardiac oxygen supply and demand: the Rotterdam Study.
Guelen, Ilja; Mattace-Raso, Francesco Us; van Popele, Nicole M; Westerhof, Berend E; Hofman, Albert; Witteman, Jacqueline Cm; Bos, Willem Jan W
2008-06-01
Aortic stiffness is an independent predictor of cardiovascular morbidity and mortality. We investigated whether aortic stiffness, estimated as aortic pulse wave velocity, is associated with decreased perfusion pressure estimated as the cardiac oxygen supply potential. Aortic stiffness and aortic pressure waves, reconstructed from finger blood pressure waves, were obtained in 2490 older adults within the framework of the Rotterdam Study, a large population-based study. Cardiac oxygen supply and demand were estimated using pulse wave analysis techniques, and related to aortic stiffness by linear regression analyses after adjustment for age, sex, mean arterial pressure and heart rate. Cardiac oxygen demand, estimated as the Systolic Pressure Time Index and the Rate Pressure Product, increased with increasing aortic stiffness [0.27 mmHg s (95% confidence interval: 0.21; 0.34)] and [42.2 mmHg/min (95% confidence interval: 34.1; 50.3)], respectively. Cardiac oxygen supply potential estimated as the Diastolic Pressure Time Index decreased [-0.70 mmHg s (95% confidence interval: -0.86; -0.54)] with aortic stiffening. Accordingly, the supply/demand ratio Diastolic Pressure Time Index/Systolic Pressure Time Index -1.11 (95% confidence interval: -0.14; -0.009) decreased with increasing aortic stiffness. Aortic stiffness is associated with estimates of increased cardiac oxygen demand and a decreased cardiac oxygen supply potential. These results may offer additional explanation for the relation between aortic stiffness and cardiovascular morbidity and mortality.
Anderton, D L; Bean, L L
1985-05-01
Our analysis of changing birth interval distributions over the course of a fertility transition from natural to controlled fertility has examined three closely related propositions. First, within both natural fertility populations (identified at the aggregate level) and cohorts following the onset of fertility limitation, we hypothesized that substantial groups of women with long birth intervals across the individually specified childbearing careers could be identified. That is, even during periods when fertility behavior at the aggregate level is consistent with a natural fertility regime, birth intervals at all parities are inversely related to completed family size. Our tabular analysis enables us to conclude that birth spacing patterns are parity dependent; there is stability in CEB-parity specific mean and birth interval variance over the entire transition. Our evidence does not suggest that the early group of women limiting and spacing births was marked by infecundity. Secondly, the transition appears to be associated with an increasingly larger proportion of women shifting to the same spacing schedules associated with smaller families in earlier cohorts. Thirdly, variations in birth spacing by age of marriage indicate that changes in birth intervals over time are at least indirectly associated with age of marriage, indicating an additional compositional effect. The evidence we have presented on spacing behavior does not negate the argument that parity-dependent stopping behavior was a powerful factor in the fertility transition. Our data also provide evidence of attempts to truncate childbearing. Specifically, the smaller the completed family size, the longer the ultimate birth interval; and ultimate birth intervals increase across cohorts controlling CEB and parity. But spacing appears to represent an additional strategy of fertility limitation. Thus, it may be necessary to distinguish spacing and stopping behavior if one wishes to clarify behavioral patterns within a population (Edlefsen, 1981; Friedlander et al., 1980; Rodriguez and Hobcraft, 1980). Because fertility transition theories imply increased attempts to limit family sizes, it is important to examine differential behavior within subgroups achieving different family sizes. It is this level of analysis which we have attempted to achieve in utilizing parity-specific birth intervals controlled by children ever born.(ABSTRACT TRUNCATED AT 400 WORDS)
NASA Technical Reports Server (NTRS)
Huikuri, H. V.; Makikallio, T. H.; Peng, C. K.; Goldberger, A. L.; Hintze, U.; Moller, M.
2000-01-01
BACKGROUND: Preliminary data suggest that the analysis of R-R interval variability by fractal analysis methods may provide clinically useful information on patients with heart failure. The purpose of this study was to compare the prognostic power of new fractal and traditional measures of R-R interval variability as predictors of death after acute myocardial infarction. METHODS AND RESULTS: Time and frequency domain heart rate (HR) variability measures, along with short- and long-term correlation (fractal) properties of R-R intervals (exponents alpha(1) and alpha(2)) and power-law scaling of the power spectra (exponent beta), were assessed from 24-hour Holter recordings in 446 survivors of acute myocardial infarction with a depressed left ventricular function (ejection fraction =35%). During a mean+/-SD follow-up period of 685+/-360 days, 114 patients died (25.6%), with 75 deaths classified as arrhythmic (17.0%) and 28 as nonarrhythmic (6.3%) cardiac deaths. Several traditional and fractal measures of R-R interval variability were significant univariate predictors of all-cause mortality. Reduced short-term scaling exponent alpha(1) was the most powerful R-R interval variability measure as a predictor of all-cause mortality (alpha(1) <0.75, relative risk 3.0, 95% confidence interval 2.5 to 4.2, P<0.001). It remained an independent predictor of death (P<0.001) after adjustment for other postinfarction risk markers, such as age, ejection fraction, NYHA class, and medication. Reduced alpha(1) predicted both arrhythmic death (P<0.001) and nonarrhythmic cardiac death (P<0.001). CONCLUSIONS: Analysis of the fractal characteristics of short-term R-R interval dynamics yields more powerful prognostic information than the traditional measures of HR variability among patients with depressed left ventricular function after an acute myocardial infarction.
Cardiovascular response to acute stress in freely moving rats: time-frequency analysis.
Loncar-Turukalo, Tatjana; Bajic, Dragana; Japundzic-Zigon, Nina
2008-01-01
Spectral analysis of cardiovascular series is an important tool for assessing the features of the autonomic control of the cardiovascular system. In this experiment Wistar rats ecquiped with intraarterial catheter for blood pressure (BP) recording were exposed to stress induced by blowing air. The problem of non stationary data was overcomed applying the Smoothed Pseudo Wigner Villle (SPWV) time-frequency distribution. Spectral analysis was done before stress, during stress, immediately after stress and later in recovery. The spectral indices were calculated for both systolic blood pressure (SBP) and pulse interval (PI) series. The time evolution of spectral indices showed perturbed sympathovagal balance.
Ewy, Gordon A; Bobrow, Bentley J; Chikani, Vatsal; Sanders, Arthur B; Otto, Charles W; Spaite, Daniel W; Kern, Karl B
2015-11-01
Recommended for decades, the therapeutic value of adrenaline (epinephrine) in the resuscitation of patients with out-of-hospital cardiac arrest (OHCA) is controversial. To investigate the possible time-dependent outcomes associated with adrenaline administration by Emergency Medical Services personnel (EMS). A retrospective analysis of prospectively collected data from a near statewide cardiac resuscitation database between 1 January 2005 and 30 November 2013. Multivariable logistic regression was used to analyze the effect of the time interval between EMS dispatch and the initial dose of adrenaline on survival. The primary endpoints were survival to hospital discharge and favourable neurologic outcome. Data from 3469 patients with witnessed OHCA were analyzed. Their mean age was 66.3 years and 69% were male. An initially shockable rhythm was present in 41.8% of patients. Based on a multivariable logistic regression model with initial adrenaline administration time interval (AATI) from EMS dispatch as the covariate, survival was greatest when adrenaline was administered very early but decreased rapidly with increasing (AATI); odds ratio 0.94 (95% Confidence Interval (CI) 0.92-0.97). The AATI had no significant effect on good neurological outcome (OR=0.96, 95% CI=0.90-1.02). In patients with OHCA, survival to hospital discharge was greater in those treated early with adrenaline by EMS especially in the subset of patients with a shockable rhythm. However survival rapidly decreased with increasing adrenaline administration time intervals (AATI). Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Data series embedding and scale invariant statistics.
Michieli, I; Medved, B; Ristov, S
2010-06-01
Data sequences acquired from bio-systems such as human gait data, heart rate interbeat data, or DNA sequences exhibit complex dynamics that is frequently described by a long-memory or power-law decay of autocorrelation function. One way of characterizing that dynamics is through scale invariant statistics or "fractal-like" behavior. For quantifying scale invariant parameters of physiological signals several methods have been proposed. Among them the most common are detrended fluctuation analysis, sample mean variance analyses, power spectral density analysis, R/S analysis, and recently in the realm of the multifractal approach, wavelet analysis. In this paper it is demonstrated that embedding the time series data in the high-dimensional pseudo-phase space reveals scale invariant statistics in the simple fashion. The procedure is applied on different stride interval data sets from human gait measurements time series (Physio-Bank data library). Results show that introduced mapping adequately separates long-memory from random behavior. Smaller gait data sets were analyzed and scale-free trends for limited scale intervals were successfully detected. The method was verified on artificially produced time series with known scaling behavior and with the varying content of noise. The possibility for the method to falsely detect long-range dependence in the artificially generated short range dependence series was investigated. (c) 2009 Elsevier B.V. All rights reserved.
Fabris, Enrico; van 't Hof, Arnoud; Hamm, Christian W; Lapostolle, Frédéric; Lassen, Jens F; Goodman, Shaun G; Ten Berg, Jurriën M; Bolognese, Leonardo; Cequier, Angel; Chettibi, Mohamed; Hammett, Christopher J; Huber, Kurt; Janzon, Magnus; Merkely, Béla; Storey, Robert F; Zeymer, Uwe; Cantor, Warren J; Tsatsaris, Anne; Kerneis, Mathieu; Diallo, Abdourahmane; Vicaut, Eric; Montalescot, Gilles
2017-08-01
In the ATLANTIC (Administration of Ticagrelor in the catheterization laboratory or in the Ambulance for New ST elevation myocardial Infarction to open the Coronary artery) trial the early use of aspirin, anticoagulation, and ticagrelor coupled with very short medical contact-to-balloon times represent good indicators of optimal treatment of ST-elevation myocardial infarction and an ideal setting to explore which factors may influence coronary reperfusion beyond a well-established pre-hospital system. This study sought to evaluate predictors of complete ST-segment resolution after percutaneous coronary intervention in ST-elevation myocardial infarction patients enrolled in the ATLANTIC trial. ST-segment analysis was performed on electrocardiograms recorded at the time of inclusion (pre-hospital electrocardiogram), and one hour after percutaneous coronary intervention (post-percutaneous coronary intervention electrocardiogram) by an independent core laboratory. Complete ST-segment resolution was defined as ≥70% ST-segment resolution. Complete ST-segment resolution occurred post-percutaneous coronary intervention in 54.9% ( n=800/1456) of patients and predicted lower 30-day composite major adverse cardiovascular and cerebrovascular events (odds ratio 0.35, 95% confidence interval 0.19-0.65; p<0.01), definite stent thrombosis (odds ratio 0.18, 95% confidence interval 0.02-0.88; p=0.03), and total mortality (odds ratio 0.43, 95% confidence interval 0.19-0.97; p=0.04). In multivariate analysis, independent negative predictors of complete ST-segment resolution were the time from symptoms to pre-hospital electrocardiogram (odds ratio 0.91, 95% confidence interval 0.85-0.98; p<0.01) and diabetes mellitus (odds ratio 0.6, 95% confidence interval 0.44-0.83; p<0.01); pre-hospital ticagrelor treatment showed a favorable trend for complete ST-segment resolution (odds ratio 1.22, 95% confidence interval 0.99-1.51; p=0.06). This study confirmed that post-percutaneous coronary intervention complete ST-segment resolution is a valid surrogate marker for cardiovascular clinical outcomes. In the current era of ST-elevation myocardial infarction reperfusion, patients' delay and diabetes mellitus are independent predictors of poor reperfusion and need specific attention in the future.
Measuring the EMS patient access time interval and the impact of responding to high-rise buildings.
Morrison, Laurie J; Angelini, Mark P; Vermeulen, Marian J; Schwartz, Brian
2005-01-01
To measure the patient access time interval and characterize its contribution to the total emergency medical services (EMS) response time interval; to compare the patient access time intervals for patients located three or more floors above ground with those less than three floors above or below ground, and specifically in the apartment subgroup; and to identify barriers that significantly impede EMS access to patients in high-rise apartments. An observational study of all patients treated by an emergency medical technician paramedics (EMT-P) crew was conducted using a trained independent observer to collect time intervals and identify potential barriers to access. Of 118 observed calls, 25 (21%) originated from patients three or more floors above ground. The overall median and 90th percentile (95% confidence interval) patient access time intervals were 1.61 (1.27, 1.91) and 3.47 (3.08, 4.05) minutes, respectively. The median interval was 2.73 (2.22, 3.03) minutes among calls from patients located three or more stories above ground compared with 1.25 (1.07, 1.55) minutes among those at lower levels. The patient access time interval represented 23.5% of the total EMS response time interval among calls originating less than three floors above or below ground and 32.2% of those located three or more stories above ground. The most frequently encountered barriers to access included security code entry requirements, lack of directional signs, and inability to fit the stretcher into the elevator. The patient access time interval is significantly long and represents a substantial component of the total EMS response time interval, especially among ambulance calls originating three or more floors above ground. A number of barriers appear to contribute to delayed paramedic access.
An operational definition of a statistically meaningful trend.
Bryhn, Andreas C; Dimberg, Peter H
2011-04-28
Linear trend analysis of time series is standard procedure in many scientific disciplines. If the number of data is large, a trend may be statistically significant even if data are scattered far from the trend line. This study introduces and tests a quality criterion for time trends referred to as statistical meaningfulness, which is a stricter quality criterion for trends than high statistical significance. The time series is divided into intervals and interval mean values are calculated. Thereafter, r(2) and p values are calculated from regressions concerning time and interval mean values. If r(2) ≥ 0.65 at p ≤ 0.05 in any of these regressions, then the trend is regarded as statistically meaningful. Out of ten investigated time series from different scientific disciplines, five displayed statistically meaningful trends. A Microsoft Excel application (add-in) was developed which can perform statistical meaningfulness tests and which may increase the operationality of the test. The presented method for distinguishing statistically meaningful trends should be reasonably uncomplicated for researchers with basic statistics skills and may thus be useful for determining which trends are worth analysing further, for instance with respect to causal factors. The method can also be used for determining which segments of a time trend may be particularly worthwhile to focus on.
Identification of speech transients using variable frame rate analysis and wavelet packets.
Rasetshwane, Daniel M; Boston, J Robert; Li, Ching-Chung
2006-01-01
Speech transients are important cues for identifying and discriminating speech sounds. Yoo et al. and Tantibundhit et al. were successful in identifying speech transients and, emphasizing them, improving the intelligibility of speech in noise. However, their methods are computationally intensive and unsuitable for real-time applications. This paper presents a method to identify and emphasize speech transients that combines subband decomposition by the wavelet packet transform with variable frame rate (VFR) analysis and unvoiced consonant detection. The VFR analysis is applied to each wavelet packet to define a transitivity function that describes the extent to which the wavelet coefficients of that packet are changing. Unvoiced consonant detection is used to identify unvoiced consonant intervals and the transitivity function is amplified during these intervals. The wavelet coefficients are multiplied by the transitivity function for that packet, amplifying the coefficients localized at times when they are changing and attenuating coefficients at times when they are steady. Inverse transform of the modified wavelet packet coefficients produces a signal corresponding to speech transients similar to the transients identified by Yoo et al. and Tantibundhit et al. A preliminary implementation of the algorithm runs more efficiently.
Modulation of human time processing by subthalamic deep brain stimulation.
Wojtecki, Lars; Elben, Saskia; Timmermann, Lars; Reck, Christiane; Maarouf, Mohammad; Jörgens, Silke; Ploner, Markus; Südmeyer, Martin; Groiss, Stefan Jun; Sturm, Volker; Niedeggen, Michael; Schnitzler, Alfons
2011-01-01
Timing in the range of seconds referred to as interval timing is crucial for cognitive operations and conscious time processing. According to recent models of interval timing basal ganglia (BG) oscillatory loops are involved in time interval recognition. Parkinsońs disease (PD) is a typical disease of the basal ganglia that shows distortions in interval timing. Deep brain stimulation (DBS) of the subthalamic nucleus (STN) is a powerful treatment of PD which modulates motor and cognitive functions depending on stimulation frequency by affecting subcortical-cortical oscillatory loops. Thus, for the understanding of BG-involvement in interval timing it is of interest whether STN-DBS can modulate timing in a frequency dependent manner by interference with oscillatory time recognition processes. We examined production and reproduction of 5 and 15 second intervals and millisecond timing in a double blind, randomised, within-subject repeated-measures design of 12 PD-patients applying no, 10-Hz- and ≥ 130-Hz-STN-DBS compared to healthy controls. We found under(re-)production of the 15-second interval and a significant enhancement of this under(re-)production by 10-Hz-stimulation compared to no stimulation, ≥ 130-Hz-STN-DBS and controls. Milliseconds timing was not affected. We provide first evidence for a frequency-specific modulatory effect of STN-DBS on interval timing. Our results corroborate the involvement of BG in general and of the STN in particular in the cognitive representation of time intervals in the range of multiple seconds.
Modulation of Human Time Processing by Subthalamic Deep Brain Stimulation
Timmermann, Lars; Reck, Christiane; Maarouf, Mohammad; Jörgens, Silke; Ploner, Markus; Südmeyer, Martin; Groiss, Stefan Jun; Sturm, Volker; Niedeggen, Michael; Schnitzler, Alfons
2011-01-01
Timing in the range of seconds referred to as interval timing is crucial for cognitive operations and conscious time processing. According to recent models of interval timing basal ganglia (BG) oscillatory loops are involved in time interval recognition. Parkinsońs disease (PD) is a typical disease of the basal ganglia that shows distortions in interval timing. Deep brain stimulation (DBS) of the subthalamic nucleus (STN) is a powerful treatment of PD which modulates motor and cognitive functions depending on stimulation frequency by affecting subcortical-cortical oscillatory loops. Thus, for the understanding of BG-involvement in interval timing it is of interest whether STN-DBS can modulate timing in a frequency dependent manner by interference with oscillatory time recognition processes. We examined production and reproduction of 5 and 15 second intervals and millisecond timing in a double blind, randomised, within-subject repeated-measures design of 12 PD-patients applying no, 10-Hz- and ≥130-Hz-STN-DBS compared to healthy controls. We found under(re-)production of the 15-second interval and a significant enhancement of this under(re-)production by 10-Hz-stimulation compared to no stimulation, ≥130-Hz-STN-DBS and controls. Milliseconds timing was not affected. We provide first evidence for a frequency-specific modulatory effect of STN-DBS on interval timing. Our results corroborate the involvement of BG in general and of the STN in particular in the cognitive representation of time intervals in the range of multiple seconds. PMID:21931767
NASA Astrophysics Data System (ADS)
Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R.
1996-02-01
Neutron coincidence counting is commonly used for the non-destructive assay of plutonium bearing waste or for safeguards verification measurements. A major drawback of conventional coincidence counting is related to the fact that a valid calibration is needed to convert a neutron coincidence count rate to a 240Pu equivalent mass ( 240Pu eq). In waste assay, calibrations are made for representative waste matrices and source distributions. The actual waste however may have quite different matrices and source distributions compared to the calibration samples. This often results in a bias of the assay result. This paper presents a new neutron multiplicity sensitive coincidence counting technique including an auto-calibration of the neutron detection efficiency. The coincidence counting principle is based on the recording of one- and two-dimensional Rossi-alpha distributions triggered respectively by pulse pairs and by pulse triplets. Rossi-alpha distributions allow an easy discrimination between real and accidental coincidences and are aimed at being measured by a PC-based fast time interval analyser. The Rossi-alpha distributions can be easily expressed in terms of a limited number of factorial moments of the neutron multiplicity distributions. The presented technique allows an unbiased measurement of the 240Pu eq mass. The presented theory—which will be indicated as Time Interval Analysis (TIA)—is complementary to Time Correlation Analysis (TCA) theories which were developed in the past, but is from the theoretical point of view much simpler and allows a straightforward calculation of deadtime corrections and error propagation. Analytical expressions are derived for the Rossi-alpha distributions as a function of the factorial moments of the efficiency dependent multiplicity distributions. The validity of the proposed theory is demonstrated and verified via Monte Carlo simulations of pulse trains and the subsequent analysis of the simulated data.
Mallinis, Giorgos; Koutsias, Nikos; Arianoutsou, Margarita
2014-08-15
The aims of this study were to map and analyze land use/land cover transitions and landscape changes in the Parnitha and Penteli mountains, which surround the Athens metropolitan area of Attica, Greece over a period of 62 years. In order to quantify the changes between land categories through time, we computed the transition matrices for three distinct periods (1945-1960, 1960-1996, and 1996-2007), on the basis of available aerial photographs used to create multi-temporal maps. We identified systematic and stationary transitions with multi-level intensity analysis. Forest areas in Parnitha remained the dominant class of land cover throughout the 62 years studied, while transitional woodlands and shrublands were the main classes involved in LULC transitions. Conversely, in Penteli, transitional woodlands, along with shrublands, dominated the study site. The annual rate of change was faster in the first and third time intervals, compared to the second (1960-1996) time interval, in both study areas. The category level analysis results indicated that in both sites annual crops avoided to gain while discontinuous urban fabric avoided to lose areas. At the transition level of analysis, similarities as well as distinct differences existed between the two areas. In both sites the gaining pattern of permanent crops with respect to annual crops and the gain of forest with respect to transitional woodland/shrublands were stationary across the three time intervals. Overall, we identified more systematic transitions and stationary processes in Penteli. We discussed these LULC changes and associated them with human interference (activity) and other major socio-economic developments that were simultaneously occurring in the area. The different patterns of change of the areas, despite their geographical proximity, throughout the period of analysis imply that site-specific studies are needed in order to comprehensively assess the driving forces and develop models of landscape transformation in Mediterranean areas. Copyright © 2014 Elsevier B.V. All rights reserved.
Personalized glucose-insulin model based on signal analysis.
Goede, Simon L; de Galan, Bastiaan E; Leow, Melvin Khee Shing
2017-04-21
Glucose plasma measurements for diabetes patients are generally presented as a glucose concentration-time profile with 15-60min time scale intervals. This limited resolution obscures detailed dynamic events of glucose appearance and metabolism. Measurement intervals of 15min or more could contribute to imperfections in present diabetes treatment. High resolution data from mixed meal tolerance tests (MMTT) for 24 type 1 and type 2 diabetes patients were used in our present modeling. We introduce a model based on the physiological properties of transport, storage and utilization. This logistic approach follows the principles of electrical network analysis and signal processing theory. The method mimics the physiological equivalent of the glucose homeostasis comprising the meal ingestion, absorption via the gastrointestinal tract (GIT) to the endocrine nexus between the liver, pancreatic alpha and beta cells. This model demystifies the metabolic 'black box' by enabling in silico simulations and fitting of individual responses to clinical data. Five-minute intervals MMTT data measured from diabetic subjects result in two independent model parameters that characterize the complete glucose system response at a personalized level. From the individual data measurements, we obtain a model which can be analyzed with a standard electrical network simulator for diagnostics and treatment optimization. The insulin dosing time scale can be accurately adjusted to match the individual requirements of characterized diabetic patients without the physical burden of treatment. Copyright © 2017 Elsevier Ltd. All rights reserved.
Falcaro, Milena; Pickles, Andrew
2007-02-10
We focus on the analysis of multivariate survival times with highly structured interdependency and subject to interval censoring. Such data are common in developmental genetics and genetic epidemiology. We propose a flexible mixed probit model that deals naturally with complex but uninformative censoring. The recorded ages of onset are treated as possibly censored ordinal outcomes with the interval censoring mechanism seen as arising from a coarsened measurement of a continuous variable observed as falling between subject-specific thresholds. This bypasses the requirement for the failure times to be observed as falling into non-overlapping intervals. The assumption of a normal age-of-onset distribution of the standard probit model is relaxed by embedding within it a multivariate Box-Cox transformation whose parameters are jointly estimated with the other parameters of the model. Complex decompositions of the underlying multivariate normal covariance matrix of the transformed ages of onset become possible. The new methodology is here applied to a multivariate study of the ages of first use of tobacco and first consumption of alcohol without parental permission in twins. The proposed model allows estimation of the genetic and environmental effects that are shared by both of these risk behaviours as well as those that are specific. 2006 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Baxley, Brian T.; Murdoch, Jennifer L.; Swieringa, Kurt A.; Barmore, Bryan E.; Capron, William R.; Hubbs, Clay E.; Shay, Richard F.; Abbott, Terence S.
2013-01-01
The predicted increase in the number of commercial aircraft operations creates a need for improved operational efficiency. Two areas believed to offer increases in aircraft efficiency are optimized profile descents and dependent parallel runway operations. Using Flight deck Interval Management (FIM) software and procedures during these operations, flight crews can achieve by the runway threshold an interval assigned by air traffic control (ATC) behind the preceding aircraft that maximizes runway throughput while minimizing additional fuel consumption and pilot workload. This document describes an experiment where 24 pilots flew arrivals into the Dallas Fort-Worth terminal environment using one of three simulators at NASA?s Langley Research Center. Results indicate that pilots delivered their aircraft to the runway threshold within +/- 3.5 seconds of their assigned time interval, and reported low workload levels. In general, pilots found the FIM concept, procedures, speeds, and interface acceptable. Analysis of the time error and FIM speed changes as a function of arrival stream position suggest the spacing algorithm generates stable behavior while in the presence of continuous (wind) or impulse (offset) error. Concerns reported included multiple speed changes within a short time period, and an airspeed increase followed shortly by an airspeed decrease.
Brenner, Amy; Shakur-Still, Haleema; Chaudhri, Rizwana; Fawole, Bukola; Arulkumaran, Sabaratnam; Roberts, Ian
2018-06-07
In severe post-partum haemorrhage, death can occur within hours of bleeding onset so interventions to control the bleeding must be given immediately. In clinical trials of treatments for life-threatening bleeding, established treatments are given priority and the trial treatment is usually given last. However, enrolling patients in whom severe maternal morbidity or death is imminent or inevitable at the time of randomisation may dilute the effects of a trial treatment. We conducted an exploratory analysis of data from the WOMAN trial, an international, randomised placebo-controlled trial of the effects of tranexamic acid on death and surgical intervention in 20,060 women with post-partum haemorrhage. We assessed the impact of early maternal death or hysterectomy due to exsanguination on the effect of tranexamic acid on each of these respective outcomes. We conducted repeated analyses excluding patients with these outcomes at increasing intervals from the time of randomisation. We quantified treatment effects using risk ratios (RR) and 99% confidence intervals (CI) and prepared cumulative failure plots. Among 14,923 women randomised within 3 h of delivery (7518 tranexamic acid and 7405 placebo), there were 216 bleeding deaths (1.5%) and 383 hysterectomies due to bleeding (2.8%). After excluding deaths from exsanguination at increasing time intervals following randomization, there was a significant reduction in the risk of death due to bleeding with tranexamic acid (RR = 0.41; 99% CI 0.19-0.89). However, after excluding hysterectomies at increasing time intervals post-randomization, there was no reduction in the risk of hysterectomy due to bleeding with tranexamic acid (RR = 0.79; 99% CI 0.33-1.86). Findings from this analysis provide further evidence that tranexamic acid reduces the risk of death from exsanguination in women who experience postpartum haemorrhage. It is uncertain whether tranexamic acid reduces the risk of hysterectomy for bleeding after excluding early hysterectomies. ISRCTN trial registration number ISRCTN76912190, 8 Dec 2008; ClinicalTrials.gov number NCT00872469, 30 March 2009; PACTR number PACTR201007000192283, 9 Feb 2010; EudraCT number 2008-008441-38, 8 Dec 2010 (retrospectively registered).
HESS Opinions: The need for process-based evaluation of large-domain hyper-resolution models
NASA Astrophysics Data System (ADS)
Melsen, Lieke A.; Teuling, Adriaan J.; Torfs, Paul J. J. F.; Uijlenhoet, Remko; Mizukami, Naoki; Clark, Martyn P.
2016-03-01
A meta-analysis on 192 peer-reviewed articles reporting on applications of the variable infiltration capacity (VIC) model in a distributed way reveals that the spatial resolution at which the model is applied has increased over the years, while the calibration and validation time interval has remained unchanged. We argue that the calibration and validation time interval should keep pace with the increase in spatial resolution in order to resolve the processes that are relevant at the applied spatial resolution. We identified six time concepts in hydrological models, which all impact the model results and conclusions. Process-based model evaluation is particularly relevant when models are applied at hyper-resolution, where stakeholders expect credible results both at a high spatial and temporal resolution.
HESS Opinions: The need for process-based evaluation of large-domain hyper-resolution models
NASA Astrophysics Data System (ADS)
Melsen, L. A.; Teuling, A. J.; Torfs, P. J. J. F.; Uijlenhoet, R.; Mizukami, N.; Clark, M. P.
2015-12-01
A meta-analysis on 192 peer-reviewed articles reporting applications of the Variable Infiltration Capacity (VIC) model in a distributed way reveals that the spatial resolution at which the model is applied has increased over the years, while the calibration and validation time interval has remained unchanged. We argue that the calibration and validation time interval should keep pace with the increase in spatial resolution in order to resolve the processes that are relevant at the applied spatial resolution. We identified six time concepts in hydrological models, which all impact the model results and conclusions. Process-based model evaluation is particularly relevant when models are applied at hyper-resolution, where stakeholders expect credible results both at a high spatial and temporal resolution.
Zero-crossing statistics for non-Markovian time series.
Nyberg, Markus; Lizana, Ludvig; Ambjörnsson, Tobias
2018-03-01
In applications spanning from image analysis and speech recognition to energy dissipation in turbulence and time-to failure of fatigued materials, researchers and engineers want to calculate how often a stochastic observable crosses a specific level, such as zero. At first glance this problem looks simple, but it is in fact theoretically very challenging, and therefore few exact results exist. One exception is the celebrated Rice formula that gives the mean number of zero crossings in a fixed time interval of a zero-mean Gaussian stationary process. In this study we use the so-called independent interval approximation to go beyond Rice's result and derive analytic expressions for all higher-order zero-crossing cumulants and moments. Our results agree well with simulations for the non-Markovian autoregressive model.
Zero-crossing statistics for non-Markovian time series
NASA Astrophysics Data System (ADS)
Nyberg, Markus; Lizana, Ludvig; Ambjörnsson, Tobias
2018-03-01
In applications spanning from image analysis and speech recognition to energy dissipation in turbulence and time-to failure of fatigued materials, researchers and engineers want to calculate how often a stochastic observable crosses a specific level, such as zero. At first glance this problem looks simple, but it is in fact theoretically very challenging, and therefore few exact results exist. One exception is the celebrated Rice formula that gives the mean number of zero crossings in a fixed time interval of a zero-mean Gaussian stationary process. In this study we use the so-called independent interval approximation to go beyond Rice's result and derive analytic expressions for all higher-order zero-crossing cumulants and moments. Our results agree well with simulations for the non-Markovian autoregressive model.
Computer analysis of Holter electrocardiogram.
Yanaga, T; Adachi, M; Sato, Y; Ichimaru, Y; Otsuka, K
1994-10-01
Computer analysis is indispensable for the interpretation of Holter ECG, because it includes a large quantity of data. Computer analysis of Holter ECG is similar to that of conventional ECG, however, in computer analysis of Holter ECG, there are some difficulties such as many noise, limited analyzing time and voluminous data. The main topics in computer analysis of Holter ECG will be arrhythmias, ST-T changes, heart rate variability, QT interval, late potential and construction of database. Although many papers have been published on the computer analysis of Holter ECG, some of the papers was reviewed briefly in the present paper. We have studied on computer analysis of VPCs, ST-T changes, heart rate variability, QT interval and Cheyne-Stokes respiration during 24-hour ambulatory ECG monitoring. Further, we have studied on ambulatory palmar sweating for the evaluation of mental stress during a day. In future, the development of "the integrated Holter system", which enables the evaluation of ventricular vulnerability and modulating factor such as psychoneural hypersensitivity may be important.
Evaluation of Operational Procedures for Using a Time-Based Airborne Inter-arrival Spacing Tool
NASA Technical Reports Server (NTRS)
Oseguera-Lohr, Rosa M.; Lohr, Gary W.; Abbott, Terence S.; Eischeid, Todd M.
2002-01-01
An airborne tool has been developed based on the concept of an aircraft maintaining a time-based spacing interval from the preceding aircraft. The Advanced Terminal Area Approach Spacing (ATAAS) tool uses Automatic Dependent Surveillance-Broadcast (ADS-B) aircraft state data to compute a speed command for the ATAAS-equipped aircraft to obtain a required time interval behind another aircraft. The tool and candidate operational procedures were tested in a high-fidelity, full mission simulator with active airline subject pilots flying an arrival scenario using three different modes for speed control. The objectives of this study were to validate the results of a prior Monte Carlo analysis of the ATAAS algorithm and to evaluate the concept from the standpoint of pilot acceptability and workload. Results showed that the aircraft was able to consistently achieve the target spacing interval within one second (the equivalent of approximately 220 ft at a final approach speed of 130 kt) when the ATAAS speed guidance was autothrottle-coupled, and a slightly greater (4-5 seconds), but consistent interval with the pilot-controlled speed modes. The subject pilots generally rated the workload level with the ATAAS procedure as similar to that with standard procedures, and also rated most aspects of the procedure high in terms of acceptability. Although pilots indicated that the head-down time was higher with ATAAS, the acceptability of head-down time was rated high. Oculometer data indicated slight changes in instrument scan patterns, but no significant change in the amount of time spent looking out the window between the ATAAS procedure versus standard procedures.
Zheng, Guanglou; Fang, Gengfa; Shankaran, Rajan; Orgun, Mehmet A; Zhou, Jie; Qiao, Li; Saleem, Kashif
2017-05-01
Generating random binary sequences (BSes) is a fundamental requirement in cryptography. A BS is a sequence of N bits, and each bit has a value of 0 or 1. For securing sensors within wireless body area networks (WBANs), electrocardiogram (ECG)-based BS generation methods have been widely investigated in which interpulse intervals (IPIs) from each heartbeat cycle are processed to produce BSes. Using these IPI-based methods to generate a 128-bit BS in real time normally takes around half a minute. In order to improve the time efficiency of such methods, this paper presents an ECG multiple fiducial-points based binary sequence generation (MFBSG) algorithm. The technique of discrete wavelet transforms is employed to detect arrival time of these fiducial points, such as P, Q, R, S, and T peaks. Time intervals between them, including RR, RQ, RS, RP, and RT intervals, are then calculated based on this arrival time, and are used as ECG features to generate random BSes with low latency. According to our analysis on real ECG data, these ECG feature values exhibit the property of randomness and, thus, can be utilized to generate random BSes. Compared with the schemes that solely rely on IPIs to generate BSes, this MFBSG algorithm uses five feature values from one heart beat cycle, and can be up to five times faster than the solely IPI-based methods. So, it achieves a design goal of low latency. According to our analysis, the complexity of the algorithm is comparable to that of fast Fourier transforms. These randomly generated ECG BSes can be used as security keys for encryption or authentication in a WBAN system.
The solar wind effect on cosmic rays and solar activity
NASA Technical Reports Server (NTRS)
Fujimoto, K.; Kojima, H.; Murakami, K.
1985-01-01
The relation of cosmic ray intensity to solar wind velocity is investigated, using neutron monitor data from Kiel and Deep River. The analysis shows that the regression coefficient of the average intensity for a time interval to the corresponding average velocity is negative and that the absolute effect increases monotonously with the interval of averaging, tau, that is, from -0.5% per 100km/s for tau = 1 day to -1.1% per 100km/s for tau = 27 days. For tau 27 days the coefficient becomes almost constant independently of the value of tau. The analysis also shows that this tau-dependence of the regression coefficiently is varying with the solar activity.
Interval Analysis Approach to Prototype the Robust Control of the Laboratory Overhead Crane
NASA Astrophysics Data System (ADS)
Smoczek, J.; Szpytko, J.; Hyla, P.
2014-07-01
The paper describes the software-hardware equipment and control-measurement solutions elaborated to prototype the laboratory scaled overhead crane control system. The novelty approach to crane dynamic system modelling and fuzzy robust control scheme design is presented. The iterative procedure for designing a fuzzy scheduling control scheme is developed based on the interval analysis of discrete-time closed-loop system characteristic polynomial coefficients in the presence of rope length and mass of a payload variation to select the minimum set of operating points corresponding to the midpoints of membership functions at which the linear controllers are determined through desired poles assignment. The experimental results obtained on the laboratory stand are presented.
Tourism climate and thermal comfort in Sun Moon Lake, Taiwan.
Lin, Tzu-Ping; Matzarakis, Andreas
2008-03-01
Bioclimate conditions at Sun Moon Lake, one of Taiwan's most popular tourist destinations, are presented. Existing tourism-related climate is typically based on mean monthly conditions of air temperature and precipitation and excludes the thermal perception of tourists. This study presents a relatively more detailed analysis of tourism climate by using a modified thermal comfort range for both Taiwan and Western/Middle European conditions, presented by frequency analysis of 10-day intervals. Furthermore, an integrated approach (climate tourism information scheme) is applied to present the frequencies of each facet under particular criteria for each 10-day interval, generating a time-series of climate data with temporal resolution for tourists and tourism authorities.
Tourism climate and thermal comfort in Sun Moon Lake, Taiwan
NASA Astrophysics Data System (ADS)
Lin, Tzu-Ping; Matzarakis, Andreas
2008-03-01
Bioclimate conditions at Sun Moon Lake, one of Taiwan’s most popular tourist destinations, are presented. Existing tourism-related climate is typically based on mean monthly conditions of air temperature and precipitation and excludes the thermal perception of tourists. This study presents a relatively more detailed analysis of tourism climate by using a modified thermal comfort range for both Taiwan and Western/Middle European conditions, presented by frequency analysis of 10-day intervals. Furthermore, an integrated approach (climate tourism information scheme) is applied to present the frequencies of each facet under particular criteria for each 10-day interval, generating a time-series of climate data with temporal resolution for tourists and tourism authorities.
Performance Analysis of the IEEE 802.11p Multichannel MAC Protocol in Vehicular Ad Hoc Networks
2017-01-01
Vehicular Ad Hoc Networks (VANETs) employ multichannel to provide a variety of safety and non-safety applications, based on the IEEE 802.11p and IEEE 1609.4 protocols. The safety applications require timely and reliable transmissions, while the non-safety applications require efficient and high throughput. In the IEEE 1609.4 protocol, operating interval is divided into alternating Control Channel (CCH) interval and Service Channel (SCH) interval with an identical length. During the CCH interval, nodes transmit safety-related messages and control messages, and Enhanced Distributed Channel Access (EDCA) mechanism is employed to allow four Access Categories (ACs) within a station with different priorities according to their criticality for the vehicle’s safety. During the SCH interval, the non-safety massages are transmitted. An analytical model is proposed in this paper to evaluate performance, reliability and efficiency of the IEEE 802.11p and IEEE 1609.4 protocols. The proposed model improves the existing work by taking serval aspects and the character of multichannel switching into design consideration. Extensive performance evaluations based on analysis and simulation help to validate the accuracy of the proposed model and analyze the capabilities and limitations of the IEEE 802.11p and IEEE 1609.4 protocols, and enhancement suggestions are given. PMID:29231882
Faydasicok, Ozlem; Arik, Sabri
2013-08-01
The main problem with the analysis of robust stability of neural networks is to find the upper bound norm for the intervalized interconnection matrices of neural networks. In the previous literature, the major three upper bound norms for the intervalized interconnection matrices have been reported and they have been successfully applied to derive new sufficient conditions for robust stability of delayed neural networks. One of the main contributions of this paper will be the derivation of a new upper bound for the norm of the intervalized interconnection matrices of neural networks. Then, by exploiting this new upper bound norm of interval matrices and using stability theory of Lyapunov functionals and the theory of homomorphic mapping, we will obtain new sufficient conditions for the existence, uniqueness and global asymptotic stability of the equilibrium point for the class of neural networks with discrete time delays under parameter uncertainties and with respect to continuous and slope-bounded activation functions. The results obtained in this paper will be shown to be new and they can be considered alternative results to previously published corresponding results. We also give some illustrative and comparative numerical examples to demonstrate the effectiveness and applicability of the proposed robust stability condition. Copyright © 2013 Elsevier Ltd. All rights reserved.
A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications
Austin, Peter C.
2017-01-01
Summary Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log–log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata). PMID:29307954
Performance Analysis of the IEEE 802.11p Multichannel MAC Protocol in Vehicular Ad Hoc Networks.
Song, Caixia
2017-12-12
Vehicular Ad Hoc Networks (VANETs) employ multichannel to provide a variety of safety and non-safety applications, based on the IEEE 802.11p and IEEE 1609.4 protocols. The safety applications require timely and reliable transmissions, while the non-safety applications require efficient and high throughput. In the IEEE 1609.4 protocol, operating interval is divided into alternating Control Channel (CCH) interval and Service Channel (SCH) interval with an identical length. During the CCH interval, nodes transmit safety-related messages and control messages, and Enhanced Distributed Channel Access (EDCA) mechanism is employed to allow four Access Categories (ACs) within a station with different priorities according to their criticality for the vehicle's safety. During the SCH interval, the non-safety massages are transmitted. An analytical model is proposed in this paper to evaluate performance, reliability and efficiency of the IEEE 802.11p and IEEE 1609.4 protocols. The proposed model improves the existing work by taking serval aspects and the character of multichannel switching into design consideration. Extensive performance evaluations based on analysis and simulation help to validate the accuracy of the proposed model and analyze the capabilities and limitations of the IEEE 802.11p and IEEE 1609.4 protocols, and enhancement suggestions are given.
A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications.
Austin, Peter C
2017-08-01
Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log-log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata).
Emergency Department Overcrowding and Ambulance Turnaround Time
Lee, Yu Jin; Shin, Sang Do; Lee, Eui Jung; Cho, Jin Seong; Cha, Won Chul
2015-01-01
Objective The aims of this study were to describe overcrowding in regional emergency departments in Seoul, Korea and evaluate the effect of crowdedness on ambulance turnaround time. Methods This study was conducted between January 2010 and December 2010. Patients who were transported by 119-responding ambulances to 28 emergency centers within Seoul were eligible for enrollment. Overcrowding was defined as the average occupancy rate, which was equal to the average number of patients staying in an emergency department (ED) for 4 hours divided by the number of beds in the ED. After selecting groups for final analysis, multi-level regression modeling (MLM) was performed with random-effects for EDs, to evaluate associations between occupancy rate and turnaround time. Results Between January 2010 and December 2010, 163,659 patients transported to 28 EDs were enrolled. The median occupancy rate was 0.42 (range: 0.10-1.94; interquartile range (IQR): 0.20-0.76). Overcrowded EDs were more likely to have older patients, those with normal mentality, and non-trauma patients. Overcrowded EDs were more likely to have longer turnaround intervals and traveling distances. The MLM analysis showed that an increase of 1% in occupancy rate was associated with 0.02-minute decrease in turnaround interval (95% CI: 0.01 to 0.03). In subgroup analyses limited to EDs with occupancy rates over 100%, we also observed a 0.03 minute decrease in turnaround interval per 1% increase in occupancy rate (95% CI: 0.01 to 0.05). Conclusions In this study, we found wide variation in emergency department crowding in a metropolitan Korean city. Our data indicate that ED overcrowding is negatively associated with turnaround interval with very small practical significance. PMID:26115183
Department of Defense Precise Time and Time Interval program improvement plan
NASA Technical Reports Server (NTRS)
Bowser, J. R.
1981-01-01
The United States Naval Observatory is responsible for ensuring uniformity in precise time and time interval operations including measurements, the establishment of overall DOD requirements for time and time interval, and the accomplishment of objectives requiring precise time and time interval with minimum cost. An overview of the objectives, the approach to the problem, the schedule, and a status report, including significant findings relative to organizational relationships, current directives, principal PTTI users, and future requirements as currently identified by the users are presented.
Recurrence Interval and Event Age Data for Type A Faults
Dawson, Timothy E.; Weldon, Ray J.; Biasi, Glenn P.
2008-01-01
This appendix summarizes available recurrence interval, event age, and timing of most recent event data for Type A faults considered in the Earthquake Rate Model 2 (ERM 2) and used in the ERM 2 Appendix C analysis as well as Appendix N (time-dependent probabilities). These data have been compiled into an Excel workbook named Appendix B A-fault event ages_recurrence_V5.0 (herein referred to as the Appendix B workbook). For convenience, the Appendix B workbook is attached to the end of this document as a series of tables. The tables within the Appendix B workbook include site locations, event ages, and recurrence data, and in some cases, the interval of time between earthquakes is also reported. The Appendix B workbook is organized as individual worksheets, with each worksheet named by fault and paleoseismic site. Each worksheet contains the site location in latitude and longitude, as well as information on event ages, and a summary of recurrence data. Because the data has been compiled from different sources with different presentation styles, descriptions of the contents of each worksheet within the Appendix B spreadsheet are summarized.
NASA Technical Reports Server (NTRS)
Stano, Geoffrey T.; Fuelberg, Henry E.; Roeder, William P.
2010-01-01
This research addresses the 45th Weather Squadron's (45WS) need for improved guidance regarding lightning cessation at Cape Canaveral Air Force Station and Kennedy Space Center (KSC). KSC's Lightning Detection and Ranging (LDAR) network was the primary observational tool to investigate both cloud-to-ground and intracloud lightning. Five statistical and empirical schemes were created from LDAR, sounding, and radar parameters derived from 116 storms. Four of the five schemes were unsuitable for operational use since lightning advisories would be canceled prematurely, leading to safety risks to personnel. These include a correlation and regression tree analysis, three variants of multiple linear regression, event time trending, and the time delay between the greatest height of the maximum dBZ value to the last flash. These schemes failed to adequately forecast the maximum interval, the greatest time between any two flashes in the storm. The majority of storms had a maximum interval less than 10 min, which biased the schemes toward small values. Success was achieved with the percentile method (PM) by separating the maximum interval into percentiles for the 100 dependent storms.
Love, Jeffrey J.; Rigler, E. Joshua; Pulkkinen, Antti; Riley, Pete
2015-01-01
An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to −Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, −Dst≥850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42,2.41] times per century; a 100-yr magnetic storm is identified as having a −Dst≥880 nT (greater than Carrington) but a wide 95% confidence interval of [490,1187] nT.
Surrogate Analysis and Index Developer (SAID) tool
Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.
2015-10-01
The regression models created in SAID can be used in utilities that have been developed to work with the USGS National Water Information System (NWIS) and for the USGS National Real-Time Water Quality (NRTWQ) Web site. The real-time dissemination of predicted SSC and prediction intervals for each time step has substantial potential to improve understanding of sediment-related water quality and associated engineering and ecological management decisions.
Kwon, Yong Hyun; Kwon, Jung Won; Lee, Myoung Hee
2015-01-01
[Purpose] The purpose of the current study was to compare the effectiveness of motor sequential learning according to two different types of practice schedules, distributed practice schedule (two 12-hour inter-trial intervals) and massed practice schedule (two 10-minute inter-trial intervals) using a serial reaction time (SRT) task. [Subjects and Methods] Thirty healthy subjects were recruited and then randomly and evenly assigned to either the distributed practice group or the massed practice group. All subjects performed three consecutive sessions of the SRT task following one of the two different types of practice schedules. Distributed practice was scheduled for two 12-hour inter-session intervals including sleeping time, whereas massed practice was administered for two 10-minute inter-session intervals. Response time (RT) and response accuracy (RA) were measured in at pre-test, mid-test, and post-test. [Results] For RT, univariate analysis demonstrated significant main effects in the within-group comparison of the three tests as well as the interaction effect of two groups × three tests, whereas the between-group comparison showed no significant effect. The results for RA showed no significant differences in neither the between-group comparison nor the interaction effect of two groups × three tests, whereas the within-group comparison of the three tests showed a significant main effect. [Conclusion] Distributed practice led to enhancement of motor skill acquisition at the first inter-session interval as well as at the second inter-interval the following day, compared to massed practice. Consequentially, the results of this study suggest that a distributed practice schedule can enhance the effectiveness of motor sequential learning in 1-day learning as well as for two days learning formats compared to massed practice. PMID:25931727
van Zaane, Bas; van Klei, Wilton A; Buhre, Wolfgang F; Bauer, Peter; Boerma, E Christiaan; Hoeft, Andreas; Metnitz, Philipp; Moreno, Rui P; Pearse, Rupert; Pelosi, Paolo; Sander, Michael; Vallet, Benoit; Pettilä, Ville; Vincent, Jean-Louis; Rhodes, Andrew
2015-07-01
Evidence suggests that sleep deprivation associated with night-time working may adversely affect performance resulting in a reduction in the safety of surgery and anaesthesia. Our primary objective was to evaluate an association between nonelective night-time surgery and in-hospital mortality. We hypothesised that urgent surgery performed during the night was associated with higher in-hospital mortality and also an increase in the duration of hospital stay and the number of admissions to critical care. A prospective cohort study. This is a secondary analysis of a large database related to perioperative care and outcome (European Surgical Outcome Study). Four hundred and ninety-eight hospitals in 28 European countries. Men and women older than 16 years who underwent nonelective, noncardiac surgery were included according to time of the procedure. None. Primary outcome was in-hospital mortality; the secondary outcome was the duration of hospital stay and critical care admission. Eleven thousand two hundred and ninety patients undergoing urgent surgery were included in the analysis with 636 in-hospital deaths (5.6%). Crude mortality odds ratios (ORs) increased sequentially from daytime [426 deaths (5.3%)] to evening [150 deaths (6.0%), OR 1.14; 95% confidence interval 0.94 to 1.38] to night-time [60 deaths (8.3%), OR 1.62; 95% confidence interval 1.22 to 2.14]. Following adjustment for confounding factors, surgery during the evening (OR 1.09; 95% confidence interval 0.91 to 1.31) and night (OR 1.20; 95% confidence interval 0.9 to 1.6) was not associated with an increased risk of postoperative death. Admittance rate to an ICU increased sequentially from daytime [891 (11.1%)], to evening [347 (13.8%)] to night time [149 (20.6%)]. In patients undergoing nonelective urgent noncardiac surgery, in-hospital mortality was associated with well known risk factors related to patients and surgery, but we did not identify any relationship with the time of day at which the procedure was performed. Clinicaltrials.gov identifier: NCT01203605.
Stochastic modeling of a serial killer
Simkin, M.V.; Roychowdhury, V.P.
2014-01-01
We analyze the time pattern of the activity of a serial killer, who during twelve years had murdered 53 people. The plot of the cumulative number of murders as a function of time is of “Devil’s staircase” type. The distribution of the intervals between murders (step length) follows a power law with the exponent of 1.4. We propose a model according to which the serial killer commits murders when neuronal excitation in his brain exceeds certain threshold. We model this neural activity as a branching process, which in turn is approximated by a random walk. As the distribution of the random walk return times is a power law with the exponent 1.5, the distribution of the inter-murder intervals is thus explained. We illustrate analytical results by numerical simulation. Time pattern activity data from two other serial killers further substantiate our analysis. PMID:24721476
Stochastic modeling of a serial killer.
Simkin, M V; Roychowdhury, V P
2014-08-21
We analyze the time pattern of the activity of a serial killer, who during 12 years had murdered 53 people. The plot of the cumulative number of murders as a function of time is of "Devil's staircase" type. The distribution of the intervals between murders (step length) follows a power law with the exponent of 1.4. We propose a model according to which the serial killer commits murders when neuronal excitation in his brain exceeds certain threshold. We model this neural activity as a branching process, which in turn is approximated by a random walk. As the distribution of the random walk return times is a power law with the exponent 1.5, the distribution of the inter-murder intervals is thus explained. We illustrate analytical results by numerical simulation. Time pattern activity data from two other serial killers further substantiate our analysis. Copyright © 2014 Elsevier Ltd. All rights reserved.
Are EUR and GBP different words for the same currency?
NASA Astrophysics Data System (ADS)
Ivanova, K.; Ausloos, M.
2002-05-01
The British Pound (GBP) is not part of the Euro (EUR) monetary system. In order to find out arguments on whether GBP should join the EUR or not correlations are calculated between GBP exchange rates with respect to various currencies: USD, JPY, CHF, DKK, the currencies forming EUR and a reconstructed EUR for the time interval from 1993 till June 30, 2000. The distribution of fluctuations of the exchange rates is Gaussian for the central part of the distribution, but has fat tails for the large size fluctuations. Within the Detrended Fluctuation Analysis (DFA) statistical method the power law behavior describing the root-mean-square deviation from a linear trend of the exchange rate fluctuations is obtained as a function of time for the time interval of interest. The time-dependent exponent evolution of the exchange rate fluctuations is given. Statistical considerations imply that the GBP is already behaving as a true EUR.
The incidence of total hip arthroplasty after hip arthroscopy in osteoarthritic patients
2010-01-01
Objective To assess the incidence of total hip arthroplasty (THA) in osteoarthritic patients who were treated by arthroscopic debridement and to evaluate factors that might influence the time interval from the first hip arthroscopy to THA. Design Retrospective clinical series Methods Follow-up data and surgical reports were retrieved from 564 records of osteoarthritic patients that have had hip arthroscopy between the years 2002 to 2009 with a mean follow-up time of 3.2 years (range, 1-6.4 years). The time interval between the first hip arthroscopy to THA was modelled as a function of patient age; level of cartilage damage; procedures performed and repeated arthroscopies with the use of multivariate regression analysis. Results Ninety (16%) of all participants eventually required THA. The awaiting time from the first arthroscopy to a hip replacement was found to be longer in patients younger than 55 years and in a milder osteoarthritic stage. Patients that experienced repeated hip scopes had a longer time to THA than those with only a single procedure. Procedures performed concomitant with debridement and lavage did not affect the time interval to THA. Conclusions In our series of arthroscopic treatment of hip osteoarthritis, 16% required THA over a period of 7 years. Factors that influence the time to arthroplasty were age, degree of osteoarthritis and recurrent procedures. PMID:20670440
Chatzidionysiou, Katerina; Lie, Elisabeth; Lukina, Galina; Hetland, Merete L; Hauge, Ellen-Margrethe; Pavelka, Karel; Gabay, Cem; Scherer, Almut; Nordström, Dan; Canhao, Helena; Santos, Maria José; Tomsic, Matija; Rotar, Ziga; Hernández, M Victoria; Gomez-Reino, Juan; Ancuta, Ioan; Kvien, Tore K; van Vollenhoven, Ronald
2017-02-01
Several aspects of rituximab (RTX) retreatment in rheumatoid arthritis (RA) need to be further elucidated. The aim of this study was to describe the effect of repeated courses of RTX on disease activity and to compare 2 retreatment strategies, fixed-interval versus on-flare retreatment, in a large international, observational, collaborative study. In the first analysis, patients with RA who received at least 4 cycles with RTX were included. In the second analysis, patients who received at least 1 RTX retreatment and for whom information about the strategy for retreatment was available were identified. Two retreatment strategies (fixed-interval vs on-flare) were compared by fitting-adjusted, mixed-effects models of 28-joint Disease Activity Score (DAS28) over time for first and second retreatment. A total of 1530 patients met the eligibility criteria for the first analysis. Significant reductions of mean DAS28 between the starts of subsequent treatment cycles were observed (at start of first treatment cycle: 5.5; second: 4.3; third: 3.8; and fourth: 3.5), suggesting improved response after each additional cycle (p < 0.0001 for all pairwise comparisons). A total of 800 patients qualified for the second analysis: 616 were retreated on flare and 184 at fixed interval. For the first retreatment, the fixed-interval retreatment group yielded significantly better results than the on-flare group (estimated marginal mean DAS28 = 3.8, 95% CI 3.6-4.1 vs 4.6, 95% CI 4.5-4.7, p < 0.0001). Similar results were found for the second retreatment. Repeated treatment with RTX leads to further clinical improvement after the first course of RTX. A fixed-interval retreatment strategy seems to be more effective than on-flare retreatment.
Interresponse Time Structures in Variable-Ratio and Variable-Interval Schedules
ERIC Educational Resources Information Center
Bowers, Matthew T.; Hill, Jade; Palya, William L.
2008-01-01
The interresponse-time structures of pigeon key pecking were examined under variable-ratio, variable-interval, and variable-interval plus linear feedback schedules. Whereas the variable-ratio and variable-interval plus linear feedback schedules generally resulted in a distinct group of short interresponse times and a broad distribution of longer…
Does controlling for biological maturity improve physical activity tracking?
Erlandson, Marta C; Sherar, Lauren B; Mosewich, Amber D; Kowalski, Kent C; Bailey, Donald A; Baxter-Jones, Adam D G
2011-05-01
Tracking of physical activity through childhood and adolescence tends to be low. Variation in the timing of biological maturation within youth of the same chronological age (CA) might affect participation in physical activity and may partially explain the low tracking. To examine the stability of physical activity over time from childhood to late adolescence when aligned on CA and biological age (BA). A total of 91 males and 96 females aged 8-15 yr from the Saskatchewan Pediatric Bone Mineral Accrual Study (PBMAS) were assessed annually for 8 yr. BA was calculated as years from age at peak height velocity. Physical activity was assessed using the Physical Activity Questionnaire for Children/Adolescents. Tracking was analyzed using intraclass correlations for both CA and BA (2-yr groupings). To be included in the analysis, an individual required a measure at both time points within an interval; however, not all individuals were present at all tracking intervals. Physical activity tracking by CA 2-yr intervals were, in general, moderate in males (r=0.42-0.59) and females (r=0.43-0.44). However, the 9- to 11-yr CA interval was low and nonsignificant (r=0.23-0.30). Likewise, tracking of physical activity by BA 2-yr intervals was moderate to high in males (r=0.44-0.60) and females (r=0.39-0.62). Accounting for differences in the timing of biological maturity had little effect on tracking physical activity. However, point estimates for tracking are higher in early adolescence in males and to a greater extent in females when aligned by BA versus CA. This suggests that maturity may be more important in physical activity participation in females than males. © 2011 by the American College of Sports Medicine
NASA Technical Reports Server (NTRS)
Orwig, L. E.
1971-01-01
Data from a balloon flight experiment using an Na I scintillation spectrometer were analyzed for gamma ray pulsation. The payload was carried to a nominal atmospheric depth of 3.5 g/cm2. A superposed epoch analysis was performed on a 12,000 second portion of the data spanning a total time interval of 16,700 seconds at float altitude. A positive pulsed contribution was observed at the expected apparent frequency of NP 0532 having the typical double pulse structure. The results indicate a time-averaged pulse photon flux of 0.00144 + or - 0.00057 photons/sq cm/sec in the energy interval from 250 keV to 2.3 MeV. This represents a time-averaged pulsed power of 0.000649 + or - 0.000257 keV/sq cm/sec/keV. The ratio of interpulse to main pulse was found to be 2.4 + or - 1.9. The analysis indicates a positive photon flux from 250 keV to 725 keV of 0.00120 + or - 0.00052 photons/sq cm/sec.
Not All Prehospital Time is Equal: Influence of Scene Time on Mortality
Brown, Joshua B.; Rosengart, Matthew R.; Forsythe, Raquel M.; Reynolds, Benjamin R.; Gestring, Mark L.; Hallinan, William M.; Peitzman, Andrew B.; Billiar, Timothy R.; Sperry, Jason L.
2016-01-01
Background Trauma is time-sensitive and minimizing prehospital (PH) time is appealing. However, most studies have not linked increasing PH time with worse outcomes, as raw PH times are highly variable. It is unclear whether specific PH time patterns affect outcomes. Our objective was to evaluate the association of PH time interval distribution with mortality. Methods Patients transported by EMS in the Pennsylvania trauma registry 2000-2013 with total prehospital time (TPT)≥20min were included. TPT was divided into three PH time intervals: response, scene, and transport time. The number of minutes in each PH time interval was divided by TPT to determine the relative proportion each interval contributed to TPT. A prolonged interval was defined as any one PH interval contributing ≥50% of TPT. Patients were classified by prolonged PH interval or no prolonged PH interval (all intervals<50% of TPT). Patients were matched for TPT and conditional logistic regression determined the association of mortality with PH time pattern, controlling for confounders. PH interventions were explored as potential mediators, and prehospital triage criteria used identify patients with time-sensitive injuries. Results There were 164,471 patients included. Patients with prolonged scene time had increased odds of mortality (OR 1.21; 95%CI 1.02–1.44, p=0.03). Prolonged response, transport, and no prolonged interval were not associated with mortality. When adjusting for mediators including extrication and PH intubation, prolonged scene time was no longer associated with mortality (OR 1.06; 0.90–1.25, p=0.50). Together these factors mediated 61% of the effect between prolonged scene time and mortality. Mortality remained associated with prolonged scene time in patients with hypotension, penetrating injury, and flail chest. Conclusions Prolonged scene time is associated with increased mortality. PH interventions partially mediate this association. Further study should evaluate whether these interventions drive increased mortality because they prolong scene time or by another mechanism, as reducing scene time may be a target for intervention. Level of Evidence IV, prognostic study PMID:26886000
ERIC Educational Resources Information Center
Zhang, Guangjian; Preacher, Kristopher J.; Luo, Shanhong
2010-01-01
This article is concerned with using the bootstrap to assign confidence intervals for rotated factor loadings and factor correlations in ordinary least squares exploratory factor analysis. Coverage performances of "SE"-based intervals, percentile intervals, bias-corrected percentile intervals, bias-corrected accelerated percentile…
Fluctuations of hi-hat timing and dynamics in a virtuoso drum track of a popular music recording.
Räsänen, Esa; Pulkkinen, Otto; Virtanen, Tuomas; Zollner, Manfred; Hennig, Holger
2015-01-01
Long-range correlated temporal fluctuations in the beats of musical rhythms are an inevitable consequence of human action. According to recent studies, such fluctuations also lead to a favored listening experience. The scaling laws of amplitude variations in rhythms, however, are widely unknown. Here we use highly sensitive onset detection and time series analysis to study the amplitude and temporal fluctuations of Jeff Porcaro's one-handed hi-hat pattern in "I Keep Forgettin'"-one of the most renowned 16th note patterns in modern drumming. We show that fluctuations of hi-hat amplitudes and interbeat intervals (times between hits) have clear long-range correlations and short-range anticorrelations separated by a characteristic time scale. In addition, we detect subtle features in Porcaro's drumming such as small drifts in the 16th note pulse and non-trivial periodic two-bar patterns in both hi-hat amplitudes and intervals. Through this investigation we introduce a step towards statistical studies of the 20th and 21st century music recordings in the framework of complex systems. Our analysis has direct applications to the development of drum machines and to drumming pedagogy.
Hashimoto, Tetsuo; Sanada, Yukihisa; Uezu, Yasuhiro
2004-05-01
A delayed coincidence method, time-interval analysis (TIA), has been applied to successive alpha- alpha decay events on the millisecond time-scale. Such decay events are part of the (220)Rn-->(216)Po ( T(1/2) 145 ms) (Th-series) and (219)Rn-->(215)Po ( T(1/2) 1.78 ms) (Ac-series). By using TIA in addition to measurement of (226)Ra (U-series) from alpha-spectrometry by liquid scintillation counting (LSC), two natural decay series could be identified and separated. The TIA detection efficiency was improved by using the pulse-shape discrimination technique (PSD) to reject beta-pulses, by solvent extraction of Ra combined with simple chemical separation, and by purging the scintillation solution with dry N(2) gas. The U- and Th-series together with the Ac-series were determined, respectively, from alpha spectra and TIA carried out immediately after Ra-extraction. Using the (221)Fr-->(217)At ( T(1/2) 32.3 ms) decay process as a tracer, overall yields were estimated from application of TIA to the (225)Ra (Np-decay series) at the time of maximum growth. The present method has proven useful for simultaneous determination of three radioactive decay series in environmental samples.
NASA Astrophysics Data System (ADS)
Winkelstern, I. Z.; Surge, D. M.
2010-12-01
Pliocene sea surface temperature (SST) data from the US Atlantic coastal plain is currently insufficient for a detailed understanding of the climatic shifts that occurred during the period. Previous studies, based on oxygen isotope proxy data from marine shells and bryozoan zooid size analysis, have provided constraints on possible annual-scale SST ranges for the region. However, more data are required to fully understand the forcing mechanisms affecting regional Pliocene climate and evaluate modeled temperature projections. Bivalve sclerochronology (growth increment analysis) is an alternative proxy for SST that can provide annually resolved multi-year time series. The method has been validated in previous studies using modern Arctica, Chione, and Mercenaria. We analyzed Pliocene Mercenaria carolinensis shells using sclerochronologic methods and tested the hypothesis that higher SST ranges are reflected in shells selected from the warmest climate interval (3.5-3.3 Ma, upper Yorktown Formation, Virginia) and lower SST ranges are observable in shells selected from the subsequent cooling interval (2.4-1.8 Ma, Chowan River Formation, North Carolina). These results further establish the validity of growth increment analysis using fossil shells and provide the first large dataset (from the region) of reconstructed annual SST from floating time series during these intervals. These data will enhance our knowledge about a warm climate state that has been identified in the 2007 IPCC report as an analogue for expected global warming. Future work will expand this study to include sampling in Florida to gain detailed information about Pliocene SST along a latitudinal gradient.
Choi, Hyang-Ki; Jung, Jin Ah; Fujita, Tomoe; Amano, Hideki; Ghim, Jong-Lyul; Lee, Dong-Hwan; Tabata, Kenichi; Song, Il-Dae; Maeda, Mika; Kumagai, Yuji; Mendzelevski, Boaz; Shin, Jae-Gook
2016-12-01
The goal of this study was to evaluate the moxifloxacin-induced QT interval prolongation in healthy male and female Korean and Japanese volunteers to investigate interethnic differences. This multicenter, randomized, double-blind, placebo-controlled, 2-way crossover study was conducted in healthy male and female Korean and Japanese volunteers. In each period, a single dose of moxifloxacin or placebo 400 mg was administered orally under fasting conditions. Triplicate 12-lead ECGs were recorded at defined time points before, up to 24 hours after dosing, and at corresponding time points during baseline. Serial blood sampling was conducted for pharmacokinetic analysis of moxifloxacin. The pharmacokinetic-pharmacodynamic data between the 2 ethnic groups were compared by using a typical analysis based on the intersection-union test and a nonlinear mixed effects method. A total of 39 healthy subjects (Korean, male: 10, female: 10; Japanese, male: 10, female: 9) were included in the analysis. The concentration-effect analysis revealed that there was no change in slope (and confirmed that the difference was caused by a change in the pharmacokinetic model of moxifloxacin). A 2-compartment model with first-order absorption provided the best description of moxifloxacin's pharmacokinetic parameters. Weight and sex were selected as significant covariates for central volume of distribution and intercompartmental clearance, respectively. An E max model (E[C]=[E max ⋅C]/[EC 50 +C]) described the QT interval data of this study well. However, ethnicity was not found to be a significant factor in a pharmacokinetic-pharmacodynamic link model. The drug-induced QTc prolongations evaluated using moxifloxacin as the probe did not seem to be significantly different between these Korean and Japanese subjects. ClinicalTrials.gov identifier: NCT01876316. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.
Fransson, Eleonor I; Heikkilä, Katriina; Nyberg, Solja T; Zins, Marie; Westerlund, Hugo; Westerholm, Peter; Väänänen, Ari; Virtanen, Marianna; Vahtera, Jussi; Theorell, Töres; Suominen, Sakari; Singh-Manoux, Archana; Siegrist, Johannes; Sabia, Séverine; Rugulies, Reiner; Pentti, Jaana; Oksanen, Tuula; Nordin, Maria; Nielsen, Martin L; Marmot, Michael G; Magnusson Hanson, Linda L; Madsen, Ida E H; Lunau, Thorsten; Leineweber, Constanze; Kumari, Meena; Kouvonen, Anne; Koskinen, Aki; Koskenvuo, Markku; Knutsson, Anders; Kittel, France; Jöckel, Karl-Heinz; Joensuu, Matti; Houtman, Irene L; Hooftman, Wendela E; Goldberg, Marcel; Geuskens, Goedele A; Ferrie, Jane E; Erbel, Raimund; Dragano, Nico; De Bacquer, Dirk; Clays, Els; Casini, Annalisa; Burr, Hermann; Borritz, Marianne; Bonenfant, Sébastien; Bjorner, Jakob B; Alfredsson, Lars; Hamer, Mark; Batty, G David; Kivimäki, Mika
2012-12-15
Unfavorable work characteristics, such as low job control and too high or too low job demands, have been suggested to increase the likelihood of physical inactivity during leisure time, but this has not been verified in large-scale studies. The authors combined individual-level data from 14 European cohort studies (baseline years from 1985-1988 to 2006-2008) to examine the association between unfavorable work characteristics and leisure-time physical inactivity in a total of 170,162 employees (50% women; mean age, 43.5 years). Of these employees, 56,735 were reexamined after 2-9 years. In cross-sectional analyses, the odds for physical inactivity were 26% higher (odds ratio = 1.26, 95% confidence interval: 1.15, 1.38) for employees with high-strain jobs (low control/high demands) and 21% higher (odds ratio = 1.21, 95% confidence interval: 1.11, 1.31) for those with passive jobs (low control/low demands) compared with employees in low-strain jobs (high control/low demands). In prospective analyses restricted to physically active participants, the odds of becoming physically inactive during follow-up were 21% and 20% higher for those with high-strain (odds ratio = 1.21, 95% confidence interval: 1.11, 1.32) and passive (odds ratio = 1.20, 95% confidence interval: 1.11, 1.30) jobs at baseline. These data suggest that unfavorable work characteristics may have a spillover effect on leisure-time physical activity.
Fransson, Eleonor I.; Heikkilä, Katriina; Nyberg, Solja T.; Zins, Marie; Westerlund, Hugo; Westerholm, Peter; Väänänen, Ari; Virtanen, Marianna; Vahtera, Jussi; Theorell, Töres; Suominen, Sakari; Singh-Manoux, Archana; Siegrist, Johannes; Sabia, Séverine; Rugulies, Reiner; Pentti, Jaana; Oksanen, Tuula; Nordin, Maria; Nielsen, Martin L.; Marmot, Michael G.; Magnusson Hanson, Linda L.; Madsen, Ida E. H.; Lunau, Thorsten; Leineweber, Constanze; Kumari, Meena; Kouvonen, Anne; Koskinen, Aki; Koskenvuo, Markku; Knutsson, Anders; Kittel, France; Jöckel, Karl-Heinz; Joensuu, Matti; Houtman, Irene L.; Hooftman, Wendela E.; Goldberg, Marcel; Geuskens, Goedele A.; Ferrie, Jane E.; Erbel, Raimund; Dragano, Nico; De Bacquer, Dirk; Clays, Els; Casini, Annalisa; Burr, Hermann; Borritz, Marianne; Bonenfant, Sébastien; Bjorner, Jakob B.; Alfredsson, Lars; Hamer, Mark; Batty, G. David; Kivimäki, Mika
2012-01-01
Unfavorable work characteristics, such as low job control and too high or too low job demands, have been suggested to increase the likelihood of physical inactivity during leisure time, but this has not been verified in large-scale studies. The authors combined individual-level data from 14 European cohort studies (baseline years from 1985–1988 to 2006–2008) to examine the association between unfavorable work characteristics and leisure-time physical inactivity in a total of 170,162 employees (50% women; mean age, 43.5 years). Of these employees, 56,735 were reexamined after 2–9 years. In cross-sectional analyses, the odds for physical inactivity were 26% higher (odds ratio = 1.26, 95% confidence interval: 1.15, 1.38) for employees with high-strain jobs (low control/high demands) and 21% higher (odds ratio = 1.21, 95% confidence interval: 1.11, 1.31) for those with passive jobs (low control/low demands) compared with employees in low-strain jobs (high control/low demands). In prospective analyses restricted to physically active participants, the odds of becoming physically inactive during follow-up were 21% and 20% higher for those with high-strain (odds ratio = 1.21, 95% confidence interval: 1.11, 1.32) and passive (odds ratio = 1.20, 95% confidence interval: 1.11, 1.30) jobs at baseline. These data suggest that unfavorable work characteristics may have a spillover effect on leisure-time physical activity. PMID:23144364
The String Stability of a Trajectory-Based Interval Management Algorithm in the Midterm Airspace
NASA Technical Reports Server (NTRS)
Swieringa, Kurt A.
2015-01-01
NASA's first Air Traffic Management (ATM) Technology Demonstration (ATD-1) was created to facilitate the transition of mature ATM technologies from the laboratory to operational use. The technologies selected for demonstration are the Traffic Management Advisor with Terminal Metering (TMA-TM), which provides precise time-based scheduling in the terminal airspace; Controller Managed Spacing (CMS), which provides terminal controllers with decision support tools enabling precise schedule conformance; and Interval Management (IM), which consists of flight deck automation that enables aircraft to achieve or maintain a precise spacing interval behind a target aircraft. As the percentage of IM equipped aircraft increases, controllers may provide IM clearances to sequences, or strings, of IM-equipped aircraft. It is important for these strings to maintain stable performance. This paper describes an analytic analysis of the string stability of the latest version of NASA's IM algorithm and a fast-time simulation designed to characterize the string performance of the IM algorithm. The analytic analysis showed that the spacing algorithm has stable poles, indicating that a spacing error perturbation will be reduced as a function of string position. The fast-time simulation investigated IM operations at two airports using constraints associated with the midterm airspace, including limited information of the target aircraft's intended speed profile and limited information of the wind forecast on the target aircraft's route. The results of the fast-time simulation demonstrated that the performance of the spacing algorithm is acceptable for strings of moderate length; however, there is some degradation in IM performance as a function of string position.
NASA Astrophysics Data System (ADS)
Reinemann, Deborah Jean
2000-10-01
Measures of time are essential to human life, especially in the Western world. Human understanding of time develops from the preschool stages of using "before" and "after" to an adult understanding and appreciation of time. Previous researchers (for example, Piaget, Friedman) have investigated and described stages of time development. Time, as it was investigated here, can be classified as conventional, logical or experiential. Conventional time is the ordered representation of time; the days of the week, the months of the year, or clock time: seconds and hours. Logical time is the deduction of duration based on regular events; for example, calculating the passage of time based on two separate events. Experiential time involves the duration of events and estimating intervals. With the recent production of the National Science Education Standards (NSES), many schools are aligning their science curriculum with the NSES. Time appears both implicitly and explicitly in the NSES. Do Middle School students possess the understanding of time necessary to meet the recommendations of the NSES? An interview protocol of four sessions was developed to investigate middle school students understanding of time. The four sessions included: building and testing water clocks; an interview about water clocks and time intervals; a laserdisc presentation about relative time spans; and a mind mapping session. Students were also given the GALT test of Logical Thinking. The subjects of the study were interviewed; eleven eighth grade students and thirteen sixth grade students. The data was transcribed and coded, and a rubric was developed to evaluate students based on their responses to the four sessions. The Time Analysis Rubric is a grid of the types of time: conventional, logical and experiential time versus the degree of understanding of time. Student results were assigned to levels of understanding based on the Time Analysis Rubric. There was a relationship (although not significant) between the students' GALT score and the Time Analysis Rubric results. There was no difference in Time Analysis levels between sixth and eighth grade students. On the basis of this study, Middle School students' level of understanding of time appears to be sufficient to master the requirements of the NSES.
NASA Astrophysics Data System (ADS)
Barton, C. C.; Smigelski, J. R.; Tebbens, S. F.
2008-12-01
Most coastal regions are subject to inundation due to many periodic and non-periodic inputs including for example: diurnal and semi diurnal tides, storms, tsunamis, and global sea level change. Tide guage data provide a frequently sampled long term record of fluctuations in water level. A power-spectral-density analysis of tidal gauge data is used to quantify persistence (degree of internal correlation over various time intervals) in terms of the scaling exponent β and to identify temporal changes in persistence. The stations are located at different proximity to the open ocean, including bays, harbors, and channels. The datasets are from the NOAA CO-OPS Verified Hourly Station Datum. The length of the data sets ranges from 3 years to 101 years. The hourly data sets are decimated to one record every four hours. All data sets analyzed show three distinct regions of persistence with two inflection points at approximately one day and five days. For times less than one day, the scaling exponent ranges between 0 < β < 2.6. For the time interval 1 to 5 days, the scaling exponent ranges between 1.1 < β < 2.1. For times greater than 5 days, the scaling exponent ranges between 0.4 < β < 0.9. Persistence generally decreases as period increases but is stable between the inflection points. At Duck, NC, long term persistence in the tide gauge signal is 0.6 as compared to 0.9 for the biweekly shoreline position signal over twenty years, suggesting a strong correlation between the two and the possibility of using tide gauge data to quantify nearby shoreline mobility over similar time intervals.
Rankin, Nicole M; York, Sarah; Stone, Emily; Barnes, David; McGregor, Deborah; Lai, Michelle; Shaw, Tim; Butow, Phyllis N
2017-05-01
Pathways to lung cancer diagnosis and treatment are complex. International evidence shows significant variations in pathways. Qualitative research investigating pathways to lung cancer diagnosis rarely considers both patient and general practitioner views simultaneously. To describe the lung cancer diagnostic pathway, focusing on the perspective of patients and general practitioners about diagnostic and pretreatment intervals. This qualitative study of patients with lung cancer and general practitioners in Australia used qualitative interviews or a focus group in which participants responded to a semistructured questionnaire designed to explore experiences of the diagnostic pathway. The Model of Pathways to Treatment (the Model) was used as a framework for analysis, with data organized into (1) events, (2) processes, and (3) contributing factors for variations in diagnostic and pretreatment intervals. Thirty participants (19 patients with lung cancer and 11 general practitioners) took part. Nine themes were identified during analysis. For the diagnostic interval, these were: (1) taking patient concerns seriously, (2) a sense of urgency, (3) advocacy that is doctor-driven or self-motivated, and (4) referral: "knowing who to refer to." For the pretreatment interval, themes were: (5) uncertainty, (6) psychosocial support for the patient and family before treatment, and (7) communication among the multidisciplinary team and general practitioners. Two cross-cutting themes were: (8) coordination of care and "handing over" the patient, and (9) general practitioner knowledge about lung cancer. Events were perceived as complex, with diagnosis often being revealed over time, rather than as a single event. Contributing factors at patient, system, and disease levels are described for both intervals. Patients and general practitioners expressed similar themes across the diagnostic and pretreatment intervals. Significant improvements could be made to health systems to facilitate better patient and general practitioner experiences of the diagnostic pathway. This novel presentation of patient and general practitioner perspectives indicates that systemic interventions have a role in timely and appropriate referrals to specialist care and coordination of investigations. Systemic interventions may alleviate concerns about urgency of diagnostic workup, communication, and coordination of care as patients transition from primary to specialist care.
Interval Timing Accuracy and Scalar Timing in C57BL/6 Mice
Buhusi, Catalin V.; Aziz, Dyana; Winslow, David; Carter, Rickey E.; Swearingen, Joshua E.; Buhusi, Mona C.
2010-01-01
In many species, interval timing behavior is accurate—appropriate estimated durations—and scalar—errors vary linearly with estimated durations. While accuracy has been previously examined, scalar timing has not been yet clearly demonstrated in house mice (Mus musculus), raising concerns about mouse models of human disease. We estimated timing accuracy and precision in C57BL/6 mice, the most used background strain for genetic models of human disease, in a peak-interval procedure with multiple intervals. Both when timing two intervals (Experiment 1) or three intervals (Experiment 2), C57BL/6 mice demonstrated varying degrees of timing accuracy. Importantly, both at individual and group level, their precision varied linearly with the subjective estimated duration. Further evidence for scalar timing was obtained using an intraclass correlation statistic. This is the first report of consistent, reliable scalar timing in a sizable sample of house mice, thus validating the PI procedure as a valuable technique, the intraclass correlation statistic as a powerful test of the scalar property, and the C57BL/6 strain as a suitable background for behavioral investigations of genetically engineered mice modeling disorders of interval timing. PMID:19824777
Kowalik, Grzegorz T; Knight, Daniel S; Steeden, Jennifer A; Tann, Oliver; Odille, Freddy; Atkinson, David; Taylor, Andrew; Muthurangu, Vivek
2015-02-01
To develop a real-time phase contrast MR sequence with high enough temporal resolution to assess cardiac time intervals. The sequence utilized spiral trajectories with an acquisition strategy that allowed a combination of temporal encoding (Unaliasing by fourier-encoding the overlaps using the temporal dimension; UNFOLD) and parallel imaging (Sensitivity encoding; SENSE) to be used (UNFOLDed-SENSE). An in silico experiment was performed to determine the optimum UNFOLD filter. In vitro experiments were carried out to validate the accuracy of time intervals calculation and peak mean velocity quantification. In addition, 15 healthy volunteers were imaged with the new sequence, and cardiac time intervals were compared to reference standard Doppler echocardiography measures. For comparison, in silico, in vitro, and in vivo experiments were also carried out using sliding window reconstructions. The in vitro experiments demonstrated good agreement between real-time spiral UNFOLDed-SENSE phase contrast MR and the reference standard measurements of velocity and time intervals. The protocol was successfully performed in all volunteers. Subsequent measurement of time intervals produced values in keeping with literature values and good agreement with the gold standard echocardiography. Importantly, the proposed UNFOLDed-SENSE sequence outperformed the sliding window reconstructions. Cardiac time intervals can be successfully assessed with UNFOLDed-SENSE real-time spiral phase contrast. Real-time MR assessment of cardiac time intervals may be beneficial in assessment of patients with cardiac conditions such as diastolic dysfunction. © 2014 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Reiff, P. H.; Spiro, R. W.; Wolf, R. A.; Kamide, Y.; King, J. H.
1985-01-01
It is pointed out that the maximum electrostatic potential difference across the polar cap, Phi, is a fundamental measure of the coupling between the solar wind and the earth's magnetosphere/ionosphere sytem. During the Coordinated Data Analysis Workshop (CDAW) 6 intervals, no suitably instrumented spacecraft was in an appropriate orbit to determine the polar-cap potential drop directly. However, two recently developed independent techniques make it possible to estimate the polar-cap potential drop for times when direct spacecraft data are not available. The present investigation is concerned with a comparison of cross-polar-cap potential drop estimates calculated for the two CDAW 6 intervals on the basis of these two techniques. In the case of one interval, the agreement between the potential drops and Joule heating rates is relatively good. In the second interval, however, the agreement is not very good. Explanations for this discrepancy are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azunre, P.
Here in this paper, two novel techniques for bounding the solutions of parametric weakly coupled second-order semilinear parabolic partial differential equations are developed. The first provides a theorem to construct interval bounds, while the second provides a theorem to construct lower bounds convex and upper bounds concave in the parameter. The convex/concave bounds can be significantly tighter than the interval bounds because of the wrapping effect suffered by interval analysis in dynamical systems. Both types of bounds are computationally cheap to construct, requiring solving auxiliary systems twice and four times larger than the original system, respectively. An illustrative numerical examplemore » of bound construction and use for deterministic global optimization within a simple serial branch-and-bound algorithm, implemented numerically using interval arithmetic and a generalization of McCormick's relaxation technique, is presented. Finally, problems within the important class of reaction-diffusion systems may be optimized with these tools.« less
Wang, Yuan; Bao, Shan; Du, Wenjun; Ye, Zhirui; Sayer, James R
2017-11-17
This article investigated and compared frequency domain and time domain characteristics of drivers' behaviors before and after the start of distracted driving. Data from an existing naturalistic driving study were used. Fast Fourier transform (FFT) was applied for the frequency domain analysis to explore drivers' behavior pattern changes between nondistracted (prestarting of visual-manual task) and distracted (poststarting of visual-manual task) driving periods. Average relative spectral power in a low frequency range (0-0.5 Hz) and the standard deviation in a 10-s time window of vehicle control variables (i.e., lane offset, yaw rate, and acceleration) were calculated and further compared. Sensitivity analyses were also applied to examine the reliability of the time and frequency domain analyses. Results of the mixed model analyses from the time and frequency domain analyses all showed significant degradation in lateral control performance after engaging in visual-manual tasks while driving. Results of the sensitivity analyses suggested that the frequency domain analysis was less sensitive to the frequency bandwidth, whereas the time domain analysis was more sensitive to the time intervals selected for variation calculations. Different time interval selections can result in significantly different standard deviation values, whereas average spectral power analysis on yaw rate in both low and high frequency bandwidths showed consistent results, that higher variation values were observed during distracted driving when compared to nondistracted driving. This study suggests that driver state detection needs to consider the behavior changes during the prestarting periods, instead of only focusing on periods with physical presence of distraction, such as cell phone use. Lateral control measures can be a better indicator of distraction detection than longitudinal controls. In addition, frequency domain analyses proved to be a more robust and consistent method in assessing driving performance compared to time domain analyses.
Effects of High Intensity Interval Training on Increasing Explosive Power, Speed, and Agility
NASA Astrophysics Data System (ADS)
Fajrin, F.; Kusnanik, N. W.; Wijono
2018-01-01
High Intensity Interval Training (HIIT) is a type of exercise that combines high-intensity exercise and low intensity exercise in a certain time interval. This type of training is very effective and efficient to improve the physical components. The process of improving athletes achievement related to how the process of improving the physical components, so the selection of a good practice method will be very helpful. This study aims to analyze how is the effects of HIIT on increasing explosive power, speed, and agility. This type of research is quantitative with quasi-experimental methods. The design of this study used the Matching-Only Design, with data analysis using the t-test (paired sample t-test). After being given the treatment for six weeks, the results showed there are significant increasing in explosive power, speed, and agility. HIIT in this study used a form of exercise plyometric as high-intensity exercise and jogging as mild or moderate intensity exercise. Increase was due to the improvement of neuromuscular characteristics that affect the increase in muscle strength and performance. From the data analysis, researchers concluded that, Exercises of High Intensity Interval Training significantly effect on the increase in Power Limbs, speed, and agility.
VARIABLE TIME-INTERVAL GENERATOR
Gross, J.E.
1959-10-31
This patent relates to a pulse generator and more particularly to a time interval generator wherein the time interval between pulses is precisely determined. The variable time generator comprises two oscillators with one having a variable frequency output and the other a fixed frequency output. A frequency divider is connected to the variable oscillator for dividing its frequency by a selected factor and a counter is used for counting the periods of the fixed oscillator occurring during a cycle of the divided frequency of the variable oscillator. This defines the period of the variable oscillator in terms of that of the fixed oscillator. A circuit is provided for selecting as a time interval a predetermined number of periods of the variable oscillator. The output of the generator consists of a first pulse produced by a trigger circuit at the start of the time interval and a second pulse marking the end of the time interval produced by the same trigger circuit.
Brackney, Ryan J; Cheung, Timothy H. C; Neisewander, Janet L; Sanabria, Federico
2011-01-01
Dissociating motoric and motivational effects of pharmacological manipulations on operant behavior is a substantial challenge. To address this problem, we applied a response-bout analysis to data from rats trained to lever press for sucrose on variable-interval (VI) schedules of reinforcement. Motoric, motivational, and schedule factors (effort requirement, deprivation level, and schedule requirements, respectively) were manipulated. Bout analysis found that interresponse times (IRTs) were described by a mixture of two exponential distributions, one characterizing IRTs within response bouts, another characterizing intervals between bouts. Increasing effort requirement lengthened the shortest IRT (the refractory period between responses). Adding a ratio requirement increased the length and density of response bouts. Both manipulations also decreased the bout-initiation rate. In contrast, food deprivation only increased the bout-initiation rate. Changes in the distribution of IRTs over time showed that responses during extinction were also emitted in bouts, and that the decrease in response rate was primarily due to progressively longer intervals between bouts. Taken together, these results suggest that changes in the refractory period indicate motoric effects, whereas selective alterations in bout initiation rate indicate incentive-motivational effects. These findings support the use of response-bout analyses to identify the influence of pharmacological manipulations on processes underlying operant performance. PMID:21765544
Exercise training improves autonomic profiles in patients with Charcot-Marie-Tooth disease.
El Mhandi, Lhassan; Pichot, Vincent; Calmels, Paul; Gautheron, Vincent; Roche, Frédéric; Féasson, Léonard
2011-11-01
The effect of an interval exercise training (ITE) program on heart rate variability (HRV) was studied in 8 patients with Charcot-Marie-Tooth (CMT) disease and 8 healthy controls. At baseline, all subjects underwent ambulatory 24-hour Holter electrocardiographic (ECG) monitoring to evaluate HRV. HRV analysis was repeated on CMT patients after they completed a 24-week ITE program on a cycle ergometer. Before exercise, all HRV indices were lower in patients compared with controls, and the difference reached statistical significance for pNN50 (percent of differences between adjacent R-R intervals exceeding 50 ms). After ITE, time- and frequency-domain indices were significantly improved, particularly at night (+8% mean R-R interval, +95% pNN50, 52% reduction in low/high-frequency ratio). We observed significant increases in some of the time and frequency parameters, and values sometimes exceeded those of controls at baseline. Our results suggest that ITE improves HRV modulation in CMT patients by enhancing parasympathetic activity. Copyright © 2011 Wiley Periodicals, Inc.
Intact interval timing in circadian CLOCK mutants.
Cordes, Sara; Gallistel, C R
2008-08-28
While progress has been made in determining the molecular basis for the circadian clock, the mechanism by which mammalian brains time intervals measured in seconds to minutes remains a mystery. An obvious question is whether the interval-timing mechanism shares molecular machinery with the circadian timing mechanism. In the current study, we trained circadian CLOCK +/- and -/- mutant male mice in a peak-interval procedure with 10 and 20-s criteria. The mutant mice were more active than their wild-type littermates, but there were no reliable deficits in the accuracy or precision of their timing as compared with wild-type littermates. This suggests that expression of the CLOCK protein is not necessary for normal interval timing.
Meza, James M; Hickey, Edward J; Blackstone, Eugene H; Jaquiss, Robert D B; Anderson, Brett R; Williams, William G; Cai, Sally; Van Arsdell, Glen S; Karamlou, Tara; McCrindle, Brian W
2017-10-31
In infants requiring 3-stage single-ventricle palliation for hypoplastic left heart syndrome, attrition after the Norwood procedure remains significant. The effect of the timing of stage 2 palliation (S2P), a physician-modifiable factor, on long-term survival is not well understood. We hypothesized that an optimal interval between the Norwood and S2P that both minimizes pre-S2P attrition and maximizes post-S2P survival exists and is associated with individual patient characteristics. The National Institutes of Health/National Heart, Lung, and Blood Institute Pediatric Heart Network Single Ventricle Reconstruction Trial public data set was used. Transplant-free survival (TFS) was modeled from (1) Norwood to S2P and (2) S2P to 3 years by using parametric hazard analysis. Factors associated with death or heart transplantation were determined for each interval. To account for staged procedures, risk-adjusted, 3-year, post-Norwood TFS (the probability of TFS at 3 years given survival to S2P) was calculated using parametric conditional survival analysis. TFS from the Norwood to S2P was first predicted. TFS after S2P to 3 years was then predicted and adjusted for attrition before S2P by multiplying by the estimate of TFS to S2P. The optimal timing of S2P was determined by generating nomograms of risk-adjusted, 3-year, post-Norwood, TFS versus the interval from the Norwood to S2P. Of 547 included patients, 399 survived to S2P (73%). Of the survivors to S2P, 349 (87%) survived to 3-year follow-up. The median interval from the Norwood to S2P was 5.1 (interquartile range, 4.1-6.0) months. The risk-adjusted, 3-year, TFS was 68±7%. A Norwood-S2P interval of 3 to 6 months was associated with greatest 3-year TFS overall and in patients with few risk factors. In patients with multiple risk factors, TFS was severely compromised, regardless of the timing of S2P and most severely when S2P was performed early. No difference in the optimal timing of S2P existed when stratified by shunt type. In infants with few risk factors, progressing to S2P at 3 to 6 months after the Norwood procedure was associated with maximal TFS. Early S2P did not rescue patients with greater risk factor burdens. Instead, referral for heart transplantation may offer their best chance at long-term survival. URL: https://www.clinicaltrials.gov. Unique identifier: NCT00115934. © 2017 American Heart Association, Inc.
Dynamical analysis of the avian-human influenza epidemic model using the semi-analytical method
NASA Astrophysics Data System (ADS)
Jabbari, Azizeh; Kheiri, Hossein; Bekir, Ahmet
2015-03-01
In this work, we present a dynamic behavior of the avian-human influenza epidemic model by using efficient computational algorithm, namely the multistage differential transform method(MsDTM). The MsDTM is used here as an algorithm for approximating the solutions of the avian-human influenza epidemic model in a sequence of time intervals. In order to show the efficiency of the method, the obtained numerical results are compared with the fourth-order Runge-Kutta method (RK4M) and differential transform method(DTM) solutions. It is shown that the MsDTM has the advantage of giving an analytical form of the solution within each time interval which is not possible in purely numerical techniques like RK4M.
Braun, Benedikt J; Bushuven, Eva; Hell, Rebecca; Veith, Nils T; Buschbaum, Jan; Holstein, Joerg H; Pohlemann, Tim
2016-02-01
Weight bearing after lower extremity fractures still remains a highly controversial issue. Even in ankle fractures, the most common lower extremity injury no standard aftercare protocol has been established. Average non weight bearing times range from 0 to 7 weeks, with standardised, radiological healing controls at fixed time intervals. Recent literature calls for patient-adapted aftercare protocols based on individual fracture and load scenarios. We show the clinical feasibility and first results of a new, insole embedded gait analysis tool for continuous monitoring of gait, load and activity. Ten patients were monitored with a new, independent gait analysis insole for up to 3 months postoperatively. Strict 20 kg partial weight bearing was ordered for 6 weeks. Overall activity, load spectrum, ground reaction forces, clinical scoring and general health data were recorded and correlated. Statistical analysis with power analysis, t-test and Spearman correlation was performed. Only one patient completely adhered to the set weight bearing limit. Average time in minutes over the limit was 374 min. Based on the parameters load, activity, gait time over 20 kg weight bearing and maximum ground reaction force high and low performers were defined after 3 weeks. Significant difference in time to painless full weight bearing between high and low performers was shown. Correlation analysis revealed a significant correlation between weight bearing and clinical scoring as well as pain (American Orthopaedic Foot and Ankle Society (AOFAS) Score rs=0.74; Olerud-Molander Score rs=0.93; VAS pain rs=-0.95). Early, continuous gait analysis is able to define aftercare performers with significant differences in time to full painless weight bearing where clinical or radiographic controls could not. Patient compliance to standardised weight bearing limits and protocols is low. Highly individual rehabilitation patterns were seen in all patients. Aftercare protocols should be adjusted to real-time patient conditions, rather than fixed intervals and limits. With a real-time measuring device high performers could be identified and influenced towards optimal healing conditions early, while low performers are recognised and missing healing influences could be corrected according to patient condition. Copyright © 2015 Elsevier Ltd. All rights reserved.
Kishore, Amit; Vail, Andy; Majid, Arshad; Dawson, Jesse; Lees, Kennedy R; Tyrrell, Pippa J; Smith, Craig J
2014-02-01
Atrial fibrillation (AF) confers a high risk of recurrent stroke, although detection methods and definitions of paroxysmal AF during screening vary. We therefore undertook a systematic review and meta-analysis to determine the frequency of newly detected AF using noninvasive or invasive cardiac monitoring after ischemic stroke or transient ischemic attack. Prospective observational studies or randomized controlled trials of patients with ischemic stroke, transient ischemic attack, or both, who underwent any cardiac monitoring for a minimum of 12 hours, were included after electronic searches of multiple databases. The primary outcome was detection of any new AF during the monitoring period. We prespecified subgroup analysis of selected (prescreened or cryptogenic) versus unselected patients and according to duration of monitoring. A total of 32 studies were analyzed. The overall detection rate of any AF was 11.5% (95% confidence interval, 8.9%-14.3%), although the timing, duration, method of monitoring, and reporting of diagnostic criteria used for paroxysmal AF varied. Detection rates were higher in selected (13.4%; 95% confidence interval, 9.0%-18.4%) than in unselected patients (6.2%; 95% confidence interval, 4.4%-8.3%). There was substantial heterogeneity even within specified subgroups. Detection of AF was highly variable, and the review was limited by small sample sizes and marked heterogeneity. Further studies are required to inform patient selection, optimal timing, methods, and duration of monitoring for detection of AF/paroxysmal AF.
NASA Technical Reports Server (NTRS)
Amer, Tahani; Tripp, John; Tcheng, Ping; Burkett, Cecil; Sealey, Bradley
2004-01-01
This paper presents the calibration results and uncertainty analysis of a high-precision reference pressure measurement system currently used in wind tunnels at the NASA Langley Research Center (LaRC). Sensors, calibration standards, and measurement instruments are subject to errors due to aging, drift with time, environment effects, transportation, the mathematical model, the calibration experimental design, and other factors. Errors occur at every link in the chain of measurements and data reduction from the sensor to the final computed results. At each link of the chain, bias and precision uncertainties must be separately estimated for facility use, and are combined to produce overall calibration and prediction confidence intervals for the instrument, typically at a 95% confidence level. The uncertainty analysis and calibration experimental designs used herein, based on techniques developed at LaRC, employ replicated experimental designs for efficiency, separate estimation of bias and precision uncertainties, and detection of significant parameter drift with time. Final results, including calibration confidence intervals and prediction intervals given as functions of the applied inputs, not as a fixed percentage of the full-scale value are presented. System uncertainties are propagated beginning with the initial reference pressure standard, to the calibrated instrument as a working standard in the facility. Among the several parameters that can affect the overall results are operating temperature, atmospheric pressure, humidity, and facility vibration. Effects of factors such as initial zeroing and temperature are investigated. The effects of the identified parameters on system performance and accuracy are discussed.
The Anaesthetic-ECT Time Interval in Electroconvulsive Therapy Practice--Is It Time to Time?
Gálvez, Verònica; Hadzi-Pavlovic, Dusan; Wark, Harry; Harper, Simon; Leyden, John; Loo, Colleen K
2016-01-01
Because most common intravenous anaesthetics used in ECT have anticonvulsant properties, their plasma-brain concentration at the time of seizure induction might affect seizure expression. The quality of ECT seizure expression has been repeatedly associated with efficacy outcomes. The time interval between the anaesthetic bolus injection and the ECT stimulus (anaesthetic-ECT time interval) will determine the anaesthetic plasma-brain concentration when the ECT stimulus is administered. The aim of this study was to examine the effect of the anaesthetic-ECT time interval on ECT seizure quality and duration. The anaesthetic-ECT time interval was recorded in 771 ECT sessions (84 patients). Right unilateral brief pulse ECT was applied. Anaesthesia given was propofol (1-2 mg/kg) and succinylcholine (0.5-1.0 mg/kg). Seizure quality indices (slow wave onset, amplitude, regularity, stereotypy and post-ictal suppression) and duration were rated through a structured rating scale by a single blinded trained rater. Linear Mixed Effects Models analysed the effect of the anaesthetic-ECT time interval on seizure quality indices, controlling for propofol dose (mg), ECT charge (mC), ECT session number, days between ECT, age (years), initial seizure threshold (mC) and concurrent medication. Longer anaesthetic-ECT time intervals lead to significantly higher quality seizures (p < 0.001 for amplitude, regularity, stereotypy and post-ictal suppression). These results suggest that the anaesthetic-ECT time interval is an important factor to consider in ECT practice. This time interval should be extended to as long as practically possible to facilitate the production of better quality seizures. Close collaboration between the anaesthetist and the psychiatrist is essential. Copyright © 2015 Elsevier Inc. All rights reserved.
Analysis of aggregated tick returns: Evidence for anomalous diffusion
NASA Astrophysics Data System (ADS)
Weber, Philipp
2007-01-01
In order to investigate the origin of large price fluctuations, we analyze stock price changes of ten frequently traded NASDAQ stocks in the year 2002. Though the influence of the trading frequency on the aggregate return in a certain time interval is important, it cannot alone explain the heavy-tailed distribution of stock price changes. For this reason, we analyze intervals with a fixed number of trades in order to eliminate the influence of the trading frequency and investigate the relevance of other factors for the aggregate return. We show that in tick time the price follows a discrete diffusion process with a variable step width while the difference between the number of steps in positive and negative direction in an interval is Gaussian distributed. The step width is given by the return due to a single trade and is long-term correlated in tick time. Hence, its mean value can well characterize an interval of many trades and turns out to be an important determinant for large aggregate returns. We also present a statistical model reproducing the cumulative distribution of aggregate returns. For an accurate agreement with the empirical distribution, we also take into account asymmetries of the step widths in different directions together with cross correlations between these asymmetries and the mean step width as well as the signs of the steps.
Pattern of spread and prognosis in lower limb-onset ALS
TURNER, MARTIN R.; BROCKINGTON, ALICE; SCABER, JAKUB; HOLLINGER, HANNAH; MARSDEN, RACHAEL; SHAW, PAMELA J.; TALBOT, KEVIN
2011-01-01
Our objective was to establish the pattern of spread in lower limb-onset ALS (contra- versus ipsi-lateral) and its contribution to prognosis within a multivariate model. Pattern of spread was established in 109 sporadic ALS patients with lower limb-onset, prospectively recorded in Oxford and Sheffield tertiary clinics from 2001 to 2008. Survival analysis was by univariate Kaplan-Meier log-rank and multivariate Cox proportional hazards. Variables studied were time to next limb progression, site of next progression, age at symptom onset, gender, diagnostic latency and use of riluzole. Initial progression was either to the contralateral leg (76%) or ipsilateral arm (24%). Factors independently affecting survival were time to next limb progression, age at symptom onset, and diagnostic latency. Time to progression as a prognostic factor was independent of initial direction of spread. In a regression analysis of the deceased, overall survival from symptom onset approximated to two years plus the time interval for initial spread. In conclusion, rate of progression in lower limb-onset ALS is not influenced by whether initial spread is to the contralateral limb or ipsilateral arm. The time interval to this initial spread is a powerful factor in predicting overall survival, and could be used to facilitate decision-making and effective care planning. PMID:20001488
Active, capable, and potentially active faults - a paleoseismic perspective
Machette, M.N.
2000-01-01
Maps of faults (geologically defined source zones) may portray seismic hazards in a wide range of completeness depending on which types of faults are shown. Three fault terms - active, capable, and potential - are used in a variety of ways for different reasons or applications. Nevertheless, to be useful for seismic-hazards analysis, fault maps should encompass a time interval that includes several earthquake cycles. For example, if the common recurrence in an area is 20,000-50,000 years, then maps should include faults that are 50,000-100,000 years old (two to five typical earthquake cycles), thus allowing for temporal variability in slip rate and recurrence intervals. Conversely, in more active areas such as plate boundaries, maps showing faults that are <10,000 years old should include those with at least 2 to as many as 20 paleoearthquakes. For the International Lithosphere Programs' Task Group II-2 Project on Major Active Faults of the World our maps and database will show five age categories and four slip rate categories that allow one to select differing time spans and activity rates for seismic-hazard analysis depending on tectonic regime. The maps are accompanied by a database that describes evidence for Quaternary faulting, geomorphic expression, and paleoseismic parameters (slip rate, recurrence interval and time of most recent surface faulting). These maps and databases provide an inventory of faults that would be defined as active, capable, and potentially active for seismic-hazard assessments.
Solar-Terrestrial Signal Record in Tree Ring Width Time Series from Brazil
NASA Astrophysics Data System (ADS)
Rigozo, Nivaor Rodolfo; Lisi, Cláudio Sergio; Filho, Mário Tomazello; Prestes, Alan; Nordemann, Daniel Jean Roger; de Souza Echer, Mariza Pereira; Echer, Ezequiel; da Silva, Heitor Evangelista; Rigozo, Valderez F.
2012-12-01
This work investigates the behavior of the sunspot number and Southern Oscillation Index (SOI) signal recorded in the tree ring time series for three different locations in Brazil: Humaitá in Amazônia State, Porto Ferreira in São Paulo State, and Passo Fundo in Rio Grande do Sul State, using wavelet and cross-wavelet analysis techniques. The wavelet spectra of tree ring time series showed signs of 11 and 22 years, possibly related to the solar activity, and periods of 2-8 years, possibly related to El Niño events. The cross-wavelet spectra for all tree ring time series from Brazil present a significant response to the 11-year solar cycle in the time interval between 1921 to after 1981. These tree ring time series still have a response to the second harmonic of the solar cycle (5.5 years), but in different time intervals. The cross-wavelet maps also showed that the relationship between the SOI x tree ring time series is more intense, for oscillation in the range of 4-8 years.
Visibility graph analysis on heartbeat dynamics of meditation training
NASA Astrophysics Data System (ADS)
Jiang, Sen; Bian, Chunhua; Ning, Xinbao; Ma, Qianli D. Y.
2013-06-01
We apply the visibility graph analysis to human heartbeat dynamics by constructing the complex networks of heartbeat interval time series and investigating the statistical properties of the network before and during chi and yoga meditation. The experiment results show that visibility graph analysis can reveal the dynamical changes caused by meditation training manifested as regular heartbeat, which is closely related to the adjustment of autonomous neural system, and visibility graph analysis is effective to evaluate the effect of meditation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vujovic, Olga, E-mail: olga.vujovic@lhsc.on.ca; Yu, Edward; Cherian, Anil
Purpose: A retrospectivechart review was conducted to determine whether the time interval from breast-conserving surgery to breast irradiation (surgery-radiation therapy interval) in early stage node-negative breast cancer had any detrimental effects on recurrence rates. Methods and Materials: There were 566 patients with T1 to T3, N0 breast cancer treated with breast-conserving surgery and breast irradiation and without adjuvant systemic treatment between 1985 and 1992. The surgery-to-radiation therapy intervals used for analysis were 0 to 8 weeks (201 patients), >8 to 12 weeks (233 patients), >12 to 16 weeks (91 patients), and >16 weeks (41 patients). Kaplan-Meier estimates of time to local recurrence, disease-free survival, distantmore » disease-free survival, cause-specific survival, and overall survival rates were calculated. Results: Median follow-up was 17.4 years. Patients in all 4 time intervals were similar in terms of characteristics and pathologic features. There were no statistically significant differences among the 4 time groups in local recurrence (P=.67) or disease-free survival (P=.82). The local recurrence rates at 5, 10, and 15 years were 4.9%, 11.5%, and 15.0%, respectively. The distant disease relapse rates at 5, 10, and 15 years were 10.6%, 15.4%, and 18.5%, respectively. The disease-free failure rates at 5, 10, and 15 years were 20%, 32.3%, and 39.8%, respectively. Cause-specific survival rates at 5, 10, and 15 years were 92%, 84.6%, and 79.8%, respectively. The overall survival rates at 5, 10, and 15 years were 89.3%, 79.2%, and 66.9%, respectively. Conclusions: Surgery-radiation therapy intervals up to 16 weeks from breast-conserving surgery are not associated with any increased risk of recurrence in early stage node-negative breast cancer. There is a steady local recurrence rate of 1% per year with adjuvant radiation alone.« less
Yang, Haichen; Laurenza, Antonio; Williams, Betsy; Patten, Anna; Hussein, Ziad; Ferry, Jim
2015-08-01
Perampanel is a selective, noncompetitive AMPA receptor antagonist approved as adjunctive treatment for partial seizures. To assess potential for delayed cardiac repolarization, a Phase I thorough QT study was performed, supplemented by plasma concentration-QT data modeled from 3 pooled Phase III studies. The Phase I thorough QT study (double-blind, combined fixed-sequence, parallel-group) quantified the effect of perampanel (6 mg once daily for 7 days, followed by dose escalation to a single 8-mg dose, a single 10-mg dose, then 12 mg once daily for 7 days), moxifloxacin positive control (single 400-mg dose on Day 16), and placebo on QT interval duration in healthy subjects (N = 261). Electrocardiograms were recorded at baseline, Day 7 (post 6 mg dose), and Day 16 (post 12 mg dose). Statistical comparisons were between the highest approved perampanel dose (12 mg) versus placebo, a "mid-therapeutic" dose (6 mg) versus placebo, and moxifloxacin versus placebo. Acknowledging that the Phase I thorough QT study could not incorporate a true "supratherapeutic" dose due to length of titration and tolerability concerns in healthy subjects, Phase III studies of perampanel included expanded electrocardiogram safety evaluations specifically intended to support concentration-QT response modeling. The lack of effect of perampanel on the QT interval is shown from pooled analysis of 3 double-blind, placebo-controlled, 19-week, Phase III studies with perampanel doses ≤ 12 mg (N = 1038, total perampanel; and N=442, placebo) in patients with partial seizures. QT measures were corrected for heart rate using Fridericia's (QTcF; the primary endpoint) and Bazett's (QTcB) formulas. In the Phase I thorough QT study, the positive control moxifloxacin caused peak time-matched, baseline-adjusted, placebo-corrected (ΔΔ) QTcF of 12.15 ms at 4h postdose, confirming a drug effect on QTc interval and study assessment sensitivity. Mean baseline-adjusted (Δ) QTcF versus nominal time curves were comparable between perampanel 12 mg and placebo, with most ΔQTcF values being slightly negative. Healthy subjects receiving perampanel 6 and 12 mg doses for 7 days showed no evidence of effects on cardiac repolarization. Peak ΔΔQTcF was 2.34 ms at 1.5h postdose for perampanel 6 mg and 3.92 ms at 0.5h postdose for perampanel 12 mg. At every time point, the upper 95% confidence limit of ΔΔQTcF for perampanel 6 and 12 mg was <10 ms. Phase III studies revealed no clinically significant difference between patients with partial seizures treated with perampanel or placebo in QTcF and QTcB values >450 ms, with no dose-dependent increases or large incremental changes from baseline of >60 ms. Regression analysis of individual plasma perampanel concentrations versus corresponding QTc interval values in Phase I thorough QT and Phase III studies demonstrated no relationship between perampanel concentrations and QT interval duration. Treatment with perampanel 6 mg and 12 mg for 7 days did not delay cardiac repolarization in healthy volunteers. In a population analysis of 1480 patients with partial seizures treated with perampanel doses ≤ 12 mg or placebo, no clinically significant trends in QT interval data were noted. Based on the thorough QT study and evaluations from pooled Phase III studies, there is no evidence of prolonged QT interval duration with perampanel treatment. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Oka, Tomoko; Matsukura, Makoto; Okamoto, Miwako; Harada, Noriaki; Kitano, Takao; Miike, Teruhisa; Futatsuka, Makoto
2002-12-01
In order to assess the cardiovascular autonomic nervous functions in patients with fetal type Minamata disease (FMD), we investigated blood pressure (BP), and conducted time and frequency domain analysis of heart rate variability (HRV). Subjects were 9 patients in Meisuien recognized as FMD, and 13 healthy age matched control subjects. HRV and BP were assessed after subjects rested in a supine position for 10 minutes. Electrocardiographic (ECG) data were collected for 3 minutes during natural breathing. Time domain analysis (the average of R-R intervals [Mean RR], standard deviation of R-R intervals [SD RR], coefficient of variation [CV]), and frequency domain analysis by fast Fourier transformation (FFT) (power of low frequency [LF] and high frequency [HF] component, expressed in normalized units[nu]) were then conducted. In the time domain analysis, the mean RR of the FMD group was significantly lower than that of the control group. Neither SD RR nor CV showed significant differences between the two groups, but both tended to be lower in the FMD group. In the frequency domain analysis, the HF component of the FMD group was significantly lower than that of the control group. Pulse pressure (PP) was significantly lower in the FMD subjects. These findings suggest that parasympathetic nervous dysfunction might exist in FMD patients, who were exposed to high doses of methylmercury (MeHg) during the prenatal period. Decrease of PP might be due to degenerative changes of blood vessels driven by exposure to high doses of MeHg.
An informational transition in conditioned Markov chains: Applied to genetics and evolution.
Zhao, Lei; Lascoux, Martin; Waxman, David
2016-08-07
In this work we assume that we have some knowledge about the state of a population at two known times, when the dynamics is governed by a Markov chain such as a Wright-Fisher model. Such knowledge could be obtained, for example, from observations made on ancient and contemporary DNA, or during laboratory experiments involving long term evolution. A natural assumption is that the behaviour of the population, between observations, is related to (or constrained by) what was actually observed. The present work shows that this assumption has limited validity. When the time interval between observations is larger than a characteristic value, which is a property of the population under consideration, there is a range of intermediate times where the behaviour of the population has reduced or no dependence on what was observed and an equilibrium-like distribution applies. Thus, for example, if the frequency of an allele is observed at two different times, then for a large enough time interval between observations, the population has reduced or no dependence on the two observed frequencies for a range of intermediate times. Given observations of a population at two times, we provide a general theoretical analysis of the behaviour of the population at all intermediate times, and determine an expression for the characteristic time interval, beyond which the observations do not constrain the population's behaviour over a range of intermediate times. The findings of this work relate to what can be meaningfully inferred about a population at intermediate times, given knowledge of terminal states. Copyright © 2016 Elsevier Ltd. All rights reserved.
Emergency department patient satisfaction survey in Imam Reza Hospital, Tabriz, Iran
2011-01-01
Introduction Patient satisfaction is an important indicator of the quality of care and service delivery in the emergency department (ED). The objective of this study was to evaluate patient satisfaction with the Emergency Department of Imam Reza Hospital in Tabriz, Iran. Methods This study was carried out for 1 week during all shifts. Trained researchers used the standard Press Ganey questionnaire. Patients were asked to complete the questionnaire prior to discharge. The study questionnaire included 30 questions based on a Likert scale. Descriptive and analytical statistics were used throughout data analysis in a number of ways using SPSS version 13. Results Five hundred patients who attended our ED were included in this study. The highest satisfaction rates were observed in the terms of physicians' communication with patients (82.5%), security guards' courtesy (78.3%) and nurses' communication with patients (78%). The average waiting time for the first visit to a physician was 24 min 15 s. The overall satisfaction rate was dependent on the mean waiting time. The mean waiting time for a low rate of satisfaction was 47 min 11 s with a confidence interval of (19.31, 74.51), and for very good level of satisfaction it was 14 min 57 s with a (10.58, 18.57) confidence interval. Approximately 63% of the patients rated their general satisfaction with the emergency setting as good or very good. On the whole, the patient satisfaction rate at the lowest level was 7.7 with a confidence interval of (5.1, 10.4), and at the low level it was 5.8% with a confidence interval of (3.7, 7.9). The rate of satisfaction for the mediocre level was 23.3 with a confidence interval of (19.1, 27.5); for the high level of satisfaction it was 28.3 with a confidence interval of (22.9, 32.8), and for the very high level of satisfaction, this rate was 32.9% with a confidence interval of (28.4, 37.4). Conclusion The study findings indicated the need for evidence-based interventions in emergency care services in areas such as medical care, nursing care, courtesy of staff, physical comfort and waiting time. Efforts should focus on shortening waiting intervals and improving patients' perceptions about waiting in the ED, and also improving the overall cleanliness of the emergency room. PMID:21407998
NASA Technical Reports Server (NTRS)
Pikkujamsa, S. M.; Makikallio, T. H.; Sourander, L. B.; Raiha, I. J.; Puukka, P.; Skytta, J.; Peng, C. K.; Goldberger, A. L.; Huikuri, H. V.
1999-01-01
BACKGROUND: New methods of R-R interval variability based on fractal scaling and nonlinear dynamics ("chaos theory") may give new insights into heart rate dynamics. The aims of this study were to (1) systematically characterize and quantify the effects of aging from early childhood to advanced age on 24-hour heart rate dynamics in healthy subjects; (2) compare age-related changes in conventional time- and frequency-domain measures with changes in newly derived measures based on fractal scaling and complexity (chaos) theory; and (3) further test the hypothesis that there is loss of complexity and altered fractal scaling of heart rate dynamics with advanced age. METHODS AND RESULTS: The relationship between age and cardiac interbeat (R-R) interval dynamics from childhood to senescence was studied in 114 healthy subjects (age range, 1 to 82 years) by measurement of the slope, beta, of the power-law regression line (log power-log frequency) of R-R interval variability (10(-4) to 10(-2) Hz), approximate entropy (ApEn), short-term (alpha(1)) and intermediate-term (alpha(2)) fractal scaling exponents obtained by detrended fluctuation analysis, and traditional time- and frequency-domain measures from 24-hour ECG recordings. Compared with young adults (<40 years old, n=29), children (<15 years old, n=27) showed similar complexity (ApEn) and fractal correlation properties (alpha(1), alpha(2), beta) of R-R interval dynamics despite lower spectral and time-domain measures. Progressive loss of complexity (decreased ApEn, r=-0.69, P<0.001) and alterations of long-term fractal-like heart rate behavior (increased alpha(2), r=0.63, decreased beta, r=-0.60, P<0.001 for both) were observed thereafter from middle age (40 to 60 years, n=29) to old age (>60 years, n=29). CONCLUSIONS: Cardiac interbeat interval dynamics change markedly from childhood to old age in healthy subjects. Children show complexity and fractal correlation properties of R-R interval time series comparable to those of young adults, despite lower overall heart rate variability. Healthy aging is associated with R-R interval dynamics showing higher regularity and altered fractal scaling consistent with a loss of complex variability.
Integrating Behavioral Health in Primary Care Using Lean Workflow Analysis: A Case Study.
van Eeghen, Constance; Littenberg, Benjamin; Holman, Melissa D; Kessler, Rodger
2016-01-01
Primary care offices are integrating behavioral health (BH) clinicians into their practices. Implementing such a change is complex, difficult, and time consuming. Lean workflow analysis may be an efficient, effective, and acceptable method for use during integration. The objectives of this study were to observe BH integration into primary care and to measure its impact. This was a prospective, mixed-methods case study in a primary care practice that served 8,426 patients over a 17-month period, with 652 patients referred to BH services. Secondary measures included primary care visits resulting in BH referrals, referrals resulting in scheduled appointments, time from referral to the scheduled appointment, and time from the referral to the first visit. Providers and staff were surveyed on the Lean method. Referrals increased from 23 to 37 per 1000 visits (P < .001). Referrals resulted in more scheduled (60% to 74%; P < .001) and arrived visits (44% to 53%; P = .025). Time from referral to the first scheduled visit decreased (hazard ratio, 1.60; 95% confidence interval, 1.37-1.88) as did time to first arrived visit (hazard ratio, 1.36; 95% confidence interval, 1.14-1.62). Survey responses and comments were positive. This pilot integration of BH showed significant improvements in treatment initiation and other measures. Strengths of Lean analysis included workflow improvement, system perspective, and project success. Further evaluation is indicated. © Copyright 2016 by the American Board of Family Medicine.
Rougier, Patrice; Burdet, Cyril; Genthon, Nicolas
2006-10-01
To assess whether prior stretching of a muscle can induce improved postural control, 15 healthy adults stood still upright with their eyes closed before and after a series of bilateral stretches of the triceps surae muscles. The analysis focused on the center of pressure (CP) and the vertical projection of the center of gravity (CGv) trajectories and their difference (CP - CGv). The prolonged stretching induced a forward shift of the mean position of the CGv. The frequency analysis showed a constancy of the amplitudes of both basic movements whereas an increased mean power frequency was seen for the CP - CGv movements. A fractional Brownian motion modeling of the trajectories indicates shortest time intervals and lower covered distances by the CGv before a change in its control occurs along the antero-posterior axis. This reorganization is thought to be a result of improved body movement detection, which allows postural control over the longest time intervals to be triggered more rapidly.
A fatigue monitoring system based on time-domain and frequency-domain analysis of pulse data
NASA Astrophysics Data System (ADS)
Shen, Jiaai
2018-04-01
Fatigue is almost a problem that everyone would face, and a psychosis that everyone hates. If we can test people's fatigue condition and remind them of the tiredness, dangers in life, for instance, traffic accidents and sudden death will be effectively reduced, people's fatigued operations will be avoided. And people can be assisted to have access to their own and others' physical condition in time to alternate work with rest. The article develops a wearable bracelet based on FFT Pulse Frequency Spectrum Analysis and IBI's standard deviation and range calculation, according to people's heart rate (BPM) and inter-beat interval (IBI) while being tired and conscious. The hardware part is based on Arduino, pulse rate sensor, and Bluetooth module, and the software part is relied on network micro database and APP. By doing sample experiment to get more accurate standard value to judge tiredness, we prove that we can judge people's fatigue condition based on heart rate (BPM) and inter-beat interval (IBI).
Jitter Reduces Response-Time Variability in ADHD: An Ex-Gaussian Analysis.
Lee, Ryan W Y; Jacobson, Lisa A; Pritchard, Alison E; Ryan, Matthew S; Yu, Qilu; Denckla, Martha B; Mostofsky, Stewart; Mahone, E Mark
2015-09-01
"Jitter" involves randomization of intervals between stimulus events. Compared with controls, individuals with ADHD demonstrate greater intrasubject variability (ISV) performing tasks with fixed interstimulus intervals (ISIs). Because Gaussian curves mask the effect of extremely slow or fast response times (RTs), ex-Gaussian approaches have been applied to study ISV. This study applied ex-Gaussian analysis to examine the effects of jitter on RT variability in children with and without ADHD. A total of 75 children, aged 9 to 14 years (44 ADHD, 31 controls), completed a go/no-go test with two conditions: fixed ISI and jittered ISI. ADHD children showed greater variability, driven by elevations in exponential (tau), but not normal (sigma) components of the RT distribution. Jitter decreased tau in ADHD to levels not statistically different than controls, reducing lapses in performance characteristic of impaired response control. Jitter may provide a nonpharmacologic mechanism to facilitate readiness to respond and reduce lapses from sustained (controlled) performance. © 2012 SAGE Publications.
Odors Bias Time Perception in Visual and Auditory Modalities
Yue, Zhenzhu; Gao, Tianyu; Chen, Lihan; Wu, Jiashuang
2016-01-01
Previous studies have shown that emotional states alter our perception of time. However, attention, which is modulated by a number of factors, such as emotional events, also influences time perception. To exclude potential attentional effects associated with emotional events, various types of odors (inducing different levels of emotional arousal) were used to explore whether olfactory events modulated time perception differently in visual and auditory modalities. Participants were shown either a visual dot or heard a continuous tone for 1000 or 4000 ms while they were exposed to odors of jasmine, lavender, or garlic. Participants then reproduced the temporal durations of the preceding visual or auditory stimuli by pressing the spacebar twice. Their reproduced durations were compared to those in the control condition (without odor). The results showed that participants produced significantly longer time intervals in the lavender condition than in the jasmine or garlic conditions. The overall influence of odor on time perception was equivalent for both visual and auditory modalities. The analysis of the interaction effect showed that participants produced longer durations than the actual duration in the short interval condition, but they produced shorter durations in the long interval condition. The effect sizes were larger for the auditory modality than those for the visual modality. Moreover, by comparing performance across the initial and the final blocks of the experiment, we found odor adaptation effects were mainly manifested as longer reproductions for the short time interval later in the adaptation phase, and there was a larger effect size in the auditory modality. In summary, the present results indicate that odors imposed differential impacts on reproduced time durations, and they were constrained by different sensory modalities, valence of the emotional events, and target durations. Biases in time perception could be accounted for by a framework of attentional deployment between the inducers (odors) and emotionally neutral stimuli (visual dots and sound beeps). PMID:27148143
Mouse Activity across Time Scales: Fractal Scenarios
Lima, G. Z. dos Santos; Lobão-Soares, B.; do Nascimento, G. C.; França, Arthur S. C.; Muratori, L.; Ribeiro, S.; Corso, G.
2014-01-01
In this work we devise a classification of mouse activity patterns based on accelerometer data using Detrended Fluctuation Analysis. We use two characteristic mouse behavioural states as benchmarks in this study: waking in free activity and slow-wave sleep (SWS). In both situations we find roughly the same pattern: for short time intervals we observe high correlation in activity - a typical 1/f complex pattern - while for large time intervals there is anti-correlation. High correlation of short intervals ( to : waking state and to : SWS) is related to highly coordinated muscle activity. In the waking state we associate high correlation both to muscle activity and to mouse stereotyped movements (grooming, waking, etc.). On the other side, the observed anti-correlation over large time scales ( to : waking state and to : SWS) during SWS appears related to a feedback autonomic response. The transition from correlated regime at short scales to an anti-correlated regime at large scales during SWS is given by the respiratory cycle interval, while during the waking state this transition occurs at the time scale corresponding to the duration of the stereotyped mouse movements. Furthermore, we find that the waking state is characterized by longer time scales than SWS and by a softer transition from correlation to anti-correlation. Moreover, this soft transition in the waking state encompass a behavioural time scale window that gives rise to a multifractal pattern. We believe that the observed multifractality in mouse activity is formed by the integration of several stereotyped movements each one with a characteristic time correlation. Finally, we compare scaling properties of body acceleration fluctuation time series during sleep and wake periods for healthy mice. Interestingly, differences between sleep and wake in the scaling exponents are comparable to previous works regarding human heartbeat. Complementarily, the nature of these sleep-wake dynamics could lead to a better understanding of neuroautonomic regulation mechanisms. PMID:25275515
Extended time-interval analysis
NASA Astrophysics Data System (ADS)
Fynbo, H. O. U.; Riisager, K.
2014-01-01
Several extensions of the halflife analysis method recently suggested by Horvat and Hardy are put forward. Goodness-of-fit testing is included, and the method is extended to cases where more information is available for each decay event which allows applications also for e.g. γ decay data. The results are tested with Monte Carlo simulations and are applied to the decays of 64Cu and 56Mn.
Intact Interval Timing in Circadian CLOCK Mutants
Cordes, Sara; Gallistel, C. R.
2008-01-01
While progress has been made in determining the molecular basis for the circadian clock, the mechanism by which mammalian brains time intervals measured in seconds to minutes remains a mystery. An obvious question is whether the interval timing mechanism shares molecular machinery with the circadian timing mechanism. In the current study, we trained circadian CLOCK +/− and −/− mutant male mice in a peak-interval procedure with 10 and 20-s criteria. The mutant mice were more active than their wild-type littermates, but there were no reliable deficits in the accuracy or precision of their timing as compared with wild-type littermates. This suggests that expression of the CLOCK protein is not necessary for normal interval timing. PMID:18602902
LING, YANG; FAN, LIEYING; DONG, CHUNLEI; ZHU, JING; LIU, YONGPING; NI, YAN; ZHU, CHANGTAI; ZHANG, CHANGSONG
2010-01-01
The aim of this study was to investigate possible differences in cellular immunity between chemo- and/or radiotherapy groups during a long interval after surgery in esophageal squamous cell carcinoma (ESCC) patients. Cellular immunity was assessed as peripheral lymphocyte subsets in response to chemotherapy (CT), radiotherapy (RT) and CT+RT by flow cytometric analysis. There were 139 blood samples obtained at different time points relative to surgery from 73 patients with ESCC. The changes in the absolute and relative proportions of lymphocyte phenotypes were significant among the adjuvant therapy groups. There were significant differences in the absolute counts of CD4+ and CD8+ T cells among the interval groups, and a lower CD4/CD8 ratio was found in patients following a prolonged interval. RT alone had a profound effect on the absolute counts of CD3+, CD4+ and CD8+ T cells compared with the other groups. CD4+ T cells exhibited a decreasing trend during a long interval, leading to a prolonged T-cell imbalance after surgery. Univariate analysis revealed that the interaction of the type of adjuvant therapy and the interval after surgery was correlated only with the percentage of CD4+ T cells. The percentage of CD4+ T cells can be used as an indicator of the cellular immunity after surgery in ESCC patients. However, natural killer cells consistently remained suppressed in ESCC patients following adjuvant therapy after surgery. These findings confirm an interaction between adjuvant therapy and the interval after surgery on peripheral CD4+ T cells, and implies that adjuvant therapy may have selective influence on the cellular immunity of ESCC patients after surgery. PMID:23136603
Mannion, Melissa L; Xie, Fenglong; Baddley, John; Chen, Lang; Curtis, Jeffrey R; Saag, Kenneth; Zhang, Jie; Beukelman, Timothy
2016-09-05
To investigate the utilization of health care services before and after transfer from pediatric to adult rheumatology care in clinical practice. Using US commercial claims data from January 2005 through August 2012, we identified individuals with a JIA diagnosis code from a pediatric rheumatologist followed by any diagnosis code from an adult rheumatologist. Individuals had 6 months observable time before the last pediatric visit and 6 months after the first adult visit. Medication, emergency room, physical therapy use, and diagnosis codes were compared between the pediatric and adult interval using McNemar's test. The proportion of days covered (PDC) of TNFi for the time between last pediatric and first adult visit was calculated. We identified 58 individuals with JIA who transferred from pediatric to adult rheumatology care after the age of 14. The median age at the last pediatric rheumatology visit was 18.1 years old and the median transfer interval was 195 days. 29 % of patients received NSAIDs in the adult interval compared to 43 % in the pediatric interval (p = 0.06). In the pediatric interval, 71 % received a JRA and 0 % received an RA physician diagnosis code compared to 28 and 45 %, respectively, in the adult interval. The median PDC for patients receiving a TNFi was 0.75 during the transfer interval. Individuals with JIA who transferred to adult care were more likely receive a diagnosis of RA instead of JRA and were less likely to receive NSAIDs, but had no significant immediate changes to other medication use.
Dubben, H H; Beck-Bornholdt, H P
2000-12-01
The statistical quality of the contributions to "Strahlentherapie und Onkologie" is assessed, aiming for improvement of the journal and consequently its impact factor. All 181 articles published during 1998 and 1999 in the categories "review", "original contribution", and "short communication" were analyzed concerning actuarial analysis of time-failure data. One hundred and twenty-three publications without time-failure data were excluded from analysis. Forty-five of the remaining 58 publications with time-failure data were evaluated actuarially. This corresponds to 78% (95% confidence interval: 64 to 88%) of papers, in which data were adequately analyzed. Complications were reported in 16 of 58 papers, but in only 3 cases actuarially. The number of patients at risk during the course of follow-up was documented adequately in 22 of the 45 publications with actuarial analysis. Authors, peer reviewers, and editors could contribute to improve the quality of the journal by setting value on acturial analysis of time-failure data.
Cryptocurrency price drivers: Wavelet coherence analysis revisited
2018-01-01
Cryptocurrencies have experienced recent surges in interest and price. It has been discovered that there are time intervals where cryptocurrency prices and certain online and social media factors appear related. In addition it has been noted that cryptocurrencies are prone to experience intervals of bubble-like price growth. The hypothesis investigated here is that relationships between online factors and price are dependent on market regime. In this paper, wavelet coherence is used to study co-movement between a cryptocurrency price and its related factors, for a number of examples. This is used alongside a well-known test for financial asset bubbles to explore whether relationships change dependent on regime. The primary finding of this work is that medium-term positive correlations between online factors and price strengthen significantly during bubble-like regimes of the price series; this explains why these relationships have previously been seen to appear and disappear over time. A secondary finding is that short-term relationships between the chosen factors and price appear to be caused by particular market events (such as hacks / security breaches), and are not consistent from one time interval to another in the effect of the factor upon the price. In addition, for the first time, wavelet coherence is used to explore the relationships between different cryptocurrencies. PMID:29668765
Cryptocurrency price drivers: Wavelet coherence analysis revisited.
Phillips, Ross C; Gorse, Denise
2018-01-01
Cryptocurrencies have experienced recent surges in interest and price. It has been discovered that there are time intervals where cryptocurrency prices and certain online and social media factors appear related. In addition it has been noted that cryptocurrencies are prone to experience intervals of bubble-like price growth. The hypothesis investigated here is that relationships between online factors and price are dependent on market regime. In this paper, wavelet coherence is used to study co-movement between a cryptocurrency price and its related factors, for a number of examples. This is used alongside a well-known test for financial asset bubbles to explore whether relationships change dependent on regime. The primary finding of this work is that medium-term positive correlations between online factors and price strengthen significantly during bubble-like regimes of the price series; this explains why these relationships have previously been seen to appear and disappear over time. A secondary finding is that short-term relationships between the chosen factors and price appear to be caused by particular market events (such as hacks / security breaches), and are not consistent from one time interval to another in the effect of the factor upon the price. In addition, for the first time, wavelet coherence is used to explore the relationships between different cryptocurrencies.
Working times of elastomeric impression materials determined by dimensional accuracy.
Tan, E; Chai, J; Wozniak, W T
1996-01-01
The working times of five poly(vinyl siloxane) impression materials were estimated by evaluating the dimensional accuracy of stone dies of impressions of a standard model made at successive time intervals. The stainless steel standard model was represented by two abutments having known distances between landmarks in three dimensions. Three dimensions in the x-, y-, and z-axes of the stone dies were measured with a traveling microscope. A time interval was rejected as being within the working time if the percentage change of the resultant dies, in any dimension, was statistically different from those measured from stone dies from previous time intervals. The absolute dimensions of those dies from the rejected time interval also must have exceeded all those from previous time intervals. Results showed that the working times estimated with this method generally were about 30 seconds longer than those recommended by the manufacturers.
Single-channel autocorrelation functions: the effects of time interval omission.
Ball, F G; Sansom, M S
1988-01-01
We present a general mathematical framework for analyzing the dynamic aspects of single channel kinetics incorporating time interval omission. An algorithm for computing model autocorrelation functions, incorporating time interval omission, is described. We show, under quite general conditions, that the form of these autocorrelations is identical to that which would be obtained if time interval omission was absent. We also show, again under quite general conditions, that zero correlations are necessarily a consequence of the underlying gating mechanism and not an artefact of time interval omission. The theory is illustrated by a numerical study of an allosteric model for the gating mechanism of the locust muscle glutamate receptor-channel. PMID:2455553
NASA Astrophysics Data System (ADS)
Lyubushin, Alexey
2016-04-01
The problem of estimate of current seismic danger based on monitoring of seismic noise properties from broadband seismic network F-net in Japan (84 stations) is considered. Variations of the following seismic noise parameters are analyzed: multifractal singularity spectrum support width, generalized Hurst exponent, minimum Hölder-Lipschitz exponent and minimum normalized entropy of squared orthogonal wavelet coefficients. These parameters are estimated within adjacent time windows of the length 1 day for seismic noise waveforms from each station. Calculating daily median values of these parameters by all stations provides 4-dimensional time series which describes integral properties of the seismic noise in the region covered by the network. Cluster analysis is applied to the sequence of clouds of 4-dimensional vectors within moving time window of the length 365 days with mutual shift 3 days starting from the beginning of 1997 up to the current time. The purpose of the cluster analysis is to find the best number of clusters (BNC) from probe numbers which are varying from 1 up to the maximum value 40. The BNC is found from the maximum of pseudo-F-statistics (PFS). A 2D map could be created which presents dependence of PFS on the tested probe number of clusters and the right-hand end of moving time window which is rather similar to usual spectral time-frequency diagrams. In the paper [1] it was shown that the BNC before Tohoku mega-earthquake on March 11, 2011, has strongly chaotic regime with jumps from minimum up to maximum values in the time interval 1 year before the event and this time intervals was characterized by high PFS values. The PFS-map is proposed as the method for extracting time intervals with high current seismic danger. The next danger time interval after Tohoku mega-EQ began at the end of 2012 and was finished at the middle of 2013. Starting from middle of 2015 the high PFS values and chaotic regime of BNC variations were returned. This could be interpreted as the increasing of the danger of the next mega-EQ in Japan in the region of Nankai Trough [1] at the first half of 2016. References 1. Lyubushin, A., 2013. How soon would the next mega-earthquake occur in Japan? // Natural Science, 5 (8A1), 1-7. http://dx.doi.org/10.4236/ns.2013.58A1001
Time series regression studies in environmental epidemiology.
Bhaskaran, Krishnan; Gasparrini, Antonio; Hajat, Shakoor; Smeeth, Liam; Armstrong, Ben
2013-08-01
Time series regression studies have been widely used in environmental epidemiology, notably in investigating the short-term associations between exposures such as air pollution, weather variables or pollen, and health outcomes such as mortality, myocardial infarction or disease-specific hospital admissions. Typically, for both exposure and outcome, data are available at regular time intervals (e.g. daily pollution levels and daily mortality counts) and the aim is to explore short-term associations between them. In this article, we describe the general features of time series data, and we outline the analysis process, beginning with descriptive analysis, then focusing on issues in time series regression that differ from other regression methods: modelling short-term fluctuations in the presence of seasonal and long-term patterns, dealing with time varying confounding factors and modelling delayed ('lagged') associations between exposure and outcome. We finish with advice on model checking and sensitivity analysis, and some common extensions to the basic model.
Multiresolution analysis of Bursa Malaysia KLCI time series
NASA Astrophysics Data System (ADS)
Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed
2017-05-01
In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.
Effect of time of day and duration into shift on hazardous exposures to biological fluids.
Macias, D J; Hafner, J; Brillman, J C; Tandberg, D
1996-06-01
To determine whether hospital employee biological hazardous exposure rates varied with time of day or increased with time interval into shift. This was a retrospective occurrence report review conducted at a university hospital with an emergency medicine residency program. Health care worker biological hazardous exposure data over a 30-month period were reviewed. Professional status, date, time, and type of exposure (needlestick, laceration, splash), time interval into shift of exposure, and hospital location of exposure were recorded. Hourly employee counts and risky procedure counts were matched by location with each reported exposure, to determine hourly rates of biological hazardous exposures. Analysis of 411 recorded exposures demonstrated that more people were exposed between 9:00 AM and 11:00 AM (p < 0.05), yet the exposure risk did not vary significantly when expressed as the number of exposures per worker or per procedure. Of the 393 exposures with data describing time interval into shift when the exposure occurred, significant numbers of exposures occurred during the first hour and at shift's end [when corrected for exposures per worker (p < 0.05) or exposures per procedure (p < 0.05)]. While the number of exposures are increased in the AM hours, the exposure rate (as a function of workers or procedures) does not vary with time of the day. However, the exposure rate is increased during the first hour and last 2 hours of a shift. Efforts to increase worker precautions at the beginning and end of shifts are warranted.
NASA Astrophysics Data System (ADS)
Martinez, Mathieu; Bodin, Stéphane; Krencker, François-Nicolas
2015-04-01
The Pliensbachian and Toarcian stages (Early Jurassic) are marked by a series of carbon cycle disturbances, major climatic changes and severe faunal turnovers. An accurate knowledge of the timing of the Pliensbachian-Toarcian age is a key for quantifying fluxes and rhythms of faunal and geochemical processes during these major environmental perturbations. Although many studies provided astrochronological frameworks of the Toarcian Stage and the Toarcian oceanic anoxic event, no precise time frame exists for the Pliensbachian-Toarcian transition, often condensed in the previously studied sections. Here, we provide an astrochronology of the Pliensbachian-Toarcian transition in the Foum Tillicht section (central High Atlas, Morocco). The section is composed of decimetric hemipelagic marl-limestone alternations accompanied by cyclic fluctuations in the δ13Cmicrite. In this section, the marl-limestone alternations reflect cyclic sea-level/climatic changes, which triggers rhythmic migrations of the surrounding carbonate platforms and modulates the amount of carbonate exported to the basin. The studied interval encompasses 142.15 m of the section, from the base of the series to a hiatus in the Early Toarcian, marked by an erosional surface. The Pliensbachian-Toarcian (P-To) Event, a negative excursion in carbonate δ13Cmicrite, is observed pro parte in this studied interval. δ13Cmicrite measurements were performed every ~2 m at the base of the section and every 0.20 m within the P-To Event interval. Spectral analyses were performed using the multi-taper method and the evolutive Fast Fourier Transform to get the accurate assessment of the main significant periods and their evolution throughout the studied interval. Two main cycles are observed in the series: the 405-kyr eccentricity cycles is observed throughout the series, while the obliquity cycles is observed within the P-To Event, in the most densely sampled interval. The studied interval covers a 3.6-Myr interval. The duration of the part of P-To Event covered in this analysis is assessed at 0.70 Myr. In addition, the interval from the base of the Toarcian to the first occurrence of the calcareous nannofossil C. superbus has a duration assessed from 0.47 to 0.55 Myr. This duration is significantly higher than most of assessments obtained by former cyclostratigraphy analyses, showing that previous studies underestimated the duration of this interval, often condensed in the Western Tethys. This study shows the potential of the Foum Tillicht section to provide a refined time frame of the Pliensbachian-Toarcian boundary, which could be integrated in the next Geological Time Scale.
Huang, Si-Si; Xie, Dong-Mei; Cai, Yi-Jing; Wu, Jian-Min; Chen, Rui-Chong; Wang, Xiao-Dong; Song, Mei; Zheng, Ming-Hua; Wang, Yu-Qun; Lin, Zhuo; Shi, Ke-Qing
2017-04-01
Hepatitis B virus (HBV) infection remains a major health problem and HBV-related-decompensated cirrhosis (HBV-DC) usually leads to a poor prognosis. Our aim was to determine the utility of inflammatory biomarkers in predicting mortality of HBV-DC. A total of 329 HBV-DC patients were enrolled. Survival estimates for the entire study population were generated using the Kaplan-Meier method. The prognostic values for model for end-stage liver disease (MELD) score, Child-Pugh score, and inflammatory biomarkers neutrophil/lymphocyte ratio, C-reactive protein-to-albumin ratio (CAR), and lymphocyte-to-monocyte ratio (LMR) for HBV-DC were compared using time-dependent receiver operating characteristic curves and time-dependent decision curves. The survival time was 23.1±15.8 months. Multivariate analysis identified age, CAR, LMR, and platelet count as prognostic independent risk factors. Kaplan-Meier analysis indicated that CAR of at least 1.0 (hazard ratio, 7.19; 95% confidence interval, 4.69-11.03), and LMR less than 1.9 (hazard ratio, 2.40; 95% confidence interval, 1.69-3.41) were independently associated with mortality of HBV-DC. The time-dependent receiver operating characteristic indicated that CAR showed the best performance in predicting mortality of HBV-DC compared with LMR, MELD score, and Child-Pugh score. The results were also confirmed by time-dependent decision curves. CAR and LMR were associated with the prognosis of HBV-DC. CAR was superior to LMR, MELD score, and Child-Pugh score in HBV-DC mortality prediction.
2D Slightly Compressible Ideal Flow in an Exterior Domain
NASA Astrophysics Data System (ADS)
Secchi, Paolo
2006-12-01
We consider the Euler equations of barotropic inviscid compressible fluids in the exterior domain. It is well known that, as the Mach number goes to zero, the compressible flows approximate the solution of the equations of motion of inviscid, incompressible fluids. In dimension 2 such limit solution exists on any arbitrary time interval, with no restriction on the size of the initial data. It is then natural to expect the same for the compressible solution, if the Mach number is sufficiently small. First we study the life span of smooth irrotational solutions, i.e. the largest time interval T(ɛ) of existence of classical solutions, when the initial data are a small perturbation of size ɛ from a constant state. Then, we study the nonlinear interaction between the irrotational part and the incompressible part of a general solution. This analysis yields the existence of smooth compressible flow on any arbitrary time interval and with no restriction on the size of the initial velocity, for any Mach number sufficiently small. Finally, the approach is applied to the study of the incompressible limit. For the proofs we use a combination of energy estimates and a decay estimate for the irrotational part.
NASA Technical Reports Server (NTRS)
Humphreys, E. A.
1981-01-01
A computerized, analytical methodology was developed to study damage accumulation during low velocity lateral impact of layered composite plates. The impact event was modeled as perfectly plastic with complete momentum transfer to the plate structure. A transient dynamic finite element approach was selected to predict the displacement time response of the plate structure. Composite ply and interlaminar stresses were computed at selected time intervals and subsequently evaluated to predict layer and interlaminar damage. The effects of damage on elemental stiffness were then incorporated back into the analysis for subsequent time steps. Damage predicted included fiber failure, matrix ply failure and interlaminar delamination.
Neuronal and network computation in the brain
NASA Astrophysics Data System (ADS)
Babloyantz, A.
1999-03-01
The concepts and methods of non-linear dynamics have been a powerful tool for studying some gamow aspects of brain dynamics. In this paper we show how, from time series analysis of electroencepholograms in sick and healthy subjects, chaotic nature of brain activity could be unveiled. This finding gave rise to the concept of spatiotemporal cortical chaotic networks which in turn was the foundation for a simple brain-like device which is able to become attentive, perform pattern recognition and motion detection. A new method of time series analysis is also proposed which demonstrates for the first time the existence of neuronal code in interspike intervals of coclear cells.
Li, Zhen; Han, Xiu-Guo; Sheng, Jing; Ma, Shao-Jun
2016-05-01
To evaluate the effectiveness of virtual reality interventions for improving balance in people after stroke. Systematic review and meta-analysis of randomized controlled trials. Studies were obtained by searching the following databases: MEDLINE, CINAHL, EMBASE, Web of Science and CENTRAL. Two reviewers assessed studies for inclusion, extracted data and assessed trial quality. Sixteen studies involving 428 participants were included. People who received virtual reality interventions showed marked improvements in Berg Balance Scale (mean difference: 1.46, 95% confidence interval: 0.09-2.83, P<0.05, I²=0%) and Timed Up and Go Test (mean difference: -1.62, 95% confidence interval: -3.07- -0.16, P<0.05, I²=24%) compared with controls. This meta-analysis of randomized controlled trials supports the use of virtual reality to improve balance after stroke. © The Author(s) 2015.
Clonal differences in generation times of GPK epithelial cells in monolayer culture.
Riley, P A; Hola, M
1980-01-01
Pedigrees of cells in eight clones of guinea pig keratocyte (GPK) cells in monolayer culture were analyzed from a time-lapse film. The generation times and the position in the field of observation were recorded up to the sixth generation when the cultures were still subconfluent. Statistical analysis of the results indicates that the position in the culture has less significance than the clonal origin of the cell in determining the interval between successive mitoses.
Validation of Heart Rate Monitor Polar RS800 for Heart Rate Variability Analysis During Exercise.
Hernando, David; Garatachea, Nuria; Almeida, Rute; Casajús, Jose A; Bailón, Raquel
2018-03-01
Hernando, D, Garatachea, N, Almeida, R, Casajús, JA, and Bailón, R. Validation of heart rate monitor Polar RS800 for heart rate variability analysis during exercise. J Strength Cond Res 32(3): 716-725, 2018-Heart rate variability (HRV) analysis during exercise is an interesting noninvasive tool to measure the cardiovascular response to the stress of exercise. Wearable heart rate monitors are a comfortable option to measure interbeat (RR) intervals while doing physical activities. It is necessary to evaluate the agreement between HRV parameters derived from the RR series recorded by wearable devices and those derived from an electrocardiogram (ECG) during dynamic exercise of low to high intensity. Twenty-three male volunteers performed an exercise stress test on a cycle ergometer. Subjects wore a Polar RS800 device, whereas ECG was also recorded simultaneously to extract the reference RR intervals. A time-frequency spectral analysis was performed to extract the instantaneous mean heart rate (HRM), and the power of low-frequency (PLF) and high-frequency (PHF) components, the latter centered on the respiratory frequency. Analysis was done in intervals of different exercise intensity based on oxygen consumption. Linear correlation, reliability, and agreement were computed in each interval. The agreement between the RR series obtained from the Polar device and from the ECG is high throughout the whole test although the shorter the RR is, the more differences there are. Both methods are interchangeable when analyzing HRV at rest. At high exercise intensity, HRM and PLF still presented a high correlation (ρ > 0.8) and excellent reliability and agreement indices (above 0.9). However, the PHF measurements from the Polar showed reliability and agreement coefficients around 0.5 or lower when the level of the exercise increases (for levels of O2 above 60%).
A microcomputer system for on-line study of atrioventricular node accommodation.
Jenkins, J R; Clemo, H F; Belardinelli, L
1987-11-01
An automated on-line programmable stimulator and interval measurement system was developed to study atrioventricular node (AVN) accommodation. This dedicated microcomputer system measures and stores the stimulus-to-His bundle (S-H) interval from His bundle electrogram (HBE) recordings. Interval measurements for each beat are accurate to within 500 microsecond. This user-controlled system has been used to stimulate at any rate up to 6.5 Hz and to measure intervals up to 125 ms in isolated perfused guinea pig hearts. A built-in timer-reset mechanism prevents failure of the system in the absence of a His potential (i.e., 2:1 AV block). It may be modified for use in clinical studies or other experimental systems and has the ability to measure other physiological intervals. The system provides the precision in pacing and accuracy in the measurement of AVN conduction time that is necessary for meaningful analysis of AVN accommodation and has the simplicity of design and use that is not available in previously described systems. Furthermore, this computer system can be used not only in studies involving AV conduction, but also in any setting where programmed stimulation and interval measurement and recording need to be performed simultaneously.
NASA Technical Reports Server (NTRS)
Dupuis, L. R.; Scoggins, J. R.
1979-01-01
Results of analyses revealed that nonlinear changes or differences formed centers or systems, that were mesosynoptic in nature. These systems correlated well in space with upper level short waves, frontal zones, and radar observed convection, and were very systematic in time and space. Many of the centers of differences were well established in the vertical, extending up to the tropopause. Statistical analysis showed that on the average nonlinear changes were larger in convective areas than nonconvective regions. Errors often exceeding 100 percent were made by assuming variables to change linearly through a 12-h period in areas of thunderstorms, indicating that these nonlinear changes are important in the development of severe weather. Linear changes, however, accounted for more and more of an observed change as the time interval (within the 12-h interpolation period) increased, implying that the accuracy of linear interpolation increased over larger time intervals.
Statistical analysis of strait time index and a simple model for trend and trend reversal
NASA Astrophysics Data System (ADS)
Chen, Kan; Jayaprakash, C.
2003-06-01
We analyze the daily closing prices of the Strait Time Index (STI) as well as the individual stocks traded in Singapore's stock market from 1988 to 2001. We find that the Hurst exponent is approximately 0.6 for both the STI and individual stocks, while the normal correlation functions show the random walk exponent of 0.5. We also investigate the conditional average of the price change in an interval of length T given the price change in the previous interval. We find strong correlations for price changes larger than a threshold value proportional to T; this indicates that there is no uniform crossover to Gaussian behavior. A simple model based on short-time trend and trend reversal is constructed. We show that the model exhibits statistical properties and market swings similar to those of the real market.
The clinical evaluation of platelet-rich plasma on free gingival graft's donor site wound healing.
Samani, Mahmoud Khosravi; Saberi, Bardia Vadiati; Ali Tabatabaei, S M; Moghadam, Mahdjoube Goldani
2017-01-01
It has been proved that platelet-rich plasma (PRP) can promote wound healing. In this way, PRP can be advantageous in periodontal plastic surgeries, free gingival graft (FGG) being one such surgery. In this randomized split-mouth controlled trial, 10 patients who needed bilateral FGG were selected, and two donor sites were randomly assigned to experience either natural healing or healing-assisted with PRP. The outcome was assessed based on the comparison of the extent of wound closure, Manchester scale, Landry healing scale, visual analog scale, and tissue thickness between the study groups at different time intervals. Repeated measurements of analysis of variance and paired t -test were used. Statistical significance was P ≤ 0.05. Significant differences between the study groups and also across different time intervals were seen in all parameters except for the changes in tissue thickness. PRP accelerates the healing process of wounds and reduces the healing time.
Yagi, Maiko; Yasunaga, Hideo; Matsui, Hiroki; Morita, Kojiro; Fushimi, Kiyohide; Fujimoto, Masashi; Koyama, Teruyuki; Fujitani, Junko
2017-03-01
We aimed to examine the concurrent effects of timing and intensity of rehabilitation on improving activities of daily living (ADL) among patients with ischemic stroke. Using the Japanese Diagnosis Procedure Combination inpatient database, we retrospectively analyzed consecutive patients with ischemic stroke at admission who received rehabilitation (n=100 719) from April 2012 to March 2014. Early rehabilitation was defined as that starting within 3 days after admission. The average rehabilitation intensity per day was calculated as the total units of rehabilitation during hospitalization divided by the length of hospital stay. A multivariable logistic regression analysis with multiple imputation and an instrumental variable analysis were performed to examine the association of early and intensive rehabilitation with the proportion of improved ADL score. The proportion of improved ADL score was higher in the early and intensive rehabilitation group. The multivariable logistic regression analysis showed that significant improvements in ADL were observed for early rehabilitation (odds ratio: 1.08; 95% confidence interval: 1.04-1.13; P <0.01) and intensive rehabilitation of >5.0 U/d (odds ratio: 1.87; 95% confidence interval: 1.69-2.07; P <0.01). The instrumental variable analysis showed that an increased proportion of improved ADL was associated with early rehabilitation (risk difference: 2.8%; 95% confidence interval: 2.0-3.4%; P <0.001) and intensive rehabilitation (risk difference: 5.6%; 95% confidence interval: 4.6-6.6%; P <0.001). The present results suggested that early and intensive rehabilitation improved ADL during hospitalization in patients with ischemic stroke. © 2017 American Heart Association, Inc.
NASA Astrophysics Data System (ADS)
Yin, Hui; Yu, Dejie; Yin, Shengwen; Xia, Baizhan
2016-10-01
This paper introduces mixed fuzzy and interval parametric uncertainties into the FE components of the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model for mid-frequency analysis of built-up systems, thus an uncertain ensemble combining non-parametric with mixed fuzzy and interval parametric uncertainties comes into being. A fuzzy interval Finite Element/Statistical Energy Analysis (FIFE/SEA) framework is proposed to obtain the uncertain responses of built-up systems, which are described as intervals with fuzzy bounds, termed as fuzzy-bounded intervals (FBIs) in this paper. Based on the level-cut technique, a first-order fuzzy interval perturbation FE/SEA (FFIPFE/SEA) and a second-order fuzzy interval perturbation FE/SEA method (SFIPFE/SEA) are developed to handle the mixed parametric uncertainties efficiently. FFIPFE/SEA approximates the response functions by the first-order Taylor series, while SFIPFE/SEA improves the accuracy by considering the second-order items of Taylor series, in which all the mixed second-order items are neglected. To further improve the accuracy, a Chebyshev fuzzy interval method (CFIM) is proposed, in which the Chebyshev polynomials is used to approximate the response functions. The FBIs are eventually reconstructed by assembling the extrema solutions at all cut levels. Numerical results on two built-up systems verify the effectiveness of the proposed methods.
NASA Astrophysics Data System (ADS)
Kurniati, Devi; Hoyyi, Abdul; Widiharih, Tatik
2018-05-01
Time series data is a series of data taken or measured based on observations at the same time interval. Time series data analysis is used to perform data analysis considering the effect of time. The purpose of time series analysis is to know the characteristics and patterns of a data and predict a data value in some future period based on data in the past. One of the forecasting methods used for time series data is the state space model. This study discusses the modeling and forecasting of electric energy consumption using the state space model for univariate data. The modeling stage is began with optimal Autoregressive (AR) order selection, determination of state vector through canonical correlation analysis, estimation of parameter, and forecasting. The result of this research shows that modeling of electric energy consumption using state space model of order 4 with Mean Absolute Percentage Error (MAPE) value 3.655%, so the model is very good forecasting category.
Echolocation system of the bottlenose dolphin
NASA Astrophysics Data System (ADS)
Dubrovsky, N. A.
2004-05-01
The hypothesis put forward by Vel’min and Dubrovsky [1] is discussed. The hypothesis suggests that bottlenose dolphins possess two functionally separate auditory subsystems: one of them serves for analyzing extraneous sounds, as in nonecholocating terrestrial animals, and the other performs the analysis of echoes caused by the echolocation clicks of the animal itself. The first subsystem is called passive hearing, and the second, active hearing. The results of experimental studies of dolphin’s echolocation system are discussed to confirm the proposed hypothesis. For the active hearing of dolphins, the notion of a critical interval is considered as the interval of time within which the formation of a merged auditory image of the echolocation object is formed when all echo highlights of the echo from this object fall within the critical interval.
García-Ruiz, Jose M; Fernández-Jiménez, Rodrigo; García-Alvarez, Ana; Pizarro, Gonzalo; Galán-Arriola, Carlos; Fernández-Friera, Leticia; Mateos, Alonso; Nuno-Ayala, Mario; Aguero, Jaume; Sánchez-González, Javier; García-Prieto, Jaime; López-Melgar, Beatriz; Martínez-Tenorio, Pedro; López-Martín, Gonzalo J; Macías, Angel; Pérez-Asenjo, Braulio; Cabrera, José A; Fernández-Ortiz, Antonio; Fuster, Valentín; Ibáñez, Borja
2016-05-10
Pre-reperfusion administration of intravenous (IV) metoprolol reduces infarct size in ST-segment elevation myocardial infarction (STEMI). This study sought to determine how this cardioprotective effect is influenced by the timing of metoprolol therapy having either a long or short metoprolol bolus-to-reperfusion interval. We performed a post hoc analysis of the METOCARD-CNIC (effect of METOprolol of CARDioproteCtioN during an acute myocardial InfarCtion) trial, which randomized anterior STEMI patients to IV metoprolol or control before mechanical reperfusion. Treated patients were divided into short- and long-interval groups, split by the median time from 15 mg metoprolol bolus to reperfusion. We also performed a controlled validation study in 51 pigs subjected to 45 min ischemia/reperfusion. Pigs were allocated to IV metoprolol with a long (-25 min) or short (-5 min) pre-perfusion interval, IV metoprolol post-reperfusion (+60 min), or IV vehicle. Cardiac magnetic resonance (CMR) was performed in the acute and chronic phases in both clinical and experimental settings. For 218 patients (105 receiving IV metoprolol), the median time from 15 mg metoprolol bolus to reperfusion was 53 min. Compared with patients in the short-interval group, those with longer metoprolol exposure had smaller infarcts (22.9 g vs. 28.1 g; p = 0.06) and higher left ventricular ejection fraction (LVEF) (48.3% vs. 43.9%; p = 0.019) on day 5 CMR. These differences occurred despite total ischemic time being significantly longer in the long-interval group (214 min vs. 160 min; p < 0.001). There was no between-group difference in the time from symptom onset to metoprolol bolus. In the animal study, the long-interval group (IV metoprolol 25 min before reperfusion) had the smallest infarcts (day 7 CMR) and highest long-term LVEF (day 45 CMR). In anterior STEMI patients undergoing primary angioplasty, the sooner IV metoprolol is administered in the course of infarction, the smaller the infarct and the higher the LVEF. These hypothesis-generating clinical data are supported by a dedicated experimental large animal study. Copyright © 2016 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Kalim, Shahid; Nazir, Shaista; Khan, Zia Ullah
2013-01-01
Protocols based on newer high sensitivity Troponin T (hsTropT) assays can rule in a suspected Acute Myocardial Infarction (AMI) as early as 3 hours. We conducted this study to audit adherence to our Trust's newly introduced AMI diagnostic protocol based on paired hsTropT testing at 0 and 3 hours. We retrospectively reviewed data of all patients who had hsTropT test done between 1st and 7th May 2012. Patient's demographics, utility of single or paired samples, time interval between paired samples, patient's presenting symptoms and ECG findings were noted and their means, medians, Standard deviations and proportions were calculated. A total of 66 patients had hsTropT test done during this period. Mean age was 63.30 +/- 17.46 years and 38 (57.57%) were males. Twenty-four (36.36%) patients had only single, rather than protocol recommended paired hsTropT samples, taken. Among the 42 (63.63%) patients with paired samples, the mean time interval was found to be 4.41 +/- 5.7 hours. Contrary to the recommendations, 15 (22.73%) had a very long whereas 2 (3.03%) had a very short time interval between two samples. A subgroup analysis of patients with single samples, found only 2 (3.03%) patient with ST-segment elevation, appropriate for single testing. Our study confirmed that in a large number of patients the protocol for paired sampling or a recommended time interval of 3 hours between 2 samples was not being followed.
Lee, Taeheon; Park, Jung Ho; Sohn, Chongil; Yoon, Kyung Jae; Lee, Yong-Taek; Park, Jung Hwan; Jung, Il Seok
2017-01-30
We attempted to examine the relationship between abnormal findings on high-resolution manometry (HRM) and videofluoroscopic swallowing study (VFSS) of the pharynx and upper esophageal sphincter (UES), and to identify the risk factors for aspiration. We performed VFSS and HRM on the same day in 36 ischemic stroke patients (mean age, 67.5 years) with dysphagia. Pressure (basal, median intra bolus, and nadir), relaxation time interval of the UES, and mesopharyngeal and hypopharyngeal contractility (as a contractile integral) were examined using HRM. The parameters of VFSS were vallecular residue, pyriform sinus residue, vallecular overflow, penetration, and aspiration. The association between the parameters of VFSS and HRM was analyzed by the Student's t test. Three (8.3%) and 4 (11.1%) stroke patients with dysphagia had pyriform sinus residue and vallecular sinus residue, respectively, and 5 (13.8%) patients showed aspiration. Mesopharyngeal and hypopharyngeal contractile integrals in patients with residue in the pyriform sinus were significantly lower than those in patients without residue in the pyriform sinus ( P < 0.05). Relaxation time intervals in patients with aspiration were significantly shorter than those in patients without aspiration ( P < 0.05), and multivariate regression analysis revealed a shorter relaxation time interval as the main risk factor for aspiration (OR, 0.03; 95% CI, 0.01-0.65; P < 0.05). Manometric measurements of the pharynx and UES were well correlated with abnormal findings in the VFSS, and a shorter relaxation time interval of the UES during deglutition is an important parameter for the development of aspiration.
NASA Astrophysics Data System (ADS)
Lee, Zoe; Baas, Andreas
2013-04-01
It is widely recognised that boundary layer turbulence plays an important role in sediment transport dynamics in aeolian environments. Improvements in the design and affordability of ultrasonic anemometers have provided significant contributions to studies of aeolian turbulence, by facilitating high frequency monitoring of three dimensional wind velocities. Consequently, research has moved beyond studies of mean airflow properties, to investigations into quasi-instantaneous turbulent fluctuations at high spatio-temporal scales. To fully understand, how temporal fluctuations in shear stress drive wind erosivity and sediment transport, research into the best practice for calculating shear stress is necessary. This paper builds upon work published by Lee and Baas (2012) on the influence of streamline correction techniques on Reynolds shear stress, by investigating the time-averaging interval used in the calculation. Concerns relating to the selection of appropriate averaging intervals for turbulence research, where the data are typically non-stationary at all timescales, are well documented in the literature (e.g. Treviño and Andreas, 2000). For example, Finnigan et al. (2003) found that underestimating the required averaging interval can lead to a reduction in the calculated momentum flux, as contributions from turbulent eddies longer than the averaging interval are lost. To avoid the risk of underestimating fluxes, researchers have typically used the total measurement duration as a single averaging period. For non-stationary data, however, using the whole measurement run as a single block average is inadequate for defining turbulent fluctuations. The data presented in this paper were collected in a field study of boundary layer turbulence conducted at Tramore beach near Rosapenna, County Donegal, Ireland. High-frequency (50 Hz) 3D wind velocity measurements were collected using ultrasonic anemometry at thirteen different heights between 0.11 and 1.62 metres above the bed. A technique for determining time-averaging intervals for a series of anemometers stacked in a close vertical array is presented. A minimum timescale is identified using spectral analysis to determine the inertial sub-range, where energy is neither produced nor dissipated but passed down to increasingly smaller scales. An autocorrelation function is then used to derive a scaling pattern between anemometer heights, which defines a series of averaging intervals of increasing length with height above the surface. Results demonstrate the effect of different averaging intervals on the calculation of Reynolds shear stress and highlight the inadequacy of using the total measurement duration as a single block average. Lee, Z. S. & Baas, A. C. W. (2012). Streamline correction for the analysis of boundary layer turbulence. Geomorphology, 171-172, 69-82. Treviño, G. and Andreas, E.L., 2000. Averaging Intervals For Spectral Analysis Of Nonstationary Turbulence. Boundary-Layer Meteorology, 95(2): 231-247. Finnigan, J.J., Clement, R., Malhi, Y., Leuning, R. and Cleugh, H.A., 2003. Re-evaluation of long-term flux measurement techniques. Part I: Averaging and coordinate rotation. Boundary-Layer Meteorology, 107(1): 1-48.
Time to brain imaging in acute stroke is improving: secondary analysis of the INSTINCT trial.
Sauser, Kori; Burke, James F; Levine, Deborah A; Scott, Phillip A; Meurer, William J
2014-01-01
Patients with acute ischemic stroke benefit from rapid evaluation and treatment, and timely brain imaging is a necessary component. We determined the effect of a targeted behavioral intervention on door-to-imaging time (DIT) among patients with ischemic stroke treated with tissue-type plasminogen activator. Second, we examined the variation in DIT accounted for by patient-level and hospital-level factors. The Increasing Stroke Treatment through Interventional behavioral Change Tactics (INSTINCT) trial was a cluster-randomized, controlled trial involving 24 Michigan hospitals. The intervention aimed to increase tissue-type plasminogen activator utilization. Detailed chart abstractions collected data for 557 patients with ischemic stroke. We used a series of hierarchical linear mixed-effects models to evaluate the effect of the intervention on DIT (difference-in-differences analysis) and used patient-level and hospital-level explanatory variables to decompose variation in DIT. DIT improved over time, without a difference between intervention and control hospitals (intervention: 23.7-19.3 minutes, control: 28.9-19.2 minutes; P=0.56). Adjusted DIT was faster in patients who arrived by ambulance (7.2 minutes; 95% confidence interval, 4.1-10.2), had severe strokes (1.0 minute per +5-point National Institutes of Health Stroke Scale; 95% confidence interval, 0.1-2.0), and presented in the postintervention period (4.9 minutes; 95% confidence interval, 2.3-7.4). After accounting for these factors, 13.8% of variation in DIT was attributable to hospital. Neither hospital stroke volume nor stroke center status was associated with DIT. Performance on DIT improved similarly in intervention and control hospitals, suggesting that nonintervention factors explain the improvement. Hospital-level factors explain a modest proportion of variation in DIT, but further research is needed to identify the hospital-level factors responsible.
Variations in rupture process with recurrence interval in a repeated small earthquake
Vidale, J.E.; Ellsworth, W.L.; Cole, A.; Marone, Chris
1994-01-01
In theory and in laboratory experiments, friction on sliding surfaces such as rock, glass and metal increases with time since the previous episode of slip. This time dependence is a central pillar of the friction laws widely used to model earthquake phenomena. On natural faults, other properties, such as rupture velocity, porosity and fluid pressure, may also vary with the recurrence interval. Eighteen repetitions of the same small earthquake, separated by intervals ranging from a few days to several years, allow us to test these laboratory predictions in situ. The events with the longest time since the previous earthquake tend to have about 15% larger seismic moment than those with the shortest intervals, although this trend is weak. In addition, the rupture durations of the events with the longest recurrence intervals are more than a factor of two shorter than for the events with the shortest intervals. Both decreased duration and increased friction are consistent with progressive fault healing during the time of stationary contact.In theory and in laboratory experiments, friction on sliding surfaces such as rock, glass and metal increases with time since the previous episode of slip. This time dependence is a central pillar of the friction laws widely used to model earthquake phenomena. On natural faults, other properties, such as rupture velocity, porosity and fluid pressure, may also vary with the recurrence interval. Eighteen repetitions of the same small earthquake, separated by intervals ranging from a few days to several years, allow us to test these laboratory predictions in situ. The events with the longest time since the previous earthquake tend to have about 15% larger seismic moment than those with the shortest intervals, although this trend is weak. In addition, the rupture durations of the events with the longest recurrence intervals are more than a factor of two shorter than for the events with the shortest intervals. Both decreased duration and increased friction are consistent with progressive fault healing during the time of stationary contact.
Stephenson, John; Farndon, Lisa; Concannon, Michael
2016-06-01
This study assesses the effect of salicylic acid plasters on the time to resolution of 324 corns experienced by 201 participants taking part in a randomized controlled trial. While the rate of corn resolution was substantively higher in the treatment group than in the control group, treatment was found to be not significantly related to time to corn recurrence when analyzed over the full 12-month follow-up period. Parametric survival analysis modeling of interval-censored data and incorporating patient-specific frailty terms was utilized, to model correlation of corns within patients (hazard ratio [HR], 1.189; 95% confidence interval [CI], 0.780-1.813; P = 0.422). Median resolution times were 10.0 months for corns in the treatment group and 13.4 months for corns in the control group. Controlling for treatment, corn type was found to be related to resolution time, with dorsal/interdigital (ID) corns showing better resolution than plantar corns (HR, 1.670; 95% CI, 1.061-2.630; P = 0.027). Median resolution times were 5.9 months for dorsal/ID corns and 14.9 months for plantar corns. Secondary measures relating to quality of life (QoL) and foot-related disability, using the EQ-5D questionnaire and the Manchester Foot Pain and Disability Index (MFPDI), were also assessed at the patient level in multivariate models. Treatment was not significantly related to any of these measures over the whole period of analysis. However, a trend analysis revealed a quadratic trend in QoL and MFPDI scores, arising from a substantive initial improvement between baseline and 3 months, followed by a gradual decrease between 3 and 12 months. © 2015 Japanese Dermatological Association.
Hossain, Monir; Wright, Steven; Petersen, Laura A
2002-04-01
One way to monitor patient access to emergent health care services is to use patient characteristics to predict arrival time at the hospital after onset of symptoms. This predicted arrival time can then be compared with actual arrival time to allow monitoring of access to services. Predicted arrival time could also be used to estimate potential effects of changes in health care service availability, such as closure of an emergency department or an acute care hospital. Our goal was to determine the best statistical method for prediction of arrival intervals for patients with acute myocardial infarction (AMI) symptoms. We compared the performance of multinomial logistic regression (MLR) and discriminant analysis (DA) models. Models for MLR and DA were developed using a dataset of 3,566 male veterans hospitalized with AMI in 81 VA Medical Centers in 1994-1995 throughout the United States. The dataset was randomly divided into a training set (n = 1,846) and a test set (n = 1,720). Arrival times were grouped into three intervals on the basis of treatment considerations: <6 hours, 6-12 hours, and >12 hours. One model for MLR and two models for DA were developed using the training dataset. One DA model had equal prior probabilities, and one DA model had proportional prior probabilities. Predictive performance of the models was compared using the test (n = 1,720) dataset. Using the test dataset, the proportions of patients in the three arrival time groups were 60.9% for <6 hours, 10.3% for 6-12 hours, and 28.8% for >12 hours after symptom onset. Whereas the overall predictive performance by MLR and DA with proportional priors was higher, the DA models with equal priors performed much better in the smaller groups. Correct classifications were 62.6% by MLR, 62.4% by DA using proportional prior probabilities, and 48.1% using equal prior probabilities of the groups. The misclassifications by MLR for the three groups were 9.5%, 100.0%, 74.2% for each time interval, respectively. Misclassifications by DA models were 9.8%, 100.0%, and 74.4% for the model with proportional priors and 47.6%, 79.5%, and 51.0% for the model with equal priors. The choice of MLR or DA with proportional priors, or DA with equal priors for monitoring time intervals of predicted hospital arrival time for a population should depend on the consequences of misclassification errors.
Buffered coscheduling for parallel programming and enhanced fault tolerance
Petrini, Fabrizio [Los Alamos, NM; Feng, Wu-chun [Los Alamos, NM
2006-01-31
A computer implemented method schedules processor jobs on a network of parallel machine processors or distributed system processors. Control information communications generated by each process performed by each processor during a defined time interval is accumulated in buffers, where adjacent time intervals are separated by strobe intervals for a global exchange of control information. A global exchange of the control information communications at the end of each defined time interval is performed during an intervening strobe interval so that each processor is informed by all of the other processors of the number of incoming jobs to be received by each processor in a subsequent time interval. The buffered coscheduling method of this invention also enhances the fault tolerance of a network of parallel machine processors or distributed system processors
NASA Technical Reports Server (NTRS)
Rodgers, T. E.; Johnson, J. F.
1977-01-01
The logic and methodology for a preliminary grouping of Spacelab and mixed-cargo payloads is proposed in a form that can be readily coded into a computer program by NASA. The logic developed for this preliminary cargo grouping analysis is summarized. Principal input data include the NASA Payload Model, payload descriptive data, Orbiter and Spacelab capabilities, and NASA guidelines and constraints. The first step in the process is a launch interval selection in which the time interval for payload grouping is identified. Logic flow steps are then taken to group payloads and define flight configurations based on criteria that includes dedication, volume, area, orbital parameters, pointing, g-level, mass, center of gravity, energy, power, and crew time.
Wang, L; Wu, L; Ji, G; Zhang, X; Chen, T; Wang, L
1998-12-01
Effects of upright tilt on mechanism of autonomic nervous regulation of cardiovascular system and characteristics of heart rate variability (HRV) were observed in sixty healthy male pilots. Relation between time domain and frequency domain indexes of short-time HRV (5 min) were analysed before and after upright tilt. The results showed that there are significant difference in short time HRV parameters before and after upright tilt. Significant relationship was formed between time domain and frequency domain indexes of HRV. It suggests that time domain and frequency domain HRV analysis is capable of revealing certain informations embedded in a short series of R-R intervals and can help to evaluate the status of autonomic regulation of cardiovascular function in pilots.
NASA Astrophysics Data System (ADS)
Rowlands, G.; Kiyani, K. H.; Chapman, S. C.; Watkins, N. W.
2009-12-01
Quantitative analysis of solar wind fluctuations are often performed in the context of intermittent turbulence and center around methods to quantify statistical scaling, such as power spectra and structure functions which assume a stationary process. The solar wind exhibits large scale secular changes and so the question arises as to whether the timeseries of the fluctuations is non-stationary. One approach is to seek a local stationarity by parsing the time interval over which statistical analysis is performed. Hence, natural systems such as the solar wind unavoidably provide observations over restricted intervals. Consequently, due to a reduction of sample size leading to poorer estimates, a stationary stochastic process (time series) can yield anomalous time variation in the scaling exponents, suggestive of nonstationarity. The variance in the estimates of scaling exponents computed from an interval of N observations is known for finite variance processes to vary as ~1/N as N becomes large for certain statistical estimators; however, the convergence to this behavior will depend on the details of the process, and may be slow. We study the variation in the scaling of second-order moments of the time-series increments with N for a variety of synthetic and “real world” time series, and we find that in particular for heavy tailed processes, for realizable N, one is far from this ~1/N limiting behavior. We propose a semiempirical estimate for the minimum N needed to make a meaningful estimate of the scaling exponents for model stochastic processes and compare these with some “real world” time series from the solar wind. With fewer datapoints the stationary timeseries becomes indistinguishable from a nonstationary process and we illustrate this with nonstationary synthetic datasets. Reference article: K. H. Kiyani, S. C. Chapman and N. W. Watkins, Phys. Rev. E 79, 036109 (2009).
Nonparametric methods in actigraphy: An update
Gonçalves, Bruno S.B.; Cavalcanti, Paula R.A.; Tavares, Gracilene R.; Campos, Tania F.; Araujo, John F.
2014-01-01
Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm) results for each time interval. Simulated data showed that (1) synchronization analysis depends on sample size, and (2) fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization. PMID:26483921
Methods of measurement signal acquisition from the rotational flow meter for frequency analysis
NASA Astrophysics Data System (ADS)
Świsulski, Dariusz; Hanus, Robert; Zych, Marcin; Petryka, Leszek
One of the simplest and commonly used instruments for measuring the flow of homogeneous substances is the rotational flow meter. The main part of such a device is a rotor (vane or screw) rotating at a speed which is the function of the fluid or gas flow rate. A pulse signal with a frequency proportional to the speed of the rotor is obtained at the sensor output. For measurements in dynamic conditions, a variable interval between pulses prohibits the analysis of the measuring signal. Therefore, the authors of the article developed a method involving the determination of measured values on the basis of the last inter-pulse interval preceding the moment designated by the timing generator. For larger changes of the measured value at a predetermined time, the value can be determined by means of extrapolation of the two adjacent interpulse ranges, assuming a linear change in the flow. The proposed methods allow analysis which requires constant spacing between measurements, allowing for an analysis of the dynamics of changes in the test flow, eg. using a Fourier transform. To present the advantages of these methods simulations of flow measurement were carried out with a DRH-1140 rotor flow meter from the company Kobold.
Changes in infant disposable diaper weights at selected intervals post-wetting.
Carlisle, Joan; Moore, Amanda; Cooper, Alyssa; Henderson, Terri; Mayfield, Debbie; Taylor, Randa; Thomas, Jennifer; Van Fleet, Laduska; Askanazi, David; Fineberg, Naomi; Sun, Yanhui
2012-01-01
Pediatric acute care nurses questioned the practice of weighing disposable infant diapers immediately after voiding. This study asked the research question, "Does volume of saline, diaper configuration, and/or size of diaper statistically effect changes in diaper weights over time?" The method was an experimental, laboratory model. Pre-set volumes of saline were added to disposable diapers that were then left folded or unfolded. Each diaper was weighed immediately post-wetting and re-weighed at hourly intervals for seven hours. Data were analyzed using a repeated measures analysis of variance (RMANOVA) with balanced data (F-test). Diaper weight changes over time were statistically significant for all time points and for all volumes regardless of diaper size; however, the changes in weight were small and without clinical significance. It is appropriate to weigh diapers at the end of eight hours without risk of altering subsequent fluid management of patients in open-air, non-humidified environments. This practice has led to more efficient use of nurses' time with fewer interruptions for patients and families.
NASA Technical Reports Server (NTRS)
Roman, Monserrate C.; Jones, Kathy U.; Oubre, Cherie M.; Castro, Victoria; Ott, Mark C.; Birmele, Michele; Venkateswaran, Kasthuri J.; Vaishampayan, Parag A.
2013-01-01
Current methods for microbial detection: a) Labor & time intensive cultivation-based approaches that can fail to detect or characterize all cells present. b) Requires collection of samples on orbit and transportation back to ground for analysis. Disadvantages to current detection methods: a) Unable to perform quick and reliable detection on orbit. b) Lengthy sampling intervals. c) No microbe identification.
James S. Rentch; B. Desta Fekedulegn; Gary W. Miller
2002-01-01
This study evaluated the use of radial growth averaging as a technique of identifying canopy disturbances in a thinned 55-year-old mixed-oak stand in West Virginia. We used analysis of variance to determine the time interval (averaging period) and lag period (time between thinning and growth increase) that best captured the growth increase associated with different...
Hung, Shih-Chiang; Kung, Chia-Te; Hung, Chih-Wei; Liu, Ber-Ming; Liu, Jien-Wei; Chew, Ghee; Chuang, Hung-Yi; Lee, Wen-Huei; Lee, Tzu-Chi
2014-08-23
The adverse effects of delayed admission to the intensive care unit (ICU) have been recognized in previous studies. However, the definitions of delayed admission varies across studies. This study proposed a model to define "delayed admission", and explored the effect of ICU-waiting time on patients' outcome. This retrospective cohort study included non-traumatic adult patients on mechanical ventilation in the emergency department (ED), from July 2009 to June 2010. The primary outcomes measures were 21-ventilator-day mortality and prolonged hospital stays (over 30 days). Models of Cox regression and logistic regression were used for multivariate analysis. The non-delayed ICU-waiting was defined as a period in which the time effect on mortality was not statistically significant in a Cox regression model. To identify a suitable cut-off point between "delayed" and "non-delayed", subsets from the overall data were made based on ICU-waiting time and the hazard ratio of ICU-waiting hour in each subset was iteratively calculated. The cut-off time was then used to evaluate the impact of delayed ICU admission on mortality and prolonged length of hospital stay. The final analysis included 1,242 patients. The time effect on mortality emerged after 4 hours, thus we deduced ICU-waiting time in ED > 4 hours as delayed. By logistic regression analysis, delayed ICU admission affected the outcomes of 21 ventilator-days mortality and prolonged hospital stay, with odds ratio of 1.41 (95% confidence interval, 1.05 to 1.89) and 1.56 (95% confidence interval, 1.07 to 2.27) respectively. For patients on mechanical ventilation at the ED, delayed ICU admission is associated with higher probability of mortality and additional resource expenditure. A benchmark waiting time of no more than 4 hours for ICU admission is recommended.
Carlsen, Katrine; Houen, Gunnar; Jakobsen, Christian; Kallemose, Thomas; Paerregaard, Anders; Riis, Lene B; Munkholm, Pia; Wewer, Vibeke
2017-09-01
To individualize timing of infliximab (IFX) treatment in children and adolescents with inflammatory bowel disease (IBD) using a patient-managed eHealth program. Patients with IBD, 10 to 17 years old, treated with IFX were prospectively included. Starting 4 weeks after their last infusion, patients reported a weekly symptom score and provided a stool sample for fecal calprotectin analysis. Based on symptom scores and fecal calprotectin results, the eHealth program calculated a total inflammation burden score that determined the timing of the next IFX infusion (4-12 wk after the previous infusion). Quality of Life was scored by IMPACT III. A control group was included to compare trough levels of IFX antibodies and concentrations and treatment intervals. Patients and their parents evaluated the eHealth program. There were 29 patients with IBD in the eHealth group and 21 patients with IBD in the control group. During the control period, 94 infusions were provided in the eHealth group (mean interval 9.5 wk; SD 2.3) versus 105 infusions in the control group (mean interval 6.9 wk; SD 1.4). Treatment intervals were longer in the eHealth group (P < 0.001). Quality of Life did not change during the study. Appearance of IFX antibodies did not differ between the 2 groups. Eighty percent of patients reported increased disease control and 63% (86% of parents) reported an improved knowledge of the disease. Self-managed, eHealth-individualized timing of IFX treatments, with treatment intervals of 4 to 12 weeks, was accompanied by no significant development of IFX antibodies. Patients reported better control and improved knowledge of their IBD.
Novel method for high-throughput phenotyping of sleep in mice.
Pack, Allan I; Galante, Raymond J; Maislin, Greg; Cater, Jacqueline; Metaxas, Dimitris; Lu, Shan; Zhang, Lin; Von Smith, Randy; Kay, Timothy; Lian, Jie; Svenson, Karen; Peters, Luanne L
2007-01-17
Assessment of sleep in mice currently requires initial implantation of chronic electrodes for assessment of electroencephalogram (EEG) and electromyogram (EMG) followed by time to recover from surgery. Hence, it is not ideal for high-throughput screening. To address this deficiency, a method of assessment of sleep and wakefulness in mice has been developed based on assessment of activity/inactivity either by digital video analysis or by breaking infrared beams in the mouse cage. It is based on the algorithm that any episode of continuous inactivity of > or =40 s is predicted to be sleep. The method gives excellent agreement in C57BL/6J male mice with simultaneous assessment of sleep by EEG/EMG recording. The average agreement over 8,640 10-s epochs in 24 h is 92% (n = 7 mice) with agreement in individual mice being 88-94%. Average EEG/EMG determined sleep per 2-h interval across the day was 59.4 min. The estimated mean difference (bias) per 2-h interval between inactivity-defined sleep and EEG/EMG-defined sleep was only 1.0 min (95% confidence interval for mean bias -0.06 to +2.6 min). The standard deviation of differences (precision) was 7.5 min per 2-h interval with 95% limits of agreement ranging from -13.7 to +15.7 min. Although bias significantly varied by time of day (P = 0.0007), the magnitude of time-of-day differences was not large (average bias during lights on and lights off was +5.0 and -3.0 min per 2-h interval, respectively). This method has applications in chemical mutagenesis and for studies of molecular changes in brain with sleep/wakefulness.
Nixdorff, Uwe; Feddersen, Isa; Voigt, Jens-Uwe; Flachskampf, Frank A
2005-01-01
Three-dimensional echocardiography (3DE) improves the accuracy of left ventricle (LV) volumetry compared with the two-dimensional echocardiography (2DE) approach because geometric assumptions in the algorithms may be eliminated. The relationship between accuracy of mode (short- versus long-axis planimetry) and the number of component images versus time required for analysis remains to be determined. Sixteen latex models simulating heterogeneously distorted (aneurysmatic) human LVs (56-303 ml; mean 182+/-82 ml) were scanned from an 'apical' position (simultaneous 2DE and 3DE). For 3DE volumetry, the slice thickness was varied for the short (C-scan) and long axes (B-scan) in 5-mm steps between 1 and 25 mm. The mean differences (true-echocardiographic volumes) were 16.5+/-44.3 ml in the 2DE approach (95% confidence intervals -27.8 to +60.8) and 0.6+/-4.0 ml (short axis; 95% confidence intervals -3.4 to +4.6) as well as 2.1+/-9.9 ml (long axis; 95% confidence intervals -7.8 to +12.0) in the 3DE approach (in both cases, the slice thickness was 1 mm). Above a slice thickness of 15 mm, the 95% confidence intervals increased steeply; in the short versus long axes, these were -6.5 to +8.5 versus -7.0 to +10.6 at 15 mm and -10.1 to +15.7 versus -11.3 to +10.9 at 20 mm. The intra-observer variance differed significantly (p<0.001) only above 15 mm (short axis). Time required for analysis derived by measuring short-axis slice thicknesses of 1, 15, and 25 mm was 58+/-16, 7+/-2 and 3+/-1 min, respectively. The most rational component image analysis for 3DE volumetry in the in vitro model uses short-axis slices with a thickness of 15 mm. Copyright (c) 2005 S. Karger AG, Basel.
Schwing, P T; O'Malley, B J; Romero, I C; Martínez-Colón, M; Hastings, D W; Glabach, M A; Hladky, E M; Greco, A; Hollander, D J
2017-01-01
Following the Deepwater Horizon (DWH) event in 2010 subsurface hydrocarbon intrusions (1000-1300 m) and an order of magnitude increase in flocculent hydrocarbon deposition caused increased concentrations of hydrocarbons in continental slope sediments. This study sought to characterize the variability [density, Fisher's alpha (S), equitability (E), Shannon (H)] of benthic foraminifera following the DWH event. A series of sediment cores were collected at two sites in the northeastern Gulf of Mexico from 2010 to 2012. At each site, three cores were utilized for benthic faunal analysis, organic geochemistry, and redox metal chemistry, respectively. The surface intervals (∼0-10 mm) of the sedimentary records collected in December 2010 at DSH08 and February 2011 at PCB06 were characterized by significant decreases in foraminiferal density, S, E, and H, relative to the down-core intervals as well as previous surveys. Non-metric multidimensional scaling (nMDS) analysis suggested that a 3-fold increase in polycyclic aromatic hydrocarbon (PAH) concentration in the surface interval, relative to the down-core interval, was the environmental driver of benthic foraminiferal variability. These records suggested that the benthic foraminiferal recovery time, following an event such as the DWH, was on the order of 1-2 years.
Reliability-based management of buried pipelines considering external corrosion defects
NASA Astrophysics Data System (ADS)
Miran, Seyedeh Azadeh
Corrosion is one of the main deteriorating mechanisms that degrade the energy pipeline integrity, due to transferring corrosive fluid or gas and interacting with corrosive environment. Corrosion defects are usually detected by periodical inspections using in-line inspection (ILI) methods. In order to ensure pipeline safety, this study develops a cost-effective maintenance strategy that consists of three aspects: corrosion growth model development using ILI data, time-dependent performance evaluation, and optimal inspection interval determination. In particular, the proposed study is applied to a cathodic protected buried steel pipeline located in Mexico. First, time-dependent power-law formulation is adopted to probabilistically characterize growth of the maximum depth and length of the external corrosion defects. Dependency between defect depth and length are considered in the model development and generation of the corrosion defects over time is characterized by the homogenous Poisson process. The growth models unknown parameters are evaluated based on the ILI data through the Bayesian updating method with Markov Chain Monte Carlo (MCMC) simulation technique. The proposed corrosion growth models can be used when either matched or non-matched defects are available, and have ability to consider newly generated defects since last inspection. Results of this part of study show that both depth and length growth models can predict damage quantities reasonably well and a strong correlation between defect depth and length is found. Next, time-dependent system failure probabilities are evaluated using developed corrosion growth models considering prevailing uncertainties where three failure modes, namely small leak, large leak and rupture are considered. Performance of the pipeline is evaluated through failure probability per km (or called a sub-system) where each subsystem is considered as a series system of detected and newly generated defects within that sub-system. Sensitivity analysis is also performed to determine to which incorporated parameter(s) in the growth models reliability of the studied pipeline is most sensitive. The reliability analysis results suggest that newly generated defects should be considered in calculating failure probability, especially for prediction of long-term performance of the pipeline and also, impact of the statistical uncertainty in the model parameters is significant that should be considered in the reliability analysis. Finally, with the evaluated time-dependent failure probabilities, a life cycle-cost analysis is conducted to determine optimal inspection interval of studied pipeline. The expected total life-cycle costs consists construction cost and expected costs of inspections, repair, and failure. The repair is conducted when failure probability from any described failure mode exceeds pre-defined probability threshold after each inspection. Moreover, this study also investigates impact of repair threshold values and unit costs of inspection and failure on the expected total life-cycle cost and optimal inspection interval through a parametric study. The analysis suggests that a smaller inspection interval leads to higher inspection costs, but can lower failure cost and also repair cost is less significant compared to inspection and failure costs.
Chládková, Jirina; Havlínová, Zuzana; Chyba, Tomás; Krcmová, Irena; Chládek, Jaroslav
2008-11-01
Current guidelines recommend the single-breath measurement of fractional concentration of exhaled nitric oxide (FE(NO)) at the expiratory flow rate of 50 mL/s as a gold standard. The time profile of exhaled FE(NO) consists of a washout phase followed by a plateau phase with a stable concentration. This study performed measurements of FE(NO) using a chemiluminescence analyzer Ecomedics CLD88sp and an electrochemical monitor NIOX MINO in 82 children and adolescents (44 males) from 4.9 to 18.7 years of age with corticosteroid-treated allergic rhinitis (N = 58) and/or asthma (N = 59). Duration of exhalation was 6 seconds for children less than 12 years of age and 10 seconds for older children. The first aim was to compare the evaluation of FE(NO)-time profiles from Ecomedics by its software in fixed intervals of 7 to 10 seconds (older children) and 2 to 4 seconds (younger children) since the start of exhalation (method A) with the guideline-based analysis of plateau concentrations at variable time intervals (method B). The second aim was to assess the between-analyzer agreement. In children over 12 years of age, the median ratio of FE(NO) concentrations of 1.00 (95% CI: 0.99-1.02) indicated an excellent agreement between the methods A and B. Compared with NIOX MINO, the Ecomedics results were higher by 11% (95% CI: 1-22) (method A) and 14% (95% CI: 4-26) (method B), respectively. In children less than 12 years of age, the FE(NO) concentrations obtained by the method B were 34% (95% CI: 21-48) higher and more reproducible (p < 0.02) compared to the method A. The Ecomedics results of the method A were 11% lower (95% CI: 2-20) than NIOX MINO concentrations while the method B gave 21% higher concentrations (95% CI: 9-35). We conclude that in children less than 12 years of age, the guideline-based analysis of FE(NO)-time profiles from Ecomedics at variable times obtains FE(NO) concentrations that are higher and more reproducible than those from the fixed interval of 2 to 4 seconds and higher than NIOX MINO concentrations obtained during a short exhalation (6 seconds). The Ecomedics FE(NO) concentrations of children more than 12 years of age calculated in the interval of 7 to 10 seconds represent plateau values and agree well with NIOX MINO results obtained during a standard 10-second exhalation.
Faber, V.
1994-11-29
Livelock-free message routing is provided in a network of interconnected nodes that is flushable in time T. An input message processor generates sequences of at least N time intervals, each of duration T. An input register provides for receiving and holding each input message, where the message is assigned a priority state p during an nth one of the N time intervals. At each of the network nodes a message processor reads the assigned priority state and awards priority to messages with priority state (p-1) during an nth time interval and to messages with priority state p during an (n+1) th time interval. The messages that are awarded priority are output on an output path toward the addressed output message processor. Thus, no message remains in the network for a time longer than T. 4 figures.
Faber, Vance
1994-01-01
Livelock-free message routing is provided in a network of interconnected nodes that is flushable in time T. An input message processor generates sequences of at least N time intervals, each of duration T. An input register provides for receiving and holding each input message, where the message is assigned a priority state p during an nth one of the N time intervals. At each of the network nodes a message processor reads the assigned priority state and awards priority to messages with priority state (p-1) during an nth time interval and to messages with priority state p during an (n+1) th time interval. The messages that are awarded priority are output on an output path toward the addressed output message processor. Thus, no message remains in the network for a time longer than T.
a New Approach for Accuracy Improvement of Pulsed LIDAR Remote Sensing Data
NASA Astrophysics Data System (ADS)
Zhou, G.; Huang, W.; Zhou, X.; He, C.; Li, X.; Huang, Y.; Zhang, L.
2018-05-01
In remote sensing applications, the accuracy of time interval measurement is one of the most important parameters that affect the quality of pulsed lidar data. The traditional time interval measurement technique has the disadvantages of low measurement accuracy, complicated circuit structure and large error. A high-precision time interval data cannot be obtained in these traditional methods. In order to obtain higher quality of remote sensing cloud images based on the time interval measurement, a higher accuracy time interval measurement method is proposed. The method is based on charging the capacitance and sampling the change of capacitor voltage at the same time. Firstly, the approximate model of the capacitance voltage curve in the time of flight of pulse is fitted based on the sampled data. Then, the whole charging time is obtained with the fitting function. In this method, only a high-speed A/D sampler and capacitor are required in a single receiving channel, and the collected data is processed directly in the main control unit. The experimental results show that the proposed method can get error less than 3 ps. Compared with other methods, the proposed method improves the time interval accuracy by at least 20 %.
Fan, Shu-Han; Chou, Chia-Ching; Chen, Wei-Chen; Fang, Wai-Chi
2015-01-01
In this study, an effective real-time obstructive sleep apnea (OSA) detection method from frequency analysis of ECG-derived respiratory (EDR) and heart rate variability (HRV) is proposed. Compared to traditional Polysomnography (PSG) which needs several physiological signals measured from patients, the proposed OSA detection method just only use ECG signals to determine the time interval of OSA. In order to be feasible to be implemented in hardware to achieve the real-time detection and portable application, the simplified Lomb Periodogram is utilized to perform the frequency analysis of EDR and HRV in this study. The experimental results of this work indicate that the overall accuracy can be effectively increased with values of Specificity (Sp) of 91%, Sensitivity (Se) of 95.7%, and Accuracy of 93.2% by integrating the EDR and HRV indexes.
Candida parapsilosis biofilm identification by Raman spectroscopy.
Samek, Ota; Mlynariková, Katarina; Bernatová, Silvie; Ježek, Jan; Krzyžánek, Vladislav; Šiler, Martin; Zemánek, Pavel; Růžička, Filip; Holá, Veronika; Mahelová, Martina
2014-12-22
Colonies of Candida parapsilosis on culture plates were probed directly in situ using Raman spectroscopy for rapid identification of specific strains separated by a given time intervals (up to months apart). To classify the Raman spectra, data analysis was performed using the approach of principal component analysis (PCA). The analysis of the data sets generated during the scans of individual colonies reveals that despite the inhomogeneity of the biological samples unambiguous associations to individual strains (two biofilm-positive and two biofilm-negative) could be made.
Candida parapsilosis Biofilm Identification by Raman Spectroscopy
Samek, Ota; Mlynariková, Katarina; Bernatová, Silvie; Ježek, Jan; Krzyžánek, Vladislav; Šiler, Martin; Zemánek, Pavel; Růžička, Filip; Holá, Veronika; Mahelová, Martina
2014-01-01
Colonies of Candida parapsilosis on culture plates were probed directly in situ using Raman spectroscopy for rapid identification of specific strains separated by a given time intervals (up to months apart). To classify the Raman spectra, data analysis was performed using the approach of principal component analysis (PCA). The analysis of the data sets generated during the scans of individual colonies reveals that despite the inhomogeneity of the biological samples unambiguous associations to individual strains (two biofilm-positive and two biofilm-negative) could be made. PMID:25535081
Recurrence time statistics for finite size intervals
NASA Astrophysics Data System (ADS)
Altmann, Eduardo G.; da Silva, Elton C.; Caldas, Iberê L.
2004-12-01
We investigate the statistics of recurrences to finite size intervals for chaotic dynamical systems. We find that the typical distribution presents an exponential decay for almost all recurrence times except for a few short times affected by a kind of memory effect. We interpret this effect as being related to the unstable periodic orbits inside the interval. Although it is restricted to a few short times it changes the whole distribution of recurrences. We show that for systems with strong mixing properties the exponential decay converges to the Poissonian statistics when the width of the interval goes to zero. However, we alert that special attention to the size of the interval is required in order to guarantee that the short time memory effect is negligible when one is interested in numerically or experimentally calculated Poincaré recurrence time statistics.
Novel Screening Tool for Stroke Using Artificial Neural Network.
Abedi, Vida; Goyal, Nitin; Tsivgoulis, Georgios; Hosseinichimeh, Niyousha; Hontecillas, Raquel; Bassaganya-Riera, Josep; Elijovich, Lucas; Metter, Jeffrey E; Alexandrov, Anne W; Liebeskind, David S; Alexandrov, Andrei V; Zand, Ramin
2017-06-01
The timely diagnosis of stroke at the initial examination is extremely important given the disease morbidity and narrow time window for intervention. The goal of this study was to develop a supervised learning method to recognize acute cerebral ischemia (ACI) and differentiate that from stroke mimics in an emergency setting. Consecutive patients presenting to the emergency department with stroke-like symptoms, within 4.5 hours of symptoms onset, in 2 tertiary care stroke centers were randomized for inclusion in the model. We developed an artificial neural network (ANN) model. The learning algorithm was based on backpropagation. To validate the model, we used a 10-fold cross-validation method. A total of 260 patients (equal number of stroke mimics and ACIs) were enrolled for the development and validation of our ANN model. Our analysis indicated that the average sensitivity and specificity of ANN for the diagnosis of ACI based on the 10-fold cross-validation analysis was 80.0% (95% confidence interval, 71.8-86.3) and 86.2% (95% confidence interval, 78.7-91.4), respectively. The median precision of ANN for the diagnosis of ACI was 92% (95% confidence interval, 88.7-95.3). Our results show that ANN can be an effective tool for the recognition of ACI and differentiation of ACI from stroke mimics at the initial examination. © 2017 American Heart Association, Inc.
Bayesian lead time estimation for the Johns Hopkins Lung Project data.
Jang, Hyejeong; Kim, Seongho; Wu, Dongfeng
2013-09-01
Lung cancer screening using X-rays has been controversial for many years. A major concern is whether lung cancer screening really brings any survival benefits, which depends on effective treatment after early detection. The problem was analyzed from a different point of view and estimates were presented of the projected lead time for participants in a lung cancer screening program using the Johns Hopkins Lung Project (JHLP) data. The newly developed method of lead time estimation was applied where the lifetime T was treated as a random variable rather than a fixed value, resulting in the number of future screenings for a given individual is a random variable. Using the actuarial life table available from the United States Social Security Administration, the lifetime distribution was first obtained, then the lead time distribution was projected using the JHLP data. The data analysis with the JHLP data shows that, for a male heavy smoker with initial screening ages at 50, 60, and 70, the probability of no-early-detection with semiannual screens will be 32.16%, 32.45%, and 33.17%, respectively; while the mean lead time is 1.36, 1.33 and 1.23 years. The probability of no-early-detection increases monotonically when the screening interval increases, and it increases slightly as the initial age increases for the same screening interval. The mean lead time and its standard error decrease when the screening interval increases for all age groups, and both decrease when initial age increases with the same screening interval. The overall mean lead time estimated with a random lifetime T is slightly less than that with a fixed value of T. This result is hoped to be of benefit to improve current screening programs. Copyright © 2013 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.
Fast transfer of crossmodal time interval training.
Chen, Lihan; Zhou, Xiaolin
2014-06-01
Sub-second time perception is essential for many important sensory and perceptual tasks including speech perception, motion perception, motor coordination, and crossmodal interaction. This study investigates to what extent the ability to discriminate sub-second time intervals acquired in one sensory modality can be transferred to another modality. To this end, we used perceptual classification of visual Ternus display (Ternus in Psychol Forsch 7:81-136, 1926) to implicitly measure participants' interval perception in pre- and posttests and implemented an intra- or crossmodal sub-second interval discrimination training protocol in between the tests. The Ternus display elicited either an "element motion" or a "group motion" percept, depending on the inter-stimulus interval between the two visual frames. The training protocol required participants to explicitly compare the interval length between a pair of visual, auditory, or tactile stimuli with a standard interval or to implicitly perceive the length of visual, auditory, or tactile intervals by completing a non-temporal task (discrimination of auditory pitch or tactile intensity). Results showed that after fast explicit training of interval discrimination (about 15 min), participants improved their ability to categorize the visual apparent motion in Ternus displays, although the training benefits were mild for visual timing training. However, the benefits were absent for implicit interval training protocols. This finding suggests that the timing ability in one modality can be rapidly acquired and used to improve timing-related performance in another modality and that there may exist a central clock for sub-second temporal processing, although modality-specific perceptual properties may constrain the functioning of this clock.
Liu, Xiao-Ping; Gao, Bao-Zhen; Han, Feng-Qing; Fang, Zhi-Yuan; Yang, Li-Mei; Zhuang, Mu; Lv, Hong-Hao; Liu, Yu-Mei; Li, Zhan-Sheng; Cai, Cheng-Cheng; Yu, Hai-Long; Li, Zhi-Yuan; Zhang, Yang-Yong
2017-03-14
Due to its variegated and colorful leaves, ornamental kale (Brassica oleracea L. var. acephala) has become a popular ornamental plant. In this study, we report the fine mapping and analysis of a candidate purple leaf gene using a backcross population and an F 2 population derived from two parental lines: W1827 (with white leaves) and P1835 (with purple leaves). Genetic analysis indicated that the purple leaf trait is controlled by a single dominant gene, which we named BoPr. Using markers developed based on the reference genome '02-12', the BoPr gene was preliminarily mapped to a 280-kb interval of chromosome C09, with flanking markers M17 and BoID4714 at genetic distances of 4.3 cM and 1.5 cM, respectively. The recombination rate within this interval is almost 12 times higher than the usual level, which could be caused by assembly error for reference genome '02-12' at this interval. Primers were designed based on 'TO1000', another B. oleracea reference genome. Among the newly designed InDel markers, BRID485 and BRID490 were found to be the closest to BoPr, flanking the gene at genetic distances of 0.1 cM and 0.2 cM, respectively; the interval between the two markers is 44.8 kb (reference genome 'TO1000'). Seven annotated genes are located within the 44.8 kb genomic region, of which only Bo9g058630 shows high homology to AT5G42800 (dihydroflavonol reductase), which was identified as a candidate gene for BoPr. Blast analysis revealed that this 44.8 kb interval is located on an unanchored scaffold (Scaffold000035_P2) of '02-12', confirming the existence of assembly error at the interval between M17 and BoID4714 for reference genome '02-12'. This study identified a candidate gene for BoPr and lays a foundation for the cloning and functional analysis of this gene.
NASA Astrophysics Data System (ADS)
Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.
2012-12-01
Many spectral analysis techniques have been designed assuming sequences taken with a constant sampling interval. However, there are empirical time series in the geosciences (sediment cores, fossil abundance data, isotope analysis, …) that do not follow regular sampling because of missing data, gapped data, random sampling or incomplete sequences, among other reasons. In general, interpolating an uneven series in order to obtain a succession with a constant sampling interval alters the spectral content of the series. In such cases it is preferable to follow an approach that works with the uneven data directly, avoiding the need for an explicit interpolation step. The Lomb-Scargle periodogram is a popular choice in such circumstances, as there are programs available in the public domain for its computation. One new computer program for spectral analysis improves the standard Lomb-Scargle periodogram approach in two ways: (1) It explicitly adjusts the statistical significance to any bias introduced by variance reduction smoothing, and (2) it uses a permutation test to evaluate confidence levels, which is better suited than parametric methods when neighbouring frequencies are highly correlated. Another novel program for cross-spectral analysis offers the advantage of estimating the Lomb-Scargle cross-periodogram of two uneven time series defined on the same interval, and it evaluates the confidence levels of the estimated cross-spectra by a non-parametric computer intensive permutation test. Thus, the cross-spectrum, the squared coherence spectrum, the phase spectrum, and the Monte Carlo statistical significance of the cross-spectrum and the squared-coherence spectrum can be obtained. Both of the programs are written in ANSI Fortran 77, in view of its simplicity and compatibility. The program code is of public domain, provided on the website of the journal (http://www.iamg.org/index.php/publisher/articleview/frmArticleID/112/). Different examples (with simulated and real data) are described in this paper to corroborate the methodology and the implementation of these two new programs.
Cerebral oxygenation and desaturations in preterm infants - a longitudinal data analysis.
Mayer, Benjamin; Pohl, Moritz; Hummler, Helmut D; Schmid, Manuel B
2017-01-01
Hypoxemic episodes commonly occur in very preterm infants and may be associated with several adverse effects. Cerebral tissue oxygen saturation (StO2) as measured by near infrared spectroscopy (NIRS) may be a useful measure to assess brain oxygenation. However, knowledge on variability of StO2 is limited in preterm infants at this time, so StO2 dependency on arterial oxygenation (SpO2) and heart rate (HR) was assessed in preterm infants using statistical methods of time series analysis. StO2, SpO2, and HR were recorded from 15 preterm infants every 2 seconds for six hours. Statistical methods of time series and longitudinal data analysis were applied to the data. The mean StO2 level was found as 72% (95% confidence interval (CI) 55.5% -85.5%) based on a moving average process with a 5 minute order. Accordingly, longitudinal SpO2 measurements showed a mean level of 91% (95% CI 69% -98%). Generally, compensation strategies to cope with both StO2 and SpO2 desaturations were observed in the studied patients. SpO2 had a significant effect on cerebral oxygenation (p < 0.001), but HR did not, which led to inconclusive results considering different time intervals. In infants with intermittent hypoxemia and bradycardia, we found a mean StO2 level of 72% and a strong correlation with SpO2. We observed large differences between individuals in the ability to maintain StO2 at a stable level.
Caminal, Pere; Sola, Fuensanta; Gomis, Pedro; Guasch, Eduard; Perera, Alexandre; Soriano, Núria; Mont, Lluis
2018-03-01
This study was conducted to test, in mountain running route conditions, the accuracy of the Polar V800™ monitor as a suitable device for monitoring the heart rate variability (HRV) of runners. Eighteen healthy subjects ran a route that included a range of running slopes such as those encountered in trail and ultra-trail races. The comparative study of a V800 and a Holter SEER 12 ECG Recorder™ included the analysis of RR time series and short-term HRV analysis. A correction algorithm was designed to obtain the corrected Polar RR intervals. Six 5-min segments related to different running slopes were considered for each subject. The correlation between corrected V800 RR intervals and Holter RR intervals was very high (r = 0.99, p < 0.001), and the bias was less than 1 ms. The limits of agreement (LoA) obtained for SDNN and RMSSD were (- 0.25 to 0.32 ms) and (- 0.90 to 1.08 ms), respectively. The effect size (ES) obtained in the time domain HRV parameters was considered small (ES < 0.2). Frequency domain HRV parameters did not differ (p > 0.05) and were well correlated (r ≥ 0.96, p < 0.001). Narrow limits of agreement, high correlations and small effect size suggest that the Polar V800 is a valid tool for the analysis of heart rate variability in athletes while running high endurance events such as marathon, trail, and ultra-trail races.
Boshuisen, Kim; Schmidt, Dieter; Uiterwaal, Cuno S P M; Arzimanoglou, Alexis; Braun, Kees P J; Study Group, TimeToStop
2014-09-01
It was recently suggested that early postoperative seizure relapse implicates a failure to define and resect the epileptogenic zone, that late recurrences reflect the persistence or re-emergence of epileptogenic pathology, and that early recurrences are associated with poor treatment response. Timing of antiepileptic drugs withdrawal policies, however, have never been taken into account when investigating time to relapse following epilepsy surgery. Of the European paediatric epilepsy surgery cohort from the "TimeToStop" study, all 95 children with postoperative seizure recurrence following antiepileptic drug (AED) withdrawal were selected. We investigated how time intervals from surgery to AED withdrawal, as well as other previously suggested determinants of (timing of) seizure recurrence, related to time to relapse and to relapse treatability. Uni- and multivariable linear and logistic regression models were used. Based on multivariable analysis, a shorter interval to AED reduction was the only independent predictor of a shorter time to relapse. Based on univariable analysis, incomplete resection of the epileptogenic zone related to a shorter time to recurrence. Timing of recurrence was not related to the chance of regaining seizure freedom after reinstallation of medical treatment. For children in whom AED reduction is initiated following epilepsy surgery, the time to relapse is largely influenced by the timing of AED withdrawal, rather than by disease or surgery-specific factors. We could not confirm a relationship between time to recurrence and treatment response. Timing of AED withdrawal should be taken into account when studying time to relapse following epilepsy surgery, as early withdrawal reveals more rapidly whether surgery had the intended curative effect, independently of the other factors involved.
Spatial-temporal analysis of the of the risk of Rift Valley Fever in Kenya
NASA Astrophysics Data System (ADS)
Bett, B.; Omolo, A.; Hansen, F.; Notenbaert, A.; Kemp, S.
2012-04-01
Historical data on Rift Valley Fever (RVF) outbreaks in Kenya covering the period 1951 - 2010 were analyzed using a logistic regression model to identify factors associated with RVF occurrence. The analysis used a division, an administrative unit below a district, as the unit of analysis. The infection status of each division was defined on a monthly time scale and used as a dependent variable. Predictors investigated include: monthly precipitation (minimum, maximum and total), normalized difference vegetation index, altitude, agro-ecological zone, presence of game, livestock and human population densities, the number of times a division has had an outbreak before and time interval in months between successive outbreaks (used as a proxy for immunity). Both univariable and multivariable analyses were conducted. The models used incorporated an auto-regressive correlation matrix to account for clustering of observations in time, while dummy variables were fitted in the multivariable model to account for spatial relatedness/topology between divisions. This last procedure was followed because it is expected that the risk of RVF occurring in a given division increases when its immediate neighbor gets infected. Functional relationships between the continuous and the outcome variables were assessed to ensure that the linearity assumption was met. Deviance and leverage residuals were also generated from the final model and used for evaluating the goodness of fit of the model. Descriptive analyzes indicate that a total of 91 divisions in 42 districts (of the original 69 districts in place by 1999) reported RVF outbreaks at least once over the period. The mean interval between outbreaks was determined to be about 43 months. Factors that were positively associated with RVF occurrence include increased precipitation, high outbreak interval and the number of times a division has been infected or reported an outbreak. The model will be validated and used for developing an RVF forecasting system. This forecasting system can then be used with the existing regional RVF prediction tools such as EMPRES-i to downscale RVF risk predictions to country-specific scales and subsequently link them with decision support systems. The ultimate aim is to increase the capacity of the national institutions to formulate appropriate RVF mitigation measures.
Chu, Catherine J; Chan, Arthur; Song, Dan; Staley, Kevin J; Stufflebeam, Steven M; Kramer, Mark A
2017-02-01
High frequency oscillations are emerging as a clinically important indicator of epileptic networks. However, manual detection of these high frequency oscillations is difficult, time consuming, and subjective, especially in the scalp EEG, thus hindering further clinical exploration and application. Semi-automated detection methods augment manual detection by reducing inspection to a subset of time intervals. We propose a new method to detect high frequency oscillations that co-occur with interictal epileptiform discharges. The new method proceeds in two steps. The first step identifies candidate time intervals during which high frequency activity is increased. The second step computes a set of seven features for each candidate interval. These features require that the candidate event contain a high frequency oscillation approximately sinusoidal in shape, with at least three cycles, that co-occurs with a large amplitude discharge. Candidate events that satisfy these features are stored for validation through visual analysis. We evaluate the detector performance in simulation and on ten examples of scalp EEG data, and show that the proposed method successfully detects spike-ripple events, with high positive predictive value, low false positive rate, and high intra-rater reliability. The proposed method is less sensitive than the existing method of visual inspection, but much faster and much more reliable. Accurate and rapid detection of high frequency activity increases the clinical viability of this rhythmic biomarker of epilepsy. The proposed spike-ripple detector rapidly identifies candidate spike-ripple events, thus making clinical analysis of prolonged, multielectrode scalp EEG recordings tractable. Copyright © 2016 Elsevier B.V. All rights reserved.
Chu, Catherine. J.; Chan, Arthur; Song, Dan; Staley, Kevin J.; Stufflebeam, Steven M.; Kramer, Mark A.
2017-01-01
Summary Background High frequency oscillations are emerging as a clinically important indicator of epileptic networks. However, manual detection of these high frequency oscillations is difficult, time consuming, and subjective, especially in the scalp EEG, thus hindering further clinical exploration and application. Semi-automated detection methods augment manual detection by reducing inspection to a subset of time intervals. We propose a new method to detect high frequency oscillations that co-occur with interictal epileptiform discharges. New Method The new method proceeds in two steps. The first step identifies candidate time intervals during which high frequency activity is increased. The second step computes a set of seven features for each candidate interval. These features require that the candidate event contain a high frequency oscillation approximately sinusoidal in shape, with at least three cycles, that co-occurs with a large amplitude discharge. Candidate events that satisfy these features are stored for validation through visual analysis. Results We evaluate the detector performance in simulation and on ten examples of scalp EEG data, and show that the proposed method successfully detects spike-ripple events, with high positive predictive value, low false positive rate, and high intra-rater reliability. Comparison with Existing Method The proposed method is less sensitive than the existing method of visual inspection, but much faster and much more reliable. Conclusions Accurate and rapid detection of high frequency activity increases the clinical viability of this rhythmic biomarker of epilepsy. The proposed spike-ripple detector rapidly identifies candidate spike-ripple events, thus making clinical analysis of prolonged, multielectrode scalp EEG recordings tractable. PMID:27988323
Whys and Hows of the Parameterized Interval Analyses: A Guide for the Perplexed
NASA Astrophysics Data System (ADS)
Elishakoff, I.
2013-10-01
Novel elements of the parameterized interval analysis developed in [1, 2] are emphasized in this response, to Professor E.D. Popova, or possibly to others who may be perplexed by the parameterized interval analysis. It is also shown that the overwhelming majority of comments by Popova [3] are based on a misreading of our paper [1]. Partial responsibility for this misreading can be attributed to the fact that explanations provided in [1] were laconic. These could have been more extensive in view of the novelty of our approach [1, 2]. It is our duty, therefore, to reiterate, in this response, the whys and hows of parameterization of intervals, introduced in [1] to incorporate the possibly available information on dependencies between various intervals describing the problem at hand. This possibility appears to have been discarded by the standard interval analysis, which may, as a result, lead to overdesign, leading to the possible divorce of engineers from the otherwise beautiful interval analysis.
Estimating clinical chemistry reference values based on an existing data set of unselected animals.
Dimauro, Corrado; Bonelli, Piero; Nicolussi, Paola; Rassu, Salvatore P G; Cappio-Borlino, Aldo; Pulina, Giuseppe
2008-11-01
In an attempt to standardise the determination of biological reference values, the International Federation of Clinical Chemistry (IFCC) has published a series of recommendations on developing reference intervals. The IFCC recommends the use of an a priori sampling of at least 120 healthy individuals. However, such a high number of samples and laboratory analysis is expensive, time-consuming and not always feasible, especially in veterinary medicine. In this paper, an alternative (a posteriori) method is described and is used to determine reference intervals for biochemical parameters of farm animals using an existing laboratory data set. The method used was based on the detection and removal of outliers to obtain a large sample of animals likely to be healthy from the existing data set. This allowed the estimation of reliable reference intervals for biochemical parameters in Sarda dairy sheep. This method may also be useful for the determination of reference intervals for different species, ages and gender.
Azunre, P.
2016-09-21
Here in this paper, two novel techniques for bounding the solutions of parametric weakly coupled second-order semilinear parabolic partial differential equations are developed. The first provides a theorem to construct interval bounds, while the second provides a theorem to construct lower bounds convex and upper bounds concave in the parameter. The convex/concave bounds can be significantly tighter than the interval bounds because of the wrapping effect suffered by interval analysis in dynamical systems. Both types of bounds are computationally cheap to construct, requiring solving auxiliary systems twice and four times larger than the original system, respectively. An illustrative numerical examplemore » of bound construction and use for deterministic global optimization within a simple serial branch-and-bound algorithm, implemented numerically using interval arithmetic and a generalization of McCormick's relaxation technique, is presented. Finally, problems within the important class of reaction-diffusion systems may be optimized with these tools.« less
A single-loop optimization method for reliability analysis with second order uncertainty
NASA Astrophysics Data System (ADS)
Xie, Shaojun; Pan, Baisong; Du, Xiaoping
2015-08-01
Reliability analysis may involve random variables and interval variables. In addition, some of the random variables may have interval distribution parameters owing to limited information. This kind of uncertainty is called second order uncertainty. This article develops an efficient reliability method for problems involving the three aforementioned types of uncertain input variables. The analysis produces the maximum and minimum reliability and is computationally demanding because two loops are needed: a reliability analysis loop with respect to random variables and an interval analysis loop for extreme responses with respect to interval variables. The first order reliability method and nonlinear optimization are used for the two loops, respectively. For computational efficiency, the two loops are combined into a single loop by treating the Karush-Kuhn-Tucker (KKT) optimal conditions of the interval analysis as constraints. Three examples are presented to demonstrate the proposed method.
Place avoidance learning and memory in a jumping spider.
Peckmezian, Tina; Taylor, Phillip W
2017-03-01
Using a conditioned passive place avoidance paradigm, we investigated the relative importance of three experimental parameters on learning and memory in a salticid, Servaea incana. Spiders encountered an aversive electric shock stimulus paired with one side of a two-sided arena. Our three parameters were the ecological relevance of the visual stimulus, the time interval between trials and the time interval before test. We paired electric shock with either a black or white visual stimulus, as prior studies in our laboratory have demonstrated that S. incana prefer dark 'safe' regions to light ones. We additionally evaluated the influence of two temporal features (time interval between trials and time interval before test) on learning and memory. Spiders exposed to the shock stimulus learned to associate shock with the visual background cue, but the extent to which they did so was dependent on which visual stimulus was present and the time interval between trials. Spiders trained with a long interval between trials (24 h) maintained performance throughout training, whereas spiders trained with a short interval (10 min) maintained performance only when the safe side was black. When the safe side was white, performance worsened steadily over time. There was no difference between spiders tested after a short (10 min) or long (24 h) interval before test. These results suggest that the ecological relevance of the stimuli used and the duration of the interval between trials can influence learning and memory in jumping spiders.
Peláez-Coca, M. D.; Orini, M.; Lázaro, J.; Bailón, R.; Gil, E.
2013-01-01
A methodology that combines information from several nonstationary biological signals is presented. This methodology is based on time-frequency coherence, that quantifies the similarity of two signals in the time-frequency domain. A cross time-frequency analysis method, based on quadratic time-frequency distribution, has been used for combining information of several nonstationary biomedical signals. In order to evaluate this methodology, the respiratory rate from the photoplethysmographic (PPG) signal is estimated. The respiration provokes simultaneous changes in the pulse interval, amplitude, and width of the PPG signal. This suggests that the combination of information from these sources will improve the accuracy of the estimation of the respiratory rate. Another target of this paper is to implement an algorithm which provides a robust estimation. Therefore, respiratory rate was estimated only in those intervals where the features extracted from the PPG signals are linearly coupled. In 38 spontaneous breathing subjects, among which 7 were characterized by a respiratory rate lower than 0.15 Hz, this methodology provided accurate estimates, with the median error {0.00; 0.98} mHz ({0.00; 0.31}%) and the interquartile range error {4.88; 6.59} mHz ({1.60; 1.92}%). The estimation error of the presented methodology was largely lower than the estimation error obtained without combining different PPG features related to respiration. PMID:24363777
TIME-INTERVAL MEASURING DEVICE
Gross, J.E.
1958-04-15
An electronic device for measuring the time interval between two control pulses is presented. The device incorporates part of a previous approach for time measurement, in that pulses from a constant-frequency oscillator are counted during the interval between the control pulses. To reduce the possible error in counting caused by the operation of the counter gating circuit at various points in the pulse cycle, the described device provides means for successively delaying the pulses for a fraction of the pulse period so that a final delay of one period is obtained and means for counting the pulses before and after each stage of delay during the time interval whereby a plurality of totals is obtained which may be averaged and multplied by the pulse period to obtain an accurate time- Interval measurement.
[Estimation of the atrioventricular time interval by pulse Doppler in the normal fetal heart].
Hamela-Olkowska, Anita; Dangel, Joanna
2009-08-01
To assess normative values of the fetal atrioventricular (AV) time interval by pulse-wave Doppler methods on 5-chamber view. Fetal echocardiography exams were performed using Acuson Sequoia 512 in 140 singleton fetuses at 18 to 40 weeks of gestation with sinus rhythm and normal cardiac and extracardiac anatomy. Pulsed Doppler derived AV intervals were measured from left ventricular inflow/outflow view using transabdominal convex 3.5-6 MHz probe. The values of AV time interval ranged from 100 to 150 ms (mean 123 +/- 11.2). The AV interval was negatively correlated with the heart rhythm (p<0.001). Fetal heart rate decreased as gestation progressed (p<0.001). Thus, the AV intervals increased with the age of gestation (p=0.007). However, in the same subgroup of the fetal heart rate there was no relation between AV intervals and gestational age. Therefore, the AV intervals showed only the heart rate dependence. The 95th percentiles of AV intervals according to FHR ranged from 135 to 148 ms. 1. The AV interval duration was negatively correlated with the heart rhythm. 2. Measurement of AV time interval is easy to perform and has a good reproducibility. It may be used for the fetal heart block screening in anti-Ro and anti-La positive pregnancies. 3. Normative values established in the study may help obstetricians in assessing fetal abnormalities of the AV conduction.
Multiscale analysis of heart rate dynamics: entropy and time irreversibility measures.
Costa, Madalena D; Peng, Chung-Kang; Goldberger, Ary L
2008-06-01
Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and non-equilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools--multiscale entropy and multiscale time irreversibility--are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs.
Smelling time: a neural basis for olfactory scene analysis
Ache, Barry W.; Hein, Andrew M.; Bobkov, Yuriy V.; Principe, Jose C.
2016-01-01
Behavioral evidence from phylogenetically diverse animals and humans suggests that olfaction could be much more involved in interpreting space and time than heretofore imagined by extracting temporal information inherent in the olfactory signal. If this is the case, the olfactory system must have neural mechanisms capable of encoding time at intervals relevant to the turbulent odor world in which many animals live. We review evidence that animals can use populations of rhythmically active or ‘bursting’ olfactory receptor neurons (bORNs) to extract and encode temporal information inherent in natural olfactory signals. We postulate that bORNs represent an unsuspected neural mechanism through which time can be accurately measured, and that ‘smelling time’ completes the requirements for true olfactory scene analysis. PMID:27594700
Multiscale Analysis of Heart Rate Dynamics: Entropy and Time Irreversibility Measures
Peng, Chung-Kang; Goldberger, Ary L.
2016-01-01
Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and nonequilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools— multiscale entropy and multiscale time irreversibility—are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs. PMID:18172763
Stability of Predictors of Mortality after Spinal Cord Injury
Krause, James S.; Saunders, Lee L.; Zhai, Yusheng
2011-01-01
Objective To identify the stability of socio-environmental, behavioral, and health predictors of mortality over an eight year time frame. Study Design Cohort study. Setting Data were analyzed at a large medical university in the Southeast United States of America (USA). Methods Adults with residual impairment from a spinal cord injury (SCI) who were at least one year post-injury at assessment were recruited through a large specialty hospital in the Southeast USA. 1209 participants were included in the final analysis. A piecewise exponential model with 2 equal time intervals (eight years total) was used to assess the stability of the hazard and the predictors over time. Results The hazard did significantly change over time, where the hazard in the first time interval was significantly lower than the second. There were no interactions between the socio-environmental, behavior, or health factors and time, although there was a significant interaction between age at injury (a demographic variable) and time. Conclusion These results suggest there is stability in the association between the predictors and mortality, even over an eight year time period. Results reinforce the use of historic variables for prediction of mortality in persons with SCI. PMID:22231541
Robust motion tracking based on adaptive speckle decorrelation analysis of OCT signal.
Wang, Yuewen; Wang, Yahui; Akansu, Ali; Belfield, Kevin D; Hubbi, Basil; Liu, Xuan
2015-11-01
Speckle decorrelation analysis of optical coherence tomography (OCT) signal has been used in motion tracking. In our previous study, we demonstrated that cross-correlation coefficient (XCC) between Ascans had an explicit functional dependency on the magnitude of lateral displacement (δx). In this study, we evaluated the sensitivity of speckle motion tracking using the derivative of function XCC(δx) on variable δx. We demonstrated the magnitude of the derivative can be maximized. In other words, the sensitivity of OCT speckle tracking can be optimized by using signals with appropriate amount of decorrelation for XCC calculation. Based on this finding, we developed an adaptive speckle decorrelation analysis strategy to achieve motion tracking with optimized sensitivity. Briefly, we used subsequently acquired Ascans and Ascans obtained with larger time intervals to obtain multiple values of XCC and chose the XCC value that maximized motion tracking sensitivity for displacement calculation. Instantaneous motion speed can be calculated by dividing the obtained displacement with time interval between Ascans involved in XCC calculation. We implemented the above-described algorithm in real-time using graphic processing unit (GPU) and demonstrated its effectiveness in reconstructing distortion-free OCT images using data obtained from a manually scanned OCT probe. The adaptive speckle tracking method was validated in manually scanned OCT imaging, on phantom as well as in vivo skin tissue.
Robust motion tracking based on adaptive speckle decorrelation analysis of OCT signal
Wang, Yuewen; Wang, Yahui; Akansu, Ali; Belfield, Kevin D.; Hubbi, Basil; Liu, Xuan
2015-01-01
Speckle decorrelation analysis of optical coherence tomography (OCT) signal has been used in motion tracking. In our previous study, we demonstrated that cross-correlation coefficient (XCC) between Ascans had an explicit functional dependency on the magnitude of lateral displacement (δx). In this study, we evaluated the sensitivity of speckle motion tracking using the derivative of function XCC(δx) on variable δx. We demonstrated the magnitude of the derivative can be maximized. In other words, the sensitivity of OCT speckle tracking can be optimized by using signals with appropriate amount of decorrelation for XCC calculation. Based on this finding, we developed an adaptive speckle decorrelation analysis strategy to achieve motion tracking with optimized sensitivity. Briefly, we used subsequently acquired Ascans and Ascans obtained with larger time intervals to obtain multiple values of XCC and chose the XCC value that maximized motion tracking sensitivity for displacement calculation. Instantaneous motion speed can be calculated by dividing the obtained displacement with time interval between Ascans involved in XCC calculation. We implemented the above-described algorithm in real-time using graphic processing unit (GPU) and demonstrated its effectiveness in reconstructing distortion-free OCT images using data obtained from a manually scanned OCT probe. The adaptive speckle tracking method was validated in manually scanned OCT imaging, on phantom as well as in vivo skin tissue. PMID:26600996
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Highly comparative time-series analysis: the empirical structure of time series and their methods
Fulcher, Ben D.; Little, Max A.; Jones, Nick S.
2013-01-01
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344
Characterization of cracking behavior using posttest fractographic analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobayashi, T.; Shockey, D.A.
A determination of time to initiation of stress corrosion cracking in structures and test specimens is important for performing structural failure analysis and for setting inspection intervals. Yet it is seldom possible to establish how much of a component's lifetime represents the time to initiation of fracture and how much represents postinitiation crack growth. This exploratory research project was undertaken to examine the feasibility of determining crack initiation times and crack growth rates from posttest examination of fracture surfaces of constant-extension-rate-test (CERT) specimens by using the fracture reconstruction applying surface topography analysis (FRASTA) technique. The specimens used in this studymore » were Type 304 stainless steel fractured in several boiling water reactor (BWR) aqueous environments. 2 refs., 25 figs., 2 tabs.« less
Magro-Malosso, Elena R; Saccone, Gabriele; Di Tommaso, Mariarosaria; Roman, Amanda; Berghella, Vincenzo
2017-08-01
Gestational hypertensive disorders, including gestational hypertension and preeclampsia, are one of the leading causes of maternal morbidity and mortality. The aim of our study was to evaluate the effect of exercise during pregnancy on the risk of gestational hypertensive disorders. Electronic databases were searched from their inception to February 2017. Selection criteria included only randomized controlled trials of uncomplicated pregnant women assigned before 23 weeks to an aerobic exercise regimen or not. The summary measures were reported as relative risk with 95% confidence intervals. The primary outcome was the incidence of gestational hypertensive disorders, defined as either gestational hypertension or preeclampsia. Seventeen trials, including 5075 pregnant women, were analyzed. Of them, seven contributed data to quantitative meta-analysis for the primary outcome. Women who were randomized in early pregnancy to aerobic exercise for about 30-60 min two to seven times per week had a significant lower incidence of gestational hypertensive disorders (5.9% vs. 8.5%; relative risk 0.70, 95% confidence interval 0.53-0.83; seven studies, 2517 participants), specifically a lower incidence of gestational hypertension (2.5% vs. 4.6%; relative risk 0.54, 95% confidence interval 0.40-0.74; 16 studies, 4641 participants) compared with controls. The incidence of preeclampsia (2.3% vs. 2.8%; relative risk 0.79, 95% confidence interval 0.45-1.38; six studies, 2230 participants) was similar in both groups. The incidence of cesarean delivery was decreased by 16% in the exercise group. Aerobic exercise for about 30-60 min two to seven times per week during pregnancy, as compared with being more sedentary, is associated with a significantly reduced risk of gestational hypertensive disorders overall, gestational hypertension, and cesarean delivery. © 2017 Nordic Federation of Societies of Obstetrics and Gynecology.
Uncertainty analysis for absorbed dose from a brain receptor imaging agent
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aydogan, B.; Miller, L.F.; Sparks, R.B.
Absorbed dose estimates are known to contain uncertainties. A recent literature search indicates that prior to this study no rigorous investigation of uncertainty associated with absorbed dose has been undertaken. A method of uncertainty analysis for absorbed dose calculations has been developed and implemented for the brain receptor imaging agent {sup 123}I-IPT. The two major sources of uncertainty considered were the uncertainty associated with the determination of residence time and that associated with the determination of the S values. There are many sources of uncertainty in the determination of the S values, but only the inter-patient organ mass variation wasmore » considered in this work. The absorbed dose uncertainties were determined for lung, liver, heart and brain. Ninety-five percent confidence intervals of the organ absorbed dose distributions for each patient and for a seven-patient population group were determined by the ``Latin Hypercube Sampling`` method. For an individual patient, the upper bound of the 95% confidence interval of the absorbed dose was found to be about 2.5 times larger than the estimated mean absorbed dose. For the seven-patient population the upper bound of the 95% confidence interval of the absorbed dose distribution was around 45% more than the estimated population mean. For example, the 95% confidence interval of the population liver dose distribution was found to be between 1.49E+0.7 Gy/MBq and 4.65E+07 Gy/MBq with a mean of 2.52E+07 Gy/MBq. This study concluded that patients in a population receiving {sup 123}I-IPT could receive absorbed doses as much as twice as large as the standard estimated absorbed dose due to these uncertainties.« less
Fixed-interval matching-to-sample: intermatching time and intermatching error runs1
Nelson, Thomas D.
1978-01-01
Four pigeons were trained on a matching-to-sample task in which reinforcers followed either the first matching response (fixed interval) or the fifth matching response (tandem fixed-interval fixed-ratio) that occurred 80 seconds or longer after the last reinforcement. Relative frequency distributions of the matching-to-sample responses that concluded intermatching times and runs of mismatches (intermatching error runs) were computed for the final matching responses directly followed by grain access and also for the three matching responses immediately preceding the final match. Comparison of these two distributions showed that the fixed-interval schedule arranged for the preferential reinforcement of matches concluding relatively extended intermatching times and runs of mismatches. Differences in matching accuracy and rate during the fixed interval, compared to the tandem fixed-interval fixed-ratio, suggested that reinforcers following matches concluding various intermatching times and runs of mismatches influenced the rate and accuracy of the last few matches before grain access, but did not control rate and accuracy throughout the entire fixed-interval period. PMID:16812032
Improved confidence intervals when the sample is counted an integer times longer than the blank.
Potter, William Edward; Strzelczyk, Jadwiga Jodi
2011-05-01
Past computer solutions for confidence intervals in paired counting are extended to the case where the ratio of the sample count time to the blank count time is taken to be an integer, IRR. Previously, confidence intervals have been named Neyman-Pearson confidence intervals; more correctly they should have been named Neyman confidence intervals or simply confidence intervals. The technique utilized mimics a technique used by Pearson and Hartley to tabulate confidence intervals for the expected value of the discrete Poisson and Binomial distributions. The blank count and the contribution of the sample to the gross count are assumed to be Poisson distributed. The expected value of the blank count, in the sample count time, is assumed known. The net count, OC, is taken to be the gross count minus the product of IRR with the blank count. The probability density function (PDF) for the net count can be determined in a straightforward manner.
The Time Is Up: Compression of Visual Time Interval Estimations of Bimodal Aperiodic Patterns
Duarte, Fabiola; Lemus, Luis
2017-01-01
The ability to estimate time intervals subserves many of our behaviors and perceptual experiences. However, it is not clear how aperiodic (AP) stimuli affect our perception of time intervals across sensory modalities. To address this question, we evaluated the human capacity to discriminate between two acoustic (A), visual (V) or audiovisual (AV) time intervals of trains of scattered pulses. We first measured the periodicity of those stimuli and then sought for correlations with the accuracy and reaction times (RTs) of the subjects. We found that, for all time intervals tested in our experiment, the visual system consistently perceived AP stimuli as being shorter than the periodic (P) ones. In contrast, such a compression phenomenon was not apparent during auditory trials. Our conclusions are: first, the subjects exposed to P stimuli are more likely to measure their durations accurately. Second, perceptual time compression occurs for AP visual stimuli. Lastly, AV discriminations are determined by A dominance rather than by AV enhancement. PMID:28848406
Dual ant colony operational modal analysis parameter estimation method
NASA Astrophysics Data System (ADS)
Sitarz, Piotr; Powałka, Bartosz
2018-01-01
Operational Modal Analysis (OMA) is a common technique used to examine the dynamic properties of a system. Contrary to experimental modal analysis, the input signal is generated in object ambient environment. Operational modal analysis mainly aims at determining the number of pole pairs and at estimating modal parameters. Many methods are used for parameter identification. Some methods operate in time while others in frequency domain. The former use correlation functions, the latter - spectral density functions. However, while some methods require the user to select poles from a stabilisation diagram, others try to automate the selection process. Dual ant colony operational modal analysis parameter estimation method (DAC-OMA) presents a new approach to the problem, avoiding issues involved in the stabilisation diagram. The presented algorithm is fully automated. It uses deterministic methods to define the interval of estimated parameters, thus reducing the problem to optimisation task which is conducted with dedicated software based on ant colony optimisation algorithm. The combination of deterministic methods restricting parameter intervals and artificial intelligence yields very good results, also for closely spaced modes and significantly varied mode shapes within one measurement point.
Time estimation by patients with frontal lesions and by Korsakoff amnesics.
Mimura, M; Kinsbourne, M; O'Connor, M
2000-07-01
We studied time estimation in patients with frontal damage (F) and alcoholic Korsakoff (K) patients in order to differentiate between the contributions of working memory and episodic memory to temporal cognition. In Experiment 1, F and K patients estimated time intervals between 10 and 120 s less accurately than matched normal and alcoholic control subjects. F patients were less accurate than K patients at short (< 1 min) time intervals whereas K patients increasingly underestimated durations as intervals grew longer. F patients overestimated short intervals in inverse proportion to their performance on the Wisconsin Card Sorting Test. As intervals grew longer, overestimation yielded to underestimation for F patients. Experiment 2 involved time estimation while counting at a subjective 1/s rate. F patients' subjective tempo, though relatively rapid, did not fully explain their overestimation of short intervals. In Experiment 3, participants produced predetermined time intervals by depressing a mouse key. K patients underproduced longer intervals. F patients produced comparably to normal participants, but were extremely variable. Findings suggest that both working memory and episodic memory play an individual role in temporal cognition. Turnover within a short-term working memory buffer provides a metric for temporal decisions. The depleted working memory that typically attends frontal dysfunction may result in quicker turnover, and this may inflate subjective duration. On the other hand, temporal estimation beyond 30 s requires episodic remembering, and this puts K patients at a disadvantage.
Lekanidi, Katerina; Dilks, Phil; Suaris, Tamara; Kennett, Steffan; Purushothaman, Hema
2017-09-01
The aim of this study was to determine the features that make interval cancers apparent on the preceding screening mammogram and determine whether changes in the ways of performing the interval cancer review will affect the true interval cancer rate. This study was approved by the clinical governance committee. Mammograms of women diagnosed with an interval cancer were included in the study if they had been allocated to either the "suspicious signs" group or "subtle signs" group, during the historic interval cancer review. Three radiologists, individually and blinded to the site of interval cancer, reviewed the mammograms and documented the presence, site, characteristics and classification of any abnormality. Findings were compared with the appearances of the abnormality at the site of subsequent cancer development by a different breast radiologist. The chi-squared test was used in the analysis of the results, seeking associations between recall concordance and cancer mammographic or histological characteristics. 111/590 interval cancers fulfilled the study inclusion criteria. In 17% of the cases none of the readers identified the relevant abnormality on the screening mammogram. 1/3 readers identified the relevant lesion in 22% of the cases, 2/3 readers in 28% of cases and all 3 readers in 33% of cases. The commonest unanimously recalled abnormality was microcalcification and the most challenging mammographic abnormality to detect was asymmetric density. We did not find any statistically significant association between recall concordance and time to interval cancer, position of lesion in the breast, breast density or cancer grade. Even the simple step of performing an independent blinded review of interval cancers reduces the rate of interval cancers classified as missed by up to 39%. Copyright © 2017 Elsevier B.V. All rights reserved.
Lu, Hongwei; Zhang, Chenxi; Sun, Ying; Hao, Zhidong; Wang, Chunfang; Tian, Jiajia
2015-08-01
Predicting the termination of paroxysmal atrial fibrillation (AF) may provide a signal to decide whether there is a need to intervene the AF timely. We proposed a novel RdR RR intervals scatter plot in our study. The abscissa of the RdR scatter plot was set to RR intervals and the ordinate was set as the difference between successive RR intervals. The RdR scatter plot includes information of RR intervals and difference between successive RR intervals, which captures more heart rate variability (HRV) information. By RdR scatter plot analysis of one minute RR intervals for 50 segments with non-terminating AF and immediately terminating AF, it was found that the points in RdR scatter plot of non-terminating AF were more decentralized than the ones of immediately terminating AF. By dividing the RdR scatter plot into uniform grids and counting the number of non-empty grids, non-terminating AF and immediately terminating AF segments were differentiated. By utilizing 49 RR intervals, for 20 segments of learning set, 17 segments were correctly detected, and for 30 segments of test set, 20 segments were detected. While utilizing 66 RR intervals, for 18 segments of learning set, 16 segments were correctly detected, and for 28 segments of test set, 20 segments were detected. The results demonstrated that during the last one minute before the termination of paroxysmal AF, the variance of the RR intervals and the difference of the neighboring two RR intervals became smaller. The termination of paroxysmal AF could be successfully predicted by utilizing the RdR scatter plot, while the predicting accuracy should be further improved.
Method and apparatus for assessing cardiovascular risk
NASA Technical Reports Server (NTRS)
Albrecht, Paul (Inventor); Bigger, J. Thomas (Inventor); Cohen, Richard J. (Inventor)
1998-01-01
The method for assessing risk of an adverse clinical event includes detecting a physiologic signal in the subject and determining from the physiologic signal a sequence of intervals corresponding to time intervals between heart beats. The long-time structure of fluctuations in the intervals over a time period of more than fifteen minutes is analyzed to assess risk of an adverse clinical event. In a preferred embodiment, the physiologic signal is an electrocardiogram and the time period is at least fifteen minutes. A preferred method for analyzing the long-time structure variability in the intervals includes computing the power spectrum and fitting the power spectrum to a power law dependence on frequency over a selected frequency range such as 10.sup.-4 to 10.sup.-2 Hz. Characteristics of the long-time structure fluctuations in the intervals is used to assess risk of an adverse clinical event.
Time-varying nonstationary multivariate risk analysis using a dynamic Bayesian copula
NASA Astrophysics Data System (ADS)
Sarhadi, Ali; Burn, Donald H.; Concepción Ausín, María.; Wiper, Michael P.
2016-03-01
A time-varying risk analysis is proposed for an adaptive design framework in nonstationary conditions arising from climate change. A Bayesian, dynamic conditional copula is developed for modeling the time-varying dependence structure between mixed continuous and discrete multiattributes of multidimensional hydrometeorological phenomena. Joint Bayesian inference is carried out to fit the marginals and copula in an illustrative example using an adaptive, Gibbs Markov Chain Monte Carlo (MCMC) sampler. Posterior mean estimates and credible intervals are provided for the model parameters and the Deviance Information Criterion (DIC) is used to select the model that best captures different forms of nonstationarity over time. This study also introduces a fully Bayesian, time-varying joint return period for multivariate time-dependent risk analysis in nonstationary environments. The results demonstrate that the nature and the risk of extreme-climate multidimensional processes are changed over time under the impact of climate change, and accordingly the long-term decision making strategies should be updated based on the anomalies of the nonstationary environment.
[Forensic entomology exemplified by a homicide. A combined stain and postmortem time analysis].
Benecke, M; Seifert, B
1999-01-01
The combined analysis of both ant and blow fly evidence recovered from a corpse, and from the boot of a suspect, suggested that an assumed scenario in a high profile murder case was likely to be true. The ants (Lasius fuliginous) were used as classical crime scene stains that linked the suspect to the scene. Blow fly maggots (Calliphora spec.) helped to determine the post mortem interval (PMI) with the calculated PMI overlapping with the assumed time of the killing. In the trial, the results of the medico-legal analysis of the insects was understood to be crucial scientific evidence, and the suspect was sentenced to 8 years in prison.
Stochastic flux analysis of chemical reaction networks
2013-01-01
Background Chemical reaction networks provide an abstraction scheme for a broad range of models in biology and ecology. The two common means for simulating these networks are the deterministic and the stochastic approaches. The traditional deterministic approach, based on differential equations, enjoys a rich set of analysis techniques, including a treatment of reaction fluxes. However, the discrete stochastic simulations, which provide advantages in some cases, lack a quantitative treatment of network fluxes. Results We describe a method for flux analysis of chemical reaction networks, where flux is given by the flow of species between reactions in stochastic simulations of the network. Extending discrete event simulation algorithms, our method constructs several data structures, and thereby reveals a variety of statistics about resource creation and consumption during the simulation. We use these structures to quantify the causal interdependence and relative importance of the reactions at arbitrary time intervals with respect to the network fluxes. This allows us to construct reduced networks that have the same flux-behavior, and compare these networks, also with respect to their time series. We demonstrate our approach on an extended example based on a published ODE model of the same network, that is, Rho GTP-binding proteins, and on other models from biology and ecology. Conclusions We provide a fully stochastic treatment of flux analysis. As in deterministic analysis, our method delivers the network behavior in terms of species transformations. Moreover, our stochastic analysis can be applied, not only at steady state, but at arbitrary time intervals, and used to identify the flow of specific species between specific reactions. Our cases study of Rho GTP-binding proteins reveals the role played by the cyclic reverse fluxes in tuning the behavior of this network. PMID:24314153
Stochastic flux analysis of chemical reaction networks.
Kahramanoğulları, Ozan; Lynch, James F
2013-12-07
Chemical reaction networks provide an abstraction scheme for a broad range of models in biology and ecology. The two common means for simulating these networks are the deterministic and the stochastic approaches. The traditional deterministic approach, based on differential equations, enjoys a rich set of analysis techniques, including a treatment of reaction fluxes. However, the discrete stochastic simulations, which provide advantages in some cases, lack a quantitative treatment of network fluxes. We describe a method for flux analysis of chemical reaction networks, where flux is given by the flow of species between reactions in stochastic simulations of the network. Extending discrete event simulation algorithms, our method constructs several data structures, and thereby reveals a variety of statistics about resource creation and consumption during the simulation. We use these structures to quantify the causal interdependence and relative importance of the reactions at arbitrary time intervals with respect to the network fluxes. This allows us to construct reduced networks that have the same flux-behavior, and compare these networks, also with respect to their time series. We demonstrate our approach on an extended example based on a published ODE model of the same network, that is, Rho GTP-binding proteins, and on other models from biology and ecology. We provide a fully stochastic treatment of flux analysis. As in deterministic analysis, our method delivers the network behavior in terms of species transformations. Moreover, our stochastic analysis can be applied, not only at steady state, but at arbitrary time intervals, and used to identify the flow of specific species between specific reactions. Our cases study of Rho GTP-binding proteins reveals the role played by the cyclic reverse fluxes in tuning the behavior of this network.
The determination of the pulse pile-up reject (PUR) counting for X and gamma ray spectrometry
NASA Astrophysics Data System (ADS)
Karabıdak, S. M.; Kaya, S.
2017-02-01
The collection the charged particles produced by the incident radiation on a detector requires a time interval. If this time interval is not sufficiently short compared with the peaking time of the amplifier, a loss in the recovered signal amplitude occurs. Another major constraint on the throughput of modern x or gamma-ray spectrometers is the time required for the subsequent the pulse processing by the electronics. Two above-mentioned limitations are cause of counting losses resulting from the dead time and the pile-up. The pulse pile-up is a common problem in x and gamma ray radiation detection systems. The pulses pile-up in spectroscopic analysis can cause significant errors. Therefore, inhibition of these pulses is a vital step. A way to reduce errors due to the pulse pile-up is a pile-up inspection circuitry (PUR). Such a circuit rejects some of the pulse pile-up. Therefore, this circuit leads to counting losses. Determination of these counting losses is an important problem. In this work, a new method is suggested for the determination of the pulse pile-up reject.
Inui, N; Ichihara, T
2001-10-01
To examine the relation between timing and force control during finger taping sequences by both pianists and nonpianists, participants tapped a force plate connected to strain gauges. A series of finger tapping tasks consisted of 16 combinations of pace (intertap interval: 180, 200, 400, or 800 ms) and peak force (50, 100, 200, or 400 g). Analysis showed that, although movement timing was independent of force control under low or medium pace conditions, there were strong interactions between the 2 parameters under high pace conditions. The results indicate that participants adapted the movement by switching from separately controlling these parameters in the slow and moderate movement to coupling them in the fast movement. While variations in the intertap interval affected force production by nonpianists, they had little effect for pianists. The ratios of time-to-peak force to press duration increased linearly in pianists but varied irregularly in nonpianists, as the required force decreased. Thus, pianists regulate peak force by timing control of peak force to press duration, suggesting that training affects the relationship between the 2 parameters.
Holland, Alexander; Aboy, Mateo
2009-07-01
We present a novel method to iteratively calculate discrete Fourier transforms for discrete time signals with sample time intervals that may be widely nonuniform. The proposed recursive Fourier transform (RFT) does not require interpolation of the samples to uniform time intervals, and each iterative transform update of N frequencies has computational order N. Because of the inherent non-uniformity in the time between successive heart beats, an application particularly well suited for this transform is power spectral density (PSD) estimation for heart rate variability. We compare RFT based spectrum estimation with Lomb-Scargle Transform (LST) based estimation. PSD estimation based on the LST also does not require uniform time samples, but the LST has a computational order greater than Nlog(N). We conducted an assessment study involving the analysis of quasi-stationary signals with various levels of randomly missing heart beats. Our results indicate that the RFT leads to comparable estimation performance to the LST with significantly less computational overhead and complexity for applications requiring iterative spectrum estimations.
Common View Time Transfer Using Worldwide GPS and DMA Monitor Stations
NASA Technical Reports Server (NTRS)
Reid, Wilson G.; McCaskill, Thomas B.; Oaks, Orville J.; Buisson, James A.; Warren, Hugh E.
1996-01-01
Analysis of the on-orbit Navstar clocks and the Global Positioning System (GPS) monitor station reference clocks is performed by the Naval Research Laboratory using both broadcast and postprocessed precise ephemerides. The precise ephemerides are produced by the Defense Mapping Agency (DMA) for each of the GPS space vehicles from pseudo-range measurements collected at five GPS and at five DMA monitor stations spaced around the world. Recently, DMA established an additional site co-located with the US Naval Observatory precise time site. The time reference for the new DMA site is the DoD Master Clock. Now, for the first time, it is possible to transfer time every 15 minutes via common view from the DoD Master Clock to the 11 GPS and DMA monitor stations. The estimated precision of a single common-view time transfer measurement taken over a 15-minute interval was between 1.4 and 2.7 nanoseconds. Using the measurements from all Navstar space vehicles in common view during the 15-minute interval, typically 3-7 space vehicles, improved the estimate of the precision to between 0.65 and 1.13 nanoseconds. The mean phase error obtained from closure of the time transfer around the world using the 11 monitor stations and the 25 space vehicle clocks over a period of 4 months had a magnitude of 31 picoseconds. Analysis of the low noise time transfer from the DoD Master Clock to each of the monitor stations yields not only the bias in the time of the reference clock, but also focuses attention on structure in the behaviour of the reference clock not previously seen. Furthermore, the time transfer provides a a uniformly sampled database of 15-minute measurements that make possible, for the first time, the direct and exhaustive computation of the frequency stability of the monitor station reference clocks. To lend perspective to the analysis, a summary is given of the discontinuities in phase and frequency that occurred in the reference clock at the Master Control Station during the period covered by the analysis.
Payne, Thomas G.
1982-01-01
REGIONAL MAPPER is a menu-driven system in the BASIC language for computing and plotting (1) time, depth, and average velocity to geologic horizons, (2) interval time, thickness, and interval velocity of stratigraphic intervals, and (3) subcropping and onlapping intervals at unconformities. The system consists of three programs: FILER, TRAVERSER, and PLOTTER. A control point is a shot point with velocity analysis or a shot point at or near a well with velocity check-shot survey. Reflection time to and code number of seismic horizons are filed by digitizing tablet from record sections. TRAVERSER starts at a point of geologic control and, in traversing to another, parallels seismic events, records loss of horizons by onlap and truncation, and stores reflection time for geologic horizons at traversed shot points. TRAVERSER is basically a phantoming procedure. Permafrost thickness and velocity variations, buried canyons with low-velocity fill, and error in seismically derived velocity cause velocity anomalies that complicate depth mapping. Two depths to the top of the pebble is based shale are computed for each control point. One depth, designated Zs on seismically derived velocity. The other (Zw) is based on interval velocity interpolated linearly between wells and multiplied by interval time (isochron) to give interval thickness. Z w is computed for all geologic horizons by downward summation of interval thickness. Unknown true depth (Z) to the pebble shale may be expressed as Z = Zs + es and Z = Zw + ew where the e terms represent error. Equating the two expressions gives the depth difference D = Zs + Zw = ew + es A plot of D for the top of the pebble shale is readily contourable but smoothing is required to produce a reasonably simple surface. Seismically derived velocity used in computing Zs includes the effect of velocity anomalies but is subject to some large randomly distributed errors resulting in depth errors (es). Well-derived velocity used in computing Zw does not include the effect of velocity anomalies, but the error (ew) should reflect these anomalies and should be contourable (non-random). The D surface as contoured with smoothing is assumed to represent ew, that is, the depth effect of variations in permafrost thickness and velocity and buried canyon depth. Estimated depth (Zest) to each geologic horizon is the sum of Z w for that horizon and a constant e w as contoured for the pebble shale, which is the first highly continuous seismic horizon below the zone of anomalous velocity. Results of this 'depthing' procedure are compared with those of Tetra Tech, Inc., the subcontractor responsible for geologic and geophysical interpretation and mapping.
NASA Astrophysics Data System (ADS)
Roberts, Sean; Eykholt, R.; Thaut, Michael H.
2000-08-01
We investigate rhythmic finger tapping in both the presence and the absence of a metronome. We examine both the time intervals between taps and the time lags between the stimulus tones from the metronome and the response taps by the subject. We analyze the correlations in these data sets, and we search for evidence of deterministic chaos, as opposed to randomness, in the fluctuations.
Measuring the Performance and Intelligence of Systems: Proceedings of the 2001 PerMIS Workshop
2001-09-04
35 1.1 Interval Mathematics for Analysis of Multiresolutional Systems V. Kreinovich, Univ. of Texas, R. Alo, Univ. of Houston-Downtown...the possible combinations. In non-deterministic real- time systems , the problem is compounded by the uncertainty in the execution times of various...multiresolutional, multiscale ) in their essence because of multiresolutional character of the meaning of words [Rieger, 01]. In integrating systems , the presence of a
Modeling and simulation of count data.
Plan, E L
2014-08-13
Count data, or number of events per time interval, are discrete data arising from repeated time to event observations. Their mean count, or piecewise constant event rate, can be evaluated by discrete probability distributions from the Poisson model family. Clinical trial data characterization often involves population count analysis. This tutorial presents the basics and diagnostics of count modeling and simulation in the context of pharmacometrics. Consideration is given to overdispersion, underdispersion, autocorrelation, and inhomogeneity.
Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR
Mobli, Mehdi; Hoch, Jeffrey C.
2017-01-01
Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. PMID:25456315
Heilbronner, Sarah R.; Meck, Warren. H.
2014-01-01
The goal of our study was to characterize the relationship between intertemporal choice and interval timing, including determining how drugs that modulate brain serotonin and dopamine levels influence these two processes. In Experiment 1, rats were tested on a standard 40-s peak-interval procedure following administration of fluoxetine (3, 5, or 8 mg/kg) or vehicle to assess basic effects on interval timing. In Experiment 2, rats were tested in a novel behavioral paradigm intended to simultaneously examine interval timing and impulsivity. Rats performed a variant of the bi-peak procedure using 10-s and 40-s target durations with an additional “defection” lever that provided the possibility of a small, immediate reward. Timing functions remained relatively intact, and ‘patience’ across subjects correlated with peak times, indicating a negative relationship between ‘patience’ and clock speed. We next examined the effects of fluoxetine (5 mg/kg), cocaine (15 mg/kg), or methamphetamine (1 mg/kg) on task performance. Fluoxetine reduced impulsivity as measured by defection time without corresponding changes in clock speed. In contrast, cocaine and methamphetamine both increased impulsivity and clock speed. Thus, variations in timing may mediate intertemporal choice via dopaminergic inputs. However, a separate, serotonergic system can affect intertemporal choice without affecting interval timing directly. PMID:24135569
NUMERICAL METHODS FOR SOLVING THE MULTI-TERM TIME-FRACTIONAL WAVE-DIFFUSION EQUATION.
Liu, F; Meerschaert, M M; McGough, R J; Zhuang, P; Liu, Q
2013-03-01
In this paper, the multi-term time-fractional wave-diffusion equations are considered. The multi-term time fractional derivatives are defined in the Caputo sense, whose orders belong to the intervals [0,1], [1,2), [0,2), [0,3), [2,3) and [2,4), respectively. Some computationally effective numerical methods are proposed for simulating the multi-term time-fractional wave-diffusion equations. The numerical results demonstrate the effectiveness of theoretical analysis. These methods and techniques can also be extended to other kinds of the multi-term fractional time-space models with fractional Laplacian.
NUMERICAL METHODS FOR SOLVING THE MULTI-TERM TIME-FRACTIONAL WAVE-DIFFUSION EQUATION
Liu, F.; Meerschaert, M.M.; McGough, R.J.; Zhuang, P.; Liu, Q.
2013-01-01
In this paper, the multi-term time-fractional wave-diffusion equations are considered. The multi-term time fractional derivatives are defined in the Caputo sense, whose orders belong to the intervals [0,1], [1,2), [0,2), [0,3), [2,3) and [2,4), respectively. Some computationally effective numerical methods are proposed for simulating the multi-term time-fractional wave-diffusion equations. The numerical results demonstrate the effectiveness of theoretical analysis. These methods and techniques can also be extended to other kinds of the multi-term fractional time-space models with fractional Laplacian. PMID:23772179
Freise, Kevin J; Dunbar, Martin; Jones, Aksana K; Hoffman, David; Enschede, Sari L Heitner; Wong, Shekman; Salem, Ahmed Hamed
2016-10-01
Venetoclax (ABT-199/GDC-0199) is a selective first-in-class B cell lymphoma-2 inhibitor being developed for the treatment of hematological malignancies. The aim of this study was to determine the potential of venetoclax to prolong the corrected QT (QTc) interval and to evaluate the relationship between systemic venetoclax concentration and QTc interval. The study population included 176 male and female patients with relapsed or refractory chronic lymphocytic leukemia/small lymphocytic lymphoma (n = 105) or non-Hodgkin's lymphoma (n = 71) enrolled in a phase 1 safety, pharmacokinetic, and efficacy study. Electrocardiograms were collected in triplicate at time-matched points (2, 4, 6, and 8 h) prior to the first venetoclax administration and after repeated venetoclax administration to achieve steady state conditions. Venetoclax doses ranged from 100 to 1200 mg daily. Plasma venetoclax samples were collected after steady state electrocardiogram measurements. The mean and upper bound of the 2-sided 90 % confidence interval (CI) QTc change from baseline were <5 and <10 ms, respectively, at all time points and doses (<400, 400, and >400 mg). Three subjects had single QTc values >500 ms and/or ΔQTc > 60 ms. The effect of venetoclax concentration on both ΔQTc and QTc was not statistically significant (P > 0.05). At the mean maximum concentrations achieved with therapeutic (400 mg) and supra-therapeutic (1200 mg) venetoclax doses, the estimated drug effects on QTc were 0.137 (90 % CI [-1.01 to 1.28]) and 0.263 (90 % CI [-1.92 to 2.45]) ms, respectively. Venetoclax does not prolong QTc interval even at supra-therapeutic doses, and there is no relationship between venetoclax concentrations and QTc interval.
da Costa, D W; Dijksman, L M; Bouwense, S A; Schepers, N J; Besselink, M G; van Santvoort, H C; Boerma, D; Gooszen, H G; Dijkgraaf, M G W
2016-11-01
Same-admission cholecystectomy is indicated after gallstone pancreatitis to reduce the risk of recurrent disease or other gallstone-related complications, but its impact on overall costs is unclear. This study analysed the cost-effectiveness of same-admission versus interval cholecystectomy after mild gallstone pancreatitis. In a multicentre RCT (Pancreatitis of biliary Origin: optimal timiNg of CHOlecystectomy; PONCHO) patients with mild gallstone pancreatitis were randomized before discharge to either cholecystectomy within 72 h (same-admission cholecystectomy) or cholecystectomy after 25-30 days (interval cholecystectomy). Healthcare use of all patients was recorded prospectively using clinical report forms. Unit costs of resources used were determined, and patients completed multiple Health and Labour Questionnaires to record pancreatitis-related absence from work. Cost-effectiveness analyses were performed from societal and healthcare perspectives, with the costs per readmission prevented as primary outcome with a time horizon of 6 months. All 264 trial participants were included in the present analysis, 128 randomized to same-admission cholecystectomy and 136 to interval cholecystectomy. Same-admission cholecystectomy reduced the risk of acute readmission for recurrent gallstone-related complications from 16·9 to 4·7 per cent (P = 0·002). Mean total costs from a societal perspective were €234 (95 per cent c.i. -1249 to 738) less per patient in the same-admission cholecystectomy group. Same-admission cholecystectomy was superior to interval cholecystectomy, with a societal incremental cost-effectiveness ratio of -€1918 to prevent one readmission for gallstone-related complications. In mild biliary pancreatitis, same-admission cholecystectomy was more effective and less costly than interval cholecystectomy. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.
Continuous inventories and the components of change
Frnacis A. Roesch
2004-01-01
The consequences of conducting a continuous inventory that utilizes measurements on overlapping temporal intervals of varying length on compatible estimation systems for the components of growth are explored. The time interpenetrating sample design of the USDA Forest Service Forest Inventory and Analysis Program is used as an example. I show why estimation of the...
Kamran, Haroon; Salciccioli, Louis; Pushilin, Sergei; Kumar, Paraag; Carter, John; Kuo, John; Novotney, Carol; Lazar, Jason M
2011-01-01
Nonhuman primates are used frequently in cardiovascular research. Cardiac time intervals derived by phonocardiography have long been used to assess left ventricular function. Electronic stethoscopes are simple low-cost systems that display heart sound signals. We assessed the use of an electronic stethoscope to measure cardiac time intervals in 48 healthy bonnet macaques (age, 8 ± 5 y) based on recorded heart sounds. Technically adequate recordings were obtained from all animals and required 1.5 ± 1.3 min. The following cardiac time intervals were determined by simultaneously recording acoustic and single-lead electrocardiographic data: electromechanical activation time (QS1), electromechanical systole (QS2), the time interval between the first and second heart sounds (S1S2), and the time interval between the second and first sounds (S2S1). QS2 was correlated with heart rate, mean arterial pressure, diastolic blood pressure, and left ventricular ejection time determined by using echocardiography. S1S2 correlated with heart rate, mean arterial pressure, diastolic blood pressure, left ventricular ejection time, and age. S2S1 correlated with heart rate, mean arterial pressure, diastolic blood pressure, systolic blood pressure, and left ventricular ejection time. QS1 did not correlate with any anthropometric or echocardiographic parameter. The relation S1S2/S2S1 correlated with systolic blood pressure. On multivariate analyses, heart rate was the only independent predictor of QS2, S1S2, and S2S1. In conclusion, determination of cardiac time intervals is feasible and reproducible by using an electrical stethoscope in nonhuman primates. Heart rate is a major determinant of QS2, S1S2, and S2S1 but not QS1; regression equations for reference values for cardiac time intervals in bonnet macaques are provided. PMID:21439218
Interpregnancy interval and risk of autistic disorder.
Gunnes, Nina; Surén, Pål; Bresnahan, Michaeline; Hornig, Mady; Lie, Kari Kveim; Lipkin, W Ian; Magnus, Per; Nilsen, Roy Miodini; Reichborn-Kjennerud, Ted; Schjølberg, Synnve; Susser, Ezra Saul; Øyen, Anne-Siri; Stoltenberg, Camilla
2013-11-01
A recent California study reported increased risk of autistic disorder in children conceived within a year after the birth of a sibling. We assessed the association between interpregnancy interval and risk of autistic disorder using nationwide registry data on pairs of singleton full siblings born in Norway. We defined interpregnancy interval as the time from birth of the first-born child to conception of the second-born child in a sibship. The outcome of interest was autistic disorder in the second-born child. Analyses were restricted to sibships in which the second-born child was born in 1990-2004. Odds ratios (ORs) were estimated by fitting ordinary logistic models and logistic generalized additive models. The study sample included 223,476 singleton full-sibling pairs. In sibships with interpregnancy intervals <9 months, 0.25% of the second-born children had autistic disorder, compared with 0.13% in the reference category (≥ 36 months). For interpregnancy intervals shorter than 9 months, the adjusted OR of autistic disorder in the second-born child was 2.18 (95% confidence interval 1.42-3.26). The risk of autistic disorder in the second-born child was also increased for interpregnancy intervals of 9-11 months in the adjusted analysis (OR = 1.71 [95% CI = 1.07-2.64]). Consistent with a previous report from California, interpregnancy intervals shorter than 1 year were associated with increased risk of autistic disorder in the second-born child. A possible explanation is depletion of micronutrients in mothers with closely spaced pregnancies.
Lee, Taeheon; Park, Jung Ho; Sohn, Chongil; Yoon, Kyung Jae; Lee, Yong-Taek; Park, Jung Hwan; Jung, Il Seok
2017-01-01
Background/Aims We attempted to examine the relationship between abnormal findings on high-resolution manometry (HRM) and videofluoroscopic swallowing study (VFSS) of the pharynx and upper esophageal sphincter (UES), and to identify the risk factors for aspiration. Methods We performed VFSS and HRM on the same day in 36 ischemic stroke patients (mean age, 67.5 years) with dysphagia. Pressure (basal, median intra bolus, and nadir), relaxation time interval of the UES, and mesopharyngeal and hypopharyngeal contractility (as a contractile integral) were examined using HRM. The parameters of VFSS were vallecular residue, pyriform sinus residue, vallecular overflow, penetration, and aspiration. The association between the parameters of VFSS and HRM was analyzed by the Student’s t test. Results Three (8.3%) and 4 (11.1%) stroke patients with dysphagia had pyriform sinus residue and vallecular sinus residue, respectively, and 5 (13.8%) patients showed aspiration. Mesopharyngeal and hypopharyngeal contractile integrals in patients with residue in the pyriform sinus were significantly lower than those in patients without residue in the pyriform sinus (P < 0.05). Relaxation time intervals in patients with aspiration were significantly shorter than those in patients without aspiration (P < 0.05), and multivariate regression analysis revealed a shorter relaxation time interval as the main risk factor for aspiration (OR, 0.03; 95% CI, 0.01–0.65; P < 0.05). Conclusions Manometric measurements of the pharynx and UES were well correlated with abnormal findings in the VFSS, and a shorter relaxation time interval of the UES during deglutition is an important parameter for the development of aspiration. PMID:27510474
Kumar, Anupam; Kumar, Vijay
2017-05-01
In this paper, a novel concept of an interval type-2 fractional order fuzzy PID (IT2FO-FPID) controller, which requires fractional order integrator and fractional order differentiator, is proposed. The incorporation of Takagi-Sugeno-Kang (TSK) type interval type-2 fuzzy logic controller (IT2FLC) with fractional controller of PID-type is investigated for time response measure due to both unit step response and unit load disturbance. The resulting IT2FO-FPID controller is examined on different delayed linear and nonlinear benchmark plants followed by robustness analysis. In order to design this controller, fractional order integrator-differentiator operators are considered as design variables including input-output scaling factors. A new hybridized algorithm named as artificial bee colony-genetic algorithm (ABC-GA) is used to optimize the parameters of the controller while minimizing weighted sum of integral of time absolute error (ITAE) and integral of square of control output (ISCO). To assess the comparative performance of the IT2FO-FPID, authors compared it against existing controllers, i.e., interval type-2 fuzzy PID (IT2-FPID), type-1 fractional order fuzzy PID (T1FO-FPID), type-1 fuzzy PID (T1-FPID), and conventional PID controllers. Furthermore, to show the effectiveness of the proposed controller, the perturbed processes along with the larger dead time are tested. Moreover, the proposed controllers are also implemented on multi input multi output (MIMO), coupled, and highly complex nonlinear two-link robot manipulator system in presence of un-modeled dynamics. Finally, the simulation results explicitly indicate that the performance of the proposed IT2FO-FPID controller is superior to its conventional counterparts in most of the cases. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Hughes, Daniel; Nair, Sunil; Harvey, John N
2017-12-01
Objectives To determine the necessary screening interval for retinopathy in diabetic patients with no retinopathy based on time to laser therapy and to assess long-term visual outcome following screening. Methods In a population-based community screening programme in North Wales, 2917 patients were followed until death or for approximately 12 years. At screening, 2493 had no retinopathy; 424 had mostly minor degrees of non-proliferative retinopathy. Data on timing of first laser therapy and visual outcome following screening were obtained from local hospitals and ophthalmology units. Results Survival analysis showed that very few of the no retinopathy at screening group required laser therapy in the early years compared with the non-proliferative retinopathy group ( p < 0.001). After two years, <0.1% of the no retinopathy at screening group required laser therapy, and at three years 0.2% (cumulative), lower rates of treatment than have been suggested by analyses of sight-threatening retinopathy determined photographically. At follow-up (mean 7.8 ± 4.6 years), mild to moderate visual impairment in one or both eyes due to diabetic retinopathy was more common in those with retinopathy at screening (26% vs. 5%, p < 0.001), but blindness due to diabetes occurred in only 1 in 1000. Conclusions Optimum screening intervals should be determined from time to active treatment. Based on requirement for laser therapy, the screening interval for diabetic patients with no retinopathy can be extended to two to three years. Patients who attend for retinal screening and treatment who have no or non-proliferative retinopathy now have a very low risk of eventual blindness from diabetes.
Variable diffusion in stock market fluctuations
NASA Astrophysics Data System (ADS)
Hua, Jia-Chen; Chen, Lijian; Falcon, Liberty; McCauley, Joseph L.; Gunaratne, Gemunu H.
2015-02-01
We analyze intraday fluctuations in several stock indices to investigate the underlying stochastic processes using techniques appropriate for processes with nonstationary increments. The five most actively traded stocks each contains two time intervals during the day where the variance of increments can be fit by power law scaling in time. The fluctuations in return within these intervals follow asymptotic bi-exponential distributions. The autocorrelation function for increments vanishes rapidly, but decays slowly for absolute and squared increments. Based on these results, we propose an intraday stochastic model with linear variable diffusion coefficient as a lowest order approximation to the real dynamics of financial markets, and to test the effects of time averaging techniques typically used for financial time series analysis. We find that our model replicates major stylized facts associated with empirical financial time series. We also find that ensemble averaging techniques can be used to identify the underlying dynamics correctly, whereas time averages fail in this task. Our work indicates that ensemble average approaches will yield new insight into the study of financial markets' dynamics. Our proposed model also provides new insight into the modeling of financial markets dynamics in microscopic time scales.
Distributed fiber sparse-wideband vibration sensing by sub-Nyquist additive random sampling
NASA Astrophysics Data System (ADS)
Zhang, Jingdong; Zheng, Hua; Zhu, Tao; Yin, Guolu; Liu, Min; Bai, Yongzhong; Qu, Dingrong; Qiu, Feng; Huang, Xianbing
2018-05-01
The round trip time of the light pulse limits the maximum detectable vibration frequency response range of phase-sensitive optical time domain reflectometry ({\\phi}-OTDR). Unlike the uniform laser pulse interval in conventional {\\phi}-OTDR, we randomly modulate the pulse interval, so that an equivalent sub-Nyquist additive random sampling (sNARS) is realized for every sensing point of the long interrogation fiber. For an {\\phi}-OTDR system with 10 km sensing length, the sNARS method is optimized by theoretical analysis and Monte Carlo simulation, and the experimental results verify that a wide-band spars signal can be identified and reconstructed. Such a method can broaden the vibration frequency response range of {\\phi}-OTDR, which is of great significance in sparse-wideband-frequency vibration signal detection, such as rail track monitoring and metal defect detection.
Prediction of future asset prices
NASA Astrophysics Data System (ADS)
Seong, Ng Yew; Hin, Pooi Ah; Ching, Soo Huei
2014-12-01
This paper attempts to incorporate trading volumes as an additional predictor for predicting asset prices. Denoting r(t) as the vector consisting of the time-t values of the trading volume and price of a given asset, we model the time-(t+1) asset price to be dependent on the present and l-1 past values r(t), r(t-1), ....., r(t-1+1) via a conditional distribution which is derived from a (2l+1)-dimensional power-normal distribution. A prediction interval based on the 100(α/2)% and 100(1-α/2)% points of the conditional distribution is then obtained. By examining the average lengths of the prediction intervals found by using the composite indices of the Malaysia stock market for the period 2008 to 2013, we found that the value 2 appears to be a good choice for l. With the omission of the trading volume in the vector r(t), the corresponding prediction interval exhibits a slightly longer average length, showing that it might be desirable to keep trading volume as a predictor. From the above conditional distribution, the probability that the time-(t+1) asset price will be larger than the time-t asset price is next computed. When the probability differs from 0 (or 1) by less than 0.03, the observed time-(t+1) increase in price tends to be negative (or positive). Thus the above probability has a good potential of being used as a market indicator in technical analysis.
Interval From Imaging to Treatment Delivery in the Radiation Surgery Age: How Long Is Too Long?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seymour, Zachary A., E-mail: seymourz@radonc.ucsf.edu; Fogh, Shannon E.; Westcott, Sarah K.
Purpose: The purpose of this study was to evaluate workflow and patient outcomes related to frameless stereotactic radiation surgery (SRS) for brain metastases. Methods and Materials: We reviewed all treatment demographics, clinical outcomes, and workflow timing, including time from magnetic resonance imaging (MRI), computed tomography (CT) simulation, insurance authorization, and consultation to the start of SRS for brain metastases. Results: A total of 82 patients with 151 brain metastases treated with SRS were evaluated. The median times from consultation, insurance authorization, CT simulation, and MRI for treatment planning were 15, 7, 6, and 11 days to SRS. Local freedom from progressionmore » (LFFP) was lower in metastases with MRI ≥14 days before treatment (P=.0003, log rank). The 6- and 12-month LFFP rate were 95% and 75% for metastasis with interval of <14 days from MRI to treatment compared to 56% and 34% for metastases with MRI ≥14 days before treatment. On multivariate analysis, LFFP remained significantly lower for lesions with MRI ≥14 days at SRS (P=.002, Cox proportional hazards; hazard ratio: 3.4, 95% confidence interval: 1.6-7.3). Conclusions: Delay from MRI to SRS treatment delivery for brain metastases appears to reduce local control. Future studies should monitor the timing from imaging acquisition to treatment delivery. Our experience suggests that the time from MRI to treatment should be <14 days.« less
Powell, Jessica; Tarnow, Karen Gahan; Perucca, Roxanne
2008-01-01
The purpose of this study was to determine any relationship between peripheral IV catheter indwell time and phlebitis in hospitalized adults. A retrospective review of quarterly quality assurance data-monitoring indwell time, phlebitis rating, and site and tubing labels-was performed. Of 1,161 sites, only 679 had documented indwell time to use. Average indwell time was 1.9 days, and overall phlebitis rate was 3.7%. Analysis of variance revealed a significant association between phlebitis and indwell time. However, asymptomatic peripheral IVs may not need to be removed at regular intervals because there were healthy, asymptomatic sites with indwell time up to 10 days.
Sood, Anshuman; Hakim, David N; Hakim, Nadey S
2016-04-01
The prevalence of obesity is increasing rapidly and globally, yet systemic reviews on this topic are scarce. Our meta-analysis and systemic review aimed to assess how obesity affects 5 postoperative outcomes: biopsy-proven acute rejection, patient death, allograft loss, type 2 diabetes mellitus after transplant, and delayed graft function. We evaluated peer-reviewed literature from 22 medical databases. Studies were included if they were conducted in accordance with the Meta-analysis of Observational Studies in Epidemiology criteria, only examined postoperative outcomes in adult patients, only examined the relation between recipient obesity at time of transplant and our 5 postoperative outcomes, and had a minimum score of > 5 stars on the Newcastle-Ottawa scale for nonrandomized studies. Reliable conclusions were ensured by having our studies examined against 2 internationally known scoring systems. Obesity was defined in accordance with the World Health Organization as having a body mass index of > 30 kg/m(2). All obese recipients were compared versus "healthy" recipients (body mass index of 18.5-24.9 kg/m(2)). Hazard ratios were calculated for biopsy-proven acute rejection, patient death, allograft loss, and type 2 diabetes mellitus after transplant. An odds ratio was calculated for delayed graft function. We assessed 21 retrospective observational studies in our meta-analysis (N = 241 381 patients). In obese transplant recipients, hazard ratios were 1.51 (95% confidence interval, 1.24-1.78) for presence of biopsy-proven acute rejection, 1.19 (95% confidence interval, 1.10-1.31) for patient death, 1.54 (95% confidence interval, 1.38-1.68) for allograft loss, and 1.01 (95% confidence interval, 0.98-1.07) for development of type 2 diabetes mellitus. The odds ratio for delayed graft function was 1.81 (95% confidence interval, 1.51-2.13). Our meta-analysis clearly demonstrated greater risks for obese renal transplant recipients and poorer postoperative outcomes with obesity. We confidently recommend renal transplant candidates seek medically supervised weight loss before transplant.
Shear Bond Strengths of Different Adhesive Systems to Biodentine
Odabaş, Mesut Enes; Bani, Mehmet; Tirali, Resmiye Ebru
2013-01-01
The aim of this study was to measure the shear bond strength of different adhesive systems to Biodentine with different time intervals. Eighty specimens of Biodentine were prepared and divided into 8 groups. After 12 minutes, 40 samples were randomly selected and divided into 4 groups of 10 each: group 1: (etch-and-rinse adhesive system) Prime & Bond NT; group 2: (2-step self-etch adhesive system) Clearfil SE Bond; group 3: (1-step self-etch adhesive systems) Clearfil S3 Bond; group 4: control (no adhesive). After the application of adhesive systems, composite resin was applied over Biodentine. This procedure was repeated 24 hours after mixing additional 40 samples, respectively. Shear bond strengths were measured using a universal testing machine, and the data were subjected to 1-way analysis of variance and Scheffé post hoc test. No significant differences were found between all of the adhesive groups at the same time intervals (12 minutes and 24 hours) (P > .05). Among the two time intervals, the lowest value was obtained for group 1 (etch-and-rinse adhesive) at a 12-minute period, and the highest was obtained for group 2 (two-step self-etch adhesive) at a 24-hour period. The placement of composite resin used with self-etch adhesive systems over Biodentine showed better shear bond strength. PMID:24222742
Shear bond strengths of different adhesive systems to biodentine.
Odabaş, Mesut Enes; Bani, Mehmet; Tirali, Resmiye Ebru
2013-01-01
The aim of this study was to measure the shear bond strength of different adhesive systems to Biodentine with different time intervals. Eighty specimens of Biodentine were prepared and divided into 8 groups. After 12 minutes, 40 samples were randomly selected and divided into 4 groups of 10 each: group 1: (etch-and-rinse adhesive system) Prime & Bond NT; group 2: (2-step self-etch adhesive system) Clearfil SE Bond; group 3: (1-step self-etch adhesive systems) Clearfil S(3) Bond; group 4: control (no adhesive). After the application of adhesive systems, composite resin was applied over Biodentine. This procedure was repeated 24 hours after mixing additional 40 samples, respectively. Shear bond strengths were measured using a universal testing machine, and the data were subjected to 1-way analysis of variance and Scheffé post hoc test. No significant differences were found between all of the adhesive groups at the same time intervals (12 minutes and 24 hours) (P > .05). Among the two time intervals, the lowest value was obtained for group 1 (etch-and-rinse adhesive) at a 12-minute period, and the highest was obtained for group 2 (two-step self-etch adhesive) at a 24-hour period. The placement of composite resin used with self-etch adhesive systems over Biodentine showed better shear bond strength.
Mahajan, Ruhi; Viangteeravat, Teeradache; Akbilgic, Oguz
2017-12-01
A timely diagnosis of congestive heart failure (CHF) is crucial to evade a life-threatening event. This paper presents a novel probabilistic symbol pattern recognition (PSPR) approach to detect CHF in subjects from their cardiac interbeat (R-R) intervals. PSPR discretizes each continuous R-R interval time series by mapping them onto an eight-symbol alphabet and then models the pattern transition behavior in the symbolic representation of the series. The PSPR-based analysis of the discretized series from 107 subjects (69 normal and 38 CHF subjects) yielded discernible features to distinguish normal subjects and subjects with CHF. In addition to PSPR features, we also extracted features using the time-domain heart rate variability measures such as average and standard deviation of R-R intervals. An ensemble of bagged decision trees was used to classify two groups resulting in a five-fold cross-validation accuracy, specificity, and sensitivity of 98.1%, 100%, and 94.7%, respectively. However, a 20% holdout validation yielded an accuracy, specificity, and sensitivity of 99.5%, 100%, and 98.57%, respectively. Results from this study suggest that features obtained with the combination of PSPR and long-term heart rate variability measures can be used in developing automated CHF diagnosis tools. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
González, Jose S.; Dorantes, Guadalupe; Alba, Alfonso; Méndez, Martin O.; Camacho, Sergio; Luna-Rivera, Martin; Parrino, Liborio; Riccardi, Silvia; Terzano, Mario G.; Milioli, Giulia
The aim of this work is to study the behavior of the autonomic system through variations in the heart rate (HR) during the Cyclic Alternating Pattern (CAP) which is formed by A-phases. The analysis was carried out in 10 healthy subjects and 10 patients with Nocturnal Front Lobe Epilepsy (NFLE) that underwent one whole night of polysomnographic recordings. In order to assess the relation of A-phases with the cardiovascular system, two time domain features were computed: the amplitude reduction and time delay of the minimum of the R-R intervals with respect to A-phases onset. In addition, the same process was performed over randomly chosen R-R interval segments during the NREM sleep for baseline comparisons. A non-parametric bootstrap procedure was used to test differences of the kurtosis values of two populations. The results suggest that the onset of the A-phases is correlated with a significant increase of the HR that peaks at around 4s after the A-phase onset, independently of the A-phase subtype and sleep time for both healthy subjects and NFLE patients. Furthermore, the behavior of the reduction in the R-R intervals during the A-phases was significantly different for NFLE patients with respect to control subjects.
NASA Astrophysics Data System (ADS)
Love, J. J.; Rigler, E. J.; Pulkkinen, A. A.; Riley, P.
2015-12-01
An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to -Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, -Dst > 850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42, 2.41] times per century; a 100-yr magnetic storm is identified as having a -Dst > 880 nT (greater than Carrington) but a wide 95% confidence interval of [490, 1187] nT. This work is partially motivated by United States National Science and Technology Council and Committee on Space Research and International Living with a Star priorities and strategic plans for the assessment and mitigation of space-weather hazards.
NASA Astrophysics Data System (ADS)
Olafsdottir, Kristin B.; Mudelsee, Manfred
2013-04-01
Estimation of the Pearson's correlation coefficient between two time series to evaluate the influences of one time depended variable on another is one of the most often used statistical method in climate sciences. Various methods are used to estimate confidence interval to support the correlation point estimate. Many of them make strong mathematical assumptions regarding distributional shape and serial correlation, which are rarely met. More robust statistical methods are needed to increase the accuracy of the confidence intervals. Bootstrap confidence intervals are estimated in the Fortran 90 program PearsonT (Mudelsee, 2003), where the main intention was to get an accurate confidence interval for correlation coefficient between two time series by taking the serial dependence of the process that generated the data into account. However, Monte Carlo experiments show that the coverage accuracy for smaller data sizes can be improved. Here we adapt the PearsonT program into a new version called PearsonT3, by calibrating the confidence interval to increase the coverage accuracy. Calibration is a bootstrap resampling technique, which basically performs a second bootstrap loop or resamples from the bootstrap resamples. It offers, like the non-calibrated bootstrap confidence intervals, robustness against the data distribution. Pairwise moving block bootstrap is used to preserve the serial correlation of both time series. The calibration is applied to standard error based bootstrap Student's t confidence intervals. The performances of the calibrated confidence intervals are examined with Monte Carlo simulations, and compared with the performances of confidence intervals without calibration, that is, PearsonT. The coverage accuracy is evidently better for the calibrated confidence intervals where the coverage error is acceptably small (i.e., within a few percentage points) already for data sizes as small as 20. One form of climate time series is output from numerical models which simulate the climate system. The method is applied to model data from the high resolution ocean model, INALT01 where the relationship between the Agulhas Leakage and the North Brazil Current is evaluated. Preliminary results show significant correlation between the two variables when there is 10 year lag between them, which is more or less the time that takes the Agulhas Leakage water to reach the North Brazil Current. Mudelsee, M., 2003. Estimating Pearson's correlation coefficient with bootstrap confidence interval from serially dependent time series. Mathematical Geology 35, 651-665.
Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi
2016-02-01
Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We recommend the use of cumulative probability to fit parametric probability distributions to propagule retention time, specifically using maximum likelihood for parameter estimation. Furthermore, the experimental design for an optimal characterization of unimodal propagule retention time should contemplate at least 500 recovered propagules and sampling time-intervals not larger than the time peak of propagule retrieval, except in the tail of the distribution where broader sampling time-intervals may also produce accurate fits.
NASA Astrophysics Data System (ADS)
Faruk, Alfensi
2018-03-01
Survival analysis is a branch of statistics, which is focussed on the analysis of time- to-event data. In multivariate survival analysis, the proportional hazards (PH) is the most popular model in order to analyze the effects of several covariates on the survival time. However, the assumption of constant hazards in PH model is not always satisfied by the data. The violation of the PH assumption leads to the misinterpretation of the estimation results and decreasing the power of the related statistical tests. On the other hand, the accelerated failure time (AFT) models do not assume the constant hazards in the survival data as in PH model. The AFT models, moreover, can be used as the alternative to PH model if the constant hazards assumption is violated. The objective of this research was to compare the performance of PH model and the AFT models in analyzing the significant factors affecting the first birth interval (FBI) data in Indonesia. In this work, the discussion was limited to three AFT models which were based on Weibull, exponential, and log-normal distribution. The analysis by using graphical approach and a statistical test showed that the non-proportional hazards exist in the FBI data set. Based on the Akaike information criterion (AIC), the log-normal AFT model was the most appropriate model among the other considered models. Results of the best fitted model (log-normal AFT model) showed that the covariates such as women’s educational level, husband’s educational level, contraceptive knowledge, access to mass media, wealth index, and employment status were among factors affecting the FBI in Indonesia.
Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.
Dimitrova, Dimitrina S; Kaishev, Vladimir K; Zhao, Shouqi
2015-10-01
In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented. © 2015 Society for Risk Analysis.
Park, Myung Sook; Kang, Kyung Ja; Jang, Sun Joo; Lee, Joo Yun; Chang, Sun Ju
2018-03-01
This study aimed to evaluate the components of test-retest reliability including time interval, sample size, and statistical methods used in patient-reported outcome measures in older people and to provide suggestions on the methodology for calculating test-retest reliability for patient-reported outcomes in older people. This was a systematic literature review. MEDLINE, Embase, CINAHL, and PsycINFO were searched from January 1, 2000 to August 10, 2017 by an information specialist. This systematic review was guided by both the Preferred Reporting Items for Systematic Reviews and Meta-Analyses checklist and the guideline for systematic review published by the National Evidence-based Healthcare Collaborating Agency in Korea. The methodological quality was assessed by the Consensus-based Standards for the selection of health Measurement Instruments checklist box B. Ninety-five out of 12,641 studies were selected for the analysis. The median time interval for test-retest reliability was 14days, and the ratio of sample size for test-retest reliability to the number of items in each measure ranged from 1:1 to 1:4. The most frequently used statistical methods for continuous scores was intraclass correlation coefficients (ICCs). Among the 63 studies that used ICCs, 21 studies presented models for ICC calculations and 30 studies reported 95% confidence intervals of the ICCs. Additional analyses using 17 studies that reported a strong ICC (>0.09) showed that the mean time interval was 12.88days and the mean ratio of the number of items to sample size was 1:5.37. When researchers plan to assess the test-retest reliability of patient-reported outcome measures for older people, they need to consider an adequate time interval of approximately 13days and the sample size of about 5 times the number of items. Particularly, statistical methods should not only be selected based on the types of scores of the patient-reported outcome measures, but should also be described clearly in the studies that report the results of test-retest reliability. Copyright © 2017 Elsevier Ltd. All rights reserved.
Palaeoclimate records 60-8 ka in the Austrian and Swiss Alps and their forelands
NASA Astrophysics Data System (ADS)
Heiri, Oliver; Koinig, Karin A.; Spötl, Christoph; Barrett, Sam; Brauer, Achim; Drescher-Schneider, Ruth; Gaar, Dorian; Ivy-Ochs, Susan; Kerschner, Hanns; Luetscher, Marc; Moran, Andrew; Nicolussi, Kurt; Preusser, Frank; Schmidt, Roland; Schoeneich, Philippe; Schwörer, Christoph; Sprafke, Tobias; Terhorst, Birgit; Tinner, Willy
2014-12-01
The European Alps and their forelands provide a range of different archives and climate proxies for developing climate records in the time interval 60-8 thousand years (ka) ago. We review quantitative and semi-quantitative approaches for reconstructing climatic variables in the Austrian and Swiss sector of the Alpine region within this time interval. Available quantitative to semi-quantitative climate records in this region are mainly based on fossil assemblages of biota such as chironomids, cladocerans, coleopterans, diatoms and pollen preserved in lake sediments and peat, the analysis of oxygen isotopes in speleothems and lake sediment records, the reconstruction of past variations in treeline altitude, the reconstruction of past equilibrium line altitude and extent of glaciers based on geomorphological evidence, and the interpretation of past soil formation processes, dust deposition and permafrost as apparent in loess-palaeosol sequences. Palaeoclimate reconstructions in the Alpine region are affected by dating uncertainties increasing with age, the fragmentary nature of most of the available records, which typically only incorporate a fraction of the time interval of interest, and the limited replication of records within and between regions. Furthermore, there have been few attempts to cross-validate different approaches across this time interval to confirm reconstructed patterns of climatic change by several independent lines of evidence. Based on our review we identify a number of developments that would provide major advances for palaeoclimate reconstruction for the period 60-8 ka in the Alps and their forelands. These include (1) the compilation of individual, fragmentary records to longer and continuous reconstructions, (2) replication of climate records and the development of regional reconstructions for different parts of the Alps, (3) the cross-validation of different proxy-types and approaches, and (4) the reconstruction of past variations in climate gradients across the Alps and their forelands. Furthermore, the development of downscaled climate model runs for the Alpine region 60-8 ka, and of forward modelling approaches for climate proxies would expand the opportunities for quantitative assessments of climatic conditions in Europe within this time-interval.
Interval stability for complex systems
NASA Astrophysics Data System (ADS)
Klinshov, Vladimir V.; Kirillov, Sergey; Kurths, Jürgen; Nekorkin, Vladimir I.
2018-04-01
Stability of dynamical systems against strong perturbations is an important problem of nonlinear dynamics relevant to many applications in various areas. Here, we develop a novel concept of interval stability, referring to the behavior of the perturbed system during a finite time interval. Based on this concept, we suggest new measures of stability, namely interval basin stability (IBS) and interval stability threshold (IST). IBS characterizes the likelihood that the perturbed system returns to the stable regime (attractor) in a given time. IST provides the minimal magnitude of the perturbation capable to disrupt the stable regime for a given interval of time. The suggested measures provide important information about the system susceptibility to external perturbations which may be useful for practical applications. Moreover, from a theoretical viewpoint the interval stability measures are shown to bridge the gap between linear and asymptotic stability. We also suggest numerical algorithms for quantification of the interval stability characteristics and demonstrate their potential for several dynamical systems of various nature, such as power grids and neural networks.
Atomic temporal interval relations in branching time: calculation and application
NASA Astrophysics Data System (ADS)
Anger, Frank D.; Ladkin, Peter B.; Rodriguez, Rita V.
1991-03-01
A practical method of reasoning about intervals in a branching-time model which is dense, unbounded, future-branching, without rejoining branches is presented. The discussion is based on heuristic constraint- propagation techniques using the relation algebra of binary temporal relations among the intervals over the branching-time model. This technique has been applied with success to models of intervals over linear time by Allen and others, and is of cubic-time complexity. To extend it to branding-time models, it is necessary to calculate compositions of the relations; thus, the table of compositions for the 'atomic' relations is computed, enabling the rapid determination of the composition of arbitrary relations, expressed as disjunctions or unions of the atomic relations.
Martin, A.D.
1986-05-09
Method and apparatus are provided for generating an output pulse following a trigger pulse at a time delay interval preset with a resolution which is high relative to a low resolution available from supplied clock pulses. A first lumped constant delay provides a first output signal at predetermined interpolation intervals corresponding to the desired high resolution time interval. Latching circuits latch the high resolution data to form a first synchronizing data set. A selected time interval has been preset to internal counters and corrected for circuit propagation delay times having the same order of magnitude as the desired high resolution. Internal system clock pulses count down the counters to generate an internal pulse delayed by an internal which is functionally related to the preset time interval. A second LCD corrects the internal signal with the high resolution time delay. A second internal pulse is then applied to a third LCD to generate a second set of synchronizing data which is complementary with the first set of synchronizing data for presentation to logic circuits. The logic circuits further delay the internal output signal with the internal pulses. The final delayed output signal thereafter enables the output pulse generator to produce the desired output pulse at the preset time delay interval following input of the trigger pulse.
NASA Astrophysics Data System (ADS)
Engelke, Julia; Esser, Klaus J. K.; Linnert, Christian; Mutterlose, Jörg; Wilmsen, Markus
2016-12-01
The benthic macroinvertebrates of the Lower Maastrichtian chalk of Saturn quarry at Kronsmoor (northern Germany) have been studied taxonomically based on more than 1,000 specimens. Two successive benthic macrofossil assemblages were recognised: the lower interval in the upper part of the Kronsmoor Formation (Belemnella obtusa Zone) is characterized by low abundances of macroinvertebrates while the upper interval in the uppermost Kronsmoor and lowermost Hemmoor formations (lower to middle Belemnella sumensis Zone) shows a high macroinvertebrate abundance (eight times more than in the B. obtusa Zone) and a conspicuous dominance of brachiopods. The palaeoecological analysis of these two assemblages indicates the presence of eight different guilds, of which epifaunal suspension feeders (fixo-sessile and libero-sessile guilds), comprising approximately half of the trophic nucleus of the lower interval, increased to a dominant 86% in the upper interval, including a considerable proportion of rhynchonelliform brachiopods. It is tempting to relate this shift from the lower to the upper interval to an increase in nutrient supply and/or a shallowing of the depositional environment but further data including geochemical proxies are needed to fully understand the macrofossil distribution patterns in the Lower Maastrichtian of Kronsmoor.
Marinaccio, Christian; Giudice, Giuseppe; Nacchiero, Eleonora; Robusto, Fabio; Opinto, Giuseppina; Lastilla, Gaetano; Maiorano, Eugenio; Ribatti, Domenico
2016-08-01
The presence of interval sentinel lymph nodes in melanoma is documented in several studies, but controversies still exist about the management of these lymph nodes. In this study, an immunohistochemical evaluation of tumor cell proliferation and neo-angiogenesis has been performed with the aim of establishing a correlation between these two parameters between positive and negative interval sentinel lymph nodes. This retrospective study reviewed data of 23 patients diagnosed with melanoma. Bioptic specimens of interval sentinel lymph node were retrieved, and immunohistochemical reactions on tissue sections were performed using Ki67 as a marker of proliferation and CD31 as a blood vessel marker for the study of angiogenesis. The entire stained tissue sections for each case were digitized using Aperio Scanscope Cs whole-slide scanning platform and stored as high-resolution images. Image analysis was carried out on three selected fields of equal area using IHC Nuclear and Microvessel analysis algorithms to determine positive Ki67 nuclei and vessel number. Patients were divided into positive and negative interval sentinel lymph node groups, and the positive interval sentinel lymph node group was further divided into interval positive with micrometastasis and interval positive with macrometastasis subgroups. The analysis revealed a significant difference between positive and negative interval sentinel lymph nodes in the percentage of Ki67-positive nuclei and mean vessel number suggestive of an increased cellular proliferation and angiogenesis in positive interval sentinel lymph nodes. Further analysis in the interval positive lymph node group showed a significant difference between micro- and macrometastasis subgroups in the percentage of Ki67-positive nuclei and mean vessel number. Percentage of Ki67-positive nuclei was increased in the macrometastasis subgroup, while mean vessel number was increased in the micrometastasis subgroup. The results of this study suggest that the correlation between tumor cell proliferation and neo-angiogenesis in interval sentinel lymph nodes in melanoma could be used as a good predictive marker to distinguish interval positive sentinel lymph nodes with micrometastasis from interval positive lymph nodes with macrometastasis subgroups.
Sensitivity Analysis of Multicriteria Choice to Changes in Intervals of Value Tradeoffs
NASA Astrophysics Data System (ADS)
Podinovski, V. V.
2018-03-01
An approach to sensitivity (stability) analysis of nondominated alternatives to changes in the bounds of intervals of value tradeoffs, where the alternatives are selected based on interval data of criteria tradeoffs is proposed. Methods of computations for the analysis of sensitivity of individual nondominated alternatives and the set of such alternatives as a whole are developed.
Nielsen, Merete Willemoes; Søndergaard, Birthe; Kjøller, Mette; Hansen, Ebba Holme
2008-09-01
This study compared national self-reported data on medicine use and national prescription records at the individual level. Data from the nationally representative Danish health survey conducted in 2000 (n=16,688) were linked at the individual level to national prescription records covering 1999-2000. Kappa statistics and 95% confidence intervals were calculated. Applying the legend time method to medicine groups used mainly on a chronic basis revealed good to very good agreement between the two data sources, whereas medicines used as needed showed fair to moderate agreement. When a fixed-time window was applied for analysis, agreement was unchanged for medicines used mainly on a chronic basis, whereas agreement increased somewhat compared to the legend time method when analyzing medicines used as needed. Agreement between national self-reported data and national prescription records differed according to method of analysis and therapeutic group. A fixed-time window is an appropriate method of analysis for most therapeutic groups.
Steerable dyadic wavelet transform and interval wavelets for enhancement of digital mammography
NASA Astrophysics Data System (ADS)
Laine, Andrew F.; Koren, Iztok; Yang, Wuhai; Taylor, Fred J.
1995-04-01
This paper describes two approaches for accomplishing interactive feature analysis by overcomplete multiresolution representations. We show quantitatively that transform coefficients, modified by an adaptive non-linear operator, can make more obvious unseen or barely seen features of mammography without requiring additional radiation. Our results are compared with traditional image enhancement techniques by measuring the local contrast of known mammographic features. We design a filter bank representing a steerable dyadic wavelet transform that can be used for multiresolution analysis along arbitrary orientations. Digital mammograms are enhanced by orientation analysis performed by a steerable dyadic wavelet transform. Arbitrary regions of interest (ROI) are enhanced by Deslauriers-Dubuc interpolation representations on an interval. We demonstrate that our methods can provide radiologists with an interactive capability to support localized processing of selected (suspicion) areas (lesions). Features extracted from multiscale representations can provide an adaptive mechanism for accomplishing local contrast enhancement. By improving the visualization of breast pathology can improve changes of early detection while requiring less time to evaluate mammograms for most patients.
Generalization of Turbulent Pair Dispersion to Large Initial Separations
NASA Astrophysics Data System (ADS)
Shnapp, Ron; Liberzon, Alex; International Collaboration for Turbulence Research
2018-06-01
We present a generalization of turbulent pair dispersion to large initial separations (η
Compression based entropy estimation of heart rate variability on multiple time scales.
Baumert, Mathias; Voss, Andreas; Javorka, Michal
2013-01-01
Heart rate fluctuates beat by beat in a complex manner. The aim of this study was to develop a framework for entropy assessment of heart rate fluctuations on multiple time scales. We employed the Lempel-Ziv algorithm for lossless data compression to investigate the compressibility of RR interval time series on different time scales, using a coarse-graining procedure. We estimated the entropy of RR interval time series of 20 young and 20 old subjects and also investigated the compressibility of randomly shuffled surrogate RR time series. The original RR time series displayed significantly smaller compression entropy values than randomized RR interval data. The RR interval time series of older subjects showed significantly different entropy characteristics over multiple time scales than those of younger subjects. In conclusion, data compression may be useful approach for multiscale entropy assessment of heart rate variability.
Palmer, E; Ciechanowicz, S; Reeve, A; Harris, S; Wong, D J N; Sultan, P
2018-07-01
We conducted a 5-year retrospective cohort study on women undergoing caesarean section to investigate factors influencing the operating room-to-incision interval. Time-to-event analysis was performed for category-1 caesarean section using a Cox proportional hazards regression model. Covariates included: anaesthetic technique; body mass index; age; parity; time of delivery; and gestational age. Binary logistic regression was performed for 5-min Apgar score ≥ 7. There were 677 women who underwent category-1 caesarean section and who met the entry criteria. Unadjusted median (IQR [range]) operating room-to-incision intervals were: epidural top-up 11 (7-17 [0-87]) min; general anaesthesia 6 (4-11 [0-69]) min; spinal 13 (10-20 [0-83]) min; and combined spinal-epidural 24 (13-35 [0-75]) min. Cox regression showed general anaesthesia to be the most rapid method with a hazard ratio (95%CI) of 1.97 (1.60-2.44; p < 0.0001), followed by epidural top-up (reference group), spinal anaesthesia 0.79 (0.65-0.96; p = 0.02) and combined spinal-epidural 0.48 (0.35-0.67; p < 0.0001). Underweight and overweight body mass indexes were associated with longer operating room-to-incision intervals. General anaesthesia was associated with fewer 5-min Apgar scores ≥ 7 with an odds ratio (95%CI) of 0.28 (0.11-0.68; p < 0.01). There was no difference in neonatal outcomes between the first and fifth quintiles for operating room-to-incision intervals. General anaesthesia is associated with the most rapid operating room-to-incision interval for category-1 caesarean section, but is also associated with worse short term neonatal outcomes. Longer operating room-to-incision intervals were not associated with worse neonatal outcomes. © 2018 The Association of Anaesthetists of Great Britain and Ireland.
Automatic Error Analysis Using Intervals
ERIC Educational Resources Information Center
Rothwell, E. J.; Cloud, M. J.
2012-01-01
A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…
What Could Be Causing Global Ozone Depletion
NASA Technical Reports Server (NTRS)
Singer, S. Fred
1990-01-01
The reported decline trend in global ozone between 1970 and 1986 may be in part an artifact of the analysis; the trend value appears to depend on the time interval selected for analysis--in relation to the 11-year solar cycle. If so, then the decline should diminish as one approaches solar maximum and includes data from 1987 to 1990. If the decline is real, its cause could be the result of natural and human factors other than just chlorofluorocarbons.
Relaxation estimation of RMSD in molecular dynamics immunosimulations.
Schreiner, Wolfgang; Karch, Rudolf; Knapp, Bernhard; Ilieva, Nevena
2012-01-01
Molecular dynamics simulations have to be sufficiently long to draw reliable conclusions. However, no method exists to prove that a simulation has converged. We suggest the method of "lagged RMSD-analysis" as a tool to judge if an MD simulation has not yet run long enough. The analysis is based on RMSD values between pairs of configurations separated by variable time intervals Δt. Unless RMSD(Δt) has reached a stationary shape, the simulation has not yet converged.
Fajt, Virginia R; Apley, Michael D; Brogden, Kim A; Skogerboe, Terry L; Shostrom, Valerie K; Chin, Ya-Lin
2004-05-01
To examine effects of danofloxacin and tilmicosin on continuously recorded body temperature in beef calves with pneumonia experimentally induced by inoculation of Mannheimia haemolytica. 41 Angus-cross heifers (body weight, 160 to 220 kg) without a recent history of respiratory tract disease or antimicrobial treatment, all from a single ranch. Radiotransmitters were implanted intravaginally in each calf. Pneumonia was induced intrabronchially by use of logarithmic-phase cultures of M. haemolytica. At 21 hours after inoculation, calves were treated with saline (0.9% NaCl) solution, danofloxacin, or tilmicosin. Body temperature was monitored from 66 hours before inoculation until 72 hours after treatment. Area under the curve (AUC) of the temperature-time plot and mean temperature were calculated for 3-hour intervals and compared among treatment groups. The AUCs for 3-hour intervals did not differ significantly among treatment groups for any of the time periods. Analysis of the mean temperature for 3-hour intervals revealed significantly higher temperatures at most time periods for saline-treated calves, compared with temperatures for antimicrobial-treated calves; however, we did not detect significant differences between the danofloxacin- and tilmicosin-treated calves. The circadian rhythm of temperatures before exposure was detected again approximately 48 hours after bacterial inoculation. Danofloxacin and tilmicosin did not differ in their effect on mean body temperature for 3-hour intervals but significantly decreased body temperature, compared with body temperature in saline-treated calves. Normal daily variation in body temperature must be considered in the face of respiratory tract disease during clinical evaluation of feedlot cattle.
Model for the respiratory modulation of the heart beat-to-beat time interval series
NASA Astrophysics Data System (ADS)
Capurro, Alberto; Diambra, Luis; Malta, C. P.
2005-09-01
In this study we present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a set of differential equations used to simulate the membrane potential of a single rabbit sinoatrial node cell, excited with a periodic input signal with added correlated noise. This signal, which simulates the input from the autonomous nervous system to the sinoatrial node, was included in the pacemaker equations as a modulation of the iNaK current pump and the potassium current iK. We focus at modeling the heart beat-to-beat time interval series from normal subjects during meditation of the Kundalini Yoga and Chi techniques. The analysis of the experimental data indicates that while the embedding of pre-meditation and control cases have a roughly circular shape, it acquires a polygonal shape during meditation, triangular for the Kundalini Yoga data and quadrangular in the case of Chi data. The model was used to assess the waveshape of the respiratory signals needed to reproduce the trajectory of the experimental data in the phase space. The embedding of the Chi data could be reproduced using a periodic signal obtained by smoothing a square wave. In the case of Kundalini Yoga data, the embedding was reproduced with a periodic signal obtained by smoothing a triangular wave having a rising branch of longer duration than the decreasing branch. Our study provides an estimation of the respiratory signal using only the heart beat-to-beat time interval series.
Reliability analysis based on the losses from failures.
Todinov, M T
2006-04-01
The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the early-life failures region and the expected losses given failure characterizing the corresponding time intervals. For complex systems whose components are not logically arranged in series, discrete simulation algorithms and software have been created for determining the losses from failures in terms of expected lost production time, cost of intervention, and cost of replacement. Different system topologies are assessed to determine the effect of modifications of the system topology on the expected losses from failures. It is argued that the reliability allocation in a production system should be done to maximize the profit/value associated with the system. Consequently, a method for setting reliability requirements and reliability allocation maximizing the profit by minimizing the total cost has been developed. Reliability allocation that maximizes the profit in case of a system consisting of blocks arranged in series is achieved by determining for each block individually the reliabilities of the components in the block that minimize the sum of the capital, operation costs, and the expected losses from failures. A Monte Carlo simulation based net present value (NPV) cash-flow model has also been proposed, which has significant advantages to cash-flow models based on the expected value of the losses from failures per time interval. Unlike these models, the proposed model has the capability to reveal the variation of the NPV due to different number of failures occurring during a specified time interval (e.g., during one year). The model also permits tracking the impact of the distribution pattern of failure occurrences and the time dependence of the losses from failures.
Comment on short-term variation in subjective sleepiness.
Eriksen, Claire A; Akerstedt, Torbjörn; Kecklund, Göran; Akerstedt, Anna
2005-12-01
Subjective sleepiness at different times is often measured in studies on sleep loss, night work, or drug effects. However, the context at the time of rating may influence results. The present study examined sleepiness throughout the day at hourly intervals and during controlled activities [reading, writing, walking, social interaction (discussion), etc.] by 10-min. intervals for 3 hr. This was done on a normal working day preceded by a scheduled early rising (to invite sleepiness) for six subjects. Analysis showed a significant U-shaped pattern across the day with peaks in the early morning and late evening. A walk and social interaction were associated with low sleepiness, compared to sedentary and quiet office work. None of this was visible in the hourly ratings. There was also a pronounced afternoon increase in sleepiness, that was not observable with hourly ratings. It was concluded that there are large variations in sleepiness related to time of day and also to context and that sparse sampling of subjective sleepiness may miss much of this variation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-09-14
This package contains statistical routines for extracting features from multivariate time-series data which can then be used for subsequent multivariate statistical analysis to identify patterns and anomalous behavior. It calculates local linear or quadratic regression model fits to moving windows for each series and then summarizes the model coefficients across user-defined time intervals for each series. These methods are domain agnostic-but they have been successfully applied to a variety of domains, including commercial aviation and electric power grid data.
Integrating Security in Real-Time Embedded Systems
2017-04-26
b) detect any intrusions/a ttacks once tl1ey occur and (c) keep the overall system safe in the event of an attack. 4. Analysis and evaluation of...beyond), we expanded our work in both security integration and attack mechanisms, and worked on demonstrations and evaluations in hardware. Year I...scheduling for each busy interval w ith the calculated arrival time w indow. Step 1 focuses on the problem of finding the quanti ty of each task
Zhuang, Katie Z.; Lebedev, Mikhail A.
2014-01-01
Correlation between cortical activity and electromyographic (EMG) activity of limb muscles has long been a subject of neurophysiological studies, especially in terms of corticospinal connectivity. Interest in this issue has recently increased due to the development of brain-machine interfaces with output signals that mimic muscle force. For this study, three monkeys were implanted with multielectrode arrays in multiple cortical areas. One monkey performed self-timed touch pad presses, whereas the other two executed arm reaching movements. We analyzed the dynamic relationship between cortical neuronal activity and arm EMGs using a joint cross-correlation (JCC) analysis that evaluated trial-by-trial correlation as a function of time intervals within a trial. JCCs revealed transient correlations between the EMGs of multiple muscles and neural activity in motor, premotor and somatosensory cortical areas. Matching results were obtained using spike-triggered averages corrected by subtracting trial-shuffled data. Compared with spike-triggered averages, JCCs more readily revealed dynamic changes in cortico-EMG correlations. JCCs showed that correlation peaks often sharpened around movement times and broadened during delay intervals. Furthermore, JCC patterns were directionally selective for the arm-reaching task. We propose that such highly dynamic, task-dependent and distributed relationships between cortical activity and EMGs should be taken into consideration for future brain-machine interfaces that generate EMG-like signals. PMID:25210153
NASA Astrophysics Data System (ADS)
Rebolledo, M. A.; Martinez-Betorz, J. A.
1989-04-01
In this paper the accuracy in the determination of the period of an oscillating signal, when obtained from the photon statistics time-interval probability, is studied as a function of the precision (the inverse of the cutoff frequency of the photon counting system) with which time intervals are measured. The results are obtained by means of an experiment with a square-wave signal, where the Fourier or square-wave transforms of the time-interval probability are measured. It is found that for values of the frequency of the signal near the cutoff frequency the errors in the period are small.
The influence of interpregnancy interval on infant mortality.
McKinney, David; House, Melissa; Chen, Aimin; Muglia, Louis; DeFranco, Emily
2017-03-01
In Ohio, the infant mortality rate is above the national average and the black infant mortality rate is more than twice the white infant mortality rate. Having a short interpregnancy interval has been shown to correlate with preterm birth and low birthweight, but the effect of short interpregnancy interval on infant mortality is less well established. We sought to quantify the population impact of interpregnancy interval on the risk of infant mortality. This was a statewide population-based retrospective cohort study of all births (n = 1,131,070) and infant mortalities (n = 8152) using linked Ohio birth and infant death records from January 2007 through September 2014. For this study we analyzed 5 interpregnancy interval categories: 0-<6, 6-<12, 12-<24, 24-<60, and ≥60 months. The primary outcome for this study was infant mortality. During the study period, 3701 infant mortalities were linked to a live birth certificate with an interpregnancy interval available. We calculated the frequency and relative risk of infant mortality for each interval compared to a referent interval of 12-<24 months. Stratified analyses by maternal race were also performed. Adjusted risks were estimated after accounting for statistically significant and biologically plausible confounding variables. Adjusted relative risk was utilized to calculate the attributable risk percent of short interpregnancy intervals on infant mortality. Short interpregnancy intervals were common in Ohio during the study period. Of all multiparous births, 20.5% followed an interval of <12 months. The overall infant mortality rate during this time was 7.2 per 1000 live births (6.0 for white mothers and 13.1 for black mothers). Infant mortalities occurred more frequently for births following short intervals of 0-<6 months (9.2 per 1000) and 6-<12 months (7.1 per 1000) compared to 12-<24 months (5.6 per 1000) (P < .001 and <.001). The highest risk for infant mortality followed interpregnancy intervals of 0-<6 months (adjusted relative risk, 1.32; 95% confidence interval, 1.17-1.49) followed by interpregnancy intervals of 6-<12 months (adjusted relative risk, 1.16; 95% confidence interval, 1.04-1.30). Analysis stratified by maternal race revealed similar findings. Attributable risk calculation showed that 24.2% of infant mortalities following intervals of 0-<6 months and 14.1% with intervals of 6-<12 months are attributable to the short interpregnancy interval. By avoiding short interpregnancy intervals of ≤12 months we estimate that in the state of Ohio 31 infant mortalities (20 white and 8 black) per year could have been prevented and the infant mortality rate could have been reduced from 7.2-7.0 during this time frame. An interpregnancy interval of 12-60 months (1-5 years) between birth and conception of next pregnancy is associated with lowest risk of infant mortality. Public health initiatives and provider counseling to optimize birth spacing has the potential to significantly reduce infant mortality for both white and black mothers. Copyright © 2017 Elsevier Inc. All rights reserved.
Liu, Xuejiao; Zhang, Dongdong; Liu, Yu; Sun, Xizhuo; Han, Chengyi; Wang, Bingyuan; Ren, Yongcheng; Zhou, Junmei; Zhao, Yang; Shi, Yuanyuan; Hu, Dongsheng; Zhang, Ming
2017-05-01
Despite the inverse association between physical activity (PA) and incident hypertension, a comprehensive assessment of the quantitative dose-response association between PA and hypertension has not been reported. We performed a meta-analysis, including dose-response analysis, to quantitatively evaluate this association. We searched PubMed and Embase databases for articles published up to November 1, 2016. Random effects generalized least squares regression models were used to assess the quantitative association between PA and hypertension risk across studies. Restricted cubic splines were used to model the dose-response association. We identified 22 articles (29 studies) investigating the risk of hypertension with leisure-time PA or total PA, including 330 222 individuals and 67 698 incident cases of hypertension. The risk of hypertension was reduced by 6% (relative risk, 0.94; 95% confidence interval, 0.92-0.96) with each 10 metabolic equivalent of task h/wk increment of leisure-time PA. We found no evidence of a nonlinear dose-response association of PA and hypertension ( P nonlinearity =0.094 for leisure-time PA and 0.771 for total PA). With the linear cubic spline model, when compared with inactive individuals, for those who met the guidelines recommended minimum level of moderate PA (10 metabolic equivalent of task h/wk), the risk of hypertension was reduced by 6% (relative risk, 0.94; 95% confidence interval, 0.92-0.97). This meta-analysis suggests that additional benefits for hypertension prevention occur as the amount of PA increases. © 2017 American Heart Association, Inc.
DNA and RNA profiling of excavated human remains with varying postmortem intervals.
van den Berge, M; Wiskerke, D; Gerretsen, R R R; Tabak, J; Sijen, T
2016-11-01
When postmortem intervals (PMIs) increase such as with longer burial times, human remains suffer increasingly from the taphonomic effects of decomposition processes such as autolysis and putrefaction. In this study, various DNA analysis techniques and a messenger RNA (mRNA) profiling method were applied to examine for trends in nucleic acid degradation and the postmortem interval. The DNA analysis techniques include highly sensitive DNA quantitation (with and without degradation index), standard and low template STR profiling, insertion and null alleles (INNUL) of retrotransposable elements typing and mitochondrial DNA profiling. The used mRNA profiling system targets genes with tissue specific expression for seven human organs as reported by Lindenbergh et al. (Int J Legal Med 127:891-900, 27) and has been applied to forensic evidentiary traces but not to excavated tissues. The techniques were applied to a total of 81 brain, lung, liver, skeletal muscle, heart, kidney and skin samples obtained from 19 excavated graves with burial times ranging from 4 to 42 years. Results show that brain and heart are the organs in which both DNA and RNA remain remarkably stable, notwithstanding long PMIs. The other organ tissues either show poor overall profiling results or vary for DNA and RNA profiling success, with sometimes DNA and other times RNA profiling being more successful. No straightforward relations were observed between nucleic acid profiling results and the PMI. This study shows that not only DNA but also RNA molecules can be remarkably stable and used for profiling of long-buried human remains, which corroborate forensic applications. The insight that the brain and heart tissues tend to provide the best profiling results may change sampling policies in identification cases of degrading cadavers.
Dedoncker, Josefien; Brunoni, Andre R; Baeken, Chris; Vanderhasselt, Marie-Anne
2016-10-01
Recently, there has been wide interest in the effects of transcranial direct current stimulation (tDCS) of the dorsolateral prefrontal cortex (DLPFC) on cognitive functioning. However, many methodological questions remain unanswered. One of them is whether the time interval between active and sham-controlled stimulation sessions, i.e. the interval between sessions (IBS), influences DLPFC tDCS effects on cognitive functioning. Therefore, a systematic review and meta-analysis was performed of experimental studies published in PubMed, Science Direct, and other databases from the first data available to February 2016. Single session sham-controlled within-subject studies reporting the effects of tDCS of the DLPFC on cognitive functioning in healthy controls and neuropsychiatric patients were included. Cognitive tasks were categorized in tasks assessing memory, attention, and executive functioning. Evaluation of 188 trials showed that anodal vs. sham tDCS significantly decreased response times and increased accuracy, and specifically for the executive functioning tasks, in a sample of healthy participants and neuropsychiatric patients (although a slightly different pattern of improvement was found in analyses for both samples separately). The effects of cathodal vs. sham tDCS (45 trials), on the other hand, were not significant. IBS ranged from less than 1 h to up to 1 week (i.e. cathodal tDCS) or 2 weeks (i.e. anodal tDCS). This IBS length had no influence on the estimated effect size when performing a meta-regression of IBS on reaction time and accuracy outcomes in all three cognitive categories, both for anodal and cathodal stimulation. Practical recommendations and limitations of the study are further discussed.
Geosocial process and its regularities
NASA Astrophysics Data System (ADS)
Vikulina, Marina; Vikulin, Alexander; Dolgaya, Anna
2015-04-01
Natural disasters and social events (wars, revolutions, genocides, epidemics, fires, etc.) accompany each other throughout human civilization, thus reflecting the close relationship of these phenomena that are seemingly of different nature. In order to study this relationship authors compiled and analyzed the list of the 2,400 natural disasters and social phenomena weighted by their magnitude that occurred during the last XXXVI centuries of our history. Statistical analysis was performed separately for each aggregate (natural disasters and social phenomena), and for particular statistically representative types of events. There was 5 + 5 = 10 types. It is shown that the numbers of events in the list are distributed by logarithmic law: the bigger the event, the less likely it happens. For each type of events and each aggregate the existence of periodicities with periods of 280 ± 60 years was established. Statistical analysis of the time intervals between adjacent events for both aggregates showed good agreement with Weibull-Gnedenko distribution with shape parameter less than 1, which is equivalent to the conclusion about the grouping of events at small time intervals. Modeling of statistics of time intervals with Pareto distribution allowed to identify the emergent property for all events in the aggregate. This result allowed the authors to make conclusion about interaction between natural disasters and social phenomena. The list of events compiled by authors and first identified properties of cyclicity, grouping and interaction process reflected by this list is the basis of modeling essentially unified geosocial process at high enough statistical level. Proof of interaction between "lifeless" Nature and Society is fundamental and provided a new approach to forecasting demographic crises with taking into account both natural disasters and social phenomena.
Nomura, Shuhei; Blangiardo, Marta; Tsubokura, Masaharu; Nishikawa, Yoshitaka; Gilmour, Stuart; Kami, Masahiro; Hodgson, Susan
2016-01-01
Considering the health impacts of evacuation is fundamental to disaster planning especially for vulnerable elderly populations; however, evacuation-related mortality risks have not been well-investigated. We conducted an analysis to compare survival of evacuated and non-evacuated residents of elderly care facilities, following the Great East Japan Earthquake and subsequent Fukushima Dai-ichi nuclear power plant incident on 11th March 2011. To assess associations between evacuation and mortality after the Fukushima nuclear incident; and to present discussion points on disaster planning, with reference to vulnerable elderly populations. The study population comprised 1,215 residents admitted to seven elderly care facilities located 20-40km from the nuclear plant in the five years before the incident. Demographic and clinical characteristics were obtained from medical records. Evacuation histories were tracked until mid 2013. Main outcome measures are hazard ratios in evacuees versus non-evacuees using random-effects Cox proportional hazards models, and pre- and post-disaster survival probabilities and relative mortality incidence. Experiencing the disasters did not have a significant influence on mortality (hazard ratio 1.10, 95% confidence interval: 0.84-1.43). Evacuation was associated with 1.82 times higher mortality (95% confidence interval: 1.22-2.70) after adjusting for confounders, with the initial evacuation from the original facility associated with 3.37 times higher mortality risk (95% confidence interval: 1.66-6.81) than non evacuation. The government should consider updating its requirements for emergency planning for elderly facilities and ensure that, in a disaster setting, these facilities have the capacity and support to shelter in place for at least sufficient time to adequately prepare initial evacuation. Copyright © 2015 Elsevier Inc. All rights reserved.
Stirling, Aaron D; Moran, Neil R; Kelly, Michael E; Ridgway, Paul F; Conlon, Kevin C
2017-10-01
Using revised Atlanta classification defined outcomes, we compare absolute values in C-reactive protein (CRP), with interval changes in CRP, for severity stratification in acute pancreatitis (AP). A retrospective study of all first incidence AP was conducted over a 5-year period. Interval change in CRP values from admission to day 1, 2 and 3 was compared against the absolute values. Receiver-operator characteristic (ROC) curve and likelihood ratios (LRs) were used to compare ability to predict severe and mild disease. 337 cases of first incidence AP were included in our analysis. ROC curve analysis demonstrated the second day as the most useful time for repeat CRP measurement. A CRP interval change >90 mg/dL at 48 h (+LR 2.15, -LR 0.26) was equivalent to an absolute value of >150 mg/dL within 48 h (+LR 2.32, -LR 0.25). The optimal cut-off for absolute CRP based on new, more stringent definition of severity was >190 mg/dL (+LR 2.72, -LR 0.24). Interval change in CRP is a comparable measure to absolute CRP in the prognostication of AP severity. This study suggests a rise of >90 mg/dL from admission or an absolute value of >190 mg/dL at 48 h predicts severe disease with the greatest accuracy. Copyright © 2017 International Hepato-Pancreato-Biliary Association Inc. Published by Elsevier Ltd. All rights reserved.
Foster, J D; Ewings, P; Falk, S; Cooper, E J; Roach, H; West, N P; Williams-Yesson, B A; Hanna, G B; Francis, N K
2016-10-01
The optimal time of rectal resection after long-course chemoradiotherapy (CRT) remains unclear. A feasibility study was undertaken for a multi-centre randomized controlled trial evaluating the impact of the interval after chemoradiotherapy on the technical complexity of surgery. Patients with rectal cancer were randomized to either a 6- or 12-week interval between CRT and surgery between June 2012 and May 2014 (ISRCTN registration number: 88843062). For blinded technical complexity assessment, the Observational Clinical Human Reliability Analysis technique was used to quantify technical errors enacted within video recordings of operations. Other measured outcomes included resection completeness, specimen quality, radiological down-staging, tumour cell density down-staging and surgeon-reported technical complexity. Thirty-one patients were enrolled: 15 were randomized to 6 and 16-12 weeks across 7 centres. Fewer eligible patients were identified than had been predicted. Of 23 patients who underwent resection, mean 12.3 errors were observed per case at 6 weeks vs. 10.7 at 12 weeks (p = 0.401). Other measured outcomes were similar between groups. The feasibility of measurement of operative performance of rectal cancer surgery as an endpoint was confirmed in this exploratory study. Recruitment of sufficient numbers of patients represented a challenge, and a proportion of patients did not proceed to resection surgery. These results suggest that interval after CRT may not substantially impact upon surgical technical performance.
A new NASA/MSFC mission analysis global cloud cover data base
NASA Technical Reports Server (NTRS)
Brown, S. C.; Jeffries, W. R., III
1985-01-01
A global cloud cover data set, derived from the USAF 3D NEPH Analysis, was developed for use in climate studies and for Earth viewing applications. This data set contains a single parameter - total sky cover - separated in time by 3 or 6 hr intervals and in space by approximately 50 n.mi. Cloud cover amount is recorded for each grid point (of a square grid) by a single alphanumeric character representing each 5 percent increment of sky cover. The data are arranged in both quarterly and monthly formats. The data base currently provides daily, 3-hr observed total sky cover for the Northern Hemisphere from 1972 through 1977 less 1976. For the Southern Hemisphere, there are data at 6-hr intervals for 1976 through 1978 and at 3-hr intervals for 1979 and 1980. More years of data are being added. To validate the data base, the percent frequency of or = 0.3 and or = 0.8 cloud cover was compared with ground observed cloud amounts at several locations with generally good agreement. Mean or other desired cloud amounts can be calculated for any time period and any size area from a single grid point to a hemisphere. The data base is especially useful in evaluating the consequence of cloud cover on Earth viewing space missions. The temporal and spatial frequency of the data allow simulations that closely approximate any projected viewing mission. No adjustments are required to account for cloud continuity.
NASA Astrophysics Data System (ADS)
Villa, Valentina; Pereira, Alison; Chaussé, Christine; Nomade, Sébastien; Giaccio, Biagio; Limondin-Lozouet, Nicole; Fusco, Fabio; Regattieri, Eleonora; Degeai, Jean-Philippe; Robert, Vincent; Kuzucuoglu, Catherine; Boschian, Giovanni; Agostini, Silvano; Aureli, Daniele; Pagli, Marina; Bahain, Jean Jacques; Nicoud, Elisa
2016-11-01
An integrated geological study, including sedimentology, stable isotope analysis (δ18O, δ13C), geochemistry, micromorphology, biomarker analysis, 40Ar/39Ar geochronology and tephrochronology, was undertaken on the Quaternary infill of the Valle Giumentina basin in Central Italy, which also includes an outstanding archaeological succession, composed of nine human occupation levels ascribed to the Lower and Middle Palaeolithic. 40Ar/39Ar dating, and other palaeoenvironmental and tephrochronological data, constrain the sedimentary history of the whole succession to the MIS 15-MIS 12 interval, between 618 ± 13 ka and 456 ± 2 ka. Palaeoenvironmental proxies suggest that over this time interval of about 150 ka, sedimentary and pedogenic processes were mainly influenced by climatic changes, in particular by the pulsing of local mountain glaciers of the Majella massif. Specifically, the Valle Giumentina succession records glacio-fluvial and lacustrine sedimentation during the colder glacial periods and pedogenesis and/or alluvial sedimentation during the warmer interglacial and/or interstadial periods. During this interval, tectonics played a negligible role as a driving factor of local morphogenesis and sedimentation, whereas the general regional uplift experienced in the Middle Pleistocene led to capture of the basin and its definitive extinction after MIS 12. These data substantially improve previous knowledge of the chronology and sedimentary evolution of the succession, providing for the first time, a well constrained chronological and palaeoenvironmental framework for the archaeological and human palaeoecological record of Valle Giumentina.
Multifactor analysis of multiscaling in volatility return intervals.
Wang, Fengzhong; Yamasaki, Kazuko; Havlin, Shlomo; Stanley, H Eugene
2009-01-01
We study the volatility time series of 1137 most traded stocks in the U.S. stock markets for the two-year period 2001-2002 and analyze their return intervals tau , which are time intervals between volatilities above a given threshold q . We explore the probability density function of tau , P_(q)(tau) , assuming a stretched exponential function, P_(q)(tau) approximately e;(-tau;(gamma)) . We find that the exponent gamma depends on the threshold in the range between q=1 and 6 standard deviations of the volatility. This finding supports the multiscaling nature of the return interval distribution. To better understand the multiscaling origin, we study how gamma depends on four essential factors, capitalization, risk, number of trades, and return. We show that gamma depends on the capitalization, risk, and return but almost does not depend on the number of trades. This suggests that gamma relates to the portfolio selection but not on the market activity. To further characterize the multiscaling of individual stocks, we fit the moments of tau , mu_(m) identical with(tautau);(m);(1m) , in the range of 10
Multifactor analysis of multiscaling in volatility return intervals
NASA Astrophysics Data System (ADS)
Wang, Fengzhong; Yamasaki, Kazuko; Havlin, Shlomo; Stanley, H. Eugene
2009-01-01
We study the volatility time series of 1137 most traded stocks in the U.S. stock markets for the two-year period 2001-2002 and analyze their return intervals τ , which are time intervals between volatilities above a given threshold q . We explore the probability density function of τ , Pq(τ) , assuming a stretched exponential function, Pq(τ)˜e-τγ . We find that the exponent γ depends on the threshold in the range between q=1 and 6 standard deviations of the volatility. This finding supports the multiscaling nature of the return interval distribution. To better understand the multiscaling origin, we study how γ depends on four essential factors, capitalization, risk, number of trades, and return. We show that γ depends on the capitalization, risk, and return but almost does not depend on the number of trades. This suggests that γ relates to the portfolio selection but not on the market activity. To further characterize the multiscaling of individual stocks, we fit the moments of τ , μm≡⟨(τ/⟨τ⟩)m⟩1/m , in the range of 10<⟨τ⟩⩽100 by a power law, μm˜⟨τ⟩δ . The exponent δ is found also to depend on the capitalization, risk, and return but not on the number of trades, and its tendency is opposite to that of γ . Moreover, we show that δ decreases with increasing γ approximately by a linear relation. The return intervals demonstrate the temporal structure of volatilities and our findings suggest that their multiscaling features may be helpful for portfolio optimization.
Cadenaro, Milena; Navarra, Chiara Ottavia; Mazzoni, Annalisa; Nucci, Cesare; Matis, Bruce A; Di Lenarda, Roberto; Breschi, Lorenzo
2010-04-01
In an in vivo study, the authors tested the hypothesis that no difference in enamel surface roughness is detectable either during or after bleaching with a high-concentration in-office whitening agent. The authors performed profilometric and scanning electron microscopic (SEM) analyses of epoxy resin replicas of the upper right incisors of 20 participants at baseline (control) and after each bleaching treatment with a 38 percent hydrogen peroxide whitening agent, applied four times, at one-week intervals. The authors used analysis of variance for repeated measures to analyze the data statistically. The profilometric analysis of the enamel surface replicas after the in vivo bleaching protocol showed no significant difference in surface roughness parameters (P > .05) compared with those at baseline, irrespective of the time interval. Results of the correlated SEM analysis showed no relevant alteration on the enamel surface. Results of this in vivo study support the tested hypothesis that the application of a 38 percent hydrogen peroxide in-office whitening agent does not alter enamel surface roughness, even after multiple applications. The use of a 38 percent hydrogen peroxide in-office whitening agent induced no roughness alterations of the enamel surface, even after prolonged and repeated applications.
Rahman, Mohammad Mahmudur; Brown, Richard J C; Kim, Ki-Hyun; Yoon, Hye-On; Phan, Nhu-Thuc
2013-01-01
In an effort to reduce the experimental bias involved in the analysis of gaseous elemental mercury (Hg(o)), the blank response from gold-coated adsorption tubes has been investigated using cold vapor atomic absorption spectrometry (CVAAS). Our study has been compared with our recent investigation on memory effect in a cold vapour atomic fluorescence spectrometry (CVAFS). The pattern of blank responses was quantified after loading different amounts of mercury and after different time intervals of 1, 14, and 45 days. In case of the one day interval, the result of five to six instant blank heating cycles confirmed successful liberation of mercury following the second and third blank heating cycles. The results of 14 or 45 days generally suggest that liberation of excess mercury is affected by both the initial loading amount and the length of storage time prior to analysis. We have demonstrated a possibly effective way to reduce memory effects. Some similarities of these results with those from CVAFS experiment suggests that the blank response is caused by a combination of mercury absorbed within the bulk gold and micro- and nanoparticles liberated during heating and not from coabsorbing interfering gaseous species.
Rahman, Mohammad Mahmudur; Brown, Richard J. C.; Yoon, Hye-On; Phan, Nhu-Thuc
2013-01-01
In an effort to reduce the experimental bias involved in the analysis of gaseous elemental mercury (Hgo), the blank response from gold-coated adsorption tubes has been investigated using cold vapor atomic absorption spectrometry (CVAAS). Our study has been compared with our recent investigation on memory effect in a cold vapour atomic fluorescence spectrometry (CVAFS). The pattern of blank responses was quantified after loading different amounts of mercury and after different time intervals of 1, 14, and 45 days. In case of the one day interval, the result of five to six instant blank heating cycles confirmed successful liberation of mercury following the second and third blank heating cycles. The results of 14 or 45 days generally suggest that liberation of excess mercury is affected by both the initial loading amount and the length of storage time prior to analysis. We have demonstrated a possibly effective way to reduce memory effects. Some similarities of these results with those from CVAFS experiment suggests that the blank response is caused by a combination of mercury absorbed within the bulk gold and micro- and nanoparticles liberated during heating and not from coabsorbing interfering gaseous species. PMID:23589708
Belke, Terry W; Christie-Fougere, Melissa M
2006-11-01
Across two experiments, a peak procedure was used to assess the timing of the onset and offset of an opportunity to run as a reinforcer. The first experiment investigated the effect of reinforcer duration on temporal discrimination of the onset of the reinforcement interval. Three male Wistar rats were exposed to fixed-interval (FI) 30-s schedules of wheel-running reinforcement and the duration of the opportunity to run was varied across values of 15, 30, and 60s. Each session consisted of 50 reinforcers and 10 probe trials. Results showed that as reinforcer duration increased, the percentage of postreinforcement pauses longer than the 30-s schedule interval increased. On probe trials, peak response rates occurred near the time of reinforcer delivery and peak times varied with reinforcer duration. In a second experiment, seven female Long-Evans rats were exposed to FI 30-s schedules leading to 30-s opportunities to run. Timing of the onset and offset of the reinforcement period was assessed by probe trials during the schedule interval and during the reinforcement interval in separate conditions. The results provided evidence of timing of the onset, but not the offset of the wheel-running reinforcement period. Further research is required to assess if timing occurs during a wheel-running reinforcement period.
High resolution digital delay timer
Martin, Albert D.
1988-01-01
Method and apparatus are provided for generating an output pulse following a trigger pulse at a time delay interval preset with a resolution which is high relative to a low resolution available from supplied clock pulses. A first lumped constant delay (20) provides a first output signal (24) at predetermined interpolation intervals corresponding to the desired high resolution time interval. Latching circuits (26, 28) latch the high resolution data (24) to form a first synchronizing data set (60). A selected time interval has been preset to internal counters (142, 146, 154) and corrected for circuit propagation delay times having the same order of magnitude as the desired high resolution. Internal system clock pulses (32, 34) count down the counters to generate an internal pulse delayed by an interval which is functionally related to the preset time interval. A second LCD (184) corrects the internal signal with the high resolution time delay. A second internal pulse is then applied to a third LCD (74) to generate a second set of synchronizing data (76) which is complementary with the first set of synchronizing data (60) for presentation to logic circuits (64). The logic circuits (64) further delay the internal output signal (72) to obtain a proper phase relationship of an output signal (80) with the internal pulses (32, 34). The final delayed output signal (80) thereafter enables the output pulse generator (82) to produce the desired output pulse (84) at the preset time delay interval following input of the trigger pulse (10, 12).
Can PPG be used for HRV analysis?
Pinheiro, N; Couceiro, R; Henriques, J; Muehlsteff, J; Quintal, I; Goncalves, L; Carvalho, P
2016-08-01
Heart rate variability (HRV) represents one of the most promising markers of the autonomic nervous system (ANS) regulation. However, it requires the acquisition of the ECG signal in order to reliably detect the RR intervals, which is not always easily and comfortably available in personal health applications. Additionally, due to progress in single spot optical sensors, photoplethysmography (PPG) is an interesting alternative for heartbeat interval measurements, since it is a more convenient and a less intrusive measurement technique. Driven by the technological advances in such sensors, wrist-worn devices are becoming a commodity, and the interest in the assessment of HRV indexes from the PPG analysis (pulse rate variability - PRV) is rising. In this study, we investigate the hypothesis of using PRV features as surrogates for HRV indexes, in three different contexts: healthy subjects at rest, healthy subjects after physical exercise and subjects with cardiovascular diseases (CVD). Additionally, we also evaluate which are the characteristic points better suited for PRV analysis in these contexts, i.e. the PPG waveform characteristic points leading to the PRV features that present the best estimates of HRV (correlation and error analysis). The achieved results suggest that the PRV can be often used as an alternative for HRV analysis in healthy subjects, with significant correlations above 82%, for both time and frequency features. Contrarily, in the post-exercise and CVD subjects, time and (most importantly) frequency domain features shall be used with caution (mean correlations ranging from 68% to 88%).
Mette, Christian; Grabemann, Marco; Zimmermann, Marco; Strunz, Laura; Scherbaum, Norbert; Wiltfang, Jens; Kis, Bernhard
2015-01-01
Altered time reproduction is exhibited by patients with adult attention deficit hyperactivity disorder (ADHD). It remains unclear whether memory capacity influences the ability of adults with ADHD to reproduce time intervals. We conducted a behavioral study on 30 ADHD patients who were medicated with methylphenidate, 29 unmedicated adult ADHD patients and 32 healthy controls (HCs). We assessed time reproduction using six time intervals (1 s, 4 s, 6 s, 10 s, 24 s and 60 s) and assessed memory performance using the Wechsler memory scale. The patients with ADHD exhibited lower memory performance scores than the HCs. No significant differences in the raw scores for any of the time intervals (p > .05), with the exception of the variability at the short time intervals (1 s, 4 s and 6 s) (p < .01), were found between the groups. The overall analyses failed to reveal any significant correlations between time reproduction at any of the time intervals examined in the time reproduction task and working memory performance (p > .05). We detected no findings indicating that working memory might influence time reproduction in adult patients with ADHD. Therefore, further studies concerning time reproduction and memory capacity among adult patients with ADHD must be performed to verify and replicate the present findings.
ERIC Educational Resources Information Center
Hooper, Martin
2017-01-01
TIMSS and PIRLS assess representative samples of students at regular intervals, measuring trends in student achievement and student contexts for learning. Because individual students are not tracked over time, analysis of international large-scale assessment data is usually conducted cross-sectionally. Gustafsson (2007) proposed examining the data…
ERIC Educational Resources Information Center
Hays, Ron D.; And Others
1994-01-01
Applied structural equation modeling to evaluation of cross-lagged panel models. Self-reports of physical and mental health at three time points spanning four-year interval were analyzed to illustrate cross-lagged analysis methodology. Data were analyzed from 856 patients with hypertension, diabetes, heart disease, or depression. Cross-lagged…
Analyzing Impulse Using iPhone and Tracker
ERIC Educational Resources Information Center
Ayop, Shahrul Kadri
2017-01-01
The iPhone 6 introduced a new feature of recording video in Slo-Mo mode at 240 fps (4.17 ms interval). This great capability when integrated with video analysis freeware such as Tracker offers in-depth exploration for physical phenomena such as collisions that occur in a very short duration of time. This article discusses one such usage in…
NASA Technical Reports Server (NTRS)
Leskovar, B.; Turko, B.
1977-01-01
The development of a high precision time interval digitizer is described. The time digitizer is a 10 psec resolution stop watch covering a range of up to 340 msec. The measured time interval is determined as a separation between leading edges of a pair of pulses applied externally to the start input and the stop input of the digitizer. Employing an interpolation techniques and a 50 MHz high precision master oscillator, the equivalent of a 100 GHz clock frequency standard is achieved. Absolute accuracy and stability of the digitizer are determined by the external 50 MHz master oscillator, which serves as a standard time marker. The start and stop pulses are fast 1 nsec rise time signals, according to the Nuclear Instrument means of tunnel diode discriminators. Firing level of the discriminator define start and stop points between which the time interval is digitized.
Tsivgoulis, Georgios; Zand, Ramin; Katsanos, Aristeidis H; Goyal, Nitin; Uchino, Ken; Chang, Jason; Dardiotis, Efthimios; Putaala, Jukka; Alexandrov, Anne W; Malkoff, Marc D; Alexandrov, Andrei V
2015-05-01
Shortening door-to-needle time may lead to inadvertent intravenous thrombolysis (IVT) administration in stroke mimics (SMs). We sought to determine the safety of IVT in SMs using prospective, single-center data and by conducting a comprehensive meta-analysis of reported case-series. We prospectively analyzed consecutive IVT-treated patients during a 5-year period at a tertiary care stroke center. A systematic review and meta-analysis of case-series reporting safety of IVT in SMs and confirmed acute ischemic stroke were conducted. Symptomatic intracerebral hemorrhage was defined as imaging evidence of ICH with an National Institutes of Health Stroke scale increase of ≥4 points. Favorable functional outcome at hospital discharge was defined as a modified Rankin Scale score of 0 to 1. Of 516 consecutive IVT patients at our tertiary care center (50% men; mean age, 60±14 years; median National Institutes of Health Stroke scale, 11; range, 3-22), SMs comprised 75 cases. Symptomatic intracerebral hemorrhage occurred in 1 patient, whereas we documented no cases of orolingual edema or major extracranial hemorrhagic complications. In meta-analysis of 9 studies (8942 IVT-treated patients), the pooled rates of symptomatic intracerebral hemorrhage and orolingual edema among 392 patients with SM treated with IVT were 0.5% (95% confidence interval, 0%-2%) and 0.3% (95% confidence interval, 0%-2%), respectively. Patients with SM were found to have a significantly lower risk for symptomatic intracerebral hemorrhage compared with patients with acute ischemic stroke (risk ratio=0.33; 95% confidence interval, 0.14-0.77; P=0.010), with no evidence of heterogeneity or publication bias. Favorable functional outcome was almost 3-fold higher in patients with SM in comparison with patients with acute ischemic stroke (risk ratio=2.78; 95% confidence interval, 2.07-3.73; P<0.00001). Our prospective, single-center experience coupled with the findings of the comprehensive meta-analysis underscores the safety of IVT in SM. © 2015 American Heart Association, Inc.
Eliciting interval beliefs: An experimental study
Peeters, Ronald; Wolk, Leonard
2017-01-01
In this paper we study the interval scoring rule as a mechanism to elicit subjective beliefs under varying degrees of uncertainty. In our experiment, subjects forecast the termination time of a time series to be generated from a given but unknown stochastic process. Subjects gradually learn more about the underlying process over time and hence the true distribution over termination times. We conduct two treatments, one with a high and one with a low volatility process. We find that elicited intervals are better when subjects are facing a low volatility process. In this treatment, participants learn to position their intervals almost optimally over the course of the experiment. This is in contrast with the high volatility treatment, where subjects, over the course of the experiment, learn to optimize the location of their intervals but fail to provide the optimal length. PMID:28380020