Sample records for sampling interval effects

  1. Sampling Theory and Confidence Intervals for Effect Sizes: Using ESCI To Illustrate "Bouncing"; Confidence Intervals.

    ERIC Educational Resources Information Center

    Du, Yunfei

    This paper discusses the impact of sampling error on the construction of confidence intervals around effect sizes. Sampling error affects the location and precision of confidence intervals. Meta-analytic resampling demonstrates that confidence intervals can haphazardly bounce around the true population parameter. Special software with graphical…

  2. Effect of increasing the sampling interval to 2 seconds on the radiation dose and accuracy of CT perfusion of the head and neck.

    PubMed

    Tawfik, Ahmed M; Razek, Ahmed A; Elhawary, Galal; Batouty, Nihal M

    2014-01-01

    To evaluate the effect of increasing the sampling interval from 1 second (1 image per second) to 2 seconds (1 image every 2 seconds) on computed tomographic (CT) perfusion (CTP) of head and neck tumors. Twenty patients underwent CTP studies of head and neck tumors with images acquired in cine mode for 50 seconds using sampling interval of 1 second. Using deconvolution-based software, analysis of CTP was done with sampling interval of 1 second and then 2 seconds. Perfusion maps representing blood flow, blood volume, mean transit time, and permeability surface area product (PS) were obtained. Quantitative tumor CTP values were compared between the 2 sampling intervals. Two blinded radiologists compared the subjective quality of CTP maps using a 3-point scale between the 2 sampling intervals. Radiation dose parameters were recorded for the 2 sampling interval rates. No significant differences were observed between the means of the 4 perfusion parameters generated using both sampling intervals; all P >0.05. The 95% limits of agreement between the 2 sampling intervals were -65.9 to 48.1) mL/min per 100 g for blood flow, -3.6 to 3.1 mL/100 g for blood volume, -2.9 to 3.8 seconds for mean transit time, and -10.0 to 12.5 mL/min per 100 g for PS. There was no significant difference between the subjective quality scores of CTP maps obtained using the 2 sampling intervals; all P > 0.05. Radiation dose was halved when sampling interval increased from 1 to 2 seconds. Increasing the sampling interval rate to 1 image every 2 seconds does not compromise the image quality and has no significant effect on quantitative perfusion parameters of head and neck tumors. The radiation dose is halved.

  3. Investigation of within- and between-herd variability of bovine leukaemia virus bulk tank milk antibody levels over different sampling intervals in the Canadian Maritimes.

    PubMed

    John, Emily E; Nekouei, Omid; McClure, J T; Cameron, Marguerite; Keefe, Greg; Stryhn, Henrik

    2018-06-01

    Bulk tank milk (BTM) samples are used to determine the infection status and estimate dairy herd prevalence for bovine leukaemia virus (BLV) using an antibody ELISA assay. BLV ELISA variability between samples from the same herd or from different herds has not been investigated over long time periods. The main objective of this study was to determine the within-herd and between-herd variability of a BTM BLV ELISA assay over 1-month, 3-month, and 3-year sampling intervals. All of the Canadian Maritime region dairy herds (n = 523) that were active in 2013 and 2016 were included (83.9% and 86.9% of total herds in 2013 and 2016, respectively). BLV antibody levels were measured in three BTM samples collected at 1-month intervals in early 2013 as well as two BTM samples collected over a 3-month interval in early 2016. Random-effects models, with fixed effects for sample replicate and province and random effects for herd, were used to estimate the variability between BTM samples from the same herd and between herds for 1-month, 3-month, and 3-year sampling intervals. The majority of variability of BTM BLV ELISA results was seen between herds (1-month, 6.792 ± 0.533; 3-month, 7.806 ± 0.652; 3-year, 6.222 ± 0.528). Unexplained variance between samples from the same herd, on square-root scale, was greatest for the 3-year (0.976 ± 0.104), followed by the 1-month (0.611 ± 0.035) then the 3-month (0.557 ± 0.071) intervals. Variability of BTM antibody levels within the same herd was present but was much smaller than the variability between herds, and was greatest for the 3-year sampling interval. The 3-month sampling interval resulted in the least variability and is appropriate to use for estimating the baseline level of within-herd prevalence for BLV control programs. Knowledge of the baseline variability and within-herd prevalence can help to determine effectiveness of control programs when BTM sampling is repeated at longer intervals. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. The effect of sampling rate on observed statistics in a correlated random walk

    PubMed Central

    Rosser, G.; Fletcher, A. G.; Maini, P. K.; Baker, R. E.

    2013-01-01

    Tracking the movement of individual cells or animals can provide important information about their motile behaviour, with key examples including migrating birds, foraging mammals and bacterial chemotaxis. In many experimental protocols, observations are recorded with a fixed sampling interval and the continuous underlying motion is approximated as a series of discrete steps. The size of the sampling interval significantly affects the tracking measurements, the statistics computed from observed trajectories, and the inferences drawn. Despite the widespread use of tracking data to investigate motile behaviour, many open questions remain about these effects. We use a correlated random walk model to study the variation with sampling interval of two key quantities of interest: apparent speed and angle change. Two variants of the model are considered, in which reorientations occur instantaneously and with a stationary pause, respectively. We employ stochastic simulations to study the effect of sampling on the distributions of apparent speeds and angle changes, and present novel mathematical analysis in the case of rapid sampling. Our investigation elucidates the complex nature of sampling effects for sampling intervals ranging over many orders of magnitude. Results show that inclusion of a stationary phase significantly alters the observed distributions of both quantities. PMID:23740484

  5. Accuracy in parameter estimation for targeted effects in structural equation modeling: sample size planning for narrow confidence intervals.

    PubMed

    Lai, Keke; Kelley, Ken

    2011-06-01

    In addition to evaluating a structural equation model (SEM) as a whole, often the model parameters are of interest and confidence intervals for those parameters are formed. Given a model with a good overall fit, it is entirely possible for the targeted effects of interest to have very wide confidence intervals, thus giving little information about the magnitude of the population targeted effects. With the goal of obtaining sufficiently narrow confidence intervals for the model parameters of interest, sample size planning methods for SEM are developed from the accuracy in parameter estimation approach. One method plans for the sample size so that the expected confidence interval width is sufficiently narrow. An extended procedure ensures that the obtained confidence interval will be no wider than desired, with some specified degree of assurance. A Monte Carlo simulation study was conducted that verified the effectiveness of the procedures in realistic situations. The methods developed have been implemented in the MBESS package in R so that they can be easily applied by researchers. © 2011 American Psychological Association

  6. Short-Term Memory for Temporal Intervals: Contrasting Explanations of the Choose-Short Effect in Pigeons

    ERIC Educational Resources Information Center

    Pinto, Carlos; Machado, Armando

    2011-01-01

    To better understand short-term memory for temporal intervals, we re-examined the choose-short effect. In Experiment 1, to contrast the predictions of two models of this effect, the subjective shortening and the coding models, pigeons were exposed to a delayed matching-to-sample task with three sample durations (2, 6 and 18 s) and retention…

  7. Sample Size Calculations for Precise Interval Estimation of the Eta-Squared Effect Size

    ERIC Educational Resources Information Center

    Shieh, Gwowen

    2015-01-01

    Analysis of variance is one of the most frequently used statistical analyses in the behavioral, educational, and social sciences, and special attention has been paid to the selection and use of an appropriate effect size measure of association in analysis of variance. This article presents the sample size procedures for precise interval estimation…

  8. Effects of sampling interval on spatial patterns and statistics of watershed nitrogen concentration

    USGS Publications Warehouse

    Wu, S.-S.D.; Usery, E.L.; Finn, M.P.; Bosch, D.D.

    2009-01-01

    This study investigates how spatial patterns and statistics of a 30 m resolution, model-simulated, watershed nitrogen concentration surface change with sampling intervals from 30 m to 600 m for every 30 m increase for the Little River Watershed (Georgia, USA). The results indicate that the mean, standard deviation, and variogram sills do not have consistent trends with increasing sampling intervals, whereas the variogram ranges remain constant. A sampling interval smaller than or equal to 90 m is necessary to build a representative variogram. The interpolation accuracy, clustering level, and total hot spot areas show decreasing trends approximating a logarithmic function. The trends correspond to the nitrogen variogram and start to level at a sampling interval of 360 m, which is therefore regarded as a critical spatial scale of the Little River Watershed. Copyright ?? 2009 by Bellwether Publishing, Ltd. All right reserved.

  9. Towards the estimation of effect measures in studies using respondent-driven sampling.

    PubMed

    Rotondi, Michael A

    2014-06-01

    Respondent-driven sampling (RDS) is an increasingly common sampling technique to recruit hidden populations. Statistical methods for RDS are not straightforward due to the correlation between individual outcomes and subject weighting; thus, analyses are typically limited to estimation of population proportions. This manuscript applies the method of variance estimates recovery (MOVER) to construct confidence intervals for effect measures such as risk difference (difference of proportions) or relative risk in studies using RDS. To illustrate the approach, MOVER is used to construct confidence intervals for differences in the prevalence of demographic characteristics between an RDS study and convenience study of injection drug users. MOVER is then applied to obtain a confidence interval for the relative risk between education levels and HIV seropositivity and current infection with syphilis, respectively. This approach provides a simple method to construct confidence intervals for effect measures in RDS studies. Since it only relies on a proportion and appropriate confidence limits, it can also be applied to previously published manuscripts.

  10. Effects of spectrometer band pass, sampling, and signal-to-noise ratio on spectral identification using the Tetracorder algorithm

    USGS Publications Warehouse

    Swayze, G.A.; Clark, R.N.; Goetz, A.F.H.; Chrien, T.H.; Gorelick, N.S.

    2003-01-01

    Estimates of spectrometer band pass, sampling interval, and signal-to-noise ratio required for identification of pure minerals and plants were derived using reflectance spectra convolved to AVIRIS, HYDICE, MIVIS, VIMS, and other imaging spectrometers. For each spectral simulation, various levels of random noise were added to the reflectance spectra after convolution, and then each was analyzed with the Tetracorder spectra identification algorithm [Clark et al., 2003]. The outcome of each identification attempt was tabulated to provide an estimate of the signal-to-noise ratio at which a given percentage of the noisy spectra were identified correctly. Results show that spectral identification is most sensitive to the signal-to-noise ratio at narrow sampling interval values but is more sensitive to the sampling interval itself at broad sampling interval values because of spectral aliasing, a condition when absorption features of different materials can resemble one another. The band pass is less critical to spectral identification than the sampling interval or signal-to-noise ratio because broadening the band pass does not induce spectral aliasing. These conclusions are empirically corroborated by analysis of mineral maps of AVIRIS data collected at Cuprite, Nevada, between 1990 and 1995, a period during which the sensor signal-to-noise ratio increased up to sixfold. There are values of spectrometer sampling and band pass beyond which spectral identification of materials will require an abrupt increase in sensor signal-to-noise ratio due to the effects of spectral aliasing. Factors that control this threshold are the uniqueness of a material's diagnostic absorptions in terms of shape and wavelength isolation, and the spectral diversity of the materials found in nature and in the spectral library used for comparison. Array spectrometers provide the best data for identification when they critically sample spectra. The sampling interval should not be broadened to increase the signal-to-noise ratio in a photon-noise-limited system when high levels of accuracy are desired. It is possible, using this simulation method, to select optimum combinations of band-pass, sampling interval, and signal-to-noise ratio values for a particular application that maximize identification accuracy and minimize the volume of imaging data.

  11. Confidence intervals and sample size calculations for the standardized mean difference effect size between two normal populations under heteroscedasticity.

    PubMed

    Shieh, G

    2013-12-01

    The use of effect sizes and associated confidence intervals in all empirical research has been strongly emphasized by journal publication guidelines. To help advance theory and practice in the social sciences, this article describes an improved procedure for constructing confidence intervals of the standardized mean difference effect size between two independent normal populations with unknown and possibly unequal variances. The presented approach has advantages over the existing formula in both theoretical justification and computational simplicity. In addition, simulation results show that the suggested one- and two-sided confidence intervals are more accurate in achieving the nominal coverage probability. The proposed estimation method provides a feasible alternative to the most commonly used measure of Cohen's d and the corresponding interval procedure when the assumption of homogeneous variances is not tenable. To further improve the potential applicability of the suggested methodology, the sample size procedures for precise interval estimation of the standardized mean difference are also delineated. The desired precision of a confidence interval is assessed with respect to the control of expected width and to the assurance probability of interval width within a designated value. Supplementary computer programs are developed to aid in the usefulness and implementation of the introduced techniques.

  12. Evaluating the Impact of Guessing and Its Interactions With Other Test Characteristics on Confidence Interval Procedures for Coefficient Alpha

    PubMed Central

    Paek, Insu

    2015-01-01

    The effect of guessing on the point estimate of coefficient alpha has been studied in the literature, but the impact of guessing and its interactions with other test characteristics on the interval estimators for coefficient alpha has not been fully investigated. This study examined the impact of guessing and its interactions with other test characteristics on four confidence interval (CI) procedures for coefficient alpha in terms of coverage rate (CR), length, and the degree of asymmetry of CI estimates. In addition, interval estimates of coefficient alpha when data follow the essentially tau-equivalent condition were investigated as a supplement to the case of dichotomous data with examinee guessing. For dichotomous data with guessing, the results did not reveal salient negative effects of guessing and its interactions with other test characteristics (sample size, test length, coefficient alpha levels) on CR and the degree of asymmetry, but the effect of guessing was salient as a main effect and an interaction effect with sample size on the length of the CI estimates, making longer CI estimates as guessing increases, especially when combined with a small sample size. Other important effects (e.g., CI procedures on CR) are also discussed. PMID:29795863

  13. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    PubMed

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  14. Alternative Confidence Interval Methods Used in the Diagnostic Accuracy Studies

    PubMed Central

    Gülhan, Orekıcı Temel

    2016-01-01

    Background/Aim. It is necessary to decide whether the newly improved methods are better than the standard or reference test or not. To decide whether the new diagnostics test is better than the gold standard test/imperfect standard test, the differences of estimated sensitivity/specificity are calculated with the help of information obtained from samples. However, to generalize this value to the population, it should be given with the confidence intervals. The aim of this study is to evaluate the confidence interval methods developed for the differences between the two dependent sensitivity/specificity values on a clinical application. Materials and Methods. In this study, confidence interval methods like Asymptotic Intervals, Conditional Intervals, Unconditional Interval, Score Intervals, and Nonparametric Methods Based on Relative Effects Intervals are used. Besides, as clinical application, data used in diagnostics study by Dickel et al. (2010) has been taken as a sample. Results. The results belonging to the alternative confidence interval methods for Nickel Sulfate, Potassium Dichromate, and Lanolin Alcohol are given as a table. Conclusion. While preferring the confidence interval methods, the researchers have to consider whether the case to be compared is single ratio or dependent binary ratio differences, the correlation coefficient between the rates in two dependent ratios and the sample sizes. PMID:27478491

  15. Alternative Confidence Interval Methods Used in the Diagnostic Accuracy Studies.

    PubMed

    Erdoğan, Semra; Gülhan, Orekıcı Temel

    2016-01-01

    Background/Aim. It is necessary to decide whether the newly improved methods are better than the standard or reference test or not. To decide whether the new diagnostics test is better than the gold standard test/imperfect standard test, the differences of estimated sensitivity/specificity are calculated with the help of information obtained from samples. However, to generalize this value to the population, it should be given with the confidence intervals. The aim of this study is to evaluate the confidence interval methods developed for the differences between the two dependent sensitivity/specificity values on a clinical application. Materials and Methods. In this study, confidence interval methods like Asymptotic Intervals, Conditional Intervals, Unconditional Interval, Score Intervals, and Nonparametric Methods Based on Relative Effects Intervals are used. Besides, as clinical application, data used in diagnostics study by Dickel et al. (2010) has been taken as a sample. Results. The results belonging to the alternative confidence interval methods for Nickel Sulfate, Potassium Dichromate, and Lanolin Alcohol are given as a table. Conclusion. While preferring the confidence interval methods, the researchers have to consider whether the case to be compared is single ratio or dependent binary ratio differences, the correlation coefficient between the rates in two dependent ratios and the sample sizes.

  16. Estimating the duration of geologic intervals from a small number of age determinations: A challenge common to petrology and paleobiology

    NASA Astrophysics Data System (ADS)

    Glazner, Allen F.; Sadler, Peter M.

    2016-12-01

    The duration of a geologic interval, such as the time over which a given volume of magma accumulated to form a pluton, or the lifespan of a large igneous province, is commonly determined from a relatively small number of geochronologic determinations (e.g., 4-10) within that interval. Such sample sets can underestimate the true length of the interval by a significant amount. For example, the average interval determined from a sample of size n = 5, drawn from a uniform random distribution, will underestimate the true interval by 50%. Even for n = 10, the average sample only captures ˜80% of the interval. If the underlying distribution is known then a correction factor can be determined from theory or Monte Carlo analysis; for a uniform random distribution, this factor is n+1n-1. Systematic undersampling of interval lengths can have a large effect on calculated magma fluxes in plutonic systems. The problem is analogous to determining the duration of an extinct species from its fossil occurrences. Confidence interval statistics developed for species origination and extinction times are applicable to the onset and cessation of magmatic events.

  17. A Comparison of Momentary Time Sampling and Partial-Interval Recording for Assessment of Effects of Social Skills Training

    ERIC Educational Resources Information Center

    Radley, Keith C.; O'Handley, Roderick D.; Labrot, Zachary C.

    2015-01-01

    Assessment in social skills training often utilizes procedures such as partial-interval recording (PIR) and momentary time sampling (MTS) to estimate changes in duration in social engagements due to intervention. Although previous research suggests PIR to be more inaccurate than MTS in estimating levels of behavior, treatment analysis decisions…

  18. What Is the Shape of Developmental Change?

    PubMed Central

    Adolph, Karen E.; Robinson, Scott R.; Young, Jesse W.; Gill-Alvarez, Felix

    2009-01-01

    Developmental trajectories provide the empirical foundation for theories about change processes during development. However, the ability to distinguish among alternative trajectories depends on how frequently observations are sampled. This study used real behavioral data, with real patterns of variability, to examine the effects of sampling at different intervals on characterization of the underlying trajectory. Data were derived from a set of 32 infant motor skills indexed daily during the first 18 months. Larger sampling intervals (2-31 days) were simulated by systematically removing observations from the daily data and interpolating over the gaps. Infrequent sampling caused decreasing sensitivity to fluctuations in the daily data: Variable trajectories erroneously appeared as step-functions and estimates of onset ages were increasingly off target. Sensitivity to variation decreased as an inverse power function of sampling interval, resulting in severe degradation of the trajectory with intervals longer than 7 days. These findings suggest that sampling rates typically used by developmental researchers may be inadequate to accurately depict patterns of variability and the shape of developmental change. Inadequate sampling regimes therefore may seriously compromise theories of development. PMID:18729590

  19. Magnetic Resonance Fingerprinting with short relaxation intervals.

    PubMed

    Amthor, Thomas; Doneva, Mariya; Koken, Peter; Sommer, Karsten; Meineke, Jakob; Börnert, Peter

    2017-09-01

    The aim of this study was to investigate a technique for improving the performance of Magnetic Resonance Fingerprinting (MRF) in repetitive sampling schemes, in particular for 3D MRF acquisition, by shortening relaxation intervals between MRF pulse train repetitions. A calculation method for MRF dictionaries adapted to short relaxation intervals and non-relaxed initial spin states is presented, based on the concept of stationary fingerprints. The method is applicable to many different k-space sampling schemes in 2D and 3D. For accuracy analysis, T 1 and T 2 values of a phantom are determined by single-slice Cartesian MRF for different relaxation intervals and are compared with quantitative reference measurements. The relevance of slice profile effects is also investigated in this case. To further illustrate the capabilities of the method, an application to in-vivo spiral 3D MRF measurements is demonstrated. The proposed computation method enables accurate parameter estimation even for the shortest relaxation intervals, as investigated for different sampling patterns in 2D and 3D. In 2D Cartesian measurements, we achieved a scan acceleration of more than a factor of two, while maintaining acceptable accuracy: The largest T 1 values of a sample set deviated from their reference values by 0.3% (longest relaxation interval) and 2.4% (shortest relaxation interval). The largest T 2 values showed systematic deviations of up to 10% for all relaxation intervals, which is discussed. The influence of slice profile effects for multislice acquisition is shown to become increasingly relevant for short relaxation intervals. In 3D spiral measurements, a scan time reduction of 36% was achieved, maintaining the quality of in-vivo T1 and T2 maps. Reducing the relaxation interval between MRF sequence repetitions using stationary fingerprint dictionaries is a feasible method to improve the scan efficiency of MRF sequences. The method enables fast implementations of 3D spatially resolved MRF. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Estimation of aquifer scale proportion using equal area grids: assessment of regional scale groundwater quality

    USGS Publications Warehouse

    Belitz, Kenneth; Jurgens, Bryant C.; Landon, Matthew K.; Fram, Miranda S.; Johnson, Tyler D.

    2010-01-01

    The proportion of an aquifer with constituent concentrations above a specified threshold (high concentrations) is taken as a nondimensional measure of regional scale water quality. If computed on the basis of area, it can be referred to as the aquifer scale proportion. A spatially unbiased estimate of aquifer scale proportion and a confidence interval for that estimate are obtained through the use of equal area grids and the binomial distribution. Traditionally, the confidence interval for a binomial proportion is computed using either the standard interval or the exact interval. Research from the statistics literature has shown that the standard interval should not be used and that the exact interval is overly conservative. On the basis of coverage probability and interval width, the Jeffreys interval is preferred. If more than one sample per cell is available, cell declustering is used to estimate the aquifer scale proportion, and Kish's design effect may be useful for estimating an effective number of samples. The binomial distribution is also used to quantify the adequacy of a grid with a given number of cells for identifying a small target, defined as a constituent that is present at high concentrations in a small proportion of the aquifer. Case studies illustrate a consistency between approaches that use one well per grid cell and many wells per cell. The methods presented in this paper provide a quantitative basis for designing a sampling program and for utilizing existing data.

  1. A confidence interval analysis of sampling effort, sequencing depth, and taxonomic resolution of fungal community ecology in the era of high-throughput sequencing.

    PubMed

    Oono, Ryoko

    2017-01-01

    High-throughput sequencing technology has helped microbial community ecologists explore ecological and evolutionary patterns at unprecedented scales. The benefits of a large sample size still typically outweigh that of greater sequencing depths per sample for accurate estimations of ecological inferences. However, excluding or not sequencing rare taxa may mislead the answers to the questions 'how and why are communities different?' This study evaluates the confidence intervals of ecological inferences from high-throughput sequencing data of foliar fungal endophytes as case studies through a range of sampling efforts, sequencing depths, and taxonomic resolutions to understand how technical and analytical practices may affect our interpretations. Increasing sampling size reliably decreased confidence intervals across multiple community comparisons. However, the effects of sequencing depths on confidence intervals depended on how rare taxa influenced the dissimilarity estimates among communities and did not significantly decrease confidence intervals for all community comparisons. A comparison of simulated communities under random drift suggests that sequencing depths are important in estimating dissimilarities between microbial communities under neutral selective processes. Confidence interval analyses reveal important biases as well as biological trends in microbial community studies that otherwise may be ignored when communities are only compared for statistically significant differences.

  2. A confidence interval analysis of sampling effort, sequencing depth, and taxonomic resolution of fungal community ecology in the era of high-throughput sequencing

    PubMed Central

    2017-01-01

    High-throughput sequencing technology has helped microbial community ecologists explore ecological and evolutionary patterns at unprecedented scales. The benefits of a large sample size still typically outweigh that of greater sequencing depths per sample for accurate estimations of ecological inferences. However, excluding or not sequencing rare taxa may mislead the answers to the questions ‘how and why are communities different?’ This study evaluates the confidence intervals of ecological inferences from high-throughput sequencing data of foliar fungal endophytes as case studies through a range of sampling efforts, sequencing depths, and taxonomic resolutions to understand how technical and analytical practices may affect our interpretations. Increasing sampling size reliably decreased confidence intervals across multiple community comparisons. However, the effects of sequencing depths on confidence intervals depended on how rare taxa influenced the dissimilarity estimates among communities and did not significantly decrease confidence intervals for all community comparisons. A comparison of simulated communities under random drift suggests that sequencing depths are important in estimating dissimilarities between microbial communities under neutral selective processes. Confidence interval analyses reveal important biases as well as biological trends in microbial community studies that otherwise may be ignored when communities are only compared for statistically significant differences. PMID:29253889

  3. Sample size planning for composite reliability coefficients: accuracy in parameter estimation via narrow confidence intervals.

    PubMed

    Terry, Leann; Kelley, Ken

    2012-11-01

    Composite measures play an important role in psychology and related disciplines. Composite measures almost always have error. Correspondingly, it is important to understand the reliability of the scores from any particular composite measure. However, the point estimates of the reliability of composite measures are fallible and thus all such point estimates should be accompanied by a confidence interval. When confidence intervals are wide, there is much uncertainty in the population value of the reliability coefficient. Given the importance of reporting confidence intervals for estimates of reliability, coupled with the undesirability of wide confidence intervals, we develop methods that allow researchers to plan sample size in order to obtain narrow confidence intervals for population reliability coefficients. We first discuss composite reliability coefficients and then provide a discussion on confidence interval formation for the corresponding population value. Using the accuracy in parameter estimation approach, we develop two methods to obtain accurate estimates of reliability by planning sample size. The first method provides a way to plan sample size so that the expected confidence interval width for the population reliability coefficient is sufficiently narrow. The second method ensures that the confidence interval width will be sufficiently narrow with some desired degree of assurance (e.g., 99% assurance that the 95% confidence interval for the population reliability coefficient will be less than W units wide). The effectiveness of our methods was verified with Monte Carlo simulation studies. We demonstrate how to easily implement the methods with easy-to-use and freely available software. ©2011 The British Psychological Society.

  4. Extended Task Space Control for Robotic Manipulators

    NASA Technical Reports Server (NTRS)

    Backes, Paul G. (Inventor); Long, Mark K. (Inventor)

    1996-01-01

    The invention is a method of operating a robot in successive sampling intervals to perform a task, the robot having joints and joint actuators with actuator control loops, by decomposing the task into behavior forces, accelerations, velocities and positions of plural behaviors to be exhibited by the robot simultaneously, computing actuator accelerations of the joint actuators for the current sampling interval from both behavior forces, accelerations velocities and positions of the current sampling interval and actuator velocities and positions of the previous sampling interval, computing actuator velocities and positions of the joint actuators for the current sampling interval from the actuator velocities and positions of the previous sampling interval, and, finally, controlling the actuators in accordance with the actuator accelerations, velocities and positions of the current sampling interval. The actuator accelerations, velocities and positions of the current sampling interval are stored for use during the next sampling interval.

  5. Exact intervals and tests for median when one sample value possibly an outliner

    NASA Technical Reports Server (NTRS)

    Keller, G. J.; Walsh, J. E.

    1973-01-01

    Available are independent observations (continuous data) that are believed to be a random sample. Desired are distribution-free confidence intervals and significance tests for the population median. However, there is the possibility that either the smallest or the largest observation is an outlier. Then, use of a procedure for rejection of an outlying observation might seem appropriate. Such a procedure would consider that two alternative situations are possible and would select one of them. Either (1) the n observations are truly a random sample, or (2) an outlier exists and its removal leaves a random sample of size n-1. For either situation, confidence intervals and tests are desired for the median of the population yielding the random sample. Unfortunately, satisfactory rejection procedures of a distribution-free nature do not seem to be available. Moreover, all rejection procedures impose undesirable conditional effects on the observations, and also, can select the wrong one of the two above situations. It is found that two-sided intervals and tests based on two symmetrically located order statistics (not the largest and smallest) of the n observations have this property.

  6. Programmable noise bandwidth reduction by means of digital averaging

    NASA Technical Reports Server (NTRS)

    Poklemba, John J. (Inventor)

    1993-01-01

    Predetection noise bandwidth reduction is effected by a pre-averager capable of digitally averaging the samples of an input data signal over two or more symbols, the averaging interval being defined by the input sampling rate divided by the output sampling rate. As the averaged sample is clocked to a suitable detector at a much slower rate than the input signal sampling rate the noise bandwidth at the input to the detector is reduced, the input to the detector having an improved signal to noise ratio as a result of the averaging process, and the rate at which such subsequent processing must operate is correspondingly reduced. The pre-averager forms a data filter having an output sampling rate of one sample per symbol of received data. More specifically, selected ones of a plurality of samples accumulated over two or more symbol intervals are output in response to clock signals at a rate of one sample per symbol interval. The pre-averager includes circuitry for weighting digitized signal samples using stored finite impulse response (FIR) filter coefficients. A method according to the present invention is also disclosed.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurley, D.F.; Whitehouse, J.M.

    A dedicated low-flow groundwater sample collection system was designed for implementation in a post-closure ACL monitoring program at the Yaworski Lagoon NPL site in Canterbury, Connecticut. The system includes dedicated bladder pumps with intake ports located in the screened interval of the monitoring wells. This sampling technique was implemented in the spring of 1993. The system was designed to simultaneously obtain samples directly from the screened interval of nested wells in three distinct water bearing zones. Sample collection is begun upon stabilization of field parameters. Other than line volume, no prior purging of the well is required. It was foundmore » that dedicated low-flow sampling from the screened interval provides a method of representative sample collection without the bias of suspended solids introduced by traditional techniques of pumping and bailing. Analytical data indicate that measured chemical constituents are representative of groundwater migrating through the screened interval. Upon implementation of the low-flow monitoring system, analytical results exhibited a decrease in concentrations of some organic compounds and metals. The system has also proven to be a cost effective alternative to pumping and bailing which generate large volumes of purge water requiring containment and disposal.« less

  8. Repeat sample intraocular pressure variance in induced and naturally ocular hypertensive monkeys.

    PubMed

    Dawson, William W; Dawson, Judyth C; Hope, George M; Brooks, Dennis E; Percicot, Christine L

    2005-12-01

    To compare repeat-sample means variance of laser induced ocular hypertension (OH) in rhesus monkeys with the repeat-sample mean variance of natural OH in age-range matched monkeys of similar and dissimilar pedigrees. Multiple monocular, retrospective, intraocular pressure (IOP) measures were recorded repeatedly during a short sampling interval (SSI, 1-5 months) and a long sampling interval (LSI, 6-36 months). There were 5-13 eyes in each SSI and LSI subgroup. Each interval contained subgroups from the Florida with natural hypertension (NHT), induced hypertension (IHT1) Florida monkeys, unrelated (Strasbourg, France) induced hypertensives (IHT2), and Florida age-range matched controls (C). Repeat-sample individual variance means and related IOPs were analyzed by a parametric analysis of variance (ANOV) and results compared to non-parametric Kruskal-Wallis ANOV. As designed, all group intraocular pressure distributions were significantly different (P < or = 0.009) except for the two (Florida/Strasbourg) induced OH groups. A parametric 2 x 4 design ANOV for mean variance showed large significant effects due to treatment group and sampling interval. Similar results were produced by the nonparametric ANOV. Induced OH sample variance (LSI) was 43x the natural OH sample variance-mean. The same relationship for the SSI was 12x. Laser induced ocular hypertension in rhesus monkeys produces large IOP repeat-sample variance mean results compared to controls and natural OH.

  9. The effects of morphine on fixed-interval patterning and temporal discrimination.

    PubMed Central

    Odum, A L; Schaal, D W

    2000-01-01

    Changes produced by drugs in response patterns under fixed-interval schedules of reinforcement have been interpreted to result from changes in temporal discrimination. To examine this possibility, this experiment determined the effects of morphine on the response patterning of 4 pigeons during a fixed-interval 1-min schedule of food delivery with interpolated temporal discrimination trials. Twenty of the 50 total intervals were interrupted by choice trials. Pecks to one key color produced food if the interval was interrupted after a short time (after 2 or 4.64 s). Pecks to another key color produced food if the interval was interrupted after a long time (after 24.99 or 58 s). Morphine (1.0 to 10.0 mg/kg) decreased the index of curvature (a measure of response patterning) during fixed intervals and accuracy during temporal discrimination trials. Accuracy was equally disrupted following short and long sample durations. Although morphine disrupted temporal discrimination in the context of a fixed-interval schedule, these effects are inconsistent with interpretations of the disruption of response patterning as a selective overestimation of elapsed time. The effects of morphine may be related to the effects of more conventional external stimuli on response patterning. PMID:11029024

  10. EFFECT ON PERFUSION VALUES OF SAMPLING INTERVAL OF CT PERFUSION ACQUISITIONS IN NEUROENDOCRINE LIVER METASTASES AND NORMAL LIVER

    PubMed Central

    Ng, Chaan S.; Hobbs, Brian P.; Wei, Wei; Anderson, Ella F.; Herron, Delise H.; Yao, James C.; Chandler, Adam G.

    2014-01-01

    Objective To assess the effects of sampling interval (SI) of CT perfusion acquisitions on CT perfusion values in normal liver and liver metastases from neuroendocrine tumors. Methods CT perfusion in 16 patients with neuroendocrine liver metastases were analyzed by distributed parameter modeling to yield tissue blood flow, blood volume, mean transit time, permeability, and hepatic arterial fraction, for tumor and normal liver. CT perfusion values for the reference sampling interval of 0.5s (SI0.5) were compared with those of SI datasets of 1s, 2s, 3s and 4s, using mixed-effects model analyses. Results Increases in SI beyond 1s were associated with significant and increasing departures of CT perfusion parameters from reference values at SI0.5 (p≤0.0009). CT perfusion values deviated from reference with increasing uncertainty with increasing SIs. Findings for normal liver were concordant. Conclusion Increasing SIs beyond 1s yield significantly different CT perfusion parameter values compared to reference values at SI0.5. PMID:25626401

  11. The Effect of Number of Ability Intervals on the Stability of Item Bias Detection.

    ERIC Educational Resources Information Center

    Loyd, Brenda

    The chi-square procedure has been suggested as a viable index of test bias because it provides the best agreement with the three parameter item characteristic curve without the large sample requirement, computer complexity, and cost. This study examines the effect of using different numbers of ability intervals on the reliability of chi-square…

  12. Salmonella enteritidis surveillance by egg immunology: impact of the sampling scheme on the release of contaminated table eggs.

    PubMed

    Klinkenberg, Don; Thomas, Ekelijn; Artavia, Francisco F Calvo; Bouma, Annemarie

    2011-08-01

    Design of surveillance programs to detect infections could benefit from more insight into sampling schemes. We address the effect of sampling schemes for Salmonella Enteritidis surveillance in laying hens. Based on experimental estimates for the transmission rate in flocks, and the characteristics of an egg immunological test, we have simulated outbreaks with various sampling schemes, and with the current boot swab program with a 15-week sampling interval. Declaring a flock infected based on a single positive egg was not possible because test specificity was too low. Thus, a threshold number of positive eggs was defined to declare a flock infected, and, for small sample sizes, eggs from previous samplings had to be included in a cumulative sample to guarantee a minimum flock level specificity. Effectiveness of surveillance was measured by the proportion of outbreaks detected, and by the number of contaminated table eggs brought on the market. The boot swab program detected 90% of the outbreaks, with 75% fewer contaminated eggs compared to no surveillance, whereas the baseline egg program (30 eggs each 15 weeks) detected 86%, with 73% fewer contaminated eggs. We conclude that a larger sample size results in more detected outbreaks, whereas a smaller sampling interval decreases the number of contaminated eggs. Decreasing sample size and interval simultaneously reduces the number of contaminated eggs, but not indefinitely: the advantage of more frequent sampling is counterbalanced by the cumulative sample including less recently laid eggs. Apparently, optimizing surveillance has its limits when test specificity is taken into account. © 2011 Society for Risk Analysis.

  13. Outcome-Dependent Sampling with Interval-Censored Failure Time Data

    PubMed Central

    Zhou, Qingning; Cai, Jianwen; Zhou, Haibo

    2017-01-01

    Summary Epidemiologic studies and disease prevention trials often seek to relate an exposure variable to a failure time that suffers from interval-censoring. When the failure rate is low and the time intervals are wide, a large cohort is often required so as to yield reliable precision on the exposure-failure-time relationship. However, large cohort studies with simple random sampling could be prohibitive for investigators with a limited budget, especially when the exposure variables are expensive to obtain. Alternative cost-effective sampling designs and inference procedures are therefore desirable. We propose an outcome-dependent sampling (ODS) design with interval-censored failure time data, where we enrich the observed sample by selectively including certain more informative failure subjects. We develop a novel sieve semiparametric maximum empirical likelihood approach for fitting the proportional hazards model to data from the proposed interval-censoring ODS design. This approach employs the empirical likelihood and sieve methods to deal with the infinite-dimensional nuisance parameters, which greatly reduces the dimensionality of the estimation problem and eases the computation difficulty. The consistency and asymptotic normality of the resulting regression parameter estimator are established. The results from our extensive simulation study show that the proposed design and method works well for practical situations and is more efficient than the alternative designs and competing approaches. An example from the Atherosclerosis Risk in Communities (ARIC) study is provided for illustration. PMID:28771664

  14. Seasonal variation in size-dependent survival of juvenile Atlantic salmon (Salmo salar): Performance of multistate capture-mark-recapture models

    USGS Publications Warehouse

    Letcher, B.H.; Horton, G.E.

    2008-01-01

    We estimated the magnitude and shape of size-dependent survival (SDS) across multiple sampling intervals for two cohorts of stream-dwelling Atlantic salmon (Salmo salar) juveniles using multistate capture-mark-recapture (CMR) models. Simulations designed to test the effectiveness of multistate models for detecting SDS in our system indicated that error in SDS estimates was low and that both time-invariant and time-varying SDS could be detected with sample sizes of >250, average survival of >0.6, and average probability of capture of >0.6, except for cases of very strong SDS. In the field (N ??? 750, survival 0.6-0.8 among sampling intervals, probability of capture 0.6-0.8 among sampling occasions), about one-third of the sampling intervals showed evidence of SDS, with poorer survival of larger fish during the age-2+ autumn and quadratic survival (opposite direction between cohorts) during age-1+ spring. The varying magnitude and shape of SDS among sampling intervals suggest a potential mechanism for the maintenance of the very wide observed size distributions. Estimating SDS using multistate CMR models appears complementary to established approaches, can provide estimates with low error, and can be used to detect intermittent SDS. ?? 2008 NRC Canada.

  15. Confidence Interval Coverage for Cohen's Effect Size Statistic

    ERIC Educational Resources Information Center

    Algina, James; Keselman, H. J.; Penfield, Randall D.

    2006-01-01

    Kelley compared three methods for setting a confidence interval (CI) around Cohen's standardized mean difference statistic: the noncentral-"t"-based, percentile (PERC) bootstrap, and biased-corrected and accelerated (BCA) bootstrap methods under three conditions of nonnormality, eight cases of sample size, and six cases of population…

  16. Normal reference intervals and the effects of time and feeding on serum bile acid concentrations in llamas.

    PubMed

    Andreasen, C B; Pearson, E G; Smith, B B; Gerros, T C; Lassen, E D

    1998-04-01

    Fifty clinically healthy llamas, 0.5-13 years of age (22 intact males, 10 neutered males, 18 females), with no biochemical evidence of liver disease or hematologic abnormalities, were selected to establish serum bile acid reference intervals. Serum samples submitted to the clinical pathology laboratory were analyzed using a colorimetric enzymatic assay to establish bile acid reference intervals. A nonparametric distribution of llama bile acid concentrations was 1-23 micromol/liter for llamas >1 year of age and 10-44 micromol/liter for llamas < or = 1 year of age. A significant difference was found between these 2 age groups. No correlation was detected between gender and bile acid concentrations. The reference intervals were 1.1-22.9 micromol/liter for llamas >1 year of age and 1.8-49.8 micromol/liter for llamas < or = 1 year of age. Additionally, a separate group of 10 healthy adult llamas (5 males, 5 females, 5-11 years of age) without biochemical or hematologic abnormalities was selected to assess the effects of feeding and time intervals on serum bile acid concentrations. These 10 llamas were provided fresh water and hay ad libitum, and serum samples were obtained via an indwelling jugular catheter hourly for 11 hours. Llamas were then kept from food overnight (12 hours), and subsequent samples were taken prior to feeding (fasting baseline time, 23 hours after trial initiation) and postprandially at 0.5, 1, 2, 4, and 8 hours. In feeding trials, there was no consistent interaction between bile acid concentrations and time, feeding, or 12-hour fasting. Prior feeding or time of day did not result in serum bile acid concentrations outside the reference interval, but concentrations from individual llamas varied within this interval over time.

  17. Efficiently estimating salmon escapement uncertainty using systematically sampled data

    USGS Publications Warehouse

    Reynolds, Joel H.; Woody, Carol Ann; Gove, Nancy E.; Fair, Lowell F.

    2007-01-01

    Fish escapement is generally monitored using nonreplicated systematic sampling designs (e.g., via visual counts from towers or hydroacoustic counts). These sampling designs support a variety of methods for estimating the variance of the total escapement. Unfortunately, all the methods give biased results, with the magnitude of the bias being determined by the underlying process patterns. Fish escapement commonly exhibits positive autocorrelation and nonlinear patterns, such as diurnal and seasonal patterns. For these patterns, poor choice of variance estimator can needlessly increase the uncertainty managers have to deal with in sustaining fish populations. We illustrate the effect of sampling design and variance estimator choice on variance estimates of total escapement for anadromous salmonids from systematic samples of fish passage. Using simulated tower counts of sockeye salmon Oncorhynchus nerka escapement on the Kvichak River, Alaska, five variance estimators for nonreplicated systematic samples were compared to determine the least biased. Using the least biased variance estimator, four confidence interval estimators were compared for expected coverage and mean interval width. Finally, five systematic sampling designs were compared to determine the design giving the smallest average variance estimate for total annual escapement. For nonreplicated systematic samples of fish escapement, all variance estimators were positively biased. Compared to the other estimators, the least biased estimator reduced bias by, on average, from 12% to 98%. All confidence intervals gave effectively identical results. Replicated systematic sampling designs consistently provided the smallest average estimated variance among those compared.

  18. Confidence intervals for the population mean tailored to small sample sizes, with applications to survey sampling.

    PubMed

    Rosenblum, Michael A; Laan, Mark J van der

    2009-01-07

    The validity of standard confidence intervals constructed in survey sampling is based on the central limit theorem. For small sample sizes, the central limit theorem may give a poor approximation, resulting in confidence intervals that are misleading. We discuss this issue and propose methods for constructing confidence intervals for the population mean tailored to small sample sizes. We present a simple approach for constructing confidence intervals for the population mean based on tail bounds for the sample mean that are correct for all sample sizes. Bernstein's inequality provides one such tail bound. The resulting confidence intervals have guaranteed coverage probability under much weaker assumptions than are required for standard methods. A drawback of this approach, as we show, is that these confidence intervals are often quite wide. In response to this, we present a method for constructing much narrower confidence intervals, which are better suited for practical applications, and that are still more robust than confidence intervals based on standard methods, when dealing with small sample sizes. We show how to extend our approaches to much more general estimation problems than estimating the sample mean. We describe how these methods can be used to obtain more reliable confidence intervals in survey sampling. As a concrete example, we construct confidence intervals using our methods for the number of violent deaths between March 2003 and July 2006 in Iraq, based on data from the study "Mortality after the 2003 invasion of Iraq: A cross sectional cluster sample survey," by Burnham et al. (2006).

  19. Effect of Periodic Burning on Soil Nitrogen Concentrations in Ponderosa Pine

    Treesearch

    W. W. Covington; S. S. Sackett

    1986-01-01

    To determine the effects of different burning intervals on soil N status in substands of sapling-, pole-, and sawtimber-sized ponderosa pine (Pinus ponderosa Laws.) we sampled plots burned at 1-, 2-, and 4-yr intervals by three strata at two depths (0-5 and 5-15 cm). Generally, NH4 +; and NO3 - concentrations were higher on plots repeatedly burned than on unburned...

  20. Cleaning frequency and the microbial load in ice-cream.

    PubMed

    Holm, Sonya; Toma, Ramses B; Reiboldt, Wendy; Newcomer, Chris; Calicchia, Melissa

    2002-07-01

    This study investigates the efficacy of a 62 h cleaning frequency in the manufacturing of ice-cream. Various product and product contact surfaces were sampled progressively throughout the time period between cleaning cycles, and analyzed for microbial growth. The coliform and standard plate counts (SPC) of these samples did not vary significantly over time after 0, 24, 48, or 62 h from Cleaning in Place (CiP). Data for product contact surfaces were significant for the SPC representing sample locations. Some of the variables in cleaning practices had significant influence on microbial loads. An increase in the number of flavors manufactured caused a decrease in SPC within the 24 h interval, but by the 48 h interval the SPC increased. More washouts within the first 24 h interval were favorable, as indicated by decreased SPC. The more frequently the liquefier was sanitized within the 62 h interval, the lower the SPC. This study indicates that food safety was not compromised and safety practices were effectively implemented throughout the process.

  1. Study of the Effect of Temporal Sampling Frequency on DSCOVR Observations Using the GEOS-5 Nature Run Results. Part II; Cloud Coverage

    NASA Technical Reports Server (NTRS)

    Holdaway, Daniel; Yang, Yuekui

    2016-01-01

    This is the second part of a study on how temporal sampling frequency affects satellite retrievals in support of the Deep Space Climate Observatory (DSCOVR) mission. Continuing from Part 1, which looked at Earth's radiation budget, this paper presents the effect of sampling frequency on DSCOVR-derived cloud fraction. The output from NASA's Goddard Earth Observing System version 5 (GEOS-5) Nature Run is used as the "truth". The effect of temporal resolution on potential DSCOVR observations is assessed by subsampling the full Nature Run data. A set of metrics, including uncertainty and absolute error in the subsampled time series, correlation between the original and the subsamples, and Fourier analysis have been used for this study. Results show that, for a given sampling frequency, the uncertainties in the annual mean cloud fraction of the sunlit half of the Earth are larger over land than over ocean. Analysis of correlation coefficients between the subsamples and the original time series demonstrates that even though sampling at certain longer time intervals may not increase the uncertainty in the mean, the subsampled time series is further and further away from the "truth" as the sampling interval becomes larger and larger. Fourier analysis shows that the simulated DSCOVR cloud fraction has underlying periodical features at certain time intervals, such as 8, 12, and 24 h. If the data is subsampled at these frequencies, the uncertainties in the mean cloud fraction are higher. These results provide helpful insights for the DSCOVR temporal sampling strategy.

  2. Exponential synchronization of neural networks with discrete and distributed delays under time-varying sampling.

    PubMed

    Wu, Zheng-Guang; Shi, Peng; Su, Hongye; Chu, Jian

    2012-09-01

    This paper investigates the problem of master-slave synchronization for neural networks with discrete and distributed delays under variable sampling with a known upper bound on the sampling intervals. An improved method is proposed, which captures the characteristic of sampled-data systems. Some delay-dependent criteria are derived to ensure the exponential stability of the error systems, and thus the master systems synchronize with the slave systems. The desired sampled-data controller can be achieved by solving a set of linear matrix inequalitys, which depend upon the maximum sampling interval and the decay rate. The obtained conditions not only have less conservatism but also have less decision variables than existing results. Simulation results are given to show the effectiveness and benefits of the proposed methods.

  3. The Effects of Acute High-Intensity Interval Training on Hematological Parameters in Sedentary Subjects.

    PubMed

    Belviranli, Muaz; Okudan, Nilsel; Kabak, Banu

    2017-07-19

    The objective of the study was to determine the effects of acute high-intensity interval training (HIIT) on hematological parameters in sedentary men. Ten healthy, non-smoker, and sedentary men aged between 18 and 24 years participated in the study. All subjects performed four Wingate tests with 4 min intervals between the tests. Blood samples were collected at pre-exercise, immediately after, 3 and 6 h after the fourth Wingate test. Hematological parameters were analyzed in these samples. The results showed that hematocrit percentage, hemoglobin values, red cell count, mean cell volume, platelet count, total white cell count, and counts of the white cell subgroups increased immediately after the acute HIIT and their values began to return to resting levels 3 h after exercise, and completely returned to resting levels 6 h after exercise. In conclusion, acute HIIT causes an inflammatory response in blood.

  4. Research on the principle and experimentation of optical compressive spectral imaging

    NASA Astrophysics Data System (ADS)

    Chen, Yuheng; Chen, Xinhua; Zhou, Jiankang; Ji, Yiqun; Shen, Weimin

    2013-12-01

    The optical compressive spectral imaging method is a novel spectral imaging technique that draws in the inspiration of compressed sensing, which takes on the advantages such as reducing acquisition data amount, realizing snapshot imaging, increasing signal to noise ratio and so on. Considering the influence of the sampling quality on the ultimate imaging quality, researchers match the sampling interval with the modulation interval in former reported imaging system, while the depressed sampling rate leads to the loss on the original spectral resolution. To overcome that technical defect, the demand for the matching between the sampling interval and the modulation interval is disposed of and the spectral channel number of the designed experimental device increases more than threefold comparing to that of the previous method. Imaging experiment is carried out by use of the experiment installation and the spectral data cube of the shooting target is reconstructed with the acquired compressed image by use of the two-step iterative shrinkage/thresholding algorithms. The experimental result indicates that the spectral channel number increases effectively and the reconstructed data stays high-fidelity. The images and spectral curves are able to accurately reflect the spatial and spectral character of the target.

  5. Confidence Intervals for Squared Semipartial Correlation Coefficients: The Effect of Nonnormality

    ERIC Educational Resources Information Center

    Algina, James; Keselman, H. J.; Penfield, Randall D.

    2010-01-01

    The increase in the squared multiple correlation coefficient ([delta]R[superscript 2]) associated with a variable in a regression equation is a commonly used measure of importance in regression analysis. Algina, Keselman, and Penfield found that intervals based on asymptotic principles were typically very inaccurate, even though the sample size…

  6. Estimating Standardized Linear Contrasts of Means with Desired Precision

    ERIC Educational Resources Information Center

    Bonett, Douglas G.

    2009-01-01

    L. Wilkinson and the Task Force on Statistical Inference (1999) recommended reporting confidence intervals for measures of effect sizes. If the sample size is too small, the confidence interval may be too wide to provide meaningful information. Recently, K. Kelley and J. R. Rausch (2006) used an iterative approach to computer-generate tables of…

  7. Lone star tick abundance, fire, and bison grazing in tall-grass prairie

    USGS Publications Warehouse

    Cully, J.F.

    1999-01-01

    Lone star ticks (Amblyomma americanum L.) were collected by drag samples of 1 km transects on 12 watersheds at Konza Prairie Research Natural Area near Manhattan, Kans., during summer 1995-1996. Watersheds were treated to 2 experimental treatments: 3 burn intervals (1-year, 4-year, and 20-year) and 2 grazing treatments (grazed by bison (Bos bison L.) or ungrazed). The objectives were to determine whether fire interval, time since most recent burn, and the presence of large ungulate grazers would cause changes in lone star tick abundance in tallgrass prairie in central Kansas. Watersheds burned at 1-year intervals had fewer larvae and adults than watersheds burned at 4-year or 20-year intervals. Watersheds burned during the year of sampling had fewer ticks than watersheds burned one or more years in the past. For watersheds burned 1 or more years in the past there was no effect from time since burn. The presence of bison did not affect tick abundance. Spring burning is an effective method to reduce tick populations in tallgrass prairie during the year of the burn.

  8. Effects of High-Intensity Interval Training on Aerobic Capacity in Cardiac Patients: A Systematic Review with Meta-Analysis

    PubMed Central

    Xie, Bin; Yan, Xianfeng

    2017-01-01

    Purpose. The aim of this study was to compare the effects of high-intensity interval training (INTERVAL) and moderate-intensity continuous training (CONTINUOUS) on aerobic capacity in cardiac patients. Methods. A meta-analysis identified by searching the PubMed, Cochrane Library, EMBASE, and Web of Science databases from inception through December 2016 compared the effects of INTERVAL and CONTINUOUS among cardiac patients. Results. Twenty-one studies involving 736 participants with cardiac diseases were included. Compared with CONTINUOUS, INTERVAL was associated with greater improvement in peak VO2 (mean difference 1.76 mL/kg/min, 95% confidence interval 1.06 to 2.46 mL/kg/min, p < 0.001) and VO2 at AT (mean difference 0.90 mL/kg/min, 95% confidence interval 0.0 to 1.72 mL/kg/min, p = 0.03). No significant difference between the INTERVAL and CONTINUOUS groups was observed in terms of peak heart rate, peak minute ventilation, VE/VCO2 slope and respiratory exchange ratio, body mass, systolic or diastolic blood pressure, triglyceride or low- or high-density lipoprotein cholesterol level, flow-mediated dilation, or left ventricular ejection fraction. Conclusions. This study showed that INTERVAL improves aerobic capacity more effectively than does CONTINUOUS in cardiac patients. Further studies with larger samples are needed to confirm our observations. PMID:28386556

  9. Assessing total fungal concentrations on commercial passenger aircraft using mixed-effects modeling.

    PubMed

    McKernan, Lauralynn Taylor; Hein, Misty J; Wallingford, Kenneth M; Burge, Harriet; Herrick, Robert

    2008-01-01

    The primary objective of this study was to compare airborne fungal concentrations onboard commercial passenger aircraft at various in-flight times with concentrations measured inside and outside airport terminals. A secondary objective was to investigate the use of mixed-effects modeling of repeat measures from multiple sampling intervals and locations. Sequential triplicate culturable and total spore samples were collected on wide-body commercial passenger aircraft (n = 12) in the front and rear of coach class during six sampling intervals: boarding, midclimb, early cruise, midcruise, late cruise, and deplaning. Comparison samples were collected inside and outside airport terminals at the origin and destination cities. The MIXED procedure in SAS was used to model the mean and the covariance matrix of the natural log transformed fungal concentrations. Five covariance structures were tested to determine the appropriate models for analysis. Fixed effects considered included the sampling interval and, for samples obtained onboard the aircraft, location (front/rear of coach section), occupancy rate, and carbon dioxide concentrations. Overall, both total culturable and total spore fungal concentrations were low while the aircraft were in flight. No statistical difference was observed between measurements made in the front and rear sections of the coach cabin for either culturable or total spore concentrations. Both culturable and total spore concentrations were significantly higher outside the airport terminal compared with inside the airport terminal (p-value < 0.0001) and inside the aircraft (p-value < 0.0001). On the aircraft, the majority of total fungal exposure occurred during the boarding and deplaning processes, when the aircraft utilized ancillary ventilation and passenger activity was at its peak.

  10. Effects of season and stage of gestation on luteinizing hormone release in gilts.

    PubMed Central

    Smith, C A; Almond, G W

    1991-01-01

    This study was designed to examine the effects of two seasons and stage of gestation on luteinizing hormone (LH) release in the gilt. Eleven Yorkshire-Landrace crossbred gilts were each fitted with an indwelling vena caval cannula. Blood samples were collected at 6 h intervals for six days during early (day 39 to 44) or mid-gestation (day 69 to 74). Serum progesterone, estradiol-17 beta and LH concentrations were determined in samples collected at 6 h intervals. Early and mid-gestation occurred during August and September in group 1 (n = 6) and during January and February in group 2 gilts (n = 5). To characterize pulsatile LH release, samples were collected at 15 min intervals for 8 h on day 40, 43, 70 and 73 of gestation. Following each 8 h sampling period, gilts were treated intravenously with 0.5 micrograms gonadotropin-releasing hormone (GnRH)/kg body weight and blood collected at 10 min intervals for 3 h. Progesterone concentrations decreased (p less than 0.01) from 22.1 +/- 0.4 ng/mL during early gestation to 18.2 +/- 0.4 ng/mL during mid-gestation. Estradiol-17 beta concentrations increased (p less than 0.01) from early to mid-gestation (13.5 +/- 0.8 versus 28.4 +/- 0.7 pg/mL). Frequency of LH pulses and LH pulse amplitude were higher (p less than 0.05) in pregnant gilts during January and February compared to August and September.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:1889040

  11. Sex preference and third birth intervals in a traditional Indian society.

    PubMed

    Nath, D C; Land, K C

    1994-07-01

    The traditional preference for sons may be the main hindrance to India's current population policy of two children per family. In this study, the effects of various sociodemographic covariates, particularly sex preference, on the length of the third birth interval are examined for the scheduled caste population in Assam, India. Life table and hazards regression techniques are applied to retrospective sample data. The analysis shows that couples having two surviving sons are less likely to have a third child than those without a surviving son and those with only one surviving son. Age at first marriage, length of preceding birth intervals, age of mother, and household income have strong effects on the length of the third birth interval.

  12. Effect of the time interval from harvesting to the pre-drying step on natural fumonisin contamination in freshly harvested corn from the State of Parana, Brazil.

    PubMed

    Da Silva, M; Garcia, G T; Vizoni, E; Kawamura, O; Hirooka, E Y; Ono, E Y S

    2008-05-01

    Natural mycoflora and fumonisins were analysed in 490 samples of freshly harvested corn (Zea mays L.) (2003 and 2004 crops) collected at three points in the producing chain from the Northern region of Parana State, Brazil, and correlated to the time interval between the harvesting and the pre-drying step. The two crops showed a similar profile concerning the fungal frequency, and Fusarium sp. was the prevalent genera (100%) for the sampling sites from both crops. Fumonisins were detected in all samples from the three points of the producing chain (2003 and 2004 crops). The levels ranged from 0.11 to 15.32 microg g(-1)in field samples, from 0.16 to 15.90 microg g(-1)in reception samples, and from 0.02 to 18.78 microg g(-1)in pre-drying samples (2003 crop). Samples from the 2004 crop showed lower contamination and fumonisin levels ranged from 0.07 to 4.78 microg g(-1)in field samples, from 0.03 to 4.09 microg g(-1)in reception samples, and from 0.11 to 11.21 microg g(-1)in pre-drying samples. The mean fumonisin level increased gradually from < or = 5.0 to 19.0 microg g(-1)as the time interval between the harvesting and the pre-drying step increased from 3.22 to 8.89 h (2003 crop). The same profile was observed for samples from the 2004 crop. Fumonisin levels and the time interval (rho = 0.96) showed positive correlation (p < or = 0.05), indicating that delay in the drying process can increase fumonisin levels.

  13. Examining Temporal Sample Scale and Model Choice with Spatial Capture-Recapture Models in the Common Leopard Panthera pardus.

    PubMed

    Goldberg, Joshua F; Tempa, Tshering; Norbu, Nawang; Hebblewhite, Mark; Mills, L Scott; Wangchuk, Tshewang R; Lukacs, Paul

    2015-01-01

    Many large carnivores occupy a wide geographic distribution, and face threats from habitat loss and fragmentation, poaching, prey depletion, and human wildlife-conflicts. Conservation requires robust techniques for estimating population densities and trends, but the elusive nature and low densities of many large carnivores make them difficult to detect. Spatial capture-recapture (SCR) models provide a means for handling imperfect detectability, while linking population estimates to individual movement patterns to provide more accurate estimates than standard approaches. Within this framework, we investigate the effect of different sample interval lengths on density estimates, using simulations and a common leopard (Panthera pardus) model system. We apply Bayesian SCR methods to 89 simulated datasets and camera-trapping data from 22 leopards captured 82 times during winter 2010-2011 in Royal Manas National Park, Bhutan. We show that sample interval length from daily, weekly, monthly or quarterly periods did not appreciably affect median abundance or density, but did influence precision. We observed the largest gains in precision when moving from quarterly to shorter intervals. We therefore recommend daily sampling intervals for monitoring rare or elusive species where practicable, but note that monthly or quarterly sample periods can have similar informative value. We further develop a novel application of Bayes factors to select models where multiple ecological factors are integrated into density estimation. Our simulations demonstrate that these methods can help identify the "true" explanatory mechanisms underlying the data. Using this method, we found strong evidence for sex-specific movement distributions in leopards, suggesting that sexual patterns of space-use influence density. This model estimated a density of 10.0 leopards/100 km2 (95% credibility interval: 6.25-15.93), comparable to contemporary estimates in Asia. These SCR methods provide a guide to monitor and observe the effect of management interventions on leopards and other species of conservation interest.

  14. Examining Temporal Sample Scale and Model Choice with Spatial Capture-Recapture Models in the Common Leopard Panthera pardus

    PubMed Central

    Goldberg, Joshua F.; Tempa, Tshering; Norbu, Nawang; Hebblewhite, Mark; Mills, L. Scott; Wangchuk, Tshewang R.; Lukacs, Paul

    2015-01-01

    Many large carnivores occupy a wide geographic distribution, and face threats from habitat loss and fragmentation, poaching, prey depletion, and human wildlife-conflicts. Conservation requires robust techniques for estimating population densities and trends, but the elusive nature and low densities of many large carnivores make them difficult to detect. Spatial capture-recapture (SCR) models provide a means for handling imperfect detectability, while linking population estimates to individual movement patterns to provide more accurate estimates than standard approaches. Within this framework, we investigate the effect of different sample interval lengths on density estimates, using simulations and a common leopard (Panthera pardus) model system. We apply Bayesian SCR methods to 89 simulated datasets and camera-trapping data from 22 leopards captured 82 times during winter 2010–2011 in Royal Manas National Park, Bhutan. We show that sample interval length from daily, weekly, monthly or quarterly periods did not appreciably affect median abundance or density, but did influence precision. We observed the largest gains in precision when moving from quarterly to shorter intervals. We therefore recommend daily sampling intervals for monitoring rare or elusive species where practicable, but note that monthly or quarterly sample periods can have similar informative value. We further develop a novel application of Bayes factors to select models where multiple ecological factors are integrated into density estimation. Our simulations demonstrate that these methods can help identify the “true” explanatory mechanisms underlying the data. Using this method, we found strong evidence for sex-specific movement distributions in leopards, suggesting that sexual patterns of space-use influence density. This model estimated a density of 10.0 leopards/100 km2 (95% credibility interval: 6.25–15.93), comparable to contemporary estimates in Asia. These SCR methods provide a guide to monitor and observe the effect of management interventions on leopards and other species of conservation interest. PMID:26536231

  15. Reversing the Signaled Magnitude Effect in Delayed Matching to Sample: Delay-Specific Remembering?

    ERIC Educational Resources Information Center

    White, K. Geoffrey; Brown, Glenn S.

    2011-01-01

    Pigeons performed a delayed matching-to-sample task in which large or small reinforcers for correct remembering were signaled during the retention interval. Accuracy was low when small reinforcers were signaled, and high when large reinforcers were signaled (the signaled magnitude effect). When the reinforcer-size cue was switched from small to…

  16. Quantitative imaging biomarkers: Effect of sample size and bias on confidence interval coverage.

    PubMed

    Obuchowski, Nancy A; Bullen, Jennifer

    2017-01-01

    Introduction Quantitative imaging biomarkers (QIBs) are being increasingly used in medical practice and clinical trials. An essential first step in the adoption of a quantitative imaging biomarker is the characterization of its technical performance, i.e. precision and bias, through one or more performance studies. Then, given the technical performance, a confidence interval for a new patient's true biomarker value can be constructed. Estimating bias and precision can be problematic because rarely are both estimated in the same study, precision studies are usually quite small, and bias cannot be measured when there is no reference standard. Methods A Monte Carlo simulation study was conducted to assess factors affecting nominal coverage of confidence intervals for a new patient's quantitative imaging biomarker measurement and for change in the quantitative imaging biomarker over time. Factors considered include sample size for estimating bias and precision, effect of fixed and non-proportional bias, clustered data, and absence of a reference standard. Results Technical performance studies of a quantitative imaging biomarker should include at least 35 test-retest subjects to estimate precision and 65 cases to estimate bias. Confidence intervals for a new patient's quantitative imaging biomarker measurement constructed under the no-bias assumption provide nominal coverage as long as the fixed bias is <12%. For confidence intervals of the true change over time, linearity must hold and the slope of the regression of the measurements vs. true values should be between 0.95 and 1.05. The regression slope can be assessed adequately as long as fixed multiples of the measurand can be generated. Even small non-proportional bias greatly reduces confidence interval coverage. Multiple lesions in the same subject can be treated as independent when estimating precision. Conclusion Technical performance studies of quantitative imaging biomarkers require moderate sample sizes in order to provide robust estimates of bias and precision for constructing confidence intervals for new patients. Assumptions of linearity and non-proportional bias should be assessed thoroughly.

  17. Only pick the right grains: Modelling the bias due to subjective grain-size interval selection for chronometric and fingerprinting approaches.

    NASA Astrophysics Data System (ADS)

    Dietze, Michael; Fuchs, Margret; Kreutzer, Sebastian

    2016-04-01

    Many modern approaches of radiometric dating or geochemical fingerprinting rely on sampling sedimentary deposits. A key assumption of most concepts is that the extracted grain-size fraction of the sampled sediment adequately represents the actual process to be dated or the source area to be fingerprinted. However, these assumptions are not always well constrained. Rather, they have to align with arbitrary, method-determined size intervals, such as "coarse grain" or "fine grain" with partly even different definitions. Such arbitrary intervals violate principal process-based concepts of sediment transport and can thus introduce significant bias to the analysis outcome (i.e., a deviation of the measured from the true value). We present a flexible numerical framework (numOlum) for the statistical programming language R that allows quantifying the bias due to any given analysis size interval for different types of sediment deposits. This framework is applied to synthetic samples from the realms of luminescence dating and geochemical fingerprinting, i.e. a virtual reworked loess section. We show independent validation data from artificially dosed and subsequently mixed grain-size proportions and we present a statistical approach (end-member modelling analysis, EMMA) that allows accounting for the effect of measuring the compound dosimetric history or geochemical composition of a sample. EMMA separates polymodal grain-size distributions into the underlying transport process-related distributions and their contribution to each sample. These underlying distributions can then be used to adjust grain-size preparation intervals to minimise the incorporation of "undesired" grain-size fractions.

  18. Analysis of the Factors Affecting the Interval between Blood Donations Using Log-Normal Hazard Model with Gamma Correlated Frailties.

    PubMed

    Tavakol, Najmeh; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Time to donating blood plays a major role in a regular donor to becoming continues one. The aim of this study was to determine the effective factors on the interval between the blood donations. In a longitudinal study in 2008, 864 samples of first-time donors in Shahrekord Blood Transfusion Center,  capital city of Chaharmahal and Bakhtiari Province, Iran were selected by a systematic sampling and were followed up for five years. Among these samples, a subset of 424 donors who had at least two successful blood donations were chosen for this study and the time intervals between their donations were measured as response variable. Sex, body weight, age, marital status, education, stay and job were recorded as independent variables. Data analysis was performed based on log-normal hazard model with gamma correlated frailty. In this model, the frailties are sum of two independent components assumed a gamma distribution. The analysis was done via Bayesian approach using Markov Chain Monte Carlo algorithm by OpenBUGS. Convergence was checked via Gelman-Rubin criteria using BOA program in R. Age, job and education were significant on chance to donate blood (P<0.05). The chances of blood donation for the higher-aged donors, clericals, workers, free job, students and educated donors were higher and in return, time intervals between their blood donations were shorter. Due to the significance effect of some variables in the log-normal correlated frailty model, it is necessary to plan educational and cultural program to encourage the people with longer inter-donation intervals to donate more frequently.

  19. A modified Wald interval for the area under the ROC curve (AUC) in diagnostic case-control studies

    PubMed Central

    2014-01-01

    Background The area under the receiver operating characteristic (ROC) curve, referred to as the AUC, is an appropriate measure for describing the overall accuracy of a diagnostic test or a biomarker in early phase trials without having to choose a threshold. There are many approaches for estimating the confidence interval for the AUC. However, all are relatively complicated to implement. Furthermore, many approaches perform poorly for large AUC values or small sample sizes. Methods The AUC is actually a probability. So we propose a modified Wald interval for a single proportion, which can be calculated on a pocket calculator. We performed a simulation study to compare this modified Wald interval (without and with continuity correction) with other intervals regarding coverage probability and statistical power. Results The main result is that the proposed modified Wald intervals maintain and exploit the type I error much better than the intervals of Agresti-Coull, Wilson, and Clopper-Pearson. The interval suggested by Bamber, the Mann-Whitney interval without transformation and also the interval of the binormal AUC are very liberal. For small sample sizes the Wald interval with continuity has a comparable coverage probability as the LT interval and higher power. For large sample sizes the results of the LT interval and of the Wald interval without continuity correction are comparable. Conclusions If individual patient data is not available, but only the estimated AUC and the total sample size, the modified Wald intervals can be recommended as confidence intervals for the AUC. For small sample sizes the continuity correction should be used. PMID:24552686

  20. A modified Wald interval for the area under the ROC curve (AUC) in diagnostic case-control studies.

    PubMed

    Kottas, Martina; Kuss, Oliver; Zapf, Antonia

    2014-02-19

    The area under the receiver operating characteristic (ROC) curve, referred to as the AUC, is an appropriate measure for describing the overall accuracy of a diagnostic test or a biomarker in early phase trials without having to choose a threshold. There are many approaches for estimating the confidence interval for the AUC. However, all are relatively complicated to implement. Furthermore, many approaches perform poorly for large AUC values or small sample sizes. The AUC is actually a probability. So we propose a modified Wald interval for a single proportion, which can be calculated on a pocket calculator. We performed a simulation study to compare this modified Wald interval (without and with continuity correction) with other intervals regarding coverage probability and statistical power. The main result is that the proposed modified Wald intervals maintain and exploit the type I error much better than the intervals of Agresti-Coull, Wilson, and Clopper-Pearson. The interval suggested by Bamber, the Mann-Whitney interval without transformation and also the interval of the binormal AUC are very liberal. For small sample sizes the Wald interval with continuity has a comparable coverage probability as the LT interval and higher power. For large sample sizes the results of the LT interval and of the Wald interval without continuity correction are comparable. If individual patient data is not available, but only the estimated AUC and the total sample size, the modified Wald intervals can be recommended as confidence intervals for the AUC. For small sample sizes the continuity correction should be used.

  1. The effects of sampling frequency on the climate statistics of the European Centre for Medium-Range Weather Forecasts

    NASA Astrophysics Data System (ADS)

    Phillips, Thomas J.; Gates, W. Lawrence; Arpe, Klaus

    1992-12-01

    The effects of sampling frequency on the first- and second-moment statistics of selected European Centre for Medium-Range Weather Forecasts (ECMWF) model variables are investigated in a simulation of "perpetual July" with a diurnal cycle included and with surface and atmospheric fields saved at hourly intervals. The shortest characteristic time scales (as determined by the e-folding time of lagged autocorrelation functions) are those of ground heat fluxes and temperatures, precipitation and runoff, convective processes, cloud properties, and atmospheric vertical motion, while the longest time scales are exhibited by soil temperature and moisture, surface pressure, and atmospheric specific humidity, temperature, and wind. The time scales of surface heat and momentum fluxes and of convective processes are substantially shorter over land than over oceans. An appropriate sampling frequency for each model variable is obtained by comparing the estimates of first- and second-moment statistics determined at intervals ranging from 2 to 24 hours with the "best" estimates obtained from hourly sampling. Relatively accurate estimation of first- and second-moment climate statistics (10% errors in means, 20% errors in variances) can be achieved by sampling a model variable at intervals that usually are longer than the bandwidth of its time series but that often are shorter than its characteristic time scale. For the surface variables, sampling at intervals that are nonintegral divisors of a 24-hour day yields relatively more accurate time-mean statistics because of a reduction in errors associated with aliasing of the diurnal cycle and higher-frequency harmonics. The superior estimates of first-moment statistics are accompanied by inferior estimates of the variance of the daily means due to the presence of systematic biases, but these probably can be avoided by defining a different measure of low-frequency variability. Estimates of the intradiurnal variance of accumulated precipitation and surface runoff also are strongly impacted by the length of the storage interval. In light of these results, several alternative strategies for storage of the EMWF model variables are recommended.

  2. Technical note: Instantaneous sampling intervals validated from continuous video observation for behavioral recording of feedlot lambs.

    PubMed

    Pullin, A N; Pairis-Garcia, M D; Campbell, B J; Campler, M R; Proudfoot, K L

    2017-11-01

    When considering methodologies for collecting behavioral data, continuous sampling provides the most complete and accurate data set whereas instantaneous sampling can provide similar results and also increase the efficiency of data collection. However, instantaneous time intervals require validation to ensure accurate estimation of the data. Therefore, the objective of this study was to validate scan sampling intervals for lambs housed in a feedlot environment. Feeding, lying, standing, drinking, locomotion, and oral manipulation were measured on 18 crossbred lambs housed in an indoor feedlot facility for 14 h (0600-2000 h). Data from continuous sampling were compared with data from instantaneous scan sampling intervals of 5, 10, 15, and 20 min using a linear regression analysis. Three criteria determined if a time interval accurately estimated behaviors: 1) ≥ 0.90, 2) slope not statistically different from 1 ( > 0.05), and 3) intercept not statistically different from 0 ( > 0.05). Estimations for lying behavior were accurate up to 20-min intervals, whereas feeding and standing behaviors were accurate only at 5-min intervals (i.e., met all 3 regression criteria). Drinking, locomotion, and oral manipulation demonstrated poor associations () for all tested intervals. The results from this study suggest that a 5-min instantaneous sampling interval will accurately estimate lying, feeding, and standing behaviors for lambs housed in a feedlot, whereas continuous sampling is recommended for the remaining behaviors. This methodology will contribute toward the efficiency, accuracy, and transparency of future behavioral data collection in lamb behavior research.

  3. BRIDGING GAPS BETWEEN ZOO AND WILDLIFE MEDICINE: ESTABLISHING REFERENCE INTERVALS FOR FREE-RANGING AFRICAN LIONS (PANTHERA LEO).

    PubMed

    Broughton, Heather M; Govender, Danny; Shikwambana, Purvance; Chappell, Patrick; Jolles, Anna

    2017-06-01

    The International Species Information System has set forth an extensive database of reference intervals for zoologic species, allowing veterinarians and game park officials to distinguish normal health parameters from underlying disease processes in captive wildlife. However, several recent studies comparing reference values from captive and free-ranging animals have found significant variation between populations, necessitating the development of separate reference intervals in free-ranging wildlife to aid in the interpretation of health data. Thus, this study characterizes reference intervals for six biochemical analytes, eleven hematologic or immune parameters, and three hormones using samples from 219 free-ranging African lions ( Panthera leo ) captured in Kruger National Park, South Africa. Using the original sample population, exclusion criteria based on physical examination were applied to yield a final reference population of 52 clinically normal lions. Reference intervals were then generated via 90% confidence intervals on log-transformed data using parametric bootstrapping techniques. In addition to the generation of reference intervals, linear mixed-effect models and generalized linear mixed-effect models were used to model associations of each focal parameter with the following independent variables: age, sex, and body condition score. Age and sex were statistically significant drivers for changes in hepatic enzymes, renal values, hematologic parameters, and leptin, a hormone related to body fat stores. Body condition was positively correlated with changes in monocyte counts. Given the large variation in reference values taken from captive versus free-ranging lions, it is our hope that this study will serve as a baseline for future clinical evaluations and biomedical research targeting free-ranging African lions.

  4. Sample size requirements for the design of reliability studies: precision consideration.

    PubMed

    Shieh, Gwowen

    2014-09-01

    In multilevel modeling, the intraclass correlation coefficient based on the one-way random-effects model is routinely employed to measure the reliability or degree of resemblance among group members. To facilitate the advocated practice of reporting confidence intervals in future reliability studies, this article presents exact sample size procedures for precise interval estimation of the intraclass correlation coefficient under various allocation and cost structures. Although the suggested approaches do not admit explicit sample size formulas and require special algorithms for carrying out iterative computations, they are more accurate than the closed-form formulas constructed from large-sample approximations with respect to the expected width and assurance probability criteria. This investigation notes the deficiency of existing methods and expands the sample size methodology for the design of reliability studies that have not previously been discussed in the literature.

  5. A pilot study of the effects of interview content, retention interval, and grade on accuracy of dietary information from children

    PubMed Central

    Baxter, Suzanne Domel; Hitchcock, David B; Guinn, Caroline H; Royer, Julie A; Wilson, Dawn K; Pate, Russell R; McIver, Kerry L; Dowda, Marsha

    2013-01-01

    Objective Investigate differences in dietary recall accuracy by interview content (diet-only; diet-and-physical-activity), retention interval (same-day; previous-day), and grade (3rd; 5th). Methods Thirty-two children observed eating school-provided meals and interviewed once each; interview content and retention interval randomly assigned. Multivariate analysis of variance on rates for omissions (foods observed but unreported) and intrusions (foods reported but unobserved); independent variables—interview content, retention interval, grade. Results Accuracy differed by retention interval (P = .05; better for same-day [omission rate, intrusion rate: 28%, 20%] than previous-day [54%, 45%]) but not interview content (P > .48; diet-only: 41%, 33%; diet-and-physical-activity: 41%, 33%) or grade (P > .27; 3rd: 48%, 42%; 5th: 34%, 24%). Conclusions and Implications Although the small sample limits firm conclusions, results provide evidence-based direction to enhance accuracy; specifically, to shorten the retention interval. Larger validation studies need to investigate the combined effect of interview content, retention interval, and grade on accuracy. PMID:23562487

  6. Online Doppler Effect Elimination Based on Unequal Time Interval Sampling for Wayside Acoustic Bearing Fault Detecting System

    PubMed Central

    Ouyang, Kesai; Lu, Siliang; Zhang, Shangbin; Zhang, Haibin; He, Qingbo; Kong, Fanrang

    2015-01-01

    The railway occupies a fairly important position in transportation due to its high speed and strong transportation capability. As a consequence, it is a key issue to guarantee continuous running and transportation safety of trains. Meanwhile, time consumption of the diagnosis procedure is of extreme importance for the detecting system. However, most of the current adopted techniques in the wayside acoustic defective bearing detector system (ADBD) are offline strategies, which means that the signal is analyzed after the sampling process. This would result in unavoidable time latency. Besides, the acquired acoustic signal would be corrupted by the Doppler effect because of high relative speed between the train and the data acquisition system (DAS). Thus, it is difficult to effectively diagnose the bearing defects immediately. In this paper, a new strategy called online Doppler effect elimination (ODEE) is proposed to remove the Doppler distortion online by the introduced unequal interval sampling scheme. The steps of proposed strategy are as follows: The essential parameters are acquired in advance. Then, the introduced unequal time interval sampling strategy is used to restore the Doppler distortion signal, and the amplitude of the signal is demodulated as well. Thus, the restored Doppler-free signal is obtained online. The proposed ODEE method has been employed in simulation analysis. Ultimately, the ODEE method is implemented in the embedded system for fault diagnosis of the train bearing. The results are in good accordance with the bearing defects, which verifies the good performance of the proposed strategy. PMID:26343657

  7. Interval sampling methods and measurement error: a computer simulation.

    PubMed

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  8. 'Aussie normals': an a priori study to develop clinical chemistry reference intervals in a healthy Australian population.

    PubMed

    Koerbin, G; Cavanaugh, J A; Potter, J M; Abhayaratna, W P; West, N P; Glasgow, N; Hawkins, C; Armbruster, D; Oakman, C; Hickman, P E

    2015-02-01

    Development of reference intervals is difficult, time consuming, expensive and beyond the scope of most laboratories. The Aussie Normals study is a direct a priori study to determine reference intervals in healthy Australian adults. All volunteers completed a health and lifestyle questionnaire and exclusion was based on conditions such as pregnancy, diabetes, renal or cardiovascular disease. Up to 91 biochemical analyses were undertaken on a variety of analytical platforms using serum samples collected from 1856 volunteers. We report on our findings for 40 of these analytes and two calculated parameters performed on the Abbott ARCHITECTci8200/ci16200 analysers. Not all samples were analysed for all assays due to volume requirements or assay/instrument availability. Results with elevated interference indices and those deemed unsuitable after clinical evaluation were removed from the database. Reference intervals were partitioned based on the method of Harris and Boyd into three scenarios, combined gender, males and females and age and gender. We have performed a detailed reference interval study on a healthy Australian population considering the effects of sex, age and body mass. These reference intervals may be adapted to other manufacturer's analytical methods using method transference.

  9. Investigation of modulation parameters in multiplexing gas chromatography.

    PubMed

    Trapp, Oliver

    2010-10-22

    Combination of information technology and separation sciences opens a new avenue to achieve high sample throughputs and therefore is of great interest to bypass bottlenecks in catalyst screening of parallelized reactors or using multitier well plates in reaction optimization. Multiplexing gas chromatography utilizes pseudo-random injection sequences derived from Hadamard matrices to perform rapid sample injections which gives a convoluted chromatogram containing the information of a single sample or of several samples with similar analyte composition. The conventional chromatogram is obtained by application of the Hadamard transform using the known injection sequence or in case of several samples an averaged transformed chromatogram is obtained which can be used in a Gauss-Jordan deconvolution procedure to obtain all single chromatograms of the individual samples. The performance of such a system depends on the modulation precision and on the parameters, e.g. the sequence length and modulation interval. Here we demonstrate the effects of the sequence length and modulation interval on the deconvoluted chromatogram, peak shapes and peak integration for sequences between 9-bit (511 elements) and 13-bit (8191 elements) and modulation intervals Δt between 5 s and 500 ms using a mixture of five components. It could be demonstrated that even for high-speed modulation at time intervals of 500 ms the chromatographic information is very well preserved and that the separation efficiency can be improved by very narrow sample injections. Furthermore this study shows that the relative peak areas in multiplexed chromatograms do not deviate from conventionally recorded chromatograms. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. Effects of Behavioral Stimuli on Plasma Interleukin-1 Activity in Humans at Rest.

    ERIC Educational Resources Information Center

    Keppel, William H.; And Others

    1993-01-01

    Performed Interleukin-1 (IL-1) bioassays on 208 serum samples from seven volunteers at 5-minute intervals before, during, and after relaxation-related behavioral stimulus. Individuals showed up to 267% increase in IL-1, and for group mean, 48.1% elevation occurred, during stimulus interval relative to baseline. Such changes in plasma IL-1,…

  11. Improving laboratory results turnaround time by reducing pre analytical phase.

    PubMed

    Khalifa, Mohamed; Khalid, Parwaiz

    2014-01-01

    Laboratory turnaround time is considered one of the most important indicators of work efficiency in hospitals, physicians always need timely results to take effective clinical decisions especially in the emergency department where these results can guide physicians whether to admit patients to the hospital, discharge them home or do further investigations. A retrospective data analysis study was performed to identify the effects of ER and Lab staff training on new routines for sample collection and transportation on the pre-analytical phase of turnaround time. Renal profile tests requested by the ER and performed in 2013 has been selected as a sample, and data about 7,519 tests were retrieved and analyzed to compare turnaround time intervals before and after implementing new routines. Results showed significant time reduction on "Request to Sample Collection" and "Collection to In Lab Delivery" time intervals with less significant improvement on the analytical phase of the turnaround time.

  12. Construction of prediction intervals for Palmer Drought Severity Index using bootstrap

    NASA Astrophysics Data System (ADS)

    Beyaztas, Ufuk; Bickici Arikan, Bugrayhan; Beyaztas, Beste Hamiye; Kahya, Ercan

    2018-04-01

    In this study, we propose an approach based on the residual-based bootstrap method to obtain valid prediction intervals using monthly, short-term (three-months) and mid-term (six-months) drought observations. The effects of North Atlantic and Arctic Oscillation indexes on the constructed prediction intervals are also examined. Performance of the proposed approach is evaluated for the Palmer Drought Severity Index (PDSI) obtained from Konya closed basin located in Central Anatolia, Turkey. The finite sample properties of the proposed method are further illustrated by an extensive simulation study. Our results revealed that the proposed approach is capable of producing valid prediction intervals for future PDSI values.

  13. Psychophysics of remembering.

    PubMed Central

    White, K G; Wixted, J T

    1999-01-01

    We present a new model of remembering in the context of conditional discrimination. For procedures such as delayed matching to sample, the effect of the sample stimuli at the time of remembering is represented by a pair of Thurstonian (normal) distributions of effective stimulus values. The critical assumption of the model is that, based on prior experience, each effective stimulus value is associated with a ratio of reinforcers obtained for previous correct choices of the comparison stimuli. That ratio determines the choice that is made on the basis of the matching law. The standard deviations of the distributions are assumed to increase with increasing retention-interval duration, and the distance between their means is assumed to be a function of other factors that influence overall difficulty of the discrimination. It is a behavioral model in that choice is determined by its reinforcement history. The model predicts that the biasing effects of the reinforcer differential increase with decreasing discriminability and with increasing retention-interval duration. Data from several conditions using a delayed matching-to-sample procedure with pigeons support the predictions. PMID:10028693

  14. Temporal Stability of the Dutch Version of the Wechsler Memory Scale-Fourth Edition (WMS-IV-NL).

    PubMed

    Bouman, Zita; Hendriks, Marc P H; Aldenkamp, Albert P; Kessels, Roy P C

    2015-01-01

    The Wechsler Memory Scale-Fourth Edition (WMS-IV) is one of the most widely used memory batteries. We examined the test-retest reliability, practice effects, and standardized regression-based (SRB) change norms for the Dutch version of the WMS-IV (WMS-IV-NL) after both short and long retest intervals. The WMS-IV-NL was administered twice after either a short (M = 8.48 weeks, SD = 3.40 weeks, range = 3-16) or a long (M = 17.87 months, SD = 3.48, range = 12-24) retest interval in a sample of 234 healthy participants (M = 59.55 years, range = 16-90; 118 completed the Adult Battery; and 116 completed the Older Adult Battery). The test-retest reliability estimates varied across indexes. They were adequate to good after a short retest interval (ranging from .74 to .86), with the exception of the Visual Working Memory Index (r = .59), yet generally lower after a long retest interval (ranging from .56 to .77). Practice effects were only observed after a short retest interval (overall group mean gains up to 11 points), whereas no significant change in performance was found after a long retest interval. Furthermore, practice effect-adjusted SRB change norms were calculated for all WMS-IV-NL index scores. Overall, this study shows that the test-retest reliability of the WMS-IV-NL varied across indexes. Practice effects were observed after a short retest interval, but no evidence was found for practice effects after a long retest interval from one to two years. Finally, the SRB change norms were provided for the WMS-IV-NL.

  15. Factors affecting blood sample haemolysis: a cross-sectional study.

    PubMed

    Barnard, Ed B G; Potter, David L; Ayling, Ruth M; Higginson, Ian; Bailey, Andrew G; Smith, Jason E

    2016-04-01

    To determine the effect of blood sampling through an intravenous catheter compared with a needle in Emergency Department blood sampling. We undertook a prospective, cross-sectional study in a UK university teaching hospital Emergency Department. A convenience sample of 985 patients who required blood sampling via venepuncture was collected. A total of 844 complete sets of data were analysed. The median age was 63 years, and 57% of patients were male. The primary outcome measure was the incidence of haemolysis in blood samples obtained via a needle compared with samples obtained via an intravenous catheter. Secondary outcome measures defined the effect on sample haemolysis of the side of the patient the sample was obtained from, the anatomical location of sampling, the perceived difficulty in obtaining the sample, the order of sample tubes collected, estimated tourniquet time and bench time. Data were analysed with logistic regression, and expressed as odds ratios (95% confidence intervals; P-values). Blood samples obtained through an intravenous catheter were more likely to be haemolysed than those obtained via a needle, odds ratio 5.63 (95% confidence interval 2.49-12.73; P<0.001). Blood sampling via an intravenous catheter was significantly associated with an increase in the likelihood of sample haemolysis compared with sampling with a needle. Wherever practicable, blood samples should be obtained via a needle in preference to an intravenous catheter. Future research should include both an economic evaluation, and staff and patient satisfaction of separating blood sampling and intravenous catheter placement.

  16. Temporal Structure of Volatility Fluctuations

    NASA Astrophysics Data System (ADS)

    Wang, Fengzhong; Yamasaki, Kazuko; Stanley, H. Eugene; Havlin, Shlomo

    Volatility fluctuations are of great importance for the study of financial markets, and the temporal structure is an essential feature of fluctuations. To explore the temporal structure, we employ a new approach based on the return interval, which is defined as the time interval between two successive volatility values that are above a given threshold. We find that the distribution of the return intervals follows a scaling law over a wide range of thresholds, and over a broad range of sampling intervals. Moreover, this scaling law is universal for stocks of different countries, for commodities, for interest rates, and for currencies. However, further and more detailed analysis of the return intervals shows some systematic deviations from the scaling law. We also demonstrate a significant memory effect in the return intervals time organization. We find that the distribution of return intervals is strongly related to the correlations in the volatility.

  17. Evaluating the efficiency of environmental monitoring programs

    USGS Publications Warehouse

    Levine, Carrie R.; Yanai, Ruth D.; Lampman, Gregory G.; Burns, Douglas A.; Driscoll, Charles T.; Lawrence, Gregory B.; Lynch, Jason; Schoch, Nina

    2014-01-01

    Statistical uncertainty analyses can be used to improve the efficiency of environmental monitoring, allowing sampling designs to maximize information gained relative to resources required for data collection and analysis. In this paper, we illustrate four methods of data analysis appropriate to four types of environmental monitoring designs. To analyze a long-term record from a single site, we applied a general linear model to weekly stream chemistry data at Biscuit Brook, NY, to simulate the effects of reducing sampling effort and to evaluate statistical confidence in the detection of change over time. To illustrate a detectable difference analysis, we analyzed a one-time survey of mercury concentrations in loon tissues in lakes in the Adirondack Park, NY, demonstrating the effects of sampling intensity on statistical power and the selection of a resampling interval. To illustrate a bootstrapping method, we analyzed the plot-level sampling intensity of forest inventory at the Hubbard Brook Experimental Forest, NH, to quantify the sampling regime needed to achieve a desired confidence interval. Finally, to analyze time-series data from multiple sites, we assessed the number of lakes and the number of samples per year needed to monitor change over time in Adirondack lake chemistry using a repeated-measures mixed-effects model. Evaluations of time series and synoptic long-term monitoring data can help determine whether sampling should be re-allocated in space or time to optimize the use of financial and human resources.

  18. TESTING ACUTE TOXICITY OF CONTAMINATED SEDIMENT IN JINZHOU BAY WITH MARINE AMPHIPODS

    EPA Science Inventory

    Sediments in some areas of Jinzhou Bay are contaminated seriously by heavy metals and organic contaminants. To assess the biological effects of these compounds in the sediment, seven surface samples of sediment were collected at an interval of about 2km between sampling stations ...

  19. On the Development and Mechanics of Delayed Matching-to-Sample Performance

    PubMed Central

    Kangas, Brian D; Berry, Meredith S; Branch, Marc N

    2011-01-01

    Despite its frequent use to assess effects of environmental and pharmacological variables on short-term memory, little is known about the development of delayed matching-to-sample (DMTS) performance. This study was designed to examine the dimensions and dynamics of DMTS performance development over a long period of exposure to provide a more secure foundation for assessing stability in future research. Six pigeons were exposed to a DMTS task with variable delays for 300 sessions (i.e., 18,000 total trials; 3,600 trials per retention interval). Percent-correct and log-d measures used to quantify the development of conditional stimulus control under the procedure generally and at each of five retention intervals (0, 2, 4, 8 and 16-s) individually revealed that high levels of accuracy developed relatively quickly under the shorter retention intervals, but increases in accuracy under the longer retention intervals sometimes were not observed until 100–150 sessions had passed, with some still increasing at Session 300. Analyses of errors suggested that retention intervals induced biases by shifting control from the sample stimulus to control by position, something that was predicted by observed response biases during initial training. These results suggest that although it may require a great deal of exposure to DMTS prior to obtaining asymptotic steady state, quantification of model parameters may help predict trends when extended exposure is not feasible. PMID:21541127

  20. Black carbon content in a ponderosa pine forest of eastern Oregon with varying seasons and intervals of prescribed burns

    NASA Astrophysics Data System (ADS)

    Matosziuk, L.; Hatten, J. A.

    2016-12-01

    Soil carbon represents a significant component of the global carbon cycle. While fire-based disturbance of forest ecosystems acts as a carbon source, the increased temperatures can initiate molecular changes to forest biomass that convert fast cycling organic carbon into more stable forms such as black carbon (BC), a product of incomplete combustion that contains highly-condensed aromatic structures and very low hydrogen and oxygen content. Such forms of carbon can remain in the soil for hundred to thousands of years, effectively creating a long-term carbon sink. The goal of this project is to understand how specific characteristics of prescribed burns, specifically the season of burn and the interval between burns, affect the formation, structure, and retention of these slowly degrading forms of carbon in the soil. Both O-horizon (forest floor) and mineral soil (0-15 cm cores) samples were collected from a season and interval of burn study in Malheur National Forest. The study area is divided into six replicate units, each of which is sub-divided into four treatment areas and a control. Beginning in 1997, each treatment area was subjected to: i) spring burns at five-year intervals, ii) fall burns at five-year intervals, iii) spring burns at 15-year intervals, or iv) fall burns at 15-year intervals. The bulk density, pH, and C/N content of each soil were measured to assess the effect of the burn treatments on the soil. Additionally, the amount and molecular structure of BC in each sample was quantified using the distribution of specific molecular markers (benzene polycarboxylic acids or BPCAs) that are present in the soil following acid digestion.

  1. Optical effects module and passive sample array

    NASA Technical Reports Server (NTRS)

    Linton, R. C.

    1983-01-01

    The Optical Effects Module (OEM) has the objective to monitor the effects of the deposition and adhesion of both molecular species and particles on optical surfaces in the Shuttle cargo bay environment. The OEM performs inflight measurements of the ultraviolet (253.7 nm) transmittance and diffuse reflectance of five optical samples at regular intervals throughout the orbital mission. Most of the obtained results indicates or implies the absence of a significant accumulation of contamination other than particulates on the samples. The contaminant species (or particulates) adhering to the samples of the Passive Sample Array (PSA) were identified by means of Auger and X-ray energy dispersive analyses. The elements silicon, chlorine, and phosphorus were discovered.

  2. First-year effects of rootraking on available nutrients in Piedmont Plateau soils

    Treesearch

    R.E. Banker; James H. Miller; D.E. Davis

    1983-01-01

    The effects of rootraking on the levels of Ca, Mg, K, PO4 and Na and on infiltration rates in Piedmont Plateau soils were investigated. Soil samples were taken before and after treatments at lo-foot intervals along permanent 100-foot lines located on the ridge, upper slope and lower slopes. Samples were taken at O-2, 2-4, 4-6, 6-12, 12-18...

  3. Effects of Paradigm and Inter-Stimulus Interval on Age Differences in Eyeblink Classical Conditioning in Rabbits

    ERIC Educational Resources Information Center

    Woodruff-Pak, Diana S.; Seta, Susan E.; Roker, LaToya A.; Lehr, Melissa A.

    2007-01-01

    The aim of this study was to examine parameters affecting age differences in eyeblink classical conditioning in a large sample of young and middle-aged rabbits. A total of 122 rabbits of mean ages of 4 or 26 mo were tested at inter-stimulus intervals (ISIs) of 600 or 750 msec in the delay or trace paradigms. Paradigm affected both age groups…

  4. Improved confidence intervals when the sample is counted an integer times longer than the blank.

    PubMed

    Potter, William Edward; Strzelczyk, Jadwiga Jodi

    2011-05-01

    Past computer solutions for confidence intervals in paired counting are extended to the case where the ratio of the sample count time to the blank count time is taken to be an integer, IRR. Previously, confidence intervals have been named Neyman-Pearson confidence intervals; more correctly they should have been named Neyman confidence intervals or simply confidence intervals. The technique utilized mimics a technique used by Pearson and Hartley to tabulate confidence intervals for the expected value of the discrete Poisson and Binomial distributions. The blank count and the contribution of the sample to the gross count are assumed to be Poisson distributed. The expected value of the blank count, in the sample count time, is assumed known. The net count, OC, is taken to be the gross count minus the product of IRR with the blank count. The probability density function (PDF) for the net count can be determined in a straightforward manner.

  5. Effect of harvest dates on yield and nutritive value of eastern gamgrass

    USDA-ARS?s Scientific Manuscript database

    Yield of 'Pete' eastern gamagrass [Tripsacum dactyloides (L.) L.] was evaluated for 3 yr. Forage samples were harvested at 7-d intervals beginning on May 15 and ending on July 17, during 2000, 2001, and 2002. Samples from 2000 and 2001 were analyzed to determine nutrient composition. Canopy height i...

  6. Comparing interval estimates for small sample ordinal CFA models

    PubMed Central

    Natesan, Prathiba

    2015-01-01

    Robust maximum likelihood (RML) and asymptotically generalized least squares (AGLS) methods have been recommended for fitting ordinal structural equation models. Studies show that some of these methods underestimate standard errors. However, these studies have not investigated the coverage and bias of interval estimates. An estimate with a reasonable standard error could still be severely biased. This can only be known by systematically investigating the interval estimates. The present study compares Bayesian, RML, and AGLS interval estimates of factor correlations in ordinal confirmatory factor analysis models (CFA) for small sample data. Six sample sizes, 3 factor correlations, and 2 factor score distributions (multivariate normal and multivariate mildly skewed) were studied. Two Bayesian prior specifications, informative and relatively less informative were studied. Undercoverage of confidence intervals and underestimation of standard errors was common in non-Bayesian methods. Underestimated standard errors may lead to inflated Type-I error rates. Non-Bayesian intervals were more positive biased than negatively biased, that is, most intervals that did not contain the true value were greater than the true value. Some non-Bayesian methods had non-converging and inadmissible solutions for small samples and non-normal data. Bayesian empirical standard error estimates for informative and relatively less informative priors were closer to the average standard errors of the estimates. The coverage of Bayesian credibility intervals was closer to what was expected with overcoverage in a few cases. Although some Bayesian credibility intervals were wider, they reflected the nature of statistical uncertainty that comes with the data (e.g., small sample). Bayesian point estimates were also more accurate than non-Bayesian estimates. The results illustrate the importance of analyzing coverage and bias of interval estimates, and how ignoring interval estimates can be misleading. Therefore, editors and policymakers should continue to emphasize the inclusion of interval estimates in research. PMID:26579002

  7. Comparing interval estimates for small sample ordinal CFA models.

    PubMed

    Natesan, Prathiba

    2015-01-01

    Robust maximum likelihood (RML) and asymptotically generalized least squares (AGLS) methods have been recommended for fitting ordinal structural equation models. Studies show that some of these methods underestimate standard errors. However, these studies have not investigated the coverage and bias of interval estimates. An estimate with a reasonable standard error could still be severely biased. This can only be known by systematically investigating the interval estimates. The present study compares Bayesian, RML, and AGLS interval estimates of factor correlations in ordinal confirmatory factor analysis models (CFA) for small sample data. Six sample sizes, 3 factor correlations, and 2 factor score distributions (multivariate normal and multivariate mildly skewed) were studied. Two Bayesian prior specifications, informative and relatively less informative were studied. Undercoverage of confidence intervals and underestimation of standard errors was common in non-Bayesian methods. Underestimated standard errors may lead to inflated Type-I error rates. Non-Bayesian intervals were more positive biased than negatively biased, that is, most intervals that did not contain the true value were greater than the true value. Some non-Bayesian methods had non-converging and inadmissible solutions for small samples and non-normal data. Bayesian empirical standard error estimates for informative and relatively less informative priors were closer to the average standard errors of the estimates. The coverage of Bayesian credibility intervals was closer to what was expected with overcoverage in a few cases. Although some Bayesian credibility intervals were wider, they reflected the nature of statistical uncertainty that comes with the data (e.g., small sample). Bayesian point estimates were also more accurate than non-Bayesian estimates. The results illustrate the importance of analyzing coverage and bias of interval estimates, and how ignoring interval estimates can be misleading. Therefore, editors and policymakers should continue to emphasize the inclusion of interval estimates in research.

  8. Introduction to Sample Size Choice for Confidence Intervals Based on "t" Statistics

    ERIC Educational Resources Information Center

    Liu, Xiaofeng Steven; Loudermilk, Brandon; Simpson, Thomas

    2014-01-01

    Sample size can be chosen to achieve a specified width in a confidence interval. The probability of obtaining a narrow width given that the confidence interval includes the population parameter is defined as the power of the confidence interval, a concept unfamiliar to many practitioners. This article shows how to utilize the Statistical Analysis…

  9. Design and Operation of a Borehole Straddle Packer for Ground-Water Sampling and Hydraulic Testing of Discrete Intervals at U.S. Air Force Plant 6, Marietta, Georgia

    USGS Publications Warehouse

    Holloway, Owen G.; Waddell, Jonathan P.

    2008-01-01

    A borehole straddle packer was developed and tested by the U.S. Geological Survey to characterize the vertical distribution of contaminants, head, and hydraulic properties in open-borehole wells as part of an ongoing investigation of ground-water contamination at U.S. Air Force Plant 6 (AFP6) in Marietta, Georgia. To better understand contaminant fate and transport in a crystalline bedrock setting and to support remedial activities at AFP6, numerous wells have been constructed that include long open-hole intervals in the crystalline bedrock. These wells can include several discontinuities that produce water, which may contain contaminants. Because of the complexity of ground-water flow and contaminant movement in the crystalline bedrock, it is important to characterize the hydraulic and water-quality characteristics of discrete intervals in these wells. The straddle packer facilitates ground-water sampling and hydraulic testing of discrete intervals, and delivery of fluids including tracer suites and remedial agents into these discontinuities. The straddle packer consists of two inflatable packers, a dual-pump system, a pressure-sensing system, and an aqueous injection system. Tests were conducted to assess the accuracy of the pressure-sensing systems, and water samples were collected for analysis of volatile organic compound (VOCs) concentrations. Pressure-transducer readings matched computed water-column height, with a coefficient of determination of greater than 0.99. The straddle packer incorporates both an air-driven piston pump and a variable-frequency, electronic, submersible pump. Only slight differences were observed between VOC concentrations in samples collected using the two different types of sampling pumps during two sampling events in July and August 2005. A test conducted to assess the effect of stagnation on VOC concentrations in water trapped in the system's pump-tubing reel showed that concentrations were not affected. A comparison was conducted to assess differences between three water-sampling methods - collecting samples from the well by pumping a packer-isolated zone using a submersible pump, by using a grab sampler, and by using a passive diffusion sampler. Concentrations of tetrachloroethylene, trichloroethylene and 1,2-dichloropropane were greatest for samples collected using the submersible pump in the packed-isolated interval, suggesting that the straddle packer yielded the least dilute sample.

  10. Interval estimation and optimal design for the within-subject coefficient of variation for continuous and binary variables

    PubMed Central

    Shoukri, Mohamed M; Elkum, Nasser; Walter, Stephen D

    2006-01-01

    Background In this paper we propose the use of the within-subject coefficient of variation as an index of a measurement's reliability. For continuous variables and based on its maximum likelihood estimation we derive a variance-stabilizing transformation and discuss confidence interval construction within the framework of a one-way random effects model. We investigate sample size requirements for the within-subject coefficient of variation for continuous and binary variables. Methods We investigate the validity of the approximate normal confidence interval by Monte Carlo simulations. In designing a reliability study, a crucial issue is the balance between the number of subjects to be recruited and the number of repeated measurements per subject. We discuss efficiency of estimation and cost considerations for the optimal allocation of the sample resources. The approach is illustrated by an example on Magnetic Resonance Imaging (MRI). We also discuss the issue of sample size estimation for dichotomous responses with two examples. Results For the continuous variable we found that the variance stabilizing transformation improves the asymptotic coverage probabilities on the within-subject coefficient of variation for the continuous variable. The maximum like estimation and sample size estimation based on pre-specified width of confidence interval are novel contribution to the literature for the binary variable. Conclusion Using the sample size formulas, we hope to help clinical epidemiologists and practicing statisticians to efficiently design reliability studies using the within-subject coefficient of variation, whether the variable of interest is continuous or binary. PMID:16686943

  11. Overlap between treatment and control distributions as an effect size measure in experiments.

    PubMed

    Hedges, Larry V; Olkin, Ingram

    2016-03-01

    The proportion π of treatment group observations that exceed the control group mean has been proposed as an effect size measure for experiments that randomly assign independent units into 2 groups. We give the exact distribution of a simple estimator of π based on the standardized mean difference and use it to study the small sample bias of this estimator. We also give the minimum variance unbiased estimator of π under 2 models, one in which the variance of the mean difference is known and one in which the variance is unknown. We show how to use the relation between the standardized mean difference and the overlap measure to compute confidence intervals for π and show that these results can be used to obtain unbiased estimators, large sample variances, and confidence intervals for 3 related effect size measures based on the overlap. Finally, we show how the effect size π can be used in a meta-analysis. (c) 2016 APA, all rights reserved).

  12. Confidence Intervals for Proportion Estimates in Complex Samples. Research Report. ETS RR-06-21

    ERIC Educational Resources Information Center

    Oranje, Andreas

    2006-01-01

    Confidence intervals are an important tool to indicate uncertainty of estimates and to give an idea of probable values of an estimate if a different sample from the population was drawn or a different sample of measures was used. Standard symmetric confidence intervals for proportion estimates based on a normal approximation can yield bounds…

  13. EFFECT OF SITE ON BACTERIAL POPULATIONS IN THE SAPWOOD OF COARSE WOODY DEBRIS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porter, Emma, G.,; Waldrop, Thomas, A.; McElreath, Susan, D.

    1998-01-01

    Porter, Emma G., T.A. Waldrop, Susan D. McElreath, and Frank H. Tainter. 1998. Effect of site on bacterial populations in the sapwood of coarse woody debris. Pp. 480-484. In: Proc. 9th Bienn. South. Silv. Res. Conf. T.A. Waldrop (ed). USDA Forest Service, Southern Research Station. Gen. Tech. Rep. SRS-20. Abstract: Coarse woody debris (CWD) is an important structural component of southeastern forest ecosystems, yet little is known about its dynamics in these systems. This project identified bacterial populations associated with CWD and their dynamics across landscape ecosystem classification (LEC) units. Bolts of red oak and loblolly pine were placed onmore » plots at each of three hydric, mesic, and xeric sites at the Savannah River Station. After the controls were processed, samples were taken at four intervals over a 16-week period. Samples were ground within an anaerobe chamber using nonselective media. Aerobic and facultative anaerobic bacteria were identified using the Biolog system and the anaerobes were identified using the API 20A system. Major genera isolated were: Bacillus, Buttiauxella, Cedecea, Enterobacter, Erwinia, Escherichia, Klebsiella, Pantoea, Pseudomonas, Serratia, and Xanthomonas. The mean total isolates were determined by LEC units and sample intervals. Differences occurred between the sample intervals with total isolates of 6.67, 13.33, 10.17, and 9.50 at 3, 6, 10, and 16 weeks, respectively. No significant differences in the numbers of bacteria isolated were found between LEC units.« less

  14. Reference intervals for plasma-free amino acid in a Japanese population.

    PubMed

    Yamamoto, Hiroyuki; Kondo, Kazuhiro; Tanaka, Takayuki; Muramatsu, Takahiko; Yoshida, Hiroo; Imaizumi, Akira; Nagao, Kenji; Noguchi, Yasushi; Miyano, Hiroshi

    2016-05-01

    Plasma amino acid concentrations vary with various diseases. Although reference intervals are useful in daily clinical practice, no reference intervals have been reported for plasma amino acids in a large Japanese population. Reference individuals were selected from 7685 subjects examined with the Japanese Ningen Dock in 2008. A total of 1890 individuals were selected based on exclusion criteria, and the reference samples were selected after the outlier samples for each amino acid concentration were excluded. The lower limit of the reference intervals for the plasma amino acid concentrations was set at the 2.5th percentile and the upper limit at the 97.5th percentile. By use of the nested analysis of variance, we analysed a large dataset of plasma samples and the effects of background factors (sex, age and body mass index [BMI]) on the plasma amino acid concentrations. Most amino acid concentrations were related to sex, especially those of branched-chained amino acid. The citrulline, glutamine, ornithine and lysine concentrations were related to age. The glutamate concentration was related to body mass index. The concentrations of most amino acids are more strongly related to sex than to age or body mass index. Our results indicate that the reference intervals for plasma amino acid concentrations should be stratified by sex when the background factors of age and body mass index are considered. © The Author(s) 2015 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  15. Sampling effects on the identification of roadkill hotspots: Implications for survey design.

    PubMed

    Santos, Sara M; Marques, J Tiago; Lourenço, André; Medinas, Denis; Barbosa, A Márcia; Beja, Pedro; Mira, António

    2015-10-01

    Although locating wildlife roadkill hotspots is essential to mitigate road impacts, the influence of study design on hotspot identification remains uncertain. We evaluated how sampling frequency affects the accuracy of hotspot identification, using a dataset of vertebrate roadkills (n = 4427) recorded over a year of daily surveys along 37 km of roads. "True" hotspots were identified using this baseline dataset, as the 500-m segments where the number of road-killed vertebrates exceeded the upper 95% confidence limit of the mean, assuming a Poisson distribution of road-kills per segment. "Estimated" hotspots were identified likewise, using datasets representing progressively lower sampling frequencies, which were produced by extracting data from the baseline dataset at appropriate time intervals (1-30 days). Overall, 24.3% of segments were "true" hotspots, concentrating 40.4% of roadkills. For different groups, "true" hotspots accounted from 6.8% (bats) to 29.7% (small birds) of road segments, concentrating from <40% (frogs and toads, snakes) to >60% (lizards, lagomorphs, carnivores) of roadkills. Spatial congruence between "true" and "estimated" hotspots declined rapidly with increasing time interval between surveys, due primarily to increasing false negatives (i.e., missing "true" hotspots). There were also false positives (i.e., wrong "estimated" hotspots), particularly at low sampling frequencies. Spatial accuracy decay with increasing time interval between surveys was higher for smaller-bodied (amphibians, reptiles, small birds, small mammals) than for larger-bodied species (birds of prey, hedgehogs, lagomorphs, carnivores). Results suggest that widely used surveys at weekly or longer intervals may produce poor estimates of roadkill hotspots, particularly for small-bodied species. Surveying daily or at two-day intervals may be required to achieve high accuracy in hotspot identification for multiple species. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Sequential effects in pigeon delayed matching-to-sample performance.

    PubMed

    Roitblat, H L; Scopatz, R A

    1983-04-01

    Pigeons were tested in a three-alternative delayed matching-to-sample task in which second-choices were permitted following first-choice errors. Sequences of responses both within and between trials were examined in three experiments. The first experiment demonstrates that the sample information contained in first-choice errors is not sufficient to account for the observed pattern of second choices. This result implies that second-choices following first-choice errors are based on a second examination of the contents of working memory. Proactive interference was found in the second experiment in the form of a dependency, beyond that expected on the basis of trial independent response bias, of first-choices from one trial on the first-choice emitted on the previous trial. Samples from the previous trial were not found to exert a significant influence on later trials. The magnitude of the intertrial association (Experiment 3) did not depend on the duration of the intertrial interval. In contrast, longer intertrial intervals and longer sample durations did facilitate choice accuracy, by strengthening the association between current samples and choices. These results are incompatible with a trace-decay and competition model; they suggest strongly that multiple influences act simultaneously and independently to control delayed matching-to-sample responding. These multiple influences include memory for the choice occurring on the previous trial, memory for the sample, and general effects of trial spacing.

  17. Adjusted Wald Confidence Interval for a Difference of Binomial Proportions Based on Paired Data

    ERIC Educational Resources Information Center

    Bonett, Douglas G.; Price, Robert M.

    2012-01-01

    Adjusted Wald intervals for binomial proportions in one-sample and two-sample designs have been shown to perform about as well as the best available methods. The adjusted Wald intervals are easy to compute and have been incorporated into introductory statistics courses. An adjusted Wald interval for paired binomial proportions is proposed here and…

  18. The influence of sampling interval on the accuracy of trail impact assessment

    USGS Publications Warehouse

    Leung, Y.-F.; Marion, J.L.

    1999-01-01

    Trail impact assessment and monitoring (IA&M) programs have been growing in importance and application in recreation resource management at protected areas. Census-based and sampling-based approaches have been developed in such programs, with systematic point sampling being the most common survey design. This paper examines the influence of sampling interval on the accuracy of estimates for selected trail impact problems. A complete census of four impact types on 70 trails in Great Smoky Mountains National Park was utilized as the base data set for the analyses. The census data were resampled at increasing intervals to create a series of simulated point data sets. Estimates of frequency of occurrence and lineal extent for the four impact types were compared with the census data set. The responses of accuracy loss on lineal extent estimates to increasing sampling intervals varied across different impact types, while the responses on frequency of occurrence estimates were consistent, approximating an inverse asymptotic curve. These findings suggest that systematic point sampling may be an appropriate method for estimating the lineal extent but not the frequency of trail impacts. Sample intervals of less than 100 m appear to yield an excellent level of accuracy for the four impact types evaluated. Multiple regression analysis results suggest that appropriate sampling intervals are more likely to be determined by the type of impact in question rather than the length of trail. The census-based trail survey and the resampling-simulation method developed in this study can be a valuable first step in establishing long-term trail IA&M programs, in which an optimal sampling interval range with acceptable accuracy is determined before investing efforts in data collection.

  19. National Survey of Adult and Pediatric Reference Intervals in Clinical Laboratories across Canada: A Report of the CSCC Working Group on Reference Interval Harmonization.

    PubMed

    Adeli, Khosrow; Higgins, Victoria; Seccombe, David; Collier, Christine P; Balion, Cynthia M; Cembrowski, George; Venner, Allison A; Shaw, Julie

    2017-11-01

    Reference intervals are widely used decision-making tools in laboratory medicine, serving as health-associated standards to interpret laboratory test results. Numerous studies have shown wide variation in reference intervals, even between laboratories using assays from the same manufacturer. Lack of consistency in either sample measurement or reference intervals across laboratories challenges the expectation of standardized patient care regardless of testing location. Here, we present data from a national survey conducted by the Canadian Society of Clinical Chemists (CSCC) Reference Interval Harmonization (hRI) Working Group that examines variation in laboratory reference sample measurements, as well as pediatric and adult reference intervals currently used in clinical practice across Canada. Data on reference intervals currently used by 37 laboratories were collected through a national survey to examine the variation in reference intervals for seven common laboratory tests. Additionally, 40 clinical laboratories participated in a baseline assessment by measuring six analytes in a reference sample. Of the seven analytes examined, alanine aminotransferase (ALT), alkaline phosphatase (ALP), and creatinine reference intervals were most variable. As expected, reference interval variation was more substantial in the pediatric population and varied between laboratories using the same instrumentation. Reference sample results differed between laboratories, particularly for ALT and free thyroxine (FT4). Reference interval variation was greater than test result variation for the majority of analytes. It is evident that there is a critical lack of harmonization in laboratory reference intervals, particularly for the pediatric population. Furthermore, the observed variation in reference intervals across instruments cannot be explained by the bias between the results obtained on instruments by different manufacturers. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  20. Average variograms to guide soil sampling

    NASA Astrophysics Data System (ADS)

    Kerry, R.; Oliver, M. A.

    2004-10-01

    To manage land in a site-specific way for agriculture requires detailed maps of the variation in the soil properties of interest. To predict accurately for mapping, the interval at which the soil is sampled should relate to the scale of spatial variation. A variogram can be used to guide sampling in two ways. A sampling interval of less than half the range of spatial dependence can be used, or the variogram can be used with the kriging equations to determine an optimal sampling interval to achieve a given tolerable error. A variogram might not be available for the site, but if the variograms of several soil properties were available on a similar parent material and or particular topographic positions an average variogram could be calculated from these. Averages of the variogram ranges and standardized average variograms from four different parent materials in southern England were used to suggest suitable sampling intervals for future surveys in similar pedological settings based on half the variogram range. The standardized average variograms were also used to determine optimal sampling intervals using the kriging equations. Similar sampling intervals were suggested by each method and the maps of predictions based on data at different grid spacings were evaluated for the different parent materials. Variograms of loss on ignition (LOI) taken from the literature for other sites in southern England with similar parent materials had ranges close to the average for a given parent material showing the possible wider application of such averages to guide sampling.

  1. An efficient sampling approach for variance-based sensitivity analysis based on the law of total variance in the successive intervals without overlapping

    NASA Astrophysics Data System (ADS)

    Yun, Wanying; Lu, Zhenzhou; Jiang, Xian

    2018-06-01

    To efficiently execute the variance-based global sensitivity analysis, the law of total variance in the successive intervals without overlapping is proved at first, on which an efficient space-partition sampling-based approach is subsequently proposed in this paper. Through partitioning the sample points of output into different subsets according to different inputs, the proposed approach can efficiently evaluate all the main effects concurrently by one group of sample points. In addition, there is no need for optimizing the partition scheme in the proposed approach. The maximum length of subintervals is decreased by increasing the number of sample points of model input variables in the proposed approach, which guarantees the convergence condition of the space-partition approach well. Furthermore, a new interpretation on the thought of partition is illuminated from the perspective of the variance ratio function. Finally, three test examples and one engineering application are employed to demonstrate the accuracy, efficiency and robustness of the proposed approach.

  2. On the Effects of Signaling Reinforcer Probability and Magnitude in Delayed Matching to Sample

    ERIC Educational Resources Information Center

    Brown, Glenn S.; White, K. Geoffrey

    2005-01-01

    Two experiments examined whether postsample signals of reinforcer probability or magnitude affected the accuracy of delayed matching to sample in pigeons. On each trial, red or green choice responses that matched red or green stimuli seen shortly before a variable retention interval were reinforced with wheat access. In Experiment 1, the…

  3. Baseline-dependent sampling and windowing for radio interferometry: data compression, field-of-interest shaping, and outer field suppression

    NASA Astrophysics Data System (ADS)

    Atemkeng, M.; Smirnov, O.; Tasse, C.; Foster, G.; Keimpema, A.; Paragi, Z.; Jonas, J.

    2018-07-01

    Traditional radio interferometric correlators produce regular-gridded samples of the true uv-distribution by averaging the signal over constant, discrete time-frequency intervals. This regular sampling and averaging then translate to be irregular-gridded samples in the uv-space, and results in a baseline-length-dependent loss of amplitude and phase coherence, which is dependent on the distance from the image phase centre. The effect is often referred to as `decorrelation' in the uv-space, which is equivalent in the source domain to `smearing'. This work discusses and implements a regular-gridded sampling scheme in the uv-space (baseline-dependent sampling) and windowing that allow for data compression, field-of-interest shaping, and source suppression. The baseline-dependent sampling requires irregular-gridded sampling in the time-frequency space, i.e. the time-frequency interval becomes baseline dependent. Analytic models and simulations are used to show that decorrelation remains constant across all the baselines when applying baseline-dependent sampling and windowing. Simulations using MeerKAT telescope and the European Very Long Baseline Interferometry Network show that both data compression, field-of-interest shaping, and outer field-of-interest suppression are achieved.

  4. Event- and interval-based measurement of stuttering: a review.

    PubMed

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be an acceptable agreement. Explanation for high reproducibility values as well as parameter choice to report those data are discussed. Both interval- and event-based methodologies used trained or experienced judges for inter- and intra-judge determination and data were beyond the references for good reproducibility values. Inter- and intra-judge values were reported in different metric scales among event- and interval-based methods studies, making it unfeasible to quantify the agreement between the two methods. © 2014 Royal College of Speech and Language Therapists.

  5. 40 CFR 1065.245 - Sample flow meter for batch sampling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... rates or total flow sampled into a batch sampling system over a test interval. You may use the... rates or total raw exhaust flow over a test interval. (b) Component requirements. We recommend that you... averaging Pitot tube, or a hot-wire anemometer. Note that your overall system for measuring sample flow must...

  6. Determinants of birth interval in a rural Mediterranean population (La Alpujarra, Spain).

    PubMed

    Polo, V; Luna, F; Fuster, V

    2000-10-01

    The fertility pattern, in terms of birth intervals, in a rural population not practicing contraception belonging to La Alta Alpujarra Oriental (southeast Spain) is analyzed. During the first half of the 20th century, this population experienced a considerable degree of geographical and cultural isolation. Because of this population's high variability in fertility and therefore in birth intervals, the analysis was limited to a homogenous subsample of 154 families, each with at least five pregnancies. This limitation allowed us to analyze, among and within families, effects of a set of variables on the interbirth pattern, and to avoid possible problems of pseudoreplication. Information on birth date of the mother, age at marriage, children's birth date and death date, birth order, and frequency of miscarriages was collected. Our results indicate that interbirth intervals depend on an exponential effect of maternal age, especially significant after the age of 35. This effect is probably related to the biological degenerative processes of female fertility with age. A linear increase of birth intervals with birth order within families was found as well as a reduction of intervals among families experiencing an infant death. Our sample size was insufficient to detect a possible replacement behavior in the case of infant death. High natality and mortality rates, a secular decrease of natality rates, a log-normal birth interval, and family-size distributions suggest that La Alpujarra has been a natural fertility population following a demographic transition process.

  7. Dynamic response analysis of structure under time-variant interval process model

    NASA Astrophysics Data System (ADS)

    Xia, Baizhan; Qin, Yuan; Yu, Dejie; Jiang, Chao

    2016-10-01

    Due to the aggressiveness of the environmental factor, the variation of the dynamic load, the degeneration of the material property and the wear of the machine surface, parameters related with the structure are distinctly time-variant. Typical model for time-variant uncertainties is the random process model which is constructed on the basis of a large number of samples. In this work, we propose a time-variant interval process model which can be effectively used to deal with time-variant uncertainties with limit information. And then two methods are presented for the dynamic response analysis of the structure under the time-variant interval process model. The first one is the direct Monte Carlo method (DMCM) whose computational burden is relative high. The second one is the Monte Carlo method based on the Chebyshev polynomial expansion (MCM-CPE) whose computational efficiency is high. In MCM-CPE, the dynamic response of the structure is approximated by the Chebyshev polynomials which can be efficiently calculated, and then the variational range of the dynamic response is estimated according to the samples yielded by the Monte Carlo method. To solve the dependency phenomenon of the interval operation, the affine arithmetic is integrated into the Chebyshev polynomial expansion. The computational effectiveness and efficiency of MCM-CPE is verified by two numerical examples, including a spring-mass-damper system and a shell structure.

  8. Ehrenfest model with large jumps in finance

    NASA Astrophysics Data System (ADS)

    Takahashi, Hisanao

    2004-02-01

    Changes (returns) in stock index prices and exchange rates for currencies are argued, based on empirical data, to obey a stable distribution with characteristic exponent α<2 for short sampling intervals and a Gaussian distribution for long sampling intervals. In order to explain this phenomenon, an Ehrenfest model with large jumps (ELJ) is introduced to explain the empirical density function of price changes for both short and long sampling intervals.

  9. Effect of preliminary thermal treatment on decomposition kinetics of austenite in low-alloyed pipe steel in intercritical temperature interval

    NASA Astrophysics Data System (ADS)

    Makovetskii, A. N.; Tabatchikova, T. I.; Yakovleva, I. L.; Tereshchenko, N. A.; Mirzaev, D. A.

    2013-06-01

    The decomposition kinetics of austenite that appears in the 13KhFA low-alloyed pipe steel upon heating the samples in an intercritical temperature interval (ICI) and exposure for 5 or 30 min has been studied by the method of high-speed dilatometry. The results of dilatometry are supplemented by the microstructure analysis. Thermokinetic diagrams of the decomposition of the γ phase are represented. The conclusion has been drawn that an increase in the duration of exposure in the intercritical interval leads to a significant increase in the stability of the γ phase.

  10. Cost-effectiveness of one versus two sample faecal immunochemical testing for colorectal cancer screening.

    PubMed

    Goede, S Lucas; van Roon, Aafke H C; Reijerink, Jacqueline C I Y; van Vuuren, Anneke J; Lansdorp-Vogelaar, Iris; Habbema, J Dik F; Kuipers, Ernst J; van Leerdam, Monique E; van Ballegooijen, Marjolein

    2013-05-01

    The sensitivity and specificity of a single faecal immunochemical test (FIT) are limited. The performance of FIT screening can be improved by increasing the screening frequency or by providing more than one sample in each screening round. This study aimed to evaluate if two-sample FIT screening is cost-effective compared with one-sample FIT. The MISCAN-colon microsimulation model was used to estimate costs and benefits of strategies with either one or two-sample FIT screening. The FIT cut-off level varied between 50 and 200 ng haemoglobin/ml, and the screening schedule was varied with respect to age range and interval. In addition, different definitions for positivity of the two-sample FIT were considered: at least one positive sample, two positive samples, or the mean of both samples being positive. Within an exemplary screening strategy, biennial FIT from the age of 55-75 years, one-sample FIT provided 76.0-97.0 life-years gained (LYG) per 1000 individuals, at a cost of € 259,000-264,000 (range reflects different FIT cut-off levels). Two-sample FIT screening with at least one sample being positive provided 7.3-12.4 additional LYG compared with one-sample FIT at an extra cost of € 50,000-59,000. However, when all screening intervals and age ranges were considered, intensifying screening with one-sample FIT provided equal or more LYG at lower costs compared with two-sample FIT. If attendance to screening does not differ between strategies it is recommended to increase the number of screening rounds with one-sample FIT screening, before considering increasing the number of FIT samples provided per screening round.

  11. Cuing effects for informational masking

    NASA Astrophysics Data System (ADS)

    Richards, Virginia M.; Neff, Donna L.

    2004-01-01

    The detection of a tone added to a random-frequency, multitone masker can be very poor even when the maskers have little energy in the frequency region of the signal. This paper examines the effects of adding a pretrial cue to reduce uncertainty for the masker or the signal. The first two experiments examined the effect of cuing a fixed-frequency signal as the number of masker components and presentation methods were manipulated. Cue effectiveness varied across observers, but could reduce thresholds by as much as 20 dB. Procedural comparisons indicated observers benefited more from having two masker samples to compare, with or without a signal cue, than having a single interval with one masker sample and a signal cue. The third experiment used random-frequency signals and compared no-cue, signal-cue, and masker-cue conditions, and also systematically varied the time interval between cue offset and trial onset. Thresholds with a cued random-frequency signal remained higher than for a cued fixed-frequency signal. For time intervals between the cue and trial of 50 ms or longer, thresholds were approximately the same with a signal or a masker cue and lower than when there was no cue. Without a cue or with a masker cue, analyses of possible decision strategies suggested observers attended to the potential signal frequencies, particularly the highest signal frequency. With a signal cue, observers appeared to attend to the frequency of the subsequent signal.

  12. Optimal methods for fitting probability distributions to propagule retention time in studies of zoochorous dispersal.

    PubMed

    Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi

    2016-02-01

    Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We recommend the use of cumulative probability to fit parametric probability distributions to propagule retention time, specifically using maximum likelihood for parameter estimation. Furthermore, the experimental design for an optimal characterization of unimodal propagule retention time should contemplate at least 500 recovered propagules and sampling time-intervals not larger than the time peak of propagule retrieval, except in the tail of the distribution where broader sampling time-intervals may also produce accurate fits.

  13. Effects of High Intensity Interval Training on Increasing Explosive Power, Speed, and Agility

    NASA Astrophysics Data System (ADS)

    Fajrin, F.; Kusnanik, N. W.; Wijono

    2018-01-01

    High Intensity Interval Training (HIIT) is a type of exercise that combines high-intensity exercise and low intensity exercise in a certain time interval. This type of training is very effective and efficient to improve the physical components. The process of improving athletes achievement related to how the process of improving the physical components, so the selection of a good practice method will be very helpful. This study aims to analyze how is the effects of HIIT on increasing explosive power, speed, and agility. This type of research is quantitative with quasi-experimental methods. The design of this study used the Matching-Only Design, with data analysis using the t-test (paired sample t-test). After being given the treatment for six weeks, the results showed there are significant increasing in explosive power, speed, and agility. HIIT in this study used a form of exercise plyometric as high-intensity exercise and jogging as mild or moderate intensity exercise. Increase was due to the improvement of neuromuscular characteristics that affect the increase in muscle strength and performance. From the data analysis, researchers concluded that, Exercises of High Intensity Interval Training significantly effect on the increase in Power Limbs, speed, and agility.

  14. Changes in the saltwater interface corresponding to the installation of a seepage barrier near Lake Okeechobee, Florida

    USGS Publications Warehouse

    Prinos, Scott T.; Valderrama, Robert

    2015-01-01

    At five of the monitoring-well cluster locations, a long-screened well was also installed for monitoring and comparison purposes. These long-screened wells are 160 to 200 ft deep, and have open intervals ranging from 145 to 185 ft in length. Water samples were collected at depth intervals of about 5 to 10 ft, using 3-ft-long straddle packers to isolate each sampling interval. The results of monitoring conducted using these long-screened interval wells were generally too variable to identify any changes that might be associated with the seepage barrier. Samples from one of these long-screened interval wells failed to detect the saltwater interface evident in samples and TSEMIL datasets from a collocated well cluster. This failure may have been caused by downward flow of freshwater from above the saltwater interface in the well bore.

  15. Estimating fluvial wood discharge from timelapse photography with varying sampling intervals

    NASA Astrophysics Data System (ADS)

    Anderson, N. K.

    2013-12-01

    There is recent focus on calculating wood budgets for streams and rivers to help inform management decisions, ecological studies and carbon/nutrient cycling models. Most work has measured in situ wood in temporary storage along stream banks or estimated wood inputs from banks. Little effort has been employed monitoring and quantifying wood in transport during high flows. This paper outlines a procedure for estimating total seasonal wood loads using non-continuous coarse interval sampling and examines differences in estimation between sampling at 1, 5, 10 and 15 minutes. Analysis is performed on wood transport for the Slave River in Northwest Territories, Canada. Relative to the 1 minute dataset, precision decreased by 23%, 46% and 60% for the 5, 10 and 15 minute datasets, respectively. Five and 10 minute sampling intervals provided unbiased equal variance estimates of 1 minute sampling, whereas 15 minute intervals were biased towards underestimation by 6%. Stratifying estimates by day and by discharge increased precision over non-stratification by 4% and 3%, respectively. Not including wood transported during ice break-up, the total minimum wood load estimated at this site is 3300 × 800$ m3 for the 2012 runoff season. The vast majority of the imprecision in total wood volumes came from variance in estimating average volume per log. Comparison of proportions and variance across sample intervals using bootstrap sampling to achieve equal n. Each trial was sampled for n=100, 10,000 times and averaged. All trials were then averaged to obtain an estimate for each sample interval. Dashed lines represent values from the one minute dataset.

  16. Confidence intervals for correlations when data are not normal.

    PubMed

    Bishara, Anthony J; Hittner, James B

    2017-02-01

    With nonnormal data, the typical confidence interval of the correlation (Fisher z') may be inaccurate. The literature has been unclear as to which of several alternative methods should be used instead, and how extreme a violation of normality is needed to justify an alternative. Through Monte Carlo simulation, 11 confidence interval methods were compared, including Fisher z', two Spearman rank-order methods, the Box-Cox transformation, rank-based inverse normal (RIN) transformation, and various bootstrap methods. Nonnormality often distorted the Fisher z' confidence interval-for example, leading to a 95 % confidence interval that had actual coverage as low as 68 %. Increasing the sample size sometimes worsened this problem. Inaccurate Fisher z' intervals could be predicted by a sample kurtosis of at least 2, an absolute sample skewness of at least 1, or significant violations of normality hypothesis tests. Only the Spearman rank-order and RIN transformation methods were universally robust to nonnormality. Among the bootstrap methods, an observed imposed bootstrap came closest to accurate coverage, though it often resulted in an overly long interval. The results suggest that sample nonnormality can justify avoidance of the Fisher z' interval in favor of a more robust alternative. R code for the relevant methods is provided in supplementary materials.

  17. Estimating statistical uncertainty of Monte Carlo efficiency-gain in the context of a correlated sampling Monte Carlo code for brachytherapy treatment planning with non-normal dose distribution.

    PubMed

    Mukhopadhyay, Nitai D; Sampson, Andrew J; Deniz, Daniel; Alm Carlsson, Gudrun; Williamson, Jeffrey; Malusek, Alexandr

    2012-01-01

    Correlated sampling Monte Carlo methods can shorten computing times in brachytherapy treatment planning. Monte Carlo efficiency is typically estimated via efficiency gain, defined as the reduction in computing time by correlated sampling relative to conventional Monte Carlo methods when equal statistical uncertainties have been achieved. The determination of the efficiency gain uncertainty arising from random effects, however, is not a straightforward task specially when the error distribution is non-normal. The purpose of this study is to evaluate the applicability of the F distribution and standardized uncertainty propagation methods (widely used in metrology to estimate uncertainty of physical measurements) for predicting confidence intervals about efficiency gain estimates derived from single Monte Carlo runs using fixed-collision correlated sampling in a simplified brachytherapy geometry. A bootstrap based algorithm was used to simulate the probability distribution of the efficiency gain estimates and the shortest 95% confidence interval was estimated from this distribution. It was found that the corresponding relative uncertainty was as large as 37% for this particular problem. The uncertainty propagation framework predicted confidence intervals reasonably well; however its main disadvantage was that uncertainties of input quantities had to be calculated in a separate run via a Monte Carlo method. The F distribution noticeably underestimated the confidence interval. These discrepancies were influenced by several photons with large statistical weights which made extremely large contributions to the scored absorbed dose difference. The mechanism of acquiring high statistical weights in the fixed-collision correlated sampling method was explained and a mitigation strategy was proposed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Temporal Doppler Effect and Future Orientation: Adaptive Function and Moderating Conditions.

    PubMed

    Gan, Yiqun; Miao, Miao; Zheng, Lei; Liu, Haihua

    2017-06-01

    The objectives of this study were to examine whether the temporal Doppler effect exists in different time intervals and whether certain individual and environmental factors act as moderators of the effect. Using hierarchical linear modeling, we examined the existence of the temporal Doppler effect and the moderating effect of future orientation among 139 university students (Study 1), and then the moderating conditions of the temporal Doppler effect using two independent samples of 143 and 147 university students (Studies 2 and 3). Results indicated that the temporal Doppler effect existed in all of our studies, and that future orientation moderated the temporal Doppler effect. Further, time interval perception mediated the relationship between future orientation and the motivation to cope at long time intervals. Finally, positive affect was found to enhance the temporal Doppler effect, whereas control deprivation did not influence the effect. The temporal Doppler effect is moderated by the personality trait of future orientation and by the situational variable of experimentally manipulated positive affect. We have identified personality and environmental processes that could enhance the temporal Doppler effect, which could be valuable in cases where attention to a future task is necessary. © 2016 Wiley Periodicals, Inc.

  19. A Comparison of Single Sample and Bootstrap Methods to Assess Mediation in Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Pituch, Keenan A.; Stapleton, Laura M.; Kang, Joo Youn

    2006-01-01

    A Monte Carlo study examined the statistical performance of single sample and bootstrap methods that can be used to test and form confidence interval estimates of indirect effects in two cluster randomized experimental designs. The designs were similar in that they featured random assignment of clusters to one of two treatment conditions and…

  20. Pigeons exhibit higher accuracy for chosen memory tests than for forced memory tests in duration matching-to-sample.

    PubMed

    Adams, Allison; Santi, Angelo

    2011-03-01

    Following training to match 2- and 8-sec durations of feederlight to red and green comparisons with a 0-sec baseline delay, pigeons were allowed to choose to take a memory test or to escape the memory test. The effects of sample omission, increases in retention interval, and variation in trial spacing on selection of the escape option and accuracy were studied. During initial testing, escaping the test did not increase as the task became more difficult, and there was no difference in accuracy between chosen and forced memory tests. However, with extended training, accuracy for chosen tests was significantly greater than for forced tests. In addition, two pigeons exhibited higher accuracy on chosen tests than on forced tests at the short retention interval and greater escape rates at the long retention interval. These results have not been obtained in previous studies with pigeons when the choice to take the test or to escape the test is given before test stimuli are presented. It appears that task-specific methodological factors may determine whether a particular species will exhibit the two behavioral effects that were initially proposed as potentially indicative of metacognition.

  1. Hydrogeology and water quality of the Dublin and Midville aquifer systems at Waynesboro, Burke County, Georgia, 2011

    USGS Publications Warehouse

    Gonthier, Gerard

    2013-01-01

    The hydrogeology and water quality of the Dublin and Midville aquifer systems were characterized in the City of Waynesboro area in Burke County, Georgia, based on geophysical and drillers’ logs, flowmeter surveys, a 24-houraquifer test, and the collection and chemical analysis of water samples in a newly constructed well. At the test site, the Dublin aquifer system consists of interlayered sands and clays between depths of 396 and 691 feet, and the Midville aquifer system consists of a sandy clay layer overlying a sand and gravel layer between depths of 728 and 936 feet. The new well was constructed with three screened intervals in the Dublin aquifer system and four screened intervals in the Midville aquifer system. Wellbore-flowmeter testing at a pumping rate of 1,000 gallons per minute indicated that 52.2 percent of the total flow was from the shallower Dublin aquifer system with the remaining 47.8 percent from the deeper Midville aquifer system. The lower part of the lower Midville aquifer (900 to 930 feet deep), contributed only 0.1 percent of the total flow. Hydraulic properties of the two aquifer systems were estimated using data from two wellbore-flowmeter surveys and a 24-hour aquifer test. Estimated values of transmissivity for the Dublin and Midville aquifer systems were 2,000 and 1,000 feet squared per day, respectively. The upper and lower Dublin aquifers have a combined thickness of about 150 feet and the horizontal hydraulic conductivity of the Dublin aquifer system averages 10 feet per day. The upper Midville aquifer, lower Midville confining unit, and lower Midville aquifer have a combined thickness of about 210 feet, and the horizontal hydraulic conductivity of the Midville aquifer system averages 6 feet per day. Storage coefficient of the Dublin aquifer system, computed using the Theis method on water-level data from one observation well, was estimated to be 0.0003. With a thickness of about 150 feet, the specific storage of the Dublin aquifer system averages about 2×10-6 per foot. Water quality of the Dublin and Midville aquifer systems was characterized during the aquifer test on the basis of water samples collected from composite well flow originating from five depths in the completed production well during the aquifer test. Samples were analyzed for total dissolved solids, specific conductance, pH, alkalinity, and major ions. Water-quality results from composite samples, known flow contribution from individual screens, and a mixing equation were used to calculate water-quality values for sample intervals between sample depths or below the bottom sample depth. With the exception of iron and manganese, constituent concentrations of water from each of the sampled intervals and total flow from the well were within U.S. Environmental Protection Agency primary and secondary drinking-water standards. Water from the bottommost sample interval in the lower part of the lower Midville aquifer (900 to 930 feet) contained manganese and iron concentrations of 59.1 and 1,160 micrograms per liter, respectively, which exceeded secondary drinking-water standards. Because this interval contributed only 0.1 percent of the total flow to the well, water quality of this interval had little effect on the composite well water quality. Two other sample intervals from the Midville aquifer system and the total flow from both aquifer systems contained iron concentrations that slightly exceeded the secondary drinking-water standard of 300 micrograms per liter.

  2. Fixed-interval matching-to-sample: intermatching time and intermatching error runs1

    PubMed Central

    Nelson, Thomas D.

    1978-01-01

    Four pigeons were trained on a matching-to-sample task in which reinforcers followed either the first matching response (fixed interval) or the fifth matching response (tandem fixed-interval fixed-ratio) that occurred 80 seconds or longer after the last reinforcement. Relative frequency distributions of the matching-to-sample responses that concluded intermatching times and runs of mismatches (intermatching error runs) were computed for the final matching responses directly followed by grain access and also for the three matching responses immediately preceding the final match. Comparison of these two distributions showed that the fixed-interval schedule arranged for the preferential reinforcement of matches concluding relatively extended intermatching times and runs of mismatches. Differences in matching accuracy and rate during the fixed interval, compared to the tandem fixed-interval fixed-ratio, suggested that reinforcers following matches concluding various intermatching times and runs of mismatches influenced the rate and accuracy of the last few matches before grain access, but did not control rate and accuracy throughout the entire fixed-interval period. PMID:16812032

  3. The effect of short-duration sprint interval exercise on plasma postprandial triacylglycerol levels in young men.

    PubMed

    Allen, Edward; Gray, Partick; Kollias-Pearson, Angeliki; Oag, Erlend; Pratt, Katrina; Henderson, Jennifer; Gray, Stuart Robert

    2014-01-01

    It is well established that regular exercise can reduce the risk of cardiovascular disease, although the most time-efficient exercise protocol to confer benefits has yet to be established. The aim of the current study was to determine the effects of short-duration sprint interval exercise on postprandial triacylglycerol. Fifteen healthy male participants completed two 2 day trials. On day 1, participants rested (control) or carried out twenty 6 s sprints, interspersed with 24 s recovery (sprint interval exercise--14 min for total exercise session). On day 2, participants consumed a high-fat meal for breakfast with blood samples collected at baseline, 2 h and 4 h. Gas exchange was also measured at these time points. On day 2 of control and sprint interval exercise trials, there were no differences (P < 0.05) between trials in plasma glucose, triacylglycerol, insulin or respiratory exchange ratio (RER). The area under the curve for plasma triacylglycerol was 7.67 ± 2.37 mmol · l(-1) x 4 h(-1) in the control trial and 7.26 ± 2.49 mmol · l(-1) x 4 h(-1) in the sprint interval exercise trial. Although the sprint exercise protocol employed had no significant effect on postprandial triacylglycerol, there was a clear variability in responses that warrants further investigation.

  4. A general method to determine sampling windows for nonlinear mixed effects models with an application to population pharmacokinetic studies.

    PubMed

    Foo, Lee Kien; McGree, James; Duffull, Stephen

    2012-01-01

    Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models. Copyright © 2012 John Wiley & Sons, Ltd.

  5. Design tradeoffs for trend assessment in aquatic biological monitoring programs

    USGS Publications Warehouse

    Gurtz, Martin E.; Van Sickle, John; Carlisle, Daren M.; Paulsen, Steven G.

    2013-01-01

    Assessments of long-term (multiyear) temporal trends in biological monitoring programs are generally undertaken without an adequate understanding of the temporal variability of biological communities. When the sources and levels of variability are unknown, managers cannot make informed choices in sampling design to achieve monitoring goals in a cost-effective manner. We evaluated different trend sampling designs by estimating components of both short- and long-term variability in biological indicators of water quality in streams. Invertebrate samples were collected from 32 sites—9 urban, 6 agricultural, and 17 relatively undisturbed (reference) streams—distributed throughout the United States. Between 5 and 12 yearly samples were collected at each site during the period 1993–2008, plus 2 samples within a 10-week index period during either 2007 or 2008. These data allowed calculation of four sources of variance for invertebrate indicators: among sites, among years within sites, interaction among sites and years (site-specific annual variation), and among samples collected within an index period at a site (residual). When estimates of these variance components are known, changes to sampling design can be made to improve trend detection. Design modifications that result in the ability to detect the smallest trend with the fewest samples are, from most to least effective: (1) increasing the number of years in the sampling period (duration of the monitoring program), (2) decreasing the interval between samples, and (3) increasing the number of repeat-visit samples per year (within an index period). This order of improvement in trend detection, which achieves the greatest gain for the fewest samples, is the same whether trends are assessed at an individual site or an average trend of multiple sites. In multiple-site surveys, increasing the number of sites has an effect similar to that of decreasing the sampling interval; the benefit of adding sites is greater when a new set of different sites is selected for each sampling effort than when the same sites are sampled each time. Understanding variance components of the ecological attributes of interest can lead to more cost-effective monitoring designs to detect trends.

  6. Appropriate time scales for nonlinear analyses of deterministic jump systems

    NASA Astrophysics Data System (ADS)

    Suzuki, Tomoya

    2011-06-01

    In the real world, there are many phenomena that are derived from deterministic systems but which fluctuate with nonuniform time intervals. This paper discusses the appropriate time scales that can be applied to such systems to analyze their properties. The financial markets are an example of such systems wherein price movements fluctuate with nonuniform time intervals. However, it is common to apply uniform time scales such as 1-min data and 1-h data to study price movements. This paper examines the validity of such time scales by using surrogate data tests to ascertain whether the deterministic properties of the original system can be identified from uniform sampled data. The results show that uniform time samplings are often inappropriate for nonlinear analyses. However, for other systems such as neural spikes and Internet traffic packets, which produce similar outputs, uniform time samplings are quite effective in extracting the system properties. Nevertheless, uniform samplings often generate overlapping data, which can cause false rejections of surrogate data tests.

  7. Reference Intervals of Common Clinical Chemistry Analytes for Adults in Hong Kong.

    PubMed

    Lo, Y C; Armbruster, David A

    2012-04-01

    Defining reference intervals is a major challenge because of the difficulty in recruiting volunteers to participate and testing samples from a significant number of healthy reference individuals. Historical literature citation intervals are often suboptimal because they're be based on obsolete methods and/or only a small number of poorly defined reference samples. Blood donors in Hong Kong gave permission for additional blood to be collected for reference interval testing. The samples were tested for twenty-five routine analytes on the Abbott ARCHITECT clinical chemistry system. Results were analyzed using the Rhoads EP evaluator software program, which is based on the CLSI/IFCC C28-A guideline, and defines the reference interval as the 95% central range. Method specific reference intervals were established for twenty-five common clinical chemistry analytes for a Chinese ethnic population. The intervals were defined for each gender separately and for genders combined. Gender specific or combined gender intervals were adapted as appropriate for each analyte. A large number of healthy, apparently normal blood donors from a local ethnic population were tested to provide current reference intervals for a new clinical chemistry system. Intervals were determined following an accepted international guideline. Laboratories using the same or similar methodologies may adapt these intervals if deemed validated and deemed suitable for their patient population. Laboratories using different methodologies may be able to successfully adapt the intervals for their facilities using the reference interval transference technique based on a method comparison study.

  8. Discrimination of edible oils and fats by combination of multivariate pattern recognition and FT-IR spectroscopy: A comparative study between different modeling methods

    NASA Astrophysics Data System (ADS)

    Javidnia, Katayoun; Parish, Maryam; Karimi, Sadegh; Hemmateenejad, Bahram

    2013-03-01

    By using FT-IR spectroscopy, many researchers from different disciplines enrich the experimental complexity of their research for obtaining more precise information. Moreover chemometrics techniques have boosted the use of IR instruments. In the present study we aimed to emphasize on the power of FT-IR spectroscopy for discrimination between different oil samples (especially fat from vegetable oils). Also our data were used to compare the performance of different classification methods. FT-IR transmittance spectra of oil samples (Corn, Colona, Sunflower, Soya, Olive, and Butter) were measured in the wave-number interval of 450-4000 cm-1. Classification analysis was performed utilizing PLS-DA, interval PLS-DA, extended canonical variate analysis (ECVA) and interval ECVA methods. The effect of data preprocessing by extended multiplicative signal correction was investigated. Whilst all employed method could distinguish butter from vegetable oils, iECVA resulted in the best performances for calibration and external test set with 100% sensitivity and specificity.

  9. Doppler-corrected differential detection system

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K. (Inventor); Divsalar, Dariush (Inventor)

    1991-01-01

    Doppler in a communication system operating with a multiple differential phase-shift-keyed format (MDPSK) creates an adverse phase shift in an incoming signal. An open loop frequency estimation is derived from a Doppler-contaminated incoming signal. Based upon the recognition that, whereas the change in phase of the received signal over a full symbol contains both the differentially encoded data and the Doppler induced phase shift, the same change in phase over half a symbol (within a given symbol interval) contains only the Doppler induced phase shift, and the Doppler effect can be estimated and removed from the incoming signal. Doppler correction occurs prior to the receiver's final output of decoded data. A multiphase system can operate with two samplings per symbol interval at no penalty in signal-to-noise ratio provided that an ideal low pass pre-detection filter is employed, and two samples, at 1/4 and 3/4 of the symbol interval T sub s, are taken and summed together prior to incoming signal data detection.

  10. Pigeons' Choices between Fixed-Interval and Random-Interval Schedules: Utility of Variability?

    ERIC Educational Resources Information Center

    Andrzejewski, Matthew E.; Cardinal, Claudia D.; Field, Douglas P.; Flannery, Barbara A.; Johnson, Michael; Bailey, Kathleen; Hineline, Philip N.

    2005-01-01

    Pigeons' choosing between fixed-interval and random-interval schedules of reinforcement was investigated in three experiments using a discrete-trial procedure. In all three experiments, the random-interval schedule was generated by sampling a probability distribution at an interval (and in multiples of the interval) equal to that of the…

  11. Effect of dietary crude proteins on the reproductive function in the postpartum dairy cow.

    PubMed

    Benaich, S; Guerouali, A; Belahsen, R; Mokhtar, N; Aguenaou, H

    1999-01-01

    The study was conducted on 216 dairy cows. Samples of feeds distributed to cows were collected monthly for the purpose of determining their content in dry matter, energy, crude proteins and mineral matter. Milk samples were collected weekly for every cow from newly calved cows until confirmation of pregnancy by rectal palpation at least 2 months after artificial insemination. These samples were used for progesterone assays in skimmed milk, in order to assess the interval between calving and return to ovarian activity [C-ROA], calving and first insemination [C-I1], calving and conception [C-C] and number of inseminations per conception (nI/C). Results have shown a significant negative correlation between the duration of [C-ROA] and [C-C] intervals and the dietary content in crude proteins (r = -0.720, p < 0.05 and r = -0.914, p < 0.01 respectively).

  12. Streamflow and suspended-sediment transport in Garvin Brook, Winona County, southeastern Minnesota: Hydrologic data for 1982

    USGS Publications Warehouse

    Payne, G.A.

    1983-01-01

    Streamflow and suspended-sediment-transport data were collected in Garvin Brook watershed in Winona County, southeastern Minnesota, during 1982. The data collection was part of a study to determine the effectiveness of agricultural best-management practices designed to improve rural water quality. The study is part of a Rural Clean Water Program demonstration project undertaken by the U.S. Department of Agriculture. Continuous streamflow data were collected at three gaging stations during March through September 1982. Suspended-sediment samples were collected at two of the gaging stations. Samples were collected manually at weekly intervals. During periods of rapidly changing stage, samples were collected at 30-minute to 12-hour intervals by stage-activated automatic samplers. The samples were analyzed for suspendedsediment concentration and particle-size distribution. Particlesize distributions were also determined for one set of bedmaterial samples collected at each sediment-sampling site. The streamflow and suspended-sediment-concentration data were used to compute records of mean-daily flow, mean-daily suspended-sediment concentration, and daily suspended-sediment discharge. The daily records are documented and results of analyses for particle-size distribution and of vertical sampling in the stream cross sections are given.

  13. PRIMUS/NAVCARE Cost-Effectiveness Analysis

    DTIC Science & Technology

    1991-04-08

    ICD-9-CM diagnosis codes that occurred most frequently in the medical record sample - 328.9 ( otitis media , unspecified) and 465.9 (upper...when attention is focused upon a single diagnosis, the MTF CECs are no longer consistently above the PRIMUS CECs. For otitis media , the MTF CECs are...CHAMPUS-EQUIVALENT COSTS FOR SELECTED DIAGNOSES 328.9 OTITIS MEDIA , UNSPECIFIED Sample Size Mean 95% Confidence Interval Upper Limit Lower

  14. Confidence intervals from single observations in forest research

    Treesearch

    Harry T. Valentine; George M. Furnival; Timothy G. Gregoire

    1991-01-01

    A procedure for constructing confidence intervals and testing hypothese from a single trial or observation is reviewed. The procedure requires a prior, fixed estimate or guess of the outcome of an experiment or sampling. Two examples of applications are described: a confidence interval is constructed for the expected outcome of a systematic sampling of a forested tract...

  15. Multiport well design for sampling of ground water at closely spaced vertical intervals

    USGS Publications Warehouse

    Delin, G.N.; Landon, M.K.

    1996-01-01

    Detailed vertical sampling is useful in aquifers where vertical mixing is limited and steep vertical gradients in chemical concentrations are expected. Samples can be collected at closely spaced vertical intervals from nested wells with short screened intervals. However, this approach may not be appropriate in all situations. An easy-to-construct and easy-to-install multiport sampling well to collect ground-water samples from closely spaced vertical intervals was developed and tested. The multiport sampling well was designed to sample ground water from surficial sand-and-gravel aquifers. The device consists of multiple stainless-steel tubes within a polyvinyl chloride (PVC) protective casing. The tubes protrude through the wall of the PVC casing at the desired sampling depths. A peristaltic pump is used to collect ground-water samples from the sampling ports. The difference in hydraulic head between any two sampling ports can be measured with a vacuum pump and a modified manometer. The usefulness and versatility of this multiport well design was demonstrated at an agricultural research site near Princeton, Minnesota where sampling ports were installed to a maximum depth of about 12 m below land surface. Tracer experiments were conducted using potassium bromide to document the degree to which short-circuiting occurred between sampling ports. Samples were successfully collected for analysis of major cations and anions, nutrients, selected herbicides, isotopes, dissolved gases, and chlorofluorcarbon concentrations.

  16. Effects of active recovery during interval training on plasma catecholamines and insulin.

    PubMed

    Nalbandian, Harutiun M; Radak, Zsolt; Takeda, Masaki

    2018-06-01

    BACKGROUNDː Active recovery has been used as a method to accelerate the recovery during intense exercise. It also has been shown to improve performance in subsequent exercises, but little is known about its acute effects on the hormonal and metabolic profile. The aim of this research was to study the effects of active recovery on plasma catecholamines and plasma insulin during a high-intensity interval exercise. METHODSː Seven subjects performed two high-intensity interval training protocols which consisted of three 30-second high-intensity bouts (constant intensity), separated by a recovery of 4 minutes. The recovery was either active recovery or passive recovery. During the main test blood samples were collected and plasma insulin, plasma catecholamines and blood lactate were determined. Furthermore, respiratory gasses were also measured. RESULTSː Plasma insulin and blood lactate were significantly higher in the passive recovery trial, while plasma adrenaline was higher in the active recovery. Additionally, VO2 and VCO2 were significantly more increased during the active recovery trials. CONCLUSIONSː These results suggest that active recovery affects the hormonal and metabolic responses to high-intensity interval exercise. Active recovery produces a hormonal environment which may favor lipolysis and oxidative metabolism, while passive recovery may be favoring glycolysis.

  17. Confidence intervals for population allele frequencies: the general case of sampling from a finite diploid population of any size.

    PubMed

    Fung, Tak; Keenan, Kevin

    2014-01-01

    The estimation of population allele frequencies using sample data forms a central component of studies in population genetics. These estimates can be used to test hypotheses on the evolutionary processes governing changes in genetic variation among populations. However, existing studies frequently do not account for sampling uncertainty in these estimates, thus compromising their utility. Incorporation of this uncertainty has been hindered by the lack of a method for constructing confidence intervals containing the population allele frequencies, for the general case of sampling from a finite diploid population of any size. In this study, we address this important knowledge gap by presenting a rigorous mathematical method to construct such confidence intervals. For a range of scenarios, the method is used to demonstrate that for a particular allele, in order to obtain accurate estimates within 0.05 of the population allele frequency with high probability (> or = 95%), a sample size of > 30 is often required. This analysis is augmented by an application of the method to empirical sample allele frequency data for two populations of the checkerspot butterfly (Melitaea cinxia L.), occupying meadows in Finland. For each population, the method is used to derive > or = 98.3% confidence intervals for the population frequencies of three alleles. These intervals are then used to construct two joint > or = 95% confidence regions, one for the set of three frequencies for each population. These regions are then used to derive a > or = 95%% confidence interval for Jost's D, a measure of genetic differentiation between the two populations. Overall, the results demonstrate the practical utility of the method with respect to informing sampling design and accounting for sampling uncertainty in studies of population genetics, important for scientific hypothesis-testing and also for risk-based natural resource management.

  18. Observer-based output feedback control of networked control systems with non-uniform sampling and time-varying delay

    NASA Astrophysics Data System (ADS)

    Meng, Su; Chen, Jie; Sun, Jian

    2017-10-01

    This paper investigates the problem of observer-based output feedback control for networked control systems with non-uniform sampling and time-varying transmission delay. The sampling intervals are assumed to vary within a given interval. The transmission delay belongs to a known interval. A discrete-time model is first established, which contains time-varying delay and norm-bounded uncertainties coming from non-uniform sampling intervals. It is then converted to an interconnection of two subsystems in which the forward channel is delay-free. The scaled small gain theorem is used to derive the stability condition for the closed-loop system. Moreover, the observer-based output feedback controller design method is proposed by utilising a modified cone complementary linearisation algorithm. Finally, numerical examples illustrate the validity and superiority of the proposed method.

  19. Reliability of confidence intervals calculated by bootstrap and classical methods using the FIA 1-ha plot design

    Treesearch

    H. T. Schreuder; M. S. Williams

    2000-01-01

    In simulation sampling from forest populations using sample sizes of 20, 40, and 60 plots respectively, confidence intervals based on the bootstrap (accelerated, percentile, and t-distribution based) were calculated and compared with those based on the classical t confidence intervals for mapped populations and subdomains within those populations. A 68.1 ha mapped...

  20. Assessing the effects of land use changes on soil sensitivity to erosion in a highland ecosystem of semi-arid Turkey.

    PubMed

    Bayramin, Ilhami; Basaran, Mustafa; Erpul, Günay; Canga, Mustafa R

    2008-05-01

    There has been increasing concern in highlands of semiarid Turkey that conversion of these systems results in excessive soil erosion, ecosystem degradation, and loss of sustainable resources. An increasing rate of land use/cover changes especially in semiarid mountainous areas has resulted in important effects on physical and ecological processes, causing many regions to undergo accelerated environmental degradation in terms of soil erosion, mass movement and reservoir sedimentation. This paper, therefore, explores the impact of land use changes on land degradation in a linkage to the soil erodibility, RUSLE-K, in Cankiri-Indagi Mountain Pass, Turkey. The characterization of soil erodibility in this ecosystem is important from the standpoint of conserving fragile ecosystems and planning management practices. Five adjacent land uses (cropland, grassland, woodland, plantation, and recreational land) were selected for this research. Analysis of variance showed that soil properties and RUSLE-K statistically changed with land use changes and soils of the recreational land and cropland were more sensitive to water erosion than those of the woodland, grassland, and plantation. This was mainly due to the significant decreases in soil organic matter (SOM) and hydraulic conductivity (HC) in those lands. Additionally, soil samples randomly collected from the depths of 0-10 cm (D1) and 10-20 cm (D2) with irregular intervals in an area of 1,200 by 4,200 m sufficiently characterized not only the spatial distribution of soil organic matter (SOM), hydraulic conductivity (HC), clay (C), silt (Si), sand (S) and silt plus very fine sand (Si + VFS) but also the spatial distribution of RUSLE-K as an algebraically estimate of these parameters together with field assessment of soil structure to assess the dynamic relationships between soil properties and land use types. In this study, in order to perform the spatial analyses, the mean sampling intervals were 43, 50, 64, 78, 85 m for woodland, plantation, grassland, recreation, and cropland with the sample numbers of 56, 79, 72, 13, and 69, respectively, resulting in an average interval of 64 m for whole study area. Although nugget effect and nugget effect-sill ratio gave an idea about the sampling design adequacy, the better results are undoubtedly likely by both equi-probable spatial sampling and random sampling representative of all land uses.

  1. Automated storm water sampling on small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; King, K.W.; Slade, R.M.

    2003-01-01

    Few guidelines are currently available to assist in designing appropriate automated storm water sampling strategies for small watersheds. Therefore, guidance is needed to develop strategies that achieve an appropriate balance between accurate characterization of storm water quality and loads and limitations of budget, equipment, and personnel. In this article, we explore the important sampling strategy components (minimum flow threshold, sampling interval, and discrete versus composite sampling) and project-specific considerations (sampling goal, sampling and analysis resources, and watershed characteristics) based on personal experiences and pertinent field and analytical studies. These components and considerations are important in achieving the balance between sampling goals and limitations because they determine how and when samples are taken and the potential sampling error. Several general recommendations are made, including: setting low minimum flow thresholds, using flow-interval or variable time-interval sampling, and using composite sampling to limit the number of samples collected. Guidelines are presented to aid in selection of an appropriate sampling strategy based on user's project-specific considerations. Our experiences suggest these recommendations should allow implementation of a successful sampling strategy for most small watershed sampling projects with common sampling goals.

  2. Approximate Confidence Intervals for Moment-Based Estimators of the Between-Study Variance in Random Effects Meta-Analysis

    ERIC Educational Resources Information Center

    Jackson, Dan; Bowden, Jack; Baker, Rose

    2015-01-01

    Moment-based estimators of the between-study variance are very popular when performing random effects meta-analyses. This type of estimation has many advantages including computational and conceptual simplicity. Furthermore, by using these estimators in large samples, valid meta-analyses can be performed without the assumption that the treatment…

  3. A passive integrative sampler for mercury vapor in air and neutral mercury species in water

    USGS Publications Warehouse

    Brumbaugh, W.G.; Petty, J.D.; May, T.W.; Huckins, J.N.

    2000-01-01

    A passive integrative mercury sampler (PIMS) based on a sealed polymeric membrane was effective for the collection and preconcentration of Hg0. Because the Hg is both oxidized and stabilized in the PIMS, sampling intervals of weeks to months are possible. The effective air sampling rate for a 15 x 2.5 cm device was about 21-equivalents/day (0.002 m3/day) and the detection limit for 4-week sampling was about 2 ng/m3 for conventional ICP-MS determination without clean-room preparation. Sampling precision was ??? 5% RSD for laboratory exposures, and 5-10% RSD for field exposures. These results suggest that the PIMS could be useful for screening assessments of Hg contamination and exposure in the environment, the laboratory, and the workplace. The PIMS approach may be particularly useful for applications requiring unattended sampling for extended periods at remote locations. Preliminary results indicate that sampling for dissolved gaseous mercury (DGM) and potentially other neutral mercury species from water is also feasible. Rigorous validation of the sampler performance is currently in progress. (C) 1999 Elsevier Science Ltd.A passive integrative mercury sampler (PIMS) based on a sealed polymeric membrane was effective for the collection and preconcentration of Hg0. Because the Hg is both oxidized and stabilized in the PIMS, sampling intervals of weeks to months are possible. The effective air sampling rate for a 15??2.5 cm device was about 21-equivalents/day (0.002 m3/day) and the detection limit for 4-week sampling was about 2 ng/m3 for conventional ICP-MS determination without clean-room preparation. Sampling precision was ???5% RSD for laboratory exposures, and 5-10% RSD for field exposures. These results suggest that the PIMS could be useful for screening assessments of Hg contamination and exposure in the environment, the laboratory, and the workplace. The PIMS approach may be particularly useful for applications requiring unattended sampling for extended periods at remote locations. Preliminary results indicate that sampling for dissolved gaseous mercury (DGM) and potentially other neutral mercury species from water is also feasible. Rigorous validation of the sampler performance is currently in progress.

  4. Deep Space Environmental Effects on Immune, Oxidative Stress and Damage, and Health and Behavioral Biomarkers in Humans

    NASA Astrophysics Data System (ADS)

    Crucian, B.; Zwart, S.; Smith, S. M.; Simonsen, L. C.; Williams, T.; Antonsen, E.

    2018-02-01

    Biomarkers will be assessed in biological samples (saliva, blood, urine, feces) collected from crewmembers and returned to Earth at various intervals, mirroring (where feasible) collection timepoints used on the International Space Station (ISS).

  5. A random effects meta-analysis model with Box-Cox transformation.

    PubMed

    Yamaguchi, Yusuke; Maruo, Kazushi; Partlett, Christopher; Riley, Richard D

    2017-07-19

    In a random effects meta-analysis model, true treatment effects for each study are routinely assumed to follow a normal distribution. However, normality is a restrictive assumption and the misspecification of the random effects distribution may result in a misleading estimate of overall mean for the treatment effect, an inappropriate quantification of heterogeneity across studies and a wrongly symmetric prediction interval. We focus on problems caused by an inappropriate normality assumption of the random effects distribution, and propose a novel random effects meta-analysis model where a Box-Cox transformation is applied to the observed treatment effect estimates. The proposed model aims to normalise an overall distribution of observed treatment effect estimates, which is sum of the within-study sampling distributions and the random effects distribution. When sampling distributions are approximately normal, non-normality in the overall distribution will be mainly due to the random effects distribution, especially when the between-study variation is large relative to the within-study variation. The Box-Cox transformation addresses this flexibly according to the observed departure from normality. We use a Bayesian approach for estimating parameters in the proposed model, and suggest summarising the meta-analysis results by an overall median, an interquartile range and a prediction interval. The model can be applied for any kind of variables once the treatment effect estimate is defined from the variable. A simulation study suggested that when the overall distribution of treatment effect estimates are skewed, the overall mean and conventional I 2 from the normal random effects model could be inappropriate summaries, and the proposed model helped reduce this issue. We illustrated the proposed model using two examples, which revealed some important differences on summary results, heterogeneity measures and prediction intervals from the normal random effects model. The random effects meta-analysis with the Box-Cox transformation may be an important tool for examining robustness of traditional meta-analysis results against skewness on the observed treatment effect estimates. Further critical evaluation of the method is needed.

  6. Combining Speed Information Across Space

    NASA Technical Reports Server (NTRS)

    Verghese, Preeti; Stone, Leland S.

    1995-01-01

    We used speed discrimination tasks to measure the ability of observers to combine speed information from multiple stimuli distributed across space. We compared speed discrimination thresholds in a classical discrimination paradigm to those in an uncertainty/search paradigm. Thresholds were measured using a temporal two-interval forced-choice design. In the discrimination paradigm, the n gratings in each interval all moved at the same speed and observers were asked to choose the interval with the faster gratings. Discrimination thresholds for this paradigm decreased as the number of gratings increased. This decrease was not due to increasing the effective stimulus area as a control experiment that increased the area of a single grating did not show a similar improvement in thresholds. Adding independent speed noise to each of the n gratings caused thresholds to decrease at a rate similar to the original no-noise case, consistent with observers combining an independent sample of speed from each grating in both the added- and no-noise cases. In the search paradigm, observers were asked to choose the interval in which one of the n gratings moved faster. Thresholds in this case increased with the number of gratings, behavior traditionally attributed to an input bottleneck. However, results from the discrimination paradigm showed that the increase was not due to observers' inability to process these gratings. We have also shown that the opposite trends of the data in the two paradigms can be predicted by a decision theory model that combines independent samples of speed information across space. This demonstrates that models typically used in classical detection and discrimination paradigms are also applicable to search paradigms. As our model does not distinguish between samples in space and time, it predicts that discrimination performance should be the same regardless of whether the gratings are presented in two spatial intervals or two temporal intervals. Our last experiment largely confirmed this prediction.

  7. Comparison of two methods for recovering migrating Ascaris suum larvae from the liver and lungs of pigs.

    PubMed

    Slotved, H C; Roepstorff, A; Barnes, E H; Eriksen, L; Nansen, P

    1996-08-01

    Nine groups of 5 pigs were inoculated with Ascaris suum eggs on day 0. Groups 1, 2, and 3 were inoculated with 100 eggs, groups 4, 5, and 6 with 1,000 eggs, and groups 7, 8, and 9 with 10,000 eggs. On day 3, groups 1, 4, and 7 were slaughtered, on day 7 groups 2, 5, and 8, and on day 10 groups 3, 6, and 9. The liver (days 3 and 7) and lungs (days 3, 7, and 10) were removed and 2, 25% samples of both organs were collected. Larvae were recovered from 1 sample by the Baermann method and from the other by an agar-gel method. Overall there were no significant differences in the liver larval recovery between the 2 methods. The use of the agar-gel method resulted in a very clean suspension of larvae and thereby reduced the sample counting time by a factor of 5-10 compared to the Baermann method. With both methods larval recovery from the lungs resulted in a clean larval suspension that was easy to count, and there were overall no significant differences between the 2 methods, although there was a tendency toward the Baermann method recovering more larvae from the lungs than the agar-gel method. The tissue sample dry weight did not significantly influence larval recovery by the agar-gel method, and the time interval from slaughtering to start of incubation on day 3 (interval 51-92 min), day 7 (interval 37-114 min), and day 10 (interval 50-129 min) had no significant effect on recovery by either method.

  8. Tracking a changing environment: optimal sampling, adaptive memory and overnight effects.

    PubMed

    Dunlap, Aimee S; Stephens, David W

    2012-02-01

    Foraging in a variable environment presents a classic problem of decision making with incomplete information. Animals must track the changing environment, remember the best options and make choices accordingly. While several experimental studies have explored the idea that sampling behavior reflects the amount of environmental change, we take the next logical step in asking how change influences memory. We explore the hypothesis that memory length should be tied to the ecological relevance and the value of the information learned, and that environmental change is a key determinant of the value of memory. We use a dynamic programming model to confirm our predictions and then test memory length in a factorial experiment. In our experimental situation we manipulate rates of change in a simple foraging task for blue jays over a 36 h period. After jays experienced an experimentally determined change regime, we tested them at a range of retention intervals, from 1 to 72 h. Manipulated rates of change influenced learning and sampling rates: subjects sampled more and learned more quickly in the high change condition. Tests of retention revealed significant interactions between retention interval and the experienced rate of change. We observed a striking and surprising difference between the high and low change treatments at the 24h retention interval. In agreement with earlier work we find that a circadian retention interval is special, but we find that the extent of this 'specialness' depends on the subject's prior experience of environmental change. Specifically, experienced rates of change seem to influence how subjects balance recent information against past experience in a way that interacts with the passage of time. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Knowledge level of effect size statistics, confidence intervals and meta-analysis in Spanish academic psychologists.

    PubMed

    Badenes-Ribera, Laura; Frias-Navarro, Dolores; Pascual-Soler, Marcos; Monterde-I-Bort, Héctor

    2016-11-01

    The statistical reform movement and the American Psychological Association (APA) defend the use of estimators of the effect size and its confidence intervals, as well as the interpretation of the clinical significance of the findings. A survey was conducted in which academic psychologists were asked about their behavior in designing and carrying out their studies. The sample was composed of 472 participants (45.8% men). The mean number of years as a university professor was 13.56 years (SD= 9.27). The use of effect-size estimators is becoming generalized, as well as the consideration of meta-analytic studies. However, several inadequate practices still persist. A traditional model of methodological behavior based on statistical significance tests is maintained, based on the predominance of Cohen’s d and the unadjusted R2/η2, which are not immune to outliers or departure from normality and the violations of statistical assumptions, and the under-reporting of confidence intervals of effect-size statistics. The paper concludes with recommendations for improving statistical practice.

  10. Evaluation of spatial variability of soil arsenic adjacent to a disused cattle-dip site, using model-based geostatistics.

    PubMed

    Niazi, Nabeel K; Bishop, Thomas F A; Singh, Balwant

    2011-12-15

    This study investigated the spatial variability of total and phosphate-extractable arsenic (As) concentrations in soil adjacent to a cattle-dip site, employing a linear mixed model-based geostatistical approach. The soil samples in the study area (n = 102 in 8.1 m(2)) were taken at the nodes of a 0.30 × 0.35 m grid. The results showed that total As concentration (0-0.2 m depth) and phosphate-extractable As concentration (at depths of 0-0.2, 0.2-0.4, and 0.4-0.6 m) in soil adjacent to the dip varied greatly. Both total and phosphate-extractable soil As concentrations significantly (p = 0.004-0.048) increased toward the cattle-dip. Using the linear mixed model, we suggest that 5 samples are sufficient to assess a dip site for soil (As) contamination (95% confidence interval of ±475.9 mg kg(-1)), but 15 samples (95% confidence interval of ±212.3 mg kg(-1)) is desirable baseline when the ultimate goal is to evaluate the effects of phytoremediation. Such guidelines on sampling requirements are crucial for the assessment of As contamination levels at other cattle-dip sites, and to determine the effect of phytoremediation on soil As.

  11. Confidence intervals for a difference between lognormal means in cluster randomization trials.

    PubMed

    Poirier, Julia; Zou, G Y; Koval, John

    2017-04-01

    Cluster randomization trials, in which intact social units are randomized to different interventions, have become popular in the last 25 years. Outcomes from these trials in many cases are positively skewed, following approximately lognormal distributions. When inference is focused on the difference between treatment arm arithmetic means, existent confidence interval procedures either make restricting assumptions or are complex to implement. We approach this problem by assuming log-transformed outcomes from each treatment arm follow a one-way random effects model. The treatment arm means are functions of multiple parameters for which separate confidence intervals are readily available, suggesting that the method of variance estimates recovery may be applied to obtain closed-form confidence intervals. A simulation study showed that this simple approach performs well in small sample sizes in terms of empirical coverage, relatively balanced tail errors, and interval widths as compared to existing methods. The methods are illustrated using data arising from a cluster randomization trial investigating a critical pathway for the treatment of community acquired pneumonia.

  12. Effects of aniracetam on delayed matching-to-sample performance of monkeys and pigeons.

    PubMed

    Pontecorvo, M J; Evans, H L

    1985-05-01

    A 3-choice, variable-delay, matching-to-sample procedure was used to evaluate drugs in both pigeons and monkeys while tested under nearly-identical conditions. Aniracetam (Roche 13-5057) improved accuracy of matching at all retention intervals following oral administration (12.5, 25 and 50 mg/kg) to macaque monkeys, with a maximal effect at 25 mg/kg. Aniracetam also antagonized scopolamine-induced impairment of the monkey's performance. Intramuscular administration of these same doses of aniracetam produced a similar, but not significant trend toward improved matching accuracy in pigeons.

  13. Fourier-transform infrared derivative spectroscopy with an improved signal-to-noise ratio.

    PubMed

    Fetterman, M R

    2005-09-01

    Infrared derivative spectroscopy is a useful technique for finding peaks hidden in broad spectral features. A data acquisition technique is shown that will improve the signal-to-noise ratio (SNR) of Fourier-transform infrared (FTIR) derivative spectroscopy. Typically, in a FTIR measurement one samples each point for the same time interval. The effect of using a graded time interval is studied. The simulations presented show that the SNR of first-derivative FTIR spectroscopy will improve by 15% and that the SNR of second-derivative FTIR will improve by 34%.

  14. Depth-dependent groundwater quality sampling at City of Tallahassee test well 32, Leon County, Florida, 2013

    USGS Publications Warehouse

    McBride, W. Scott; Wacker, Michael A.

    2015-01-01

    A test well was drilled by the City of Tallahassee to assess the suitability of the site for the installation of a new well for public water supply. The test well is in Leon County in north-central Florida. The U.S. Geological Survey delineated high-permeability zones in the Upper Floridan aquifer, using borehole-geophysical data collected from the open interval of the test well. A composite water sample was collected from the open interval during high-flow conditions, and three discrete water samples were collected from specified depth intervals within the test well during low-flow conditions. Water-quality, source tracer, and age-dating results indicate that the open interval of the test well produces water of consistently high quality throughout its length. The cavernous nature of the open interval makes it likely that the highly permeable zones are interconnected in the aquifer by secondary porosity features.

  15. Quantitative investigation of resolution increase of free-flow electrophoresis via simple interval sample injection and separation.

    PubMed

    Shao, Jing; Fan, Liu-Yin; Cao, Cheng-Xi; Huang, Xian-Qing; Xu, Yu-Quan

    2012-07-01

    Interval free-flow zone electrophoresis (FFZE) has been used to suppress sample band broadening greatly hindering the development of free-flow electrophoresis (FFE). However, there has been still no quantitative study on the resolution increase of interval FFZE. Herein, we tried to make a comparison between bandwidths in interval FFZE and continuous one. A commercial dye with methyl green and crystal violet was well chosen to show the bandwidth. The comparative experiments were conducted under the same sample loading of the model dye (viz. 3.49, 1.75, 1.17, and 0.88 mg/h), the same running time (viz. 5, 10, 15, and 20 min), and the same flux ratio between sample and background buffer (= 10.64 × 10⁻³). Under the given conditions, the experiments demonstrated that (i) the band broadening was evidently caused by hydrodynamic factor in continuous mode, and (ii) the interval mode could clearly eliminate the hydrodynamic broadening existing in continuous mode, greatly increasing the resolution of dye separation. Finally, the interval FFZE was successfully used for the complete separation of two-model antibiotics (herein pyoluteorin and phenazine-1-carboxylic acid coexisting in fermentation broth of a new strain Pseudomonas aeruginosa M18), demonstrating the feasibility of interval FFZE mode for separation of biomolecules. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Measuring Safety Performance: A Comparison of Whole, Partial, and Momentary Time-Sampling Recording Methods

    ERIC Educational Resources Information Center

    Alvero, Alicia M.; Struss, Kristen; Rappaport, Eva

    2008-01-01

    Partial-interval (PIR), whole-interval (WIR), and momentary time sampling (MTS) estimates were compared against continuous measures of safety performance for three postural behaviors: feet, back, and shoulder position. Twenty-five samples of safety performance across five undergraduate students were scored using a second-by-second continuous…

  17. Effect of Missing Inter-Beat Interval Data on Heart Rate Variability Analysis Using Wrist-Worn Wearables.

    PubMed

    Baek, Hyun Jae; Shin, JaeWook

    2017-08-15

    Most of the wrist-worn devices on the market provide a continuous heart rate measurement function using photoplethysmography, but have not yet provided a function to measure the continuous heart rate variability (HRV) using beat-to-beat pulse interval. The reason for such is the difficulty of measuring a continuous pulse interval during movement using a wearable device because of the nature of photoplethysmography, which is susceptible to motion noise. This study investigated the effect of missing heart beat interval data on the HRV analysis in cases where pulse interval cannot be measured because of movement noise. First, we performed simulations by randomly removing data from the RR interval of the electrocardiogram measured from 39 subjects and observed the changes of the relative and normalized errors for the HRV parameters according to the total length of the missing heart beat interval data. Second, we measured the pulse interval from 20 subjects using a wrist-worn device for 24 h and observed the error value for the missing pulse interval data caused by the movement during actual daily life. The experimental results showed that mean NN and RMSSD were the most robust for the missing heart beat interval data among all the parameters in the time and frequency domains. Most of the pulse interval data could not be obtained during daily life. In other words, the sample number was too small for spectral analysis because of the long missing duration. Therefore, the frequency domain parameters often could not be calculated, except for the sleep state with little motion. The errors of the HRV parameters were proportional to the missing data duration in the presence of missing heart beat interval data. Based on the results of this study, the maximum missing duration for acceptable errors for each parameter is recommended for use when the HRV analysis is performed on a wrist-worn device.

  18. Electromagnetic-induction logging to monitor changing chloride concentrations

    USGS Publications Warehouse

    Metzger, Loren F.; Izbicki, John A.

    2013-01-01

    Water from the San Joaquin Delta, having chloride concentrations up to 3590 mg/L, has intruded fresh water aquifers underlying Stockton, California. Changes in chloride concentrations at depth within these aquifers were evaluated using sequential electromagnetic (EM) induction logs collected during 2004 through 2007 at seven multiple-well sites as deep as 268 m. Sequential EM logging is useful for identifying changes in groundwater quality through polyvinyl chloride-cased wells in intervals not screened by wells. These unscreened intervals represent more than 90% of the aquifer at the sites studied. Sequential EM logging suggested degrading groundwater quality in numerous thin intervals, typically between 1 and 7 m in thickness, especially in the northern part of the study area. Some of these intervals were unscreened by wells, and would not have been identified by traditional groundwater sample collection. Sequential logging also identified intervals with improving water quality—possibly due to groundwater management practices that have limited pumping and promoted artificial recharge. EM resistivity was correlated with chloride concentrations in sampled wells and in water from core material. Natural gamma log data were used to account for the effect of aquifer lithology on EM resistivity. Results of this study show that a sequential EM logging is useful for identifying and monitoring the movement of high-chloride water, having lower salinities and chloride concentrations than sea water, in aquifer intervals not screened by wells, and that increases in chloride in water from wells in the area are consistent with high-chloride water originating from the San Joaquin Delta rather than from the underlying saline aquifer.

  19. Effects of environmental and pharmacological manipulations on a novel delayed nonmatching-to-sample 'working memory' procedure in unrestrained rhesus monkeys.

    PubMed

    Hutsell, Blake A; Banks, Matthew L

    2015-08-15

    Working memory is a domain of 'executive function.' Delayed nonmatching-to-sample (DNMTS) procedures are commonly used to examine working memory in both human laboratory and preclinical studies. The aim was to develop an automated DNMTS procedure maintained by food pellets in rhesus monkeys using a touch-sensitive screen attached to the housing chamber. Specifically, the DNMTS procedure was a 2-stimulus, 2-choice recognition memory task employing unidimensional discriminative stimuli and randomized delay interval presentations. DNMTS maintained a delay-dependent decrease in discriminability that was independent of the retention interval distribution. Eliminating reinforcer availability during a single delay session or providing food pellets before the session did not systematically alter accuracy, but did reduce total choices. Increasing the intertrial interval enhanced accuracy at short delays. Acute Δ(9)-THC pretreatment produced delay interval-dependent changes in the forgetting function at doses that did not alter total choices. Acute methylphenidate pretreatment only decreased total choices. All monkeys were trained to perform NMTS at the 1s training delay within 60 days of initiating operant touch training. Furthermore, forgetting functions were reliably delay interval-dependent and stable over the experimental period (∼6 months). Consistent with previous studies, increasing the intertrial interval improved DNMTS performance, whereas Δ(9)-THC disrupted DNMTS performance independent of changes in total choices. Overall, the touchscreen-based DNMTS procedure described provides an efficient method for training and testing experimental manipulations on working memory in unrestrained rhesus monkeys. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Analysis of single ion channel data incorporating time-interval omission and sampling

    PubMed Central

    The, Yu-Kai; Timmer, Jens

    2005-01-01

    Hidden Markov models are widely used to describe single channel currents from patch-clamp experiments. The inevitable anti-aliasing filter limits the time resolution of the measurements and therefore the standard hidden Markov model is not adequate anymore. The notion of time-interval omission has been introduced where brief events are not detected. The developed, exact solutions to this problem do not take into account that the measured intervals are limited by the sampling time. In this case the dead-time that specifies the minimal detectable interval length is not defined unambiguously. We show that a wrong choice of the dead-time leads to considerably biased estimates and present the appropriate equations to describe sampled data. PMID:16849220

  1. VACUUM DISTILLATION COUPLED WITH GAS CHROMATOGRAPHY/MASS SPECTROMETRY FOR THE ANALYSIS OF ENVIRONMENTAL SAMPLES

    EPA Science Inventory

    A procedure is presented that uses a vacuum distillation/gas chromatography/mass spectrometry system for analysis of problematic matrices of volatile organic compounds. The procedure compensates for matrix effects and provides both analytical results and confidence intervals from...

  2. Prolonged dry periods between rainfall events shorten the growth period of the resurrection plant Reaumuria soongorica.

    PubMed

    Zhang, Zhengzhong; Shan, Lishan; Li, Yi

    2018-01-01

    The resurrection plant Reaumuria soongorica is widespread across Asia, southern Europe, and North Africa and is considered to be a constructive keystone species in desert ecosystems, but the impacts of climate change on this species in desert ecosystems are unclear. Here, the morphological responses of R. soongorica to changes in rainfall quantity (30% reduction and 30% increase in rainfall quantity) and interval (50% longer drought interval between rainfall events) were tested. Stage-specific changes in growth were monitored by sampling at the beginning, middle, and end of the growing season. Reduced rainfall decreased the aboveground and total biomass, while additional precipitation generally advanced R. soongorica growth and biomass accumulation. An increased interval between rainfall events resulted in an increase in root biomass in the middle of the growing season, followed by a decrease toward the end. The response to the combination of increased rainfall quantity and interval was similar to the response to increased interval alone, suggesting that the effects of changes in rainfall patterns exert a greater influence than increased rainfall quantity. Thus, despite the short duration of this experiment, consequences of changes in rainfall regime on seedling growth were observed. In particular, a prolonged rainfall interval shortened the growth period, suggesting that climate change-induced rainfall variability may have significant effects on the structure and functioning of desert ecosystems.

  3. The effect of the interval-between-sessions on prefrontal transcranial direct current stimulation (tDCS) on cognitive outcomes: a systematic review and meta-analysis.

    PubMed

    Dedoncker, Josefien; Brunoni, Andre R; Baeken, Chris; Vanderhasselt, Marie-Anne

    2016-10-01

    Recently, there has been wide interest in the effects of transcranial direct current stimulation (tDCS) of the dorsolateral prefrontal cortex (DLPFC) on cognitive functioning. However, many methodological questions remain unanswered. One of them is whether the time interval between active and sham-controlled stimulation sessions, i.e. the interval between sessions (IBS), influences DLPFC tDCS effects on cognitive functioning. Therefore, a systematic review and meta-analysis was performed of experimental studies published in PubMed, Science Direct, and other databases from the first data available to February 2016. Single session sham-controlled within-subject studies reporting the effects of tDCS of the DLPFC on cognitive functioning in healthy controls and neuropsychiatric patients were included. Cognitive tasks were categorized in tasks assessing memory, attention, and executive functioning. Evaluation of 188 trials showed that anodal vs. sham tDCS significantly decreased response times and increased accuracy, and specifically for the executive functioning tasks, in a sample of healthy participants and neuropsychiatric patients (although a slightly different pattern of improvement was found in analyses for both samples separately). The effects of cathodal vs. sham tDCS (45 trials), on the other hand, were not significant. IBS ranged from less than 1 h to up to 1 week (i.e. cathodal tDCS) or 2 weeks (i.e. anodal tDCS). This IBS length had no influence on the estimated effect size when performing a meta-regression of IBS on reaction time and accuracy outcomes in all three cognitive categories, both for anodal and cathodal stimulation. Practical recommendations and limitations of the study are further discussed.

  4. Influence of In-Well Convection on Well Sampling

    USGS Publications Warehouse

    Vroblesky, Don A.; Casey, Clifton C.; Lowery, Mark A.

    2006-01-01

    Convective transport of dissolved oxygen (DO) from shallow to deeper parts of wells was observed as the shallow water in wells in South Carolina became cooler than the deeper water in the wells due to seasonal changes. Wells having a relatively small depth to water were more susceptible to thermally induced convection than wells where the depth to water was greater because the shallower water levels were more influenced by air temperature. The potential for convective transport of DO to maintain oxygenated conditions in a well was diminished as ground-water exchange through the well screen increased and as oxygen demand increased. Convective flow did not transport oxygen to the screened interval when the screened interval was deeper than the range of the convective cell. The convective movement of water in wells has potential implications for passive, or no-purge, and low-flow sampling approaches. Transport of DO to the screened interval can adversely affect the ability of passive samplers to produce accurate concentrations of oxygen-sensitive solutes, such as iron. Other potential consequences include mixing the screened-interval water with casing water and potentially allowing volatilization loss at the water surface. A field test of diffusion samplers in a convecting well during the winter, however, showed good agreement of chlorinated solvent concentrations with pumped samples, indicating that there was no negative impact of the convection on the utility of the samplers to collect volatile organic compound concentrations in that well. In the cases of low-flow sampling, convective circulation can cause the pumped sample to be a mixture of casing water and aquifer water. This can substantially increase the equilibration time of oxygen as an indicator parameter and can give false indications of the redox state. Data from this investigation show that simple in-well devices can effectively mitigate convective transport of oxygen. The devices can range from inflatable packers to simple baffle systems.

  5. The thresholds for statistical and clinical significance – a five-step procedure for evaluation of intervention effects in randomised clinical trials

    PubMed Central

    2014-01-01

    Background Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore, assessment of intervention effects in randomised clinical trials deserves more rigour in order to become more valid. Methods Several methodologies for assessing the statistical and clinical significance of intervention effects in randomised clinical trials were considered. Balancing simplicity and comprehensiveness, a simple five-step procedure was developed. Results For a more valid assessment of results from a randomised clinical trial we propose the following five-steps: (1) report the confidence intervals and the exact P-values; (2) report Bayes factor for the primary outcome, being the ratio of the probability that a given trial result is compatible with a ‘null’ effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance threshold if the trial is stopped early or if interim analyses have been conducted; (4) adjust the confidence intervals and the P-values for multiplicity due to number of outcome comparisons; and (5) assess clinical significance of the trial results. Conclusions If the proposed five-step procedure is followed, this may increase the validity of assessments of intervention effects in randomised clinical trials. PMID:24588900

  6. Method and apparatus for measuring nuclear magnetic properties

    DOEpatents

    Weitekamp, D.P.; Bielecki, A.; Zax, D.B.; Zilm, K.W.; Pines, A.

    1987-12-01

    A method for studying the chemical and structural characteristics of materials is disclosed. The method includes placement of a sample material in a high strength polarizing magnetic field to order the sample nuclei. The condition used to order the sample is then removed abruptly and the ordering of the sample allowed to evolve for a time interval. At the end of the time interval, the ordering of the sample is measured by conventional nuclear magnetic resonance techniques. 5 figs.

  7. Method and apparatus for measuring nuclear magnetic properties

    DOEpatents

    Weitekamp, Daniel P.; Bielecki, Anthony; Zax, David B.; Zilm, Kurt W.; Pines, Alexander

    1987-01-01

    A method for studying the chemical and structural characteristics of materials is disclosed. The method includes placement of a sample material in a high strength polarizing magnetic field to order the sample nucleii. The condition used to order the sample is then removed abruptly and the ordering of the sample allowed to evolve for a time interval. At the end of the time interval, the ordering of the sample is measured by conventional nuclear magnetic resonance techniques.

  8. An analysis of first-time blood donors return behaviour using regression models.

    PubMed

    Kheiri, S; Alibeigi, Z

    2015-08-01

    Blood products have a vital role in saving many patients' lives. The aim of this study was to analyse blood donor return behaviour. Using a cross-sectional follow-up design of 5-year duration, 864 first-time donors who had donated blood were selected using a systematic sampling. The behaviours of donors via three response variables, return to donation, frequency of return to donation and the time interval between donations, were analysed based on logistic regression, negative binomial regression and Cox's shared frailty model for recurrent events respectively. Successful return to donation rated at 49·1% and the deferral rate was 13·3%. There was a significant reverse relationship between the frequency of return to donation and the time interval between donations. Sex, body weight and job had an effect on return to donation; weight and frequency of donation during the first year had a direct effect on the total frequency of donations. Age, weight and job had a significant effect on the time intervals between donations. Aging decreases the chances of return to donation and increases the time interval between donations. Body weight affects the three response variables, i.e. the higher the weight, the more the chances of return to donation and the shorter the time interval between donations. There is a positive correlation between the frequency of donations in the first year and the total number of return to donations. Also, the shorter the time interval between donations is, the higher the frequency of donations. © 2015 British Blood Transfusion Society.

  9. Monitoring the effective population size of a brown bear (Ursus arctos) population using new single-sample approaches.

    PubMed

    Skrbinšek, Tomaž; Jelenčič, Maja; Waits, Lisette; Kos, Ivan; Jerina, Klemen; Trontelj, Peter

    2012-02-01

    The effective population size (N(e) ) could be the ideal parameter for monitoring populations of conservation concern as it conveniently summarizes both the evolutionary potential of the population and its sensitivity to genetic stochasticity. However, tracing its change through time is difficult in natural populations. We applied four new methods for estimating N(e) from a single sample of genotypes to trace temporal change in N(e) for bears in the Northern Dinaric Mountains. We genotyped 510 bears using 20 microsatellite loci and determined their age. The samples were organized into cohorts with regard to the year when the animals were born and yearly samples with age categories for every year when they were alive. We used the Estimator by Parentage Assignment (EPA) to directly estimate both N(e) and generation interval for each yearly sample. For cohorts, we estimated the effective number of breeders (N(b) ) using linkage disequilibrium, sibship assignment and approximate Bayesian computation methods and extrapolated these estimates to N(e) using the generation interval. The N(e) estimate by EPA is 276 (183-350 95% CI), meeting the inbreeding-avoidance criterion of N(e) > 50 but short of the long-term minimum viable population goal of N(e) > 500. The results obtained by the other methods are highly consistent with this result, and all indicate a rapid increase in N(e) probably in the late 1990s and early 2000s. The new single-sample approaches to the estimation of N(e) provide efficient means for including N(e) in monitoring frameworks and will be of great importance for future management and conservation. © 2012 Blackwell Publishing Ltd.

  10. Measuring discharge with ADCPs: Inferences from synthetic velocity profiles

    USGS Publications Warehouse

    Rehmann, C.R.; Mueller, D.S.; Oberg, K.A.

    2009-01-01

    Synthetic velocity profiles are used to determine guidelines for sampling discharge with acoustic Doppler current profilers (ADCPs). The analysis allows the effects of instrument characteristics, sampling parameters, and properties of the flow to be studied systematically. For mid-section measurements, the averaging time required for a single profile measurement always exceeded the 40 s usually recommended for velocity measurements, and it increased with increasing sample interval and increasing time scale of the large eddies. Similarly, simulations of transect measurements show that discharge error decreases as the number of large eddies sampled increases. The simulations allow sampling criteria that account for the physics of the flow to be developed. ?? 2009 ASCE.

  11. Hematology and serum clinical chemistry reference intervals for free-ranging Scandinavian gray wolves (Canis lupus).

    PubMed

    Thoresen, Stein I; Arnemo, Jon M; Liberg, Olof

    2009-06-01

    Scandinavian free-ranging wolves (Canis lupus) are endangered, such that laboratory data to assess their health status is increasingly important. Although wolves have been studied for decades, most biological information comes from captive animals. The objective of the present study was to establish reference intervals for 30 clinical chemical and 8 hematologic analytes in Scandinavian free-ranging wolves. All wolves were tracked and chemically immobilized from a helicopter before examination and blood sampling in the winter of 7 consecutive years (1998-2004). Seventy-nine blood samples were collected from 57 gray wolves, including 24 juveniles (24 samples), 17 adult females (25 samples), and 16 adult males (30 samples). Whole blood and serum samples were stored at refrigeration temperature for 1-3 days before hematologic analyses and for 1-5 days before serum biochemical analyses. Reference intervals were calculated as 95% confidence intervals except for juveniles where the minimum and maximum values were used. Significant differences were observed between adult and juvenile wolves for RBC parameters, alkaline phosphatase and amylase activities, and total protein, albumin, gamma-globulins, cholesterol, creatinine, calcium, chloride, magnesium, phosphate, and sodium concentrations. Compared with published reference values for captive wolves, reference intervals for free-ranging wolves reflected exercise activity associated with capture (higher creatine kinase activity, higher glucose concentration), and differences in nutritional status (higher urea concentration).

  12. Blood and Plasma Biochemistry Reference Intervals for Wild Juvenile American Alligators ( Alligator mississippiensis ).

    PubMed

    Hamilton, Matthew T; Kupar, Caitlin A; Kelley, Meghan D; Finger, John W; Tuberville, Tracey D

    2016-07-01

    : American alligators ( Alligator mississippiensis ) are one of the most studied crocodilian species in the world, yet blood and plasma biochemistry information is limited for juvenile alligators in their northern range, where individuals may be exposed to extreme abiotic and biotic stressors. We collected blood samples over a 2-yr period from 37 juvenile alligators in May, June, and July to establish reference intervals for 22 blood and plasma analytes. We observed no effect of either sex or blood collection time on any analyte investigated. However, our results indicate a significant correlation between a calculated body condition index and aspartate aminotransferase and creatine kinase. Glucose, total protein, and potassium varied significantly between sampling sessions. In addition, glucose and potassium were highly correlated between the two point-of-care devices used, although they were significantly lower with the i-STAT 1 CG8+ cartridge than with the Vetscan VS2 Avian/Reptile Rotor. The reference intervals presented herein should provide baseline data for evaluating wild juvenile alligators in the northern portion of their range.

  13. Discrimination of edible oils and fats by combination of multivariate pattern recognition and FT-IR spectroscopy: a comparative study between different modeling methods.

    PubMed

    Javidnia, Katayoun; Parish, Maryam; Karimi, Sadegh; Hemmateenejad, Bahram

    2013-03-01

    By using FT-IR spectroscopy, many researchers from different disciplines enrich the experimental complexity of their research for obtaining more precise information. Moreover chemometrics techniques have boosted the use of IR instruments. In the present study we aimed to emphasize on the power of FT-IR spectroscopy for discrimination between different oil samples (especially fat from vegetable oils). Also our data were used to compare the performance of different classification methods. FT-IR transmittance spectra of oil samples (Corn, Colona, Sunflower, Soya, Olive, and Butter) were measured in the wave-number interval of 450-4000 cm(-1). Classification analysis was performed utilizing PLS-DA, interval PLS-DA, extended canonical variate analysis (ECVA) and interval ECVA methods. The effect of data preprocessing by extended multiplicative signal correction was investigated. Whilst all employed method could distinguish butter from vegetable oils, iECVA resulted in the best performances for calibration and external test set with 100% sensitivity and specificity. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. [Evaluation of the principles of distribution of electrocardiographic R-R intervals for elaboration of methods of automated diagnosis of cardiac rhythm disorders].

    PubMed

    Tsukerman, B M; Finkel'shteĭn, I E

    1987-07-01

    A statistical analysis of prolonged ECG records has been carried out in patients with various heart rhythm and conductivity disorders. The distribution of absolute R-R duration values and relationships between adjacent intervals have been examined. A two-step algorithm has been constructed that excludes anomalous and "suspicious" intervals from a sample of consecutively recorded R-R intervals, until only the intervals between contractions of veritably sinus origin remain in the sample. The algorithm has been developed into a programme for microcomputer Electronica NC-80. It operates reliably even in cases of complex combined rhythm and conductivity disorders.

  15. Parametric methods outperformed non-parametric methods in comparisons of discrete numerical variables.

    PubMed

    Fagerland, Morten W; Sandvik, Leiv; Mowinckel, Petter

    2011-04-13

    The number of events per individual is a widely reported variable in medical research papers. Such variables are the most common representation of the general variable type called discrete numerical. There is currently no consensus on how to compare and present such variables, and recommendations are lacking. The objective of this paper is to present recommendations for analysis and presentation of results for discrete numerical variables. Two simulation studies were used to investigate the performance of hypothesis tests and confidence interval methods for variables with outcomes {0, 1, 2}, {0, 1, 2, 3}, {0, 1, 2, 3, 4}, and {0, 1, 2, 3, 4, 5}, using the difference between the means as an effect measure. The Welch U test (the T test with adjustment for unequal variances) and its associated confidence interval performed well for almost all situations considered. The Brunner-Munzel test also performed well, except for small sample sizes (10 in each group). The ordinary T test, the Wilcoxon-Mann-Whitney test, the percentile bootstrap interval, and the bootstrap-t interval did not perform satisfactorily. The difference between the means is an appropriate effect measure for comparing two independent discrete numerical variables that has both lower and upper bounds. To analyze this problem, we encourage more frequent use of parametric hypothesis tests and confidence intervals.

  16. Performance of informative priors skeptical of large treatment effects in clinical trials: A simulation study.

    PubMed

    Pedroza, Claudia; Han, Weilu; Thanh Truong, Van Thi; Green, Charles; Tyson, Jon E

    2018-01-01

    One of the main advantages of Bayesian analyses of clinical trials is their ability to formally incorporate skepticism about large treatment effects through the use of informative priors. We conducted a simulation study to assess the performance of informative normal, Student- t, and beta distributions in estimating relative risk (RR) or odds ratio (OR) for binary outcomes. Simulation scenarios varied the prior standard deviation (SD; level of skepticism of large treatment effects), outcome rate in the control group, true treatment effect, and sample size. We compared the priors with regards to bias, mean squared error (MSE), and coverage of 95% credible intervals. Simulation results show that the prior SD influenced the posterior to a greater degree than the particular distributional form of the prior. For RR, priors with a 95% interval of 0.50-2.0 performed well in terms of bias, MSE, and coverage under most scenarios. For OR, priors with a wider 95% interval of 0.23-4.35 had good performance. We recommend the use of informative priors that exclude implausibly large treatment effects in analyses of clinical trials, particularly for major outcomes such as mortality.

  17. Effect of adhesive materials on shear bond strength of a mineral trioxide aggregate.

    PubMed

    Ali, Ahmed; Banerjee, Avijit; Mannocci, Francesco

    2016-02-01

    To compare the shear bond strength (SBS) and fractography between mineral trioxide aggregate (MTA) and glass-ionomer cement (GIC) or resin composite (RC) after varying MTA setting time intervals. MTA was mixed and packed into standardized cavities (4 mm diameter x 3 mm depth) in acrylic blocks. RC with 37% H₃PO₄ and type 2 (etch and rinse) adhesive, or conventional GIC was bonded to the exposed MTA sample surfaces after 10-minute, 24-hour, 72-hour and 30-day MTA setting intervals (n = 10/group, eight groups). Samples were stored (37°C, 24 hours, 100% humidity) before SBS testing and statistical analysis (ANOVA, Tukey LSD, P < 0.05). Fractography was undertaken using stereomicroscopy for all samples and three random samples/group by using SEM. Significant differences between all groups were found (P= 0.002). SBS of RC:MTA (Max 5.09 ± 1.79 MPa) was higher than the SBS of GIC:MTA (Max 3.74 ± 0.70 MPa) in 24-hour, 72-hour and 30-day groups except in the 10-minute MTA setting time groups, where SBS of GIC:MTA was higher. There was a significant effect of time on SBS of RC: MTA (P = 0.008) and no effect on SBS of GIC:MTA (P = 3.00). Fractography revealed mixed (adhesive/cohesive) failures in all groups; in RC:MTA groups there was a decrease in adhesive failure with time in contrast to the GIC:MTA groups.

  18. Estimation of treatment effect in a subpopulation: An empirical Bayes approach.

    PubMed

    Shen, Changyu; Li, Xiaochun; Jeong, Jaesik

    2016-01-01

    It is well recognized that the benefit of a medical intervention may not be distributed evenly in the target population due to patient heterogeneity, and conclusions based on conventional randomized clinical trials may not apply to every person. Given the increasing cost of randomized trials and difficulties in recruiting patients, there is a strong need to develop analytical approaches to estimate treatment effect in subpopulations. In particular, due to limited sample size for subpopulations and the need for multiple comparisons, standard analysis tends to yield wide confidence intervals of the treatment effect that are often noninformative. We propose an empirical Bayes approach to combine both information embedded in a target subpopulation and information from other subjects to construct confidence intervals of the treatment effect. The method is appealing in its simplicity and tangibility in characterizing the uncertainty about the true treatment effect. Simulation studies and a real data analysis are presented.

  19. Plasma creatinine in dogs: intra- and inter-laboratory variation in 10 European veterinary laboratories

    PubMed Central

    2011-01-01

    Background There is substantial variation in reported reference intervals for canine plasma creatinine among veterinary laboratories, thereby influencing the clinical assessment of analytical results. The aims of the study was to determine the inter- and intra-laboratory variation in plasma creatinine among 10 veterinary laboratories, and to compare results from each laboratory with the upper limit of its reference interval. Methods Samples were collected from 10 healthy dogs, 10 dogs with expected intermediate plasma creatinine concentrations, and 10 dogs with azotemia. Overlap was observed for the first two groups. The 30 samples were divided into 3 batches and shipped in random order by postal delivery for plasma creatinine determination. Statistical testing was performed in accordance with ISO standard methodology. Results Inter- and intra-laboratory variation was clinically acceptable as plasma creatinine values for most samples were usually of the same magnitude. A few extreme outliers caused three laboratories to fail statistical testing for consistency. Laboratory sample means above or below the overall sample mean, did not unequivocally reflect high or low reference intervals in that laboratory. Conclusions In spite of close analytical results, further standardization among laboratories is warranted. The discrepant reference intervals seem to largely reflect different populations used in establishing the reference intervals, rather than analytical variation due to different laboratory methods. PMID:21477356

  20. The Quality of Reporting of Measures of Precision in Animal Experiments in Implant Dentistry: A Methodological Study.

    PubMed

    Faggion, Clovis Mariano; Aranda, Luisiana; Diaz, Karla Tatiana; Shih, Ming-Chieh; Tu, Yu-Kang; Alarcón, Marco Antonio

    2016-01-01

    Information on precision of treatment-effect estimates is pivotal for understanding research findings. In animal experiments, which provide important information for supporting clinical trials in implant dentistry, inaccurate information may lead to biased clinical trials. The aim of this methodological study was to determine whether sample size calculation, standard errors, and confidence intervals for treatment-effect estimates are reported accurately in publications describing animal experiments in implant dentistry. MEDLINE (via PubMed), Scopus, and SciELO databases were searched to identify reports involving animal experiments with dental implants published from September 2010 to March 2015. Data from publications were extracted into a standardized form with nine items related to precision of treatment estimates and experiment characteristics. Data selection and extraction were performed independently and in duplicate, with disagreements resolved by discussion-based consensus. The chi-square and Fisher exact tests were used to assess differences in reporting according to study sponsorship type and impact factor of the journal of publication. The sample comprised reports of 161 animal experiments. Sample size calculation was reported in five (2%) publications. P values and confidence intervals were reported in 152 (94%) and 13 (8%) of these publications, respectively. Standard errors were reported in 19 (12%) publications. Confidence intervals were better reported in publications describing industry-supported animal experiments (P = .03) and with a higher impact factor (P = .02). Information on precision of estimates is rarely reported in publications describing animal experiments in implant dentistry. This lack of information makes it difficult to evaluate whether the translation of animal research findings to clinical trials is adequate.

  1. Effects of small-scale vertical variations in well-screen inflow rates and concentrations of organic compounds on the collection of representative ground-water-quality samples

    USGS Publications Warehouse

    Gibs, Jacob; Brown, G. Allan; Turner, Kenneth S.; MacLeod, Cecilia L.; Jelinski, James; Koehnlein, Susan A.

    1993-01-01

    Because a water sample collected from a well is an integration of water from different depths along the well screen, measured concentrations can be biased if analyte concentrations are not uniform along the length of the well screen. The resulting concentration in the sample, therefore, is a function of variations in well-screen inflow rate and analyte concentration with depth. A multiport sampler with seven short screened intervals was designed and used to investigate small-scale vertical variations in water chemistry and aquifer hydraulic conductivity in ground water contaminated by leaded gasoline at Galloway Township, Atlantic County, New Jersey. The multiport samplers were used to collect independent samples from seven intervals within the screened zone that were flow-rate weighted and integrated to simulate a 5-foot-long, 2.375-inch- outside-diameter conventional wire-wound screen. The integration of the results of analyses of samples collected from two multiport samplers showed that a conventional 5-foot-long well screen would integrate contaminant concentrations over its length and resulted in an apparent contaminant concentration that was a little as 28 percent of the maximum concentration observed in the multiport sampler.

  2. Oversampling of digitized images. [effects on interpolation in signal processing

    NASA Technical Reports Server (NTRS)

    Fischel, D.

    1976-01-01

    Oversampling is defined as sampling with a device whose characteristic width is greater than the interval between samples. This paper shows why oversampling should be avoided and discusses the limitations in data processing if circumstances dictate that oversampling cannot be circumvented. Principally, oversampling should not be used to provide interpolating data points. Rather, the time spent oversampling should be used to obtain more signal with less relative error, and the Sampling Theorem should be employed to provide any desired interpolated values. The concepts are applicable to single-element and multielement detectors.

  3. Estimation of reference intervals from small samples: an example using canine plasma creatinine.

    PubMed

    Geffré, A; Braun, J P; Trumel, C; Concordet, D

    2009-12-01

    According to international recommendations, reference intervals should be determined from at least 120 reference individuals, which often are impossible to achieve in veterinary clinical pathology, especially for wild animals. When only a small number of reference subjects is available, the possible bias cannot be known and the normality of the distribution cannot be evaluated. A comparison of reference intervals estimated by different methods could be helpful. The purpose of this study was to compare reference limits determined from a large set of canine plasma creatinine reference values, and large subsets of this data, with estimates obtained from small samples selected randomly. Twenty sets each of 120 and 27 samples were randomly selected from a set of 1439 plasma creatinine results obtained from healthy dogs in another study. Reference intervals for the whole sample and for the large samples were determined by a nonparametric method. The estimated reference limits for the small samples were minimum and maximum, mean +/- 2 SD of native and Box-Cox-transformed values, 2.5th and 97.5th percentiles by a robust method on native and Box-Cox-transformed values, and estimates from diagrams of cumulative distribution functions. The whole sample had a heavily skewed distribution, which approached Gaussian after Box-Cox transformation. The reference limits estimated from small samples were highly variable. The closest estimates to the 1439-result reference interval for 27-result subsamples were obtained by both parametric and robust methods after Box-Cox transformation but were grossly erroneous in some cases. For small samples, it is recommended that all values be reported graphically in a dot plot or histogram and that estimates of the reference limits be compared using different methods.

  4. Intraseptal infusion of oxotremorine impairs memory in a delayed-non-match-to-sample radial maze task.

    PubMed

    Bunce, J G; Sabolek, H R; Chrobak, J J

    2003-01-01

    The medial septal nucleus is part of the forebrain circuitry that supports memory. This nucleus is rich in cholinergic receptors and is a putative target for the development of cholinomimetic cognitive-enhancing drugs. Septal neurons, primarily cholinergic and GABAergic, innervate the entire hippocampal formation and regulate hippocampal formation physiology and emergent function. Direct intraseptal drug infusions can produce amnestic or promnestic effects depending upon the type of drug administered. However, intraseptal infusion of the cholinomimetic oxotremorine has been reported to produce both promnestic and amnestic effects when administered prior to task performance. The present study examined whether post-acquisition intraseptal infusion of oxotremorine would be promnestic or amnestic in a delayed-non-match-to-sample radial maze task. In this task rats must remember information about spatial locations visited during a daily sample session and maintain that information over extended retention intervals (hours) in order to perform accurately on the daily test session. Treatments may then be administered during the retention interval. Alterations in maze performance during the test session an hour or more after treatment evidences effects on memory. In the present study, intraseptal infusion of oxotremorine (1.0-10.0 microg) produced a linear dose-related impairment of memory performance. Importantly, we also observed disrupted performance on the day after treatment. This persistent deficit was related only to memory over the retention interval and did not affect indices of short-term memory (ability to avoid repetitive or proactive errors during both the pre- and post-delay sessions). The persistent deficit contrasts with the acute amnestic effects of other intraseptally administered drugs including the cholinomimetics carbachol and tacrine. Thus, intraseptal oxotremorine produced a preferential disruption of memory consolidation as well as a persistent alteration of medial septal circuits. These findings are discussed with regards to multi-stage models of hippocampal-dependent memory formation and the further development of therapeutic strategies in the treatment of mild cognitive impairment as well as age-related decline and Alzheimer's dementia.

  5. Weighted regression analysis and interval estimators

    Treesearch

    Donald W. Seegrist

    1974-01-01

    A method for deriving the weighted least squares estimators for the parameters of a multiple regression model. Confidence intervals for expected values, and prediction intervals for the means of future samples are given.

  6. Nanoscale Surface Characterization of Aqueous Copper Corrosion: Effects of Immersion Interval and Orthophosphate Concentration

    EPA Science Inventory

    Morphology changes for copper surfaces exposed to different water parameters were investigated at the nanoscale with atomic force microscopy (AFM), as influenced by changes in pH and the levels of orthophosphate ions. Synthetic water samples were designed to mimic physiological c...

  7. Reference intervals for selected serum biochemistry analytes in cheetahs Acinonyx jubatus.

    PubMed

    Hudson-Lamb, Gavin C; Schoeman, Johan P; Hooijberg, Emma H; Heinrich, Sonja K; Tordiffe, Adrian S W

    2016-02-26

    Published haematologic and serum biochemistry reference intervals are very scarce for captive cheetahs and even more for free-ranging cheetahs. The current study was performed to establish reference intervals for selected serum biochemistry analytes in cheetahs. Baseline serum biochemistry analytes were analysed from 66 healthy Namibian cheetahs. Samples were collected from 30 captive cheetahs at the AfriCat Foundation and 36 free-ranging cheetahs from central Namibia. The effects of captivity-status, age, sex and haemolysis score on the tested serum analytes were investigated. The biochemistry analytes that were measured were sodium, potassium, magnesium, chloride, urea and creatinine. The 90% confidence interval of the reference limits was obtained using the non-parametric bootstrap method. Reference intervals were preferentially determined by the non-parametric method and were as follows: sodium (128 mmol/L - 166 mmol/L), potassium (3.9 mmol/L - 5.2 mmol/L), magnesium (0.8 mmol/L - 1.2 mmol/L), chloride (97 mmol/L - 130 mmol/L), urea (8.2 mmol/L - 25.1 mmol/L) and creatinine (88 µmol/L - 288 µmol/L). Reference intervals from the current study were compared with International Species Information System values for cheetahs and found to be narrower. Moreover, age, sex and haemolysis score had no significant effect on the serum analytes in this study. Separate reference intervals for captive and free-ranging cheetahs were also determined. Captive cheetahs had higher urea values, most likely due to dietary factors. This study is the first to establish reference intervals for serum biochemistry analytes in cheetahs according to international guidelines. These results can be used for future health and disease assessments in both captive and free-ranging cheetahs.

  8. Indirect methods for reference interval determination - review and recommendations.

    PubMed

    Jones, Graham R D; Haeckel, Rainer; Loh, Tze Ping; Sikaris, Ken; Streichert, Thomas; Katayev, Alex; Barth, Julian H; Ozarda, Yesim

    2018-04-19

    Reference intervals are a vital part of the information supplied by clinical laboratories to support interpretation of numerical pathology results such as are produced in clinical chemistry and hematology laboratories. The traditional method for establishing reference intervals, known as the direct approach, is based on collecting samples from members of a preselected reference population, making the measurements and then determining the intervals. An alternative approach is to perform analysis of results generated as part of routine pathology testing and using appropriate statistical techniques to determine reference intervals. This is known as the indirect approach. This paper from a working group of the International Federation of Clinical Chemistry (IFCC) Committee on Reference Intervals and Decision Limits (C-RIDL) aims to summarize current thinking on indirect approaches to reference intervals. The indirect approach has some major potential advantages compared with direct methods. The processes are faster, cheaper and do not involve patient inconvenience, discomfort or the risks associated with generating new patient health information. Indirect methods also use the same preanalytical and analytical techniques used for patient management and can provide very large numbers for assessment. Limitations to the indirect methods include possible effects of diseased subpopulations on the derived interval. The IFCC C-RIDL aims to encourage the use of indirect methods to establish and verify reference intervals, to promote publication of such intervals with clear explanation of the process used and also to support the development of improved statistical techniques for these studies.

  9. Use of Ketorolac Is Associated with Decreased Pneumonia Following Rib Fractures

    PubMed Central

    Yang, Yifan; Young, Jason B.; Schermer, Carol R.; Utter, Garth H.

    2015-01-01

    Background The effectiveness of the non-steroidal anti-inflammatory drug ketorolac in reducing pulmonary morbidity following rib fractures remains largely unknown. Methods We conducted a retrospective cohort study spanning January, 2003 to June, 2011, comparing pneumonia within 30 days and potential adverse effects of ketorolac among all patients with rib fractures who received ketorolac within four days post-injury to a random sample of those who did not. Results Among 202 patients who received ketorolac and 417 who did not, ketorolac use was associated with decreased pneumonia [odds ratio 0.14 (95% confidence interval 0.04–0.46)] and increased ventilator- and intensive care unit-free days [1.8 (95% confidence interval 1.1–2.5) and 2.1 (95% confidence interval 1.3–3.0) days, respectively] within 30 days. The rates of acute kidney injury, gastrointestinal hemorrhage, and fracture non-union were not different. Conclusions Early administration of ketorolac to patients with rib fractures is associated with a decreased likelihood of pneumonia, without apparent risks. PMID:24112670

  10. Use of ketorolac is associated with decreased pneumonia following rib fractures.

    PubMed

    Yang, Yifan; Young, Jason B; Schermer, Carol R; Utter, Garth H

    2014-04-01

    The effectiveness of the nonsteroidal anti-inflammatory drug ketorolac in reducing pulmonary morbidity after rib fractures remains largely unknown. A retrospective cohort study was conducted spanning January 2003 to June 2011 assessing pneumonia within 30 days and potential adverse effects of ketorolac among all patients with rib fractures who received ketorolac <4 days after injury compared with a random sample of those who did not. Among 202 patients who received ketorolac and 417 who did not, ketorolac use was associated with decreased pneumonia (odds ratio, .14; 95% confidence interval, .04 to .46) and increased ventilator-free days (difference, 1.8 days; 95% confidence interval, 1.1 to 2.5) and intensive care unit-free days (difference, 2.1 days; 95% confidence interval, 1.3 to 3.0) within 30 days. The rates of acute kidney injury, gastrointestinal hemorrhage, and fracture nonunion were not different. Early administration of ketorolac to patients with rib fractures is associated with a decreased likelihood of pneumonia, without apparent risks. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Generalized SAMPLE SIZE Determination Formulas for Investigating Contextual Effects by a Three-Level Random Intercept Model.

    PubMed

    Usami, Satoshi

    2017-03-01

    Behavioral and psychological researchers have shown strong interests in investigating contextual effects (i.e., the influences of combinations of individual- and group-level predictors on individual-level outcomes). The present research provides generalized formulas for determining the sample size needed in investigating contextual effects according to the desired level of statistical power as well as width of confidence interval. These formulas are derived within a three-level random intercept model that includes one predictor/contextual variable at each level to simultaneously cover various kinds of contextual effects that researchers can show interest. The relative influences of indices included in the formulas on the standard errors of contextual effects estimates are investigated with the aim of further simplifying sample size determination procedures. In addition, simulation studies are performed to investigate finite sample behavior of calculated statistical power, showing that estimated sample sizes based on derived formulas can be both positively and negatively biased due to complex effects of unreliability of contextual variables, multicollinearity, and violation of assumption regarding the known variances. Thus, it is advisable to compare estimated sample sizes under various specifications of indices and to evaluate its potential bias, as illustrated in the example.

  12. Multiple Independent Genetic Factors at NOS1AP Modulate the QT Interval in a Multi-Ethnic Population

    PubMed Central

    Arking, Dan E.; Khera, Amit; Xing, Chao; Kao, W. H. Linda; Post, Wendy; Boerwinkle, Eric; Chakravarti, Aravinda

    2009-01-01

    Extremes of electrocardiographic QT interval are associated with increased risk for sudden cardiac death (SCD); thus, identification and characterization of genetic variants that modulate QT interval may elucidate the underlying etiology of SCD. Previous studies have revealed an association between a common genetic variant in NOS1AP and QT interval in populations of European ancestry, but this finding has not been extended to other ethnic populations. We sought to characterize the effects of NOS1AP genetic variants on QT interval in the multi-ethnic population-based Dallas Heart Study (DHS, n = 3,072). The SNP most strongly associated with QT interval in previous samples of European ancestry, rs16847548, was the most strongly associated in White (P = 0.005) and Black (P = 3.6×10−5) participants, with the same direction of effect in Hispanics (P = 0.17), and further showed a significant SNP × sex-interaction (P = 0.03). A second SNP, rs16856785, uncorrelated with rs16847548, was also associated with QT interval in Blacks (P = 0.01), with qualitatively similar results in Whites and Hispanics. In a previously genotyped cohort of 14,107 White individuals drawn from the combined Atherosclerotic Risk in Communities (ARIC) and Cardiovascular Health Study (CHS) cohorts, we validated both the second locus at rs16856785 (P = 7.63×10−8), as well as the sex-interaction with rs16847548 (P = 8.68×10−6). These data extend the association of genetic variants in NOS1AP with QT interval to a Black population, with similar trends, though not statistically significant at P<0.05, in Hispanics. In addition, we identify a strong sex-interaction and the presence of a second independent site within NOS1AP associated with the QT interval. These results highlight the consistent and complex role of NOS1AP genetic variants in modulating QT interval. PMID:19180230

  13. The effect of stimulation interval on plasticity following repeated blocks of intermittent theta burst stimulation.

    PubMed

    Tse, Nga Yan; Goldsworthy, Mitchell R; Ridding, Michael C; Coxon, James P; Fitzgerald, Paul B; Fornito, Alex; Rogasch, Nigel C

    2018-06-04

    This study assessed the effect of interval duration on the direction and magnitude of changes in cortical excitability and inhibition when applying repeated blocks of intermittent theta burst stimulation (iTBS) over motor cortex. 15 participants received three different iTBS conditions on separate days: single iTBS; repeated iTBS with a 5 minute interval (iTBS-5-iTBS); and with a 15 minute interval (iTBS-15-iTBS). Changes in cortical excitability and short-interval cortical inhibition (SICI) were assessed via motor-evoked potentials (MEPs) before and up to 60 mins following stimulation. iTBS-15-iTBS increased MEP amplitude for up to 60 mins post stimulation, whereas iTBS-5-iTBS decreased MEP amplitude. In contrast, MEP amplitude was not altered by single iTBS. Despite the group level findings, only 53% of individuals showed facilitated MEPs following iTBS-15-iTBS, and only 40% inhibited MEPs following iTBS-5-iTBS. Modulation of SICI did not differ between conditions. These results suggest interval duration between spaced iTBS plays an important role in determining the direction of plasticity on excitatory, but not inhibitory circuits in human motor cortex. While repeated iTBS can increase the magnitude of MEP facilitation/inhibition in some individuals compared to single iTBS, the response to repeated iTBS appears variable between individuals in this small sample.

  14. Resampling methods in Microsoft Excel® for estimating reference intervals

    PubMed Central

    Theodorsson, Elvar

    2015-01-01

    Computer- intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles.
The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular.
Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples. PMID:26527366

  15. Resampling methods in Microsoft Excel® for estimating reference intervals.

    PubMed

    Theodorsson, Elvar

    2015-01-01

    Computer-intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles. 
The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular.
 Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples.

  16. In vivo gamma-rays induced initial DNA damage and the effect of famotidine in mouse leukocytes as assayed by the alkaline comet assay.

    PubMed

    Mozdarani, Hossein; Nasirian, Borzo; Haeri, S Abolghasem

    2007-03-01

    Ionizing radiation induces a variety of lesions in DNA, each of which can be used as a bio-indicator for biological dosimetry or the study of the radioprotective effects of substances. To assess gamma ray-induced DNA damage in vivo in mouse leukocytes at various doses and the effect of famotidine, blood was collected from Balb/c male mice after irradiation with 4 Gy gamma-rays at different time intervals post-irradiation. To assess the response, mice were irradiated with doses of gamma-rays at 1 to 4 Grays. Famotidine was injected intra-peritoneally (i.p) at a dose of 5 mg/kg at various time intervals before irradiation. Four slides were prepared from each sample and alkaline comet assay was performed using standard protocols. Results obtained show that radiation significantly increases DNA damage in leukocytes in a dose dependent manner (p < 0.01) when using appropriate sampling time after irradiation, because increasing sampling time after irradiation resulted in a time dependent disappearance of DNA damage. Treatment with only 5 mg/kg famotidine before 4 Gy irradiation led to almost 50% reduction in DNA damage when compared with those animals which received radiation alone. The radioprotective capability of famotidine might be attributed to radical scavenging properties and an anti-oxidation mechanism.

  17. Power in Bayesian Mediation Analysis for Small Sample Research

    PubMed Central

    Miočević, Milica; MacKinnon, David P.; Levy, Roy

    2018-01-01

    It was suggested that Bayesian methods have potential for increasing power in mediation analysis (Koopman, Howe, Hollenbeck, & Sin, 2015; Yuan & MacKinnon, 2009). This paper compares the power of Bayesian credibility intervals for the mediated effect to the power of normal theory, distribution of the product, percentile, and bias-corrected bootstrap confidence intervals at N≤ 200. Bayesian methods with diffuse priors have power comparable to the distribution of the product and bootstrap methods, and Bayesian methods with informative priors had the most power. Varying degrees of precision of prior distributions were also examined. Increased precision led to greater power only when N≥ 100 and the effects were small, N < 60 and the effects were large, and N < 200 and the effects were medium. An empirical example from psychology illustrated a Bayesian analysis of the single mediator model from prior selection to interpreting results. PMID:29662296

  18. Power in Bayesian Mediation Analysis for Small Sample Research.

    PubMed

    Miočević, Milica; MacKinnon, David P; Levy, Roy

    2017-01-01

    It was suggested that Bayesian methods have potential for increasing power in mediation analysis (Koopman, Howe, Hollenbeck, & Sin, 2015; Yuan & MacKinnon, 2009). This paper compares the power of Bayesian credibility intervals for the mediated effect to the power of normal theory, distribution of the product, percentile, and bias-corrected bootstrap confidence intervals at N≤ 200. Bayesian methods with diffuse priors have power comparable to the distribution of the product and bootstrap methods, and Bayesian methods with informative priors had the most power. Varying degrees of precision of prior distributions were also examined. Increased precision led to greater power only when N≥ 100 and the effects were small, N < 60 and the effects were large, and N < 200 and the effects were medium. An empirical example from psychology illustrated a Bayesian analysis of the single mediator model from prior selection to interpreting results.

  19. Comparison of passive diffusion bag samplers and submersible pump sampling methods for monitoring volatile organic compounds in ground water at Area 6, Naval Air Station, Whidbey Island, Washington

    USGS Publications Warehouse

    Huffman, Raegan L.

    2002-01-01

    Ground-water samples were collected in April 1999 at Naval Air Station Whidbey Island, Washington, with passive diffusion samplers and a submersible pump to compare concentrations of volatile organic compounds (VOCs) in water samples collected using the two sampling methods. Single diffusion samplers were installed in wells with 10-foot screened intervals, and multiple diffusion samplers were installed in wells with 20- to 40-foot screened intervals. The diffusion samplers were recovered after 20 days and the wells were then sampled using a submersible pump. VOC concentrations in the 10-foot screened wells in water samples collected with diffusion samplers closely matched concentrations in samples collected with the submersible pump. Analysis of VOC concentrations in samples collected from the 20- to 40-foot screened wells with multiple diffusion samplers indicated vertical concentration variation within the screened interval, whereas the analysis of VOC concentrations in samples collected with the submersible pump indicated mixing during pumping. The results obtained using the two sampling methods indicate that the samples collected with the diffusion samplers were comparable with and can be considerably less expensive than samples collected using a submersible pump.

  20. Gas, water, and oil production from Wattenberg field in the Denver Basin, Colorado

    USGS Publications Warehouse

    Nelson, Philip H.; Santus, Stephen L.

    2011-01-01

    Gas, oil, and water production data were compiled from selected wells in two tight gas reservoirs-the Codell-Niobrara interval, comprised of the Codell Sandstone Member of the Carlile Shale and the Niobrara Formation; and the Dakota J interval, comprised mostly of the Muddy (J) Sandstone of the Dakota Group; both intervals are of Cretaceous age-in the Wattenberg field in the Denver Basin of Colorado. Production from each well is represented by two samples spaced five years apart, the first sample typically taken two years after production commenced, which generally was in the 1990s. For each producing interval, summary diagrams and tables of oil-versus-gas production and water-versus-gas production are shown with fluid-production rates, the change in production over five years, the water-gas and oil-gas ratios, and the fluid type. These diagrams and tables permit well-to-well and field-to-field comparisons. Fields producing water at low rates (water dissolved in gas in the reservoir) can be distinguished from fields producing water at moderate or high rates, and the water-gas ratios are quantified. The Dakota J interval produces gas on a per-well basis at roughly three times the rate of the Codell-Niobrara interval. After five years of production, gas data from the second samples show that both intervals produce gas, on average, at about one-half the rate as the first sample. Oil-gas ratios in the Codell-Niobrara interval are characteristic of a retrograde gas and are considerably higher than oil-gas ratios in the Dakota J interval, which are characteristic of a wet gas. Water production from both intervals is low, and records in many wells are discontinuous, particularly in the Codell-Niobrara interval. Water-gas ratios are broadly variable, with some of the variability possibly due to the difficulty of measuring small production rates. Most wells for which water is reported have water-gas ratios exceeding the amount that could exist dissolved in gas at reservoir pressure and temperature. The Codell-Niobrara interval is reported to be overpressured (that is, pressure greater than hydrostatic) whereas the underlying Dakota J interval is underpressured (less than hydrostatic), demonstrating a lack of hydraulic communication between the two intervals despite their proximity over a broad geographical area. The underpressuring in the Dakota J interval has been attributed by others to outcropping strata east of the basin. We agree with this interpretation and postulate that the gas accumulation also may contribute to hydraulic isolation from outcrops immediately west of the basin.

  1. Global Geopotential Modelling from Satellite-to-Satellite Tracking,

    DTIC Science & Technology

    1981-10-01

    measured range-rate sampled at regular intervals. The expansion of the potential has been truncated at degree n = 331, because little information on...averaging interval is 4 s , and sampling takes place every 4 s ; if residual data are used, with respect to a reference model of specified accuracy, complete...LEGFDN, MODEL, andNVAR... .. ....... 93 B-4 Sample Output .. .. .. .... ..... ..... ..... 94 Appendix C: Detailed Listings Degree by Degree

  2. Serum TSH reference interval in healthy Finnish adults using the Abbott Architect 2000i Analyzer.

    PubMed

    Schalin-Jäntti, Camilla; Tanner, Pirjo; Välimäki, Matti J; Hämäläinen, Esa

    2011-07-01

    Current serum TSH reference intervals have been criticized as they were established from unselected background populations. A special concern is that the upper limit, which defines subclinical hypothyroidism, is too high. The objective was to redefine the TSH reference interval in the adult Finnish population. The current reference interval for the widely used Abbott Architect method in Finland is 0.4-4.0 mU/L. Serum TSH and free T4 concentrations were derived from 606 healthy, non-pregnant, 18-91-year-old Finns from the Nordic Reference Interval Project (NORIP) and the possible effects of age, sex and thyroid peroxidase antibody (TPOAb) status were evaluated. After excluding TPOAb-positive subjects and outliers, a reference population of 511 subjects was obtained. In the reference population, no statistically significant gender- or age-specific differences in mean TSH (1.55 ± 3.30 mU/L) or TSH reference intervals were observed. The new reference interval was 0.5-3.6 mU/L (2.5th-97.5th percentiles). The current upper TSH reference limit is 10% too high. A TSH > 3.6 mU/L, confirmed with a repeat TSH sampling, may indicate subclinical hypothyroidism. Differences in ethnicity, regional iodine-intake and analytical methods underline the need for redefining the TSH reference interval in central laboratories in different countries.

  3. Improving regression-model-based streamwater constituent load estimates derived from serially correlated data

    USGS Publications Warehouse

    Aulenbach, Brent T.

    2013-01-01

    A regression-model based approach is a commonly used, efficient method for estimating streamwater constituent load when there is a relationship between streamwater constituent concentration and continuous variables such as streamwater discharge, season and time. A subsetting experiment using a 30-year dataset of daily suspended sediment observations from the Mississippi River at Thebes, Illinois, was performed to determine optimal sampling frequency, model calibration period length, and regression model methodology, as well as to determine the effect of serial correlation of model residuals on load estimate precision. Two regression-based methods were used to estimate streamwater loads, the Adjusted Maximum Likelihood Estimator (AMLE), and the composite method, a hybrid load estimation approach. While both methods accurately and precisely estimated loads at the model’s calibration period time scale, precisions were progressively worse at shorter reporting periods, from annually to monthly. Serial correlation in model residuals resulted in observed AMLE precision to be significantly worse than the model calculated standard errors of prediction. The composite method effectively improved upon AMLE loads for shorter reporting periods, but required a sampling interval of at least 15-days or shorter, when the serial correlations in the observed load residuals were greater than 0.15. AMLE precision was better at shorter sampling intervals and when using the shortest model calibration periods, such that the regression models better fit the temporal changes in the concentration–discharge relationship. The models with the largest errors typically had poor high flow sampling coverage resulting in unrepresentative models. Increasing sampling frequency and/or targeted high flow sampling are more efficient approaches to ensure sufficient sampling and to avoid poorly performing models, than increasing calibration period length.

  4. Abstract: Inference and Interval Estimation for Indirect Effects With Latent Variable Models.

    PubMed

    Falk, Carl F; Biesanz, Jeremy C

    2011-11-30

    Models specifying indirect effects (or mediation) and structural equation modeling are both popular in the social sciences. Yet relatively little research has compared methods that test for indirect effects among latent variables and provided precise estimates of the effectiveness of different methods. This simulation study provides an extensive comparison of methods for constructing confidence intervals and for making inferences about indirect effects with latent variables. We compared the percentile (PC) bootstrap, bias-corrected (BC) bootstrap, bias-corrected accelerated (BC a ) bootstrap, likelihood-based confidence intervals (Neale & Miller, 1997), partial posterior predictive (Biesanz, Falk, and Savalei, 2010), and joint significance tests based on Wald tests or likelihood ratio tests. All models included three reflective latent variables representing the independent, dependent, and mediating variables. The design included the following fully crossed conditions: (a) sample size: 100, 200, and 500; (b) number of indicators per latent variable: 3 versus 5; (c) reliability per set of indicators: .7 versus .9; (d) and 16 different path combinations for the indirect effect (α = 0, .14, .39, or .59; and β = 0, .14, .39, or .59). Simulations were performed using a WestGrid cluster of 1680 3.06GHz Intel Xeon processors running R and OpenMx. Results based on 1,000 replications per cell and 2,000 resamples per bootstrap method indicated that the BC and BC a bootstrap methods have inflated Type I error rates. Likelihood-based confidence intervals and the PC bootstrap emerged as methods that adequately control Type I error and have good coverage rates.

  5. [Effects of the late marriage of Korean women on the first-birth interval].

    PubMed

    Chung, Woojin; Lee, Kyoungae; Lee, Sunmi

    2006-05-01

    The purpose of this study was to examine the effect of women's late age of marriage on the interval between marriage and their first birth. Data from Year 2000 Korea National Fertility Survey was collected through direct interview questionings, and the data was analyzed based on randomly selected sampling. In particular, the married women (N=5,648) were analyzed for the factors that determined the first-birth interval by performing Cox's proportional hazard model survival analysis. Unlike previous findings, the woman whose age of marriage was 30 or more was more likely to delay the birth of her first baby than were the other women who married earlier. Further, a woman's age at marriage, a woman's residence before marriage, her husband's religion, her husband's level of education and the difference in age between the woman and her husband significantly influenced the first-birth interval. In contrast, for a married woman, her age, level of education, current residence and religion were not significant predictors of her first birth interval. Our study showed that women who married at the age of 30 years or more tend to postpone their first birth in Korea. When facing the increasing number of women who marry at a late age, the Korean government should implement population and social policies to encourage married women have their first child as early as possible.

  6. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    PubMed

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  7. Relations between Parental Psychological Control and Childhood Relational Aggression: Reciprocal in Nature?

    ERIC Educational Resources Information Center

    Kuppens, Sofie; Grietens, Hans; Onghena, Patrick; Michiels, Daisy

    2009-01-01

    Using a cross-lagged panel design, this study examined the directionality of relations between parental psychological control and child relational aggression. Data were collected from a proportionally stratified sample of 600 Flemish 8- to 10-year-old children at 3 measurement points with 1-year intervals. Reciprocal effects were evident in…

  8. The effect of covariate mean differences on the standard error and confidence interval for the comparison of treatment means.

    PubMed

    Liu, Xiaofeng Steven

    2011-05-01

    The use of covariates is commonly believed to reduce the unexplained error variance and the standard error for the comparison of treatment means, but the reduction in the standard error is neither guaranteed nor uniform over different sample sizes. The covariate mean differences between the treatment conditions can inflate the standard error of the covariate-adjusted mean difference and can actually produce a larger standard error for the adjusted mean difference than that for the unadjusted mean difference. When the covariate observations are conceived of as randomly varying from one study to another, the covariate mean differences can be related to a Hotelling's T(2) . Using this Hotelling's T(2) statistic, one can always find a minimum sample size to achieve a high probability of reducing the standard error and confidence interval width for the adjusted mean difference. ©2010 The British Psychological Society.

  9. Methods for estimating confidence intervals in interrupted time series analyses of health interventions.

    PubMed

    Zhang, Fang; Wagner, Anita K; Soumerai, Stephen B; Ross-Degnan, Dennis

    2009-02-01

    Interrupted time series (ITS) is a strong quasi-experimental research design, which is increasingly applied to estimate the effects of health services and policy interventions. We describe and illustrate two methods for estimating confidence intervals (CIs) around absolute and relative changes in outcomes calculated from segmented regression parameter estimates. We used multivariate delta and bootstrapping methods (BMs) to construct CIs around relative changes in level and trend, and around absolute changes in outcome based on segmented linear regression analyses of time series data corrected for autocorrelated errors. Using previously published time series data, we estimated CIs around the effect of prescription alerts for interacting medications with warfarin on the rate of prescriptions per 10,000 warfarin users per month. Both the multivariate delta method (MDM) and the BM produced similar results. BM is preferred for calculating CIs of relative changes in outcomes of time series studies, because it does not require large sample sizes when parameter estimates are obtained correctly from the model. Caution is needed when sample size is small.

  10. Testing equality and interval estimation of the generalized odds ratio in ordinal data under a three-period crossover design.

    PubMed

    Lui, Kung-Jong; Chang, Kuang-Chao; Lin, Chii-Dean

    2017-06-01

    The crossover design can be of use to save the number of patients or improve power of a parallel groups design in studying treatments to noncurable chronic diseases. We propose using the generalized odds ratio for paired sample data to measure the relative effects in ordinal data between treatments and between periods. We show that one can apply the commonly used asymptotic and exact test procedures for stratified analysis in epidemiology to test non-equality of treatments in ordinal data, as well as obtain asymptotic and exact interval estimators for the generalized odds ratio under a three-period crossover design. We further show that one can apply procedures for testing the homogeneity of the odds ratio under stratified sampling to examine whether there are treatment-by-period interactions. We use the data taken from a three-period crossover trial studying the effects of low and high doses of an analgesic versus a placebo for the relief of pain in primary dysmenorrhea to illustrate the use of these test procedures and estimators proposed here.

  11. Variance Estimation, Design Effects, and Sample Size Calculations for Respondent-Driven Sampling

    PubMed Central

    2006-01-01

    Hidden populations, such as injection drug users and sex workers, are central to a number of public health problems. However, because of the nature of these groups, it is difficult to collect accurate information about them, and this difficulty complicates disease prevention efforts. A recently developed statistical approach called respondent-driven sampling improves our ability to study hidden populations by allowing researchers to make unbiased estimates of the prevalence of certain traits in these populations. Yet, not enough is known about the sample-to-sample variability of these prevalence estimates. In this paper, we present a bootstrap method for constructing confidence intervals around respondent-driven sampling estimates and demonstrate in simulations that it outperforms the naive method currently in use. We also use simulations and real data to estimate the design effects for respondent-driven sampling in a number of situations. We conclude with practical advice about the power calculations that are needed to determine the appropriate sample size for a study using respondent-driven sampling. In general, we recommend a sample size twice as large as would be needed under simple random sampling. PMID:16937083

  12. Use of Self-Matching to Control for Stable Patient Characteristics While Addressing Time-Varying Confounding on Treatment Effect: A Case Study of Older Intensive Care Patients.

    PubMed

    Han, Ling; Pisani, M A; Araujo, K L B; Allore, Heather G

    Exposure-crossover design offers a non-experimental option to control for stable baseline confounding through self-matching while examining causal effect of an exposure on an acute outcome. This study extends this approach to longitudinal data with repeated measures of exposure and outcome using data from a cohort of 340 older medical patients in an intensive care unit (ICU). The analytic sample included 92 patients who received ≥1 dose of haloperidol, an antipsychotic medication often used for patients with delirium. Exposure-crossover design was implemented by sampling the 3-day time segments prior ( Induction) and posterior ( Subsequent) to each treatment episode of receiving haloperidol. In the full cohort, there was a trend of increasing delirium severity scores (Mean±SD: 4.4±1.7) over the course of the ICU stay. After exposure-crossover sampling, the delirium severity score decreased from the Induction (4.9) to the Subsequent (4.1) intervals, with the treatment episode falling in-between (4.5). Based on a GEE Poisson model accounting for self-matching and within-subject correlation, the unadjusted mean delirium severity scores was -0.55 (95% CI: -1.10, -0.01) points lower for the Subsequent than the Induction intervals. The association diminished by 32% (-0.38, 95%CI: -0.99, 0.24) after adjusting only for ICU confounding, while being slightly increased by 7% (-0.60, 95%CI: -1.15, -0.04) when adjusting only for baseline characteristics. These results suggest that longitudinal exposure-crossover design is feasible and capable of partially removing stable baseline confounding through self-matching. Loss of power due to eliminating treatment-irrelevant person-time and uncertainty around allocating person-time to comparison intervals remain methodological challenges.

  13. The effect of packaging materials on the stability of sunscreen emulsions.

    PubMed

    Santoro, Maria Inês R M; Da Costa E Oliveira, Daniella Almança Gonçalves; Kedor-Hackmann, Erika R M; Singh, Anil K

    2005-06-13

    The purpose of this research was to study the stability of a emulsion containing UVA, UVB and infrared sunscreens after storage in different types of packaging materials (glass and plastic flasks; plastic and metallic tubes). The samples, emulsions containing benzophenone-3 (B-3), octyl methoxycinnamate (OM) and Phycocorail, were stored at 10, 25, 35 and 45 degrees C and representative samples were analyzed after 2, 7, 30, 60 and 90 days period. The stability studies were conducted by analyzing samples at pre-determined intervals by high performance liquid chromatography (HPLC) along with periodic rheological measurements.

  14. Annealing Increases Stability Of Iridium Thermocouples

    NASA Technical Reports Server (NTRS)

    Germain, Edward F.; Daryabeigi, Kamran; Alderfer, David W.; Wright, Robert E.; Ahmed, Shaffiq

    1989-01-01

    Metallurgical studies carried out on samples of iridium versus iridium/40-percent rhodium thermocouples in condition received from manufacturer. Metallurgical studies included x-ray, macroscopic, resistance, and metallographic studies. Revealed large amount of internal stress caused by cold-working during manufacturing, and large number of segregations and inhomogeneities. Samples annealed in furnace at temperatures from 1,000 to 2,000 degree C for intervals up to 1 h to study effects of heat treatment. Wire annealed by this procedure found to be ductile.

  15. Interlaboratory Reproducibility and Proficiency Testing within the Human Papillomavirus Cervical Cancer Screening Program in Catalonia, Spain

    PubMed Central

    Ibáñez, R.; Félez-Sánchez, M.; Godínez, J. M.; Guardià, C.; Caballero, E.; Juve, R.; Combalia, N.; Bellosillo, B.; Cuevas, D.; Moreno-Crespi, J.; Pons, L.; Autonell, J.; Gutierrez, C.; Ordi, J.; de Sanjosé, S.

    2014-01-01

    In Catalonia, a screening protocol for cervical cancer, including human papillomavirus (HPV) DNA testing using the Digene Hybrid Capture 2 (HC2) assay, was implemented in 2006. In order to monitor interlaboratory reproducibility, a proficiency testing (PT) survey of the HPV samples was launched in 2008. The aim of this study was to explore the repeatability of the HC2 assay's performance. Participating laboratories provided 20 samples annually, 5 randomly chosen samples from each of the following relative light unit (RLU) intervals: <0.5, 0.5 to 0.99, 1 to 9.99, and ≥10. Kappa statistics were used to determine the agreement levels between the original and the PT readings. The nature and origin of the discrepant results were calculated by bootstrapping. A total of 946 specimens were retested. The kappa values were 0.91 for positive/negative categorical classification and 0.79 for the four RLU intervals studied. Sample retesting yielded systematically lower RLU values than the original test (P < 0.005), independently of the time elapsed between the two determinations (median, 53 days), possibly due to freeze-thaw cycles. The probability for a sample to show clinically discrepant results upon retesting was a function of the RLU value; samples with RLU values in the 0.5 to 5 interval showed 10.80% probability to yield discrepant results (95% confidence interval [CI], 7.86 to 14.33) compared to 0.85% probability for samples outside this interval (95% CI, 0.17 to 1.69). Globally, the HC2 assay shows high interlaboratory concordance. We have identified differential confidence thresholds and suggested the guidelines for interlaboratory PT in the future, as analytical quality assessment of HPV DNA detection remains a central component of the screening program for cervical cancer prevention. PMID:24574284

  16. Cigarette smoke chemistry market maps under Massachusetts Department of Public Health smoking conditions.

    PubMed

    Morton, Michael J; Laffoon, Susan W

    2008-06-01

    This study extends the market mapping concept introduced by Counts et al. (Counts, M.E., Hsu, F.S., Tewes, F.J., 2006. Development of a commercial cigarette "market map" comparison methodology for evaluating new or non-conventional cigarettes. Regul. Toxicol. Pharmacol. 46, 225-242) to include both temporal cigarette and testing variation and also machine smoking with more intense puffing parameters, as defined by the Massachusetts Department of Public Health (MDPH). The study was conducted over a two year period and involved a total of 23 different commercial cigarette brands from the U.S. marketplace. Market mapping prediction intervals were developed for 40 mainstream cigarette smoke constituents and the potential utility of the market map as a comparison tool for new brands was demonstrated. The over-time character of the data allowed for the variance structure of the smoke constituents to be more completely characterized than is possible with one-time sample data. The variance was partitioned among brand-to-brand differences, temporal differences, and the remaining residual variation using a mixed random and fixed effects model. It was shown that a conventional weighted least squares model typically gave similar prediction intervals to those of the more complicated mixed model. For most constituents there was less difference in the prediction intervals calculated from over-time samples and those calculated from one-time samples than had been anticipated. One-time sample maps may be adequate for many purposes if the user is aware of their limitations. Cigarette tobacco fillers were analyzed for nitrate, nicotine, tobacco-specific nitrosamines, ammonia, chlorogenic acid, and reducing sugars. The filler information was used to improve predicting relationships for several of the smoke constituents, and it was concluded that the effects of filler chemistry on smoke chemistry were partial explanations of the observed brand-to-brand variation.

  17. Resonance Shift of Single-Axis Acoustic Levitation

    NASA Astrophysics Data System (ADS)

    Xie, Wen-Jun; Wei, Bing-Bo

    2007-01-01

    The resonance shift due to the presence and movement of a rigid spherical sample in a single-axis acoustic levitator is studied with the boundary element method on the basis of a two-cylinder model of the levitator. The introduction of a sample into the sound pressure nodes, where it is usually levitated, reduces the resonant interval Hn (n is the mode number) between the reflector and emitter. The larger the sample radius, the greater the resonance shift. When the sample moves along the symmetric axis, the resonance interval Hn varies in an approximately periodical manner, which reaches the minima near the pressure nodes and the maxima near the pressure antinodes. This suggests a resonance interval oscillation around its minimum if the stably levitated sample is slightly perturbed. The dependence of the resonance shift on the sample radius R and position h for the single-axis acoustic levitator is compared with Leung's theory for a closed rectangular chamber, which shows a good agreement.

  18. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    NASA Astrophysics Data System (ADS)

    Saha, Debasish; Kemanian, Armen R.; Rau, Benjamin M.; Adler, Paul R.; Montes, Felipe

    2017-04-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (corn-soybean rotation), College Station, TX (corn-vetch rotation), Fort Collins, CO (irrigated corn), and Pullman, WA (winter wheat), representing diverse agro-ecoregions of the United States. Fertilization source, rate, and timing were site-specific. These simulated fluxes surrogated daily measurements in the analysis. We ;sampled; the fluxes using a fixed interval (1-32 days) or a rule-based (decision tree-based) sampling method. Two types of decision trees were built: a high-input tree (HI) that included soil inorganic nitrogen (SIN) as a predictor variable, and a low-input tree (LI) that excluded SIN. Other predictor variables were identified with Random Forest. The decision trees were inverted to be used as rules for sampling a representative number of members from each terminal node. The uncertainty of the annual N2O flux estimation increased along with the fixed interval length. A 4- and 8-day fixed sampling interval was required at College Station and Ames, respectively, to yield ±20% accuracy in the flux estimate; a 12-day interval rendered the same accuracy at Fort Collins and Pullman. Both the HI and the LI rule-based methods provided the same accuracy as that of fixed interval method with up to a 60% reduction in sampling events, particularly at locations with greater temporal flux variability. For instance, at Ames, the HI rule-based and the fixed interval methods required 16 and 91 sampling events, respectively, to achieve the same absolute bias of 0.2 kg N ha-1 yr-1 in estimating cumulative N2O flux. These results suggest that using simulation models along with decision trees can reduce the cost and improve the accuracy of the estimations of cumulative N2O fluxes using the discrete chamber-based method.

  19. The association between subclinical mastitis around calving and reproductive performance in grazing dairy cows.

    PubMed

    Villa-Arcila, N A; Sanchez, J; Ratto, M H; Rodriguez-Lecompte, J C; Duque-Madrid, P C; Sanchez-Arias, S; Ceballos-Marquez, A

    2017-10-01

    The objective of this study was to evaluate the effect of subclinical mastitis (SCM) on calving-to-first-service interval (CFS), calving-to-conception interval (CC), and on the number of services per conception (S/C) in grazing Holstein and Normande cows. Primiparous (n=43) and multiparous (n=165) cows were selected from five dairy herds. Two composite milk samples were aseptically collected from each cow at drying-off, and then every week during the first postpartum month. One sample was used for somatic cell count (SCC), and the other one for bacteriological analysis. Cows were followed up to 300 d after calving. Non-parametric and parametric survival models, and negative binomial regression were used to assess the association between SCM, evaluated by SCC and milk culture, and reproductive indices. Staphylococcus aureus, CNS, and Streptococcus uberis were the most frequent isolated pathogens. Subclinical mastitis in the first month of lactation was not associated with CFS; however, the CC interval was longer in cows with SCM compared to healthy cows, the former also had a higher number of S/C. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Practical Advice on Calculating Confidence Intervals for Radioprotection Effects and Reducing Animal Numbers in Radiation Countermeasure Experiments

    PubMed Central

    Landes, Reid D.; Lensing, Shelly Y.; Kodell, Ralph L.; Hauer-Jensen, Martin

    2014-01-01

    The dose of a substance that causes death in P% of a population is called an LDP, where LD stands for lethal dose. In radiation research, a common LDP of interest is the radiation dose that kills 50% of the population by a specified time, i.e., lethal dose 50 or LD50. When comparing LD50 between two populations, relative potency is the parameter of interest. In radiation research, this is commonly known as the dose reduction factor (DRF). Unfortunately, statistical inference on dose reduction factor is seldom reported. We illustrate how to calculate confidence intervals for dose reduction factor, which may then be used for statistical inference. Further, most dose reduction factor experiments use hundreds, rather than tens of animals. Through better dosing strategies and the use of a recently available sample size formula, we also show how animal numbers may be reduced while maintaining high statistical power. The illustrations center on realistic examples comparing LD50 values between a radiation countermeasure group and a radiation-only control. We also provide easy-to-use spreadsheets for sample size calculations and confidence interval calculations, as well as SAS® and R code for the latter. PMID:24164553

  1. Excitation-based and informational masking of a tonal signal in a four-tone masker.

    PubMed

    Leibold, Lori J; Hitchens, Jack J; Buss, Emily; Neff, Donna L

    2010-04-01

    This study examined contributions of peripheral excitation and informational masking to the variability in masking effectiveness observed across samples of multi-tonal maskers. Detection thresholds were measured for a 1000-Hz signal presented simultaneously with each of 25, four-tone masker samples. Using a two-interval, forced-choice adaptive task, thresholds were measured with each sample fixed throughout trial blocks for ten listeners. Average thresholds differed by as much as 26 dB across samples. An excitation-based model of partial loudness [Moore, B. C. J. et al. (1997). J. Audio Eng. Soc. 45, 224-237] was used to predict thresholds. These predictions accounted for a significant portion of variance in the data of several listeners, but no relation between the model and data was observed for many listeners. Moreover, substantial individual differences, on the order of 41 dB, were observed for some maskers. The largest individual differences were found for maskers predicted to produce minimal excitation-based masking. In subsequent conditions, one of five maskers was randomly presented in each interval. The difference in performance for samples with low versus high predicted thresholds was reduced in random compared to fixed conditions. These findings are consistent with a trading relation whereby informational masking is largest for conditions in which excitation-based masking is smallest.

  2. A genetic algorithm-based framework for wavelength selection on sample categorization.

    PubMed

    Anzanello, Michel J; Yamashita, Gabrielli; Marcelo, Marcelo; Fogliatto, Flávio S; Ortiz, Rafael S; Mariotti, Kristiane; Ferrão, Marco F

    2017-08-01

    In forensic and pharmaceutical scenarios, the application of chemometrics and optimization techniques has unveiled common and peculiar features of seized medicine and drug samples, helping investigative forces to track illegal operations. This paper proposes a novel framework aimed at identifying relevant subsets of attenuated total reflectance Fourier transform infrared (ATR-FTIR) wavelengths for classifying samples into two classes, for example authentic or forged categories in case of medicines, or salt or base form in cocaine analysis. In the first step of the framework, the ATR-FTIR spectra were partitioned into equidistant intervals and the k-nearest neighbour (KNN) classification technique was applied to each interval to insert samples into proper classes. In the next step, selected intervals were refined through the genetic algorithm (GA) by identifying a limited number of wavelengths from the intervals previously selected aimed at maximizing classification accuracy. When applied to Cialis®, Viagra®, and cocaine ATR-FTIR datasets, the proposed method substantially decreased the number of wavelengths needed to categorize, and increased the classification accuracy. From a practical perspective, the proposed method provides investigative forces with valuable information towards monitoring illegal production of drugs and medicines. In addition, focusing on a reduced subset of wavelengths allows the development of portable devices capable of testing the authenticity of samples during police checking events, avoiding the need for later laboratorial analyses and reducing equipment expenses. Theoretically, the proposed GA-based approach yields more refined solutions than the current methods relying on interval approaches, which tend to insert irrelevant wavelengths in the retained intervals. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Feasibility and effects of a combined adjuvant high-intensity interval/strength training in breast cancer patients: a single-center pilot study.

    PubMed

    Schulz, Sebastian Viktor Waldemar; Laszlo, Roman; Otto, Stephanie; Prokopchuk, Dmytro; Schumann, Uwe; Ebner, Florian; Huober, Jens; Steinacker, Jürgen Michael

    2018-06-01

    To evaluate feasibility of an exercise intervention consisting of high-intensity interval endurance and strength training in breast cancer patients. Twenty-six women with nonmetastatic breast cancer were consecutively assigned to the exercise intervention- (n= 15, mean age 51.9 ± 9.8 years) and the control group (n = 11, mean age 56.9 ± 7.0 years). Cardiopulmonary exercise testing that included lactate sampling, one-repetition maximum tests and a HADS-D questionnaire were used to monitor patients both before and after a supervised six weeks period of either combined high-intensity interval endurance and strength training (intervention group, twice a week) or leisure training (control group). Contrarily to the control group, endurance (mean change of VO 2 , peak 12.0 ± 13.0%) and strength performance (mean change of cumulative load 25.9 ± 11.2%) and quality of life increased in the intervention group. No training-related adverse events were observed. Our guided exercise intervention could be used effectively for initiation and improvement of performance capacity and quality of life in breast cancer patients in a relatively short time. This might be especially attractive during medical treatment. Long-term effects have to be evaluated in randomized controlled studies also with a longer follow-up. Implications for Rehabilitation High-intensity interval training allows improvement of aerobic capacity within a comparable short time. Standard leisure training in breast cancer patients is rather suitable for the maintenance of performance capacity and quality of life. Guided high-intensity interval training combined with strength training can be used effectively for the improvement of endurance and strength capacity and also quality of life. After exclusion of contraindications, guided adjuvant high-intensity interval training combined with strength training can be safely used in breast cancer patients.

  4. The role of fire-return interval and season of burn in snag dynamics in a south Florida slash pine forest

    USGS Publications Warehouse

    Lloyd, John D.; Slater, Gary L.; Snyder, James R.

    2012-01-01

    Standing dead trees, or snags, are an important habitat element for many animal species. In many ecosystems, fire is a primary driver of snag population dynamics because it can both create and consume snags. The objective of this study was to examine how variation in two key components of the fire regime—fire-return interval and season of burn—affected population dynamics of snags. Using a factorial design, we exposed 1 ha plots, located within larger burn units in a south Florida slash pine (Pinus elliottii var. densa Little and Dorman) forest, to prescribed fire applied at two intervals (approximately 3-year intervals vs. approximately 6-year intervals) and during two seasons (wet season vs. dry season) over a 12- to 13-year period. We found no consistent effect of fire season or frequency on the density of lightly to moderately decayed or heavily decayed snags, suggesting that variation in these elements of the fire regime at the scale we considered is relatively unimportant in the dynamics of snag populations. However, our confidence in these findings is limited by small sample sizes, potentially confounding effects of unmeasured variation in fire behavior and effects (e.g., intensity, severity, synergy with drought cycles) and wide variation in responses within a treatment level. The generalizing of our findings is also limited by the narrow range of treatment levels considered. Future experiments incorporating a wider range of fire regimes and directly quantifying fire intensity would prove useful in identifying more clearly the role of fire in shaping the dynamics of snag populations.

  5. Effect of Hurricane Hugo on molluscan skeletal distributions,Salt River Bay, St. Croix, U.S. Virgin Islands

    NASA Astrophysics Data System (ADS)

    Miller, Arnold I.; Llewellyn, Ghislaine; Parsons, Karla M.; Cummins, Hays; Boardman, Mark R.; Greenstein, Benjamin J.; Jacobs, David K.

    1992-01-01

    Just prior to the passage of Hurricane Hugo over St. Croix, U.S. Virgin Islands, 35 molluscan skeletal samples were collected at 30 m intervals along a sampling transect in Salt River Bay, on the north-central coast. Three months after the hurricane, the transect was resampled to permit direct assessment of storm effects on skeletal distributions. Results indicate that spatial zonation of molluscan accumulations, associated with environmental transitions along the transect, was maintained in the wake of the hurricane. However, limited transport was diagnosed by comparing the compositions of prestorm and poststorm samples from the deepest, mud-rich subenvironment on the transect. In aggregate, the species richness of samples from the southern half of this zone increased from 16 to 40, and the abundance of species that were not among the characteristic molluscs of this subenvironment increased from 11% to 26%. These storm effects could probably not have been recognized, and attributed directly to Hugo, had there been no prestorm samples with which to compare directly the poststorm samples.

  6. Levonorgestrel release rates over 5 years with the Liletta® 52-mg intrauterine system.

    PubMed

    Creinin, Mitchell D; Jansen, Rolf; Starr, Robert M; Gobburu, Joga; Gopalakrishnan, Mathangi; Olariu, Andrea

    2016-10-01

    To understand the potential duration of action for Liletta®, we conducted this study to estimate levonorgestrel (LNG) release rates over approximately 5½years of product use. Clinical sites in the U.S. Phase 3 study of Liletta collected the LNG intrauterine systems (IUSs) from women who discontinued the study. We randomly selected samples within 90-day intervals after discontinuation of IUS use through 900days (approximately 2.5years) and 180-day intervals for the remaining duration through 5.4years (1980days) to evaluate residual LNG content. We also performed an initial LNG content analysis using 10 randomly selected samples from a single lot. We calculated the average ex vivo release rate using the residual LNG content over the duration of the analysis. We analyzed 64 samples within 90-day intervals (range 6-10 samples per interval) through 900days and 36 samples within 180-day intervals (6 samples per interval) for the remaining duration. The initial content analysis averaged 52.0±1.8mg. We calculated an average initial release rate of 19.5mcg/day that decreased to 17.0, 14.8, 12.9, 11.3 and 9.8mcg/day after 1, 2, 3, 4 and 5years, respectively. The 5-year average release rate is 14.7mcg/day. The estimated initial LNG release rate and gradual decay of the estimated release rate are consistent with the target design and function of the product. The calculated LNG content and release rate curves support the continued evaluation of Liletta as a contraceptive for 5 or more years of use. Liletta LNG content and release rates are comparable to published data for another LNG 52-mg IUS. The release rate at 5years is more than double the published release rate at 3years with an LNG 13.5-mg IUS, suggesting continued efficacy of Liletta beyond 5years. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Surveying drainage culvert use by carnivores: sampling design and cost-benefit analyzes of track-pads vs. video-surveillance methods.

    PubMed

    Mateus, Ana Rita A; Grilo, Clara; Santos-Reis, Margarida

    2011-10-01

    Environmental assessment studies often evaluate the effectiveness of drainage culverts as habitat linkages for species, however, the efficiency of the sampling designs and the survey methods are not known. Our main goal was to estimate the most cost-effective monitoring method for sampling carnivore culvert using track-pads and video-surveillance. We estimated the most efficient (lower costs and high detection success) interval between visits (days) when using track-pads and also determined the advantages of using each method. In 2006, we selected two highways in southern Portugal and sampled 15 culverts over two 10-day sampling periods (spring and summer). Using the track-pad method, 90% of the animal tracks were detected using a 2-day interval between visits. We recorded a higher number of crossings for most species using video-surveillance (n = 129) when compared with the track-pad technique (n = 102); however, the detection ability using the video-surveillance method varied with type of structure and species. More crossings were detected in circular culverts (1 m and 1.5 m diameter) than in box culverts (2 m to 4 m width), likely because video cameras had a reduced vision coverage area. On the other hand, carnivore species with small feet such as the common genet Genetta genetta were detected less often using the track-pad surveying method. The cost-benefit analyzes shows that the track-pad technique is the most appropriate technique, but video-surveillance allows year-round surveys as well as the behavior response analyzes of species using crossing structures.

  8. The Applicability of Confidence Intervals of Quantiles for the Generalized Logistic Distribution

    NASA Astrophysics Data System (ADS)

    Shin, H.; Heo, J.; Kim, T.; Jung, Y.

    2007-12-01

    The generalized logistic (GL) distribution has been widely used for frequency analysis. However, there is a little study related to the confidence intervals that indicate the prediction accuracy of distribution for the GL distribution. In this paper, the estimation of the confidence intervals of quantiles for the GL distribution is presented based on the method of moments (MOM), maximum likelihood (ML), and probability weighted moments (PWM) and the asymptotic variances of each quantile estimator are derived as functions of the sample sizes, return periods, and parameters. Monte Carlo simulation experiments are also performed to verify the applicability of the derived confidence intervals of quantile. As the results, the relative bias (RBIAS) and relative root mean square error (RRMSE) of the confidence intervals generally increase as return period increases and reverse as sample size increases. And PWM for estimating the confidence intervals performs better than the other methods in terms of RRMSE when the data is almost symmetric while ML shows the smallest RBIAS and RRMSE when the data is more skewed and sample size is moderately large. The GL model was applied to fit the distribution of annual maximum rainfall data. The results show that there are little differences in the estimated quantiles between ML and PWM while distinct differences in MOM.

  9. Shelf-life extension of convenience meat products sold in Indian supermarkets by radiation processing

    NASA Astrophysics Data System (ADS)

    Kanatt, Sweetie R.; Shobita Rao, M.; Chawla, S. P.; Sharma, Arun

    2010-12-01

    A variety of ready-to-cook meat products available in Indian supermarkets (mutton mince, chicken mince, chicken chunks, and chicken legs) were studied. The samples were irradiated (2.5 kGy), or left untreated as control, and stored at 0-3 °C for up to 21 days. The effect of irradiation on the microbiological, chemical, and sensory properties was evaluated at intervals during the storage period. Irradiated samples had a longer shelf-life at 0-3 °C compared with the corresponding non-irradiated samples. Fecal coliforms were eliminated by irradiation treatment. Radiation processed samples had lower counts of Staphylococcus spp. There were no significant organoleptic changes in irradiated samples stored at chilled temperatures.

  10. Dispersion durations of P-wave and QT interval in children treated with a ketogenic diet.

    PubMed

    Doksöz, Önder; Güzel, Orkide; Yılmaz, Ünsal; Işgüder, Rana; Çeleğen, Kübra; Meşe, Timur

    2014-04-01

    Limited data are available on the effects of a ketogenic diet on dispersion duration of P-wave and QT-interval measures in children. We searched for the changes in these measures with serial electrocardiograms in patients treated with a ketogenic diet. Twenty-five drug-resistant patients with epilepsy treated with a ketogenic diet were enrolled in this study. Electrocardiography was performed in all patients before the beginning and at the sixth month after implementation of the ketogenic diet. Heart rate, maximum and minimum P-wave duration, P-wave dispersion, and maximum and minimum corrected QT interval and QT dispersion were manually measured from the 12-lead surface electrocardiogram. Minimum and maximum corrected QT and QT dispersion measurements showed nonsignificant increase at month 6 compared with baseline values. Other previously mentioned electrocardiogram parameters also showed no significant changes. A ketogenic diet of 6 months' duration has no significant effect on electrocardiogram parameters in children. Further studies with larger samples and longer duration of follow-up are needed to clarify the effects of ketogenic diet on P-wave dispersion and corrected QT and QT dispersion. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. [Mammographic screening. An analysis of the characteristics of interval carcinomas observed in the program in the province of Firenze (1989-1991)].

    PubMed

    Ciatto, S; Rosselli del Turco, M; Bonardi, R; Bianchi, S

    1994-04-01

    The authors evaluated 30 interval cancers consecutively observed from 1989 to 1991 and compared them to 98 screening-detected cancers observed in the same period. Interval cancers have a more advanced stage (stage I = 13 lesions, stage II + = 17 lesions) with respect to screening-detected cancers (stage 0 = 10 lesions, stage I = 61 lesions, stage II + = 27 lesions). This finding seems unrelated to an intrinsically higher aggressivity of interval cancers (length biased sampling) which do not differ significantly from screening-detected cancers as far as histopathologic characteristics of prognostic value are concerned. Diagnostic delay due to technical or reading error (9 cases), to radiologically occult cancer in clear (10 cases) or dense parenchymal areas (11 cases) is most likely. This seems to be confirmed by the low frequency observed among interval cancers of easily visible lesions such as isolated microcalcifications (3% vs. 35%) or stellate opacities (13% vs. 31%), and by the higher frequency of opacities with irregular margins (57% vs. 26%) which are more likely masked by dense parenchyma. The chances of reducing interval cancer rate by attempting to increase sensitivity or by increasing screening frequency are discussed, as well as the possible negative consequences of such protocols in terms of cost-effectiveness.

  12. Statistical power analysis in wildlife research

    USGS Publications Warehouse

    Steidl, R.J.; Hayes, J.P.

    1997-01-01

    Statistical power analysis can be used to increase the efficiency of research efforts and to clarify research results. Power analysis is most valuable in the design or planning phases of research efforts. Such prospective (a priori) power analyses can be used to guide research design and to estimate the number of samples necessary to achieve a high probability of detecting biologically significant effects. Retrospective (a posteriori) power analysis has been advocated as a method to increase information about hypothesis tests that were not rejected. However, estimating power for tests of null hypotheses that were not rejected with the effect size observed in the study is incorrect; these power estimates will always be a??0.50 when bias adjusted and have no relation to true power. Therefore, retrospective power estimates based on the observed effect size for hypothesis tests that were not rejected are misleading; retrospective power estimates are only meaningful when based on effect sizes other than the observed effect size, such as those effect sizes hypothesized to be biologically significant. Retrospective power analysis can be used effectively to estimate the number of samples or effect size that would have been necessary for a completed study to have rejected a specific null hypothesis. Simply presenting confidence intervals can provide additional information about null hypotheses that were not rejected, including information about the size of the true effect and whether or not there is adequate evidence to 'accept' a null hypothesis as true. We suggest that (1) statistical power analyses be routinely incorporated into research planning efforts to increase their efficiency, (2) confidence intervals be used in lieu of retrospective power analyses for null hypotheses that were not rejected to assess the likely size of the true effect, (3) minimum biologically significant effect sizes be used for all power analyses, and (4) if retrospective power estimates are to be reported, then the I?-level, effect sizes, and sample sizes used in calculations must also be reported.

  13. An evaluation of inferential procedures for adaptive clinical trial designs with pre-specified rules for modifying the sample size.

    PubMed

    Levin, Gregory P; Emerson, Sarah C; Emerson, Scott S

    2014-09-01

    Many papers have introduced adaptive clinical trial methods that allow modifications to the sample size based on interim estimates of treatment effect. There has been extensive commentary on type I error control and efficiency considerations, but little research on estimation after an adaptive hypothesis test. We evaluate the reliability and precision of different inferential procedures in the presence of an adaptive design with pre-specified rules for modifying the sampling plan. We extend group sequential orderings of the outcome space based on the stage at stopping, likelihood ratio statistic, and sample mean to the adaptive setting in order to compute median-unbiased point estimates, exact confidence intervals, and P-values uniformly distributed under the null hypothesis. The likelihood ratio ordering is found to average shorter confidence intervals and produce higher probabilities of P-values below important thresholds than alternative approaches. The bias adjusted mean demonstrates the lowest mean squared error among candidate point estimates. A conditional error-based approach in the literature has the benefit of being the only method that accommodates unplanned adaptations. We compare the performance of this and other methods in order to quantify the cost of failing to plan ahead in settings where adaptations could realistically be pre-specified at the design stage. We find the cost to be meaningful for all designs and treatment effects considered, and to be substantial for designs frequently proposed in the literature. © 2014, The International Biometric Society.

  14. Interception loss, throughfall and stemflow in a maritime pine stand. I. Variability of throughfall and stemflow beneath the pine canopy

    NASA Astrophysics Data System (ADS)

    Loustau, D.; Berbigier, P.; Granier, A.; Moussa, F. El Hadj

    1992-10-01

    Patterns of spatial variability of throughfall and stemflow were determined in a maritime pine ( Pinus pinaster Ait.) stand for two consecutive years. Data were obtained from 52 fixed rain gauges and 12 stemflow measuring devices located in a 50m × 50m plot at the centre of an 18-year-old stand. The pine trees had been sown in rows 4m apart and had reached an average height of 12.6m. The spatial distribution of stems had a negligible effect on the throughfall partitioning beneath the canopy. Variograms of throughfall computed for a sample of storms did not reveal any spatial autocorrelation of throughfall for the sampling design used. Differences in throughfall, in relation to the distance from the rows, were not consistently significant. In addition, the distance from the tree stem did not influence the amount of throughfall. The confidence interval on the amount of throughfall per storm was between 3 and 8%. The stemflow was highly variable between trees. The effect of individual trees on stemflow was significant but the amount of stemflow per tree was not related to tree size (i.e. height, trunk diameter, etc.). The cumulative sampling errors on stemflow and throughfall for a single storm created a confidence interval of between ±7 and ±51% on interception. This resulted mainly from the low interception rate and sampling error on throughfall.

  15. Effect of temporal sampling and timing for soil moisture measurements at field scale

    NASA Astrophysics Data System (ADS)

    Snapir, B.; Hobbs, S.

    2012-04-01

    Estimating soil moisture at field scale is valuable for various applications such as irrigation scheduling in cultivated watersheds, flood and drought prediction, waterborne disease spread assessment, or even determination of mobility with lightweight vehicles. Synthetic aperture radar on satellites in low Earth orbit can provide fine resolution images with a repeat time of a few days. For an Earth observing satellite, the choice of the orbit is driven in particular by the frequency of measurements required to meet a certain accuracy in retrieving the parameters of interest. For a given target, having only one image every week may not enable to capture the full dynamic range of soil moisture - soil moisture can change significantly within a day when rainfall occurs. Hence this study focuses on the effect of temporal sampling and timing of measurements in terms of error on the retrieved signal. All the analyses are based on in situ measurements of soil moisture (acquired every 30 min) from the OzNet Hydrological Monitoring Network in Australia for different fields over several years. The first study concerns sampling frequency. Measurements at different frequencies were simulated by sub-sampling the original data. Linear interpolation was used to estimate the missing intermediate values, and then this time series was compared to the original. The difference between these two signals is computed for different levels of sub-sampling. Results show that the error increases linearly when the interval is less than 1 day. For intervals longer than a day, a sinusoidal component appears on top of the linear growth due to the diurnal variation of surface soil moisture. Thus, for example, the error with measurements every 4.5 days can be slightly less than the error with measurements every 2 days. Next, for a given sampling interval, this study evaluated the effect of the time during the day at which measurements are made. Of course when measurements are very frequent the time of acquisition does not matter, but when few measurements are available (sampling interval > 1 day), the time of acquisition can be important. It is shown that with daily measurements the error can double depending on the time of acquisition. This result is very sensitive to the phase of the sinusoidal variation of soil moisture. For example, in autumn for a given field with soil moisture ranging from 7.08% to 11.44% (mean and standard deviation being respectively 8.68% and 0.74%), daily measurements at 2 pm lead to a mean error of 0.47% v/v, while daily measurements at 9 am/pm produce a mean error of 0.24% v/v. The minimum of the sinusoid occurs every afternoon around 2 pm, after interpolation, measurements acquired at this time underestimate soil moisture, whereas measurements around 9 am/pm correspond to nodes of the sinusoid, hence they represent the average soil moisture. These results concerning the frequency and the timing of measurements can potentially drive the schedule of satellite image acquisition over some fields.

  16. Meta-analysis of multiple outcomes: a multilevel approach.

    PubMed

    Van den Noortgate, Wim; López-López, José Antonio; Marín-Martínez, Fulgencio; Sánchez-Meca, Julio

    2015-12-01

    In meta-analysis, dependent effect sizes are very common. An example is where in one or more studies the effect of an intervention is evaluated on multiple outcome variables for the same sample of participants. In this paper, we evaluate a three-level meta-analytic model to account for this kind of dependence, extending the simulation results of Van den Noortgate, López-López, Marín-Martínez, and Sánchez-Meca Behavior Research Methods, 45, 576-594 (2013) by allowing for a variation in the number of effect sizes per study, in the between-study variance, in the correlations between pairs of outcomes, and in the sample size of the studies. At the same time, we explore the performance of the approach if the outcomes used in a study can be regarded as a random sample from a population of outcomes. We conclude that although this approach is relatively simple and does not require prior estimates of the sampling covariances between effect sizes, it gives appropriate mean effect size estimates, standard error estimates, and confidence interval coverage proportions in a variety of realistic situations.

  17. Reversing the Course of Forgetting

    ERIC Educational Resources Information Center

    White, K. Geoffrey; Brown, Glenn S.

    2011-01-01

    Forgetting functions were generated for pigeons in a delayed matching-to-sample task, in which accuracy decreased with increasing retention-interval duration. In baseline training with dark retention intervals, accuracy was high overall. Illumination of the experimental chamber by a houselight during the retention interval impaired performance…

  18. Optimization of Sample Preparation processes of Bone Material for Raman Spectroscopy.

    PubMed

    Chikhani, Madelen; Wuhrer, Richard; Green, Hayley

    2018-03-30

    Raman spectroscopy has recently been investigated for use in the calculation of postmortem interval from skeletal material. The fluorescence generated by samples, which affects the interpretation of Raman data, is a major limitation. This study compares the effectiveness of two sample preparation techniques, chemical bleaching and scraping, in the reduction of fluorescence from bone samples during testing with Raman spectroscopy. Visual assessment of Raman spectra obtained at 1064 nm excitation following the preparation protocols indicates an overall reduction in fluorescence. Results demonstrate that scraping is more effective at resolving fluorescence than chemical bleaching. The scraping of skeletonized remains prior to Raman analysis is a less destructive method and allows for the preservation of a bone sample in a state closest to its original form, which is beneficial in forensic investigations. It is recommended that bone scraping supersedes chemical bleaching as the preferred method for sample preparation prior to Raman spectroscopy. © 2018 American Academy of Forensic Sciences.

  19. Estimating daily fat yield from a single milking on test day for herds with a robotic milking system.

    PubMed

    Peeters, R; Galesloot, P J B

    2002-03-01

    The objective of this study was to estimate the daily fat yield and fat percentage from one sampled milking per cow per test day in an automatic milking system herd, when the milking times and milk yields of all individual milkings are recorded by the automatic milking system. Multiple regression models were used to estimate the 24-h fat percentage when only one milking is sampled for components and milk yields and milking times are known for all milkings in the 24-h period before the sampled milking. In total, 10,697 cow test day records, from 595 herd tests at 91 Dutch herds milked with an automatic milking system, were used. The best model to predict 24-h fat percentage included fat percentage, protein percentage, milk yield and milking interval of the sampled milking, milk yield, and milking interval of the preceding milking, and the interaction between milking interval and the ratio of fat and protein percentage of the sampled milking. This model gave a standard deviation of the prediction error (SE) for 24-h fat percentage of 0.321 and a correlation between the predicted and actual 24-h fat percentage of 0.910. For the 24-h fat yield, we found SE = 90 g and correlation = 0.967. This precision is slightly better than that of present a.m.-p.m. testing schemes. Extra attention must be paid to correctly matching the sample jars and the milkings. Furthermore, milkings with an interval of less than 4 h must be excluded from sampling as well as milkings that are interrupted or that follow an interrupted milking. Under these restrictions (correct matching, interval of at least 4 h, and no interrupted milking), one sampled milking suffices to get a satisfactory estimate for the test-day fat yield.

  20. Application of continuous-wave terahertz computed tomography for the analysis of chicken bone structure

    NASA Astrophysics Data System (ADS)

    Li, Bin; Wang, Dayong; Rong, Lu; Zhai, Changchao; Wang, Yunxin; Zhao, Jie

    2018-02-01

    Terahertz (THz) radiation is able to penetrate many different types of nonpolar and nonmetallic materials without the damaging effects of x-rays. THz technology can be combined with computed tomography (CT) to form THz CT, which is an effective imaging method that is used to visualize the internal structure of a three-dimensional sample as cross-sectional images. Here, we reported an application of THz as the radiation source in CT imaging by replacing the x-rays. In this method, the sample cross section is scanned in all translation and rotation directions. Then, the projection data are reconstructed using a tomographic reconstruction algorithm. Two-dimensional (2-D) cross-sectional images of the chicken ulna were obtained through the continuous-wave (CW) THz CT system. Given by the difference of the THz absorption of different substances, the compact bone and spongy bone inside the chicken ulna are structurally distinguishable in the 2-D cross-sectional images. Using the filtered back projection algorithm, we reconstructed the projection data of the chicken ulna at different projection angle intervals and found that the artifacts and noise in the images are strikingly increased when the projection angle intervals become larger, reflected by the blurred boundary of the compact bone. The quality and fidelity of the 2-D cross-sectional images could be substantially improved by reducing the projection angle intervals. Our experimental data demonstrated a feasible application of the CW THz CT system in biological imaging.

  1. Confidence intervals for distinguishing ordinal and disordinal interactions in multiple regression.

    PubMed

    Lee, Sunbok; Lei, Man-Kit; Brody, Gene H

    2015-06-01

    Distinguishing between ordinal and disordinal interaction in multiple regression is useful in testing many interesting theoretical hypotheses. Because the distinction is made based on the location of a crossover point of 2 simple regression lines, confidence intervals of the crossover point can be used to distinguish ordinal and disordinal interactions. This study examined 2 factors that need to be considered in constructing confidence intervals of the crossover point: (a) the assumption about the sampling distribution of the crossover point, and (b) the possibility of abnormally wide confidence intervals for the crossover point. A Monte Carlo simulation study was conducted to compare 6 different methods for constructing confidence intervals of the crossover point in terms of the coverage rate, the proportion of true values that fall to the left or right of the confidence intervals, and the average width of the confidence intervals. The methods include the reparameterization, delta, Fieller, basic bootstrap, percentile bootstrap, and bias-corrected accelerated bootstrap methods. The results of our Monte Carlo simulation study suggest that statistical inference using confidence intervals to distinguish ordinal and disordinal interaction requires sample sizes more than 500 to be able to provide sufficiently narrow confidence intervals to identify the location of the crossover point. (c) 2015 APA, all rights reserved).

  2. Assessing accuracy of point fire intervals across landscapes with simulation modelling

    Treesearch

    Russell A. Parsons; Emily K. Heyerdahl; Robert E. Keane; Brigitte Dorner; Joseph Fall

    2007-01-01

    We assessed accuracy in point fire intervals using a simulation model that sampled four spatially explicit simulated fire histories. These histories varied in fire frequency and size and were simulated on a flat landscape with two forest types (dry versus mesic). We used three sampling designs (random, systematic grids, and stratified). We assessed the sensitivity of...

  3. Sample Size Calculation for Estimating or Testing a Nonzero Squared Multiple Correlation Coefficient

    ERIC Educational Resources Information Center

    Krishnamoorthy, K.; Xia, Yanping

    2008-01-01

    The problems of hypothesis testing and interval estimation of the squared multiple correlation coefficient of a multivariate normal distribution are considered. It is shown that available one-sided tests are uniformly most powerful, and the one-sided confidence intervals are uniformly most accurate. An exact method of calculating sample size to…

  4. [Main Components of Xinjiang Lavender Essential Oil Determined by Partial Least Squares and Near Infrared Spectroscopy].

    PubMed

    Liao, Xiang; Wang, Qing; Fu, Ji-hong; Tang, Jun

    2015-09-01

    This work was undertaken to establish a quantitative analysis model which can rapid determinate the content of linalool, linalyl acetate of Xinjiang lavender essential oil. Totally 165 lavender essential oil samples were measured by using near infrared absorption spectrum (NIR), after analyzing the near infrared spectral absorption peaks of all samples, lavender essential oil have abundant chemical information and the interference of random noise may be relatively low on the spectral intervals of 7100~4500 cm(-1). Thus, the PLS models was constructed by using this interval for further analysis. 8 abnormal samples were eliminated. Through the clustering method, 157 lavender essential oil samples were divided into 105 calibration set samples and 52 validation set samples. Gas chromatography mass spectrometry (GC-MS) was used as a tool to determine the content of linalool and linalyl acetate in lavender essential oil. Then the matrix was established with the GC-MS raw data of two compounds in combination with the original NIR data. In order to optimize the model, different pretreatment methods were used to preprocess the raw NIR spectral to contrast the spectral filtering effect, after analysizing the quantitative model results of linalool and linalyl acetate, the root mean square error prediction (RMSEP) of orthogonal signal transformation (OSC) was 0.226, 0.558, spectrally, it was the optimum pretreatment method. In addition, forward interval partial least squares (FiPLS) method was used to exclude the wavelength points which has nothing to do with determination composition or present nonlinear correlation, finally 8 spectral intervals totally 160 wavelength points were obtained as the dataset. Combining the data sets which have optimized by OSC-FiPLS with partial least squares (PLS) to establish a rapid quantitative analysis model for determining the content of linalool and linalyl acetate in Xinjiang lavender essential oil, numbers of hidden variables of two components were 8 in the model. The performance of the model was evaluated according to root mean square error of cross-validation (RMSECV), root mean square error of prediction (RMSEP). In the model, RESECV of linalool and linalyl acetate were 0.170 and 0.416, respectively; RM-SEP were 0.188 and 0.364. The results indicated that raw data was pretreated by OSC and FiPLS, the NIR-PLS quantitative analysis model with good robustness, high measurement precision; it could quickly determine the content of linalool and linalyl acetate in lavender essential oil. In addition, the model has a favorable prediction ability. The study also provide a new effective method which could rapid quantitative analysis the major components of Xinjiang lavender essential oil.

  5. Efficacy of herbal toothpastes on salivary pH and salivary glucose - A preliminary study.

    PubMed

    Khairnar, Mahesh R; Dodamani, Arun S; Karibasappa, G N; Naik, Rahul G; Deshmukh, Manjiri A

    Due to dearth of literature on the effect of herbal toothpaste on saliva and salivary constituents, the present study was undertaken to evaluate and compare the effect of three different herbal toothpastes with the focus on on salivary pH and salivary glucose. Forty five subjects in the age group of 19-21 years were randomly divided into 3 groups (15 in each group) and were randomly intervened with three different herbal toothpastes (Dant Kanti, Himalaya Complete Care and Vicco Vajradanti). Unstimulated saliva samples were collected before and after brushing and salivary glucose and pH levels were assessed at an interval of one week each for a period of 4 weeks starting from day 1. All the three toothpastes were effective in reducing the overall (p < 0.05) levels as well as levels of salivary glucose from pre-brushing to post-brushing at each interval (p < 0.05) and in increasing the overall levels as well as levels of salivary pH (p < 0.05) from pre-brushing to post-brushing at each interval. Herbal toothpastes were effective in reducing salivary levels of glucose and improving pH of the saliva. Copyright © 2016 Transdisciplinary University, Bangalore and World Ayurveda Foundation. Published by Elsevier B.V. All rights reserved.

  6. Statistical inference for the within-device precision of quantitative measurements in assay validation.

    PubMed

    Liu, Jen-Pei; Lu, Li-Tien; Liao, C T

    2009-09-01

    Intermediate precision is one of the most important characteristics for evaluation of precision in assay validation. The current methods for evaluation of within-device precision recommended by the Clinical Laboratory Standard Institute (CLSI) guideline EP5-A2 are based on the point estimator. On the other hand, in addition to point estimators, confidence intervals can provide a range for the within-device precision with a probability statement. Therefore, we suggest a confidence interval approach for assessment of the within-device precision. Furthermore, under the two-stage nested random-effects model recommended by the approved CLSI guideline EP5-A2, in addition to the current Satterthwaite's approximation and the modified large sample (MLS) methods, we apply the technique of generalized pivotal quantities (GPQ) to derive the confidence interval for the within-device precision. The data from the approved CLSI guideline EP5-A2 illustrate the applications of the confidence interval approach and comparison of results between the three methods. Results of a simulation study on the coverage probability and expected length of the three methods are reported. The proposed method of the GPQ-based confidence intervals is also extended to consider the between-laboratories variation for precision assessment.

  7. Pediatric-specific reference intervals in a nationally representative sample of Iranian children and adolescents: the CASPIAN-III study.

    PubMed

    Kelishadi, Roya; Marateb, Hamid Reza; Mansourian, Marjan; Ardalan, Gelayol; Heshmat, Ramin; Adeli, Khosrow

    2016-08-01

    This study aimed to determine for the first time the age- and gender-specific reference intervals for biomarkers of bone, metabolism, nutrition, and obesity in a nationally representative sample of the Iranian children and adolescents. We assessed the data of blood samples obtained from healthy Iranian children and adolescents, aged 7 to 19 years. The reference intervals of glucose, lipid profile, liver enzymes, zinc, copper, chromium, magnesium, and 25-hydroxy vitamin D [25(OH)D] were determined according to the Clinical & Laboratory Standards Institute C28-A3 guidelines. The reference intervals were partitioned using the Harris-Boyd method according to age and gender. The study population consisted of 4800 school students (50% boys, mean age of 13.8 years). Twelve chemistry analyses were partitioned by age and gender, displaying the range of results between the 2.5th to 97.5th percentiles. Significant differences existed only between boys and girls at 18 to 19 years of age for low density lipoprotein-cholesterol. 25(OH)D had the only reference interval that was similar to all age groups and both sexes. This study presented the first national database of reference intervals for a number of biochemical markers in Iranian children and adolescents. It is the first report of its kind from the Middle East and North Africa. The findings underscore the importance of providing reference intervals in different ethnicities and in various regions.

  8. Reference interval estimation: Methodological comparison using extensive simulations and empirical data.

    PubMed

    Daly, Caitlin H; Higgins, Victoria; Adeli, Khosrow; Grey, Vijay L; Hamid, Jemila S

    2017-12-01

    To statistically compare and evaluate commonly used methods of estimating reference intervals and to determine which method is best based on characteristics of the distribution of various data sets. Three approaches for estimating reference intervals, i.e. parametric, non-parametric, and robust, were compared with simulated Gaussian and non-Gaussian data. The hierarchy of the performances of each method was examined based on bias and measures of precision. The findings of the simulation study were illustrated through real data sets. In all Gaussian scenarios, the parametric approach provided the least biased and most precise estimates. In non-Gaussian scenarios, no single method provided the least biased and most precise estimates for both limits of a reference interval across all sample sizes, although the non-parametric approach performed the best for most scenarios. The hierarchy of the performances of the three methods was only impacted by sample size and skewness. Differences between reference interval estimates established by the three methods were inflated by variability. Whenever possible, laboratories should attempt to transform data to a Gaussian distribution and use the parametric approach to obtain the most optimal reference intervals. When this is not possible, laboratories should consider sample size and skewness as factors in their choice of reference interval estimation method. The consequences of false positives or false negatives may also serve as factors in this decision. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  9. Using Screencast Videos to Enhance Undergraduate Students' Statistical Reasoning about Confidence Intervals

    ERIC Educational Resources Information Center

    Strazzeri, Kenneth Charles

    2013-01-01

    The purposes of this study were to investigate (a) undergraduate students' reasoning about the concepts of confidence intervals (b) undergraduate students' interactions with "well-designed" screencast videos on sampling distributions and confidence intervals, and (c) how screencast videos improve undergraduate students' reasoning ability…

  10. High sensitivity Troponin T: an audit of implementation of its protocol in a district general hospital.

    PubMed

    Kalim, Shahid; Nazir, Shaista; Khan, Zia Ullah

    2013-01-01

    Protocols based on newer high sensitivity Troponin T (hsTropT) assays can rule in a suspected Acute Myocardial Infarction (AMI) as early as 3 hours. We conducted this study to audit adherence to our Trust's newly introduced AMI diagnostic protocol based on paired hsTropT testing at 0 and 3 hours. We retrospectively reviewed data of all patients who had hsTropT test done between 1st and 7th May 2012. Patient's demographics, utility of single or paired samples, time interval between paired samples, patient's presenting symptoms and ECG findings were noted and their means, medians, Standard deviations and proportions were calculated. A total of 66 patients had hsTropT test done during this period. Mean age was 63.30 +/- 17.46 years and 38 (57.57%) were males. Twenty-four (36.36%) patients had only single, rather than protocol recommended paired hsTropT samples, taken. Among the 42 (63.63%) patients with paired samples, the mean time interval was found to be 4.41 +/- 5.7 hours. Contrary to the recommendations, 15 (22.73%) had a very long whereas 2 (3.03%) had a very short time interval between two samples. A subgroup analysis of patients with single samples, found only 2 (3.03%) patient with ST-segment elevation, appropriate for single testing. Our study confirmed that in a large number of patients the protocol for paired sampling or a recommended time interval of 3 hours between 2 samples was not being followed.

  11. Study of the Effect of Temporal Sampling Frequency on DSCOVR Observations Using the GEOS-5 Nature Run Results (Part I): Earths Radiation Budget

    NASA Technical Reports Server (NTRS)

    Holdaway, Daniel; Yang, Yuekui

    2016-01-01

    Satellites always sample the Earth-atmosphere system in a finite temporal resolution. This study investigates the effect of sampling frequency on the satellite-derived Earth radiation budget, with the Deep Space Climate Observatory (DSCOVR) as an example. The output from NASA's Goddard Earth Observing System Version 5 (GEOS-5) Nature Run is used as the truth. The Nature Run is a high spatial and temporal resolution atmospheric simulation spanning a two-year period. The effect of temporal resolution on potential DSCOVR observations is assessed by sampling the full Nature Run data with 1-h to 24-h frequencies. The uncertainty associated with a given sampling frequency is measured by computing means over daily, monthly, seasonal and annual intervals and determining the spread across different possible starting points. The skill with which a particular sampling frequency captures the structure of the full time series is measured using correlations and normalized errors. Results show that higher sampling frequency gives more information and less uncertainty in the derived radiation budget. A sampling frequency coarser than every 4 h results in significant error. Correlations between true and sampled time series also decrease more rapidly for a sampling frequency less than 4 h.

  12. Profile-likelihood Confidence Intervals in Item Response Theory Models.

    PubMed

    Chalmers, R Philip; Pek, Jolynn; Liu, Yang

    2017-01-01

    Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.

  13. Empirical likelihood-based confidence intervals for mean medical cost with censored data.

    PubMed

    Jeyarajah, Jenny; Qin, Gengsheng

    2017-11-10

    In this paper, we propose empirical likelihood methods based on influence function and jackknife techniques for constructing confidence intervals for mean medical cost with censored data. We conduct a simulation study to compare the coverage probabilities and interval lengths of our proposed confidence intervals with that of the existing normal approximation-based confidence intervals and bootstrap confidence intervals. The proposed methods have better finite-sample performances than existing methods. Finally, we illustrate our proposed methods with a relevant example. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Dried blood spot testing for seven steroids using liquid chromatography-tandem mass spectrometry with reference interval determination in the Korean population.

    PubMed

    Kim, Borahm; Lee, Mi Na; Park, Hyung Doo; Kim, Jong Won; Chang, Yun Sil; Park, Won Soon; Lee, Soo Youn

    2015-11-01

    Conventional screening for congenital adrenal hyperplasia (CAH) using immunoassays generates a large number of false-positive results. A more specific liquid chromatography-tandem mass spectrometry (LC-MS/MS) method has been introduced to minimize unnecessary follow-ups. However, because of limited data on its use in the Korean population, LC-MS/MS has not yet been incorporated into newborn screening programs in this region. The present study aims to develop and validate an LC-MS/MS method for the simultaneous determination of seven steroids in dried blood spots (DBS) for CAH screening, and to define age-specific reference intervals in the Korean population. We developed and validated an LC-MS/MS method to determine the reference intervals of cortisol, 17-hydroxyprogesterone, 11-deoxycortisol, 21-deoxycortisol, androstenedione, corticosterone, and 11-deoxycorticosterone simultaneously in 453 DBS samples. The samples were from Korean subjects stratified by age group (78 full-term neonates, 76 premature neonates, 89 children, and 100 adults). The accuracy, precision, matrix effects, and extraction recovery were satisfactory for all the steroids at three concentrations; values of intra- and inter-day precision coefficients of variance, bias, and recovery were 0.7-7.7%, -1.5-9.8%, and 49.3-97.5%, respectively. The linearity range was 1-100 ng/mL for cortisol and 0.5-50 ng/mL for other steroids (R²>0.99). The reference intervals were in agreement with the previous reports. This LC-MS/MS method and the reference intervals validated in the Korean population can be successfully applied to analyze seven steroids in DBS for the diagnosis of CAH.

  15. Dating human skeletal remains: investigating the viability of measuring the equilibrium between 210Po and 210Pb as a means of estimating the post-mortem interval.

    PubMed

    Swift, B

    1998-11-30

    Estimating the post-mortem interval in skeletal remains is a notoriously difficult task; forensic pathologists often rely heavily upon experience in recognising morphological appearances. Previous techniques have involved measuring physical or chemical changes within the hydroxyapatite matrix, radiocarbon dating and 90Sr dating, though no individual test has been advocated. Within this paper it is proposed that measuring the equilibrium between two naturally occurring radio-isotopes, 210Po and 210Pb, and comparison with post-mortem examination samples would produce a new method of dating human skeletal remains. Possible limitations exist, notably the effect of diagenesis, time limitations and relative cost, though this technique could provide a relatively accurate means of determining the post-mortem interval. It is therefore proposed that a large study be undertaken to provide a calibration scale against which bones uncovered can be dated.

  16. Estimating clinical chemistry reference values based on an existing data set of unselected animals.

    PubMed

    Dimauro, Corrado; Bonelli, Piero; Nicolussi, Paola; Rassu, Salvatore P G; Cappio-Borlino, Aldo; Pulina, Giuseppe

    2008-11-01

    In an attempt to standardise the determination of biological reference values, the International Federation of Clinical Chemistry (IFCC) has published a series of recommendations on developing reference intervals. The IFCC recommends the use of an a priori sampling of at least 120 healthy individuals. However, such a high number of samples and laboratory analysis is expensive, time-consuming and not always feasible, especially in veterinary medicine. In this paper, an alternative (a posteriori) method is described and is used to determine reference intervals for biochemical parameters of farm animals using an existing laboratory data set. The method used was based on the detection and removal of outliers to obtain a large sample of animals likely to be healthy from the existing data set. This allowed the estimation of reliable reference intervals for biochemical parameters in Sarda dairy sheep. This method may also be useful for the determination of reference intervals for different species, ages and gender.

  17. Effects of Attenuation of Gas Hydrate-bearing Sediments on Seismic Data: Example from Mallik, Northwest Territories, Canada

    NASA Astrophysics Data System (ADS)

    Bellefleur, G.; Riedel, M.; Brent, T.

    2007-05-01

    Wave attenuation is an important physical property of hydrate-bearing sediments that is rarely taken into account in site characterization with seismic data. We present a field example showing improved images of hydrate- bearing sediments on seismic data after compensation of attenuation effects. Compressional quality factors (Q) are estimated from zero-offset Vertical Seismic Profiling data acquired at Mallik, Northwest Territories, Canada. During the last 10 years, two internationally-partnered research drilling programs have intersected three major intervals of sub-permafrost gas hydrates at Mallik, and have successfully extracted core samples containing significant amount of gas hydrates. Individual gas hydrate intervals are up to 40m in thickness and are characterized by high in situ gas hydrate saturation, sometimes exceeding 80% of pore volume of unconsolidated clastic sediments having average porosities ranging from 25% to 40%. The Q-factors obtained from the VSP data demonstrate significant wave attenuation for permafrost and hydrate- bearing sediments. These results are in agreement with previous attenuation estimates from sonic logs and crosshole data at different frequency intervals. The Q-factors obtained from VSP data were used to compensate attenuation effects on surface 3D seismic data acquired over the Mallik gas hydrate research wells. Intervals of gas hydrate on surface seismic data are characterized by strong reflectivity and effects from attenuation are not perceptible from a simple visual inspection of the data. However, the application of an inverse Q-filter increases the resolution of the data and improves correlation with log data, particularly for the shallowest gas hydrate interval. Compensation of the attenuation effects of the permafrost likely explains most of the improvements for the shallow gas hydrate zone. Our results show that characterization of the Mallik gas hydrates with seismic data not corrected for attenuation would tend to overestimate thicknesses and lateral extent of hydrate-bearing strata and hence, the volume of hydrates in place.

  18. Hematologic and serum biochemical reference intervals for free-ranging common bottlenose dolphins (Tursiops truncatus) and variation in the distributions of clinicopathologic values related to geographic sampling site.

    PubMed

    Schwacke, Lori H; Hall, Ailsa J; Townsend, Forrest I; Wells, Randall S; Hansen, Larry J; Hohn, Aleta A; Bossart, Gregory D; Fair, Patricia A; Rowles, Teresa K

    2009-08-01

    To develop robust reference intervals for hematologic and serum biochemical variables by use of data derived from free-ranging bottlenose dolphins (Tursiops truncatus) and examine potential variation in distributions of clinicopathologic values related to sampling sites' geographic locations. 255 free-ranging bottlenose dolphins. Data from samples collected during multiple bottlenose dolphin capture-release projects conducted at 4 southeastern US coastal locations in 2000 through 2006 were combined to determine reference intervals for 52 clinicopathologic variables. A nonparametric bootstrap approach was applied to estimate 95th percentiles and associated 90% confidence intervals; the need for partitioning by length and sex classes was determined by testing for differences in estimated thresholds with a bootstrap method. When appropriate, quantile regression was used to determine continuous functions for 95th percentiles dependent on length. The proportion of out-of-range samples for all clinicopathologic measurements was examined for each geographic site, and multivariate ANOVA was applied to further explore variation in leukocyte subgroups. A need for partitioning by length and sex classes was indicated for many clinicopathologic variables. For each geographic site, few significant deviations from expected number of out-of-range samples were detected. Although mean leukocyte counts did not vary among sites, differences in the mean counts for leukocyte subgroups were identified. Although differences in the centrality of distributions for some variables were detected, the 95th percentiles estimated from the pooled data were robust and applicable across geographic sites. The derived reference intervals provide critical information for conducting bottlenose dolphin population health studies.

  19. Reporting of research quality characteristics of studies published in 6 major clinical dental specialty journals.

    PubMed

    Pandis, Nikolaos; Polychronopoulou, Argy; Madianos, Phoebus; Makou, Margarita; Eliades, Theodore

    2011-06-01

    The objective of this article was to record reporting characteristics related to study quality of research published in major specialty dental journals with the highest impact factor (Journal of Endodontics, Journal of Oral and Maxillofacial Surgery, American Journal of Orthodontics and Dentofacial Orthopedics; Pediatric Dentistry, Journal of Clinical Periodontology, and International Journal of Prosthetic Dentistry). The included articles were classified into the following 3 broad subject categories: (1) cross-sectional (snap-shot), (2) observational, and (3) interventional. Multinomial logistic regression was conducted for effect estimation using the journal as the response and randomization, sample calculation, confounding discussed, multivariate analysis, effect measurement, and confidence intervals as the explanatory variables. The results showed that cross-sectional studies were the dominant design (55%), whereas observational investigations accounted for 13%, and interventions/clinical trials for 32%. Reporting on quality characteristics was low for all variables: random allocation (15%), sample size calculation (7%), confounding issues/possible confounders (38%), effect measurements (16%), and multivariate analysis (21%). Eighty-four percent of the published articles reported a statistically significant main finding and only 13% presented confidence intervals. The Journal of Clinical Periodontology showed the highest probability of including quality characteristics in reporting results among all dental journals. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Modulation Based on Probability Density Functions

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  1. Min and Max Exponential Extreme Interval Values and Statistics

    ERIC Educational Resources Information Center

    Jance, Marsha; Thomopoulos, Nick

    2009-01-01

    The extreme interval values and statistics (expected value, median, mode, standard deviation, and coefficient of variation) for the smallest (min) and largest (max) values of exponentially distributed variables with parameter ? = 1 are examined for different observation (sample) sizes. An extreme interval value g[subscript a] is defined as a…

  2. Considerations for Time Sampling Interval Durations in the Measurement of Young Children's Classroom Engagement

    ERIC Educational Resources Information Center

    Zakszeski, Brittany N.; Hojnoski, Robin L.; Wood, Brenna K.

    2017-01-01

    Classroom engagement is important to young children's academic and social development. Accurate methods of capturing this behavior are needed to inform and evaluate intervention efforts. This study compared the accuracy of interval durations (i.e., 5 s, 10 s, 15 s, 20 s, 30 s, and 60 s) of momentary time sampling (MTS) in approximating the…

  3. Asymptotic confidence intervals for the Pearson correlation via skewness and kurtosis.

    PubMed

    Bishara, Anthony J; Li, Jiexiang; Nash, Thomas

    2018-02-01

    When bivariate normality is violated, the default confidence interval of the Pearson correlation can be inaccurate. Two new methods were developed based on the asymptotic sampling distribution of Fisher's z' under the general case where bivariate normality need not be assumed. In Monte Carlo simulations, the most successful of these methods relied on the (Vale & Maurelli, 1983, Psychometrika, 48, 465) family to approximate a distribution via the marginal skewness and kurtosis of the sample data. In Simulation 1, this method provided more accurate confidence intervals of the correlation in non-normal data, at least as compared to no adjustment of the Fisher z' interval, or to adjustment via the sample joint moments. In Simulation 2, this approximate distribution method performed favourably relative to common non-parametric bootstrap methods, but its performance was mixed relative to an observed imposed bootstrap and two other robust methods (PM1 and HC4). No method was completely satisfactory. An advantage of the approximate distribution method, though, is that it can be implemented even without access to raw data if sample skewness and kurtosis are reported, making the method particularly useful for meta-analysis. Supporting information includes R code. © 2017 The British Psychological Society.

  4. A pilot study examining the effects of low-volume high-intensity interval training and continuous low to moderate intensity training on quality of life, functional capacity and cardiovascular risk factors in cancer survivors.

    PubMed

    Toohey, Kellie; Pumpa, Kate L; Arnolda, Leonard; Cooke, Julie; Yip, Desmond; Craft, Paul S; Semple, Stuart

    2016-01-01

    The aim of this study was to evaluate the effects of low-volume high-intensity interval training and continuous low to moderate intensity training on quality of life, functional capacity and cardiovascular disease risk factors in cancer survivors. Cancer survivors within 24 months post-diagnosis were randomly assigned into the low-volume high-intensity interval training group ( n  = 8) or the continuous low to moderate intensity training group ( n  = 8) group for 36 sessions (12 weeks) of supervised exercise. The low-volume high-intensity interval training (LVHIIT) group performed 7 × 30 s intervals (≥85% maximal heart rate) and the continuous low to moderate intensity training (CLMIT) group performed continuous aerobic training for 20 min (≤55% maximal heart rate) on a stationary bike or treadmill. Significant improvements (time) were observed for 13 of the 23 dependent variables (ES 0.05-0.61, p  ≤ 0.05). An interaction effect was observed for six minute walk test (18.53% [32.43-4.63] ES 0.50, p  ≤ 0.01) with the LVHIIT group demonstrating greater improvements. These preliminary findings suggest that both interventions can induce improvements in quality of life, functional capacity and selected cardiovascular disease risk factors. The LVHIIT program was well tolerated by the participants and our results suggest that LVHIIT is the preferred modality to improve fitness (6MWT); it remains to be seen which intervention elicits the most clinically relevant outcomes for patients. A larger sample size with a control group is required to confirm the significance of these findings.

  5. A pilot study examining the effects of low-volume high-intensity interval training and continuous low to moderate intensity training on quality of life, functional capacity and cardiovascular risk factors in cancer survivors

    PubMed Central

    Pumpa, Kate L.; Arnolda, Leonard; Cooke, Julie; Yip, Desmond; Craft, Paul S.; Semple, Stuart

    2016-01-01

    Purpose The aim of this study was to evaluate the effects of low-volume high-intensity interval training and continuous low to moderate intensity training on quality of life, functional capacity and cardiovascular disease risk factors in cancer survivors. Methods Cancer survivors within 24 months post-diagnosis were randomly assigned into the low-volume high-intensity interval training group (n = 8) or the continuous low to moderate intensity training group (n = 8) group for 36 sessions (12 weeks) of supervised exercise. The low-volume high-intensity interval training (LVHIIT) group performed 7 × 30 s intervals (≥85% maximal heart rate) and the continuous low to moderate intensity training (CLMIT) group performed continuous aerobic training for 20 min (≤55% maximal heart rate) on a stationary bike or treadmill. Results Significant improvements (time) were observed for 13 of the 23 dependent variables (ES 0.05–0.61, p ≤ 0.05). An interaction effect was observed for six minute walk test (18.53% [32.43–4.63] ES 0.50, p ≤ 0.01) with the LVHIIT group demonstrating greater improvements. Conclusion These preliminary findings suggest that both interventions can induce improvements in quality of life, functional capacity and selected cardiovascular disease risk factors. The LVHIIT program was well tolerated by the participants and our results suggest that LVHIIT is the preferred modality to improve fitness (6MWT); it remains to be seen which intervention elicits the most clinically relevant outcomes for patients. A larger sample size with a control group is required to confirm the significance of these findings. PMID:27781180

  6. Prevalence of Posttraumatic Stress Disorder in Prisoners.

    PubMed

    Baranyi, Gergõ; Cassidy, Megan; Fazel, Seena; Priebe, Stefan; Mundt, Adrian P

    2018-06-01

    People involved with criminal justice frequently are exposed to violence and traumatic experiences. This may lead to posttraumatic stress disorder (PTSD); however, no review, to our knowledge, has synthetized findings in this setting. We conducted a systematic review and meta-analysis to estimate prevalence rates of PTSD in prison populations. Original studies in which prevalence rates of PTSD in unselected samples of incarcerated people were reported were systematically searched between 1980 and June 2017. Data were pooled using random-effects meta-analysis, and sources of heterogeneity for prespecified characteristics were assessed by meta-regression. We identified 56 samples comprising 21,099 imprisoned men and women from 20 countries. Point prevalence of PTSD ranged from 0.1% to 27% for male, and from 12% to 38% for female prisoner populations. The random-effects pooled point prevalence was 6.2% (95% confidence interval: 3.9, 9.0) in male prisoners and 21.1% (95% confidence interval: 16.9, 25.6) in female prisoners. The heterogeneity between the included studies was very high. Higher prevalence was reported in samples of female prisoners, smaller studies (n < 100), and for investigations based in high-income countries. Existing evidence shows high levels of PTSD among imprisoned people, especially women. Psychosocial interventions to prevent violence, especially against children and women, and to mitigate its consequences in marginalized communities must be improved. Trauma-informed approaches for correctional programs and scalable PTSD treatments in prisons require further consideration.

  7. Differential modulatory effects of cocaine on marmoset monkey recognition memory.

    PubMed

    Melamed, Jonathan L; de Jesus, Fernando M; Aquino, Jéssica; Vannuchi, Clarissa R S; Duarte, Renata B M; Maior, Rafael S; Tomaz, Carlos; Barros, Marilia

    2017-01-01

    Acute and repeated exposure to cocaine alters the cognitive performance of humans and animals. How each administration schedule affects the same memory task has yet to be properly established in nonhuman primates. Therefore, we assessed the performance of marmoset monkeys in a spontaneous object-location (SOL) recognition memory task after acute and repeated exposure to cocaine (COC; 5mg/kg, ip). Two identical neutral stimuli were explored on the 10-min sample trial, after which preferential exploration of the displaced vs the stationary object was analyzed on the 10-min test trial. For the acute treatment, cocaine was given immediately after the sample presentation, and spatial recognition was then tested after a 24-h interval. For the repeated exposure schedule, daily cocaine injections were given on 7 consecutive days. After a 7-day drug-free period, the SOL task was carried out with a 10-min intertrial interval. When given acutely postsample, COC improved the marmosets' recognition memory, whereas it had a detrimental effect after the repeated exposure. Thus, depending on the administration schedule, COC exerted opposing effects on the marmosets' ability to recognize spatial changes. This agrees with recent studies in rodents and the recognition impairment seen in human addicts. Further studies related to the effects of cocaine's acute×prior drug history on the same cognitive domain are warranted. © 2017 Elsevier B.V. All rights reserved.

  8. Examining mediators of child sexual abuse and sexually transmitted infections.

    PubMed

    Sutherland, Melissa A

    2011-01-01

    Interpersonal violence has increasingly been identified as a risk factor for sexually transmitted infections. Understanding the pathways between violence and sexually transmitted infections is essential to designing effective interventions. The aim of this study was to examine dissociative symptoms, alcohol use, and intimate partner physical violence and sexual coercion as mediators of child sexual abuse and lifetime sexually transmitted infection diagnosis among a sample of women. A convenience sample of 202 women was recruited from healthcare settings, with 189 complete cases for analysis. A multiple mediation model tested the proposed mediators of child sexual abuse and lifetime sexually transmitted infection diagnosis. Bootstrapping, a resampling method, was used to test for mediation. Key variables included child sexual abuse, dissociative symptoms, alcohol use, and intimate partner violence. Child sexual abuse was reported by 46% of the study participants (n = 93). Child sexual abuse was found to have an indirect effect on lifetime sexually transmitted infection diagnosis, with the effect occurring through dissociative symptoms (95% confidence interval = 0.0033-0.4714) and sexual coercion (95% confidence interval = 0.0359-0.7694). Alcohol use and physical violence were not found to be significant mediators. This study suggests that dissociation and intimate partner sexual coercion are important mediators of child sexual abuse and sexually transmitted infection diagnosis. Therefore, interventions that consider the roles of dissociative symptoms and interpersonal violence may be effective in preventing sexually transmitted infections among women.

  9. Common Genetic Variant Risk Score Is Associated With Drug-Induced QT Prolongation and Torsade de Pointes Risk: A Pilot Study.

    PubMed

    Strauss, David G; Vicente, Jose; Johannesen, Lars; Blinova, Ksenia; Mason, Jay W; Weeke, Peter; Behr, Elijah R; Roden, Dan M; Woosley, Ray; Kosova, Gulum; Rosenberg, Michael A; Newton-Cheh, Christopher

    2017-04-04

    Drug-induced QT interval prolongation, a risk factor for life-threatening ventricular arrhythmias, is a potential side effect of many marketed and withdrawn medications. The contribution of common genetic variants previously associated with baseline QT interval to drug-induced QT prolongation and arrhythmias is not known. We tested the hypothesis that a weighted combination of common genetic variants contributing to QT interval at baseline, identified through genome-wide association studies, can predict individual response to multiple QT-prolonging drugs. Genetic analysis of 22 subjects was performed in a secondary analysis of a randomized, double-blind, placebo-controlled, crossover trial of 3 QT-prolonging drugs with 15 time-matched QT and plasma drug concentration measurements. Subjects received single doses of dofetilide, quinidine, ranolazine, and placebo. The outcome was the correlation between a genetic QT score comprising 61 common genetic variants and the slope of an individual subject's drug-induced increase in heart rate-corrected QT (QTc) versus drug concentration. The genetic QT score was correlated with drug-induced QTc prolongation. Among white subjects, genetic QT score explained 30% of the variability in response to dofetilide ( r =0.55; 95% confidence interval, 0.09-0.81; P =0.02), 23% in response to quinidine ( r =0.48; 95% confidence interval, -0.03 to 0.79; P =0.06), and 27% in response to ranolazine ( r =0.52; 95% confidence interval, 0.05-0.80; P =0.03). Furthermore, the genetic QT score was a significant predictor of drug-induced torsade de pointes in an independent sample of 216 cases compared with 771 controls ( r 2 =12%, P =1×10 -7 ). We demonstrate that a genetic QT score comprising 61 common genetic variants explains a significant proportion of the variability in drug-induced QT prolongation and is a significant predictor of drug-induced torsade de pointes. These findings highlight an opportunity for recent genetic discoveries to improve individualized risk-benefit assessment for pharmacological therapies. Replication of these findings in larger samples is needed to more precisely estimate variance explained and to establish the individual variants that drive these effects. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01873950. © 2017 American Heart Association, Inc.

  10. Summer effects on body mass index (BMI) gain and growth patterns of American Indian children from kindergarten to first grade: a prospective study

    PubMed Central

    2011-01-01

    Background Overweight and obesity are highly prevalent among American Indian children, especially those living on reservations. There is little scientific evidence about the effects of summer vacation on obesity development in children. The purpose of this study was to investigate the effects of summer vacation between kindergarten and first grade on growth in height, weight, and body mass index (BMI) for a sample of American Indian children. Methods Children had their height and weight measured in four rounds of data collection (yielded three intervals: kindergarten, summer vacation, and first grade) as part of a school-based obesity prevention trial (Bright Start) in a Northern Plains Indian Reservation. Demographic variables were collected at baseline from parent surveys. Growth velocities (Z-score units/year) for BMI, weight, and height were estimated and compared for each interval using generalized linear mixed models. Results The children were taller and heavier than median of same age counterparts. Height Z-scores were positively associated with increasing weight status category. The mean weight velocity during summer was significantly less than during the school year. More rapid growth velocity in height during summer than during school year was observed. Obese children gained less adjusted-BMI in the first grade after gaining more than their counterparts during the previous two intervals. No statistically significant interval effects were found for height and BMI velocities. Conclusions There was no indication of a significant summer effect on children's BMI. Rather than seasonal or school-related patterns, the predominant pattern indicated by weight-Z and BMI-Z velocities might be related to age or maturation. Trial registration Bright Start: Obesity Prevention in American Indian Children Clinical Trial Govt ID# NCT00123032 PMID:22192795

  11. 40 CFR Table F-5 to Subpart F of... - Estimated Mass Concentration Measurement of PM2.5 for Idealized “Typical” Coarse Aerosol Size...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 5 2011-07-01 2011-07-01 false Estimated Mass Concentration... 53—Estimated Mass Concentration Measurement of PM2.5 for Idealized “Typical” Coarse Aerosol Size Distribution Particle Aerodynamic Diameter (µm) Test Sampler Fractional Sampling Effectiveness Interval Mass...

  12. 40 CFR Table F-5 to Subpart F of... - Estimated Mass Concentration Measurement of PM2.5 for Idealized “Typical” Coarse Aerosol Size...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 5 2010-07-01 2010-07-01 false Estimated Mass Concentration... 53—Estimated Mass Concentration Measurement of PM2.5 for Idealized “Typical” Coarse Aerosol Size Distribution Particle Aerodynamic Diameter (µm) Test Sampler Fractional Sampling Effectiveness Interval Mass...

  13. Venetoclax does not prolong the QT interval in patients with hematological malignancies: an exposure-response analysis.

    PubMed

    Freise, Kevin J; Dunbar, Martin; Jones, Aksana K; Hoffman, David; Enschede, Sari L Heitner; Wong, Shekman; Salem, Ahmed Hamed

    2016-10-01

    Venetoclax (ABT-199/GDC-0199) is a selective first-in-class B cell lymphoma-2 inhibitor being developed for the treatment of hematological malignancies. The aim of this study was to determine the potential of venetoclax to prolong the corrected QT (QTc) interval and to evaluate the relationship between systemic venetoclax concentration and QTc interval. The study population included 176 male and female patients with relapsed or refractory chronic lymphocytic leukemia/small lymphocytic lymphoma (n = 105) or non-Hodgkin's lymphoma (n = 71) enrolled in a phase 1 safety, pharmacokinetic, and efficacy study. Electrocardiograms were collected in triplicate at time-matched points (2, 4, 6, and 8 h) prior to the first venetoclax administration and after repeated venetoclax administration to achieve steady state conditions. Venetoclax doses ranged from 100 to 1200 mg daily. Plasma venetoclax samples were collected after steady state electrocardiogram measurements. The mean and upper bound of the 2-sided 90 % confidence interval (CI) QTc change from baseline were <5 and <10 ms, respectively, at all time points and doses (<400, 400, and >400 mg). Three subjects had single QTc values >500 ms and/or ΔQTc > 60 ms. The effect of venetoclax concentration on both ΔQTc and QTc was not statistically significant (P > 0.05). At the mean maximum concentrations achieved with therapeutic (400 mg) and supra-therapeutic (1200 mg) venetoclax doses, the estimated drug effects on QTc were 0.137 (90 % CI [-1.01 to 1.28]) and 0.263 (90 % CI [-1.92 to 2.45]) ms, respectively. Venetoclax does not prolong QTc interval even at supra-therapeutic doses, and there is no relationship between venetoclax concentrations and QTc interval.

  14. A comparison of single and multiple stressor protocols to assess acute stress in a coastal shark species, Rhizoprionodon terraenovae.

    PubMed

    Hoffmayer, Eric R; Hendon, Jill M; Parsons, Glenn R; Driggers, William B; Campbell, Matthew D

    2015-10-01

    Elasmobranch stress responses are traditionally measured in the field by either singly or serially sampling an animal after a physiologically stressful event. Although capture and handling techniques are effective at inducing a stress response, differences in protocols could affect the degree of stress experienced by an individual, making meaningful comparisons between the protocols difficult, if not impossible. This study acutely stressed Atlantic sharpnose sharks, Rhizoprionodon terraenovae, by standardized capture (rod and reel) and handling methods and implemented either a single or serial blood sampling protocol to monitor four indicators of the secondary stress response. Single-sampled sharks were hooked and allowed to swim around the boat until retrieved for a blood sample at either 0, 15, 30, 45, or 60 min post-hooking. Serially sampled sharks were retrieved, phlebotomized, released while still hooked, and subsequently resampled at 15, 30, 45, and 60 min intervals post-hooking. Blood was analyzed for hematocrit, and plasma glucose, lactate, and osmolality levels. Although both single and serial sampling protocols resulted in an increase in glucose, no significant difference in glucose level was found between protocols. Serially sampled sharks exhibited cumulatively heightened levels for lactate and osmolality at all time intervals when compared to single-sampled animals at the same time. Maximal concentration differences of 217.5, 9.8, and 41.6 % were reported for lactate, osmolality, and glucose levels, respectively. Hematocrit increased significantly over time for the single sampling protocol but did not change significantly during the serial sampling protocol. The differences in resultant blood chemistry levels between implemented stress protocols and durations are significant and need to be considered when assessing stress in elasmobranchs.

  15. Practicability of monitoring soil Cd, Hg, and Pb pollution based on a geochemical survey in China.

    PubMed

    Xia, Xueqi; Yang, Zhongfang; Li, Guocheng; Yu, Tao; Hou, Qingye; Mutelo, Admire Muchimamui

    2017-04-01

    Repeated visiting, i.e., sampling and analysis at two or more temporal points, is one of the important ways of monitoring soil heavy metal contamination. However, with the concern about the cost, determination of the number of samples and the temporal interval, and their capability to detect a certain change is a key technical problem to be solved. This depends on the spatial variation of the parameters in the monitoring units. The "National Multi-Purpose Regional Geochemical Survey" (NMPRGS) project in China, acquired the spatial distribution of heavy metals using a high density sampling method in the most arable regions in China. Based on soil Cd, Hg, and Pb data and taking administrative regions as the monitoring units, the number of samples and temporal intervals that may be used for monitoring soil heavy metal contamination were determined. It was found that there is a large variety of spatial variation of the elements in each NMPRGS region. This results in the difficulty in the determination of the minimum detectable changes (MDC), the number of samples, and temporal intervals for revisiting. This paper recommends a suitable set of the number of samples (n r ) for each region under the balance of cost, practicability, and monitoring precision. Under n r , MDC values are acceptable for all the regions, and the minimum temporal intervals are practical with the range of 3.3-13.3 years. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Systematic bias in genomic classification due to contaminating non-neoplastic tissue in breast tumor samples.

    PubMed

    Elloumi, Fathi; Hu, Zhiyuan; Li, Yan; Parker, Joel S; Gulley, Margaret L; Amos, Keith D; Troester, Melissa A

    2011-06-30

    Genomic tests are available to predict breast cancer recurrence and to guide clinical decision making. These predictors provide recurrence risk scores along with a measure of uncertainty, usually a confidence interval. The confidence interval conveys random error and not systematic bias. Standard tumor sampling methods make this problematic, as it is common to have a substantial proportion (typically 30-50%) of a tumor sample comprised of histologically benign tissue. This "normal" tissue could represent a source of non-random error or systematic bias in genomic classification. To assess the performance characteristics of genomic classification to systematic error from normal contamination, we collected 55 tumor samples and paired tumor-adjacent normal tissue. Using genomic signatures from the tumor and paired normal, we evaluated how increasing normal contamination altered recurrence risk scores for various genomic predictors. Simulations of normal tissue contamination caused misclassification of tumors in all predictors evaluated, but different breast cancer predictors showed different types of vulnerability to normal tissue bias. While two predictors had unpredictable direction of bias (either higher or lower risk of relapse resulted from normal contamination), one signature showed predictable direction of normal tissue effects. Due to this predictable direction of effect, this signature (the PAM50) was adjusted for normal tissue contamination and these corrections improved sensitivity and negative predictive value. For all three assays quality control standards and/or appropriate bias adjustment strategies can be used to improve assay reliability. Normal tissue sampled concurrently with tumor is an important source of bias in breast genomic predictors. All genomic predictors show some sensitivity to normal tissue contamination and ideal strategies for mitigating this bias vary depending upon the particular genes and computational methods used in the predictor.

  17. Using known populations of pronghorn to evaluate sampling plans and estimators

    USGS Publications Warehouse

    Kraft, K.M.; Johnson, D.H.; Samuelson, J.M.; Allen, S.H.

    1995-01-01

    Although sampling plans and estimators of abundance have good theoretical properties, their performance in real situations is rarely assessed because true population sizes are unknown. We evaluated widely used sampling plans and estimators of population size on 3 known clustered distributions of pronghorn (Antilocapra americana). Our criteria were accuracy of the estimate, coverage of 95% confidence intervals, and cost. Sampling plans were combinations of sampling intensities (16, 33, and 50%), sample selection (simple random sampling without replacement, systematic sampling, and probability proportional to size sampling with replacement), and stratification. We paired sampling plans with suitable estimators (simple, ratio, and probability proportional to size). We used area of the sampling unit as the auxiliary variable for the ratio and probability proportional to size estimators. All estimators were nearly unbiased, but precision was generally low (overall mean coefficient of variation [CV] = 29). Coverage of 95% confidence intervals was only 89% because of the highly skewed distribution of the pronghorn counts and small sample sizes, especially with stratification. Stratification combined with accurate estimates of optimal stratum sample sizes increased precision, reducing the mean CV from 33 without stratification to 25 with stratification; costs increased 23%. Precise results (mean CV = 13) but poor confidence interval coverage (83%) were obtained with simple and ratio estimators when the allocation scheme included all sampling units in the stratum containing most pronghorn. Although areas of the sampling units varied, ratio estimators and probability proportional to size sampling did not increase precision, possibly because of the clumped distribution of pronghorn. Managers should be cautious in using sampling plans and estimators to estimate abundance of aggregated populations.

  18. The impact of hypnotic suggestibility in clinical care settings.

    PubMed

    Montgomery, Guy H; Schnur, Julie B; David, Daniel

    2011-07-01

    Hypnotic suggestibility has been described as a powerful predictor of outcomes associated with hypnotic interventions. However, there have been no systematic approaches to quantifying this effect across the literature. This meta-analysis evaluates the magnitude of the effect of hypnotic suggestibility on hypnotic outcomes in clinical settings. PsycINFO and PubMed were searched from their inception through July 2009. Thirty-four effects from 10 studies and 283 participants are reported. Results revealed a statistically significant overall effect size in the small to medium range (r = .24; 95% Confidence Interval = -0.28 to 0.75), indicating that greater hypnotic suggestibility led to greater effects of hypnosis interventions. Hypnotic suggestibility accounted for 6% of the variance in outcomes. Smaller sample size studies, use of the SHCS, and pediatric samples tended to result in larger effect sizes. The authors question the usefulness of assessing hypnotic suggestibility in clinical contexts.

  19. The impact of hypnotic suggestibility in clinical care settings

    PubMed Central

    Montgomery, Guy H.; Schnur, Julie B.; David, Daniel

    2013-01-01

    Hypnotic suggestibility has been described as a powerful predictor of outcomes associated with hypnotic interventions. However, there have been no systematic approaches to quantifying this effect across the literature. The present meta-analysis evaluates the magnitude of the effect of hypnotic suggestibility on hypnotic outcomes in clinical settings. PsycINFO and PubMed were searched from their inception through July 2009. Thirty-four effects from ten studies and 283 participants are reported. Results revealed a statistically significant overall effect size in the small to medium range (r = 0.24; 95% Confidence Interval = −0.28 to 0.75), indicating that greater hypnotic suggestibility led to greater effects of hypnosis interventions. Hypnotic suggestibility accounted for 6% of the variance in outcomes. Smaller sample size studies, use of the SHCS, and pediatric samples tended to result in larger effect sizes. Results question the usefulness of assessing hypnotic suggestibility in clinical contexts. PMID:21644122

  20. Screening for phaeochromocytoma and paraganglioma: impact of using supine reference intervals for plasma metanephrines with samples collected from fasted/seated patients.

    PubMed

    Casey, R; Griffin, T P; Wall, D; Dennedy, M C; Bell, M; O'Shea, P M

    2017-01-01

    Background The Endocrine Society Clinical Practice Guideline on Phaeochomocytoma and Paraganglioma recommends phlebotomy for plasma-free metanephrines with patients fasted and supine using appropriately defined reference intervals. Studies have shown higher diagnostic sensitivities using these criteria. Further, with seated-sampling protocols, for result interpretation, reference intervals that do not compromise diagnostic sensitivity should be employed. Objective To determine the impact on diagnostic performance and financial cost of using supine reference intervals for result interpretation with our current plasma-free metanephrines fasted/seated-sampling protocol. Methods We conducted a retrospective cohort study of patients who underwent screening for PPGL using plasma-free metanephrines from 2009 to 2014 at Galway University Hospitals. Plasma-free metanephrines were measured using liquid chromatography-tandem mass spectrometry. Supine thresholds for plasma normetanephrine and metanephrine set at 610 pmol/L and 310 pmol/L, respectively, were used. Results A total of 183 patients were evaluated. Mean age of participants was 53.4 (±16.3) years. Five of 183 (2.7%) patients had histologically confirmed PPGL (males, n=4). Using seated reference intervals for plasma-free metanephrines, diagnostic sensitivity and specificity were 100% and 98.9%, respectively, with two false-positive cases. Application of reference intervals established in subjects supine and fasted to this cohort gave diagnostic sensitivity of 100% with specificity of 74.7%. Financial analysis of each pretesting strategy demonstrated cost-equivalence (€147.27/patient). Conclusion Our cost analysis, together with the evidence that fasted/supine-sampling for plasma-free metanephrines, offers more reliable exclusion of PPGL mandates changing our current practice. This study highlights the important advantages of standardized diagnostic protocols for plasma-free metanephrines to ensure the highest diagnostic accuracy for investigation of PPGL.

  1. Observer Error when Measuring Safety-Related Behavior: Momentary Time Sampling versus Whole-Interval Recording

    ERIC Educational Resources Information Center

    Taylor, Matthew A.; Skourides, Andreas; Alvero, Alicia M.

    2012-01-01

    Interval recording procedures are used by persons who collect data through observation to estimate the cumulative occurrence and nonoccurrence of behavior/events. Although interval recording procedures can increase the efficiency of observational data collection, they can also induce error from the observer. In the present study, 50 observers were…

  2. Using an R Shiny to Enhance the Learning Experience of Confidence Intervals

    ERIC Educational Resources Information Center

    Williams, Immanuel James; Williams, Kelley Kim

    2018-01-01

    Many students find understanding confidence intervals difficult, especially because of the amalgamation of concepts such as confidence levels, standard error, point estimates and sample sizes. An R Shiny application was created to assist the learning process of confidence intervals using graphics and data from the US National Basketball…

  3. Machine learning approaches for estimation of prediction interval for the model output.

    PubMed

    Shrestha, Durga L; Solomatine, Dimitri P

    2006-03-01

    A novel method for estimating prediction uncertainty using machine learning techniques is presented. Uncertainty is expressed in the form of the two quantiles (constituting the prediction interval) of the underlying distribution of prediction errors. The idea is to partition the input space into different zones or clusters having similar model errors using fuzzy c-means clustering. The prediction interval is constructed for each cluster on the basis of empirical distributions of the errors associated with all instances belonging to the cluster under consideration and propagated from each cluster to the examples according to their membership grades in each cluster. Then a regression model is built for in-sample data using computed prediction limits as targets, and finally, this model is applied to estimate the prediction intervals (limits) for out-of-sample data. The method was tested on artificial and real hydrologic data sets using various machine learning techniques. Preliminary results show that the method is superior to other methods estimating the prediction interval. A new method for evaluating performance for estimating prediction interval is proposed as well.

  4. Variability of reflectance measurements with sensor altitude and canopy type

    NASA Technical Reports Server (NTRS)

    Daughtry, C. S. T.; Vanderbilt, V. C.; Pollara, V. J.

    1981-01-01

    Data were acquired on canopies of mature corn planted in 76 cm rows, mature soybeans planted in 96 cm rows with 71 percent soil cover, and mature soybeans planed in 76 cm rows with 100 percent soil cover. A LANDSAT band radiometer with a 15 degree field of view was used at ten altitudes ranging from 0.2 m to 10 m above the canopy. At each altitude, measurements were taken at 15 cm intervals also a 2.0 m transect perpendicular to the crop row direction. Reflectance data were plotted as a function of altitude and horizontal position to verify that the variance of measurements at low altitudes was attributable to row effects which disappear at higher altitudes where the sensor integrate across several rows. The coefficient of variation of reflectance decreased exponentially as the sensor was elevated. Systematic sampling (at odd multiples of 0.5 times the row spacing interval) required fewer measurements than simple random sampling over row crop canopies.

  5. Uncertainty Quantification and Statistical Convergence Guidelines for PIV Data

    NASA Astrophysics Data System (ADS)

    Stegmeir, Matthew; Kassen, Dan

    2016-11-01

    As Particle Image Velocimetry has continued to mature, it has developed into a robust and flexible technique for velocimetry used by expert and non-expert users. While historical estimates of PIV accuracy have typically relied heavily on "rules of thumb" and analysis of idealized synthetic images, recently increased emphasis has been placed on better quantifying real-world PIV measurement uncertainty. Multiple techniques have been developed to provide per-vector instantaneous uncertainty estimates for PIV measurements. Often real-world experimental conditions introduce complications in collecting "optimal" data, and the effect of these conditions is important to consider when planning an experimental campaign. The current work utilizes the results of PIV Uncertainty Quantification techniques to develop a framework for PIV users to utilize estimated PIV confidence intervals to compute reliable data convergence criteria for optimal sampling of flow statistics. Results are compared using experimental and synthetic data, and recommended guidelines and procedures leveraging estimated PIV confidence intervals for efficient sampling for converged statistics are provided.

  6. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics.

    PubMed

    Feng, Shu; Gale, Michael J; Fay, Jonathan D; Faridi, Ambar; Titus, Hope E; Garg, Anupam K; Michaels, Keith V; Erker, Laura R; Peters, Dawn; Smith, Travis B; Pennesi, Mark E

    2015-09-01

    To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population.

  7. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics

    PubMed Central

    Feng, Shu; Gale, Michael J.; Fay, Jonathan D.; Faridi, Ambar; Titus, Hope E.; Garg, Anupam K.; Michaels, Keith V.; Erker, Laura R.; Peters, Dawn; Smith, Travis B.; Pennesi, Mark E.

    2015-01-01

    Purpose To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Methods Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Results Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. Conclusions We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population. PMID:26325414

  8. Automatic frequency control for FM transmitter

    NASA Technical Reports Server (NTRS)

    Honnell, M. A. (Inventor)

    1974-01-01

    An automatic frequency control circuit for an FM television transmitter is described. The frequency of the transmitter is sampled during what is termed the back porch portion of the horizontal synchronizing pulse which occurs during the retrace interval, the frequency sample compared with the frequency of a reference oscillator, and a correction applied to the frequency of the transmitter during this portion of the retrace interval.

  9. Ultrasonic sensor and method of use

    DOEpatents

    Condreva, Kenneth J.

    2001-01-01

    An ultrasonic sensor system and method of use for measuring transit time though a liquid sample, using one ultrasonic transducer coupled to a precision time interval counter. The timing circuit captures changes in transit time, representing small changes in the velocity of sound transmitted, over necessarily small time intervals (nanoseconds) and uses the transit time changes to identify the presence of non-conforming constituents in the sample.

  10. Sex-specific reference intervals of hematologic and biochemical analytes in Sprague-Dawley rats using the nonparametric rank percentile method.

    PubMed

    He, Qili; Su, Guoming; Liu, Keliang; Zhang, Fangcheng; Jiang, Yong; Gao, Jun; Liu, Lida; Jiang, Zhongren; Jin, Minwu; Xie, Huiping

    2017-01-01

    Hematologic and biochemical analytes of Sprague-Dawley rats are commonly used to determine effects that were induced by treatment and to evaluate organ dysfunction in toxicological safety assessments, but reference intervals have not been well established for these analytes. Reference intervals as presently defined for these analytes in Sprague-Dawley rats have not used internationally recommended statistical method nor stratified by sex. Thus, we aimed to establish sex-specific reference intervals for hematologic and biochemical parameters in Sprague-Dawley rats according to Clinical and Laboratory Standards Institute C28-A3 and American Society for Veterinary Clinical Pathology guideline. Hematology and biochemistry blood samples were collected from 500 healthy Sprague-Dawley rats (250 males and 250 females) in the control groups. We measured 24 hematologic analytes with the Sysmex XT-2100i analyzer, 9 biochemical analytes with the Olympus AU400 analyzer. We then determined statistically relevant sex partitions and calculated reference intervals, including corresponding 90% confidence intervals, using nonparametric rank percentile method. We observed that most hematologic and biochemical analytes of Sprague-Dawley rats were significantly influenced by sex. Males had higher hemoglobin, hematocrit, red blood cell count, red cell distribution width, mean corpuscular volume, mean corpuscular hemoglobin, white blood cell count, neutrophils, lymphocytes, monocytes, percentage of neutrophils, percentage of monocytes, alanine aminotransferase, aspartate aminotransferase, and triglycerides compared to females. Females had higher mean corpuscular hemoglobin concentration, plateletcrit, platelet count, eosinophils, percentage of lymphocytes, percentage of eosinophils, creatinine, glucose, total cholesterol and urea compared to males. Sex partition was required for most hematologic and biochemical analytes in Sprague-Dawley rats. We established sex-specific reference intervals, including corresponding 90% confidence intervals, for Sprague-Dawley rats. Understanding the significant discrepancies in hematologic and biochemical analytes between male and female Sprague-Dawley rats provides important insight into physiological effects in test rats. Establishment of locally sex-specific reference intervals allows a more precise evaluation of animal quality and experimental results of Sprague-Dawley rats in our toxicology safety assessment.

  11. Evaluation of fertilization-to-planting and fertilization-to-harvest intervals for safe use of noncomposted bovine manure in Wisconsin vegetable production.

    PubMed

    Ingham, Steven C; Fanslau, Melody A; Engel, Rebecca A; Breuer, Jeffry R; Breuer, Jane E; Wright, Thomas H; Reith-Rozelle, Judith K; Zhu, Jun

    2005-06-01

    Fresh bovine manure was mechanically incorporated into loamy sand and silty clay loam Wisconsin soils in April 2004. At varying fertilization-to-planting intervals, radish, lettuce, and carrot seeds were planted; crops were harvested 90, 100, 110 or 111, and 120 days after manure application. As an indicator of potential contamination with fecal pathogens, levels of Escherichia coli in the manure-fertilized soil and presence of E. coli on harvested vegetables were monitored. From initial levels of 4.0 to 4.2 log CFU/g, E. coli levels in both manure-fertilized soils decreased by 2.4 to 2.5 log CFU/g during the first 7 weeks. However, E. coli was consistently detected from enriched soil samples through week 17, perhaps as a result of contamination by birds and other wildlife. In the higher clay silty clay loam soil, the fertilization-to-planting interval affected the prevalence of E. coli on lettuce but not on radishes and carrots. Root crop contamination was consistent across different fertilization-to-harvest intervals in silty clay loam, including the National Organic Program minimum fertilization-to-harvest interval of 120 days. However, lettuce contamination in silty clay loam was significantly (P < 0.10) affected by fertilization-to-harvest interval. Increasing the fertilization-to-planting interval in the lower clay loamy sand soil decreased the prevalence of E. coli on root crops. The fertilization-to-harvest interval had no clear effect on vegetable contamination in loamy sand. Overall, these results do not provide grounds for reducing the National Organic Program minimum fertilization-to-harvest interval from the current 120-day standard.

  12. Questa baseline and premining ground-water quality investigation. 8. Lake-sediment geochemical record from 1960 to 2002, Eagle Rock and Fawn Lakes, Taos County, New Mexico

    USGS Publications Warehouse

    Church, S.E.; Fey, D.L.; Marot, M.E.

    2005-01-01

    Geochemical studies of lake sediment from Eagle Rock Lake and upper Fawn Lake were conducted to evaluate the effect of mining at the Molycorp Questa porphyry molybdenum deposit located immediately north of the Red River. Two cores were taken, one from each lake near the outlet where the sediment was thinnest, and they were sampled at 1-cm intervals to provide geochemical data at less than 1-year resolution. Samples from the core intervals were digested and analyzed for 34 elements using ICP-AES (inductively coupled plasma-atomic emission spectrometry). The activity of 137Cs has been used to establish the beginning of sedimentation in the two lakes. Correlation of the geochemistry of heavy-mineral suites in the cores from both Fawn and Eagle Rock Lakes has been used to develop a sedimentation model to date the intervals sampled. The core from upper Fawn Lake, located upstream of the deposit, provided an annual sedimentary record of the geochemical baseline for material being transported in the Red River, whereas the core from Eagle Rock Lake, located downstream of the deposit, provided an annual record of the effect of mining at the Questa mine on the sediment in the Red River. Abrupt changes in the concentrations of many lithophile and deposit-related metals occur in the middle of the Eagle Rock Lake core, which we correlate with the major flood-of-record recorded at the Questa gage at Eagle Rock Lake in 1979. Sediment from the Red River collected at low flow in 2002 is a poor match for the geochemical data from the sediment core in Eagle Rock Lake. The change in sediment geochemistry in Eagle Rock Lake in the post-1979 interval is dramatic and requires that a new source of sediment be identified that has substantially different geochemistry from that in the pre-1979 core interval. Loss of mill tailings from pipeline breaks are most likely responsible for some of the spikes in trace-element concentrations in the Eagle Rock Lake core. Enrichment of Al2O3, Cu, and Zn occurred as a result of chemical precipitation of these metals from ground water upstream in the Red River. Comparisons of the geochemistry of the post-1979 sediment core with both mine wastes and with premining sediment from the vicinity of the Questa mine indicate that both are possible sources for this new component of sediment. Existing data have not resolved this enigma.

  13. Reference intervals for putative biomarkers of drug-induced liver injury and liver regeneration in healthy human volunteers.

    PubMed

    Francis, Ben; Clarke, Joanna I; Walker, Lauren E; Brillant, Nathalie; Jorgensen, Andrea L; Park, B Kevin; Pirmohamed, Munir; Antoine, Daniel J

    2018-05-02

    The potential of mechanistic biomarkers to improve the prediction of drug-induced liver injury (DILI) and hepatic regeneration is widely acknowledged. We sought to determine reference intervals for new biomarkers of DILI and regeneration as well as to characterize their natural variability and impact of diurnal variation. Serum samples from 200 healthy volunteers were recruited as part of a cross sectional study; of these, 50 subjects had weekly serial sampling over 3 weeks, while 24 had intensive blood sampling over a 24h period. Alanine aminotransferase (ALT), MicroRNA-122 (miR-122), high mobility group box-1 (HMGB1), total keratin-18 (FL-K18), caspase cleaved keratin-18 (cc-K18), glutamate dehydrogenase (GLDH) and colony stimulating factor-1 (CSF-1) were assessed by validated assays. Reference intervals were established for each biomarker based on the 97.5% quantile (90% CI) following the assessment of fixed effects in univariate and multivariable models (ALT 50 (41-50) U/l, miR-122 3548 (2912-4321) copies/µl, HMGB1 2.3 (2.2-2.4) ng/ml, FL-K18 475 (456-488) U/l, cc-K18 272 (256-291) U/l, GLDH 27 (26-30) U/l and CSF-1 2.4 (2.3-2.9) ng/ml). There was a small but significant intra-individual time random effect detected but no significant impact of diurnal variation was observed, with the exception of GLDH. Reference intervals for novel DILI biomarkers have been described for the first time. An upper limit of a reference range might represent the most appropriate method to utilize these data. Regulatory authorities have published letters of support encouraging further qualification of leading candidate biomarkers. These data can now be used to interpret data from exploratory clinical DILI studies and to assist their further qualification. Drug-induced liver injury (DILI) has a big impact on patient health and the development of new medicines. Unfortunately, currently used blood-based tests to assess liver injury and recovery suffer from insufficiencies. Newer blood-based tests (biomarkers) have been described that accurately predict the onset and recovery from DILI. Here, in this study we describe reference intervals from investigations designed, for the first time, with the intension to assess the natural variation of these newer biomarkers in healthy volunteers. The outcomes of these results can be used to aid the interpretation of data from patients with suspected liver toxicity. Copyright © 2018. Published by Elsevier B.V.

  14. Profile local linear estimation of generalized semiparametric regression model for longitudinal data.

    PubMed

    Sun, Yanqing; Sun, Liuquan; Zhou, Jie

    2013-07-01

    This paper studies the generalized semiparametric regression model for longitudinal data where the covariate effects are constant for some and time-varying for others. Different link functions can be used to allow more flexible modelling of longitudinal data. The nonparametric components of the model are estimated using a local linear estimating equation and the parametric components are estimated through a profile estimating function. The method automatically adjusts for heterogeneity of sampling times, allowing the sampling strategy to depend on the past sampling history as well as possibly time-dependent covariates without specifically model such dependence. A [Formula: see text]-fold cross-validation bandwidth selection is proposed as a working tool for locating an appropriate bandwidth. A criteria for selecting the link function is proposed to provide better fit of the data. Large sample properties of the proposed estimators are investigated. Large sample pointwise and simultaneous confidence intervals for the regression coefficients are constructed. Formal hypothesis testing procedures are proposed to check for the covariate effects and whether the effects are time-varying. A simulation study is conducted to examine the finite sample performances of the proposed estimation and hypothesis testing procedures. The methods are illustrated with a data example.

  15. In vitro effects of 6 % hydroxyethyl starch 130/0.42 solution on feline whole blood coagulation measured by rotational thromboelastometry.

    PubMed

    Albrecht, Nathalie A; Howard, Judith; Kovacevic, Alan; Adamik, Katja N

    2016-07-26

    The artificial colloid, hydroxyethyl starch (HES), is recommended for intravascular volume expansion and colloid-osmotic pressure enhancement in dogs and cats. A well-known side effect of HES solutions in humans and dogs is coagulopathy. However, HES-associated coagulopathy has thus far not been investigated in cats. The goal of this study was to assess the in vitro effects of 6 % HES 130/0.42 on feline whole blood samples using rotational thromboelastometry (ROTEM). A further goal was to develop feline reference intervals for ROTEM at our institution. In this in vitro experimental study, blood samples of 24 adult healthy cats were collected by atraumatic jugular phlebotomy following intramuscular sedation. Baseline ROTEM analyses (using ex-tem, in-tem and fib-tem assays) were performed in duplicate. Additionally, ROTEM analyses were performed on blood samples after dilution with either Ringer's acetate (RA) or 6 % HES 130/0.42 (HES) in a 1:6 dilution (i.e. 1 part solution and 6 parts blood). Coefficients of variation of duplicate measures were below 12 % in all ex-tem assays, 3 of 4 in-tem assays but only 1 of 3 fib-tem assays. Reference intervals were similar albeit somewhat narrower than those previously published. Dilution with both solutions lead to significantly prolonged CT (in-tem), CFT (ex-tem and in-tem), and reduced MCF (ex-tem, in-tem, and fib-tem) and alpha (ex-tem and in-tem). Compared to RA, dilution with HES caused a significant prolongation of CT in fib-tem (P = 0.016), CFT in ex-tem (P = 0.017) and in-tem (P = 0.019), as well as a reduction in MCF in in-tem (P = 0.032) and fib-tem (P = 0.020), and alpha in ex-tem (P = 0.014). However, only a single parameter (CFT in ex-tem) was outside of the established reference interval after dilution with HES. In vitro hemodilution of feline blood with RA and HES causes a small but significant impairment of whole blood coagulation, with HES leading to a significantly greater effect on coagulation than RA. Further studies are necessary to evaluate the in vivo effects and the clinical significance of these findings.

  16. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  17. The Incidence, Prevalence and Burden of OM in Unselected Children Aged 1 to 8 Years Followed by Weekly Otoscopy through the “Common Cold” Season

    PubMed Central

    Mandel, Ellen M.; Doyle, William J.; Winther, Birgit; Alper, Cuneyt M.

    2008-01-01

    Background There is a continuing interest in defining the incidence, prevalence and burden of otitis media (OM) in the individual and population for purposes of assigning “risk factors”. Often overlooked in past studies are the contributions of cold-like illnesses (CLIs) and sampling interval to those estimates. Objective Describe the incidence of symptomatic (AOM) and asymptomatic (OME) OM, the prevalence of OM, the contribution of CLI incidence, burden and other OM “risk factors” to the incidence and burden of OM, and the effect of sampling interval on those measures in children. Methods 148 children (74 male; 131 white, aged 1.0–8.6 years) were followed from November 1 to April 30 by weekly pneumatic otoscopy to diagnose OM presence/absence and by daily parental diary to assign CLI episodes. Data for previously identified OM “risk factors” were collected on 127. Results were summarized using standard measures of incidence, prevalence and burden, and multiple-regression techniques were used to identify OM “risk factors”. Results The basal OM prevalence was 20% with peaks in December and March and the temporal pattern was correlated with CLI prevalence. The incidence of OME (per 27232 child-days) was 317, AOM was 74 and CLI was 456. The seasonal pattern of AOM and OME incidences tracked and was correlated with that for CLIs. New OM episodes were usually of short duration (≤7 days in 40%, ≤4 weeks in 75–90%) and the usual OM burden was low (median=12%). OM and breastfeeding histories and CLI incidence/prevalence were significant predictors of OME and AOM incidence and OM burden. Longer sampling intervals were less efficient in capturing AOM and OME durations and incidences, but not OM burden. Conclusions These results demonstrate a high incidence and prevalence of OM, most OM episodes were of short duration and longer sampling intervals introduced biases into some parameter estimates. There was a significant relationship between OM and CLI incidence, prevalence and burden suggesting that CLI experience should be controlled for in assessing independent “risk factors” for AOM and OME. PMID:18272237

  18. Influence of acidic beverage (Coca-Cola) on pharmacokinetics of ibuprofen in healthy rabbits.

    PubMed

    Kondal, Amit; Garg, S K

    2003-11-01

    The study was aimed at determining the effect of Coca-Cola on the pharmacokinetics of ibuprofen in rabbits. In a cross-over study, ibuprofen was given orally in a dose of 56 mg/kg, prepared as 0.5% suspension in carboxymethyl cellulose (CMC) and blood samples (1 ml) were drawn at different time intervals from 0-12 hr. After a washout period of 7 days, Coca-Cola in a dose of (5 ml/kg) was administered along with ibuprofen (56 mg/kg) and blood samples were drawn from 0-12 hr. To these rabbits, 5 ml/kg Coca-Cola was administered once daily for another 7 days. On 8th day, Coca-Cola (5 ml/kg) along with ibuprofen (56 mg/kg), prepared as a suspension was administered and blood samples (1 ml each) were drawn at similar time intervals. Plasma was separated and assayed for ibuprofen by HPLC technique and various pharmacokinetic parameters were calculated. The Cmax and AUC0-alpha of ibuprofen were significantly increased after single and multiple doses of Coca-Cola, thereby indicating increased extent of absorption of ibuprofen. The results warrant the reduction of ibuprofen daily dosage, frequency when administered with Coca-Cola.

  19. Effect of Acute Exercise on Fatigue in People with ME/CFS/SEID: A Meta-analysis.

    PubMed

    Loy, Bryan D; O'Connor, Patrick J; Dishman, Rodney K

    2016-10-01

    A prominent symptom of myalgic encephalomyelitis, chronic fatigue syndrome, or systemic exertion intolerance disease (ME/CFS/SEID) is persistent fatigue that is worsened by physical exertion. Here the population effect of a single bout of exercise on fatigue symptoms in people with ME/CFS/SEID was estimated and effect moderators were identified. Google Scholar was systematically searched for peer-reviewed articles published between February 1991 and May 2015. Studies were included where people diagnosed with ME/CFS/SEID and matched control participants completed a single bout of exercise and fatigue self-reports were obtained before and after exercise. Fatigue means, standard deviations, and sample sizes were extracted to calculate effect sizes and the 95% confidence interval. Effects were pooled using a random-effects model and corrected for small sample bias to generate mean Δ. Multilevel regression modeling adjusted for nesting of effects within studies. Moderators identified a priori were diagnostic criteria, fibromyalgia comorbidity, exercise factors (intensity, duration, and type), and measurement factors. Seven studies examining 159 people with ME/CFS/SEID met inclusion criteria, and 47 fatigue effects were derived. The mean fatigue effect was Δ = 0.73 (95% confidence interval = 0.24-1.23). Fatigue increases were larger for people with ME/CFS/SEID when fatigue was measured 4 h or more after exercise ended rather than during or immediately after exercise ceased. This preliminary evidence indicates that acute exercise increases fatigue in people with ME/CFS/SEID more than that in control groups, but effects were heterogeneous between studies. Future studies with no-exercise control groups of people with ME/CFS/SEID are needed to obtain a more precise estimate of the effect of exercise on fatigue in this population.

  20. CLSI-based transference of CALIPER pediatric reference intervals to Beckman Coulter AU biochemical assays.

    PubMed

    Abou El Hassan, Mohamed; Stoianov, Alexandra; Araújo, Petra A T; Sadeghieh, Tara; Chan, Man Khun; Chen, Yunqi; Randell, Edward; Nieuwesteeg, Michelle; Adeli, Khosrow

    2015-11-01

    The CALIPER program has established a comprehensive database of pediatric reference intervals using largely the Abbott ARCHITECT biochemical assays. To expand clinical application of CALIPER reference standards, the present study is aimed at transferring CALIPER reference intervals from the Abbott ARCHITECT to Beckman Coulter AU assays. Transference of CALIPER reference intervals was performed based on the CLSI guidelines C28-A3 and EP9-A2. The new reference intervals were directly verified using up to 100 reference samples from the healthy CALIPER cohort. We found a strong correlation between Abbott ARCHITECT and Beckman Coulter AU biochemical assays, allowing the transference of the vast majority (94%; 30 out of 32 assays) of CALIPER reference intervals previously established using Abbott assays. Transferred reference intervals were, in general, similar to previously published CALIPER reference intervals, with some exceptions. Most of the transferred reference intervals were sex-specific and were verified using healthy reference samples from the CALIPER biobank based on CLSI criteria. It is important to note that the comparisons performed between the Abbott and Beckman Coulter assays make no assumptions as to assay accuracy or which system is more correct/accurate. The majority of CALIPER reference intervals were transferrable to Beckman Coulter AU assays, allowing the establishment of a new database of pediatric reference intervals. This further expands the utility of the CALIPER database to clinical laboratories using the AU assays; however, each laboratory should validate these intervals for their analytical platform and local population as recommended by the CLSI. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  1. Predictive sensor method and apparatus

    NASA Technical Reports Server (NTRS)

    Nail, William L. (Inventor); Koger, Thomas L. (Inventor); Cambridge, Vivien (Inventor)

    1990-01-01

    A predictive algorithm is used to determine, in near real time, the steady state response of a slow responding sensor such as hydrogen gas sensor of the type which produces an output current proportional to the partial pressure of the hydrogen present. A microprocessor connected to the sensor samples the sensor output at small regular time intervals and predicts the steady state response of the sensor in response to a perturbation in the parameter being sensed, based on the beginning and end samples of the sensor output for the current sample time interval.

  2. OSL response bleaching of BeO samples, using fluorescent light and blue LEDs

    NASA Astrophysics Data System (ADS)

    Groppo, D. P.; Caldas, L. V. E.

    2016-07-01

    The optically stimulated luminescence (OSL) is widely used as a dosimetric technique for many applications. In this work, the OSL response bleaching of BeO samples was studied. The samples were irradiated using a beta radiation source (90Sr+90Y); the bleaching treatments (fluorescent light and blue LEDs) were performed, and the results were compared. Various optical treatment time intervals were tested until reaching the complete bleaching of the OSL response. The best combination of the time interval and bleaching type was analyzed.

  3. Commutability of Cytomegalovirus WHO International Standard in Different Matrices

    PubMed Central

    Jones, Sara; Webb, Erika M.; Barry, Catherine P.; Choi, Won S.; Abravaya, Klara B.; Schneider, George J.

    2016-01-01

    Commutability of quantitative standards allows patient results to be compared across molecular diagnostic methods and laboratories. This is critical to establishing quantitative thresholds for use in clinical decision-making. A matrix effect associated with the 1st cytomegalovirus (CMV) WHO international standard (IS) was identified using the Abbott RealTime CMV assay. A commutability study was performed to compare the CMV WHO IS and patient specimens diluted in plasma and whole blood. Patient specimens showed similar CMV DNA quantitation values regardless of the diluent or extraction procedure used. The CMV WHO IS, on the other hand, exhibited a matrix effect. The CMV concentration reported for the WHO IS diluted in plasma was within the 95% prediction interval established with patient samples. In contrast, the reported DNA concentration of the CMV WHO IS diluted in whole blood was reduced approximately 0.4 log copies/ml, and values fell outside the 95% prediction interval. Calibrating the assay by using the CMV WHO IS diluted in whole blood would introduce a bias for CMV whole-blood quantitation; samples would be reported as having higher measured concentrations, by approximately 0.4 log IU/ml. Based on the commutability study with patient samples, the RealTime CMV assay was standardized based on the CMV WHO IS diluted in plasma. A revision of the instructions for use of the CMV WHO IS should be considered to alert users of the potential impact from the diluent matrix. The identification of a matrix effect with the CMV WHO IS underscores the importance of assessing commutability of the IS in order to achieve consistent results across methods. PMID:27030491

  4. Mindfulness-based stress reduction for treating chronic headache: A systematic review and meta-analysis.

    PubMed

    Anheyer, Dennis; Leach, Matthew J; Klose, Petra; Dobos, Gustav; Cramer, Holger

    2018-01-01

    Background Mindfulness-based stress reduction/cognitive therapy are frequently used for pain-related conditions, but their effects on headache remain uncertain. This review aimed to assess the efficacy and safety of mindfulness-based stress reduction/cognitive therapy in reducing the symptoms of chronic headache. Data sources and study selection MEDLINE/PubMed, Scopus, CENTRAL, and PsychINFO were searched to 16 June 2017. Randomized controlled trials comparing mindfulness-based stress reduction/cognitive therapy with usual care or active comparators for migraine and/or tension-type headache, which assessed headache frequency, duration or intensity as a primary outcome, were eligible for inclusion. Risk of bias was assessed using the Cochrane Tool. Results Five randomized controlled trials (two on tension-type headache; one on migraine; two with mixed samples) with a total of 185 participants were included. Compared to usual care, mindfulness-based stress reduction/cognitive therapy did not improve headache frequency (three randomized controlled trials; standardized mean difference = 0.00; 95% confidence interval = -0.33,0.32) or headache duration (three randomized controlled trials; standardized mean difference = -0.08; 95% confidence interval = -1.03,0.87). Similarly, no significant difference between groups was found for pain intensity (five randomized controlled trials; standardized mean difference = -0.78; 95% confidence interval = -1.72,0.16). Conclusions Due to the low number, small scale and often high or unclear risk of bias of included randomized controlled trials, the results are imprecise; this may be consistent with either an important or negligible effect. Therefore, more rigorous trials with larger sample sizes are needed.

  5. Theoretical evaluation of accuracy in position and size of brain activity obtained by near-infrared topography

    NASA Astrophysics Data System (ADS)

    Kawaguchi, Hiroshi; Hayashi, Toshiyuki; Kato, Toshinori; Okada, Eiji

    2004-06-01

    Near-infrared (NIR) topography can obtain a topographical distribution of the activated region in the brain cortex. Near-infrared light is strongly scattered in the head, and the volume of tissue sampled by a source-detector pair on the head surface is broadly distributed in the brain. This scattering effect results in poor resolution and contrast in the topographic image of the brain activity. In this study, a one-dimensional distribution of absorption change in a head model is calculated by mapping and reconstruction methods to evaluate the effect of the image reconstruction algorithm and the interval of measurement points for topographic imaging on the accuracy of the topographic image. The light propagation in the head model is predicted by Monte Carlo simulation to obtain the spatial sensitivity profile for a source-detector pair. The measurement points are one-dimensionally arranged on the surface of the model, and the distance between adjacent measurement points is varied from 4 mm to 28 mm. Small intervals of the measurement points improve the topographic image calculated by both the mapping and reconstruction methods. In the conventional mapping method, the limit of the spatial resolution depends upon the interval of the measurement points and spatial sensitivity profile for source-detector pairs. The reconstruction method has advantages over the mapping method which improve the results of one-dimensional analysis when the interval of measurement points is less than 12 mm. The effect of overlapping of spatial sensitivity profiles indicates that the reconstruction method may be effective to improve the spatial resolution of a two-dimensional reconstruction of topographic image obtained with larger interval of measurement points. Near-infrared topography with the reconstruction method potentially obtains an accurate distribution of absorption change in the brain even if the size of absorption change is less than 10 mm.

  6. Demodulator for binary-phase modulated signals having a variable clock rate

    NASA Technical Reports Server (NTRS)

    Wu, Ta Tzu (Inventor)

    1976-01-01

    Method and apparatus for demodulating binary-phase modulated signals recorded on a magnetic stripe on a card as the card is manually inserted into a card reader. Magnetic transitions are sensed as the card is read and the time interval between immediately preceeding basic transitions determines the duration of a data sampling pulse which detects the presence or absence of an intermediate transition pulse indicative of two respective logic states. The duration of the data sampling pulse is approximately 75 percent of the preceeding interval between basic transitions to permit tracking succeeding time differences in basic transition intervals of up to approximately 25 percent.

  7. Effects of smoking cessation on heart rate variability among long-term male smokers.

    PubMed

    Harte, Christopher B; Meston, Cindy M

    2014-04-01

    Cigarette smoking has been shown to adversely affect heart rate variability (HRV), suggesting dysregulation of cardiac autonomic function. Conversely, smoking cessation is posited to improve cardiac regulation. The aim of the present study was to examine the effects of smoking cessation on HRV among a community sample of chronic smokers. Sixty-two healthy male smokers enrolled in an 8-week smoking cessation program involving a nicotine transdermal patch treatment. Participants were assessed at baseline (while smoking regularly), at mid-treatment (while using a high-dose patch), and at follow-up, 4 weeks after patch discontinuation. Both time-domain (standard deviation of normal-to-normal (NN) intervals (SDNN), square root of the mean squared difference of successive NN intervals (RMSSD), and percent of NN intervals for which successive heartbeat intervals differed by at least 50 ms (pNN50)) and frequency-domain (low frequency (LF), high frequency (HF), LF/HF ratio) parameters of HRV were assessed at each visit. Successful quitters (n = 20), compared to those who relapsed (n = 42), displayed significantly higher SDNN, RMSSD, pNN50, LF, and HF at follow-up, when both nicotine and smoke free. Smoking cessation significantly enhances HRV in chronic male smokers, indicating improved autonomic modulation of the heart. Results suggest that these findings may be primarily attributable to nicotine discontinuation rather than tobacco smoke discontinuation alone.

  8. Do flexible inter-injection intervals improve the effects of botulinum toxin A treatment in reducing impairment and disability in patients with spasticity?

    PubMed

    Trompetto, Carlo; Marinelli, Lucio; Mori, Laura; Puce, Luca; Pelosin, Elisa; Serrati, Carlo; Fattapposta, Francesco; Rinalduzzi, Steno; Abbruzzese, Giovanni; Currà, Antonio

    2017-05-01

    In patients treated with botulinum toxin-A (BoNT-A), toxin-directed antibody formation was related to the dosage and frequency of injections, leading to the empirical adoption of minimum time intervals between injections of 3months or longer. However, recent data suggest that low immunogenicity of current BoNT-A preparations could allow more frequent injections. Our hypothesis is that a short time interval between injections may be safe and effective in reducing upper limb spasticity and related disability. IncobotulinumtoxinA was injected under ultrasound guidance in spastic muscles of 11 subjects, who were evaluated just before BoNT-A injection (T0), and 1month (T1), 2months (T2) and 4months (T3) after injecting. At T1, in the case of persistent disability related to spasticity interfering with normal activities, patients received an additional toxin dose. Seven subjects received the additional dose at T1 because of persistent disability; 4 of them had a decrease of disability 1month later (T2). Rethinking the injection scheme for BoNT-A treatment may have a major impact in the management of spasticity and related disability. Future studies with larger sample sizes are warranted to confirm that injection schedules with short time intervals should no longer be discouraged in clinical practice. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Effect of thermal maturity on remobilization of molybdenum in black shales

    NASA Astrophysics Data System (ADS)

    Ardakani, Omid H.; Chappaz, Anthony; Sanei, Hamed; Mayer, Bernhard

    2016-09-01

    Molybdenum (Mo) concentrations in sedimentary records have been widely used as a method to assess paleo-redox conditions prevailing in the ancient oceans. However, the potential effects of post-depositional processes, such as thermal maturity and burial diagenesis, on Mo concentrations in organic-rich shales have not been addressed, compromising its use as a redox proxy. This study investigates the distribution and speciation of Mo at various thermal maturities in the Upper Ordovician Utica Shale from southern Quebec, Canada. Samples display maturities ranging from the peak oil window (VRo ∼ 1%) to the dry gas zone (VRo ∼ 2%). While our data show a significant correlation between total organic carbon (TOC) and Mo (R2 = 0.40, n = 28, P < 0.0003) at lower thermal maturity, this correlation gradually deteriorates with increasing thermal maturity. Intervals within the thermally overmature section of the Utica Shale that contain elevated Mo levels (20-81 ppm) show petrographic and sulfur isotopic evidence of thermochemical sulfate reduction (TSR) along with formation of recrystallized pyrite. X-ray Absorption Fine Structure spectroscopy (XAFS) was used to determine Mo speciation in samples from intervals with elevated Mo contents (>30 ppm). Our results show the presence of two Mo species: molybdenite Mo(IV)S2 (39 ± 5%) and Mo(VI)-Organic Matter (61 ± 5%). This new evidence suggests that at higher thermal maturities, TSR causes sulfate reduction coupled with oxidation of organic matter (OM). This process is associated with H2S generation and pyrite formation and recrystallization. This in turn leads to the remobilization of Mo and co-precipitation of molybdenite with TSR-derived carbonates in the porous intervals. This could lead to alteration of the initial sedimentary signature of Mo in the affected intervals, hence challenging its use as a paleo-redox proxy in overmature black shales.

  10. Lack of effect of oral cabotegravir on the pharmacokinetics of a levonorgestrel/ethinyl oestradiol‐containing oral contraceptive in healthy adult women

    PubMed Central

    Trezza, Christine; Ford, Susan L.; Gould, Elizabeth; Lou, Yu; Huang, Chuyun; Ritter, James M.; Buchanan, Ann M.; Spreen, William

    2017-01-01

    Aims This study aimed to investigate whether cabotegravir (CAB), an integrase inhibitor in development for treatment and prevention of human immunodeficiency virus‐1, influences the pharmacokinetics (PK) of a levonorgestrel (LNG) and ethinyl oestradiol (EO)–containing oral contraceptive (OC) in healthy women. Methods In this open‐label, fixed‐sequence crossover study, healthy female subjects received LNG 0.15 mg/EO 0.03 mg tablet once daily Days 1–10 alone and with oral CAB 30 mg once daily Days 11–21. At the end of each treatment period, subjects underwent predose sampling for concentrations of follicle‐stimulating hormone, luteinizing hormone, and progesterone and serial PK sampling for plasma LNG, EO, and CAB concentrations. Results Twenty women were enrolled, and 19 completed the study. One subject was withdrawn due to an adverse event unrelated to study medications. Geometric least squares mean ratios (90% confidence interval) of LNG + CAB vs. LNG alone for LNG area under the plasma concentration–time curve over the dosing interval of duration τ and maximum observed plasma concentration were 1.12 (1.07–1.18) and 1.05 (0.96–1.15), respectively. Geometric least squares mean ratio (90% confidence interval) of EO + CAB vs. EO alone for EO area under the plasma concentration–time curve over the dosing interval of duration τ and maximum observed plasma concentration were 1.02 (0.97–1.08) and 0.92 (0.83–1.03), respectively. Steady‐state CAB PK parameters were comparable to historical values. There was no apparent difference in mean luteinizing hormone, follicle‐stimulating hormone, and progesterone concentrations between periods. No clinically significant trends in laboratory values, vital signs, or electrocardiography values were observed. Conclusions Repeat doses of oral CAB had no significant effect on LNG/EO PK or pharmacodynamics, which supports CAB coadministration with LNG/EO OCs in clinical practice. PMID:28087972

  11. Quantile regression models of animal habitat relationships

    USGS Publications Warehouse

    Cade, Brian S.

    2003-01-01

    Typically, all factors that limit an organism are not measured and included in statistical models used to investigate relationships with their environment. If important unmeasured variables interact multiplicatively with the measured variables, the statistical models often will have heterogeneous response distributions with unequal variances. Quantile regression is an approach for estimating the conditional quantiles of a response variable distribution in the linear model, providing a more complete view of possible causal relationships between variables in ecological processes. Chapter 1 introduces quantile regression and discusses the ordering characteristics, interval nature, sampling variation, weighting, and interpretation of estimates for homogeneous and heterogeneous regression models. Chapter 2 evaluates performance of quantile rankscore tests used for hypothesis testing and constructing confidence intervals for linear quantile regression estimates (0 ≤ τ ≤ 1). A permutation F test maintained better Type I errors than the Chi-square T test for models with smaller n, greater number of parameters p, and more extreme quantiles τ. Both versions of the test required weighting to maintain correct Type I errors when there was heterogeneity under the alternative model. An example application related trout densities to stream channel width:depth. Chapter 3 evaluates a drop in dispersion, F-ratio like permutation test for hypothesis testing and constructing confidence intervals for linear quantile regression estimates (0 ≤ τ ≤ 1). Chapter 4 simulates from a large (N = 10,000) finite population representing grid areas on a landscape to demonstrate various forms of hidden bias that might occur when the effect of a measured habitat variable on some animal was confounded with the effect of another unmeasured variable (spatially and not spatially structured). Depending on whether interactions of the measured habitat and unmeasured variable were negative (interference interactions) or positive (facilitation interactions), either upper (τ > 0.5) or lower (τ < 0.5) quantile regression parameters were less biased than mean rate parameters. Sampling (n = 20 - 300) simulations demonstrated that confidence intervals constructed by inverting rankscore tests provided valid coverage of these biased parameters. Quantile regression was used to estimate effects of physical habitat resources on a bivalve mussel (Macomona liliana) in a New Zealand harbor by modeling the spatial trend surface as a cubic polynomial of location coordinates.

  12. Normative data for a computer-assisted version of the auditory three-consonant Brown-Peterson paradigm in the elderly French-Quebec population.

    PubMed

    Callahan, Brandy L; Belleville, Sylvie; Ferland, Guylaine; Potvin, Olivier; Tremblay, Marie-Pier; Hudon, Carol; Macoir, Joël

    2014-01-01

    The Brown-Peterson task is used to assess verbal short-term memory as well as divided attention. In its auditory three-consonant version, trigrams are presented to participants who must recall the items in correct order after variable delays, during which an interference task is performed. The present study aimed to establish normative data for this test in the elderly French-Quebec population based on cross-sectional data from a retrospective, multi-center convenience sample. A total of 595 elderly native French-speakers from the province of Quebec performed the Memoria version of the auditory three-consonant Brown-Peterson test. For both series and item-by-item scoring methods, age, education, and, in most cases, recall after a 0-second interval were found to be significantly associated with recall performance after 10-second, 20-second, and 30-second interference intervals. Based on regression model results, equations to calculate Z scores are presented for the 10-second, 20-second and 30-second intervals and for each scoring method to allow estimation of expected performance based on participants' individual characteristics. As an important ceiling effect was observed at the 0-second interval, norms for this interference interval are presented in percentiles.

  13. Degradation of Insecticides in Poultry Manure: Determining the Insecticidal Treatment Interval for Managing House Fly (Diptera: Muscidae) Populations in Poultry Farms.

    PubMed

    Ong, Song-Quan; Ab Majid, Abdul Hafiz; Ahmad, Hamdan

    2016-04-01

    It is crucial to understand the degradation pattern of insecticides when designing a sustainable control program for the house fly, Musca domestica (L.), on poultry farms. The aim of this study was to determine the half-life and degradation rates of cyromazine, chlorpyrifos, and cypermethrin by spiking these insecticides into poultry manure, and then quantitatively analyzing the insecticide residue using ultra-performance liquid chromatography. The insecticides were later tested in the field in order to study the appropriate insecticidal treatment intervals. Bio-assays on manure samples were later tested at 3, 7, 10, and 15 d for bio-efficacy on susceptible house fly larvae. Degradation analysis demonstrated that cyromazine has the shortest half-life (3.01 d) compared with chlorpyrifos (4.36 d) and cypermethrin (3.75 d). Cyromazine also had a significantly greater degradation rate compared with chlorpyrifos and cypermethrin. For the field insecticidal treatment interval study, 10 d was the interval that had been determined for cyromazine due to its significantly lower residue; for ChCy (a mixture of chlorpyrifos and cypermethrin), the suggested interval was 7 d. Future work should focus on the effects of insecticide metabolites on targeted pests and the poultry manure environment.

  14. Confidence Intervals for the Mean: To Bootstrap or Not to Bootstrap

    ERIC Educational Resources Information Center

    Calzada, Maria E.; Gardner, Holly

    2011-01-01

    The results of a simulation conducted by a research team involving undergraduate and high school students indicate that when data is symmetric the student's "t" confidence interval for a mean is superior to the studied non-parametric bootstrap confidence intervals. When data is skewed and for sample sizes n greater than or equal to 10,…

  15. Optimal estimation of suspended-sediment concentrations in streams

    USGS Publications Warehouse

    Holtschlag, D.J.

    2001-01-01

    Optimal estimators are developed for computation of suspended-sediment concentrations in streams. The estimators are a function of parameters, computed by use of generalized least squares, which simultaneously account for effects of streamflow, seasonal variations in average sediment concentrations, a dynamic error component, and the uncertainty in concentration measurements. The parameters are used in a Kalman filter for on-line estimation and an associated smoother for off-line estimation of suspended-sediment concentrations. The accuracies of the optimal estimators are compared with alternative time-averaging interpolators and flow-weighting regression estimators by use of long-term daily-mean suspended-sediment concentration and streamflow data from 10 sites within the United States. For sampling intervals from 3 to 48 days, the standard errors of on-line and off-line optimal estimators ranged from 52.7 to 107%, and from 39.5 to 93.0%, respectively. The corresponding standard errors of linear and cubic-spline interpolators ranged from 48.8 to 158%, and from 50.6 to 176%, respectively. The standard errors of simple and multiple regression estimators, which did not vary with the sampling interval, were 124 and 105%, respectively. Thus, the optimal off-line estimator (Kalman smoother) had the lowest error characteristics of those evaluated. Because suspended-sediment concentrations are typically measured at less than 3-day intervals, use of optimal estimators will likely result in significant improvements in the accuracy of continuous suspended-sediment concentration records. Additional research on the integration of direct suspended-sediment concentration measurements and optimal estimators applied at hourly or shorter intervals is needed.

  16. Effect of Multiple Alloying Elements on the Glass-Forming Ability, Thermal Stability, and Crystallization Behavior of Zr-Based Alloys

    NASA Astrophysics Data System (ADS)

    Bazlov, A. I.; Tsarkov, A. A.; Ketov, S. V.; Suryanarayana, C.; Louzguine-Luzgin, D. V.

    2018-02-01

    Effect of multiple alloying elements on the glass-forming ability, thermal stability, and crystallization behavior of Zr-based glass-forming alloys were studied in the present work. We investigated the effect of complete or partial substitution of Ti and Ni with similar early and late transition metals, respectively, on the glass-forming ability and crystallization behavior of the Zr50Ti10Cu20Ni10Al10 alloy. Poor correlation was observed between different parameters indicating the glass-forming ability and the critical size of the obtained glassy samples. Importance of the width of the crystallization interval is emphasized. The kinetics of primary crystallization, i.e., the rate of nucleation and rate of growth of the nuclei of primary crystals is very different from that of the eutectic alloys. Thus, it is difficult to estimate the glass-forming ability only on the basis of the empirical parameters not taking into account the crystallization behavior and the crystallization interval.

  17. Effects of macro- and micronutrients on exercise-induced hepcidin response in highly trained endurance athletes.

    PubMed

    Dahlquist, Dylan T; Stellingwerff, Trent; Dieter, Brad P; McKenzie, Donald C; Koehle, Michael S

    2017-10-01

    Iron deficiency has ergolytic effects on athletic performance. Exercise-induced inflammation impedes iron absorption in the digestive tract by upregulating the expression of the iron regulatory protein, hepcidin. Limited research indicates the potential of specific macro- and micronutrients on blunting exercise-induced hepcidin. Therefore, we investigated the effects of postexercise supplementation with protein and carbohydrate (CHO) and vitamins D 3 and K 2 on the postexercise hepcidin response. Ten highly trained male cyclists (age: 26.9 ± 6.4 years; maximal oxygen uptake: 67.4 ± 4.4 mL·kg -1 ·min -1 completed 4 cycling sessions in a randomized, placebo-controlled, single-blinded, triple-crossover study. Experimental days consisted of an 8-min warm-up at 50% power output at maximal oxygen uptake, followed by 8 × 3-min intervals at 85% power output at maximal oxygen uptake with 1.5 min at 60% power output at maximal oxygen uptake between each interval. Blood samples were collected pre- and postexercise, and at 3 h postexercise. Three different drinks consisting of CHO (75 g) and protein (25 g) with (VPRO) or without (PRO) vitamins D 3 (5000 IU) and K 2 (1000 μg), or a zero-calorie control drink (PLA) were consumed immediately after the postexercise blood sample. Results showed that the postexercise drinks had no significant (p ≥ 0.05) effect on any biomarker measured. There was a significant (p < 0.05) increase in hepcidin and interleukin-6 following intense cycling intervals in the participants. Hepcidin increased significantly (p < 0.05) from baseline (nmol·L -1 : 9.94 ± 8.93, 14.18 ± 14.90, 10.44 ± 14.62) to 3 h postexercise (nmol·L -1 : 22.27 ± 13.41, 25.44 ± 11.91, 22.57 ± 15.57) in VPRO, PRO, and PLA, respectively. Contrary to our hypothesis, the drink compositions used did not blunt the postexercise hepcidin response in highly trained athletes.

  18. Neither fixed nor random: weighted least squares meta-analysis.

    PubMed

    Stanley, T D; Doucouliagos, Hristos

    2015-06-15

    This study challenges two core conventional meta-analysis methods: fixed effect and random effects. We show how and explain why an unrestricted weighted least squares estimator is superior to conventional random-effects meta-analysis when there is publication (or small-sample) bias and better than a fixed-effect weighted average if there is heterogeneity. Statistical theory and simulations of effect sizes, log odds ratios and regression coefficients demonstrate that this unrestricted weighted least squares estimator provides satisfactory estimates and confidence intervals that are comparable to random effects when there is no publication (or small-sample) bias and identical to fixed-effect meta-analysis when there is no heterogeneity. When there is publication selection bias, the unrestricted weighted least squares approach dominates random effects; when there is excess heterogeneity, it is clearly superior to fixed-effect meta-analysis. In practical applications, an unrestricted weighted least squares weighted average will often provide superior estimates to both conventional fixed and random effects. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Effects of desiccation on the recalcitrant seeds of Carapa guianensis Aubl. and Carapa procera DC

    Treesearch

    Kristina F. Connor; I. D. Kossmann Ferraz; F.T. Bonner; John A. Vozzo

    1998-01-01

    This study was undertaken to determine if the seeds of Carapa guianensis Aubl. and Carapa procera DC. undergo physiological, biochemical, and ultrastructural changes when they are desiccated; and to find if these changes can be used to monitor viability in Carapa. Seeds were air-dried at room temperature for 7 to 11 days. Samples were taken at frequent intervals and...

  20. INNOVATIVE TECHNOLOGY EVALUATION REPORT ...

    EPA Pesticide Factsheets

    The Russian Peat Borer designed and fabricated by Aquatic Research Instruments was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation Program in April and May 1999 at sites in EPA Regions 1 and 5, respectively. In addition to assessing ease of sampler operation, key objectives of the demonstration included evaluating the sampler?s ability to (1) consistently collect a given volume of sediment, (2) consistently collect sediment in a given depth interval, (3) collect samples with consistent characteristics from a homogenous layer of sediment, and (4) collect samples under a variety of site conditions. This report describes the demonstration results for the Russian Peat Borer and two conventional samplers (the Hand Corer and Vibrocorer) used as reference samplers. During the demonstration, the Russian Peat Borer was the only sampler that collected samples in the deep depth interval (4 to 11 feet below sediment surface). It collected representative and relatively uncompressed core samples of consolidated sediment in discrete depth intervals. The reference samplers collected relatively compressed samples of both consolidated and unconsolidated sediments from the sediment surface downward; sample representativeness may be questionable because of core shortening and core compression. Sediment stratification was preserved only for consolidated sediment samples collected by the Russian Peat Borer but for bo

  1. Usability of Immunohistochemistry in Forensic Samples With Varying Decomposition.

    PubMed

    Lesnikova, Iana; Schreckenbach, Marc Niclas; Kristensen, Maria Pihlmann; Papanikolaou, Liv Lindegaard; Hamilton-Dutoit, Stephen

    2018-05-24

    Immunohistochemistry (IHC) is an important diagnostic tool in anatomic and surgical pathology but is used less frequently in forensic pathology. Degradation of tissue because of postmortem decomposition is believed to be a major limiting factor, although it is unclear what impact such degradation actually has on IHC staining validity. This study included 120 forensic autopsy samples of liver, lung, and brain tissues obtained for diagnostic purposes. The time from death to autopsy ranged between 1 and more than 14 days. Samples were prepared using the tissue microarray technique. The antibodies chosen for the study included KL1 (for staining bile duct epithelium), S100 (for staining glial cells and myelin), vimentin (for endothelial cells in cerebral blood vessels), and CD45 (for pulmonary lymphocytes). Slides were evaluated by light microscopy. Immunohistochemistry reactions were scored according to a system based on the extent and intensity of the positive stain. An overall correlation between the postmortem interval and the IHC score for all tissue samples was found. Samples from decedents with a postmortem interval of 1 to 3 days showed positive staining with all antibodies, whereas samples from decedents with a longer postmortem interval showed decreased staining rates. Our results suggest that IHC analysis can be successfully used for postmortem diagnosis in a range of autopsy samples showing lesser degrees of decomposition.

  2. Comparison of Techniques for Sampling Adult Necrophilous Insects From Pig Carcasses.

    PubMed

    Cruise, Angela; Hatano, Eduardo; Watson, David W; Schal, Coby

    2018-02-06

    Studies of the pre-colonization interval and mechanisms driving necrophilous insect ecological succession depend on effective sampling of adult insects and knowledge of their diel and successional activity patterns. The number of insects trapped, their diversity, and diel periodicity were compared with four sampling methods on neonate pigs. Sampling method, time of day and decomposition age of the pigs significantly affected the number of insects sampled from pigs. We also found significant interactions of sampling method and decomposition day, time of sampling and decomposition day. No single method was superior to the other methods during all three decomposition days. Sampling times after noon yielded the largest samples during the first 2 d of decomposition. On day 3 of decomposition however, all sampling times were equally effective. Therefore, to maximize insect collections from neonate pigs, the method used to sample must vary by decomposition day. The suction trap collected the most species-rich samples, but sticky trap samples were the most diverse, when both species richness and evenness were factored into a Shannon diversity index. Repeated sampling during the noon to 18:00 hours period was most effective to obtain the maximum diversity of trapped insects. The integration of multiple sampling techniques would most effectively sample the necrophilous insect community. However, because all four tested methods were deficient at sampling beetle species, future work should focus on optimizing the most promising methods, alone or in combinations, and incorporate hand-collections of beetles. © The Author(s) 2018. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Time interval measurement device based on surface acoustic wave filter excitation, providing 1 ps precision and stability.

    PubMed

    Panek, Petr; Prochazka, Ivan

    2007-09-01

    This article deals with the time interval measurement device, which is based on a surface acoustic wave (SAW) filter as a time interpolator. The operating principle is based on the fact that a transversal SAW filter excited by a short pulse can generate a finite signal with highly suppressed spectra outside a narrow frequency band. If the responses to two excitations are sampled at clock ticks, they can be precisely reconstructed from a finite number of samples and then compared so as to determine the time interval between the two excitations. We have designed and constructed a two-channel time interval measurement device which allows independent timing of two events and evaluation of the time interval between them. The device has been constructed using commercially available components. The experimental results proved the concept. We have assessed the single-shot time interval measurement precision of 1.3 ps rms that corresponds to the time of arrival precision of 0.9 ps rms in each channel. The temperature drift of the measured time interval on temperature is lower than 0.5 ps/K, and the long term stability is better than +/-0.2 ps/h. These are to our knowledge the best values reported for the time interval measurement device. The results are in good agreement with the error budget based on the theoretical analysis.

  4. Upfront dilution of ferritin samples to reduce hook effect, improve turnaround time and reduce costs.

    PubMed

    Wu, Shu Juan; Hayden, Joshua A

    2018-02-15

    Sandwich immunoassays offer advantages in the clinical laboratory but can yield erroneously low results due to hook (prozone) effect, especially with analytes whose concentrations span several orders of magnitude such as ferritin. This study investigated a new approach to reduce the likelihood of hook effect in ferritin immunoassays by performing upfront, five-fold dilutions of all samples for ferritin analysis. The impact of this change on turnaround time and costs were also investigated. Ferritin concentrations were analysed in routine clinical practice with and without upfront dilutions on Siemens Centaur® XP (Siemens Healthineers, Erlang, Germany) immunoanalysers. In addition, one month of baseline data (1026 results) were collected prior to implementing upfront dilutions and one month of data (1033 results) were collected after implementation. Without upfront dilutions, hook effect was observed in samples with ferritin concentrations as low as 86,028 µg/L. With upfront dilutions, samples with ferritin concentrations as high as 126,050 µg/L yielded values greater than the measurement interval and would have been diluted until an accurate value was obtained. The implementation of upfront dilution of ferritin samples led to a decrease in turnaround time from a median of 2 hours and 3 minutes to 1 hour and 18 minutes (P = 0.002). Implementation of upfront dilutions of all ferritin samples reduced the possibility of hook effect, improved turnaround time and saved the cost of performing additional dilutions.

  5. Stabilization for sampled-data neural-network-based control systems.

    PubMed

    Zhu, Xun-Lin; Wang, Youyi

    2011-02-01

    This paper studies the problem of stabilization for sampled-data neural-network-based control systems with an optimal guaranteed cost. Unlike previous works, the resulting closed-loop system with variable uncertain sampling cannot simply be regarded as an ordinary continuous-time system with a fast-varying delay in the state. By defining a novel piecewise Lyapunov functional and using a convex combination technique, the characteristic of sampled-data systems is captured. A new delay-dependent stabilization criterion is established in terms of linear matrix inequalities such that the maximal sampling interval and the minimal guaranteed cost control performance can be obtained. It is shown that the newly proposed approach can lead to less conservative and less complex results than the existing ones. Application examples are given to illustrate the effectiveness and the benefits of the proposed method.

  6. Work stress, caregiving, and allostatic load: prospective results from the Whitehall II cohort study.

    PubMed

    Dich, Nadya; Lange, Theis; Head, Jenny; Rod, Naja Hulvej

    2015-06-01

    Studies investigating health effects of work and family stress usually consider these factors in isolation. The present study investigated prospective interactive effects of job strain and informal caregiving on allostatic load (AL), a multisystem indicator of physiological dysregulation. Participants were 7007 British civil servants from the Whitehall II cohort study. Phase 3 (1991-1994) served as the baseline, and Phases 5 (1997-1999) and 7 (2002-2004) served as follow-ups. Job strain (high job demands combined with low control) and caregiving (providing care to aged or disabled relatives) were assessed at baseline. AL index (possible range, 0-9) was assessed at baseline and both follow-ups based on nine cardiovascular, metabolic, and immune biomarkers. Linear mixed-effect models were used to examine the association of job strain and caregiving with AL. High caregiving burden (above the sample median weekly hours of providing care) predicted higher AL levels, with the effect strongest in those also reporting job strain (b = 0.36, 95% confidence interval = 0.01-0.71); however, the interaction between job strain and caregiving was not significant (p = .56). Regardless of job strain, participants with low caregiving burden (below sample median) had lower subsequent AL levels than did non-caregivers (b = -0.22, 95% confidence interval = -0.06--0.37). The study provides some evidence for adverse effects of stress at work combined with family demands on physiological functioning. However, providing care to others may also have health protective effects if it does not involve excessive time commitment.

  7. Relationship research between meteorological disasters and stock markets based on a multifractal detrending moving average algorithm

    NASA Astrophysics Data System (ADS)

    Li, Qingchen; Cao, Guangxi; Xu, Wei

    2018-01-01

    Based on a multifractal detrending moving average algorithm (MFDMA), this study uses the fractionally autoregressive integrated moving average process (ARFIMA) to demonstrate the effectiveness of MFDMA in the detection of auto-correlation at different sample lengths and to simulate some artificial time series with the same length as the actual sample interval. We analyze the effect of predictable and unpredictable meteorological disasters on the US and Chinese stock markets and the degree of long memory in different sectors. Furthermore, we conduct a preliminary investigation to determine whether the fluctuations of financial markets caused by meteorological disasters are derived from the normal evolution of the financial system itself or not. We also propose several reasonable recommendations.

  8. Soil moisture determination study. [Guymon, Oklahoma

    NASA Technical Reports Server (NTRS)

    Blanchard, B. J.

    1979-01-01

    Soil moisture data collected in conjunction with aircraft sensor and SEASAT SAR data taken near Guymon, Oklahoma are summarized. In order to minimize the effects of vegetation and roughness three bare and uniformly smooth fields were sampled 6 times at three day intervals on the flight days from August 2 through 17. Two fields remained unirrigated and dry. A similar pair of fields was irrigated at different times during the sample period. In addition, eighteen other fields were sampled on the nonflight days with no field being sampled more than 24 hours from a flight time. The aircraft sensors used included either black and white or color infrared photography, L and C band passive microwave radiometers, the 13.3, 4.75, 1.6 and .4 GHz scatterometers, the 11 channel modular microwave scanner, and the PRT5.

  9. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR

    PubMed Central

    Mobli, Mehdi; Hoch, Jeffrey C.

    2017-01-01

    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. PMID:25456315

  10. EXACT DISTRIBUTIONS OF INTRACLASS CORRELATION AND CRONBACH'S ALPHA WITH GAUSSIAN DATA AND GENERAL COVARIANCE.

    PubMed

    Kistner, Emily O; Muller, Keith E

    2004-09-01

    Intraclass correlation and Cronbach's alpha are widely used to describe reliability of tests and measurements. Even with Gaussian data, exact distributions are known only for compound symmetric covariance (equal variances and equal correlations). Recently, large sample Gaussian approximations were derived for the distribution functions. New exact results allow calculating the exact distribution function and other properties of intraclass correlation and Cronbach's alpha, for Gaussian data with any covariance pattern, not just compound symmetry. Probabilities are computed in terms of the distribution function of a weighted sum of independent chi-square random variables. New F approximations for the distribution functions of intraclass correlation and Cronbach's alpha are much simpler and faster to compute than the exact forms. Assuming the covariance matrix is known, the approximations typically provide sufficient accuracy, even with as few as ten observations. Either the exact or approximate distributions may be used to create confidence intervals around an estimate of reliability. Monte Carlo simulations led to a number of conclusions. Correctly assuming that the covariance matrix is compound symmetric leads to accurate confidence intervals, as was expected from previously known results. However, assuming and estimating a general covariance matrix produces somewhat optimistically narrow confidence intervals with 10 observations. Increasing sample size to 100 gives essentially unbiased coverage. Incorrectly assuming compound symmetry leads to pessimistically large confidence intervals, with pessimism increasing with sample size. In contrast, incorrectly assuming general covariance introduces only a modest optimistic bias in small samples. Hence the new methods seem preferable for creating confidence intervals, except when compound symmetry definitely holds.

  11. Population-Based Pediatric Reference Intervals in General Clinical Chemistry: A Swedish Survey.

    PubMed

    Ridefelt, Peter

    2015-01-01

    Very few high quality studies on pediatric reference intervals for general clinical chemistry and hematology analytes have been performed. Three recent prospective community-based projects utilising blood samples from healthy children in Sweden, Denmark and Canada have substantially improved the situation. The Swedish survey included 701 healthy children. Reference intervals for general clinical chemistry and hematology were defined.

  12. Measurement of trained speech patterns in stuttering: interjudge and intrajudge agreement of experts by means of modified time-interval analysis.

    PubMed

    Alpermann, Anke; Huber, Walter; Natke, Ulrich; Willmes, Klaus

    2010-09-01

    Improved fluency after stuttering therapy is usually measured by the percentage of stuttered syllables. However, outcome studies rarely evaluate the use of trained speech patterns that speakers use to manage stuttering. This study investigated whether the modified time interval analysis can distinguish between trained speech patterns, fluent speech, and stuttered speech. Seventeen German experts on stuttering judged a speech sample on two occasions. Speakers of the sample were stuttering adults, who were not undergoing therapy, as well as participants in a fluency shaping and a stuttering modification therapy. Results showed satisfactory inter-judge and intra-judge agreement above 80%. Intervals with trained speech patterns were identified as consistently as stuttered and fluent intervals. We discuss limitations of the study, as well as implications of our findings for the development of training for identification of trained speech patterns and future outcome studies. The reader will be able to (a) explain different methods to measure the use of trained speech patterns, (b) evaluate whether German experts are able to discriminate intervals with trained speech patterns reliably from fluent and stuttered intervals and (c) describe how the measurement of trained speech patterns can contribute to outcome studies.

  13. a New Approach for Accuracy Improvement of Pulsed LIDAR Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Zhou, G.; Huang, W.; Zhou, X.; He, C.; Li, X.; Huang, Y.; Zhang, L.

    2018-05-01

    In remote sensing applications, the accuracy of time interval measurement is one of the most important parameters that affect the quality of pulsed lidar data. The traditional time interval measurement technique has the disadvantages of low measurement accuracy, complicated circuit structure and large error. A high-precision time interval data cannot be obtained in these traditional methods. In order to obtain higher quality of remote sensing cloud images based on the time interval measurement, a higher accuracy time interval measurement method is proposed. The method is based on charging the capacitance and sampling the change of capacitor voltage at the same time. Firstly, the approximate model of the capacitance voltage curve in the time of flight of pulse is fitted based on the sampled data. Then, the whole charging time is obtained with the fitting function. In this method, only a high-speed A/D sampler and capacitor are required in a single receiving channel, and the collected data is processed directly in the main control unit. The experimental results show that the proposed method can get error less than 3 ps. Compared with other methods, the proposed method improves the time interval accuracy by at least 20 %.

  14. Effect of age on survival benefit of adjuvant chemotherapy in elderly patients with Stage III colon cancer.

    PubMed

    Zuckerman, Ilene H; Rapp, Thomas; Onukwugha, Ebere; Davidoff, Amy; Choti, Michael A; Gardner, James; Seal, Brian; Mullins, C Daniel

    2009-08-01

    To estimate the modifying effect of age on the survival benefit associated with adjuvant chemotherapy receipt in elderly patients with a diagnosis of Stage III colon cancer. Observational, retrospective cohort study using two samples: an overall sample of 7,182 patients to provide externally valid analyses and a propensity score-matched sample of 3,016 patients to provide more internally valid analyses by reducing the presence of treatment endogeneity. An interval-censored survival model with a complementary log-log link was used. Hazard ratios and 95% confidence intervals were obtained for all regressions. Data from the National Cancer Institute's Surveillance, Epidemiology and End Results database and the linked Medicare enrollment and claims database were used. Selected patients were aged 66 and older and had a diagnosis of Stage III colon cancer. Patients were followed from surgery to time of death or censorship. The outcome was colon cancer-specific death during the follow-up period. Receipt of adjuvant chemotherapy was measured according to the presence of a claim for 5-fluorouracil or leucovorin within 6 months after surgery. All elderly patients had a significant survival benefit associated with adjuvant chemotherapy receipt, although the survival benefit of adjuvant chemotherapy was not uniform across all age groups. These findings have important clinical and policy implications for the risk-benefit calculation induced by treatment in older patients with Stage III colon cancer. The results suggest that there is a benefit from chemotherapy, but the benefit is lower with older age.

  15. Validation, residue analysis, and risk assessment of fipronil and flonicamid in cotton (Gossypium sp.) samples and soil.

    PubMed

    Chawla, Suchi; Gor, Hetal N; Patel, Hemlatta K; Parmar, Kaushik D; Patel, Anil R; Shukla, Varsha; Ilyas, Mohammad; Parsai, Satish K; Somashekar; Meena, Roop Singh; Shah, Paresh G

    2018-05-04

    Cotton crop is highly susceptible to attack by sucking pests. Being an important oilseed and feed crop, it is essential to monitor the pesticides and ensure health protection at consumer level. Therefore, a method was validated to estimate fipronil and flonicamid in various cotton samples and risk assessment was performed. Contamination of oil in the extracts from the various oil seeds and cake samples is a major problem as this oil contaminates the column and interferes with the detection of pesticides. The present manuscript for the first time describes successful analysis of the pesticides from various cotton samples including cotton oil, seed, and cake. Quick, easy, cheap, effective, rugged, and safe (QuEChERS)-based methods were validated for estimation of fipronil and flonicamid in cotton samples and in soil by LC-MS/MS. Recoveries were within the acceptable range of 70-120% with relative standard deviation ≤ 20% and HorRat values < 0.3-1.3. R 2 was > 0.99. Matrix effects of 150 and 13.5% were observed for fipronil and flonicamid, respectively, in cotton leaves. Limits of quantitation (LOQs) were in the range of 0.0004 to 0.004 mg kg -1 for fipronil and flonicamid. Cotton samples collected from a field study at different locations were analyzed. Half-life ranged from 2.2 to 5.8 for fipronil and 4.6 to 7.0 days for flonicamid. A pre-harvest interval of 33 days is suggested. The risk assessment studies at maximum residue level values showed HQ < 1 at pre-harvest interval (PHI). The methods being short and easy can be extended to estimate more types of pesticides in different oilseeds. Following a PHI of 33 days, fipronil and flonicamid can be used on cotton at standard dose. As the levels of fipronil and flonicamid were below determination limit in all the soils, the environmental risk is negligible.

  16. ECCM Scheme against Interrupted Sampling Repeater Jammer Based on Parameter-Adjusted Waveform Design

    PubMed Central

    Wei, Zhenhua; Peng, Bo; Shen, Rui

    2018-01-01

    Interrupted sampling repeater jamming (ISRJ) is an effective way of deceiving coherent radar sensors, especially for linear frequency modulated (LFM) radar. In this paper, for a simplified scenario with a single jammer, we propose a dynamic electronic counter-counter measure (ECCM) scheme based on jammer parameter estimation and transmitted signal design. Firstly, the LFM waveform is transmitted to estimate the main jamming parameters by investigating the discontinuousness of the ISRJ’s time-frequency (TF) characteristics. Then, a parameter-adjusted intra-pulse frequency coded signal, whose ISRJ signal after matched filtering only forms a single false target, is designed adaptively according to the estimated parameters, i.e., sampling interval, sampling duration and repeater times. Ultimately, for typical jamming scenes with different jamming signal ratio (JSR) and duty cycle, we propose two particular ISRJ suppression approaches. Simulation results validate the effective performance of the proposed scheme for countering the ISRJ, and the trade-off relationship between the two approaches is demonstrated. PMID:29642508

  17. Effect of α-Amylase, Papain, and Spermfluid treatments on viscosity and semen parameters of dromedary camel ejaculates.

    PubMed

    Monaco, Davide; Fatnassi, Meriem; Padalino, Barbara; Hammadi, Mohamed; Khorchani, Touhami; Lacalandra, Giovanni Michele

    2016-04-01

    Ejaculates from five clinically healthy dromedary camels (Camelus dromedarius) were used to evaluate the effects of different enzymatic treatments (Amylase, Papain, Spermfluid) on liquefaction and seminal parameters. After collection, ejaculates were divided into 5 aliquots: (1) kept undiluted (control); or diluted 1:1 with: (2) Tris-Citrate-Fructose (TCF), (3) TCF containing Amylase, (4) TCF containing Papain or (5) Spermfluid containing Bromelain. At 120 min after dilution, each aliquot was evaluated, at 20-min intervals, for viscosity, motility, viability and agglutination. Only the aliquots diluted with TCF containing Papain underwent complete liquefaction. Sperm motility decreased significantly during the observation times, except for the samples diluted with Spermfluid (P=0.005). Diluted samples showed different levels of agglutination, with the lowest being observed in the control and the highest in the Papain-treated samples. The viscosity of dromedary camel ejaculates could be effectively reduced by using the proteolytic enzyme Papain. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Permutation-based inference for the AUC: A unified approach for continuous and discontinuous data.

    PubMed

    Pauly, Markus; Asendorf, Thomas; Konietschke, Frank

    2016-11-01

    We investigate rank-based studentized permutation methods for the nonparametric Behrens-Fisher problem, that is, inference methods for the area under the ROC curve. We hereby prove that the studentized permutation distribution of the Brunner-Munzel rank statistic is asymptotically standard normal, even under the alternative. Thus, incidentally providing the hitherto missing theoretical foundation for the Neubert and Brunner studentized permutation test. In particular, we do not only show its consistency, but also that confidence intervals for the underlying treatment effects can be computed by inverting this permutation test. In addition, we derive permutation-based range-preserving confidence intervals. Extensive simulation studies show that the permutation-based confidence intervals appear to maintain the preassigned coverage probability quite accurately (even for rather small sample sizes). For a convenient application of the proposed methods, a freely available software package for the statistical software R has been developed. A real data example illustrates the application. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Solar Ion Processing of Itokawa Grains: Reconciling Model Predictions with Sample Observations

    NASA Technical Reports Server (NTRS)

    Christoffersen, Roy; Keller, L. P.

    2014-01-01

    Analytical TEM observations of Itokawa grains reported to date show complex solar wind ion processing effects in the outer 30-100 nm of pyroxene and olivine grains. The effects include loss of long-range structural order, formation of isolated interval cavities or "bubbles", and other nanoscale compositional/microstructural variations. None of the effects so far described have, however, included complete ion-induced amorphization. To link the array of observed relationships to grain surface exposure times, we have adapted our previous numerical model for progressive solar ion processing effects in lunar regolith grains to the Itokawa samples. The model uses SRIM ion collision damage and implantation calculations within a framework of a constant-deposited-energy model for amorphization. Inputs include experimentally-measured amorphization fluences, a Pi steradian variable ion incidence geometry required for a rotating asteroid, and a numerical flux-versus-velocity solar wind spectrum.

  20. The relationship between observational scale and explained variance in benthic communities

    PubMed Central

    Flood, Roger D.; Frisk, Michael G.; Garza, Corey D.; Lopez, Glenn R.; Maher, Nicole P.

    2018-01-01

    This study addresses the impact of spatial scale on explaining variance in benthic communities. In particular, the analysis estimated the fraction of community variation that occurred at a spatial scale smaller than the sampling interval (i.e., the geographic distance between samples). This estimate is important because it sets a limit on the amount of community variation that can be explained based on the spatial configuration of a study area and sampling design. Six benthic data sets were examined that consisted of faunal abundances, common environmental variables (water depth, grain size, and surficial percent cover), and sonar backscatter treated as a habitat proxy (categorical acoustic provinces). Redundancy analysis was coupled with spatial variograms generated by multiscale ordination to quantify the explained and residual variance at different spatial scales and within and between acoustic provinces. The amount of community variation below the sampling interval of the surveys (< 100 m) was estimated to be 36–59% of the total. Once adjusted for this small-scale variation, > 71% of the remaining variance was explained by the environmental and province variables. Furthermore, these variables effectively explained the spatial structure present in the infaunal community. Overall, no scale problems remained to compromise inferences, and unexplained infaunal community variation had no apparent spatial structure within the observational scale of the surveys (> 100 m), although small-scale gradients (< 100 m) below the observational scale may be present. PMID:29324746

  1. A new method for estimating the demographic history from DNA sequences: an importance sampling approach

    PubMed Central

    Ait Kaci Azzou, Sadoune; Larribe, Fabrice; Froda, Sorana

    2015-01-01

    The effective population size over time (demographic history) can be retraced from a sample of contemporary DNA sequences. In this paper, we propose a novel methodology based on importance sampling (IS) for exploring such demographic histories. Our starting point is the generalized skyline plot with the main difference being that our procedure, skywis plot, uses a large number of genealogies. The information provided by these genealogies is combined according to the IS weights. Thus, we compute a weighted average of the effective population sizes on specific time intervals (epochs), where the genealogies that agree more with the data are given more weight. We illustrate by a simulation study that the skywis plot correctly reconstructs the recent demographic history under the scenarios most commonly considered in the literature. In particular, our method can capture a change point in the effective population size, and its overall performance is comparable with the one of the bayesian skyline plot. We also introduce the case of serially sampled sequences and illustrate that it is possible to improve the performance of the skywis plot in the case of an exponential expansion of the effective population size. PMID:26300910

  2. Effect of Lowering the Dialysate Temperature in Chronic Hemodialysis: A Systematic Review and Meta-Analysis

    PubMed Central

    Bdair, Fadi; Akl, Elie A.; Garg, Amit X.; Thiessen-Philbrook, Heather; Salameh, Hassan; Kisra, Sood; Nesrallah, Gihad; Al-Jaishi, Ahmad; Patel, Parth; Patel, Payal; Mustafa, Ahmad A.; Schünemann, Holger J.

    2016-01-01

    Background and objectives Lowering the dialysate temperature may improve outcomes for patients undergoing chronic hemodialysis. We reviewed the reported benefits and harms of lower temperature dialysis. Design, setting, participants, & measurements We searched the Cochrane Central Register, OVID MEDLINE, EMBASE, and Pubmed until April 15, 2015. We reviewed the reference lists of relevant reviews, registered trials, and relevant conference proceedings. We included all randomized, controlled trials that evaluated the effect of reduced temperature dialysis versus standard temperature dialysis in adult patients receiving chronic hemodialysis. We followed the Grading of Recommendations Assessment, Development and Evaluation approach to assess confidence in the estimates of effect (i.e., the quality of evidence). We conducted meta-analyses using random effects models. Results Twenty-six trials were included, consisting of a total of 484 patients. Compared with standard temperature dialysis, reduced temperature dialysis significantly reduced the rate of intradialytic hypotension by 70% (95% confidence interval, 49% to 89%) and significantly increased intradialytic mean arterial pressure by 12 mmHg (95% confidence interval, 8 to 16 mmHg). Symptoms of discomfort occurred 2.95 (95% confidence interval, 0.88 to 9.82) times more often with reduced temperature compared with standard temperature dialysis. The effect on dialysis adequacy was not significantly different, with a Kt/V mean difference of −0.05 (95% confidence interval, −0.09 to 0.01). Small sample sizes, loss to follow-up, and a lack of appropriate blinding in some trials reduced confidence in the estimates of effect. None of the trials reported long-term outcomes. Conclusions In patients receiving chronic hemodialysis, reduced temperature dialysis may reduce the rate of intradialytic hypotension and increase intradialytic mean arterial pressure. High–quality, large, multicenter, randomized trials are needed to determine whether reduced temperature dialysis affects patient mortality and major adverse cardiovascular events. PMID:26712807

  3. Influences of the Tamarisk Leaf Beetle (Diorhabda carinulata) on the diet of insectivorous birds along the Dolores River in Southwestern Colorado

    USGS Publications Warehouse

    Puckett, Sarah L.; van Riper, Charles

    2014-01-01

    We examined the effects of a biologic control agent, the tamarisk leaf beetle (Diorhabda carinulata), on native avifauna in southwestern Colorado, specifically, addressing whether and to what degree birds eat tamarisk leaf beetles. In 2010, we documented avian foraging behavior, characterized the arthropod community, sampled bird diets, and undertook an experiment to determine whether tamarisk leaf beetles are palatable to birds. We observed that tamarisk leaf beetles compose 24.0 percent (95-percent-confidence interval, 19.9-27.4 percent) and 35.4 percent (95-percent-confidence interval, 32.4-45.1 percent) of arthropod abundance and biomass in the study area, respectively. Birds ate few tamarisk leaf beetles, despite a superabundance of D. carinulata in the environment. The frequency of occurrence of tamarisk leaf beetles in bird diets was 2.1 percent (95-percent-confidence interval, 1.3- 2.9 percent) by abundance and 3.4 percent (95-percent-confidence interval, 2.6-4.2 percent) by biomass. Thus, tamarisk leaf beetles probably do not contribute significantly to the diets of birds in areas where biologic control of tamarisk is being applied.

  4. Evaluation of an in-practice wet-chemistry analyzer using canine and feline serum samples.

    PubMed

    Irvine, Katherine L; Burt, Kay; Papasouliotis, Kostas

    2016-01-01

    A wet-chemistry biochemical analyzer was assessed for in-practice veterinary use. Its small size may mean a cost-effective method for low-throughput in-house biochemical analyses for first-opinion practice. The objectives of our study were to determine imprecision, total observed error, and acceptability of the analyzer for measurement of common canine and feline serum analytes, and to compare clinical sample results to those from a commercial reference analyzer. Imprecision was determined by within- and between-run repeatability for canine and feline pooled samples, and manufacturer-supplied quality control material (QCM). Total observed error (TEobs) was determined for pooled samples and QCM. Performance was assessed for canine and feline pooled samples by sigma metric determination. Agreement and errors between the in-practice and reference analyzers were determined for canine and feline clinical samples by Bland-Altman and Deming regression analyses. Within- and between-run precision was high for most analytes, and TEobs(%) was mostly lower than total allowable error. Performance based on sigma metrics was good (σ > 4) for many analytes and marginal (σ > 3) for most of the remainder. Correlation between the analyzers was very high for most canine analytes and high for most feline analytes. Between-analyzer bias was generally attributed to high constant error. The in-practice analyzer showed good overall performance, with only calcium and phosphate analyses identified as significantly problematic. Agreement for most analytes was insufficient for transposition of reference intervals, and we recommend that in-practice-specific reference intervals be established in the laboratory. © 2015 The Author(s).

  5. A new framework of statistical inferences based on the valid joint sampling distribution of the observed counts in an incomplete contingency table.

    PubMed

    Tian, Guo-Liang; Li, Hui-Qiong

    2017-08-01

    Some existing confidence interval methods and hypothesis testing methods in the analysis of a contingency table with incomplete observations in both margins entirely depend on an underlying assumption that the sampling distribution of the observed counts is a product of independent multinomial/binomial distributions for complete and incomplete counts. However, it can be shown that this independency assumption is incorrect and can result in unreliable conclusions because of the under-estimation of the uncertainty. Therefore, the first objective of this paper is to derive the valid joint sampling distribution of the observed counts in a contingency table with incomplete observations in both margins. The second objective is to provide a new framework for analyzing incomplete contingency tables based on the derived joint sampling distribution of the observed counts by developing a Fisher scoring algorithm to calculate maximum likelihood estimates of parameters of interest, the bootstrap confidence interval methods, and the bootstrap testing hypothesis methods. We compare the differences between the valid sampling distribution and the sampling distribution under the independency assumption. Simulation studies showed that average/expected confidence-interval widths of parameters based on the sampling distribution under the independency assumption are shorter than those based on the new sampling distribution, yielding unrealistic results. A real data set is analyzed to illustrate the application of the new sampling distribution for incomplete contingency tables and the analysis results again confirm the conclusions obtained from the simulation studies.

  6. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    PubMed

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  7. Sample interval modulation for the simultaneous acquisition of displacement vector data in magnetic resonance elastography: theory and application

    NASA Astrophysics Data System (ADS)

    Klatt, Dieter; Yasar, Temel K.; Royston, Thomas J.; Magin, Richard L.

    2013-12-01

    SampLe Interval Modulation-magnetic resonance elastography (SLIM-MRE) is introduced for simultaneously encoding all three displacement projections of a monofrequency vibration into the MR signal phase. In SLIM-MRE, the individual displacement components are observed using different sample intervals. In doing so, the components are modulated with different apparent frequencies in the MR signal phase expressed as a harmonic function of the start time of the motion encoding gradients and can thus be decomposed by applying a Fourier transform to the sampled multidirectional MR phases. In this work, the theoretical foundations of SLIM-MRE are presented and the new idea is implemented using a high field (11.7 T) vertical bore magnetic resonance imaging system on an inhomogeneous agarose gel phantom sample. The local frequency estimation-derived stiffness values were the same within the error margins for both the new SLIM-MRE method and for conventional MRE, while the number of temporally-resolved MRE experiments needed for each study was reduced from three to one. In this work, we present for the first time, monofrequency displacement data along three sensitization directions that were acquired simultaneously and stored in the same k-space.

  8. Sample interval modulation for the simultaneous acquisition of displacement vector data in magnetic resonance elastography: theory and application.

    PubMed

    Klatt, Dieter; Yasar, Temel K; Royston, Thomas J; Magin, Richard L

    2013-12-21

    SampLe Interval Modulation-magnetic resonance elastography (SLIM-MRE) is introduced for simultaneously encoding all three displacement projections of a monofrequency vibration into the MR signal phase. In SLIM-MRE, the individual displacement components are observed using different sample intervals. In doing so, the components are modulated with different apparent frequencies in the MR signal phase expressed as a harmonic function of the start time of the motion encoding gradients and can thus be decomposed by applying a Fourier transform to the sampled multidirectional MR phases. In this work, the theoretical foundations of SLIM-MRE are presented and the new idea is implemented using a high field (11.7 T) vertical bore magnetic resonance imaging system on an inhomogeneous agarose gel phantom sample. The local frequency estimation-derived stiffness values were the same within the error margins for both the new SLIM-MRE method and for conventional MRE, while the number of temporally-resolved MRE experiments needed for each study was reduced from three to one. In this work, we present for the first time, monofrequency displacement data along three sensitization directions that were acquired simultaneously and stored in the same k-space.

  9. 40 CFR Table F-4 to Subpart F of... - Estimated Mass Concentration Measurement of PM2.5 for Idealized Coarse Aerosol Size Distribution

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 5 2010-07-01 2010-07-01 false Estimated Mass Concentration... Equivalent Methods for PM2.5 Pt. 53, Subpt. F, Table F-4 Table F-4 to Subpart F of Part 53—Estimated Mass... (µm) Test Sampler Fractional Sampling Effectiveness Interval Mass Concentration (µg/m3) Estimated Mass...

  10. 40 CFR Table F-4 to Subpart F of... - Estimated Mass Concentration Measurement of PM2.5 for Idealized Coarse Aerosol Size Distribution

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 5 2011-07-01 2011-07-01 false Estimated Mass Concentration... Equivalent Methods for PM2.5 Pt. 53, Subpt. F, Table F-4 Table F-4 to Subpart F of Part 53—Estimated Mass... (µm) Test Sampler Fractional Sampling Effectiveness Interval Mass Concentration (µg/m3) Estimated Mass...

  11. 40 CFR Table F-6 to Subpart F of... - Estimated Mass Concentration Measurement of PM2.5 for Idealized Fine Aerosol Size Distribution

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 5 2011-07-01 2011-07-01 false Estimated Mass Concentration... Equivalent Methods for PM2.5 Pt. 53, Subpt. F, Table F-6 Table F-6 to Subpart F of Part 53—Estimated Mass... (µm) Test Sampler Fractional Sampling Effectiveness Interval Mass Concentration (µg/m3) Estimated Mass...

  12. 40 CFR Table F-6 to Subpart F of... - Estimated Mass Concentration Measurement of PM2.5 for Idealized Fine Aerosol Size Distribution

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 5 2010-07-01 2010-07-01 false Estimated Mass Concentration... Equivalent Methods for PM2.5 Pt. 53, Subpt. F, Table F-6 Table F-6 to Subpart F of Part 53—Estimated Mass... (µm) Test Sampler Fractional Sampling Effectiveness Interval Mass Concentration (µg/m3) Estimated Mass...

  13. [Uncertainty characterization approaches for ecological risk assessment of polycyclic aromatic hydrocarbon in Taihu Lake].

    PubMed

    Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian

    2012-04-01

    Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.

  14. Identifying Issues and Concerns with the Use of Interval-Based Systems in Single Case Research Using a Pilot Simulation Study

    ERIC Educational Resources Information Center

    Ledford, Jennifer R.; Ayres, Kevin M.; Lane, Justin D.; Lam, Man Fung

    2015-01-01

    Momentary time sampling (MTS), whole interval recording (WIR), and partial interval recording (PIR) are commonly used in applied research. We discuss potential difficulties with analyzing data when these systems are used and present results from a pilot simulation study designed to determine the extent to which these issues are likely to be…

  15. A robust method of thin plate spline and its application to DEM construction

    NASA Astrophysics Data System (ADS)

    Chen, Chuanfa; Li, Yanyan

    2012-11-01

    In order to avoid the ill-conditioning problem of thin plate spline (TPS), the orthogonal least squares (OLS) method was introduced, and a modified OLS (MOLS) was developed. The MOLS of TPS (TPS-M) can not only select significant points, termed knots, from large and dense sampling data sets, but also easily compute the weights of the knots in terms of back-substitution. For interpolating large sampling points, we developed a local TPS-M, where some neighbor sampling points around the point being estimated are selected for computation. Numerical tests indicate that irrespective of sampling noise level, the average performance of TPS-M can advantage with smoothing TPS. Under the same simulation accuracy, the computational time of TPS-M decreases with the increase of the number of sampling points. The smooth fitting results on lidar-derived noise data indicate that TPS-M has an obvious smoothing effect, which is on par with smoothing TPS. The example of constructing a series of large scale DEMs, located in Shandong province, China, was employed to comparatively analyze the estimation accuracies of the two versions of TPS and the classical interpolation methods including inverse distance weighting (IDW), ordinary kriging (OK) and universal kriging with the second-order drift function (UK). Results show that regardless of sampling interval and spatial resolution, TPS-M is more accurate than the classical interpolation methods, except for the smoothing TPS at the finest sampling interval of 20 m, and the two versions of kriging at the spatial resolution of 15 m. In conclusion, TPS-M, which avoids the ill-conditioning problem, is considered as a robust method for DEM construction.

  16. Reassessment of the Access Testosterone chemiluminescence assay and comparison with LC-MS method.

    PubMed

    Dittadi, Ruggero; Matteucci, Mara; Meneghetti, Elisa; Ndreu, Rudina

    2018-03-01

    To reassess the imprecision and Limit of Quantitation, to evaluate the cross-reaction with dehydroepiandrosterone-sulfate (DHEAS), the accuracy toward liquid chromatography-mass spectrometry (LC-MS) and the reference interval of the Access Testosterone method, performed by DxI immunoassay platform (Beckman Coulter). Imprecision was evaluated testing six pool samples assayed in 20 different run using two reagents lots. The cross-reaction with DHEAS was studied both by a displacement curve and by spiking DHEAS standard in two serum samples with known amount of testosterone. The comparison with LC-MS was evaluated by Passing-Bablock analysis in 21 routine serum samples and 19 control samples from an External Quality Assurance (EQA) scheme. The reference interval was verified by an indirect estimation on 2445 male and 2838 female outpatients. The imprecision study showed a coefficient of variation (CV) between 2.7% and 34.7% for serum pools from 16.3 and 0.27 nmol/L. The value of Limit of Quantitation at 20% CV was 0.53 nmol/L. The DHEAS showed a cross-reaction of 0.0074%. A comparison with LC-MS showed a trend toward a slight underestimation of immunoassay vs LC-MS (Passing-Bablock equations: DxI=-0.24+0.906 LCMS in serum samples and DxI=-0.299+0.981 LCMS in EQA samples). The verification of reference interval showed a 2.5th-97.5th percentile distribution of 6.6-24.3 nmol/L for male over 14 years and <0.5-2.78 nmol/L for female subjects, in accord with the reference intervals reported by the manufacturer. The Access Testosterone method could be considered an adequately reliable tool for the testosterone measurement. © 2017 Wiley Periodicals, Inc.

  17. Reversing the Course of Forgetting

    PubMed Central

    White, K. Geoffrey; Brown, Glenn S

    2011-01-01

    Forgetting functions were generated for pigeons in a delayed matching-to-sample task, in which accuracy decreased with increasing retention-interval duration. In baseline training with dark retention intervals, accuracy was high overall. Illumination of the experimental chamber by a houselight during the retention interval impaired performance accuracy by increasing the rate of forgetting. In novel conditions, the houselight was lit at the beginning of a retention interval and then turned off partway through the retention interval. Accuracy was low at the beginning of the retention interval and then increased later in the interval. Thus the course of forgetting was reversed. Such a dissociation of forgetting from the passage of time is consistent with an interference account in which attention or stimulus control switches between the remembering task and extraneous events. PMID:21909163

  18. Evaluating the Impact of Various Parameters on the Gamma Index Values of 2D Diode Array in IMRT Verification

    PubMed Central

    Jabbari, Keyvan; Pashaei, Fakhereh; Ay, Mohammad R.; Amouheidari, Alireza; Tavakoli, Mohammad B.

    2018-01-01

    Background: MapCHECK2 is a two-dimensional diode arrays planar dosimetry verification system. Dosimetric results are evaluated with gamma index. This study aims to provide comprehensive information on the impact of various factors on the gamma index values of MapCHECK2, which is mostly used for IMRT dose verification. Methods: Seven fields were planned for 6 and 18 MV photons. The azimuthal angle is defined as any rotation of collimators or the MapCHECK2 around the central axis, which was varied from 5 to −5°. The gantry angle was changed from −8 to 8°. Isodose sampling resolution was studied in the range of 0.5 to 4 mm. The effects of additional buildup on gamma index in three cases were also assessed. Gamma test acceptance criteria were 3%/3 mm. Results: The change of azimuthal angle in 5° interval reduced gamma index value by about 9%. The results of putting buildups of various thicknesses on the MapCHECK2 surface showed that gamma index was generally improved in thicker buildup, especially for 18 MV. Changing the sampling resolution from 4 to 2 mm resulted in an increase in gamma index by about 3.7%. The deviation of the gantry in 8° intervals in either directions changed the gamma index only by about 1.6% for 6 MV and 2.1% for 18 MV. Conclusion: Among the studied parameters, the azimuthal angle is one of the most effective factors on gamma index value. The gantry angle deviation and sampling resolution are less effective on gamma index value reduction. PMID:29535922

  19. Effects of hemolysis and lipemia interference on kaolin-activated thromboelastography, and comparison with conventional coagulation tests.

    PubMed

    Tang, Ning; Jin, Xi; Sun, Ziyong; Jian, Cui

    2017-04-01

    The effects of hemolysis and lipemia on thromboelastography (TEG) analysis have been scarcely evaluated in human samples, and neglected in clinical practice. We aimed to investigate the effects of in vitro mechanical hemolysis and lipemia on TEG analysis and conventional coagulation tests. Twenty-four healthy volunteers were enrolled in the study. Besides the controls, three groups with slight, moderate and severe mechanical hemolysis were constituted according to free hemoglobin (Hb) concentrations of 0.5-1.0, 2.0-6.0 and 7.0-13.0 g/L, respectively; and three groups with mild, moderate and high lipemia were established according to triglyceride concentrations of ∼6.0, ∼12.0, and ∼18.0 mmol/L, respectively. Four TEG parameters, reaction time (R), coagulation time (K), angle (α), and maximum amplitude (MA), were measured alongside conventional plasma tests including prothrombin time (PT), activated partial thromboplastin time (APTT) and fibrinogen (FIB) by mechanical method, and platelet count by optical method. Results showed that the median R and MA values at moderate and severe hemolysis and K at severe hemolysis exceeded respective reference intervals, and were considered unacceptable. Median values of TEG parameters in lipemic samples were all within reference intervals. Bias values of conventional plasma tests PT, APTT and FIB in hemolyzed or lipemic samples were all lower than the Clinical Laboratory Improvement Amendments (CLIA) allowable limits. Bias values of platelet count at moderate to severe hemolysis and lipemia exceeded the CLIA allowable limits. In conclusion, the detection of TEG was in general more affected by mechanical hemolysis than plasma coagulation tests. Pre-analytical variables should be taken into account when unexpected TEG results are obtained.

  20. Effectiveness of child safety seats vs seat belts in reducing risk for death in children in passenger vehicle crashes.

    PubMed

    Elliott, Michael R; Kallan, Michael J; Durbin, Dennis R; Winston, Flaura K

    2006-06-01

    To provide an estimate of benefit, if any, of child restraint systems over seat belts alone for children aged from 2 through 6 years. Cohort study. A sample of children in US passenger vehicle crashes was obtained from the National Highway Transportation Safety Administration by combining cases involving a fatality from the US Department of Transportation Fatality Analysis Reporting System with a probability sample of cases without a fatality from the National Automotive Sampling System. Children in tow-away [corrected] crashes occurring between 1998 and 2003. Use of child restraint systems (rear-facing and forward-facing car seats, and shield and belt-positioning booster seats) vs seat belts. Potentially confounding variables included seating position, vehicle type, model year, driver and passenger ages, and driver survival status. Death of child passengers from injuries incurred during the crash. Compared with seat belts, child restraints, when not seriously misused (eg, unattached restraint, child restraint system harness not used, 2 children restrained with 1 seat belt) were associated with a 28% reduction in risk for death (relative risk, 0.72; 95% confidence interval, 0.54-0.97) in children aged 2 through 6 years after adjusting for seating position, vehicle type, model year, driver and passenger ages, and driver survival status. When including cases of serious misuse, the effectiveness estimate was slightly lower (21%) (relative risk, 0.79; 95% confidence interval, 0.59-1.05). Based on these findings as well as previous epidemiological and biomechanical evidence for child restraint system effectiveness in reducing nonfatal injury risk, efforts should continue to promote use of child restraint systems through improved laws and with education and disbursement programs.

  1. Replication kinetics and shedding of very virulent Marek's disease virus and vaccinal Rispens/CVI988 virus during single and mixed infections varying in order and interval between infections.

    PubMed

    Islam, Tanzila; Walkden-Brown, Stephen W; Renz, Katrin G; Islam, A F M Fakhrul; Ralapanawe, Sithara

    2014-10-10

    Vaccination is thought to contribute to an evolution in virulence of the Marek's disease virus (MDV) as vaccines prevent disease but not infection. We investigated the effects of co-infections at various intervals between Rispens/CVI988 vaccine virus (Rispens) and very virulent MDV (vvMDV) on the replication and shedding of each virus. The experiment used 600 ISA Brown layer chickens in 24 isolators with all treatments replicated in two isolators. Chickens were vaccinated with Rispens and/or challenged with the vvMDV isolate 02LAR on days 0, 5, or 10 post hatching providing vaccination to challenge intervals (VCI) of -10, -5, 0, 5 or 10 days with the negative values indicating challenge prior to vaccination. Peripheral blood lymphocytes (PBL), feathers and isolator exhaust dust were sampled between 7 and 56 days post infection (dpi) and subjected to quantitative real-time polymerase chain reaction (qPCR) to differentiate the two viruses. Overall Rispens significantly reduced the viral load of vvMDV in PBL and feather cells and shedding in dust. Similarly vvMDV significantly reduced the viral load of Rispens in PBL and feather cells but not in dust. VCI significantly influenced these relationships having strong positive and negative associations with load of vvMDV and Rispens respectively. Differences between the two viruses and their effects on each other were greatest in PBL and feathers, and least in dust. This study expands our understanding of the interaction between pathogenic and vaccinal viruses following vaccination with imperfect vaccines and has implications for selection of appropriate samples to test for vaccination success. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Exposure to smoking depictions in movies: its association with established adolescent smoking.

    PubMed

    Sargent, James D; Stoolmiller, Mike; Worth, Keilah A; Dal Cin, Sonya; Wills, Thomas A; Gibbons, Frederick X; Gerrard, Meg; Tanski, Susanne

    2007-09-01

    To assess the association between exposure to movie smoking and established adolescent smoking. Longitudinal survey of a representative US adolescent sample. Adolescents were surveyed by telephone in their homes. Sixty-five hundred twenty-two US adolescents aged 10 to 14 years at baseline, resurveyed at 8 months (8M) (n = 5503), 16 months (16M) (n = 5019), and 24 months (24M) (n = 4575). Main Exposure Exposure to smoking in 532 box-office hits released in the 5 years prior to the baseline survey. Outcome Measure Established smoking (having smoked more than 100 cigarettes during lifetime). Of 108 incident established smokers with data at the 24M survey, 85% were current (30-day smokers) and 83% endorsed at least 1 addiction symptom. Established smoking incidence was 7.4, 15.8, and 19.7 per 1000 person-years of observation for the baseline-to-8M, 8M-to-16M, and 16M-to-24M observation periods, respectively. In a multivariate survival model, risk of established smoking was predicted by baseline exposure to smoking in movies with an adjusted overall hazard ratio of 2.04 (95% confidence interval, 1.01-4.12) for teens in the 95th percentile of movie-smoking exposure compared with the 5th percentile. This effect was independent of age; parent, sibling, or friend smoking; and sensation seeking. Teens low on sensation seeking were more responsive to the movie-smoking effect (hazard ratio, 12.7; 95% confidence interval, 2.0-80.6) compared with teens who were high on sensation seeking (hazard ratio, 1.01; 95% confidence interval, 0.4-2.6). In this national US adolescent sample, exposure to smoking in movies predicted risk of becoming an established smoker, an outcome linked with adult dependent smoking and its associated morbidity and mortality.

  3. Association between growth hormone receptor AluI polymorphism and fertility of Holstein cows.

    PubMed

    Schneider, A; Corrêa, M N; Butler, W R

    2013-12-01

    The aim of this work was to determine the effects of a growth hormone receptor (GHR) AluI polymorphism on the reproductive performance of Holstein cows. The cows (n = 94) were on the study from 3 weeks prepartum until 210 days in milk (DIM). Blood samples were collected at -21, 0, 7, 21, and 60 DIM. For GHR genotyping, DNA was extracted from blood and the presence of the alleles determined after polymerase chain reaction and digestion with the restriction enzyme AluI. Milk samples were collected for progesterone analysis and detection of ovulation until first breeding. Cows were submitted to an OvSynch-TAI protocol at 55 DIM that was repeated for cows diagnosed as not pregnant. Data were analyzed with SAS for polynomial effects of the presence of 0, 1, or 2 GHR AluI (-) alleles. Among the cows, 37% had the AluI(+/+) genotype, 51% had AluI(-/+), and 12% were AluI(-/-). Interval from calving to first ovulation was not different among genotypes (P > 0.05). Cows carrying at least one GHR AluI(-) allele had fewer number of services per conception (P = 0.02). In addition, there was a linear reduction (P = 0.02) in the calving to conception interval among genotypes with fewest days for GHR AluI(-/-) cows. GHR AluI(-/-) cows also had the highest serum IGF-I concentrations (P = 0.03). Milk production and composition were not different among genotypes (P > 0.05). The presence of one or two GHR AluI(-) alleles in Holstein cows was associated with a linear reduction in the calving to conception interval and a reduction in the number of AI/conception. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. speed-ne: Software to simulate and estimate genetic effective population size (Ne ) from linkage disequilibrium observed in single samples.

    PubMed

    Hamilton, Matthew B; Tartakovsky, Maria; Battocletti, Amy

    2018-05-01

    The genetic effective population size, N e , can be estimated from the average gametic disequilibrium (r2^) between pairs of loci, but such estimates require evaluation of assumptions and currently have few methods to estimate confidence intervals. speed-ne is a suite of matlab computer code functions to estimate Ne^ from r2^ with a graphical user interface and a rich set of outputs that aid in understanding data patterns and comparing multiple estimators. speed-ne includes functions to either generate or input simulated genotype data to facilitate comparative studies of Ne^ estimators under various population genetic scenarios. speed-ne was validated with data simulated under both time-forward and time-backward coalescent models of genetic drift. Three classes of estimators were compared with simulated data to examine several general questions: what are the impacts of microsatellite null alleles on Ne^, how should missing data be treated, and does disequilibrium contributed by reduced recombination among some loci in a sample impact Ne^. Estimators differed greatly in precision in the scenarios examined, and a widely employed Ne^ estimator exhibited the largest variances among replicate data sets. speed-ne implements several jackknife approaches to estimate confidence intervals, and simulated data showed that jackknifing over loci and jackknifing over individuals provided ~95% confidence interval coverage for some estimators and should be useful for empirical studies. speed-ne provides an open-source extensible tool for estimation of Ne^ from empirical genotype data and to conduct simulations of both microsatellite and single nucleotide polymorphism (SNP) data types to develop expectations and to compare Ne^ estimators. © 2018 John Wiley & Sons Ltd.

  5. Effect of supplemental oxygen on post-exercise inflammatory response and oxidative stress.

    PubMed

    White, Jodii; Dawson, Brian; Landers, Grant; Croft, Kevin; Peeling, Peter

    2013-04-01

    This investigation explored the influence of supplemental oxygen administered during the recovery periods of an interval-based running session on the post-exercise markers of reactive oxygen species (ROS) and inflammation. Ten well-trained male endurance athletes completed two sessions of 10 × 3 min running intervals at 85 % of the maximal oxygen consumption velocity (vVO(2)peak) on a motorised treadmill. A 90-s recovery period was given between each interval, during which time the participants were administered either a hyperoxic (HYP) (Fraction of Inspired Oxygen (FIO2) 99.5 %) or normoxic (NORM) (FIO2 21 %) gas, in a randomized, single-blind fashion. Pulse oximetry (SpO(2)), heart rate (HR), blood lactate (BLa), perceived exertion (RPE), and perceived recovery (TQRper) were recorded during each trial. Venous blood samples were taken pre-exercise, post-exercise and 1 h post-exercise to measure Interleukin-6 (IL-6) and Isoprostanes (F2-IsoP). The S(p)O(2) was significantly lower than baseline following all interval repetitions in both experimental trials (p < 0.05). The S(p)O(2) recovery time was significantly quicker in the HYP when compared to the NORM (p < 0.05), with a trend for improved perceptual recovery. The IL-6 and F2-IsoP were significantly elevated immediately post-exercise, but had significantly decreased by 1 h post-exercise in both trials (p < 0.05). There were no differences in IL-6 or F2-IsoP levels between trials. Supplemental oxygen provided during the recovery periods of interval based exercise improves the recovery time of SPO(2) but has no effect on post-exercise ROS or inflammatory responses.

  6. Pharmacokinetic interactions and safety evaluations of coadministered tafenoquine and chloroquine in healthy subjects.

    PubMed

    Miller, Ann K; Harrell, Emma; Ye, Li; Baptiste-Brown, Sharon; Kleim, Jőrg-Peter; Ohrt, Colin; Duparc, Stephan; Möhrle, Jörg J; Webster, Alison; Stinnett, Sandra; Hughes, Arlene; Griffith, Sandy; Beelen, Andrew P

    2013-12-01

    The long-acting 8-aminoquinoline tafenoquine (TQ) coadministered with chloroquine (CQ) may radically cure Plasmodium vivax malaria. Coadministration therapy was evaluated for a pharmacokinetic interaction and for pharmacodynamic, safety and tolerability characteristics. Healthy subjects, 18-55 years old, without documented glucose-6-phosphate dehydrogenase deficiency, received CQ alone (days 1-2, 600 mg; and day 3, 300 mg), TQ alone (days 2 and 3, 450 mg) or coadministration therapy (day 1, CQ 600 mg; day 2, CQ 600 mg + TQ 450 mg; and day 3, CQ 300 mg + TQ 450 mg) in a randomized, double-blind, parallel-group study. Blood samples for pharmacokinetic and pharmacodynamic analyses and safety data, including electrocardiograms, were collected for 56 days. The coadministration of CQ + TQ had no effect on TQ AUC0-t , AUC0-∞ , Tmax or t1/2 . The 90% confidence intervals of CQ + TQ vs. TQ for AUC0-t , AUC0-∞ and t1/2 indicated no drug interaction. On day 2 of CQ + TQ coadministration, TQ Cmax and AUC0-24 increased by 38% (90% confidence interval 1.27, 1.64) and 24% (90% confidence interval 1.04, 1.46), respectively. The pharmacokinetics of CQ and its primary metabolite desethylchloroquine were not affected by TQ. Coadministration had no clinically significant effect on QT intervals and was well tolerated. No clinically significant safety or pharmacokinetic/pharmacodynamic interactions were observed with coadministered CQ and TQ in healthy subjects. © 2013 The British Pharmacological Society.

  7. Borehole geophysical logging and aquifer-isolation tests conducted in well MG-1693 at North Penn Area 5 Superfund Site near Colmar, Montgomery County, Pennsylvania

    USGS Publications Warehouse

    Bird, Philip H.

    2006-01-01

    Borehole geophysical logging and aquifer-isolation (packer) tests were conducted in well MG-1693 (NP-87) at the North Penn Area 5 Superfund Site near Colmar, Montgomery County, Pa. Objectives of the study were to identify the depth and yield of water-bearing zones, occurrence of vertical borehole flow, and effects of pumping on water levels in nearby wells. Caliper, natural-gamma, single-point-resistance, fluidtemperature, fluid-resistivity, heatpulse-flowmeter, and borehole-video logs were collected. Vertical borehole-fluid movement direction and rate were measured under nonpumping conditions. The suite of logs was used to locate water-bearing fractures, determine zones of vertical borehole-fluid movement, and select depths to set packers. Aquifer-isolation tests were conducted to sample discrete intervals and to determine specific capacities of water-bearing zones and effects of pumping individual zones on water levels in two nearby monitor wells. Specific capacities of isolated zones during aquifer-isolation tests ranged from 0.03 to 3.09 (gal/min)/ft (gallons per minute per foot). Fractures identified by borehole geophysical methods as water-producing or water-receiving zones produced water when isolated and pumped.Water enters the borehole primarily through high-angle fractures at 416 to 435 ft bls (feet below land surface) and 129 to 136 ft bls. Water exits the borehole through a high-angle fracture at 104 to 107 ft bls, a broken casing joint at 82 ft bls, and sometimes as artesian flow through the top of the well. Thirteen intervals were selected for aquifer-isolation testing, using a straddle-packer assembly. The specific capacity of interval 1 was 2.09 (gal/min)/ft. The specific capacities of intervals 2, 3, and 4 were similar—0.27, 0.30, and 0.29 (gal/min)/ft, respectively. The specific capacities of intervals 5, 6, 7, 8, and 10 were similar—0.03, 0.04, 0.09, 0.09, and 0.04 (gal/min)/ft, respectively. Intervals 9, 11, and 12 each showed a strong hydraulic connection outside the borehole with intervals above and below the isolated interval. The specific capacities of intervals 9, 11, 12, and 13 were similar—2.12, 2.17, 3.09, and 3.08 (gal/min)/ft, respectively. The aquifer-isolation tests indicate that wells MG-1693 (NP-87) and MG-924 (NP-21) are connected primarily through the high-angle fracture from 416 to 435 ft bls. Pumping in either of these wells directly impacts the other well, allowing the pumped well to draw from water-bearing zones in the nonpumped well that are not present in or are not connected directly to the pumped well. The two boreholes act as a single, U-shaped well. The aquifer-isolation tests also show that the lower zones in well MG-1693 (NP-87) are a major source of hydraulic head in well MG-1661 (W-13) through the broken casing joint at 82 ft bls. Water moving upward from the lower intervals in well MG-1693 (NP-87) exits the borehole through the broken casing joint, moves upward outside the borehole, possibly around and (or) through a poor or damaged casing seal, and through the weathered zone above bedrock to well MG-1661 (W-13).Samples for volatile organic compounds (VOCs) were collected in nine isolated intervals. Six compounds were detected (1,1-dichloroethane, 1,1-dichloroethene, cis-1,2-dichloroethene, toluene, 1,1,1-trichloroethane, and trichloroethene (TCE)), and TCE was found in all nine isolated intervals. Intervals 4 (124-149 ft bls) and 6 (277-302 ft bls) had the highest total concentration of VOCs (6.66 and 6.2 micrograms per liter, respectively). Intervals 1 (68-93 ft bls) and 4 each had five compounds detected, which was the highest number of compounds detected. Interval 5 (252-277 ft bls) had the lowest total concentration of VOCs (0.08 microgram per liter) and the least number of VOCs detected (one). Detected compounds were not evenly distributed throughout the intervals. Contaminants were found in shallow, intermediate, and deep intervals and were associated with high-angle fractures and rough areas that showed no distinct fractures.

  8. Effects of variability of practice in music: a pilot study on fast goal-directed movements in pianists

    PubMed Central

    Bangert, Marc; Wiedemann, Anna; Jabusch, Hans-Christian

    2014-01-01

    Variability of Practice (VOP) refers to the acquisition of a particular target movement by practicing a range of varying targets rather than by focusing on fixed repetitions of the target only. VOP has been demonstrated to have beneficial effects on transfer to a novel task and on skill consolidation. This study extends the line of research to musical practice. In a task resembling a barrier-knockdown paradigm, 36 music students trained to perform a wide left-hand interval leap on the piano. Performance at the target distance was tested before and after a 30-min standardized training session. The high-variability group (VAR) practiced four different intervals including the target. Another group (FIX) practiced the target interval only. A third group (SPA) performed spaced practice on the target only, interweaving with periods of not playing. Transfer was tested by introducing an interval novel to either group. After a 24-h period with no further exposure to the instrument, performance was retested. All groups performed at comparable error levels before training, after training, and after the retention (RET) interval. At transfer, however, the FIX group, unlike the other groups, committed significantly more errors than in the target task. After the RET period, the effect was washed out for the FIX group but then was present for VAR. Thus, the results provide only partial support for the VOP hypothesis for the given setting. Additional exploratory observations suggest tentative benefits of VOP regarding execution speed, loudness, and performance confidence. We derive specific hypotheses and specific recommendations regarding sample selection and intervention duration for future investigations. Furthermore, the proposed leap task measurement is shown to be (a) robust enough to serve as a standard framework for studies in the music domain, yet (b) versatile enough to allow for a wide range of designs not previously investigated for music on a standardized basis. PMID:25157223

  9. Deep Ion Torrent sequencing identifies soil fungal community shifts after frequent prescribed fires in a southeastern US forest ecosystem.

    PubMed

    Brown, Shawn P; Callaham, Mac A; Oliver, Alena K; Jumpponen, Ari

    2013-12-01

    Prescribed burning is a common management tool to control fuel loads, ground vegetation, and facilitate desirable game species. We evaluated soil fungal community responses to long-term prescribed fire treatments in a loblolly pine forest on the Piedmont of Georgia and utilized deep Internal Transcribed Spacer Region 1 (ITS1) amplicon sequencing afforded by the recent Ion Torrent Personal Genome Machine (PGM). These deep sequence data (19,000 + reads per sample after subsampling) indicate that frequent fires (3-year fire interval) shift soil fungus communities, whereas infrequent fires (6-year fire interval) permit system resetting to a state similar to that without prescribed fire. Furthermore, in nonmetric multidimensional scaling analyses, primarily ectomycorrhizal taxa were correlated with axes associated with long fire intervals, whereas soil saprobes tended to be correlated with the frequent fire recurrence. We conclude that (1) multiplexed Ion Torrent PGM analyses allow deep cost effective sequencing of fungal communities but may suffer from short read lengths and inconsistent sequence quality adjacent to the sequencing adaptor; (2) frequent prescribed fires elicit a shift in soil fungal communities; and (3) such shifts do not occur when fire intervals are longer. Our results emphasize the general responsiveness of these forests to management, and the importance of fire return intervals in meeting management objectives. © 2013 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.

  10. Uncertainty in Population Growth Rates: Determining Confidence Intervals from Point Estimates of Parameters

    PubMed Central

    Devenish Nelson, Eleanor S.; Harris, Stephen; Soulsbury, Carl D.; Richards, Shane A.; Stephens, Philip A.

    2010-01-01

    Background Demographic models are widely used in conservation and management, and their parameterisation often relies on data collected for other purposes. When underlying data lack clear indications of associated uncertainty, modellers often fail to account for that uncertainty in model outputs, such as estimates of population growth. Methodology/Principal Findings We applied a likelihood approach to infer uncertainty retrospectively from point estimates of vital rates. Combining this with resampling techniques and projection modelling, we show that confidence intervals for population growth estimates are easy to derive. We used similar techniques to examine the effects of sample size on uncertainty. Our approach is illustrated using data on the red fox, Vulpes vulpes, a predator of ecological and cultural importance, and the most widespread extant terrestrial mammal. We show that uncertainty surrounding estimated population growth rates can be high, even for relatively well-studied populations. Halving that uncertainty typically requires a quadrupling of sampling effort. Conclusions/Significance Our results compel caution when comparing demographic trends between populations without accounting for uncertainty. Our methods will be widely applicable to demographic studies of many species. PMID:21049049

  11. Robustness-Based Design Optimization Under Data Uncertainty

    NASA Technical Reports Server (NTRS)

    Zaman, Kais; McDonald, Mark; Mahadevan, Sankaran; Green, Lawrence

    2010-01-01

    This paper proposes formulations and algorithms for design optimization under both aleatory (i.e., natural or physical variability) and epistemic uncertainty (i.e., imprecise probabilistic information), from the perspective of system robustness. The proposed formulations deal with epistemic uncertainty arising from both sparse and interval data without any assumption about the probability distributions of the random variables. A decoupled approach is proposed in this paper to un-nest the robustness-based design from the analysis of non-design epistemic variables to achieve computational efficiency. The proposed methods are illustrated for the upper stage design problem of a two-stage-to-orbit (TSTO) vehicle, where the information on the random design inputs are only available as sparse point and/or interval data. As collecting more data reduces uncertainty but increases cost, the effect of sample size on the optimality and robustness of the solution is also studied. A method is developed to determine the optimal sample size for sparse point data that leads to the solutions of the design problem that are least sensitive to variations in the input random variables.

  12. A longitudinal investigation of repressive coping and ageing.

    PubMed

    Erskine, James; Kvavilashvili, Lia; Myers, Lynn; Leggett, Sarah; Davies, Steve; Hiskey, Syd; Hogg, Joanna; Yeo, Sophia; Georgiou, George

    2016-10-01

    Two studies investigated the possibility that repressive coping is more prevalent in older adults and that this represents a developmental progression rather than a cohort effect. Study 1 examined repressive coping and mental health cross-sectionally in young and old adults. Study 2 examined whether there was a developmental progression of repressive coping prevalence rates in a longitudinal sample of older adults. Study 1 compared younger adults (mean age 27.6 years) with older adults (mean age 74.2 years) on inventories of mental health and well-being and examined the prevalence of repressive coping in both samples. Study 2 re-tested a sample of older adults previously reported following an interval of 7 years. Study 1 - in line with previous research older adults demonstrated greater psychological well-being and had a higher prevalence of repressive coping than younger adults (at 30% vs. 12% respectively). Study 2 - the data indicated that the prevalence of repressive coping rose from 41% at the first time of testing (2002) to 56.4% at the second testing interval (2009). These results suggest that repressive coping may increase across the lifespan in certain individuals and continue to increase throughout older adulthood. Furthermore, this increase in repressive coping with age appears to result in better well-being in those older adults who become repressive copers.

  13. Predicting Hydrologic Function With Aquatic Gene Fragments

    NASA Astrophysics Data System (ADS)

    Good, S. P.; URycki, D. R.; Crump, B. C.

    2018-03-01

    Recent advances in microbiology techniques, such as genetic sequencing, allow for rapid and cost-effective collection of large quantities of genetic information carried within water samples. Here we posit that the unique composition of aquatic DNA material within a water sample contains relevant information about hydrologic function at multiple temporal scales. In this study, machine learning was used to develop discharge prediction models trained on the relative abundance of bacterial taxa classified into operational taxonomic units (OTUs) based on 16S rRNA gene sequences from six large arctic rivers. We term this approach "genohydrology," and show that OTU relative abundances can be used to predict river discharge at monthly and longer timescales. Based on a single DNA sample from each river, the average Nash-Sutcliffe efficiency (NSE) for predicted mean monthly discharge values throughout the year was 0.84, while the NSE for predicted discharge values across different return intervals was 0.67. These are considerable improvements over predictions based only on the area-scaled mean specific discharge of five similar rivers, which had average NSE values of 0.64 and -0.32 for seasonal and recurrence interval discharge values, respectively. The genohydrology approach demonstrates that genetic diversity within the aquatic microbiome is a large and underutilized data resource with benefits for prediction of hydrologic function.

  14. The size of a pilot study for a clinical trial should be calculated in relation to considerations of precision and efficiency.

    PubMed

    Sim, Julius; Lewis, Martyn

    2012-03-01

    To investigate methods to determine the size of a pilot study to inform a power calculation for a randomized controlled trial (RCT) using an interval/ratio outcome measure. Calculations based on confidence intervals (CIs) for the sample standard deviation (SD). Based on CIs for the sample SD, methods are demonstrated whereby (1) the observed SD can be adjusted to secure the desired level of statistical power in the main study with a specified level of confidence; (2) the sample for the main study, if calculated using the observed SD, can be adjusted, again to obtain the desired level of statistical power in the main study; (3) the power of the main study can be calculated for the situation in which the SD in the pilot study proves to be an underestimate of the true SD; and (4) an "efficient" pilot size can be determined to minimize the combined size of the pilot and main RCT. Trialists should calculate the appropriate size of a pilot study, just as they should the size of the main RCT, taking into account the twin needs to demonstrate efficiency in terms of recruitment and to produce precise estimates of treatment effect. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Effects of land use and hydrogeology on the water quality of alluvial aquifers in eastern Iowa and southern Minnesota, 1997

    USGS Publications Warehouse

    Savoca, Mark E.; Sadorf, Eric M.; Linhart, S. Mike; Akers, Kim K.B.

    2000-01-01

    Factors other than land use may contribute to observed differences in water quality between and within agricultural and urban areas. Nitrate, atrazine, deethylatrazine, and deisopropylatrazine concentrations were significantly higher in shallow wells with sample intervals nearer the water table and in wells with thinner cumulative clay thickness above the sample intervals. These relations suggest that longer flow paths allow for greater residence time and increase opportunities for sorption, degradation, and dispersion, which may contribute to decreases in nutrient and pesticide concentrations with depth. Nitrogen speciation was influenced by redox conditions. Nitrate concentrations were significantly higher in ground water with dissolved-oxygen concentrations in excess of 0.5 milligram per liter. Ammonia concentrations were higher in ground water with dissolved-oxygen concentrations of 0.5 milligram per liter or less; however, this relation was not statistically significant. The amount of available organic matter may limit denitrification rates. Elevated nitrate concentrations (greater than 2.0 mg/L) were significantly related to lower dissolved organic carbon concentrations in water samples from both agricultural and urban areas. A similar relation between nitrate concentrations (in water) and organic carbon concentrations (in aquifer material) also was observed but was not statistically significant.

  16. The effect of preinjury sleep difficulties on neurocognitive impairment and symptoms after sport-related concussion.

    PubMed

    Sufrinko, Alicia; Pearce, Kelly; Elbin, R J; Covassin, Tracey; Johnson, Eric; Collins, Michael; Kontos, Anthony P

    2015-04-01

    Researchers have reported that sleep duration is positively related to baseline neurocognitive performance. However, researchers have yet to examine the effect of preinjury sleep difficulties on postconcussion impairments. To compare neurocognitive impairment and symptoms of athletes with preinjury sleep difficulties to those without after a sport-related concussion (SRC). Cohort study; Level of evidence, 3. The sample included 348 adolescent and adult athletes (age, mean ± SD, 17.43 ± 2.34 years) with a diagnosed SRC. The sample was divided into 2 groups: (1) 34 (10%) participants with preinjury sleep difficulties (sleeping less as well as having trouble falling asleep; SLEEP SX) and (2) 231 (66%) participants without preinjury sleep difficulties (CONTROL). The remaining 84 (24%) participants with minimal sleep difficulties (1 symptom) were excluded. Participants completed the Immediate Postconcussion Assessment and Cognitive Test (ImPACT) and Postconcussion Symptom Scale (PCSS) at baseline and 3 postinjury intervals (2, 5-7, and 10-14 days after injury). A series of repeated-measures analyses of covariance with Bonferroni correction, controlling for baseline non-sleep-related symptoms, were conducted to compare postinjury neurocognitive performance between groups. Follow-up exploratory t tests examined between-group differences at each time interval. A series of analyses of variance were used to examine total PCSS score, sleep-related, and non-sleep-related symptoms across time intervals between groups. Groups differed significantly in PCSS scores across postinjury intervals for reaction time (P < .001), with the preinjury SLEEP SX group performing worse than controls at 5-7 days (mean ± SD, 0.70 ± 0.32 [SLEEP SX], 0.60 ± 0.14 [CONTROL]) and 10-14 days (0.61 ± 0.17 [SLEEP SX]; 0.57 ± 0.10 [CONTROL]) after injury. Groups also differed significantly on verbal memory performance (P = .04), with the SLEEP SX (68.21 ± 18.64) group performing worse than the CONTROL group (76.76 ± 14.50) 2 days after injury. The SLEEP SX group reported higher total symptom (P = .02) and sleep-related symptom (P = .02) scores across postinjury time intervals. Preinjury sleep difficulties may exacerbate neurocognitive impairment and symptoms after concussion. The findings may help clinicians identify athletes who are at risk for worse impairments after a concussion due to preinjury sleep difficulties. © 2015 The Author(s).

  17. Comparative evaluation of human pulp tissue dissolution by different concentrations of chlorine dioxide, calcium hypochlorite and sodium hypochlorite: An in vitro study

    PubMed Central

    Taneja, Sonali; Mishra, Neha; Malik, Shubhra

    2014-01-01

    Introduction: Irrigation plays an indispensable role in removal of tissue remnants and debris from the complicated root canal system. This study compared the human pulp tissue dissolution by different concentrations of chlorine dioxide, calcium hypochlorite and sodium hypochlorite. Materials and Methods: Pulp tissue was standardized to a weight of 9 mg for each sample. In all,60 samples obtained were divided into 6 groups according to the irrigating solution used- 2.5% sodium hypochlorite (NaOCl), 5.25% NaOCl, 5% calcium hypochlorite (Ca(OCl)2), 10% Ca(OCl)2, 5%chlorine dioxide (ClO2) and 13% ClO2. Pulp tissue was placed in each test tube carrying irrigants of measured volume (5ml) according to their specified subgroup time interval: 30 minutes (Subgroup A) and 60 minutes (Subgroup B). The solution from each sample test tube was filtered and was left for drying overnight. The residual weight was calculated by filtration method. Results: Mean tissue dissolution increases with increase in time period. Results showed 5.25% NaOCl to be most effective at both time intervals followed by 2.5% NaOCl at 60 minutes, 10%Ca(OCl)2 and 13% ClO2 at 60 minutes. Least amount of tissue dissolving ability was demonstrated by 5% Ca(OCl)2 and 5% ClO2 at 30 minutes. Distilled water showed no pulp tissue dissolution. Conclusion: Withinthe limitations of the study, NaOCl most efficiently dissolved the pulp tissue at both concentrations and at both time intervals. Mean tissue dissolution by Ca(OCl)2 and ClO2 gradually increased with time and with their increase in concentration. PMID:25506141

  18. Terrestrial predator alarm vocalizations are a valid monitor of stress in captive brown capuchins (Cebus apella)

    USGS Publications Warehouse

    Boinski, S.; Gross, T.S.; Davis, J.K.

    1999-01-01

    The vocal behavior of captive animals is increasingly exploited as an index of well-being. Here we show that the terrestrial predator alarm (TPA) vocalization, a robust and acoustically distinctive anti-predation vocal response present in many mammal and bird species, offers useful information on the relative well-being and stress levels of captive animals. In a 16-week experiment evaluating the effects of varying levels of physical environmental enrichment (control < toys < foraging box < foraging box and toys) in the cages of eight singly housed adult male brown capuchins, we quantified the 1) emission rate of TPAs, 2) proportions of normal and abnormal behavior sample intervals, and 3) fecal and plasma cortisol levels. Variation in TPA emission across the experimental conditions was significant. We found significant reductions in the mean TPA production rate by the group in the enriched (toys, foraging box, and foraging box and toys) compared to the control condition; pre-and post-experimental conditions, however, did not differ from the control condition. Mean TPA production by the group was also significantly positively correlated to mean group levels of fecal cortisol and proportion of abnormal behavior sample intervals, and significantly negatively correlated to the average proportion of normal behavior sample intervals in the group. Based on group means, plasma cortisol levels were positively, but not significantly, related to increasing TPA rate. At the level of the responses of an individual subject, however, the covariation between the vocal and non-vocal behavioral measures and the cortisol assays seldom attained significance. Nevertheless, the direction of the relationships among these parameters within individual subjects typically mirrored those correlations based on group means. At both the group mean and individual levels, our results are consistent with the.

  19. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR.

    PubMed

    Mobli, Mehdi; Hoch, Jeffrey C

    2014-11-01

    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Statistical inferences with jointly type-II censored samples from two Pareto distributions

    NASA Astrophysics Data System (ADS)

    Abu-Zinadah, Hanaa H.

    2017-08-01

    In the several fields of industries the product comes from more than one production line, which is required to work the comparative life tests. This problem requires sampling of the different production lines, then the joint censoring scheme is appeared. In this article we consider the life time Pareto distribution with jointly type-II censoring scheme. The maximum likelihood estimators (MLE) and the corresponding approximate confidence intervals as well as the bootstrap confidence intervals of the model parameters are obtained. Also Bayesian point and credible intervals of the model parameters are presented. The life time data set is analyzed for illustrative purposes. Monte Carlo results from simulation studies are presented to assess the performance of our proposed method.

  1. Reducing sampling error in faecal egg counts from black rhinoceros (Diceros bicornis).

    PubMed

    Stringer, Andrew P; Smith, Diane; Kerley, Graham I H; Linklater, Wayne L

    2014-04-01

    Faecal egg counts (FECs) are commonly used for the non-invasive assessment of parasite load within hosts. Sources of error, however, have been identified in laboratory techniques and sample storage. Here we focus on sampling error. We test whether a delay in sample collection can affect FECs, and estimate the number of samples needed to reliably assess mean parasite abundance within a host population. Two commonly found parasite eggs in black rhinoceros (Diceros bicornis) dung, strongyle-type nematodes and Anoplocephala gigantea, were used. We find that collection of dung from the centre of faecal boluses up to six hours after defecation does not affect FECs. More than nine samples were needed to greatly improve confidence intervals of the estimated mean parasite abundance within a host population. These results should improve the cost-effectiveness and efficiency of sampling regimes, and support the usefulness of FECs when used for the non-invasive assessment of parasite abundance in black rhinoceros populations.

  2. Study of pentoxifylline effects on motility and viability of spermatozoa from infertile asthenozoospermic males

    PubMed Central

    Ghasemzadeh, Aliye; Karkon-Shayan, Farid; Yousefzadeh, Solmaz; Naghavi-Behzad, Mohammad; Hamdi, Kobra

    2016-01-01

    Background: The quality of semen is one of the major parameters in male infertility. Pentoxifylline, a methylxanthine derivative, is an agent primarily used in the treatment of intermittent claudication and other vascular disorders. Studies have shown that pentoxifylline enhances the quality and quantity of sperms. In this study, we have investigated the in vitro effects of pentoxifylline on viability and motility of spermatozoa in samples of infertile oligoasthenozoospermic males. Materials and Methods: In this observer-blinded clinical trial, semen samples of 25 infertile oligoasthenozoospermic males were collected in Alzahra Educational Medical Center of Tabriz University of Medical Sciences from August 2010 to August 2012. After the isolation of spermatozoa by the swim-up method, they were randomized into four groups in ISM1 environment: The controls treated normally: Group 1 treated by pentoxifylline at a dose of 50 μg/ml, Group 2 treated by pentoxifylline at a dose of 100 μg/ml, and Group 3 treated by pentoxifylline at a dose of 200 μg/ml. Sperm viability and motility were compared among the groups on 45 min, 24 h, 36 h, and 48 h intervals. Results: Mean percentages of live sperms were 98.40%, 51.40%, 20.60%, and 6.00% in control group and 98.40%, 69.20%, 38.60%, and 14.60% in Group 3 on the mentioned intervals, respectively. This mean percentage decrease of live sperms was significantly lower in Group 3 comparing with that of other groups (P = 0.01). Mean percentages of motile sperms were 54%, 8.40%, 2.80%, and 0% in control group; and 54%, 16%, 4.80%, and 1.40% in Group 3 on the mentioned intervals, respectively. There was not a significant difference between the four groups in this regard (P = 0.19). Conclusion: Pentoxifylline can enhance the viability of sperm of infertile oligoasthenozoospermic males with no significant effect on its motility. PMID:27942099

  3. Study of pentoxifylline effects on motility and viability of spermatozoa from infertile asthenozoospermic males.

    PubMed

    Ghasemzadeh, Aliye; Karkon-Shayan, Farid; Yousefzadeh, Solmaz; Naghavi-Behzad, Mohammad; Hamdi, Kobra

    2016-01-01

    The quality of semen is one of the major parameters in male infertility. Pentoxifylline, a methylxanthine derivative, is an agent primarily used in the treatment of intermittent claudication and other vascular disorders. Studies have shown that pentoxifylline enhances the quality and quantity of sperms. In this study, we have investigated the in vitro effects of pentoxifylline on viability and motility of spermatozoa in samples of infertile oligoasthenozoospermic males. In this observer-blinded clinical trial, semen samples of 25 infertile oligoasthenozoospermic males were collected in Alzahra Educational Medical Center of Tabriz University of Medical Sciences from August 2010 to August 2012. After the isolation of spermatozoa by the swim-up method, they were randomized into four groups in ISM1 environment: The controls treated normally: Group 1 treated by pentoxifylline at a dose of 50 μg/ml, Group 2 treated by pentoxifylline at a dose of 100 μg/ml, and Group 3 treated by pentoxifylline at a dose of 200 μg/ml. Sperm viability and motility were compared among the groups on 45 min, 24 h, 36 h, and 48 h intervals. Mean percentages of live sperms were 98.40%, 51.40%, 20.60%, and 6.00% in control group and 98.40%, 69.20%, 38.60%, and 14.60% in Group 3 on the mentioned intervals, respectively. This mean percentage decrease of live sperms was significantly lower in Group 3 comparing with that of other groups ( P = 0.01). Mean percentages of motile sperms were 54%, 8.40%, 2.80%, and 0% in control group; and 54%, 16%, 4.80%, and 1.40% in Group 3 on the mentioned intervals, respectively. There was not a significant difference between the four groups in this regard ( P = 0.19). Pentoxifylline can enhance the viability of sperm of infertile oligoasthenozoospermic males with no significant effect on its motility.

  4. The effects of time of disease occurrence, milk yield, and body condition on fertility of dairy cows.

    PubMed

    Loeffler, S H; de Vries, M J; Schukken, Y H

    1999-12-01

    The associations between occurrence of diseases, milk yield, and body condition score on conception risk after first artificial insemination (AI) were analyzed in an observational study on a convenience sample of 43 farms participating in a herd health program. Data were taken from 9369 lactations, from 4382 cows inseminated between 20 and 180 d in milk from 1990 to 1996. Two logistic regression models, one containing data from all lactations and a subset containing data from 1762 lactations with body condition scoring, were used to determine pregnancy risk at first AI. The effects of herd deviation in test-day milk yield, body condition score loss, and milk fat to protein ratio changes in early lactation were significant predictors of pregnancy risk, independent of disease; days in milk; farm; and seasonal factors. Three different methods of disease parameterization (incidence rates, binomial classes dependent on the interval in days since last occurrence with respect to AI, and a linear variable weighted for this interval) produced similar results. Metritis, cystic ovarian disease, lameness, and mastitis gave odds ratios for pregnancy risk ranging from 0.35 to 1.15, largely dependent on the interval in days from final disease occurrence to first AI. Displaced abomasum, milk fever, and retained fetal membranes resulted in odds ratios for pregnancy risk of 0.25, 0.85, and 0.55, respectively. These diseases showed little relationship between fertility and the number of days since last occurrence. Results of this study confirm the negative effects of milk yield, body score condition loss, and disease on dairy cow fertility. The effects of some diseases on first service conception were strongly dependent on the interval since last disease occurrence. This was especially valid for clinical mastitis, which has an extremely weak effect on conception if occurring prior to AI and is associated with > 50% reduction in pregnancy risk if occurring in the 3 wk directly after AI.

  5. Chirp Scaling Algorithms for SAR Processing

    NASA Technical Reports Server (NTRS)

    Jin, M.; Cheng, T.; Chen, M.

    1993-01-01

    The chirp scaling SAR processing algorithm is both accurate and efficient. Successful implementation requires proper selection of the interval of output samples, which is a function of the chirp interval, signal sampling rate, and signal bandwidth. Analysis indicates that for both airborne and spaceborne SAR applications in the slant range domain a linear chirp scaling is sufficient. To perform nonlinear interpolation process such as to output ground range SAR images, one can use a nonlinear chirp scaling interpolator presented in this paper.

  6. Estimating reliable paediatric reference intervals in clinical chemistry and haematology.

    PubMed

    Ridefelt, Peter; Hellberg, Dan; Aldrimer, Mattias; Gustafsson, Jan

    2014-01-01

    Very few high-quality studies on paediatric reference intervals for general clinical chemistry and haematology analytes have been performed. Three recent prospective community-based projects utilising blood samples from healthy children in Sweden, Denmark and Canada have substantially improved the situation. The present review summarises current reference interval studies for common clinical chemistry and haematology analyses. ©2013 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  7. Variance of discharge estimates sampled using acoustic Doppler current profilers from moving boats

    USGS Publications Warehouse

    Garcia, Carlos M.; Tarrab, Leticia; Oberg, Kevin; Szupiany, Ricardo; Cantero, Mariano I.

    2012-01-01

    This paper presents a model for quantifying the random errors (i.e., variance) of acoustic Doppler current profiler (ADCP) discharge measurements from moving boats for different sampling times. The model focuses on the random processes in the sampled flow field and has been developed using statistical methods currently available for uncertainty analysis of velocity time series. Analysis of field data collected using ADCP from moving boats from three natural rivers of varying sizes and flow conditions shows that, even though the estimate of the integral time scale of the actual turbulent flow field is larger than the sampling interval, the integral time scale of the sampled flow field is on the order of the sampling interval. Thus, an equation for computing the variance error in discharge measurements associated with different sampling times, assuming uncorrelated flow fields is appropriate. The approach is used to help define optimal sampling strategies by choosing the exposure time required for ADCPs to accurately measure flow discharge.

  8. Plasma biochemical and PCV ranges for healthy, wild, immature hawksbill (Eretmochelys imbricata) sea turtles.

    PubMed

    Whiting, S D; Guinea, M L; Fomiatti, K; Flint, M; Limpus, C J

    2014-06-14

    In recent years, the use of blood chemistry as a diagnostic tool for sea turtles has been demonstrated, but much of its effectiveness relies on reference intervals. The first comprehensive blood chemistry values for healthy wild hawksbill (Eretmochelys imbricata) sea turtles are presented. Nineteen blood chemistry analytes and packed cell volume were analysed for 40 clinically healthy juvenile hawksbill sea turtles captured from a rocky reef habitat in northern Australia. We used four statistical approaches to calculate reference intervals and to investigate their use with non-normal distributions and small sample sizes, and to compare upper and lower limits between methods. Eleven analytes were correlated with curved carapace length indicating that body size should be considered when designing future studies and interpreting analyte values. British Veterinary Association.

  9. Lead zirconate titanate-nickel zink ferrite thick-film composites: obtaining by the screen printing technique and magnetoelectric properties

    NASA Astrophysics Data System (ADS)

    Bush, A. A.; Shkuratov, V. Ya.; Chernykh, I. A.; Fetisov, Y. K.

    2010-03-01

    Layered thick-film composites containing one lead zirconate titanate (PZT) layer, one nickel zinc ferrite (NZF) layer, two PZT-NZF layers, or three PZT-NZF-PZT layers each 40-50 μm thick are prepared. The layers are applied by screen printing on a ceramic aluminum oxide substrate with a preformed contact (conducting) layer. The dielectric properties of the composites are studied in the temperature interval 80-900 K and the frequency interval 25 Hz-1 MHz. Polarized samples exhibit piezoelectric, pyroelectric, and magnetoelectric effects. In tangentially magnetized two- and three-layer composites, the magnetoelectric conversion factor equals 57 kV/(m T) at low frequencies and reaches 2000 kV/(m T) at the mechanical resonance frequency.

  10. Effect of dipping in pomegranate (Punica granatum) fruit juice phenolic solution on the shelf life of chicken meat under refrigerated storage (4°C).

    PubMed

    Vaithiyanathan, S; Naveena, B M; Muthukumar, M; Girish, P S; Kondaiah, N

    2011-07-01

    An experiment was conducted to evaluate the effect of dipping in pomegranate fruit juice phenolics (PFJP) solution on the shelf life of chicken meat held under refrigerated storage at 4°C. Breast muscle obtained from spent hens was dipped (1:2w/v; muscle: liquid) in sterile water or in sterile water with 0.02% (v/v) PFJP, packed, stored at 4°C for 28 days and samples were analyzed on 2 days of intervals. Thiobarbituric acid reactive substance values were lower in samples treated with PFJP. Total sulfhydryl and protein bound sulfhydryl content values were higher in samples treated with PFJP. Microbial quality evaluation showed that aerobic and psychrotrophic counts were higher in samples treated without PFJP. Sensory evaluation revealed that acceptability level of samples treated without PFJP decreased on 12th day of storage. It is concluded that spent hen breast meat samples dipped in 0.02% PFJP reduced protein oxidation and inhibited microbial growth and sensorily acceptable up to 12 days of refrigerated storage at 4°C. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Effect of exercise intensity on circulating microparticles in men and women.

    PubMed

    Shill, Daniel D; Lansford, Kasey A; Hempel, Hannah K; Call, Jarrod A; Murrow, Jonathan R; Jenkins, Nathan T

    2018-05-01

    What is the central question of this study? What is the effect of exercise intensity on circulating microparticle populations in young, healthy men and women? What is the main finding and its importance? Acute, moderate-intensity continuous exercise and high-intensity interval exercise altered distinct microparticle populations during and after exercise in addition to a sex-specific response in CD62E + microparticles. The microparticles studied contribute to cardiovascular disease progression, regulate vascular function and facilitate new blood vessel formation. Thus, characterizing the impact of intensity on exercise-induced microparticle responses advances our understanding of potential mechanisms underlying the beneficial vascular adaptations to exercise. Circulating microparticles (MPs) are biological vectors of information within the cardiovascular system that elicit both deleterious and beneficial effects on the vasculature. Acute exercise has been shown to alter MP concentrations, probably through a shear stress-dependent mechanism, but evidence is limited. Therefore, we investigated the effect of exercise intensity on plasma levels of CD34 + and CD62E + MPs in young, healthy men and women. Blood samples were collected before, during and after two energy-matched bouts of acute treadmill exercise: interval exercise (10 × 1 min intervals at ∼95% of maximal oxygen uptake V̇O2max) and continuous exercise (65% V̇O2max). Continuous exercise, but not interval exercise, reduced CD62E + MP concentrations in men and women by 18% immediately after exercise (from 914.5 ± 589.6 to 754.4 ± 390.5 MPs μl -1 ; P < 0.05), suggesting that mechanisms underlying exercise-induced CD62E + MP dynamics are intensity dependent. Furthermore, continuous exercise reduced CD62E + MPs in women by 19% (from 1030.6 ± 688.1 to 829.9 ± 435.4 MPs μl -1 ; P < 0.05), but not in men. Although interval exercise did not alter CD62E + MPs per se, the concentrations after interval exercise were higher than those observed after continuous exercise (P < 0.05). Conversely, CD34 + MPs did not fluctuate in response to short-duration acute continuous or interval exercise in men or women. Our results suggest that exercise-induced MP alterations are intensity dependent and sex specific and impact MP populations differentially. © 2018 The Authors. Experimental Physiology © 2018 The Physiological Society.

  12. A Three Month Comparative Evaluation of the Effect of Different Surface Treatment Agents on the Surface Integrity and Softness of Acrylic based Soft Liner: An In vivo Study

    PubMed Central

    Mahajan, Neerja; Naveen, Y. G.; Sethuraman, Rajesh

    2017-01-01

    Introduction Acrylic based soft liners are cost effective, yet are inferior in durability as compared to silicone based liners. Hence, this study was conducted to evaluate if the softness and surface integrity of acrylic based soft liner can be maintained by using different surface treatment agents. Aim To comparatively evaluate the effects of Varnish, Monopoly and Kregard surface treatment agents on the surface integrity and softness of acrylic based soft liner at baseline, at one month and after three months. Materials and Methods A total of 37 participants who required conventional maxillary dentures were selected according to the determined inclusion and exclusion criteria of the study. In the maxillary denture on the denture bearing surface, eight palatal recesses (5 mm x 3 mm) were made and filled with acrylic based soft liner (Permasoft). The soft liners in these recesses were given surface treatment and divided as control (uncoated), Varnish, Monopoly and Kregard groups. The hardness and surface integrity were evaluated with Shore A Durometer and Scanning Electron Microscope (SEM) respectively at baseline, one month and three months interval. Surface integrity between groups was compared using Kruskal-Wallis test. Intergroup comparison for hardness was done using ANOVA and Tukey’s HSD post-hoc tests. Results Amongst all the groups tested, surface integrity was maintained in the Kregard group, as compared to control, Varnish and Monopoly groups for all three time intervals (p< 0.001). Kregard treated samples also demonstrated significantly higher softness at all the time intervals (p<0.001). Conclusion Surface treatment with Kregard demonstrated better surface integrity and softness at all the time intervals. PMID:29207842

  13. Statistical inference for tumor growth inhibition T/C ratio.

    PubMed

    Wu, Jianrong

    2010-09-01

    The tumor growth inhibition T/C ratio is commonly used to quantify treatment effects in drug screening tumor xenograft experiments. The T/C ratio is converted to an antitumor activity rating using an arbitrary cutoff point and often without any formal statistical inference. Here, we applied a nonparametric bootstrap method and a small sample likelihood ratio statistic to make a statistical inference of the T/C ratio, including both hypothesis testing and a confidence interval estimate. Furthermore, sample size and power are also discussed for statistical design of tumor xenograft experiments. Tumor xenograft data from an actual experiment were analyzed to illustrate the application.

  14. Hemolymph amino acid analysis of individual Drosophila larvae.

    PubMed

    Piyankarage, Sujeewa C; Augustin, Hrvoje; Grosjean, Yael; Featherstone, David E; Shippy, Scott A

    2008-02-15

    One of the most widely used transgenic animal models in biology is Drosophila melanogaster, the fruit fly. Chemical information from this exceedingly small organism is usually accomplished by studying populations to attain sample volumes suitable for standard analysis methods. This paper describes a direct sampling technique capable of obtaining 50-300 nL of hemolymph from individual Drosophila larvae. Hemolymph sampling performed under mineral oil and in air at 30 s intervals up to 120 s after piercing larvae revealed that the effect of evaporation on amino acid concentrations is insignificant when the sample was collected within 60 s. Qualitative and quantitative amino acid analyses of obtained hemolymph were carried out in two optimized buffer conditions by capillary electrophoresis with laser-induced fluorescence detection after derivatizing with fluorescamine. Thirteen amino acids were identified from individual hemolymph samples of both wild-type (WT) control and the genderblind (gb) mutant larvae. The levels of glutamine, glutamate, and taurine in the gb hemolymph were significantly lower at 35%, 38%, and 57% of WT levels, respectively. The developed technique that samples only the hemolymph fluid is efficient and enables accurate organism-level chemical information while minimizing errors associated with possible sample contaminations, estimations, and effects of evaporation compared to the traditional hemolymph-sampling techniques.

  15. Combined influence of adjuvant therapy and interval after surgery on peripheral CD4+ T lymphocytes in patients with esophageal squamous cell carcinoma

    PubMed Central

    LING, YANG; FAN, LIEYING; DONG, CHUNLEI; ZHU, JING; LIU, YONGPING; NI, YAN; ZHU, CHANGTAI; ZHANG, CHANGSONG

    2010-01-01

    The aim of this study was to investigate possible differences in cellular immunity between chemo- and/or radiotherapy groups during a long interval after surgery in esophageal squamous cell carcinoma (ESCC) patients. Cellular immunity was assessed as peripheral lymphocyte subsets in response to chemotherapy (CT), radiotherapy (RT) and CT+RT by flow cytometric analysis. There were 139 blood samples obtained at different time points relative to surgery from 73 patients with ESCC. The changes in the absolute and relative proportions of lymphocyte phenotypes were significant among the adjuvant therapy groups. There were significant differences in the absolute counts of CD4+ and CD8+ T cells among the interval groups, and a lower CD4/CD8 ratio was found in patients following a prolonged interval. RT alone had a profound effect on the absolute counts of CD3+, CD4+ and CD8+ T cells compared with the other groups. CD4+ T cells exhibited a decreasing trend during a long interval, leading to a prolonged T-cell imbalance after surgery. Univariate analysis revealed that the interaction of the type of adjuvant therapy and the interval after surgery was correlated only with the percentage of CD4+ T cells. The percentage of CD4+ T cells can be used as an indicator of the cellular immunity after surgery in ESCC patients. However, natural killer cells consistently remained suppressed in ESCC patients following adjuvant therapy after surgery. These findings confirm an interaction between adjuvant therapy and the interval after surgery on peripheral CD4+ T cells, and implies that adjuvant therapy may have selective influence on the cellular immunity of ESCC patients after surgery. PMID:23136603

  16. Sampling theorem for geometric moment determination and its application to a laser beam position detector.

    PubMed

    Loce, R P; Jodoin, R E

    1990-09-10

    Using the tools of Fourier analysis, a sampling requirement is derived that assures that sufficient information is contained within the samples of a distribution to calculate accurately geometric moments of that distribution. The derivation follows the standard textbook derivation of the Whittaker-Shannon sampling theorem, which is used for reconstruction, but further insight leads to a coarser minimum sampling interval for moment determination. The need for fewer samples to determine moments agrees with intuition since less information should be required to determine a characteristic of a distribution compared with that required to construct the distribution. A formula for calculation of the moments from these samples is also derived. A numerical analysis is performed to quantify the accuracy of the calculated first moment for practical nonideal sampling conditions. The theory is applied to a high speed laser beam position detector, which uses the normalized first moment to measure raster line positional accuracy in a laser printer. The effects of the laser irradiance profile, sampling aperture, number of samples acquired, quantization, and noise are taken into account.

  17. CLSI-based transference and verification of CALIPER pediatric reference intervals for 29 Ortho VITROS 5600 chemistry assays.

    PubMed

    Higgins, Victoria; Truong, Dorothy; Woroch, Amy; Chan, Man Khun; Tahmasebi, Houman; Adeli, Khosrow

    2018-03-01

    Evidence-based reference intervals (RIs) are essential to accurately interpret pediatric laboratory test results. To fill gaps in pediatric RIs, the Canadian Laboratory Initiative on Pediatric Reference Intervals (CALIPER) project developed an age- and sex-specific pediatric RI database based on healthy pediatric subjects. Originally established for Abbott ARCHITECT assays, CALIPER RIs were transferred to assays on Beckman, Roche, Siemens, and Ortho analytical platforms. This study provides transferred reference intervals for 29 biochemical assays for the Ortho VITROS 5600 Chemistry System (Ortho). Based on Clinical Laboratory Standards Institute (CLSI) guidelines, a method comparison analysis was performed by measuring approximately 200 patient serum samples using Abbott and Ortho assays. The equation of the line of best fit was calculated and the appropriateness of the linear model was assessed. This equation was used to transfer RIs from Abbott to Ortho assays. Transferred RIs were verified using 84 healthy pediatric serum samples from the CALIPER cohort. RIs for most chemistry analytes successfully transferred from Abbott to Ortho assays. Calcium and CO 2 did not meet statistical criteria for transference (r 2 <0.70). Of the 32 transferred reference intervals, 29 successfully verified with approximately 90% of results from reference samples falling within transferred confidence limits. Transferred RIs for total bilirubin, magnesium, and LDH did not meet verification criteria and are not reported. This study broadens the utility of the CALIPER pediatric RI database to laboratories using Ortho VITROS 5600 biochemical assays. Clinical laboratories should verify CALIPER reference intervals for their specific analytical platform and local population as recommended by CLSI. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. An Introduction to Confidence Intervals for Both Statistical Estimates and Effect Sizes.

    ERIC Educational Resources Information Center

    Capraro, Mary Margaret

    This paper summarizes methods of estimating confidence intervals, including classical intervals and intervals for effect sizes. The recent American Psychological Association (APA) Task Force on Statistical Inference report suggested that confidence intervals should always be reported, and the fifth edition of the APA "Publication Manual"…

  19. A combined approach of generalized additive model and bootstrap with small sample sets for fault diagnosis in fermentation process of glutamate.

    PubMed

    Liu, Chunbo; Pan, Feng; Li, Yun

    2016-07-29

    Glutamate is of great importance in food and pharmaceutical industries. There is still lack of effective statistical approaches for fault diagnosis in the fermentation process of glutamate. To date, the statistical approach based on generalized additive model (GAM) and bootstrap has not been used for fault diagnosis in fermentation processes, much less the fermentation process of glutamate with small samples sets. A combined approach of GAM and bootstrap was developed for the online fault diagnosis in the fermentation process of glutamate with small sample sets. GAM was first used to model the relationship between glutamate production and different fermentation parameters using online data from four normal fermentation experiments of glutamate. The fitted GAM with fermentation time, dissolved oxygen, oxygen uptake rate and carbon dioxide evolution rate captured 99.6 % variance of glutamate production during fermentation process. Bootstrap was then used to quantify the uncertainty of the estimated production of glutamate from the fitted GAM using 95 % confidence interval. The proposed approach was then used for the online fault diagnosis in the abnormal fermentation processes of glutamate, and a fault was defined as the estimated production of glutamate fell outside the 95 % confidence interval. The online fault diagnosis based on the proposed approach identified not only the start of the fault in the fermentation process, but also the end of the fault when the fermentation conditions were back to normal. The proposed approach only used a small sample sets from normal fermentations excitements to establish the approach, and then only required online recorded data on fermentation parameters for fault diagnosis in the fermentation process of glutamate. The proposed approach based on GAM and bootstrap provides a new and effective way for the fault diagnosis in the fermentation process of glutamate with small sample sets.

  20. Sampling Development

    ERIC Educational Resources Information Center

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  1. Randomized comparison of oral misoprostol and oxytocin for labor induction in term prelabor membrane rupture.

    PubMed

    Butt, K D; Bennett, K A; Crane, J M; Hutchens, D; Young, D C

    1999-12-01

    To compare labor induction intervals between oral misoprostol and intravenous oxytocin in women who present at term with premature rupture of membranes. One hundred eight women were randomly assigned to misoprostol 50 microg orally every 4 hours as needed or intravenous oxytocin. The primary outcome measure was time from induction to vaginal delivery. Sample size was calculated using a two-tailed alpha of 0.05 and power of 80%. Baseline demographic data, including maternal age, gestation, parity, Bishop score, birth weight, and group B streptococcal status, were similar. The mean time +/-standard deviation to vaginal birth with oral misoprostol was 720+/-382 minutes compared with 501+/-389 minutes with oxytocin (P = .007). The durations of the first, second, and third stages of labor were similar. There were no differences in maternal secondary outcomes, including cesarean birth (eight and seven, respectively), infection, maternal satisfaction with labor, epidural use, perineal trauma, manual placental removal, or gastrointestinal side effects. Neonatal outcomes including cord pH, Apgar scores, infection, and admission to neonatal intensive care unit were not different. Although labor induction with oral misoprostol was effective, oxytocin resulted in a shorter induction-to-delivery interval. Active labor intervals and other maternal and neonatal outcomes were similar.

  2. Surface modification of epoxy resin using He/CF4 atmospheric pressure plasma jet for flashover withstanding characteristics improvement in vacuum

    NASA Astrophysics Data System (ADS)

    Chen, Sile; Wang, Shuai; Wang, Yibo; Guo, Baohong; Li, Guoqiang; Chang, Zhengshi; Zhang, Guan-Jun

    2017-08-01

    For enhancing the surface electric withstanding strength of insulating materials, epoxy resin (EP) samples are treated by atmospheric pressure plasma jet (APPJ) with different time interval from 0 to 300s. Helium (He) and tetrafluoromethane (CF4) mixtures are used as working gases with the concentration of CF4 ranging 0%-5%, and when CF4 is ∼3%, the APPJ exhibits an optimal steady state. The flashover withstanding characteristics of modified EP in vacuum are greatly improved under appropriate APPJ treatment conditions. The surface properties of EP samples are evaluated by surface roughness, scanning electron microscope (SEM), X-ray photoelectron spectroscopy (XPS) and water contact angle. It is considered that both physical and chemical effects lead to the enhancement of flashover strength. The physical effect is reflected in the increase of surface roughness, while the chemical effect is reflected in the graft of fluorine groups.

  3. Detection in fixed and random noise in foveal and parafoveal vision explained by template learning

    NASA Technical Reports Server (NTRS)

    Beard, B. L.; Ahumada, A. J. Jr; Watson, A. B. (Principal Investigator)

    1999-01-01

    Foveal and parafoveal contrast detection thresholds for Gabor and checkerboard targets were measured in white noise by means of a two-interval forced-choice paradigm. Two white-noise conditions were used: fixed and twin. In the fixed noise condition a single noise sample was presented in both intervals of all the trials. In the twin noise condition the same noise sample was used in the two intervals of a trial, but a new sample was generated for each trial. Fixed noise conditions usually resulted in lower thresholds than twin noise. Template learning models are presented that attribute this advantage of fixed over twin noise either to fixed memory templates' reducing uncertainty by incorporation of the noise or to the introduction, by the learning process itself, of more variability in the twin noise condition. Quantitative predictions of the template learning process show that it contributes to the accelerating nonlinear increase in performance with signal amplitude at low signal-to-noise ratios.

  4. Confidence limit calculation for antidotal potency ratio derived from lethal dose 50

    PubMed Central

    Manage, Ananda; Petrikovics, Ilona

    2013-01-01

    AIM: To describe confidence interval calculation for antidotal potency ratios using bootstrap method. METHODS: We can easily adapt the nonparametric bootstrap method which was invented by Efron to construct confidence intervals in such situations like this. The bootstrap method is a resampling method in which the bootstrap samples are obtained by resampling from the original sample. RESULTS: The described confidence interval calculation using bootstrap method does not require the sampling distribution antidotal potency ratio. This can serve as a substantial help for toxicologists, who are directed to employ the Dixon up-and-down method with the application of lower number of animals to determine lethal dose 50 values for characterizing the investigated toxic molecules and eventually for characterizing the antidotal protections by the test antidotal systems. CONCLUSION: The described method can serve as a useful tool in various other applications. Simplicity of the method makes it easier to do the calculation using most of the programming software packages. PMID:25237618

  5. The effect of triazole induced photosynthetic pigments and biochemical constituents of Zea mays L. (Maize) under drought stress

    NASA Astrophysics Data System (ADS)

    Rajasekar, Mahalingam; Rabert, Gabriel Amalan; Manivannan, Paramasivam

    2016-06-01

    In this investigation, pot culture experiment was carried out to estimate the ameliorating effect of triazole compounds, namely Triadimefon (TDM), Tebuconazole (TBZ), and Propiconazole (PCZ) on drought stress, photosynthetic pigments, and biochemical constituents of Zea mays L. (Maize). From 30 days after sowing (DAS), the plants were subjected to 4 days interval drought (DID) stress and drought with TDM at 15 mg l-1, TBZ at 10 mg l-1, and PCZ at 15 mg l-1. Irrigation at 1-day interval was kept as control. Irrigation performed on alternative day. The plant samples were collected on 40, 50, and 60 DAS and separated into root, stem, and leaf for estimating the photosynthetic pigments and biochemical constituents. Drought and drought with triazole compounds treatment increased the biochemical glycine betaine content, whereas the protein and the pigments contents chlorophyll-a, chlorophyll-b, total chlorophyll, carotenoid, and anthocyanin decreased when compared to control. The triazole treatment mitigated the adverse effects of drought stress by increasing the biochemical potentials and paved the way to overcome drought stress in corn plant.

  6. Estimation of the latent mediated effect with ordinal data using the limited-information and Bayesian full-information approaches.

    PubMed

    Chen, Jinsong; Zhang, Dake; Choi, Jaehwa

    2015-12-01

    It is common to encounter latent variables with ordinal data in social or behavioral research. Although a mediated effect of latent variables (latent mediated effect, or LME) with ordinal data may appear to be a straightforward combination of LME with continuous data and latent variables with ordinal data, the methodological challenges to combine the two are not trivial. This research covers model structures as complex as LME and formulates both point and interval estimates of LME for ordinal data using the Bayesian full-information approach. We also combine weighted least squares (WLS) estimation with the bias-corrected bootstrapping (BCB; Efron Journal of the American Statistical Association, 82, 171-185, 1987) method or the traditional delta method as the limited-information approach. We evaluated the viability of these different approaches across various conditions through simulation studies, and provide an empirical example to illustrate the approaches. We found that the Bayesian approach with reasonably informative priors is preferred when both point and interval estimates are of interest and the sample size is 200 or above.

  7. Maggot development during morgue storage and its effect on estimating the post-mortem interval.

    PubMed

    Huntington, Timothy E; Higley, Leon G; Baxendale, Frederick P

    2007-03-01

    When insect evidence is obtained during autopsy, forensic entomologists make decisions regarding the effects of low-temperature (-1 degrees C to 4 degrees C) storage of the body and associated insects when estimating the post-mortem interval (PMI). To determine the effects of storage in a morgue cooler on the temperature of maggot masses, temperatures inside and outside of body bags containing a human cadaver and porcine cadavers (seven replicates) were measured during storage. Temperatures remained significantly higher (p<0.05) inside of the body bags relative to the cooler, and remained at levels sufficient for maggot feeding and development. If the assumption that no insect development takes place during preautopsy refrigeration is made, potential error rates in PMI estimation of 8.6-12.8% occur. The potential for blow fly larvae to undergo significant development while being stored in the morgue is a possibility that forensic entomologists should consider during an investigation involving samples collected from autopsy. Case and experimental evidence also demonstrate that substantial tissue loss can occur from maggot feeding during morgue storage.

  8. The effect of sleep on item recognition and source memory recollection among shift-workers and permanent day-workers.

    PubMed

    Mawdsley, Matthew; Grasby, Katrina; Talk, Andrew

    2014-10-01

    We studied the effect of sleep versus wakefulness on item recognition and source memory recollection in a sample of shift-workers and permanent day-workers. Recognition of words that were previously viewed arrayed in quadrants of a page, and recollection of the original source location of the words on the page were assessed after a 12-h retention interval that was filled with wakefulness incorporating the subjects' work-shift, or an equal period that included sleep. Both shift-workers and permanent day-workers had poorer item recognition and source memory recollection when the retention interval was spent awake rather than including sleep. Shift-workers expressed larger deficits in performance than day-workers after wakefulness. This effect was not mediated by whether the shift-workers were on a day- or night-shift at the time of the study. These results indicate that sleep is an important contributor to successful item recognition and source recollection, and that mnemonic processing in shift-workers may be especially sensitive across their work-shift. © 2014 European Sleep Research Society.

  9. 21 CFR 660.6 - Samples; protocols; official release.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Hepatitis B Surface Antigen § 660.6 Samples; protocols; official release. (a) Samples. (1) For the purposes... of official release is no longer required under paragraph (c)(2) of this section. (ii) One sample at... required under paragraph (c)(2) of this section. The sample submitted at the 90-day interval shall be from...

  10. 21 CFR 660.6 - Samples; protocols; official release.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Hepatitis B Surface Antigen § 660.6 Samples; protocols; official release. (a) Samples. (1) For the purposes... of official release is no longer required under paragraph (c)(2) of this section. (ii) One sample at... required under paragraph (c)(2) of this section. The sample submitted at the 90-day interval shall be from...

  11. 21 CFR 660.6 - Samples; protocols; official release.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Hepatitis B Surface Antigen § 660.6 Samples; protocols; official release. (a) Samples. (1) For the purposes... of official release is no longer required under paragraph (c)(2) of this section. (ii) One sample at... required under paragraph (c)(2) of this section. The sample submitted at the 90-day interval shall be from...

  12. Synchronizing data from irregularly sampled sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uluyol, Onder

    A system and method include receiving a set of sampled measurements for each of multiple sensors, wherein the sampled measurements are at irregular intervals or different rates, re-sampling the sampled measurements of each of the multiple sensors at a higher rate than one of the sensor's set of sampled measurements, and synchronizing the sampled measurements of each of the multiple sensors.

  13. Sampled-Data Consensus of Linear Multi-agent Systems With Packet Losses.

    PubMed

    Zhang, Wenbing; Tang, Yang; Huang, Tingwen; Kurths, Jurgen

    In this paper, the consensus problem is studied for a class of multi-agent systems with sampled data and packet losses, where random and deterministic packet losses are considered, respectively. For random packet losses, a Bernoulli-distributed white sequence is used to describe packet dropouts among agents in a stochastic way. For deterministic packet losses, a switched system with stable and unstable subsystems is employed to model packet dropouts in a deterministic way. The purpose of this paper is to derive consensus criteria, such that linear multi-agent systems with sampled-data and packet losses can reach consensus. By means of the Lyapunov function approach and the decomposition method, the design problem of a distributed controller is solved in terms of convex optimization. The interplay among the allowable bound of the sampling interval, the probability of random packet losses, and the rate of deterministic packet losses are explicitly derived to characterize consensus conditions. The obtained criteria are closely related to the maximum eigenvalue of the Laplacian matrix versus the second minimum eigenvalue of the Laplacian matrix, which reveals the intrinsic effect of communication topologies on consensus performance. Finally, simulations are given to show the effectiveness of the proposed results.In this paper, the consensus problem is studied for a class of multi-agent systems with sampled data and packet losses, where random and deterministic packet losses are considered, respectively. For random packet losses, a Bernoulli-distributed white sequence is used to describe packet dropouts among agents in a stochastic way. For deterministic packet losses, a switched system with stable and unstable subsystems is employed to model packet dropouts in a deterministic way. The purpose of this paper is to derive consensus criteria, such that linear multi-agent systems with sampled-data and packet losses can reach consensus. By means of the Lyapunov function approach and the decomposition method, the design problem of a distributed controller is solved in terms of convex optimization. The interplay among the allowable bound of the sampling interval, the probability of random packet losses, and the rate of deterministic packet losses are explicitly derived to characterize consensus conditions. The obtained criteria are closely related to the maximum eigenvalue of the Laplacian matrix versus the second minimum eigenvalue of the Laplacian matrix, which reveals the intrinsic effect of communication topologies on consensus performance. Finally, simulations are given to show the effectiveness of the proposed results.

  14. A pharmacometric case study regarding the sensitivity of structural model parameter estimation to error in patient reported dosing times.

    PubMed

    Knights, Jonathan; Rohatagi, Shashank

    2015-12-01

    Although there is a body of literature focused on minimizing the effect of dosing inaccuracies on pharmacokinetic (PK) parameter estimation, most of the work centers on missing doses. No attempt has been made to specifically characterize the effect of error in reported dosing times. Additionally, existing work has largely dealt with cases in which the compound of interest is dosed at an interval no less than its terminal half-life. This work provides a case study investigating how error in patient reported dosing times might affect the accuracy of structural model parameter estimation under sparse sampling conditions when the dosing interval is less than the terminal half-life of the compound, and the underlying kinetics are monoexponential. Additional effects due to noncompliance with dosing events are not explored and it is assumed that the structural model and reasonable initial estimates of the model parameters are known. Under the conditions of our simulations, with structural model CV % ranging from ~20 to 60 %, parameter estimation inaccuracy derived from error in reported dosing times was largely controlled around 10 % on average. Given that no observed dosing was included in the design and sparse sampling was utilized, we believe these error results represent a practical ceiling given the variability and parameter estimates for the one-compartment model. The findings suggest additional investigations may be of interest and are noteworthy given the inability of current PK software platforms to accommodate error in dosing times.

  15. Effort-reward imbalance at school and depressive symptoms in Chinese adolescents: the role of family socioeconomic status.

    PubMed

    Guo, Hongxiang; Yang, Wenjie; Cao, Ying; Li, Jian; Siegrist, Johannes

    2014-06-10

    Depression is a major mental health problem during adolescence. This study, using a sample of Chinese adolescents, examined the separate and combined effects of perceived school-related stress and of family socioeconomic status (SES) on the prevalence of depressive symptoms. A total of 1774 Chinese students from Grades 7-12 were recruited into our questionnaire survey. School-related stress was measured by the Effort-Reward Imbalance Questionnaire-School Version, family SES was assessed by a standardized question, and depressive symptoms were evaluated by the Center for Epidemiological Studies Depression Scale for Children. Multivariate logistic regression was applied, adjusting for age, gender, grade, smoking, alcohol drinking and physical activity. It was found that high school-related stress and low family SES were associated with elevated odds of depressive symptoms, respectively. The effect of school-related stress was particularly strong in low SES group. In adolescents with both high stress at school and low SES, the odds ratio was 9.18 (95% confidence interval = 6.53-12.89) compared to the reference group (low stress at school and high SES). A significant synergistic interaction effect was observed (synergy index = 2.28, 95% confidence interval = 1.56-3.32). The findings indicated that perceived school-related stress, in terms of effort-reward imbalance, was related to depressive symptoms in this sample of Chinese adolescents. The strong interaction with family SES suggests that health promoting efforts in school settings should be targeted specifically at these socially deprived groups.

  16. DISCRETE-LEVEL GROUND-WATER MONITORING SYSTEM FOR CONTAINMENT AND REMEDIAL PERFORMANCE ASSESSMENT OBJECTIVES

    EPA Science Inventory

    A passive discrete-level multilayer ground-water sampler was evaluated to determine its capability to obtain representative discrete-interval samples within the screen intervals of traditional monitoring wells without purging. Results indicate that the device is able to provide ...

  17. Revealing stellar brightness profiles by means of microlensing fold caustics

    NASA Astrophysics Data System (ADS)

    Dominik, M.

    2004-09-01

    With a handful of measurements of limb-darkening coefficients, galactic microlensing has already proven to be a powerful technique for studying atmospheres of distant stars. Survey campaigns such as OGLE-III are capable of providing ~10 suitable target stars per year that undergo microlensing events involving passages over the caustic created by a binary lens, which last from a few hours to a few days and allow us to resolve the stellar atmosphere by frequent broad-band photometry. For a caustic exit lasting 12 h and a photometric precision of 1.5 per cent, a moderate sampling interval of 30 min (corresponding to ~25-30 data points) is sufficient for providing a reliable measurement of the linear limb-darkening coefficient Γ with an uncertainty of ~8 per cent, which reduces to ~3 per cent for a reduced sampling interval of 6 min for the surroundings of the end of the caustic exit. While some additional points over the remaining parts of the light curve are highly valuable, a denser sampling in these regions provides little improvement. Unless an accuracy of less than 5 per cent is desired, limb-darkening coefficients for several filters can be obtained or observing time can be spent on other targets during the same night. The adoption of an inappropriate stellar brightness profile as well as the effect of acceleration between source and caustic yield distinguishable characteristic systematics in the model residuals. Acceleration effects are unlikely to affect the light curve significantly for most events, although a free acceleration parameter blurs the limb-darkening measurement if the passage duration cannot be accurately determined.

  18. Hybrid Model Predictive Control for Sequential Decision Policies in Adaptive Behavioral Interventions.

    PubMed

    Dong, Yuwen; Deshpande, Sunil; Rivera, Daniel E; Downs, Danielle S; Savage, Jennifer S

    2014-06-01

    Control engineering offers a systematic and efficient method to optimize the effectiveness of individually tailored treatment and prevention policies known as adaptive or "just-in-time" behavioral interventions. The nature of these interventions requires assigning dosages at categorical levels, which has been addressed in prior work using Mixed Logical Dynamical (MLD)-based hybrid model predictive control (HMPC) schemes. However, certain requirements of adaptive behavioral interventions that involve sequential decision making have not been comprehensively explored in the literature. This paper presents an extension of the traditional MLD framework for HMPC by representing the requirements of sequential decision policies as mixed-integer linear constraints. This is accomplished with user-specified dosage sequence tables, manipulation of one input at a time, and a switching time strategy for assigning dosages at time intervals less frequent than the measurement sampling interval. A model developed for a gestational weight gain (GWG) intervention is used to illustrate the generation of these sequential decision policies and their effectiveness for implementing adaptive behavioral interventions involving multiple components.

  19. Effects of time and sampling location on concentrations of β-hydroxybutyric acid in dairy cows.

    PubMed

    Mahrt, A; Burfeind, O; Heuwieser, W

    2014-01-01

    Two trials were conducted to examine factors potentially influencing the measurement of blood β-hydroxybutyric acid (BHBA) in dairy cows. The objective of the first trial was to study effects of sampling time on BHBA concentration in continuously fed dairy cows. Furthermore, we determined test characteristics of a single BHBA measurement at a random time of the day to diagnose subclinical ketosis considering commonly used cut-points (1.2 and 1.4 mmol/L). Finally, we set out to evaluate if test characteristics could be enhanced by repeating measurements after different time intervals. During 4 herd visits, a total of 128 cows (8 to 28 d in milk) fed 10 times daily were screened at 0900 h and preselected by BHBA concentration. Blood samples were drawn from the tail vessels and BHBA concentrations were measured using an electronic BHBA meter (Precision Xceed, Abbott Diabetes Care Ltd., Witney, UK). Cows with BHBA concentrations ≥0.8 mmol/L at this time were enrolled in the trial (n=92). Subsequent BHBA measurements took place every 3h for a total of 8 measurements during 24 h. The effect of sampling time on BHBA concentrations was tested in a repeated-measures ANOVA repeating sampling time. Sampling time did not affect BHBA concentrations in continuously fed dairy cows. Defining the average daily BHBA concentration calculated from the 8 measurements as the gold standard, a single measurement at a random time of the day to diagnose subclinical ketosis had a sensitivity of 0.90 or 0.89 at the 2 BHBA cut-points (1.2 and 1.4 mmol/L). Specificity was 0.88 or 0.90 using the same cut-points. Repeating measurements after different time intervals improved test characteristics only slightly. In the second experiment, we compared BHBA concentrations of samples drawn from 3 different blood sampling locations (tail vessels, jugular vein, and mammary vein) of 116 lactating dairy cows. Concentrations of BHBA differed in samples from the 3 sampling locations. Mean BHBA concentration was 0.3 mmol/L lower when measured in the mammary vein compared with the jugular vein and 0.4 mmol/L lower in the mammary vein compared with the tail vessels. We conclude that to measure BHBA, blood samples of continuously fed dairy cows can be drawn at any time of the day. A single measurement provides very good test characteristics for on-farm conditions. Blood samples for BHBA measurement should be drawn from the jugular vein or tail vessels; the mammary vein should not be used for this purpose. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  20. Changes in crash risk following re-timing of traffic signal change intervals.

    PubMed

    Retting, Richard A; Chapline, Janella F; Williams, Allan F

    2002-03-01

    More than I million motor vehicle crashes occur annually at signalized intersections in the USA. The principal method used to prevent crashes associated with routine changes in signal indications is employment of a traffic signal change interval--a brief yellow and all-red period that follows the green indication. No universal practice exists for selecting the duration of change intervals, and little is known about the influence of the duration of the change interval on crash risk. The purpose of this study was to estimate potential crash effects of modifying the duration of traffic signal change intervals to conform with values associated with a proposed recommended practice published by the Institute of Transportation Engineers. A sample of 122 intersections was identified and randomly assigned to experimental and control groups. Of 51 eligible experimental sites, 40 (78%) needed signal timing changes. For the 3-year period following implementation of signal timing changes, there was an 8% reduction in reportable crashes at experimental sites relative to those occurring at control sites (P = 0.08). For injury crashes, a 12% reduction at experimental sites relative to those occurring at control sites was found (P = 0.03). Pedestrian and bicycle crashes at experimental sites decreased 37% (P = 0.03) relative to controls. Given these results and the relatively low cost of re-timing traffic signals, modifying the duration of traffic signal change intervals to conform with values associated with the Institute of Transportation Engineers' proposed recommended practice should be strongly considered by transportation agencies to reduce the frequency of urban motor vehicle crashes.

  1. Enhanced DNA Profiling of the Semen Donor in Late Reported Sexual Assaults: Use of Y-Chromosome-Targeted Pre-amplification and Next Generation Y-STR Amplification Systems.

    PubMed

    Hanson, Erin K; Ballantyne, Jack

    2016-01-01

    In some cases of sexual assault the victim may not report the assault for several days after the incident due to various factors. The ability to obtain an autosomal STR profile of the semen donor from a living victim rapidly diminishes as the post-coital interval is extended due to the presence of only a small amount of male DNA amidst an overwhelming amount of female DNA. Previously, we have utilized various technological tools to overcome the limitations of male DNA profiling in extended interval post-coital samples including the use of Y-chromosome STR profiling, cervical sample, and post-PCR purification permitting the recovery of Y-STR profiles of the male DNA from samples collected 5-6 days after intercourse. Despite this success, the reproductive biology literature reports the presence of spermatozoa in the human cervix up to 7-10 days post-coitus. Therefore, novel and improved methods for recovery of male profiles in extended interval post-coital samples were required. Here, we describe enhanced strategies, including Y-chromosome-targeted pre-amplification and next generation Y-STR amplification kits, that have resulted in the ability to obtain probative male profiles from samples collected 6-9 days after intercourse.

  2. Sampling factors influencing accuracy of sperm kinematic analysis.

    PubMed

    Owen, D H; Katz, D F

    1993-01-01

    Sampling conditions that influence the accuracy of experimental measurement of sperm head kinematics were studied by computer simulation methods. Several archetypal sperm trajectories were studied. First, mathematical models of typical flagellar beats were input to hydrodynamic equations of sperm motion. The instantaneous swimming velocities of such sperm were computed over sequences of flagellar beat cycles, from which the resulting trajectories were determined. In a second, idealized approach, direct mathematical models of trajectories were utilized, based upon similarities to the previous hydrodynamic constructs. In general, it was found that analyses of sampling factors produced similar results for the hydrodynamic and idealized trajectories. A number of experimental sampling factors were studied, including the number of sperm head positions measured per flagellar beat, and the time interval over which these measurements are taken. It was found that when one flagellar beat is sampled, values of amplitude of lateral head displacement (ALH) and linearity (LIN) approached their actual values when five or more sample points per beat were taken. Mean angular displacement (MAD) values, however, remained sensitive to sampling rate even when large sampling rates were used. Values of MAD were also much more sensitive to the initial starting point of the sampling procedure than were ALH or LIN. On the basis of these analyses of measurement accuracy for individual sperm, simulations were then performed of cumulative effects when studying entire populations of motile cells. It was found that substantial (double digit) errors occurred in the mean values of curvilinear velocity (VCL), LIN, and MAD under the conditions of 30 video frames per second and 0.5 seconds of analysis time. Increasing the analysis interval to 1 second did not appreciably improve the results. However, increasing the analysis rate to 60 frames per second significantly reduced the errors. These findings thus suggest that computer-aided sperm analysis (CASA) application at 60 frames per second will significantly improve the accuracy of kinematic analysis in most applications to human and other mammalian sperm.

  3. The P Value Problem in Otolaryngology: Shifting to Effect Sizes and Confidence Intervals.

    PubMed

    Vila, Peter M; Townsend, Melanie Elizabeth; Bhatt, Neel K; Kao, W Katherine; Sinha, Parul; Neely, J Gail

    2017-06-01

    There is a lack of reporting effect sizes and confidence intervals in the current biomedical literature. The objective of this article is to present a discussion of the recent paradigm shift encouraging the use of reporting effect sizes and confidence intervals. Although P values help to inform us about whether an effect exists due to chance, effect sizes inform us about the magnitude of the effect (clinical significance), and confidence intervals inform us about the range of plausible estimates for the general population mean (precision). Reporting effect sizes and confidence intervals is a necessary addition to the biomedical literature, and these concepts are reviewed in this article.

  4. Technical note: A device for obtaining time-integrated samples of ruminal fluid

    USGS Publications Warehouse

    Corley, R. N.; Murphy, M.R.; Lucena, J.; Panno, S.V.

    1999-01-01

    A device was adapted to allow for time-integrated sampling of fluid from the rumen via a cannula. The sampler consisted of a cup-shaped ceramic filter positioned in the ventral rumen of a cannulated cow and attached to a tube through which fluid entering the filter was removed continuously using a peristaltic pump. Rate of ruminal fluid removal using the device was monitored over two 36-h periods (at 6-h intervals) and was not affected (P > .05) by time, indicating that the system was not susceptible to clogging during this period. Two cows having ad libitum access to a totally mixed ration were used in a split-block design to evaluate the utility of the system for obtaining time-integrated samples of ruminal fluid. Ruminal fluid VFA concentration and pattern in samples collected in two replicated 8-h periods by the time-integrated sampler (at 1-h intervals) were compared with composite samples collected using a conventional suction-strainer device (at 30-min intervals). Each 8-h collection period started 2 h before or 6 h after feeding. Results indicated that total VFA concentration was not affected (P > .05) by the sampling method. Volatile fatty acid patterns were likewise unaffected (P > .05) except that acetate was 2.5% higher (P < .05) in samples collected 2 h before feeding and valerate was 5% higher (P < .05) in samples collected 6 h after feeding by the suction-strainer device. Although significant, these differences were not considered physiologically important. We concluded that use of the ceramic filter improved the sampling of ruminal fluid by simplifying the technique and allowing time-integrated samples to be obtained.

  5. Circulating intact and cleaved forms of the urokinase-type plasminogen activator receptor: biological variation, reference intervals and clinical useful cut-points.

    PubMed

    Thurison, Tine; Christensen, Ib J; Lund, Ida K; Nielsen, Hans J; Høyer-Hansen, Gunilla

    2015-01-15

    High levels of circulating forms of the urokinase-type plasminogen activator receptor (uPAR) are significantly associated to poor prognosis in cancer patients. Our aim was to determine biological variations and reference intervals of the uPAR forms in blood, and in addition, to test the clinical relevance of using these as cut-points in colorectal cancer (CRC) prognosis. uPAR forms were measured in citrated and EDTA plasma samples using time-resolved fluorescence immunoassays. Diurnal, intra- and inter-individual variations were assessed in plasma samples from cohorts of healthy individuals. Reference intervals were determined in plasma from healthy individuals randomly selected from a Danish multi-center cross-sectional study. A cohort of CRC patients was selected from the same cross-sectional study. The reference intervals showed a slight increase with age and women had ~20% higher levels. The intra- and inter-individual variations were ~10% and ~20-30%, respectively and the measured levels of the uPAR forms were within the determined 95% reference intervals. No diurnal variation was found. Applying the normal upper limit of the reference intervals as cut-point for dichotomizing CRC patients revealed significantly decreased overall survival of patients with levels above this cut-point of any uPAR form. The reference intervals for the different uPAR forms are valid and the upper normal limits are clinically relevant cut-points for CRC prognosis. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Lack of effect of oral cabotegravir on the pharmacokinetics of a levonorgestrel/ethinyl oestradiol-containing oral contraceptive in healthy adult women.

    PubMed

    Trezza, Christine; Ford, Susan L; Gould, Elizabeth; Lou, Yu; Huang, Chuyun; Ritter, James M; Buchanan, Ann M; Spreen, William; Patel, Parul

    2017-07-01

    This study aimed to investigate whether cabotegravir (CAB), an integrase inhibitor in development for treatment and prevention of human immunodeficiency virus-1, influences the pharmacokinetics (PK) of a levonorgestrel (LNG) and ethinyl oestradiol (EO)-containing oral contraceptive (OC) in healthy women. In this open-label, fixed-sequence crossover study, healthy female subjects received LNG 0.15 mg/EO 0.03 mg tablet once daily Days 1-10 alone and with oral CAB 30 mg once daily Days 11-21. At the end of each treatment period, subjects underwent predose sampling for concentrations of follicle-stimulating hormone, luteinizing hormone, and progesterone and serial PK sampling for plasma LNG, EO, and CAB concentrations. Twenty women were enrolled, and 19 completed the study. One subject was withdrawn due to an adverse event unrelated to study medications. Geometric least squares mean ratios (90% confidence interval) of LNG + CAB vs. LNG alone for LNG area under the plasma concentration-time curve over the dosing interval of duration τ and maximum observed plasma concentration were 1.12 (1.07-1.18) and 1.05 (0.96-1.15), respectively. Geometric least squares mean ratio (90% confidence interval) of EO + CAB vs. EO alone for EO area under the plasma concentration-time curve over the dosing interval of duration τ and maximum observed plasma concentration were 1.02 (0.97-1.08) and 0.92 (0.83-1.03), respectively. Steady-state CAB PK parameters were comparable to historical values. There was no apparent difference in mean luteinizing hormone, follicle-stimulating hormone, and progesterone concentrations between periods. No clinically significant trends in laboratory values, vital signs, or electrocardiography values were observed. Repeat doses of oral CAB had no significant effect on LNG/EO PK or pharmacodynamics, which supports CAB coadministration with LNG/EO OCs in clinical practice. © 2017 ViiV Healthcare. British Journal of Clinical Pharmacology published by John Wiley & Sons Ltd on behalf of British Pharmacological Society.

  7. 40 CFR 1065.1107 - Sample media and sample system preparation; sample system assembly.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) For capturing PM, we recommend using pure quartz filters with no binder. Select the filter diameter to minimize filter change intervals, accounting for the expected PM emission rate, sample flow rate, and... filter without replacing the sorbent or otherwise disassembling the batch sampler. In those cases...

  8. Analysis of serum and cerebrospinal fluid in clinically normal adult miniature donkeys.

    PubMed

    Mozaffari, A A; Samadieh, H

    2013-09-01

    To establish reference intervals for serum and cerebrospinal fluid (CSF) parameters in clinically healthy adult miniature donkeys. Experiments were conducted on 10 female and 10 male clinically normal adult miniature donkeys, randomly selected from five herds. Lumbosacral CSF collection was performed with the sedated donkey in the standing position. Cell analysis was performed immediately after the samples were collected. Blood samples were obtained from the jugular vein immediately after CSF sample collection. Sodium, potassium, glucose, urea nitrogen, total protein, calcium, chloride, phosphorous and magnesium concentrations were measured in CSF and serum samples. A paired t-test was used to compare mean values between female and male donkeys. The CSF was uniformly clear, colourless and free from flocculent material, with a specific gravity of 1.002. The range of total nucleated cell counts was 2-4 cells/μL. The differential white cell count comprised only small lymphocytes. No erythrocytes or polymorphonuclear cells were observed on cytological examination. Reference values were obtained for biochemical analysis of serum and CSF. Gender had no effect on any variables measured in serum or CSF (p>0.05). CSF analysis can provide important information in addition to that gained by clinical examination. CSF analysis has not previously been performed in miniature donkeys; this is the first report on the subject. In the present study, reference intervals for total nucleated cell count, total protein, glucose, urea nitrogen, sodium, potassium, chloride, calcium, phosphorous and magnesium concentrations of serum and CSF were determined for male and female miniature donkeys.

  9. A 3.9 ps Time-Interval RMS Precision Time-to-Digital Converter Using a Dual-Sampling Method in an UltraScale FPGA

    NASA Astrophysics Data System (ADS)

    Wang, Yonggang; Liu, Chong

    2016-10-01

    Field programmable gate arrays (FPGAs) manufactured with more advanced processing technology have faster carry chains and smaller delay elements, which are favorable for the design of tapped delay line (TDL)-style time-to-digital converters (TDCs) in FPGA. However, new challenges are posed in using them to implement TDCs with a high time precision. In this paper, we propose a bin realignment method and a dual-sampling method for TDC implementation in a Xilinx UltraScale FPGA. The former realigns the disordered time delay taps so that the TDC precision can approach the limit of its delay granularity, while the latter doubles the number of taps in the delay line so that the TDC precision beyond the cell delay limitation can be expected. Two TDC channels were implemented in a Kintex UltraScale FPGA, and the effectiveness of the new methods was evaluated. For fixed time intervals in the range from 0 to 440 ns, the average RMS precision measured by the two TDC channels reaches 5.8 ps using the bin realignment, and it further improves to 3.9 ps by using the dual-sampling method. The time precision has a 5.6% variation in the measured temperature range. Every part of the TDC, including dual-sampling, encoding, and on-line calibration, could run at a 500 MHz clock frequency. The system measurement dead time is only 4 ns.

  10. Intergenerational Transmission of Childhood Conduct Problems

    PubMed Central

    D’Onofrio, Brian M.; Slutske, Wendy S.; Turkheimer, Eric; Emery, Robert E.; Paige Harden, K.; Heath, Andrew C.; Madden, Pamela A. F.; Martin, Nicholas G.

    2010-01-01

    Context The familial nature of childhood conduct problems has been well documented, but few genetically informed studies have explicitly explored the processes through which parental conduct problems influence an offspring’s behavior problems. Objective To delineate the genetic and environmental processes underlying the intergenerational transmission of childhood conduct problems. Design We used hierarchical linear models to analyze data from a Children of Twins Study, a quasiexperimental design, to explore the extent to which genetic factors common to both generations, unmeasured environmental factors that are shared by twins, or measured characteristics of both parents confound the intergenerational association. Setting Participants were recruited from the community and completed a semistructured diagnostic telephone interview. Participants The research used a high-risk sample of twins, their spouses, and their young adult offspring (n=2554) from 889 twin families in the Australian Twin Registry, but the analyses used sample weights to produce parameter estimates for the community-based volunteer sample of twins. Main Outcome Measure Number of conduct disorder symptoms. Results The magnitude of the intergenerational transmission was significant for all offspring, though it was stronger for males (effect size [Cohen d]=0.21; 95% confidence interval, 0.15–0.17) than females (d=0.09; 95% confidence interval, 0.05–0.14). The use of the Children of Twins design and measured covariates indicated that the intergenerational transmission of conduct problems for male offspring was largely mediated by environmental variables specifically related to parental conduct disorder (d=0.13; 95% confidence interval, 0.02–0.23). In contrast, the intergenerational transmission of conduct problems was not because of environmentally mediated causal processes for female offspring (d=−0.09; 95% confidence interval, −0.20 to 0.03); a common genetic liability accounted for the intergenerational relations. Conclusions The mechanisms underlying the inter-generational transmission of conduct problems depend on the sex of the offspring. The results are consistent with an environmentally mediated causal role of parental conduct problems on behavior problems in males. Common genetic risk, however, confounds the entire inter-generational transmission in female offspring. PMID:17606816

  11. A spreadsheet template compatible with Microsoft Excel and iWork Numbers that returns the simultaneous confidence intervals for all pairwise differences between multiple sample means.

    PubMed

    Brown, Angus M

    2010-04-01

    The objective of the method described in this paper is to develop a spreadsheet template for the purpose of comparing multiple sample means. An initial analysis of variance (ANOVA) test on the data returns F--the test statistic. If F is larger than the critical F value drawn from the F distribution at the appropriate degrees of freedom, convention dictates rejection of the null hypothesis and allows subsequent multiple comparison testing to determine where the inequalities between the sample means lie. A variety of multiple comparison methods are described that return the 95% confidence intervals for differences between means using an inclusive pairwise comparison of the sample means. 2009 Elsevier Ireland Ltd. All rights reserved.

  12. Is the placebo powerless? An analysis of clinical trials comparing placebo with no treatment.

    PubMed

    Hróbjartsson, A; Gøtzsche, P C

    2001-05-24

    Placebo treatments have been reported to help patients with many diseases, but the quality of the evidence supporting this finding has not been rigorously evaluated. We conducted a systematic review of clinical trials in which patients were randomly assigned to either placebo or no treatment. A placebo could be pharmacologic (e.g., a tablet), physical (e.g., a manipulation), or psychological (e.g., a conversation). We identified 130 trials that met our inclusion criteria. After the exclusion of 16 trials without relevant data on outcomes, there were 32 with binary outcomes (involving 3795 patients, with a median of 51 patients per trial) and 82 with continuous outcomes (involving 4730 patients, with a median of 27 patients per trial). As compared with no treatment, placebo had no significant effect on binary outcomes (pooled relative risk of an unwanted outcome with placebo, 0.95; 95 percent confidence interval, 0.88 to 1.02), regardless of whether these outcomes were subjective or objective. For the trials with continuous outcomes, placebo had a beneficial effect (pooled standardized mean difference in the value for an unwanted outcome between the placebo and untreated groups, -0.28; 95 percent confidence interval, -0.38 to -0.19), but the effect decreased with increasing sample size, indicating a possible bias related to the effects of small trials. The pooled standardized mean difference was significant for the trials with subjective outcomes (-0.36; 95 percent confidence interval, -0.47 to -0.25) but not for those with objective outcomes. In 27 trials involving the treatment of pain, placebo had a beneficial effect (-0.27; 95 percent confidence interval, -0.40 to -0.15). This corresponded to a reduction in the intensity of pain of 6.5 mm on a 100-mm visual-analogue scale. We found little evidence in general that placebos had powerful clinical effects. Although placebos had no significant effects on objective or binary outcomes, they had possible small benefits in studies with continuous subjective outcomes and for the treatment of pain. Outside the setting of clinical trials, there is no justification for the use of placebos.

  13. Retest effects in working memory capacity tests: A meta-analysis.

    PubMed

    Scharfen, Jana; Jansen, Katrin; Holling, Heinz

    2018-06-15

    The repeated administration of working memory capacity tests is common in clinical and research settings. For cognitive ability tests and different neuropsychological tests, meta-analyses have shown that they are prone to retest effects, which have to be accounted for when interpreting retest scores. Using a multilevel approach, this meta-analysis aims at showing the reproducibility of retest effects in working memory capacity tests for up to seven test administrations, and examines the impact of the length of the test-retest interval, test modality, equivalence of test forms and participant age on the size of retest effects. Furthermore, it is assessed whether the size of retest effects depends on the test paradigm. An extensive literature search revealed 234 effect sizes from 95 samples and 68 studies, in which healthy participants between 12 and 70 years repeatedly performed a working memory capacity test. Results yield a weighted average of g = 0.28 for retest effects from the first to the second test administration, and a significant increase in effect sizes was observed up to the fourth test administration. The length of the test-retest interval and publication year were found to moderate the size of retest effects. Retest effects differed between the paradigms of working memory capacity tests. These findings call for the development and use of appropriate experimental or statistical methods to address retest effects in working memory capacity tests.

  14. Exposure to Radiofrequency Radiation Emitted from Common Mobile Phone Jammers Alters the Pattern of Muscle Contractions: an Animal Model Study.

    PubMed

    Rafati, A; Rahimi, S; Talebi, A; Soleimani, A; Haghani, M; Mortazavi, S M J

    2015-09-01

    The rapid growth of wireless communication technologies has caused public concerns regarding the biological effects of electromagnetic radiations on human health. Some early reports indicated a wide variety of non-thermal effects of electromagnetic radiation on amphibians such as the alterations of the pattern of muscle extractions. This study is aimed at investigating the effects of exposure to radiofrequency (RF) radiation emitted from mobile phone jammers on the pulse height of contractions, the time interval between two subsequent contractions and the latency period of frog's isolated gastrocnemius muscle after stimulation with single square pulses of 1V (1 Hz). Frogs were kept in plastic containers in a room. Animals in the jammer group were exposed to radiofrequency (RF) radiation emitted from a common Jammer at a distance of 1m from the jammer's antenna for 2 hours while the control frogs were only sham exposed. Then animals were sacrificed and isolated gastrocnemius muscles were exposed to on/off jammer radiation for 3 subsequent 10 minute intervals. Isolated gastrocnemius muscles were attached to the force transducer with a string. Using a PowerLab device (26-T), the pattern of muscular contractions was monitored after applying single square pulses of 1V (1 Hz) as stimuli. The findings of this study showed that the pulse height of muscle contractions could not be affected by the exposure to electromagnetic fields. However, the latency period was effectively altered in RF-exposed samples. However, none of the experiments could show an alteration in the time interval between two subsequent contractions after exposure to electromagnetic fields. These findings support early reports which indicated a wide variety of non-thermal effects of electromagnetic radiation on amphibians including the effects on the pattern of muscle extractions.

  15. The extent to which tobacco marketing and tobacco use in films contribute to children's use of tobacco: a meta-analysis.

    PubMed

    Wellman, Robert J; Sugarman, David B; DiFranza, Joseph R; Winickoff, Jonathan P

    2006-12-01

    To quantify the effect of exposure on initiation of tobacco use among adolescents. A systematic literature search of MEDLINE, PsychINFO, ABI/INFORM, and Business Source Premier through October/November 2005 was conducted. Unpublished studies were solicited from researchers. Of 401 citations initially identified, 51 (n = 141 949 participants) met the inclusion criteria: reporting on exposure and tobacco use outcomes and participants younger than 18 years. Included studies reported 146 effects; 89 were conceptually independent effects. Data were extracted independently by 3 of us using a standardized tool. Weighted averages were calculated using a linear mixed-effects model. Heterogeneity and publication bias were assessed. Main Exposures Exposures (tobacco advertising, promotions, and samples and pro-tobacco depictions in films, television, and videos) were categorized as low or high engagement based on the degree of psychological involvement required. Outcomes were categorized as cognitive (attitudes or intentions) or behavioral (initiation, tobacco use status, or progression of use). Exposure to pro-tobacco marketing and media increases the odds of youth holding positive attitudes toward tobacco use (odds ratio, 1.51; 95% confidence interval, 1.08-2.13) and more than doubles the odds of initiating tobacco use (odds ratio, 2.23; 95% confidence interval, 1.79-2.77). Highly engaging marketing and media are more effective at promoting use (odds ratio, 2.67; 95% confidence interval, 2.19-3.25). These effects are observed across time, in different countries, with different study designs and measures of exposure and outcome. Pro-tobacco marketing and media stimulate tobacco use among youth. A ban on all tobacco promotions is warranted to protect children.

  16. Biodegradation and attenuation of steroidal hormones and alkylphenols by stream biofilms and sediments

    USGS Publications Warehouse

    Writer, Jeffrey; Barber, Larry B.; Ryan, Joseph N.; Bradley, Paul M.

    2011-01-01

    Biodegradation of select endocrine-disrupting compounds (17β-estradiol, estrone, 17α-ethynylestradiol, 4-nonylphenol, 4-nonylphenolmonoexthoylate, and 4-nonylphenoldiethoxylate) was evaluated in stream biofilm, sediment, and water matrices collected from locations upstream and downstream from a wastewater treatment plant effluent discharge. Both biologically mediated transformation to intermediate metabolites and biologically mediated mineralization were evaluated in separate time interval experiments. Initial time intervals (0–7 d) evaluated biodegradation by the microbial community dominant at the time of sampling. Later time intervals (70 and 185 d) evaluated the biodegradation potential as the microbial community adapted to the absence of outside energy sources. The sediment matrix was more effective than the biofilm and water matrices at biodegrading 4-nonylphenol and 17β-estradiol. Biodegradation by the sediment matrix of 17α-ethynylestradiol occurred at later time intervals (70 and 185 d) and was not observed in the biofilm or water matrices. Stream biofilms play an important role in the attenuation of endocrine-disrupting compounds in surface waters due to both biodegradation and sorption processes. Because sorption to stream biofilms and bed sediments occurs on a faster temporal scale (<1 h) than the potential to biodegrade the target compounds (50% mineralization at >185 d), these compounds can accumulate in stream biofilms and sediments.

  17. How Hot Are Drosophila Hotspots? Examining Recombination Rate Variation and Associations with Nucleotide Diversity, Divergence, and Maternal Age in Drosophila pseudoobscura

    PubMed Central

    Manzano-Winkler, Brenda; McGaugh, Suzanne E.; Noor, Mohamed A. F.

    2013-01-01

    Fine scale meiotic recombination maps have uncovered a large amount of variation in crossover rate across the genomes of many species, and such variation in mammalian and yeast genomes is concentrated to <5kb regions of highly elevated recombination rates (10–100x the background rate) called “hotspots.” Drosophila exhibit substantial recombination rate heterogeneity across their genome, but evidence for these highly-localized hotspots is lacking. We assayed recombination across a 40Kb region of Drosophila pseudoobscura chromosome 2, with one 20kb interval assayed every 5Kb and the adjacent 20kb interval bisected into 10kb pieces. We found that recombination events across the 40kb stretch were relatively evenly distributed across each of the 5kb and 10kb intervals, rather than concentrated in a single 5kb region. This, in combination with other recent work, indicates that the recombination landscape of Drosophila may differ from the punctate recombination pattern observed in many mammals and yeast. Additionally, we found no correlation of average pairwise nucleotide diversity and divergence with recombination rate across the 20kb intervals, nor any effect of maternal age in weeks on recombination rate in our sample. PMID:23967224

  18. Using Stochastic Approximation Techniques to Efficiently Construct Confidence Intervals for Heritability.

    PubMed

    Schweiger, Regev; Fisher, Eyal; Rahmani, Elior; Shenhav, Liat; Rosset, Saharon; Halperin, Eran

    2018-06-22

    Estimation of heritability is an important task in genetics. The use of linear mixed models (LMMs) to determine narrow-sense single-nucleotide polymorphism (SNP)-heritability and related quantities has received much recent attention, due of its ability to account for variants with small effect sizes. Typically, heritability estimation under LMMs uses the restricted maximum likelihood (REML) approach. The common way to report the uncertainty in REML estimation uses standard errors (SEs), which rely on asymptotic properties. However, these assumptions are often violated because of the bounded parameter space, statistical dependencies, and limited sample size, leading to biased estimates and inflated or deflated confidence intervals (CIs). In addition, for larger data sets (e.g., tens of thousands of individuals), the construction of SEs itself may require considerable time, as it requires expensive matrix inversions and multiplications. Here, we present FIESTA (Fast confidence IntErvals using STochastic Approximation), a method for constructing accurate CIs. FIESTA is based on parametric bootstrap sampling, and, therefore, avoids unjustified assumptions on the distribution of the heritability estimator. FIESTA uses stochastic approximation techniques, which accelerate the construction of CIs by several orders of magnitude, compared with previous approaches as well as to the analytical approximation used by SEs. FIESTA builds accurate CIs rapidly, for example, requiring only several seconds for data sets of tens of thousands of individuals, making FIESTA a very fast solution to the problem of building accurate CIs for heritability for all data set sizes.

  19. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    PubMed

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  20. Discrimination training facilitates pigeons' performance on one-trial-per-day delayed matching of key location

    PubMed Central

    Willson, Robert J.; Wilkie, Donald M.

    1991-01-01

    Six pigeons were tested on a one-trial-per-day variant of delayed matching of key location. In one condition, a trial began with the illumination of a pair of quasi-randomly selected pecking keys in a large 10-key test box. Pigeons' pecks to one key (the sample) were reinforced with 8-second access to grain on a variable-interval 30-second schedule, whereas pecks to the other key (the distractor) had no scheduled consequences. In the second condition, the nonreinforced distractor was not presented. In both conditions, subjects were removed from the apparatus after 15 minutes and placed in a holding cage. Subjects were subsequently replaced in the box after a delay (retention interval) of 30 seconds and were reexposed to the illuminated sample and distractor keys for 1 minute. If a pigeon made more pecks to the sample during this interval, the distractor was extinguished and subsequent pecks to the sample were reinforced on the previous schedule for an additional 15 minutes. If, however, a pigeon made more pecks to the distractor, both keys were extinguished and the subject was returned to its home cage. For all subjects, matching-to-sample accuracy was higher in the first condition. In a second experiment, the retention interval was increased to 5, 15, and 30 minutes, and then to 1, 2, 4, 8, 12, and 24 hours. Most subjects remembered the correct key location for up to 4 hours, and in one case, up to 24 hours, demonstrating a spatial-memory proficiency far better than previously reported in this species on delayed matching tasks. The results are discussed in terms of the commonly held distinction between working and reference memory. PMID:16812633

  1. A post hoc evaluation of a sample size re-estimation in the Secondary Prevention of Small Subcortical Strokes study.

    PubMed

    McClure, Leslie A; Szychowski, Jeff M; Benavente, Oscar; Hart, Robert G; Coffey, Christopher S

    2016-10-01

    The use of adaptive designs has been increasing in randomized clinical trials. Sample size re-estimation is a type of adaptation in which nuisance parameters are estimated at an interim point in the trial and the sample size re-computed based on these estimates. The Secondary Prevention of Small Subcortical Strokes study was a randomized clinical trial assessing the impact of single- versus dual-antiplatelet therapy and control of systolic blood pressure to a higher (130-149 mmHg) versus lower (<130 mmHg) target on recurrent stroke risk in a two-by-two factorial design. A sample size re-estimation was performed during the Secondary Prevention of Small Subcortical Strokes study resulting in an increase from the planned sample size of 2500-3020, and we sought to determine the impact of the sample size re-estimation on the study results. We assessed the results of the primary efficacy and safety analyses with the full 3020 patients and compared them to the results that would have been observed had randomization ended with 2500 patients. The primary efficacy outcome considered was recurrent stroke, and the primary safety outcomes were major bleeds and death. We computed incidence rates for the efficacy and safety outcomes and used Cox proportional hazards models to examine the hazard ratios for each of the two treatment interventions (i.e. the antiplatelet and blood pressure interventions). In the antiplatelet intervention, the hazard ratio was not materially modified by increasing the sample size, nor did the conclusions regarding the efficacy of mono versus dual-therapy change: there was no difference in the effect of dual- versus monotherapy on the risk of recurrent stroke hazard ratios (n = 3020 HR (95% confidence interval): 0.92 (0.72, 1.2), p = 0.48; n = 2500 HR (95% confidence interval): 1.0 (0.78, 1.3), p = 0.85). With respect to the blood pressure intervention, increasing the sample size resulted in less certainty in the results, as the hazard ratio for higher versus lower systolic blood pressure target approached, but did not achieve, statistical significance with the larger sample (n = 3020 HR (95% confidence interval): 0.81 (0.63, 1.0), p = 0.089; n = 2500 HR (95% confidence interval): 0.89 (0.68, 1.17), p = 0.40). The results from the safety analyses were similar to 3020 and 2500 patients for both study interventions. Other trial-related factors, such as contracts, finances, and study management, were impacted as well. Adaptive designs can have benefits in randomized clinical trials, but do not always result in significant findings. The impact of adaptive designs should be measured in terms of both trial results, as well as practical issues related to trial management. More post hoc analyses of study adaptations will lead to better understanding of the balance between the benefits and the costs. © The Author(s) 2016.

  2. Predicted bulk composition of petroleum generated by Lower Cretaceous Wealden black shales, Lower Saxony Basin, Germany

    NASA Astrophysics Data System (ADS)

    Ziegs, Volker; Mahlstedt, Nicolaj; Bruns, Benjamin; Horsfield, Brian

    2015-09-01

    The Berriasian Wealden Shale provides the favourable situation of possessing immature to overmature source rock intervals due to differential subsidence within the Lower Saxony Basin. Hydrocarbon generation kinetics and petroleum physical properties have been investigated on four immature Wealden Shale samples situated in different depth intervals and following the PhaseKinetics approach of di Primio and Horsfield (AAPG Bull 90(7):1031-1058, 2006). Kinetic parameters and phase prediction were applied to a thermally calibrated 1D model of the geodynamic evolution at the location of an overmature well. The immature source rocks of all depth intervals comprise kerogen type I being derived from the lacustrine algae Botryococcus braunii. Bulk kinetics of the lower three depth intervals (sample 2-4) can be described by one single activation energy E a, typical for homogeneous, lacustrine organic matter (OM), whereas sample 1 from the uppermost interval shows a slightly broader E a distribution which hints to a more heterogeneous, less stable OM, but still of lacustrine origin. Predicted physical properties of the generated petroleum fluids are characteristic of variably waxy, black oil possessing GOR's below 100 Sm3/Sm3 and saturations pressures below 150 bar. Petroleum fluids from the more heterogeneous OM-containing sample 1 can always be described by slightly higher values. Based on the occurrence of paraffinic, free hydrocarbons in the uppermost horizon of the overmature well and gas/condensate in the lower 3 depth intervals, two scenarios have been discussed. From the first and least realistic scenario assuming no expulsion from the source rock, it can be deduced that phase separation in the course of uplift can only have occurred in the uppermost interval containing the slightly less stable OM but not in the lower intervals being composed of a more stable OM. Therefore and taking secondary cracking into account, all depth intervals should contain gas/condensate. The free hydrocarbons in the upper horizon are interpreted as impregnation from migrated hydrocarbons. The second scenario assumes nearly complete expulsion due to fracturing by the so-called generation overpressure (Mann et al. in Petroleum and basin evolution. Springer, Berlin, 1997). The expelled petroleum might migrate into lower pressurised source rock horizons and reach bubble-point pressures leading to the exsolution of gas and "precipitation" of very high molecular weight bitumen unable to migrate. Subsequent burial of the latter in the course of the basin evolution would lead to secondary cracking and remaining pyrobitumen explaining the high amounts of pyrobitumen in the overmature well Ex-B and relatively enhanced TOC contents at such high maturity levels.

  3. Hematology and biochemistry reference intervals for Ontario commercial nursing pigs close to the time of weaning

    PubMed Central

    Perri, Amanda M.; O’Sullivan, Terri L.; Harding, John C.S.; Wood, R. Darren; Friendship, Robert M.

    2017-01-01

    The evaluation of pig hematology and biochemistry parameters is rarely done largely due to the costs associated with laboratory testing and labor, and the limited availability of reference intervals needed for interpretation. Within-herd and between-herd biological variation of these values also make it difficult to establish reference intervals. Regardless, baseline reference intervals are important to aid veterinarians in the interpretation of blood parameters for the diagnosis and treatment of diseased swine. The objective of this research was to provide reference intervals for hematology and biochemistry parameters of 3-week-old commercial nursing piglets in Ontario. A total of 1032 pigs lacking clinical signs of disease from 20 swine farms were sampled for hematology and iron panel evaluation, with biochemistry analysis performed on a subset of 189 randomly selected pigs. The 95% reference interval, mean, median, range, and 90% confidence intervals were calculated for each parameter. PMID:28373729

  4. Application and assessment of a regular environmental monitoring of the antineoplastic drug contamination level in pharmacies - the MEWIP project.

    PubMed

    Kiffmeyer, Thekla K; Tuerk, Jochen; Hahn, Moritz; Stuetzer, Hartmut; Hadtstein, Claudia; Heinemann, André; Eickmann, Udo

    2013-05-01

    A large-scale study was carried out in order to determine the contamination level of antineoplastic drugs in pharmacies and to investigate the suitability and effects of wipe sample monitoring at regular intervals. A specific study design was developed. The 130 participating pharmacies were divided into a study and a control group, carrying out five and two wipe sampling cycles, respectively. The work practice was analyzed using questionnaires to identify factors that influence the contamination level. From 1269 wipe samples, 774 (61%) were contaminated with at least one of the analyzed cytotoxic drugs: cyclophosphamide, docetaxel, etoposide, 5-fluorouracil, gemcitabine, ifosfamide, methotrexate, and paclitaxel. A significant decrease of the contamination with cyclophosphamide and 5-fluorouracil was observed in the study group. The Monitoring-Effect Study of Wipe Sampling in Pharmacies method has proven to be a reliable and affordable tool for contamination control. Based on the 90th percentile of the contamination values, a substance-independent performance-based guidance value of 0.1ng cm(-2) has been derived.

  5. Preliminary Observations of Noise Spectra at the SRO and ASRO Stations

    USGS Publications Warehouse

    Peterson, Jon

    1980-01-01

    Introduction The seismic noise spectra presented in this report were derived from SRO and ASRO station data for the purpose of evaluating the performance of the seismic instruments. They are also useful for constructing a spectral estimate of earth noise at a quiet site based on noise samples obtained from a network of globally distributed sites. It is hoped that the spectra will be usefull for other purposes as well. The term 'noise' is used here to describe the ambient signals recorded during a quiet period when earthquake signals have not been detected by visual inspectino of the analog seismogram. The total recorded noise is the sum of instrumental noise, environmental noise (such as effects of temperature, pressure, wind), earth background noise from both natural and cultural sources, and very possibly low-level signals from earthquakes that cannot be visually identified. It is not possible to separate and quantify the signals generated by these independent noise sources using a single sample of station data, although instrumental problems may be indicated by gross changes of noise levels, if the changes are not in the microseismic bands. Since seismic data at the SRO and ASRO stations are recorded in a digital format, spectral computations can be automated so that station noise levels can be monitored as part of data-review procedures. The noise spectra presented in this study are intended to serve as an initial baseline against which relative changes in noise levels can be measured. Total noise power was computed separately for the short- and long-period bands, which are recorded separately at the stations. Power spectral densities were derived by averaging the spectral estimates of a number of contiguous dat segments. The mean value and slope were removed from each segment, cosine-tapered windows were applied, and the estimates were obtained using a fast Fourier transform. In the short-period analyses 16 segments were used, each segment being 1024 samples in length. Because the sampling interval is .05 seconds, the total record length is nearly 13.7 minutes. Normally, the short-period SRO and ASRO data are recorded in an event-only mode. However, several days of continuous short-period data were acquired from most stations for the purpose of this study. Where there was appreciable diurnal variation in short-period noise, spectral data were computed for both day and night intervals. In most cases the long-period spectral densities were obtained by averagin the estimates from 16 data segments, each segment having a length of 2048 samples. Since the long-period sampling interval in 1 second, the total record length used was nearly 9.1 hours. In a few instances, a smaller number of segments was averaged. Spectral data were computed from the vertical-component short-period signals and all three components of long-period signals. All of the spectral plots have been corrected for known instrument response and presented in units of earth displacement. With a few exceptions, the samples of noise data used were acquired during the early months of 1980, winter at some of the stations and summer at others. The starting times for the intervals analyzed are listed in Table 1. A seasonal variation of noise levels in microseismic bands is to be expected. However, none of the stations were experiencing a noticeably high level of microseisms during the intervals analyzed. Weltman and others (1979) have studied and reported daily and seasonal RMS (root-mean-square) noise trends at the SRO and ASRO stations.

  6. Care coordination, the family-centered medical home, and functional disability among children with special health care needs.

    PubMed

    Litt, Jonathan S; McCormick, Marie C

    2015-01-01

    Children with special health care needs (CSHCN) are at increased risk for functional disabilities. Care coordination has been shown to decrease unmet health service use but has yet been shown to improve functional status. We hypothesize that care coordination services lower the odds of functional disability for CSHCN and that this effect is greater within the context of a family-centered medical home. A secondary objective was to test the mediating effect of unmet care needs on functional disability. Our sample included children ages 0 to 17 years participating the 2009-2010 National Survey of Children with Special Health Care Needs. Care coordination, unmet needs, and disability were measured by parent report. We used logistic regression models with covariate adjustment for confounding and a mediation analysis approach for binary outcomes to assess the effect of unmet needs. There were 34,459 children in our sample. Care coordination was associated with lower odds of having a functional disability (adjusted odds ratio 0.82, 95% confidence interval 0.77, 0.88). This effect was greater for care coordination in the context of a medical home (adjusted odds ratio 0.71, 95% confidence interval 0.66, 0.76). The relationship between care coordination and functional disability was mediated by reducing unmet services. Care coordination is associated with lower odds of functional disability among CSHCN, especially when delivered in the setting of a family-centered medical home. Reducing unmet service needs mediates this effect. Our findings support a central role for coordination services in improving outcomes for vulnerable children. Copyright © 2015 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  7. Secretaries, depression and absenteeism.

    PubMed

    Garrison, R; Eaton, W W

    1992-01-01

    This study examines the prevalence of Major Depressive Disorder; missed work; and mental health services use among secretaries and other women employed full-time. In a random sample of 3,484 women employed full-time, women employed as secretaries were significantly more likely to be depressed than other women even after controlling for socio-demographic characteristics (odds ratio = 1.69, 95% confidence interval = 1.05, 2.73). Secretaries were significantly more likely to report missing work in the last three months (odds ratio = 1.77, confidence interval = 1.01, 3.11); a finding not attributable to depression. Secretaries were also more likely to seek mental health services, but this finding was not significant (odds ratio = 1.78, confidence interval = 0.55, 5.78). It is possible that these findings are attributable to a selection effect whereby depressed women, and women who are likely to miss work, become secretaries. A second possibility is that women employed as secretaries have more "nonwork role stress" than other employed women. Alternatively, job conditions which result in dissatisfaction and stress may lead to depression and absenteeism. We believe our findings warrant further investigation into the work environment of secretaries.

  8. Efficacy of Miswak toothpaste and mouthwash on cariogenic bacteria

    PubMed Central

    Al-Dabbagh, Samim A.; Qasim, Huda J.; Al-Derzi, Nadia A.

    2016-01-01

    Objectives: To evaluate the efficacy of Salvadora persica (Miswak) products on cariogenic bacteria in comparison with ordinary toothpaste. Methods: The study was conducted in Zakho city, Kurdistan region, Iraq during the period from October 2013 to January 2014. A randomized controlled clinical trial of 40 students randomly allocated into 4 groups. They were instructed to use Mismark toothpaste, Miswak mouthwash, and ordinary toothpaste with water or with normal saline. Salivary samples were collected at 3-time intervals: before, immediately after use, and after 2 weeks of use. The effect of each method on Streptococcus mutans and Lactobacilli was evaluated by using caries risk test. Results: One-way repeated measure analysis of variance (ANOVA), one-way ANOVA, and least significant difference tests were used. Miswak wash has a significant reduction effect on both bacteria immediately and after 2 weeks of use. Miswak paste has a similar effect on Lactobacilli, while Streptococcus mutans showed a significant decrease only after 2 weeks of use. Ordinary paste showed a non significant effect on both bacteria at both time intervals; while the addition of normal saline showed a significant effect on both bacteria only after 2 weeks of use. Conclusion: Miswak products, especially mouth wash, were more effective in reducing the growth of cariogenic bacteria than ordinary toothpaste. PMID:27570858

  9. Effects of Birth Month on Child Health and Survival in Sub-Saharan Africa

    PubMed Central

    Dorélien, Audrey M.

    2015-01-01

    Birth month is broadly predictive of both under-five mortality rates and stunting throughout most of sub-Saharan Africa (SSA). Observed factors, such as mother's age at birth and educational status, are correlated with birth month but are not the main factors underlying the relationship between birth month and child health. Accounting for maternal selection via a fixed-effects model attenuates the relationship between birth month and health in many SSA countries. In the remaining countries, the effect of birth month may be mediated by environmental factors. Birth month effects on mortality typically do not vary across age intervals; the differential mortality rates by birth month were evident in the neonatal period and continued across age intervals. The male-to-female sex-ratio at birth did not vary by birth month, which suggests that in utero exposures are not influencing fetal loss, and therefore, the birth month effects are not likely due to selective survival during the in utero period. In one-third of the sample, the birth month effects on stunting diminished after the age of two years; therefore, some children were able to catch-up. Policies to improve child health should target pregnant women and infants and must take seasonality into account. PMID:26266973

  10. A Self-Contained Pole Syringe Array for Closed-Interval Water Sampling.

    DTIC Science & Technology

    1982-10-19

    L AD-R12l 265 R SELF-CONTAINED POLE SYRINGE ARRAY FOR CLOSDITRR Va WATER SANPLING4U) NAVAL RESEARCH LAB WASHINGTON DC I R E PELLENBARG ET AL. 19 OCT...PERIOD COVERED A SELF-CONTAINED POLE SYRINGE ARRAY FOR Interim report on one phase of CLOSED-INTERVAL WATER SAMPLING an NRL problem. 6. PERFORMING ORG...1473 EDITION OF I NOv ,, IS OMSOLCT S/N 0102-014- 6601 SECURITY CLASSIFICATION OF THIS PAGE (Wm Dle Et ere d A SELF-CONTAINED POLE SYRINGE ARRAY FOR

  11. A Two-Stage Method to Determine Optimal Product Sampling considering Dynamic Potential Market

    PubMed Central

    Hu, Zhineng; Lu, Wei; Han, Bing

    2015-01-01

    This paper develops an optimization model for the diffusion effects of free samples under dynamic changes in potential market based on the characteristics of independent product and presents a two-stage method to figure out the sampling level. The impact analysis of the key factors on the sampling level shows that the increase of the external coefficient or internal coefficient has a negative influence on the sampling level. And the changing rate of the potential market has no significant influence on the sampling level whereas the repeat purchase has a positive one. Using logistic analysis and regression analysis, the global sensitivity analysis gives a whole analysis of the interaction of all parameters, which provides a two-stage method to estimate the impact of the relevant parameters in the case of inaccuracy of the parameters and to be able to construct a 95% confidence interval for the predicted sampling level. Finally, the paper provides the operational steps to improve the accuracy of the parameter estimation and an innovational way to estimate the sampling level. PMID:25821847

  12. The Walter Reed performance assessment battery.

    PubMed

    Thorne, D R; Genser, S G; Sing, H C; Hegge, F W

    1985-01-01

    This paper describes technical details of a computerized psychological test battery designed for examining the effects of various state-variables on a representative sample of normal psychomotor, perceptual and cognitive tasks. The duration, number and type of tasks can be customized to different experimental needs, and then administered and analyzed automatically, at intervals as short as one hour. The battery can be run on either the Apple-II family of computers or on machines compatible with the IBM-PC.

  13. Filtering Drifter Trajectories Sampled at Submesoscale Resolution

    DTIC Science & Technology

    2015-07-10

    interval 5 min and a positioning error 1.5 m, the acceleration error is 4 10 m/s , a value comparable with the typical Coriolis acceleration of a water...10 ms , corresponding to the Coriolis acceleration experi- enced by a water parcel traveling at a speed of 2.2 m/s. This value corresponds to the...computed by integrating the NCOM velocity field contaminated by a random walk process whose effective dispersion coefficient (150 m /s) was specified as the

  14. The Effect of Pixel Size on the Accuracy of Orthophoto Production

    NASA Astrophysics Data System (ADS)

    Kulur, S.; Yildiz, F.; Selcuk, O.; Yildiz, M. A.

    2016-06-01

    In our country, orthophoto products are used by the public and private sectors for engineering services and infrastructure projects, Orthophotos are particularly preferred due to faster and are more economical production according to vector digital photogrammetric production. Today, digital orthophotos provide an expected accuracy for engineering and infrastructure projects. In this study, the accuracy of orthophotos using pixel sizes with different sampling intervals are tested for the expectations of engineering and infrastructure projects.

  15. Evoked Potential Studies of the Effects of Impact Acceleration on the Motor Nervous System,

    DTIC Science & Technology

    1983-01-01

    experimental animals subjected to -Y ampere) were applied sufficient to obtain good (lateral impact) acceleration and animals sub- afferent evoked...NBDL -Y impact experiments were processed at the EP recorded from these animals before and after Texas Research Institute of Mental Sciences impact were...adjustments were made to playback discriminators and cord in a lateral (-Y) collision. Each and the sampling interval. The final digit. -e d, J animal was

  16. Pharmacokinetic interactions and safety evaluations of coadministered tafenoquine and chloroquine in healthy subjects

    PubMed Central

    Miller, Ann K; Harrell, Emma; Ye, Li; Baptiste-Brown, Sharon; Kleim, Jőrg-Peter; Ohrt, Colin; Duparc, Stephan; Möhrle, Jörg J; Webster, Alison; Stinnett, Sandra; Hughes, Arlene; Griffith, Sandy; Beelen, Andrew P

    2013-01-01

    Aims The long-acting 8-aminoquinoline tafenoquine (TQ) coadministered with chloroquine (CQ) may radically cure Plasmodium vivax malaria. Coadministration therapy was evaluated for a pharmacokinetic interaction and for pharmacodynamic, safety and tolerability characteristics. Methods Healthy subjects, 18–55 years old, without documented glucose-6-phosphate dehydrogenase deficiency, received CQ alone (days 1–2, 600 mg; and day 3, 300 mg), TQ alone (days 2 and 3, 450 mg) or coadministration therapy (day 1, CQ 600 mg; day 2, CQ 600 mg + TQ 450 mg; and day 3, CQ 300 mg + TQ 450 mg) in a randomized, double-blind, parallel-group study. Blood samples for pharmacokinetic and pharmacodynamic analyses and safety data, including electrocardiograms, were collected for 56 days. Results The coadministration of CQ + TQ had no effect on TQ AUC0–t, AUC0–∞, Tmax or t1/2. The 90% confidence intervals of CQ + TQ vs. TQ for AUC0–t, AUC0–∞ and t1/2 indicated no drug interaction. On day 2 of CQ + TQ coadministration, TQ Cmax and AUC0–24 increased by 38% (90% confidence interval 1.27, 1.64) and 24% (90% confidence interval 1.04, 1.46), respectively. The pharmacokinetics of CQ and its primary metabolite desethylchloroquine were not affected by TQ. Coadministration had no clinically significant effect on QT intervals and was well tolerated. Conclusions No clinically significant safety or pharmacokinetic/pharmacodynamic interactions were observed with coadministered CQ and TQ in healthy subjects. PMID:23701202

  17. Climate change and the selective signature of the Late Ordovician mass extinction.

    PubMed

    Finnegan, Seth; Heim, Noel A; Peters, Shanan E; Fischer, Woodward W

    2012-05-01

    Selectivity patterns provide insights into the causes of ancient extinction events. The Late Ordovician mass extinction was related to Gondwanan glaciation; however, it is still unclear whether elevated extinction rates were attributable to record failure, habitat loss, or climatic cooling. We examined Middle Ordovician-Early Silurian North American fossil occurrences within a spatiotemporally explicit stratigraphic framework that allowed us to quantify rock record effects on a per-taxon basis and assay the interplay of macrostratigraphic and macroecological variables in determining extinction risk. Genera that had large proportions of their observed geographic ranges affected by stratigraphic truncation or environmental shifts at the end of the Katian stage were particularly hard hit. The duration of the subsequent sampling gaps had little effect on extinction risk, suggesting that this extinction pulse cannot be entirely attributed to rock record failure; rather, it was caused, in part, by habitat loss. Extinction risk at this time was also strongly influenced by the maximum paleolatitude at which a genus had previously been sampled, a macroecological trait linked to thermal tolerance. A model trained on the relationship between 16 explanatory variables and extinction patterns during the early Katian interval substantially underestimates the extinction of exclusively tropical taxa during the late Katian interval. These results indicate that glacioeustatic sea-level fall and tropical ocean cooling played important roles in the first pulse of the Late Ordovician mass extinction in Laurentia.

  18. Effects of fluid, electrolyte and substrate ingestion on endurance capacity.

    PubMed

    Maughan, R J; Fenn, C E; Leiper, J B

    1989-01-01

    The availability of carbohydrate (CHO) as a substrate for the exercising muscles is known to be a limiting factor in the performance of prolonged cycle exercise, and provision of exogenous CHO in the form of glucose can increase endurance capacity. The present study examined the effects of ingestion of fluids and of CHO in different forms on exercise performance. Six male volunteers exercised to exhaustion on a cycle ergometer at a workload which required approximately 70% of Vo2max. After one preliminary trial, subjects performed this exercise test on six occasions, one week apart. Immediately before exercise, and at 10-min intervals throughout, subjects ingested 100 ml of one of the following: control (no drink), water, glucose syrup, fructose syrup, glucose-fructose syrup or a dilute glucose-electrolyte solution. Each of the syrup solutions contained approximately 36 g CHO per 100 ml; the isotonic glucose-electrolyte solution contained 4 g glucose per 100 ml. A randomised Latin square order of administration of trials was employed. Expired air samples for determination of Vo2, respiratory exchange ratio and rate of CHO oxidation were collected at 15-min intervals. Venous blood samples were obtained before and after exercise. Subjects drinking the isotonic glucose-electrolyte solution exercised longer (90.8 (12.4) min, mean (SEM] than on the control test (70.2 (8.3) min; p less than 0.05).(ABSTRACT TRUNCATED AT 250 WORDS)

  19. The Effects of Kangaroo Mother Care and Swaddling on Venipuncture Pain in Premature Neonates: A Randomized Clinical Trial.

    PubMed

    Dezhdar, Shahin; Jahanpour, Faezeh; Firouz Bakht, Saeedeh; Ostovar, Afshin

    2016-04-01

    Hospitalized premature babies often undergo various painful procedures. Kangaroo mother care (KMC) and swaddling are two pain reduction methods. This study was undertaken to compare the effects of swaddling and KMC on pain during venous sampling in premature neonates. This study was performed as a randomized clinical trial on 90 premature neonates. The neonates were divided into three groups using a random allocation block. The three groups were group A (swaddling), group B (KMC), and group C (control). In all three groups, the heart rate and arterial oxygen saturation were measured and recorded in time intervals of 30 seconds before, during, and 30, 60, 90, and 120 seconds after blood sampling. The neonate's face was video recorded and assessed using the premature infant pain profile (PIPP) at time intervals of 30 seconds. The data was analyzed using the t-test, chi-square test, Repeated Measure analysis of variance (ANOVA), Kruskal-Wallis, Post-hoc, and Bonferroni test. The findings revealed that pain was reduced to a great extent in the swaddling and KMC methods compared to the control group. However, there was no significant difference between KMC and swaddling (P ≥ 0.05). The results of this study indicate that there is no meaningful difference between swaddling and KMC on physiological indexes and pain in neonates. Therefore, the swaddling method may be a good substitute for KMC.

  20. Quantification of transuranic elements by time interval correlation spectroscopy of the detected neutrons

    PubMed

    Baeten; Bruggeman; Paepen; Carchon

    2000-03-01

    The non-destructive quantification of transuranic elements in nuclear waste management or in safeguards verifications is commonly performed by passive neutron assay techniques. To minimise the number of unknown sample-dependent parameters, Neutron Multiplicity Counting (NMC) is applied. We developed a new NMC-technique, called Time Interval Correlation Spectroscopy (TICS), which is based on the measurement of Rossi-alpha time interval distributions. Compared to other NMC-techniques, TICS offers several advantages.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbertson, Robert D.; Patterson, Brian M.; Smith, Zachary

    An accelerated aging study of BKC 44306-10 rigid polyurethane foam was carried out. Foam samples were aged in a nitrogen atmosphere at three different temperatures: 50 °C, 65 °C, and 80 °C. Foam samples were periodically removed from the aging canisters at 1, 3, 6, 9, 12, and 15 month intervals when FT-IR spectroscopy, dimensional analysis, and mechanical testing experiments were performed. Micro Computed Tomography imaging was also employed to study the morphology of the foams. Over the course of the aging study the foams the decreased in size by a magnitude of 0.001 inches per inch of foam. Micromore » CT showed the heterogeneous nature of the foam structure likely resulting from flow effects during the molding process. The effect of aging on the compression and tensile strength of the foam was minor and no cause for concern. FT-IR spectroscopy was used to follow the foam chemistry. However, it was difficult to draw definitive conclusions about the changes in chemical nature of the materials due to large variability throughout the samples.« less

  2. Reference intervals for serum biochemistries of molting Pacific Black Brant (Branta bernicla nigricans) in Northern Alaska, USA

    USGS Publications Warehouse

    Franson, J. Christian; Flint, Paul L.; Schmutz, Joel A.

    2017-01-01

    We determined reference intervals for nine serum biochemistries in samples from 329 molting, after-hatch-year, Pacific Black Brant (Branta bernicla nigricans) in Alaska, US. Cholesterol and nonesterified fatty acids differed by sex, but no other differences were noted.

  3. Impact of wildfire return interval on the ectomycorrhizal resistant propagules communities of a Mediterranean open forest.

    PubMed

    Buscardo, Erika; Rodríguez-Echeverría, Susana; Martín, María P; De Angelis, Paolo; Pereira, João Santos; Freitas, Helena

    2010-08-01

    Ectomycorrhizal (ECM) fungi, in particular their spores and other resistant propagules, play an important role in secondary succession processes that facilitate regeneration after disturbance events. In this study, the effects of high and low wildfire frequencies (respectively short and long fire return intervals) on the resistant propagules communities (RPCs) of a Mediterranean open pine forest were compared. Soil samples were collected in four mountain sites with different fire return intervals and used to test ectomycorrhiza development in two hosts, Pinus pinaster and Quercus suber. RPCs were characterized by direct sequencing of fungal internal transcribed spacer (ITS) regions from individual ECM root tips. Eighteen ECM species were detected in the bioassay. The most frequently found fungi were Cenococcum geophilum, Inocybe jacobi, Thelephora terrestris, Tomentella ellisii on both hosts and Rhizopogon luteolus and R. roseolus on maritime pine. A short fire return interval reduced the species richness of the ECM community found on Q. suber, promoted species like R. roseolus and reduced the abundance of other species (e.g. R. luteolus). The abundance of I. jacobi was positively affected by long fire return interval, but decreased significantly with recurrent fires. These results indicate that changes in fire frequency can alter the structure, composition and diversity of ECM communities, which could compromise the resilience of the ecosystem in highly disturbed areas. Copyright © 2010 The British Mycological Society. Published by Elsevier Ltd. All rights reserved.

  4. Ecosystem services from converted land: the importance of tree cover in Amazonian pastures

    USGS Publications Warehouse

    Barrett, Kirsten; Valentim, Judson; Turner, B. L.

    2013-01-01

    Deforestation is responsible for a substantial fraction of global carbon emissions and changes in surface energy budgets that affect climate. Deforestation losses include wildlife and human habitat, and myriad forest products on which rural and urban societies depend for food, fiber, fuel, fresh water, medicine, and recreation. Ecosystem services gained in the transition from forests to pasture and croplands, however, are often ignored in assessments of the impact of land cover change. The role of converted lands in tropical areas in terms of carbon uptake and storage is largely unknown. Pastures represent the fastest-growing form of converted land use in the tropics, even in some areas of rapid urban expansion. Tree biomass stored in these areas spans a broad range, depending on tree cover. Trees in pasture increase carbon storage, provide shade for cattle, and increase productivity of forage material. As a result, increasing fractional tree cover can provide benefits land managers as well as important ecosystem services such as reducing conversion pressure on forests adjacent to pastures. This study presents an estimation of fractional tree cover in pasture in a dynamic region on the verge of large-scale land use change. An appropriate sampling interval is established for similar studies, one that balances the need for independent samples of sufficient number to characterize a pasture in terms of fractional tree cover. This information represents a useful policy tool for government organizations and NGOs interested in encouraging ecosystem services on converted lands. Using high spatial resolution remotely sensed imagery, fractional tree cover in pasture is quantified for the municipality of Rio Branco, Brazil. A semivariogram and devolving spatial resolution are employed to determine the coarsest sampling interval that may be used, minimizing effects of spatial autocorrelation. The coarsest sampling interval that minimizes spatial dependence was about 22 m. The area-weighted fractional tree cover for the study area was 1.85 %, corrected for a slight bias associated with the coarser sampling resolution. The pastures sampled for fractional tree cover were divided between ‘high’ and ‘low’ tree cover, which may be the result of intentional incorporation of arboreal species in pasture. Further research involving those ranchers that have a higher fractional tree cover may indicate ways to promote the practice on a broader scale in the region.

  5. Clinical Evaluation of the BD FACSPresto™ Near-Patient CD4 Counter in Kenya

    PubMed Central

    Angira, Francis; Akoth, Benta; Omolo, Paul; Opollo, Valarie; Bornheimer, Scott; Judge, Kevin; Tilahun, Henok; Lu, Beverly; Omana-Zapata, Imelda; Zeh, Clement

    2016-01-01

    Background The BD FACSPresto™ Near-Patient CD4 Counter was developed to expand HIV/AIDS management in resource-limited settings. It measures absolute CD4 counts (AbsCD4), percent CD4 (%CD4), and hemoglobin (Hb) from a single drop of capillary or venous blood in approximately 23 minutes, with throughput of 10 samples per hour. We assessed the performance of the BD FACSPresto system, evaluating accuracy, stability, linearity, precision, and reference intervals using capillary and venous blood at KEMRI/CDC HIV-research laboratory, Kisumu, Kenya, and precision and linearity at BD Biosciences, California, USA. Methods For accuracy, venous samples were tested using the BD FACSCalibur™ instrument with BD Tritest™ CD3/CD4/CD45 reagent, BD Trucount™ tubes, and BD Multiset™ software for AbsCD4 and %CD4, and the Sysmex™ KX-21N for Hb. Stability studies evaluated duration of staining (18–120-minute incubation), and effects of venous blood storage <6–24 hours post-draw. A normal cohort was tested for reference intervals. Precision covered multiple days, operators, and instruments. Linearity required mixing two pools of samples, to obtain evenly spaced concentrations for AbsCD4, total lymphocytes, and Hb. Results AbsCD4 and %CD4 venous/capillary (N = 189/ N = 162) accuracy results gave Deming regression slopes within 0.97–1.03 and R2 ≥0.96. For Hb, Deming regression results were R2 ≥0.94 and slope ≥0.94 for both venous and capillary samples. Stability varied within 10% 2 hours after staining and for venous blood stored less than 24 hours. Reference intervals results showed that gender—but not age—differences were statistically significant (p<0.05). Precision results had <3.5% coefficient of variation for AbsCD4, %CD4, and Hb, except for low AbsCD4 samples (<6.8%). Linearity was 42–4,897 cells/μL for AbsCD4, 182–11,704 cells/μL for total lymphocytes, and 2–24 g/dL for Hb. Conclusions The BD FACSPresto system provides accurate, precise clinical results for capillary or venous blood samples and is suitable for near-patient CD4 testing. Trial Registration ClinicalTrials.gov NCT02396355 PMID:27483008

  6. A simulative comparison of respondent driven sampling with incentivized snowball sampling--the "strudel effect".

    PubMed

    Gyarmathy, V Anna; Johnston, Lisa G; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A

    2014-02-01

    Respondent driven sampling (RDS) and incentivized snowball sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania ("original sample") to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1-12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called "strudel effect" is discussed in the paper. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Test Statistics and Confidence Intervals to Establish Noninferiority between Treatments with Ordinal Categorical Data.

    PubMed

    Zhang, Fanghong; Miyaoka, Etsuo; Huang, Fuping; Tanaka, Yutaka

    2015-01-01

    The problem for establishing noninferiority is discussed between a new treatment and a standard (control) treatment with ordinal categorical data. A measure of treatment effect is used and a method of specifying noninferiority margin for the measure is provided. Two Z-type test statistics are proposed where the estimation of variance is constructed under the shifted null hypothesis using U-statistics. Furthermore, the confidence interval and the sample size formula are given based on the proposed test statistics. The proposed procedure is applied to a dataset from a clinical trial. A simulation study is conducted to compare the performance of the proposed test statistics with that of the existing ones, and the results show that the proposed test statistics are better in terms of the deviation from nominal level and the power.

  8. Sets of spectral lines for spectrographic thermometry and manometry in d.c. arcs of geologic materials

    USGS Publications Warehouse

    Golightly, D.W.; Dorrzapf, A.F.; Thomas, C.P.

    1977-01-01

    Sets of 5 Fe(I) lines and 3 Ti(I)Ti(II) line pairs have been characterized for precise spectrographic thermometry and manometry, respectively, in d.c. arcs of geologic materials. The recommended lines are free of spectral interferences, exhibit minimal self absorption within defined concentration intervals, and are useful for chemically-unaltered silicate rocks, arced in an argon-oxygen stream. The functional character of these lines in thermometry and manometry of d.c. arcs for evaluations of electrical parameter effects, for temporal studies, and for matrix-effect investigations on real samples is illustrated. ?? 1977.

  9. Food and Insulin Effect on QT/QTC Interval of ECG

    ClinicalTrials.gov

    2014-08-19

    Effects of Different Meals on the QT/QTc Interval; Insulin and Oral Hypoglycemic [Antidiabetic] Drugs Causing Adverse Effects in Therapeutic Use; C-Peptide Effects on the QT/QTc Interval; Moxifloxacin ECG Profile in Fed and Fasted State; Japanese vs. Caucasian TQT Comparison

  10. CLSI-based transference of the CALIPER database of pediatric reference intervals from Abbott to Beckman, Ortho, Roche and Siemens Clinical Chemistry Assays: direct validation using reference samples from the CALIPER cohort.

    PubMed

    Estey, Mathew P; Cohen, Ashley H; Colantonio, David A; Chan, Man Khun; Marvasti, Tina Binesh; Randell, Edward; Delvin, Edgard; Cousineau, Jocelyne; Grey, Vijaylaxmi; Greenway, Donald; Meng, Qing H; Jung, Benjamin; Bhuiyan, Jalaluddin; Seccombe, David; Adeli, Khosrow

    2013-09-01

    The CALIPER program recently established a comprehensive database of age- and sex-stratified pediatric reference intervals for 40 biochemical markers. However, this database was only directly applicable for Abbott ARCHITECT assays. We therefore sought to expand the scope of this database to biochemical assays from other major manufacturers, allowing for a much wider application of the CALIPER database. Based on CLSI C28-A3 and EP9-A2 guidelines, CALIPER reference intervals were transferred (using specific statistical criteria) to assays performed on four other commonly used clinical chemistry platforms including Beckman Coulter DxC800, Ortho Vitros 5600, Roche Cobas 6000, and Siemens Vista 1500. The resulting reference intervals were subjected to a thorough validation using 100 reference specimens (healthy community children and adolescents) from the CALIPER bio-bank, and all testing centers participated in an external quality assessment (EQA) evaluation. In general, the transferred pediatric reference intervals were similar to those established in our previous study. However, assay-specific differences in reference limits were observed for many analytes, and in some instances were considerable. The results of the EQA evaluation generally mimicked the similarities and differences in reference limits among the five manufacturers' assays. In addition, the majority of transferred reference intervals were validated through the analysis of CALIPER reference samples. This study greatly extends the utility of the CALIPER reference interval database which is now directly applicable for assays performed on five major analytical platforms in clinical use, and should permit the worldwide application of CALIPER pediatric reference intervals. Copyright © 2013 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  11. Reference intervals for 24 laboratory parameters determined in 24-hour urine collections.

    PubMed

    Curcio, Raffaele; Stettler, Helen; Suter, Paolo M; Aksözen, Jasmin Barman; Saleh, Lanja; Spanaus, Katharina; Bochud, Murielle; Minder, Elisabeth; von Eckardstein, Arnold

    2016-01-01

    Reference intervals for many laboratory parameters determined in 24-h urine collections are either not publicly available or based on small numbers, not sex specific or not from a representative sample. Osmolality and concentrations or enzymatic activities of sodium, potassium, chloride, glucose, creatinine, citrate, cortisol, pancreatic α-amylase, total protein, albumin, transferrin, immunoglobulin G, α1-microglobulin, α2-macroglobulin, as well as porphyrins and their precursors (δ-aminolevulinic acid and porphobilinogen) were determined in 241 24-h urine samples of a population-based cohort of asymptomatic adults (121 men and 120 women). For 16 of these 24 parameters creatinine-normalized ratios were calculated based on 24-h urine creatinine. The reference intervals for these parameters were calculated according to the CLSI C28-A3 statistical guidelines. By contrast to most published reference intervals, which do not stratify for sex, reference intervals of 12 of 24 laboratory parameters in 24-h urine collections and of eight of 16 parameters as creatinine-normalized ratios differed significantly between men and women. For six parameters calculated as 24-h urine excretion and four parameters calculated as creatinine-normalized ratios no reference intervals had been published before. For some parameters we found significant and relevant deviations from previously reported reference intervals, most notably for 24-h urine cortisol in women. Ten 24-h urine parameters showed weak or moderate sex-specific correlations with age. By applying up-to-date analytical methods and clinical chemistry analyzers to 24-h urine collections from a large population-based cohort we provide as yet the most comprehensive set of sex-specific reference intervals calculated according to CLSI guidelines for parameters determined in 24-h urine collections.

  12. A computer program for uncertainty analysis integrating regression and Bayesian methods

    USGS Publications Warehouse

    Lu, Dan; Ye, Ming; Hill, Mary C.; Poeter, Eileen P.; Curtis, Gary

    2014-01-01

    This work develops a new functionality in UCODE_2014 to evaluate Bayesian credible intervals using the Markov Chain Monte Carlo (MCMC) method. The MCMC capability in UCODE_2014 is based on the FORTRAN version of the differential evolution adaptive Metropolis (DREAM) algorithm of Vrugt et al. (2009), which estimates the posterior probability density function of model parameters in high-dimensional and multimodal sampling problems. The UCODE MCMC capability provides eleven prior probability distributions and three ways to initialize the sampling process. It evaluates parametric and predictive uncertainties and it has parallel computing capability based on multiple chains to accelerate the sampling process. This paper tests and demonstrates the MCMC capability using a 10-dimensional multimodal mathematical function, a 100-dimensional Gaussian function, and a groundwater reactive transport model. The use of the MCMC capability is made straightforward and flexible by adopting the JUPITER API protocol. With the new MCMC capability, UCODE_2014 can be used to calculate three types of uncertainty intervals, which all can account for prior information: (1) linear confidence intervals which require linearity and Gaussian error assumptions and typically 10s–100s of highly parallelizable model runs after optimization, (2) nonlinear confidence intervals which require a smooth objective function surface and Gaussian observation error assumptions and typically 100s–1,000s of partially parallelizable model runs after optimization, and (3) MCMC Bayesian credible intervals which require few assumptions and commonly 10,000s–100,000s or more partially parallelizable model runs. Ready access allows users to select methods best suited to their work, and to compare methods in many circumstances.

  13. Longitudinal progesterone profiles in baleen from female North Atlantic right whales (Eubalaena glacialis) match known calving history

    PubMed Central

    Hunt, Kathleen E.; Lysiak, Nadine S.; Moore, Michael J.; Rolland, Rosalind M.

    2016-01-01

    Reproduction of mysticete whales is difficult to monitor, and basic parameters, such as pregnancy rate and inter-calving interval, remain unknown for many populations. We hypothesized that baleen plates (keratinous strips that grow downward from the palate of mysticete whales) might record previous pregnancies, in the form of high-progesterone regions in the sections of baleen that grew while the whale was pregnant. To test this hypothesis, longitudinal baleen progesterone profiles from two adult female North Atlantic right whales (Eubalaena glacialis) that died as a result of ship strike were compared with dates of known pregnancies inferred from calf sightings and post-mortem data. We sampled a full-length baleen plate from each female at 4 cm intervals from base (newest baleen) to tip (oldest baleen), each interval representing ∼60 days of baleen growth, with high-progesterone areas then sampled at 2 or 1 cm intervals. Pulverized baleen powder was assayed for progesterone using enzyme immunoassay. The date of growth of each sampling location on the baleen plate was estimated based on the distance from the base of the plate and baleen growth rates derived from annual cycles of stable isotope ratios. Baleen progesterone profiles from both whales showed dramatic elevations (two orders of magnitude higher than baseline) in areas corresponding to known pregnancies. Baleen hormone analysis shows great potential for estimation of recent reproductive history, inter-calving interval and general reproductive biology in this species and, possibly, in other mysticete whales. PMID:27293762

  14. Effects of urbanization on water quality in the Kansas River, Shunganunga Creek Basin, and Soldier Creek, Topeka, Kansas, October 1993 through September 1995

    USGS Publications Warehouse

    Pope, L.M.; Putnam, J.E.

    1997-01-01

    A study of urban-related water-qulity effects in the Kansas River, Shunganunga Creek Basin, and Soldier Creek in Topeka, Kansas, was conducted from October 1993 through September 1995. The purpose of this report is to assess the effects of urbanization on instream concentrations of selected physical and chemical constituents within the city of Topeka. A network of seven sampling sites was established in the study area. Samples principally were collected at monthly intervals from the Kansas River and from the Shunganunga Creek Basin, and at quarterly intervals from Soldier Creek. The effects of urbanization werestatistically evaluated from differences in constituent concentrations between sites on the same stream. No significant differences in median concentrations of dissolved solids, nutrients, or metals and trace elements, or median densities offecal bacteria were documented between sampling sites upstream and downstream from the major urbanized length of the Kansas River in Topeka.Discharge from the city's primary wastewater- treatment plant is the largest potential source of contamination to the Kansas River. This discharge increased concentrations of dissolved ammonia, totalphosphorus, and densities of fecal bacteria.Calculated dissolved ammonia as nitrogen concentrations in water from the Kansas River ranged from 0.03 to 1.1 milligrams per liter after receiving treatment-plant discharge. However, most of the calculated concentrations wereconsiderably less than 50 percent of Kansas Department of Health and Environment water- quality criteria, with a median value of 20 percent.Generally, treatment-plant discharge increased calculated total phosphorus concentrations in water from the Kansas River by 0.01 to 0.04 milligrams per liter, with a median percentage increase of 7.6 percent. The calculated median densities of fecal coliform and fecal Streptococci bacteria in water from the Kansas River increased from 120 and 150colonies per 100 milliliters of water, respectively, before treatment-plant discharge to a calculated 4,900 and 4,700 colonies per 100 milliliters of water, respectively, after discharge. Median concentrations of dissolved solids were not significantly different between three sampling sites in the Shunganunga Creek Basin. Median concentrations of dissolved nitrate as nitrogen, total phosphorus, and dissolved orthophosphate were significantly larger in water from the upstream- most Shunganunga Creek sampling site than in water from either of the other sampling sites in the Shunganunga Creek Basin probably because of the site's proximity to a wastewater-treatment plant.Median concentrations of dissolved nitrate as nitrogen and total phosphorus during 1993-95 at upstream sampling sites were either significantlylarger than during 1979-81 in response to increase of wastewater-treatment plant discharge or smaller because of the elimination of wastewater-treatment plant discharge. Median concentrations of dissolved ammonia as nitrogen were significantly less during 1993-95 than during 1979-81. Median concentrations of total aluminum, iron, maganese, and molybdenum were significantly larger in water from the downstream-mostShunganunga Creek sampling site than in water from the upstream-most sampling site. This probably reflects their widespread use in the urbanenvironment between the upstream and downstream Shunganunga Creek sampling sites. Little water-quality effect from the urbanization was indicated by results from the Soldier Creek sampling site. Median concentrations of most water-quality constituents in water from this sampling site were the smallest in water from any sampling site in the study area. Herbicides were detected in water from all sampling sites. Some of the more frequently detected herbicides included acetochlor, alachlor,atrazine, cyanazine, EPTC, metolachlor, prometon, simazine, and tebuthiuron. Detected insecticides including chlordane,

  15. Gravity separation of pericardial fat in cardiotomy suction blood: an in vitro model.

    PubMed

    Kinard, M Rhett; Shackelford, Anthony G; Sistino, Joseph J

    2009-06-01

    Fat emboli generated during cardiac surgery have been shown to cause neurologic complications in patients postoperatively. Cardiotomy suction has been known to be a large generator of emboli. This study will examine the efficacy of a separation technique in which the cardiotomy suction blood is stored in a cardiotomy reservoir for various time intervals to allow spontaneous separation of fat from blood by density. Soybean oil was added to heparinized porcine blood to simulate the blood of a patient with hypertriglyceridemia (> 150 mg/dL). Roller pump suction was used to transfer the room temperature blood into the cardiotomy reservoir. Blood was removed from the reservoir in 200-mL aliquots at 0, 15, 30 45, and 60 minutes. Samples were taken at each interval and centrifuged to facilitate further separation of liquid fat. Fat content in each sample was determined by a point-of-care triglyceride analyzer. Three trials were conducted for a total of 30 samples. The 0-minute group was considered a baseline and was compared to the other four times. Fat concentration was reduced significantly in the 45- and 60-minute groups compared to the 0-, 15-, and 30-minute groups (p < .05). Gravity separation of cardiotomy suction blood is effective; however, it may require retention of blood for more time than is clinically acceptable during a routing coronary artery bypass graft surgery.

  16. Insufficient filling of vacuum tubes as a cause of microhemolysis and elevated serum lactate dehydrogenase levels. Use of a data-mining technique in evaluation of questionable laboratory test results.

    PubMed

    Tamechika, Yoshie; Iwatani, Yoshinori; Tohyama, Kaoru; Ichihara, Kiyoshi

    2006-01-01

    Experienced physicians noted unexpectedly elevated concentrations of lactate dehydrogenase in some patient samples, but quality control specimens showed no bias. To evaluate this problem, we used a "latent reference individual extraction method", designed to obtain reference intervals from a laboratory database by excluding individuals who have abnormal results for basic analytes other than the analyte in question, in this case lactate dehydrogenase. The reference interval derived for the suspected year was 264-530 U/L, while that of the previous year was 248-495 U/L. The only change we found was the introduction of an order entry system, which requests precise sampling volumes rather than complete filling of vacuum tubes. The effect of vacuum persistence was tested using ten freshly drawn blood samples. Compared with complete filling, 1/5 filling resulted in average elevations of lactate dehydrogenase, aspartic aminotransferase, and potassium levels of 8.0%, 3.8%, and 3.4%, respectively (all p<0.01). Microhemolysis was confirmed using a urine stick method. The length of time before centrifugation determined the degree of hemolysis, while vacuum during centrifugation did not affect it. Microhemolysis is the probable cause of the suspected pseudo-elevation noted by the physicians. Data-mining methodology represents a valuable tool for monitoring long-term bias in laboratory results.

  17. Water-quality response to a high-elevation wildfire in the Colorado Front Range

    USGS Publications Warehouse

    Mast, M. Alisa; Murphy, Sheila F.; Clow, David W.; Penn, Colin A.; Sexstone, Graham A.

    2016-01-01

    Water quality of the Big Thompson River in the Front Range of Colorado was studied for 2 years following a high-elevation wildfire that started in October 2012 and burned 15% of the watershed. A combination of fixed-interval sampling and continuous water-quality monitors was used to examine the timing and magnitude of water-quality changes caused by the wildfire. Prefire water quality was well characterized because the site has been monitored at least monthly since the early 2000s. Major ions and nitrate showed the largest changes in concentrations; major ion increases were greatest in the first postfire snowmelt period, but nitrate increases were greatest in the second snowmelt period. The delay in nitrate release until the second snowmelt season likely reflected a combination of factors including fire timing, hydrologic regime, and rates of nitrogen transformations. Despite the small size of the fire, annual yields of dissolved constituents from the watershed increased 20–52% in the first 2 years following the fire. Turbidity data from the continuous sensor indicated high-intensity summer rain storms had a much greater effect on sediment transport compared to snowmelt. High-frequency sensor data also revealed that weekly sampling missed the concentration peak during snowmelt and short-duration spikes during rain events, underscoring the challenge of characterizing postfire water-quality response with fixed-interval sampling.

  18. The effect of ion irradiation on the dissolution of UO 2 and UO 2 -based simulant fuel

    DOE PAGES

    Popel, Aleksej J.; Wietsma, Thomas W.; Engelhard, Mark H.; ...

    2017-11-21

    Our aim is to study the separate effect of fission fragment damage on the dissolution of simulant UK advanced gas-cooled reactor nuclear fuel in water. Plain UO 2 and UO 2 samples, doped with inactive fission products to simulate 43 GWd/tU of burn-up, were fabricated. A set of these samples were then irradiated with 92 MeV 129Xe 23+ ions to a fluence of 4.8 × 10 15 ions/cm 2 to simulate the fission damage that occurs within nuclear fuels. The primary effect of the irradiation on the UO 2 samples, observed by scanning electron microscopy, was to induce a smootheningmore » of the surface features and formation of hollow blisters, which was attributed to multiple overlap of ion tracks. Dissolution experiments were conducted in single-pass flow-through (SPFT) mode under anoxic conditions (<0.1 O 2 ppm in Ar) to study the effect of the induced irradiation damage on the dissolution of the UO 2 matrix with data collection capturing six minute intervals for several hours. These time-resolved data showed that the irradiated samples showed a higher initial release of uranium than unirradiated samples, but that the uranium concentrations converged towards ~10 -9 mol/l after a few hours. And apart from the initial spike in uranium concentration, attributed to irradiation induced surficial micro-structural changes, no noticeable difference in uranium chemistry as measured by X-ray electron spectroscopy or ‘effective solubility’ was observed between the irradiated, doped and undoped samples in this work. Some secondary phase formation was observed on the surface of UO 2 samples after the dissolution experiment.« less

  19. The effect of ion irradiation on the dissolution of UO 2 and UO 2 -based simulant fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Popel, Aleksej J.; Wietsma, Thomas W.; Engelhard, Mark H.

    Our aim is to study the separate effect of fission fragment damage on the dissolution of simulant UK advanced gas-cooled reactor nuclear fuel in water. Plain UO 2 and UO 2 samples, doped with inactive fission products to simulate 43 GWd/tU of burn-up, were fabricated. A set of these samples were then irradiated with 92 MeV 129Xe 23+ ions to a fluence of 4.8 × 10 15 ions/cm 2 to simulate the fission damage that occurs within nuclear fuels. The primary effect of the irradiation on the UO 2 samples, observed by scanning electron microscopy, was to induce a smootheningmore » of the surface features and formation of hollow blisters, which was attributed to multiple overlap of ion tracks. Dissolution experiments were conducted in single-pass flow-through (SPFT) mode under anoxic conditions (<0.1 O 2 ppm in Ar) to study the effect of the induced irradiation damage on the dissolution of the UO 2 matrix with data collection capturing six minute intervals for several hours. These time-resolved data showed that the irradiated samples showed a higher initial release of uranium than unirradiated samples, but that the uranium concentrations converged towards ~10 -9 mol/l after a few hours. And apart from the initial spike in uranium concentration, attributed to irradiation induced surficial micro-structural changes, no noticeable difference in uranium chemistry as measured by X-ray electron spectroscopy or ‘effective solubility’ was observed between the irradiated, doped and undoped samples in this work. Some secondary phase formation was observed on the surface of UO 2 samples after the dissolution experiment.« less

  20. The effect of parturition induction treatment on interval to calving, calving ease, postpartum uterine health, and resumption of ovarian cyclicity in beef heifers.

    PubMed

    Šavc, Miha; Kenny, David A; Beltman, Marijke E

    2016-05-01

    The aim of this study was to compare the effects of two parturition induction protocols with a nontreated control group, on interval to calving, calving ease, postpartum uterine health, and ovarian cyclicity in beef heifers. At Day 285 of gestation, 81 crossbred recipient beef heifers carrying purebred Simmental fetuses, were blocked by live-weight, body condition score, expected calving date and fetal sex, and assigned to one of three groups: (1) control (CON; no induction treatment, n = 29); (2) induction with corticosteroids (CORT; n = 27); or (3) induction with corticosteroids plus prostaglandin (CORT + PG; n = 25). Interval from induction to calving in hours and calving ease on a scale of 1 to 5 were recorded. Vaginal mucus samples were collected on Day 21 and Day 42 after calving (Day 0) by means of a Metricheck and scored on a scale of 0 to 3. Reproductive tract examinations were conducted on Day 21 and Day 42 after calving, and uterine cytology samples were obtained on Day 21. A positive cytologic sample was defined as greater than 18% neutrophils in the sample obtained via a cytobrush technique. Cows were considered to have resumed ovarian cyclicity if the presence of the CL was confirmed. Data were analyzed using the Mixed (normally distributed data) and Genmod (nonparametric data) procedures of SAS (v. 9.3). The interval from treatment to calving was longer (P < 0.0001) for CON (161.9 ± 15.12 hours) animals compared with CORT (39.7 ± 11.64 hours) or CORT + PG (32.6 ± 12.10 hours), which did not differ. Treatment did not affect calving difficulty score. There was also no difference in incidence of retained placenta between the three groups. At Day 21 postpartum, cytology score tended to be higher for both induced groups (48%) compared with the control animals (24%), but this was not the case for vaginal mucus score (CON 52%, CORT 70%, and CORT + PG 52%). A higher proportion of CON had an involuted uterus by Day 21 postpartum (69%) compared with both induced groups (CORT 48%, CORT + PG 32%). Day 21 ovarian cyclicity was higher in both CON (52%) and CORT (59%) compared with CORT + PG (29%). By Day 42, there was no difference in ovarian cyclicity or uterine involution between CON and CORT; however, a positive relationship was observed between uterine involution score on Day 21 and return to cyclicity on Day 42 in these two groups. There was a negative relationship between uterine involution score and return to cyclicity in the CORT + PG group, and these animals were slower (P < 0.05) to resume cyclicity by Day 42 with a larger proportion animals having evidence of having resumed postpartum ovarian cyclicity in both CON (P = 0.03) and CORT compared with CORT + PG on Day 42. In conclusion, the use of corticosteroid-based treatments is an effective strategy to advance parturition in full term dams and does not have a negative effect on calving progress or dam health. However, when prostaglandin is also included in the protocol, these treatments may lead to greater delay in uterine involution with increased chance of uterine infection and slower resumption of ovarian cyclicity. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  2. Soil Carbon Variability and Change Detection in the Forest Inventory Analysis Database of the United States

    NASA Astrophysics Data System (ADS)

    Wu, A. M.; Nater, E. A.; Dalzell, B. J.; Perry, C. H.

    2014-12-01

    The USDA Forest Service's Forest Inventory Analysis (FIA) program is a national effort assessing current forest resources to ensure sustainable management practices, to assist planning activities, and to report critical status and trends. For example, estimates of carbon stocks and stock change in FIA are reported as the official United States submission to the United Nations Framework Convention on Climate Change. While the main effort in FIA has been focused on aboveground biomass, soil is a critical component of this system. FIA sampled forest soils in the early 2000s and has remeasurement now underway. However, soil sampling is repeated on a 10-year interval (or longer), and it is uncertain what magnitude of changes in soil organic carbon (SOC) may be detectable with the current sampling protocol. We aim to identify the sensitivity and variability of SOC in the FIA database, and to determine the amount of SOC change that can be detected with the current sampling scheme. For this analysis, we attempt to answer the following questions: 1) What is the sensitivity (power) of SOC data in the current FIA database? 2) How does the minimum detectable change in forest SOC respond to changes in sampling intervals and/or sample point density? Soil samples in the FIA database represent 0-10 cm and 10-20 cm depth increments with a 10-year sampling interval. We are investigating the variability of SOC and its change over time for composite soil data in each FIA region (Pacific Northwest, Interior West, Northern, and Southern). To guide future sampling efforts, we are employing statistical power analysis to examine the minimum detectable change in SOC storage. We are also investigating the sensitivity of SOC storage changes under various scenarios of sample size and/or sample frequency. This research will inform the design of future FIA soil sampling schemes and improve the information available to international policy makers, university and industry partners, and the public.

  3. Chemical Characteristics, Water Sources and Pathways, and Age Distribution of Ground Water in the Contributing Recharge Area of a Public-Supply Well near Tampa, Florida, 2002-05

    USGS Publications Warehouse

    Katz, Brian G.; Crandall, Christy A.; Metz, Patricia A.; McBride, W. Scott; Berndt, Marian P.

    2007-01-01

    In 2001, the National Water-Quality Assessment (NAWQA) Program of the U.S. Geological Survey began a series of studies on the transport of anthropogenic and natural contaminants (TANC) to public-supply wells. The main goal of the TANC program was to better understand the source, transport, and receptor factors that control contaminant movement to public-supply wells in representative aquifers of the United States. Studies were first conducted at regional scales at four of the eight TANC study areas during 2002-03 and at small (local) scales during 2003-05 in California, Nebraska, Connecticut, and Florida. In the Temple Terrace study area near Tampa, Florida, multiple chemical indicators and geochemical and ground-water flow modeling techniques were used to assess the vulnerability of a public-supply well in the karstic Upper Floridan aquifer to contamination from anthropogenic and naturally occurring contaminants. During 2003-05, water samples were collected from the public-supply well and 13 surrounding monitoring wells that all tap the Upper Floridan aquifer, and from 15 monitoring wells in the overlying surficial aquifer system and the intermediate confining unit that are located within the modeled ground-water contributing recharge area of the public-supply well. Six volatile organic compounds and four pesticides were detected in trace concentrations (well below drinking-water standards) in water from the public-supply well, which had an open interval from 36 to 53 meters below land surface. These contaminants were detected more frequently in water samples from monitoring wells in the overlying clastic surficial aquifer system than in water from monitoring wells in the Upper Floridan aquifer in the study area. Likewise, nitrate-N concentrations in the public-supply well (0.72-1.4 milligrams per liter) were more similar to median concentrations in the oxic surficial aquifer system (2.1 milligrams per liter) than to median nitrate-N concentrations in the anoxic Upper Floridan aquifer (0.06 milligram per liter) under sulfate-reducing conditions. High concentrations of radon-222 and uranium in the public-supply well compared to those in monitoring wells in the Upper Floridan aquifer appear to originate from water moving downward through sands and discontinuous clay lenses that overlie the aquifer. Water samples also were collected from three overlapping depth intervals (38-53, 43-53, and 49-53 meters below land surface) in the public-supply well. The 49- to 53-meter interval was identified as a high-flow zone during geophysical logging of the wellbore. Water samples were collected from these depth intervals at a low pumping rate by placing a low-capacity submersible pump (less than 0.02 cubic meter per minute) at the top of each interval. To represent higher pumping conditions, a large-capacity portable submersible pump (1.6 cubic meters per minute) was placed near the top of the open interval; water-chemistry samples were collected using the low-capacity submersible pump. The 49- to 53-meter depth interval had distinctly different chemistry than the other two sampled intervals. Higher concentrations of nitrate-N, atrazine, radon, trichloromethane (chloroform), and arsenic (and high arsenic (V)/arsenic (III) ratios); lower concentrations of dissolved solids, strontium, iron, manganese, and lower nitrogen and sulfur isotope ratios were found in this highly transmissive zone in the limestone than in water from the two other depth intervals. Movement of water likely occurs from the overlying sands and clays of the oxic surficial aquifer system and intermediate confining unit (that contains high radon-222 and nitrate-N concentrations) into the anoxic Upper Floridan aquifer (that contains low radon-222 and nitrate-N concentrations). Differences in arsenic concentrations in water from the various depth intervals in the public-supply well (3.2-19.0 micrograms per liter) were related to pumping conditions. The high arsenic

  4. Modeling of frequency-domain scalar wave equation with the average-derivative optimal scheme based on a multigrid-preconditioned iterative solver

    NASA Astrophysics Data System (ADS)

    Cao, Jian; Chen, Jing-Bo; Dai, Meng-Xue

    2018-01-01

    An efficient finite-difference frequency-domain modeling of seismic wave propagation relies on the discrete schemes and appropriate solving methods. The average-derivative optimal scheme for the scalar wave modeling is advantageous in terms of the storage saving for the system of linear equations and the flexibility for arbitrary directional sampling intervals. However, using a LU-decomposition-based direct solver to solve its resulting system of linear equations is very costly for both memory and computational requirements. To address this issue, we consider establishing a multigrid-preconditioned BI-CGSTAB iterative solver fit for the average-derivative optimal scheme. The choice of preconditioning matrix and its corresponding multigrid components is made with the help of Fourier spectral analysis and local mode analysis, respectively, which is important for the convergence. Furthermore, we find that for the computation with unequal directional sampling interval, the anisotropic smoothing in the multigrid precondition may affect the convergence rate of this iterative solver. Successful numerical applications of this iterative solver for the homogenous and heterogeneous models in 2D and 3D are presented where the significant reduction of computer memory and the improvement of computational efficiency are demonstrated by comparison with the direct solver. In the numerical experiments, we also show that the unequal directional sampling interval will weaken the advantage of this multigrid-preconditioned iterative solver in the computing speed or, even worse, could reduce its accuracy in some cases, which implies the need for a reasonable control of directional sampling interval in the discretization.

  5. Window period donations during primary cytomegalovirus infection and risk of transfusion-transmitted infections.

    PubMed

    Ziemann, Malte; Heuft, Hans-Gert; Frank, Kerstin; Kraas, Sabine; Görg, Siegfried; Hennig, Holger

    2013-05-01

    Donors with short interdonation intervals (e.g., apheresis donors) have an increased risk of window period donations. The frequency of cytomegalovirus (CMV) window period donations is important information to decide whether selection of seronegative donors might be advantageous for patients at risk for transfusion-transmitted CMV infections (TT-CMV). CMV seroconversion in 93 donors with positive results in routine CMV antibody testing within at most 35 days after the last seronegative sample was evaluated by Western blot and/or a second antibody test. In donors with unconfirmed seroconversion, an additional later sample was tested. Concentration of CMV DNA was determined in pre- and postseroconversion samples. CMV seroconversion was confirmed in 12 donors (13%). Among these, the last seronegative sample was CMV DNA positive in three donors (25%, below 30 IU/mL). The first seropositive sample was CMV DNA positive in 10 donors (83%, maximum 1600 IU/mL). Both prevalence and median concentration of CMV DNA were higher in the first seropositive sample (p = 0.004 and p = 0.02), with maximum concentrations being reached about 2 weeks after seroconversion. No CMV DNA was detected in samples from donors with unconfirmed seroconversion. At least in donors with short interdonation intervals, most suspected CMV seroconversions are due to false-positive results of the screening test. As window period donations are rare and contain less CMV DNA than the first seropositive donation, avoidance of blood products from primarily seropositive donors is especially helpful to avoid TT-CMV if donors with short interdonation intervals are concerned. © 2013 American Association of Blood Banks.

  6. A new variable interval schedule with constant hazard rate and finite time range.

    PubMed

    Bugallo, Mehdi; Machado, Armando; Vasconcelos, Marco

    2018-05-27

    We propose a new variable interval (VI) schedule that achieves constant probability of reinforcement in time while using a bounded range of intervals. By sampling each trial duration from a uniform distribution ranging from 0 to 2 T seconds, and then applying a reinforcement rule that depends linearly on trial duration, the schedule alternates reinforced and unreinforced trials, each less than 2 T seconds, while preserving a constant hazard function. © 2018 Society for the Experimental Analysis of Behavior.

  7. Mid-Atlantic Microtidal Barrier Coast Classification.

    DTIC Science & Technology

    1983-05-01

    subregions A through F. APPENDIX A. BIGDAT data file for the 800 sample sites along the coast, and strike-parallel plots of this data. i C 4 FLIST OF FIGURES...from this data set as follows: 1) BIGDAT - the entire coast at 1-km intervals, including areas peripheral to inlets and capes (n - 800); 2) INLETR2 - the...in Table 5. The Entire Coast at 1-km Intervals ( BIGDAT and fINLETRZ Correlation analysis of the 15 variables for the entire coast at 1-km intervals

  8. Retrospective analysis of mercury content in feathers of birds collected from the state of Michigan (1895-2007).

    PubMed

    Head, Jessica A; DeBofsky, Abigail; Hinshaw, Janet; Basu, Niladri

    2011-10-01

    Museum specimens were used to analyze temporal trends in feather mercury (Hg) concentrations in birds collected from the state of Michigan between the years 1895 and 2007. Hg was measured in flank and secondary feathers from three species of birds that breed in the Great Lakes region; common terns (n = 32), great blue herons (n = 35), and herring gulls (n = 35). More than 90% of the Hg in feathers should be organic, but some of the heron and gull feathers collected prior to 1936 showed evidence of contamination with inorganic Hg, likely from museum preservatives. The data presented here therefore consist of organic Hg in pre-1936 samples and total Hg in post-1936 samples. Insufficient tissue was available from terns to assess organic Hg content. Mean Hg concentrations ranged from 2.9 ± 2.5 μg/g Hg in tern flank feathers to 12.4 ± 10.6 μg/g Hg in gull flank feathers. No linear trend of Hg contamination over time was detected in herons and gulls. Though a significant decrease was noted for terns, these data are presented with caution given the strong likelihood that earlier samples were preserved with inorganic mercury. When data were separated into 30-year intervals, Hg content in heron and gull feathers collected from birds sampled between 1920 and 1949 were consistently highest but not to a level of statistical significance. For example, Hg concentrations in gull secondary feathers collected in the second time interval (1920-1949) were 11.5 ± 7.8. This value was 67% higher than the first time interval (1890-1919), 44% higher than the third interval (1950-1979), and 187% higher than the fourth interval (1980-2009). Studies on Great Lakes sediments also showed greatest Hg accumulations in the mid-twentieth century. Through the use of museum specimens, these results present a unique snapshot of Hg concentrations in Great Lakes biota in the early part of the twentieth century.

  9. Improving the analysis of composite endpoints in rare disease trials.

    PubMed

    McMenamin, Martina; Berglind, Anna; Wason, James M S

    2018-05-22

    Composite endpoints are recommended in rare diseases to increase power and/or to sufficiently capture complexity. Often, they are in the form of responder indices which contain a mixture of continuous and binary components. Analyses of these outcomes typically treat them as binary, thus only using the dichotomisations of continuous components. The augmented binary method offers a more efficient alternative and is therefore especially useful for rare diseases. Previous work has indicated the method may have poorer statistical properties when the sample size is small. Here we investigate small sample properties and implement small sample corrections. We re-sample from a previous trial with sample sizes varying from 30 to 80. We apply the standard binary and augmented binary methods and determine the power, type I error rate, coverage and average confidence interval width for each of the estimators. We implement Firth's adjustment for the binary component models and a small sample variance correction for the generalized estimating equations, applying the small sample adjusted methods to each sub-sample as before for comparison. For the log-odds treatment effect the power of the augmented binary method is 20-55% compared to 12-20% for the standard binary method. Both methods have approximately nominal type I error rates. The difference in response probabilities exhibit similar power but both unadjusted methods demonstrate type I error rates of 6-8%. The small sample corrected methods have approximately nominal type I error rates. On both scales, the reduction in average confidence interval width when using the adjusted augmented binary method is 17-18%. This is equivalent to requiring a 32% smaller sample size to achieve the same statistical power. The augmented binary method with small sample corrections provides a substantial improvement for rare disease trials using composite endpoints. We recommend the use of the method for the primary analysis in relevant rare disease trials. We emphasise that the method should be used alongside other efforts in improving the quality of evidence generated from rare disease trials rather than replace them.

  10. Sampled-data chain-observer design for a class of delayed nonlinear systems

    NASA Astrophysics Data System (ADS)

    Kahelras, M.; Ahmed-Ali, T.; Giri, F.; Lamnabhi-Lagarrigue, F.

    2018-05-01

    The problem of observer design is addressed for a class of triangular nonlinear systems with not-necessarily small delay and sampled output measurements. One more difficulty is that the system state matrix is dependent on the un-delayed output signal which is not accessible to measurement, making existing observers inapplicable. A new chain observer, composed of m elementary observers in series, is designed to compensate for output sampling and arbitrary large delays. The larger the time-delay the larger the number m. Each elementary observer includes an output predictor that is conceived to compensate for the effects of output sampling and a fractional delay. The predictors are defined by first-order ordinary differential equations (ODEs) much simpler than those of existing predictors which involve both output and state predictors. Using a small gain type analysis, sufficient conditions for the observer to be exponentially convergent are established in terms of the minimal number m of elementary observers and the maximum sampling interval.

  11. Effect of Xylopia aethiopica aqueous extract on antioxidant properties of refrigerated Roma tomato variety packaged in low density polyethylene bags.

    PubMed

    Babarinde, Grace Oluwakemi; Adegoke, Gabriel O

    2015-03-01

    Effects of Xylopia aethiopica (Dunal) A. Richard aqueous extract on the antioxidants of matured tomato fruits at red stage were investigated at 13 ± 2 °C and 80 ± 5 % relative humidity. A sample treated with sodium bicarbonate and untreated samples were included. Samples packaged in low density polyethylene (30 μm thickness) bags were analysed at intervals of 5 days. The treatments revealed statistically significant differences in ascorbic acid content of stored tomato fruits. Fruits treated with 5 % X. aethiopica on day 5 of storage had 21.0 mg/100 g which was significantly (p < 0.05) higher than 18.2 mg/100 g in untreated control samples. At 15th day of storage, ascorbic acid was 10.0 and 14.2 mg/100 g in tomato fruits treated with sodium bicarbonate and 5 % X. aethiopica respectively. The carotenoid and lycopene contents were lower in sodium bicarbonate-treated and the untreated control samples than in X. aethiopica-treated sample. The total phenolic contents were better retained in X. aethiopica-treated tomato than in control. Treatment of tomato fruits with X. aethiopica at 4 & 5 % levels significantly retained the qualities evaluated.

  12. Quality of major ion and total dissolved solids data from groundwater sampled by the National Water-Quality Assessment Program, 1992–2010

    USGS Publications Warehouse

    Gross, Eliza L.; Lindsey, Bruce D.; Rupert, Michael G.

    2012-01-01

    Field blank samples help determine the frequency and magnitude of contamination bias, and replicate samples help determine the sampling variability (error) of measured analyte concentrations. Quality control data were evaluated for calcium, magnesium, sodium, potassium, chloride, sulfate, fluoride, silica, and total dissolved solids. A 99-percent upper confidence limit is calculated from field blanks to assess the potential for contamination bias. For magnesium, potassium, chloride, sulfate, and fluoride, potential contamination in more than 95 percent of environmental samples is less than or equal to the common maximum reporting level. Contamination bias has little effect on measured concentrations greater than 4.74 mg/L (milligrams per liter) for calcium, 14.98 mg/L for silica, 4.9 mg/L for sodium, and 120 mg/L for total dissolved solids. Estimates of sampling variability are calculated for high and low ranges of concentration for major ions and total dissolved solids. Examples showing the calculation of confidence intervals and how to determine whether measured differences between two water samples are significant are presented.

  13. Impact of chlortetracycline and sulfapyridine antibiotics on soil enzyme activities

    NASA Astrophysics Data System (ADS)

    Molaei, Ali; Lakzian, Amir; Datta, Rahul; Haghnia, Gholamhosain; Astaraei, Alireza; Rasouli-Sadaghiani, MirHassan; Ceccherini, Maria T.

    2017-10-01

    Pharmaceutical antibiotics are frequently used in the livestock and poultry industries to control infectious diseases. Due to the lack of proper guidance for use, the majority of administrated antibiotics and their metabolites are excreted to the soil environment through urine and feces. In the present study, we used chlortetracycline and sulfapyridine antibiotics to screen out their effects on dehydrogenase, alkaline phosphatase and urease activity. Factorial experiments were conducted with different concentrations of antibiotic (0, 10, 25 and 100 mg kg-1 of soil) mixed with soil samples, and the enzyme activity was measured at intervals of 1, 4 and 21 days. The results show that the chlortetracycline and sulfapyridine antibiotics negatively affect the dehydrogenase activity, but the effect of sulfapyridine decreases with time of incubation. Indeed, sulfapyridine antibiotic significantly affect the alkaline phosphatase activity for the entire three-time interval, while chlortetracycline seems to inhibit its activity within 1 and 4 days of incubation. The effects of chlortetracycline and sulfapyridine antibiotics on urease activity appear similar, as they both significantly affect the urease activity on day 1 of incubation. The present study concludes that chlortetracycline and sulfapyridine antibiotics have harmful effects on soil microbes, with the extent of effects varying with the duration of incubation and the type of antibiotics used.

  14. Pharmacokinetics and effect on the corrected QT interval of single-dose escitalopram in healthy elderly compared with younger adults.

    PubMed

    Chung, Hyewon; Kim, Anhye; Lim, Kyoung Soo; Park, Sang-In; Yu, Kyung-Sang; Yoon, Seo Hyun; Cho, Joo-Youn; Chung, Jae-Yong

    2017-01-01

    Escitalopram is the (S)-enantiomer of citalopram that has a potential QT prolonging effect. In this study, 12 healthy elderly individuals received a single oral dose of escitalopram (20 mg), and their pharmacokinetics and QT effect data were compared with data from 33 younger adults obtained in a previous study. Serial blood samples for pharmacokinetic analysis were collected and ECG was performed up to 48 h postdose. The elderly and younger adults showed similar pharmacokinetic profiles. The geometric mean ratios (90% confidence interval) of the elderly compared with the younger adults were 1.02 (0.89-1.17) and 1.01 (0.86-1.17) for the maximum plasma concentration and area under the concentration-time curve, respectively. The mean baseline-adjusted QT (dQT) time profiles were similar and the mean values of maximum dQT were not significantly different between the elderly and the younger adults. The linear mixed-effect model indicated a weak but positive relationship between the escitalopram concentration and dQT, with an estimated coefficient of concentration of 0.43-0.54. In conclusion, the pharmacokinetics and QT effect of a single dose of escitalopram observed in the elderly without comorbidities and younger adults were generally similar.

  15. Effect of precursor concentration and film thickness deposited by layer on nanostructured TiO2 thin films

    NASA Astrophysics Data System (ADS)

    Affendi, I. H. H.; Sarah, M. S. P.; Alrokayan, Salman A. H.; Khan, Haseeb A.; Rusop, M.

    2018-05-01

    Sol-gel spin coating method is used in the production of nanostructured TiO2 thin film. The surface topology and morphology was observed using the Atomic Force Microscopy (AFM) and Field Emission Scanning Electron Microscopy (FESEM). The electrical properties were investigated by using two probe current-voltage (I-V) measurements to study the electrical resistivity behavior, hence the conductivity of the thin film. The solution concentration will be varied from 14.0 to 0.01wt% with 0.02wt% interval where the last concentration of 0.02 to 0.01wt% have 0.01wt% interval to find which concentrations have the highest conductivity then the optimized concentration's sample were chosen for the thickness parameter based on layer by layer deposition from 1 to 6 layer. Based on the result, the lowest concentration of TiO2, the surface becomes more uniform and the conductivity will increase. As the result, sample of 0.01wt% concentration have conductivity value of 1.77E-10 S/m and will be advanced in thickness parameter. Whereas in thickness parameter, the 3layer deposition were chosen as its conductivity is the highest at 3.9098E9 S/m.

  16. Influences of two high intensity interval exercise protocols on the main determinants of blood fluidity in overweight men.

    PubMed

    Ahmadizad, Sajad; Bassami, Minoo; Hadian, Mohsen; Eslami, Maryam

    2016-01-01

    Acute effects of continuous exercise on the markers of blood fluidity have been addressed in different populations and the changes are intensity related. However, the effect of different high intensity interval exercise (HIIE) on these variables is unclear. This study is designed to determine the effects of two different HIIE with different work/rest ratios but the same energy expenditure on the main determinants of blood fluidity. Ten overweight men (age, 26.3±1.7 yrs) completed two HIIE protocols on two separate occasions with one week intervening. The two HIIE encompassed performing: 1) 6 intervals of 2 min activity at 85% of VO2max interspersed by 2 min active recovery at 30% of VO2max (ratio 1 to 1, HIIE1/1), and 2) 6 intervals of 30 s activity at 110% of VO2max interspersed by 4 min active recovery at 40% of VO2max (ratio 1 to 8, HIIE1/8). Each exercise trial was followed by 30 min rest. Venous blood samples were obtained before exercise, immediately after exercise and after recovery and analyzed for blood and plasma viscosity, fibrinogen and red blood cell indices. The HIIE1/1 protocol led to higher reduction (P < 0.01) in plasma volume changes compared to HIIE1/8 (9.9% vs 5.7%). Moreover, increases in blood viscosity, plasma viscosity, hematocrit, RBC count and mean arterial blood pressure observed following HIIE1/1 were significantly (P < 0.05) higher than HIIE1/8 ; whereas, the changes in fibrinogen concentration neither were significant in response to both trials nor were significantly different between two protocols (P > 0.05). However, the changes in all variables during exercise were transient and returned to the baseline levels after 30 min recovery. It is concluded that the HIIE protocol with lower intensity and shorter rest intervals (higher work to rest ratio) clearly results in more physiological strain than HIIE with higher intensity but longer rest intervals (lower work to rest ratio) in overweight individuals, and that the work to rest ratio could be as important as exercise intensity when considering the hemorheological variables during HIIE.

  17. A new time calibration method for switched-capacitor-array-based waveform samplers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, H.; Chen, C. -T.; Eclov, N.

    2014-08-24

    Here we have developed a new time calibration method for the DRS4 waveform sampler that enables us to precisely measure the non-uniform sampling interval inherent in the switched-capacitor cells of the DRS4. The method uses the proportionality between the differential amplitude and sampling interval of adjacent switched-capacitor cells responding to a sawtooth-shape pulse. In the experiment, a sawtooth-shape pulse with a 40 ns period generated by a Tektronix AWG7102 is fed to a DRS4 evaluation board for calibrating the sampling intervals of all 1024 cells individually. The electronic time resolution of the DRS4 evaluation board with the new time calibrationmore » is measured to be ~2.4 ps RMS by using two simultaneous Gaussian pulses with 2.35 ns full-width at half-maximum and applying a Gaussian fit. The time resolution dependencies on the time difference with the new time calibration are measured and compared to results obtained by another method. Ultimately, the new method could be applicable for other switched-capacitor-array technology-based waveform samplers for precise time calibration.« less

  18. A new time calibration method for switched-capacitor-array-based waveform samplers

    NASA Astrophysics Data System (ADS)

    Kim, H.; Chen, C.-T.; Eclov, N.; Ronzhin, A.; Murat, P.; Ramberg, E.; Los, S.; Moses, W.; Choong, W.-S.; Kao, C.-M.

    2014-12-01

    We have developed a new time calibration method for the DRS4 waveform sampler that enables us to precisely measure the non-uniform sampling interval inherent in the switched-capacitor cells of the DRS4. The method uses the proportionality between the differential amplitude and sampling interval of adjacent switched-capacitor cells responding to a sawtooth-shape pulse. In the experiment, a sawtooth-shape pulse with a 40 ns period generated by a Tektronix AWG7102 is fed to a DRS4 evaluation board for calibrating the sampling intervals of all 1024 cells individually. The electronic time resolution of the DRS4 evaluation board with the new time calibration is measured to be 2.4 ps RMS by using two simultaneous Gaussian pulses with 2.35 ns full-width at half-maximum and applying a Gaussian fit. The time resolution dependencies on the time difference with the new time calibration are measured and compared to results obtained by another method. The new method could be applicable for other switched-capacitor-array technology-based waveform samplers for precise time calibration.

  19. Interpretable functional principal component analysis.

    PubMed

    Lin, Zhenhua; Wang, Liangliang; Cao, Jiguo

    2016-09-01

    Functional principal component analysis (FPCA) is a popular approach to explore major sources of variation in a sample of random curves. These major sources of variation are represented by functional principal components (FPCs). The intervals where the values of FPCs are significant are interpreted as where sample curves have major variations. However, these intervals are often hard for naïve users to identify, because of the vague definition of "significant values". In this article, we develop a novel penalty-based method to derive FPCs that are only nonzero precisely in the intervals where the values of FPCs are significant, whence the derived FPCs possess better interpretability than the FPCs derived from existing methods. To compute the proposed FPCs, we devise an efficient algorithm based on projection deflation techniques. We show that the proposed interpretable FPCs are strongly consistent and asymptotically normal under mild conditions. Simulation studies confirm that with a competitive performance in explaining variations of sample curves, the proposed FPCs are more interpretable than the traditional counterparts. This advantage is demonstrated by analyzing two real datasets, namely, electroencephalography data and Canadian weather data. © 2015, The International Biometric Society.

  20. A New Time Calibration Method for Switched-capacitor-array-based Waveform Samplers.

    PubMed

    Kim, H; Chen, C-T; Eclov, N; Ronzhin, A; Murat, P; Ramberg, E; Los, S; Moses, W; Choong, W-S; Kao, C-M

    2014-12-11

    We have developed a new time calibration method for the DRS4 waveform sampler that enables us to precisely measure the non-uniform sampling interval inherent in the switched-capacitor cells of the DRS4. The method uses the proportionality between the differential amplitude and sampling interval of adjacent switched-capacitor cells responding to a sawtooth-shape pulse. In the experiment, a sawtooth-shape pulse with a 40 ns period generated by a Tektronix AWG7102 is fed to a DRS4 evaluation board for calibrating the sampling intervals of all 1024 cells individually. The electronic time resolution of the DRS4 evaluation board with the new time calibration is measured to be ~2.4 ps RMS by using two simultaneous Gaussian pulses with 2.35 ns full-width at half-maximum and applying a Gaussian fit. The time resolution dependencies on the time difference with the new time calibration are measured and compared to results obtained by another method. The new method could be applicable for other switched-capacitor-array technology-based waveform samplers for precise time calibration.

Top