Sample records for systematic sampling method

  1. At convenience and systematic random sampling: effects on the prognostic value of nuclear area assessments in breast cancer patients.

    PubMed

    Jannink, I; Bennen, J N; Blaauw, J; van Diest, P J; Baak, J P

    1995-01-01

    This study compares the influence of two different nuclear sampling methods on the prognostic value of assessments of mean and standard deviation of nuclear area (MNA, SDNA) in 191 consecutive invasive breast cancer patients with long term follow up. The first sampling method used was 'at convenience' sampling (ACS); the second, systematic random sampling (SRS). Both sampling methods were tested with a sample size of 50 nuclei (ACS-50 and SRS-50). To determine whether, besides the sampling methods, sample size had impact on prognostic value as well, the SRS method was also tested using a sample size of 100 nuclei (SRS-100). SDNA values were systematically lower for ACS, obviously due to (unconsciously) not including small and large nuclei. Testing prognostic value of a series of cut off points, MNA and SDNA values assessed by the SRS method were prognostically significantly stronger than the values obtained by the ACS method. This was confirmed in Cox regression analysis. For the MNA, the Mantel-Cox p-values from SRS-50 and SRS-100 measurements were not significantly different. However, for the SDNA, SRS-100 yielded significantly lower p-values than SRS-50. In conclusion, compared with the 'at convenience' nuclear sampling method, systematic random sampling of nuclei is not only superior with respect to reproducibility of results, but also provides a better prognostic value in patients with invasive breast cancer.

  2. Comparative analysis of whole mount processing and systematic sampling of radical prostatectomy specimens: pathological outcomes and risk of biochemical recurrence.

    PubMed

    Salem, Shady; Chang, Sam S; Clark, Peter E; Davis, Rodney; Herrell, S Duke; Kordan, Yakup; Wills, Marcia L; Shappell, Scott B; Baumgartner, Roxelyn; Phillips, Sharon; Smith, Joseph A; Cookson, Michael S; Barocas, Daniel A

    2010-10-01

    Whole mount processing is more resource intensive than routine systematic sampling of radical retropubic prostatectomy specimens. We compared whole mount and systematic sampling for detecting pathological outcomes, and compared the prognostic value of pathological findings across pathological methods. We included men (608 whole mount and 525 systematic sampling samples) with no prior treatment who underwent radical retropubic prostatectomy at Vanderbilt University Medical Center between January 2000 and June 2008. We used univariate and multivariate analysis to compare the pathological outcome detection rate between pathological methods. Kaplan-Meier curves and the log rank test were used to compare the prognostic value of pathological findings across pathological methods. There were no significant differences between the whole mount and the systematic sampling groups in detecting extraprostatic extension (25% vs 30%), positive surgical margins (31% vs 31%), pathological Gleason score less than 7 (49% vs 43%), 7 (39% vs 43%) or greater than 7 (12% vs 13%), seminal vesicle invasion (8% vs 10%) or lymph node involvement (3% vs 5%). Tumor volume was higher in the systematic sampling group and whole mount detected more multiple surgical margins (each p <0.01). There were no significant differences in the likelihood of biochemical recurrence between the pathological methods when patients were stratified by pathological outcome. Except for estimated tumor volume and multiple margins whole mount and systematic sampling yield similar pathological information. Each method stratifies patients into comparable risk groups for biochemical recurrence. Thus, while whole mount is more resource intensive, it does not appear to result in improved detection of clinically important pathological outcomes or prognostication. Copyright © 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  3. Dark Energy Survey Year 1 results: cross-correlation redshifts - methods and systematics characterization

    NASA Astrophysics Data System (ADS)

    Gatti, M.; Vielzeuf, P.; Davis, C.; Cawthon, R.; Rau, M. M.; DeRose, J.; De Vicente, J.; Alarcon, A.; Rozo, E.; Gaztanaga, E.; Hoyle, B.; Miquel, R.; Bernstein, G. M.; Bonnett, C.; Carnero Rosell, A.; Castander, F. J.; Chang, C.; da Costa, L. N.; Gruen, D.; Gschwend, J.; Hartley, W. G.; Lin, H.; MacCrann, N.; Maia, M. A. G.; Ogando, R. L. C.; Roodman, A.; Sevilla-Noarbe, I.; Troxel, M. A.; Wechsler, R. H.; Asorey, J.; Davis, T. M.; Glazebrook, K.; Hinton, S. R.; Lewis, G.; Lidman, C.; Macaulay, E.; Möller, A.; O'Neill, C. R.; Sommer, N. E.; Uddin, S. A.; Yuan, F.; Zhang, B.; Abbott, T. M. C.; Allam, S.; Annis, J.; Bechtol, K.; Brooks, D.; Burke, D. L.; Carollo, D.; Carrasco Kind, M.; Carretero, J.; Cunha, C. E.; D'Andrea, C. B.; DePoy, D. L.; Desai, S.; Eifler, T. F.; Evrard, A. E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Goldstein, D. A.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; Hoormann, J. K.; Jain, B.; James, D. J.; Jarvis, M.; Jeltema, T.; Johnson, M. W. G.; Johnson, M. D.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Kuropatkin, N.; Li, T. S.; Lima, M.; Marshall, J. L.; Melchior, P.; Menanteau, F.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Reil, K.; Rykoff, E. S.; Sako, M.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sheldon, E.; Smith, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Tucker, B. E.; Tucker, D. L.; Vikram, V.; Walker, A. R.; Weller, J.; Wester, W.; Wolf, R. C.

    2018-06-01

    We use numerical simulations to characterize the performance of a clustering-based method to calibrate photometric redshift biases. In particular, we cross-correlate the weak lensing source galaxies from the Dark Energy Survey Year 1 sample with redMaGiC galaxies (luminous red galaxies with secure photometric redshifts) to estimate the redshift distribution of the former sample. The recovered redshift distributions are used to calibrate the photometric redshift bias of standard photo-z methods applied to the same source galaxy sample. We apply the method to two photo-z codes run in our simulated data: Bayesian Photometric Redshift and Directional Neighbourhood Fitting. We characterize the systematic uncertainties of our calibration procedure, and find that these systematic uncertainties dominate our error budget. The dominant systematics are due to our assumption of unevolving bias and clustering across each redshift bin, and to differences between the shapes of the redshift distributions derived by clustering versus photo-zs. The systematic uncertainty in the mean redshift bias of the source galaxy sample is Δz ≲ 0.02, though the precise value depends on the redshift bin under consideration. We discuss possible ways to mitigate the impact of our dominant systematics in future analyses.

  4. Efficiently estimating salmon escapement uncertainty using systematically sampled data

    USGS Publications Warehouse

    Reynolds, Joel H.; Woody, Carol Ann; Gove, Nancy E.; Fair, Lowell F.

    2007-01-01

    Fish escapement is generally monitored using nonreplicated systematic sampling designs (e.g., via visual counts from towers or hydroacoustic counts). These sampling designs support a variety of methods for estimating the variance of the total escapement. Unfortunately, all the methods give biased results, with the magnitude of the bias being determined by the underlying process patterns. Fish escapement commonly exhibits positive autocorrelation and nonlinear patterns, such as diurnal and seasonal patterns. For these patterns, poor choice of variance estimator can needlessly increase the uncertainty managers have to deal with in sustaining fish populations. We illustrate the effect of sampling design and variance estimator choice on variance estimates of total escapement for anadromous salmonids from systematic samples of fish passage. Using simulated tower counts of sockeye salmon Oncorhynchus nerka escapement on the Kvichak River, Alaska, five variance estimators for nonreplicated systematic samples were compared to determine the least biased. Using the least biased variance estimator, four confidence interval estimators were compared for expected coverage and mean interval width. Finally, five systematic sampling designs were compared to determine the design giving the smallest average variance estimate for total annual escapement. For nonreplicated systematic samples of fish escapement, all variance estimators were positively biased. Compared to the other estimators, the least biased estimator reduced bias by, on average, from 12% to 98%. All confidence intervals gave effectively identical results. Replicated systematic sampling designs consistently provided the smallest average estimated variance among those compared.

  5. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    PubMed

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.

  6. [Comparison study on sampling methods of Oncomelania hupensis snail survey in marshland schistosomiasis epidemic areas in China].

    PubMed

    An, Zhao; Wen-Xin, Zhang; Zhong, Yao; Yu-Kuan, Ma; Qing, Liu; Hou-Lang, Duan; Yi-di, Shang

    2016-06-29

    To optimize and simplify the survey method of Oncomelania hupensis snail in marshland endemic region of schistosomiasis and increase the precision, efficiency and economy of the snail survey. A quadrate experimental field was selected as the subject of 50 m×50 m size in Chayegang marshland near Henghu farm in the Poyang Lake region and a whole-covered method was adopted to survey the snails. The simple random sampling, systematic sampling and stratified random sampling methods were applied to calculate the minimum sample size, relative sampling error and absolute sampling error. The minimum sample sizes of the simple random sampling, systematic sampling and stratified random sampling methods were 300, 300 and 225, respectively. The relative sampling errors of three methods were all less than 15%. The absolute sampling errors were 0.221 7, 0.302 4 and 0.047 8, respectively. The spatial stratified sampling with altitude as the stratum variable is an efficient approach of lower cost and higher precision for the snail survey.

  7. Dark Energy Survey Year 1 Results: Cross-Correlation Redshifts - Methods and Systematics Characterization

    DOE PAGES

    Gatti, M.

    2018-02-22

    We use numerical simulations to characterize the performance of a clustering-based method to calibrate photometric redshift biases. In particular, we cross-correlate the weak lensing (WL) source galaxies from the Dark Energy Survey Year 1 (DES Y1) sample with redMaGiC galaxies (luminous red galaxies with secure photometric red- shifts) to estimate the redshift distribution of the former sample. The recovered redshift distributions are used to calibrate the photometric redshift bias of standard photo-z methods applied to the same source galaxy sample. We also apply the method to three photo-z codes run in our simulated data: Bayesian Photometric Redshift (BPZ), Directional Neighborhoodmore » Fitting (DNF), and Random Forest-based photo-z (RF). We characterize the systematic uncertainties of our calibration procedure, and find that these systematic uncertainties dominate our error budget. The dominant systematics are due to our assumption of unevolving bias and clustering across each redshift bin, and to differences between the shapes of the redshift distributions derived by clustering vs photo-z's. The systematic uncertainty in the mean redshift bias of the source galaxy sample is z ≲ 0.02, though the precise value depends on the redshift bin under consideration. Here, we discuss possible ways to mitigate the impact of our dominant systematics in future analyses.« less

  8. Dark Energy Survey Year 1 Results: Cross-Correlation Redshifts - Methods and Systematics Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gatti, M.

    We use numerical simulations to characterize the performance of a clustering-based method to calibrate photometric redshift biases. In particular, we cross-correlate the weak lensing (WL) source galaxies from the Dark Energy Survey Year 1 (DES Y1) sample with redMaGiC galaxies (luminous red galaxies with secure photometric red- shifts) to estimate the redshift distribution of the former sample. The recovered redshift distributions are used to calibrate the photometric redshift bias of standard photo-z methods applied to the same source galaxy sample. We also apply the method to three photo-z codes run in our simulated data: Bayesian Photometric Redshift (BPZ), Directional Neighborhoodmore » Fitting (DNF), and Random Forest-based photo-z (RF). We characterize the systematic uncertainties of our calibration procedure, and find that these systematic uncertainties dominate our error budget. The dominant systematics are due to our assumption of unevolving bias and clustering across each redshift bin, and to differences between the shapes of the redshift distributions derived by clustering vs photo-z's. The systematic uncertainty in the mean redshift bias of the source galaxy sample is z ≲ 0.02, though the precise value depends on the redshift bin under consideration. Here, we discuss possible ways to mitigate the impact of our dominant systematics in future analyses.« less

  9. Systematic study of inorganic functionalization of ZnO nanorods by Sol-Gel method

    NASA Astrophysics Data System (ADS)

    Gamarra, J. K.; Solano, C.; Piñeres, I.; Gómez, H.; Mass, J.; Montenegro, D. N.

    2017-01-01

    A systematic study of the inorganic surface functionalization of ZnO nanostructures by sol-gel method is shown. We have emphasized on the evolution of morphology properties of samples as a function of functionalization parameters. In addition, the effects on thermal stability and some optical properties of samples are discussed.

  10. Systematic Evaluation of Aggressive Air Sampling for Bacillus ...

    EPA Pesticide Factsheets

    Report The primary objectives of this project were to evaluate the Aggressive Air Sampling (AAS) method compared to currently used surface sampling methods and to determine if AAS is a viable option for sampling Bacillus anthracis spores.

  11. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    PubMed

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  12. Hard-to-reach populations of men who have sex with men and sex workers: a systematic review on sampling methods.

    PubMed

    Barros, Ana B; Dias, Sonia F; Martins, Maria Rosario O

    2015-10-30

    In public health, hard-to-reach populations are often recruited by non-probabilistic sampling methods that produce biased results. In order to overcome this, several sampling methods have been improved and developed in the last years. The aim of this systematic review was to identify all current methods used to survey most-at-risk populations of men who have sex with men and sex workers. The review also aimed to assess if there were any relations between the study populations and the sampling methods used to recruit them. Lastly, we wanted to assess if the number of publications originated in middle and low human development (MLHD) countries had been increasing in the last years. A systematic review was conducted using electronic databases and a total of 268 published studies were included in the analysis. In this review, 11 recruitment methods were identified. Semi-probabilistic methods were used most commonly to survey men who have sex with men, and the use of the Internet was the method that gathered more respondents. We found that female sex workers were more frequently recruited through non-probabilistic methods than men who have sex with men (odds = 2.2; p < 0.05; confidence interval (CI) [1.1-4.2]). In the last 6 years, the number of studies based in middle and low human development countries increased more than the number of studies based in very high and high human development countries (odds = 2.5; p < 0.05; CI [1.3-4.9]). This systematic literature review identified 11 methods used to sample men who have sex with men and female sex workers. There is an association between the type of sampling method and the population being studied. The number of studies based in middle and low human development countries has increased in the last 6 years of this study.

  13. Including mixed methods research in systematic reviews: examples from qualitative syntheses in TB and malaria control.

    PubMed

    Atkins, Salla; Launiala, Annika; Kagaha, Alexander; Smith, Helen

    2012-04-30

    Health policy makers now have access to a greater number and variety of systematic reviews to inform different stages in the policy making process, including reviews of qualitative research. The inclusion of mixed methods studies in systematic reviews is increasing, but these studies pose particular challenges to methods of review. This article examines the quality of the reporting of mixed methods and qualitative-only studies. We used two completed systematic reviews to generate a sample of qualitative studies and mixed method studies in order to make an assessment of how the quality of reporting and rigor of qualitative-only studies compares with that of mixed-methods studies. Overall, the reporting of qualitative studies in our sample was consistently better when compared with the reporting of mixed methods studies. We found that mixed methods studies are less likely to provide a description of the research conduct or qualitative data analysis procedures and less likely to be judged credible or provide rich data and thick description compared with standalone qualitative studies. Our time-related analysis shows that for both types of study, papers published since 2003 are more likely to report on the study context, describe analysis procedures, and be judged credible and provide rich data. However, the reporting of other aspects of research conduct (i.e. descriptions of the research question, the sampling strategy, and data collection methods) in mixed methods studies does not appear to have improved over time. Mixed methods research makes an important contribution to health research in general, and could make a more substantial contribution to systematic reviews. Through our careful analysis of the quality of reporting of mixed methods and qualitative-only research, we have identified areas that deserve more attention in the conduct and reporting of mixed methods research.

  14. Sampling bee communities using pan traps: alternative methods increase sample size

    USDA-ARS?s Scientific Manuscript database

    Monitoring of the status of bee populations and inventories of bee faunas require systematic sampling. Efficiency and ease of implementation has encouraged the use of pan traps to sample bees. Efforts to find an optimal standardized sampling method for pan traps have focused on pan trap color. Th...

  15. Detection and monitoring of invasive exotic plants: a comparison of four sampling methods

    Treesearch

    Cynthia D. Huebner

    2007-01-01

    The ability to detect and monitor exotic invasive plants is likely to vary depending on the sampling method employed. Methods with strong qualitative thoroughness for species detection often lack the intensity necessary to monitor vegetation change. Four sampling methods (systematic plot, stratified-random plot, modified Whittaker, and timed meander) in hemlock and red...

  16. Including mixed methods research in systematic reviews: Examples from qualitative syntheses in TB and malaria control

    PubMed Central

    2012-01-01

    Background Health policy makers now have access to a greater number and variety of systematic reviews to inform different stages in the policy making process, including reviews of qualitative research. The inclusion of mixed methods studies in systematic reviews is increasing, but these studies pose particular challenges to methods of review. This article examines the quality of the reporting of mixed methods and qualitative-only studies. Methods We used two completed systematic reviews to generate a sample of qualitative studies and mixed method studies in order to make an assessment of how the quality of reporting and rigor of qualitative-only studies compares with that of mixed-methods studies. Results Overall, the reporting of qualitative studies in our sample was consistently better when compared with the reporting of mixed methods studies. We found that mixed methods studies are less likely to provide a description of the research conduct or qualitative data analysis procedures and less likely to be judged credible or provide rich data and thick description compared with standalone qualitative studies. Our time-related analysis shows that for both types of study, papers published since 2003 are more likely to report on the study context, describe analysis procedures, and be judged credible and provide rich data. However, the reporting of other aspects of research conduct (i.e. descriptions of the research question, the sampling strategy, and data collection methods) in mixed methods studies does not appear to have improved over time. Conclusions Mixed methods research makes an important contribution to health research in general, and could make a more substantial contribution to systematic reviews. Through our careful analysis of the quality of reporting of mixed methods and qualitative-only research, we have identified areas that deserve more attention in the conduct and reporting of mixed methods research. PMID:22545681

  17. Reviewing the research methods literature: principles and strategies illustrated by a systematic overview of sampling in qualitative research.

    PubMed

    Gentles, Stephen J; Charles, Cathy; Nicholas, David B; Ploeg, Jenny; McKibbon, K Ann

    2016-10-11

    Overviews of methods are potentially useful means to increase clarity and enhance collective understanding of specific methods topics that may be characterized by ambiguity, inconsistency, or a lack of comprehensiveness. This type of review represents a distinct literature synthesis method, although to date, its methodology remains relatively undeveloped despite several aspects that demand unique review procedures. The purpose of this paper is to initiate discussion about what a rigorous systematic approach to reviews of methods, referred to here as systematic methods overviews, might look like by providing tentative suggestions for approaching specific challenges likely to be encountered. The guidance offered here was derived from experience conducting a systematic methods overview on the topic of sampling in qualitative research. The guidance is organized into several principles that highlight specific objectives for this type of review given the common challenges that must be overcome to achieve them. Optional strategies for achieving each principle are also proposed, along with discussion of how they were successfully implemented in the overview on sampling. We describe seven paired principles and strategies that address the following aspects: delimiting the initial set of publications to consider, searching beyond standard bibliographic databases, searching without the availability of relevant metadata, selecting publications on purposeful conceptual grounds, defining concepts and other information to abstract iteratively, accounting for inconsistent terminology used to describe specific methods topics, and generating rigorous verifiable analytic interpretations. Since a broad aim in systematic methods overviews is to describe and interpret the relevant literature in qualitative terms, we suggest that iterative decision making at various stages of the review process, and a rigorous qualitative approach to analysis are necessary features of this review type. We believe that the principles and strategies provided here will be useful to anyone choosing to undertake a systematic methods overview. This paper represents an initial effort to promote high quality critical evaluations of the literature regarding problematic methods topics, which have the potential to promote clearer, shared understandings, and accelerate advances in research methods. Further work is warranted to develop more definitive guidance.

  18. Scan Order in Gibbs Sampling: Models in Which it Matters and Bounds on How Much.

    PubMed

    He, Bryan; De Sa, Christopher; Mitliagkas, Ioannis; Ré, Christopher

    2016-01-01

    Gibbs sampling is a Markov Chain Monte Carlo sampling technique that iteratively samples variables from their conditional distributions. There are two common scan orders for the variables: random scan and systematic scan. Due to the benefits of locality in hardware, systematic scan is commonly used, even though most statistical guarantees are only for random scan. While it has been conjectured that the mixing times of random scan and systematic scan do not differ by more than a logarithmic factor, we show by counterexample that this is not the case, and we prove that that the mixing times do not differ by more than a polynomial factor under mild conditions. To prove these relative bounds, we introduce a method of augmenting the state space to study systematic scan using conductance.

  19. Scan Order in Gibbs Sampling: Models in Which it Matters and Bounds on How Much

    PubMed Central

    He, Bryan; De Sa, Christopher; Mitliagkas, Ioannis; Ré, Christopher

    2016-01-01

    Gibbs sampling is a Markov Chain Monte Carlo sampling technique that iteratively samples variables from their conditional distributions. There are two common scan orders for the variables: random scan and systematic scan. Due to the benefits of locality in hardware, systematic scan is commonly used, even though most statistical guarantees are only for random scan. While it has been conjectured that the mixing times of random scan and systematic scan do not differ by more than a logarithmic factor, we show by counterexample that this is not the case, and we prove that that the mixing times do not differ by more than a polynomial factor under mild conditions. To prove these relative bounds, we introduce a method of augmenting the state space to study systematic scan using conductance. PMID:28344429

  20. [Study of spatial stratified sampling strategy of Oncomelania hupensis snail survey based on plant abundance].

    PubMed

    Xun-Ping, W; An, Z

    2017-07-27

    Objective To optimize and simplify the survey method of Oncomelania hupensis snails in marshland endemic regions of schistosomiasis, so as to improve the precision, efficiency and economy of the snail survey. Methods A snail sampling strategy (Spatial Sampling Scenario of Oncomelania based on Plant Abundance, SOPA) which took the plant abundance as auxiliary variable was explored and an experimental study in a 50 m×50 m plot in a marshland in the Poyang Lake region was performed. Firstly, the push broom surveyed data was stratified into 5 layers by the plant abundance data; then, the required numbers of optimal sampling points of each layer through Hammond McCullagh equation were calculated; thirdly, every sample point in the line with the Multiple Directional Interpolation (MDI) placement scheme was pinpointed; and finally, the comparison study among the outcomes of the spatial random sampling strategy, the traditional systematic sampling method, the spatial stratified sampling method, Sandwich spatial sampling and inference and SOPA was performed. Results The method (SOPA) proposed in this study had the minimal absolute error of 0.213 8; and the traditional systematic sampling method had the largest estimate, and the absolute error was 0.924 4. Conclusion The snail sampling strategy (SOPA) proposed in this study obtains the higher estimation accuracy than the other four methods.

  1. A Comparison of Two Sampling Strategies to Assess Discomycete Diversity in Wet Tropical Forests

    Treesearch

    SHARON A. CANTRELL

    2004-01-01

    Most of the fungal diversity studies that have used a systematic collecting scheme have not included the discomycetes, so optimal sampling methods are not available for this group. In this study, I tested two sampling methods at each sites in the Caribbean National Forest, Puerto Rico and Ebano Verde Reserve, Dominican Republic. For a plot-based sampling method, 10 ×...

  2. Qualitative fusion technique based on information poor system and its application to factor analysis for vibration of rolling bearings

    NASA Astrophysics Data System (ADS)

    Xia, Xintao; Wang, Zhongyu

    2008-10-01

    For some methods of stability analysis of a system using statistics, it is difficult to resolve the problems of unknown probability distribution and small sample. Therefore, a novel method is proposed in this paper to resolve these problems. This method is independent of probability distribution, and is useful for small sample systems. After rearrangement of the original data series, the order difference and two polynomial membership functions are introduced to estimate the true value, the lower bound and the supper bound of the system using fuzzy-set theory. Then empirical distribution function is investigated to ensure confidence level above 95%, and the degree of similarity is presented to evaluate stability of the system. Cases of computer simulation investigate stable systems with various probability distribution, unstable systems with linear systematic errors and periodic systematic errors and some mixed systems. The method of analysis for systematic stability is approved.

  3. Obtaining Self-Samples to Diagnose Curable Sexually Transmitted Infections: A Systematic Review of Patients’ Experiences

    PubMed Central

    Paudyal, Priyamvada; Llewellyn, Carrie; Lau, Jason; Mahmud, Mohammad; Smith, Helen

    2015-01-01

    Background Routine screening is key to sexually transmitted infection (STI) prevention and control. Previous studies suggest that clinic-based screening programmes capture only a small proportion of people with STIs. Self-sampling using non- or minimally invasive techniques may be beneficial for those reluctant to actively engage with conventional sampling methods. We systematically reviewed studies of patients’ experiences of obtaining self-samples to diagnose curable STIs. Methods We conducted an electronic search of MEDLINE, EMBASE, CINAHL, PsychINFO, BNI, and Cochrane Database of Systematic Reviews to identify relevant articles published in English between January 1980 and March 2014. Studies were included if participants self-sampled for the diagnosis of a curable STI and had specifically sought participants’ opinions of their experience, acceptability, preferences, or willingness to self-sample. Results The initial search yielded 558 references. Of these, 45 studies met the inclusion criteria. Thirty-six studies assessed patients’ acceptability and experiences of self-sampling. Pooled results from these studies shows that self-sampling is a highly acceptable method with 85% of patients reporting the method to be well received and acceptable. Twenty-eight studies reported on ease of self-sampling; the majority of patients (88%) in these studies found self-sampling an “easy” procedure. Self-sampling was favoured compared to clinician sampling, and home sampling was preferred to clinic-based sampling. Females and older participants were more accepting of self-sampling. Only a small minority of participants (13%) reported pain during self-sampling. Participants were willing to undergo self-sampling and recommend others. Privacy and safety were the most common concerns. Conclusion Self-sampling for diagnostic testing is well accepted with the majority having a positive experience and willingness to use again. Standardization of self-sampling procedures and rigorous validation of outcome measurement will lead to better comparability across studies. Future studies need to conduct rigorous economic evaluations of self-sampling to inform policy development for the management of STI. PMID:25909508

  4. Point Intercept (PO)

    Treesearch

    John F. Caratti

    2006-01-01

    The FIREMON Point Intercept (PO) method is used to assess changes in plant species cover or ground cover for a macroplot. This method uses a narrow diameter sampling pole or sampling pins, placed at systematic intervals along line transects to sample within plot variation and quantify statistically valid changes in plant species cover and height over time. Plant...

  5. Evaluation of three sampling methods to monitor outcomes of antiretroviral treatment programmes in low- and middle-income countries.

    PubMed

    Tassie, Jean-Michel; Malateste, Karen; Pujades-Rodríguez, Mar; Poulet, Elisabeth; Bennett, Diane; Harries, Anthony; Mahy, Mary; Schechter, Mauro; Souteyrand, Yves; Dabis, François

    2010-11-10

    Retention of patients on antiretroviral therapy (ART) over time is a proxy for quality of care and an outcome indicator to monitor ART programs. Using existing databases (Antiretroviral in Lower Income Countries of the International Databases to Evaluate AIDS and Médecins Sans Frontières), we evaluated three sampling approaches to simplify the generation of outcome indicators. We used individual patient data from 27 ART sites and included 27,201 ART-naive adults (≥15 years) who initiated ART in 2005. For each site, we generated two outcome indicators at 12 months, retention on ART and proportion of patients lost to follow-up (LFU), first using all patient data and then within a smaller group of patients selected using three sampling methods (random, systematic and consecutive sampling). For each method and each site, 500 samples were generated, and the average result was compared with the unsampled value. The 95% sampling distribution (SD) was expressed as the 2.5(th) and 97.5(th) percentile values from the 500 samples. Overall, retention on ART was 76.5% (range 58.9-88.6) and the proportion of patients LFU, 13.5% (range 0.8-31.9). Estimates of retention from sampling (n = 5696) were 76.5% (SD 75.4-77.7) for random, 76.5% (75.3-77.5) for systematic and 76.0% (74.1-78.2) for the consecutive method. Estimates for the proportion of patients LFU were 13.5% (12.6-14.5), 13.5% (12.6-14.3) and 14.0% (12.5-15.5), respectively. With consecutive sampling, 50% of sites had SD within ±5% of the unsampled site value. Our results suggest that random, systematic or consecutive sampling methods are feasible for monitoring ART indicators at national level. However, sampling may not produce precise estimates in some sites.

  6. Systematic sampling for suspended sediment

    Treesearch

    Robert B. Thomas

    1991-01-01

    Abstract - Because of high costs or complex logistics, scientific populations cannot be measured entirely and must be sampled. Accepted scientific practice holds that sample selection be based on statistical principles to assure objectivity when estimating totals and variances. Probability sampling--obtaining samples with known probabilities--is the only method that...

  7. U-Th-Pb, Sm-Nd, Rb-Sr, and Lu-Hf systematics of returned Mars samples

    NASA Technical Reports Server (NTRS)

    Tatsumoto, M.; Premo, W. R.

    1988-01-01

    The advantage of studying returned planetary samples cannot be overstated. A wider range of analytical techniques with higher sensitivities and accuracies can be applied to returned samples. Measurement of U-Th-Pb, Sm-Nd, Rb-Sr, and Lu-Hf isotopic systematics for chronology and isotopic tracer studies of planetary specimens cannot be done in situ with desirable precision. Returned Mars samples will be examined using all the physical, chemical, and geologic methods necessary to gain information on the origin and evolution of Mars. A returned Martian sample would provide ample information regarding the accretionary and evolutionary history of the Martian planetary body and possibly other planets of our solar system.

  8. Random vs. systematic sampling from administrative databases involving human subjects.

    PubMed

    Hagino, C; Lo, R J

    1998-09-01

    Two sampling techniques, simple random sampling (SRS) and systematic sampling (SS), were compared to determine whether they yield similar and accurate distributions for the following four factors: age, gender, geographic location and years in practice. Any point estimate within 7 yr or 7 percentage points of its reference standard (SRS or the entire data set, i.e., the target population) was considered "acceptably similar" to the reference standard. The sampling frame was from the entire membership database of the Canadian Chiropractic Association. The two sampling methods were tested using eight different sample sizes of n (50, 100, 150, 200, 250, 300, 500, 800). From the profile/characteristics, summaries of four known factors [gender, average age, number (%) of chiropractors in each province and years in practice], between- and within-methods chi 2 tests and unpaired t tests were performed to determine whether any of the differences [descriptively greater than 7% or 7 yr] were also statistically significant. The strengths of the agreements between the provincial distributions were quantified by calculating the percent agreements for each (provincial pairwise-comparison methods). Any percent agreement less than 70% was judged to be unacceptable. Our assessments of the two sampling methods (SRS and SS) for the different sample sizes tested suggest that SRS and SS yielded acceptably similar results. Both methods started to yield "correct" sample profiles at approximately the same sample size (n > 200). SS is not only convenient, it can be recommended for sampling from large databases in which the data are listed without any inherent order biases other than alphabetical listing by surname.

  9. Changing Heterosexuals' Attitudes toward Homosexuals: A Systematic Review of the Empirical Literature

    ERIC Educational Resources Information Center

    Tucker, Edmon W.; Potocky-Tripodi, Miriam

    2006-01-01

    Objective: This article systematically reviews evidence for interventions that change attitudes toward homosexuals. Method: In all, 17 empirical studies using college and/or university student samples and interventions intended to improve heterosexuals' attitudes toward lesbian, gay, or bisexual individuals are reviewed. Characteristics of the…

  10. Minimally-invasive Sampling of Interleukin-1α and Interleukin-1 Receptor Antagonist from the Skin: A Systematic Review of In vivo Studies in Humans.

    PubMed

    Falcone, Denise; Spee, Pieter; van de Kerkhof, Peter C M; van Erp, Piet E J

    2017-10-02

    Interleukin-1α (IL-1α) and its receptor antagonist IL-1RA play a pivotal role in skin homeostasis and disease. Although the use of biopsies to sample these cytokines from human skin is widely employed in dermatological practice, knowledge about less invasive, in vivo sampling methods is scarce. The aim of this study was to provide an overview of such methods by systematically reviewing studies in Medline, EMBASE, Web of Science and Cochrane Library using combinations of the terms "IL-1α", IL-1RA", "skin", "human", including all possible synonyms. Quality was assessed using the STrengthening the Reporting of OBservational studies in Epidemiology (STROBE) checklist. The search, performed on 14 October 2016, revealed 10 different sampling methods, with varying degrees of invasiveness and wide application spectrum, including assessment of both normal and diseased skin, from several body sites. The possibility to sample quantifiable amounts of cytokines from human skin with no or minimal discomfort holds promise for linking clinical outcomes to molecular profiles of skin inflammation.

  11. Sampling methods for titica vine (Heteropsis spp.) inventory in a tropical forest

    Treesearch

    Carine Klauberg; Edson Vidal; Carlos Alberto Silva; Michelliny de M. Bentes; Andrew Thomas Hudak

    2016-01-01

    Titica vine provides useful raw fiber material. Using sampling schemes that reduce sampling error can provide direction for sustainable forest management of this vine. Sampling systematically with rectangular plots (10× 25 m) promoted lower error and greater accuracy in the inventory of titica vines in tropical rainforest.

  12. Control of a Clonal Outbreak of Multidrug-Resistant Acinetobacter baumannii in a Hospital of the Basque Country after the Introduction of Environmental Cleaning Led by the Systematic Sampling from Environmental Objects.

    PubMed

    Delgado Naranjo, Jesús; Villate Navarro, José Ignacio; Sota Busselo, Mercedes; Martínez Ruíz, Alberto; Hernández Hernández, José María; Torres Garmendia, María Pilar; Urcelay López, María Isabel

    2013-01-01

    Background. Between July 2009 and September 2010, an outbreak of multidrug-resistant (MDR) Acinetobacter baumannii was detected in one critical care unit of a tertiary hospital in the Basque Country, involving 49 infected and 16 colonized patients. The aim was to evaluate the impact of environmental cleaning and systematic sampling from environmental objects on the risk of infection by MDR A. baumannii. Methods. After systematic sampling from environmental objects and molecular typing of all new MDR A. baumannii strains from patients and environmental isolates, we analyzed the correlation (Pearson's r) between new infected cases and positive environmental samples. The risk ratio (RR) of infection was estimated with Poisson regression. Results. The risk increased significantly with the number of positive samples in common areas (RR = 1.40; 95%CI = 0.99-1.94) and positive samples in boxes (RR = 1.19; 95%CI = 1.01-1.40). The number of cases also positively correlated with positive samples in boxes (r = 0.50; P < 0.05) and common areas (r = 0.29; P = 0.18). Conclusion. Once conventional measures have failed, environmental cleaning, guided by systematic sampling from environmental objects, provided the objective risk reduction of new cases and enabled the full control of the outbreak.

  13. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    PubMed

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  14. The Reporting Quality of Systematic Reviews and Meta-Analyses in Industrial and Organizational Psychology: A Systematic Review

    PubMed Central

    Schalken, Naomi; Rietbergen, Charlotte

    2017-01-01

    Objective: The goal of this systematic review was to examine the reporting quality of the method section of quantitative systematic reviews and meta-analyses from 2009 to 2016 in the field of industrial and organizational psychology with the help of the Meta-Analysis Reporting Standards (MARS), and to update previous research, such as the study of Aytug et al. (2012) and Dieckmann et al. (2009). Methods: A systematic search for quantitative systematic reviews and meta-analyses was conducted in the top 10 journals in the field of industrial and organizational psychology between January 2009 and April 2016. Data were extracted on study characteristics and items of the method section of MARS. A cross-classified multilevel model was analyzed, to test whether publication year and journal impact factor (JIF) were associated with the reporting quality scores of articles. Results: Compliance with MARS in the method section was generally inadequate in the random sample of 120 articles. Variation existed in the reporting of items. There were no significant effects of publication year and journal impact factor (JIF) on the reporting quality scores of articles. Conclusions: The reporting quality in the method section of systematic reviews and meta-analyses was still insufficient, therefore we recommend researchers to improve the reporting in their articles by using reporting standards like MARS. PMID:28878704

  15. Assessment of left ventricular function and mass by MR imaging: a stereological study based on the systematic slice sampling procedure.

    PubMed

    Mazonakis, Michalis; Sahin, Bunyamin; Pagonidis, Konstantin; Damilakis, John

    2011-06-01

    The aim of this study was to combine the stereological technique with magnetic resonance (MR) imaging data for the volumetric and functional analysis of the left ventricle (LV). Cardiac MR examinations were performed in 13 consecutive subjects with known or suspected coronary artery disease. The end-diastolic volume (EDV), end-systolic volume, ejection fraction (EF), and mass were estimated by stereology using the entire slice set depicting LV and systematic sampling intensities of 1/2 and 1/3 that provided samples with every second and third slice, respectively. The repeatability of stereology was evaluated. Stereological assessments were compared with the reference values derived by manually tracing the endocardial and epicardial contours on MR images. Stereological EDV and EF estimations obtained by the 1/3 systematic sampling scheme were significantly different from those by manual delineation (P < .05). No difference was observed between the reference values and the LV parameters estimated by the entire slice set or a sampling intensity of 1/2 (P > .05). For these stereological approaches, a high correlation (r(2) = 0.80-0.93) and clinically acceptable limits of agreement were found with the reference method. Stereological estimations obtained by both sample sizes presented comparable coefficient of variation values of 2.9-5.8%. The mean time for stereological measurements on the entire slice set was 3.4 ± 0.6 minutes and it was reduced to 2.5 ± 0.5 minutes with the 1/2 systematic sampling scheme. Stereological analysis on systematic samples of MR slices generated by the 1/2 sampling intensity provided efficient and quick assessment of LV volumes, function, and mass. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.

  16. Evaluation of Laboratory Procedures to Quantify the Neutral Detergent Fiber Content in Forage, Concentrate, and Ruminant Feces.

    PubMed

    Barbosa, Marcília Medrado; Detmann, Edenio; Rocha, Gabriel Cipriano; de Oliveira Franco, Marcia; de Campos Valadares Filho, Sebastião

    2015-01-01

    A comparison was made of measurements of neutral detergent fiber concentrations obtained with AOAC Method 2002.04 and modified methods using pressurized environments or direct use of industrial heat-stable α-amylase in samples of forage (n=37), concentrate (n=30), and ruminant feces (n=39). The following method modifications were tested: AOAC Method 2002.04 with replacement of the reflux apparatus with an autoclave or Ankom(220®) extractor and F57 filter bags, and AOAC Method 2002.04 with replacement of the standardization procedures for α-amylase by a single addition of industrial α-amylase [250 μL of Termamyl 2X 240 Kilo Novo Units (KNU)-T/g] prior to heating the neutral detergent solution. For the feces and forage samples, the results obtained with the modified methods with an autoclave or modification of α-amylase use were similar to those obtained using AOAC Method 2002.04, but the use of the Ankom220 extractor resulted in overestimated values. For the concentrate samples, the modified methods using an autoclave or Ankom220 extractor resulted in positive systematic errors. However, the method using industrial α-amylase resulted in systematic error and slope bias despite that the obtained values were close to those obtained with AOAC Method 2002.04.

  17. Forest statistics for Southwest Arkansas counties

    Treesearch

    T. Richard Quick; Mary S. Hedlund

    1979-01-01

    These tables were derived from data obtained during a 1978 inventory of 20 counties comprising the Southwest Unit of Arkansas (fig. 1). The data on forest acreage and timber volume were secured by a systematic sampling method involving a forest-nonforest classification on aerial photographs and on-the-ground measurements of trees at sample locations. The sample...

  18. Forest statistics for plateau Tennessee counties

    Treesearch

    Renewable Resources Evaluation Research Work Unit

    1982-01-01

    These tables were derived from data obtained during a 1980 inventory of 16 counties comprising the Plateau Unit of Tennessee (fib. 1). The data on forest acreage and timber volume were secured by a systematic sampling method involving a forest-nonforest classification on aerial photographs and on-the-ground measurements of trees at sample locations. The sample...

  19. Forest statistics for Northwest Louisiana Parishes

    Treesearch

    James F. Rosson; Daniel F. Bertelson

    1985-01-01

    These tables were derived from data obtained during a 1984 inventory of 13 parishes comprising the Northwest unit of Louisiana (fig. 1). The data on forest acreage and timber volume were secured by a systematic sampling method involving a forest-nonforest classification on aerial photographs and on-the-ground measurements of trees at sample locations. The sample...

  20. Forest statistics for Southwest Louisiana parishes

    Treesearch

    James F. Rosson; Daniel F. Bertelson

    1985-01-01

    These tables were derived from data obtained during a 1984 inventory of 11 parishes comprising the Southwest Unit of Louisiana (fig. 1). The data on forest acreage and timber volume were secured by a systematic sampling method involving a forest-nonforest classification on aerial photographs and on-the-ground measurements of trees at sample locations. The sample...

  1. Using Language Sample Analysis to Assess Spoken Language Production in Adolescents

    ERIC Educational Resources Information Center

    Miller, Jon F.; Andriacchi, Karen; Nockerts, Ann

    2016-01-01

    Purpose: This tutorial discusses the importance of language sample analysis and how Systematic Analysis of Language Transcripts (SALT) software can be used to simplify the process and effectively assess the spoken language production of adolescents. Method: Over the past 30 years, thousands of language samples have been collected from typical…

  2. Estimating the sample mean and standard deviation from the sample size, median, range and/or interquartile range.

    PubMed

    Wan, Xiang; Wang, Wenqian; Liu, Jiming; Tong, Tiejun

    2014-12-19

    In systematic reviews and meta-analysis, researchers often pool the results of the sample mean and standard deviation from a set of similar clinical trials. A number of the trials, however, reported the study using the median, the minimum and maximum values, and/or the first and third quartiles. Hence, in order to combine results, one may have to estimate the sample mean and standard deviation for such trials. In this paper, we propose to improve the existing literature in several directions. First, we show that the sample standard deviation estimation in Hozo et al.'s method (BMC Med Res Methodol 5:13, 2005) has some serious limitations and is always less satisfactory in practice. Inspired by this, we propose a new estimation method by incorporating the sample size. Second, we systematically study the sample mean and standard deviation estimation problem under several other interesting settings where the interquartile range is also available for the trials. We demonstrate the performance of the proposed methods through simulation studies for the three frequently encountered scenarios, respectively. For the first two scenarios, our method greatly improves existing methods and provides a nearly unbiased estimate of the true sample standard deviation for normal data and a slightly biased estimate for skewed data. For the third scenario, our method still performs very well for both normal data and skewed data. Furthermore, we compare the estimators of the sample mean and standard deviation under all three scenarios and present some suggestions on which scenario is preferred in real-world applications. In this paper, we discuss different approximation methods in the estimation of the sample mean and standard deviation and propose some new estimation methods to improve the existing literature. We conclude our work with a summary table (an Excel spread sheet including all formulas) that serves as a comprehensive guidance for performing meta-analysis in different situations.

  3. Systematic bias in genomic classification due to contaminating non-neoplastic tissue in breast tumor samples.

    PubMed

    Elloumi, Fathi; Hu, Zhiyuan; Li, Yan; Parker, Joel S; Gulley, Margaret L; Amos, Keith D; Troester, Melissa A

    2011-06-30

    Genomic tests are available to predict breast cancer recurrence and to guide clinical decision making. These predictors provide recurrence risk scores along with a measure of uncertainty, usually a confidence interval. The confidence interval conveys random error and not systematic bias. Standard tumor sampling methods make this problematic, as it is common to have a substantial proportion (typically 30-50%) of a tumor sample comprised of histologically benign tissue. This "normal" tissue could represent a source of non-random error or systematic bias in genomic classification. To assess the performance characteristics of genomic classification to systematic error from normal contamination, we collected 55 tumor samples and paired tumor-adjacent normal tissue. Using genomic signatures from the tumor and paired normal, we evaluated how increasing normal contamination altered recurrence risk scores for various genomic predictors. Simulations of normal tissue contamination caused misclassification of tumors in all predictors evaluated, but different breast cancer predictors showed different types of vulnerability to normal tissue bias. While two predictors had unpredictable direction of bias (either higher or lower risk of relapse resulted from normal contamination), one signature showed predictable direction of normal tissue effects. Due to this predictable direction of effect, this signature (the PAM50) was adjusted for normal tissue contamination and these corrections improved sensitivity and negative predictive value. For all three assays quality control standards and/or appropriate bias adjustment strategies can be used to improve assay reliability. Normal tissue sampled concurrently with tumor is an important source of bias in breast genomic predictors. All genomic predictors show some sensitivity to normal tissue contamination and ideal strategies for mitigating this bias vary depending upon the particular genes and computational methods used in the predictor.

  4. Forest statistics for West-Central Tennessee counties

    Treesearch

    Renewable Resource Evaluation Research Work Unit

    1982-01-01

    These tables were derived from data obtained during a 1980 inventory of 11 counties comprising the West Central Unit of Tennessee (fig. 1). The data on forest acreage and timber volume were secured by systematic sampling method involving a forest-non-forest classification on aerial photographs and on-the-ground measurements of trees at sample locations. The sample...

  5. Forest statistics for Southwest-South Alabama counties

    Treesearch

    Forest Inventory and Analysis Research Work Unit

    1983-01-01

    These tables were derived from data obtained during a 1982 inventory of 21 counties comprising the Southeast Unit of Alabama (fig. 1). The data on forest acreage and timber volume were secured by a systematic sampling method of involving a forest-nonforest classification on aerial photographs and on-the-ground measurements of trees at sample locations. The sample...

  6. Forest statistics for North-Central Alabama counties

    Treesearch

    Forest Inventory and Analysis Research Work Unit

    1983-01-01

    These tables were derived from data obtained during a 1982 inventory of 15 counties comprising the North Central Unit of Alabama (fig. 1). The data on forest acreage and timber volume were secured by systematic sampling method involving a forest-nonforest classification on aerial photographs and on-the-ground measurements of trees at sample locations. The sample...

  7. Forest statistics for West Tennessee counties

    Treesearch

    No Author Named

    1982-01-01

    These tables were derived from data obtained during a 1980 inventory of 18 counties comprising the West Unit of Tennessee (fig. 1). The data on forest acreage and timber volume were secured by a systematic sampling method involving a forest-nonforest classification on aerial photographs and on-the-ground measurements of trees at sample locations. The sample locations...

  8. Forest statistics for North Alabama counties

    Treesearch

    Forest Inventory and Analysis Work Unit

    1983-01-01

    These tables were derived from data obtained during a 1982 inventory of 10 counties comprising the North Unit of Alabama (fig. 1). The data on forest acreage and timber volume were secured by a systematic sampling method involving a forest-nonforest classification on aerial photographs and on-the-ground measurements of trees at sample locations. The sample locations...

  9. Forest statistics for Central Tennessee counties

    Treesearch

    Renewable Resources Evaluation Research Work Unit

    1981-01-01

    These tables were derived from data obtained during a 1980 inventory of 23 counties comprising the Central Unit of Tennessee (fig. 1). The data on forest acreage and timber volume were secured by a systematic sampling method involving a forest-non-forest classification on aerial photographs and on-the-ground measurements of trees at sample locations. The sample...

  10. Forest statistics for Southwest Alabama counties

    Treesearch

    SO Southern Experiment Sta

    1983-01-01

    These tables were derived from data obtained during a 1982 inventory of 21 counties comprising the Southeast Unit of Alabama (fig. 1). The data on forest acreage and timber volume were secured by a systematic sampling method of involving a forest-nonforest classification on aerial photographs and on-the-ground measurements of trees at sample locations. The sample...

  11. Forest statistics for West Central Alabama counties

    Treesearch

    SO Southern Experiment Sta

    1983-01-01

    These tables were derived from data obtained during a 1982 inventory of 21 counties comprising the Southeast Unit of Alabama (fig. 1). The data on forest acreage and timber volume were secured by a systematic sampling method of involving a forest-nonforest classification on aerial photographs and on-the-ground measurements of trees at sample locations. The sample...

  12. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    PubMed

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  13. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

    PubMed Central

    Smith, Justin D.

    2013-01-01

    This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874

  14. A Systematic Review of Store Audit Methods for Assessing Tobacco Marketing and Products at the Point of Sale

    PubMed Central

    Lee, Joseph G. L.; Henriksen, Lisa; Myers, Allison E.; Dauphinee, Amanda L.; Ribisl, Kurt M.

    2013-01-01

    Objective Over four-fifths of reported expenditures for marketing tobacco products occur at the retail point of sale (POS). To date, no systematic review has synthesized the methods used for surveillance of POS marketing. This review sought to describe the audit objectives, methods, and measures used to study retail tobacco environments. Methods We systematically searched 11 academic databases for papers indexed on or before March 14, 2012, identifying 2,906 papers. Two coders independently reviewed each abstract or fulltext to identify papers with the following criteria: (1) data collectors visited and assessed (2) retail environments using (3) a data collection instrument for (4) tobacco products or marketing. We excluded papers where limited measures of products and/or marketing were incidental. Two abstractors independently coded included papers for research aims, locale, methods, measures used, and measurement properties. We calculated descriptive statistics regarding the use of 4 P’s of marketing (product, price, placement, promotion) and for measures of study design, sampling strategy, and sample size. Results We identified 88 store audit studies. Most studies focus on enumerating the number of signs or other promotions. Several strengths, particularly in sampling, are noted, but substantial improvements are indicated in the reporting of reliability, validity, and audit procedures. Conclusions Audits of POS tobacco marketing have made important contributions to understanding industry behaviour, the uses of marketing, and resulting health behaviours. Increased emphasis on standardization and the use of theory are needed in the field. We propose key components of audit methodology that should be routinely reported. PMID:23322313

  15. Evaluating test-retest reliability in patient-reported outcome measures for older people: A systematic review.

    PubMed

    Park, Myung Sook; Kang, Kyung Ja; Jang, Sun Joo; Lee, Joo Yun; Chang, Sun Ju

    2018-03-01

    This study aimed to evaluate the components of test-retest reliability including time interval, sample size, and statistical methods used in patient-reported outcome measures in older people and to provide suggestions on the methodology for calculating test-retest reliability for patient-reported outcomes in older people. This was a systematic literature review. MEDLINE, Embase, CINAHL, and PsycINFO were searched from January 1, 2000 to August 10, 2017 by an information specialist. This systematic review was guided by both the Preferred Reporting Items for Systematic Reviews and Meta-Analyses checklist and the guideline for systematic review published by the National Evidence-based Healthcare Collaborating Agency in Korea. The methodological quality was assessed by the Consensus-based Standards for the selection of health Measurement Instruments checklist box B. Ninety-five out of 12,641 studies were selected for the analysis. The median time interval for test-retest reliability was 14days, and the ratio of sample size for test-retest reliability to the number of items in each measure ranged from 1:1 to 1:4. The most frequently used statistical methods for continuous scores was intraclass correlation coefficients (ICCs). Among the 63 studies that used ICCs, 21 studies presented models for ICC calculations and 30 studies reported 95% confidence intervals of the ICCs. Additional analyses using 17 studies that reported a strong ICC (>0.09) showed that the mean time interval was 12.88days and the mean ratio of the number of items to sample size was 1:5.37. When researchers plan to assess the test-retest reliability of patient-reported outcome measures for older people, they need to consider an adequate time interval of approximately 13days and the sample size of about 5 times the number of items. Particularly, statistical methods should not only be selected based on the types of scores of the patient-reported outcome measures, but should also be described clearly in the studies that report the results of test-retest reliability. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Neutron activation analysis of certified samples by the absolute method

    NASA Astrophysics Data System (ADS)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  17. Characteristics of Qualitative Descriptive Studies: A Systematic Review

    PubMed Central

    Kim, Hyejin; Sefcik, Justine S.; Bradway, Christine

    2016-01-01

    Qualitative description (QD) is a term that is widely used to describe qualitative studies of health care and nursing-related phenomena. However, limited discussions regarding QD are found in the existing literature. In this systematic review, we identified characteristics of methods and findings reported in research articles published in 2014 whose authors identified the work as QD. After searching and screening, data were extracted from the sample of 55 QD articles and examined to characterize research objectives, design justification, theoretical/philosophical frameworks, sampling and sample size, data collection and sources, data analysis, and presentation of findings. In this review, three primary findings were identified. First, despite inconsistencies, most articles included characteristics consistent with limited, available QD definitions and descriptions. Next, flexibility or variability of methods was common and desirable for obtaining rich data and achieving understanding of a phenomenon. Finally, justification for how a QD approach was chosen and why it would be an appropriate fit for a particular study was limited in the sample and, therefore, in need of increased attention. Based on these findings, recommendations include encouragement to researchers to provide as many details as possible regarding the methods of their QD study so that readers can determine whether the methods used were reasonable and effective in producing useful findings. PMID:27686751

  18. Systematic Assessment of Seven Solvent and Solid-Phase Extraction Methods for Metabolomics Analysis of Human Plasma by LC-MS

    NASA Astrophysics Data System (ADS)

    Sitnikov, Dmitri G.; Monnin, Cian S.; Vuckovic, Dajana

    2016-12-01

    The comparison of extraction methods for global metabolomics is usually executed in biofluids only and focuses on metabolite coverage and method repeatability. This limits our detailed understanding of extraction parameters such as recovery and matrix effects and prevents side-by-side comparison of different sample preparation strategies. To address this gap in knowledge, seven solvent-based and solid-phase extraction methods were systematically evaluated using standard analytes spiked into both buffer and human plasma. We compared recovery, coverage, repeatability, matrix effects, selectivity and orthogonality of all methods tested for non-lipid metabolome in combination with reversed-phased and mixed-mode liquid chromatography mass spectrometry analysis (LC-MS). Our results confirmed wide selectivity and excellent precision of solvent precipitations, but revealed their high susceptibility to matrix effects. The use of all seven methods showed high overlap and redundancy which resulted in metabolite coverage increases of 34-80% depending on LC-MS method employed as compared to the best single extraction protocol (methanol/ethanol precipitation) despite 7x increase in MS analysis time and sample consumption. The most orthogonal methods to methanol-based precipitation were ion-exchange solid-phase extraction and liquid-liquid extraction using methyl-tertbutyl ether. Our results help facilitate rational design and selection of sample preparation methods and internal standards for global metabolomics.

  19. Systematic Assessment of Seven Solvent and Solid-Phase Extraction Methods for Metabolomics Analysis of Human Plasma by LC-MS

    PubMed Central

    Sitnikov, Dmitri G.; Monnin, Cian S.; Vuckovic, Dajana

    2016-01-01

    The comparison of extraction methods for global metabolomics is usually executed in biofluids only and focuses on metabolite coverage and method repeatability. This limits our detailed understanding of extraction parameters such as recovery and matrix effects and prevents side-by-side comparison of different sample preparation strategies. To address this gap in knowledge, seven solvent-based and solid-phase extraction methods were systematically evaluated using standard analytes spiked into both buffer and human plasma. We compared recovery, coverage, repeatability, matrix effects, selectivity and orthogonality of all methods tested for non-lipid metabolome in combination with reversed-phased and mixed-mode liquid chromatography mass spectrometry analysis (LC-MS). Our results confirmed wide selectivity and excellent precision of solvent precipitations, but revealed their high susceptibility to matrix effects. The use of all seven methods showed high overlap and redundancy which resulted in metabolite coverage increases of 34–80% depending on LC-MS method employed as compared to the best single extraction protocol (methanol/ethanol precipitation) despite 7x increase in MS analysis time and sample consumption. The most orthogonal methods to methanol-based precipitation were ion-exchange solid-phase extraction and liquid-liquid extraction using methyl-tertbutyl ether. Our results help facilitate rational design and selection of sample preparation methods and internal standards for global metabolomics. PMID:28000704

  20. IMPLICIT DUAL CONTROL BASED ON PARTICLE FILTERING AND FORWARD DYNAMIC PROGRAMMING.

    PubMed

    Bayard, David S; Schumitzky, Alan

    2010-03-01

    This paper develops a sampling-based approach to implicit dual control. Implicit dual control methods synthesize stochastic control policies by systematically approximating the stochastic dynamic programming equations of Bellman, in contrast to explicit dual control methods that artificially induce probing into the control law by modifying the cost function to include a term that rewards learning. The proposed implicit dual control approach is novel in that it combines a particle filter with a policy-iteration method for forward dynamic programming. The integration of the two methods provides a complete sampling-based approach to the problem. Implementation of the approach is simplified by making use of a specific architecture denoted as an H-block. Practical suggestions are given for reducing computational loads within the H-block for real-time applications. As an example, the method is applied to the control of a stochastic pendulum model having unknown mass, length, initial position and velocity, and unknown sign of its dc gain. Simulation results indicate that active controllers based on the described method can systematically improve closed-loop performance with respect to other more common stochastic control approaches.

  1. Sequence Capture versus Restriction Site Associated DNA Sequencing for Shallow Systematics.

    PubMed

    Harvey, Michael G; Smith, Brian Tilston; Glenn, Travis C; Faircloth, Brant C; Brumfield, Robb T

    2016-09-01

    Sequence capture and restriction site associated DNA sequencing (RAD-Seq) are two genomic enrichment strategies for applying next-generation sequencing technologies to systematics studies. At shallow timescales, such as within species, RAD-Seq has been widely adopted among researchers, although there has been little discussion of the potential limitations and benefits of RAD-Seq and sequence capture. We discuss a series of issues that may impact the utility of sequence capture and RAD-Seq data for shallow systematics in non-model species. We review prior studies that used both methods, and investigate differences between the methods by re-analyzing existing RAD-Seq and sequence capture data sets from a Neotropical bird (Xenops minutus). We suggest that the strengths of RAD-Seq data sets for shallow systematics are the wide dispersion of markers across the genome, the relative ease and cost of laboratory work, the deep coverage and read overlap at recovered loci, and the high overall information that results. Sequence capture's benefits include flexibility and repeatability in the genomic regions targeted, success using low-quality samples, more straightforward read orthology assessment, and higher per-locus information content. The utility of a method in systematics, however, rests not only on its performance within a study, but on the comparability of data sets and inferences with those of prior work. In RAD-Seq data sets, comparability is compromised by low overlap of orthologous markers across species and the sensitivity of genetic diversity in a data set to an interaction between the level of natural heterozygosity in the samples examined and the parameters used for orthology assessment. In contrast, sequence capture of conserved genomic regions permits interrogation of the same loci across divergent species, which is preferable for maintaining comparability among data sets and studies for the purpose of drawing general conclusions about the impact of historical processes across biotas. We argue that sequence capture should be given greater attention as a method of obtaining data for studies in shallow systematics and comparative phylogeography. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Human body mass estimation: a comparison of "morphometric" and "mechanical" methods.

    PubMed

    Auerbach, Benjamin M; Ruff, Christopher B

    2004-12-01

    In the past, body mass was reconstructed from hominin skeletal remains using both "mechanical" methods which rely on the support of body mass by weight-bearing skeletal elements, and "morphometric" methods which reconstruct body mass through direct assessment of body size and shape. A previous comparison of two such techniques, using femoral head breadth (mechanical) and stature and bi-iliac breadth (morphometric), indicated a good general correspondence between them (Ruff et al. [1997] Nature 387:173-176). However, the two techniques were never systematically compared across a large group of modern humans of diverse body form. This study incorporates skeletal measures taken from 1,173 Holocene adult individuals, representing diverse geographic origins, body sizes, and body shapes. Femoral head breadth, bi-iliac breadth (after pelvic rearticulation), and long bone lengths were measured on each individual. Statures were estimated from long bone lengths using appropriate reference samples. Body masses were calculated using three available femoral head breadth (FH) formulae and the stature/bi-iliac breadth (STBIB) formula, and compared. All methods yielded similar results. Correlations between FH estimates and STBIB estimates are 0.74-0.81. Slight differences in results between the three FH estimates can be attributed to sampling differences in the original reference samples, and in particular, the body-size ranges included in those samples. There is no evidence for systematic differences in results due to differences in body proportions. Since the STBIB method was validated on other samples, and the FH methods produced similar estimates, this argues that either may be applied to skeletal remains with some confidence. 2004 Wiley-Liss, Inc.

  3. A systematic review of store audit methods for assessing tobacco marketing and products at the point of sale.

    PubMed

    Lee, Joseph G L; Henriksen, Lisa; Myers, Allison E; Dauphinee, Amanda L; Ribisl, Kurt M

    2014-03-01

    Over four-fifths of reported expenditures for marketing tobacco products occur at the retail point of sale (POS). To date, no systematic review has synthesised the methods used for surveillance of POS marketing. This review sought to describe the audit objectives, methods and measures used to study retail tobacco environments. We systematically searched 11 academic databases for papers indexed on or before 14 March 2012, identifying 2906 papers. Two coders independently reviewed each abstract or full text to identify papers with the following criteria: (1) data collectors visited and assessed (2) retail environments using (3) a data collection instrument for (4) tobacco products or marketing. We excluded papers where limited measures of products and/or marketing were incidental. Two abstractors independently coded included papers for research aims, locale, methods, measures used and measurement properties. We calculated descriptive statistics regarding the use of four P's of marketing (product, price, placement, promotion) and for measures of study design, sampling strategy and sample size. We identified 88 store audit studies. Most studies focus on enumerating the number of signs or other promotions. Several strengths, particularly in sampling, are noted, but substantial improvements are indicated in the reporting of reliability, validity and audit procedures. Audits of POS tobacco marketing have made important contributions to understanding industry behaviour, the uses of marketing and resulting health behaviours. Increased emphasis on standardisation and the use of theory are needed in the field. We propose key components of audit methodology that should be routinely reported.

  4. A Systematic Evaluation of ADHD and Comorbid Psychopathology in a Population-Based Twin Sample

    ERIC Educational Resources Information Center

    Volk, Heather E.; Neuman, Rosalind J.; Todd, Richard D.

    2005-01-01

    Objective: Clinical and population samples demonstrate that attention-deficit/hyperactivity disorder (ADHD) occurs with other disorders. Comorbid disorder clustering within ADHD subtypes is not well studied. Method: Latent class analysis (LCA) examined the co-occurrence of DSM-IV ADHD, oppositional defiant disorder (ODD), conduct disorder (CD),…

  5. Quantification of proteins in urine samples using targeted mass spectrometry methods.

    PubMed

    Khristenko, Nina; Domon, Bruno

    2015-01-01

    Numerous clinical proteomics studies are focused on the development of biomarkers to improve either diagnostics for early disease detection or the monitoring of the response to the treatment. Although, a wealth of biomarker candidates are available, their evaluation and validation in a true clinical setup remains challenging. In biomarkers evaluation studies, a panel of proteins of interest are systematically analyzed in a large cohort of samples. However, in spite of the latest progresses in mass spectrometry, the consistent detection of pertinent proteins in high complex biological samples is still a challenging task. Thus, targeted LC-MS/MS methods are better suited for the systematic analysis of biomarkers rather than shotgun approaches. This chapter describes the workflow used to perform targeted quantitative analyses of proteins in urinary samples. The peptides, as surrogates of the protein of interest, are commonly measured using a triple quadrupole mass spectrometers operated in selected reaction monitoring (SRM) mode. More recently, the advances in targeted LC-MS/MS analysis based on parallel reaction monitoring (PRM) performed on a quadrupole-orbitrap instrument have allowed to increase the specificity and selectivity of the measurements.

  6. Systematic reviews do not adequately report or address missing outcome data in their analyses: a methodological survey.

    PubMed

    Kahale, Lara A; Diab, Batoul; Brignardello-Petersen, Romina; Agarwal, Arnav; Mustafa, Reem A; Kwong, Joey; Neumann, Ignacio; Li, Ling; Lopes, Luciane Cruz; Briel, Matthias; Busse, Jason W; Iorio, Alfonso; Vandvik, Per Olav; Alexander, Paul Elias; Guyatt, Gordon; Akl, Elie A

    2018-07-01

    To describe how systematic review authors report and address categories of participants with potential missing outcome data of trial participants. Methodological survey of systematic reviews reporting a group-level meta-analysis. We included a random sample of 50 Cochrane and 50 non-Cochrane systematic reviews. Of these, 25 reported in their methods section a plan to consider at least one of the 10 categories of missing outcome data; 42 reported in their results, data for at least one category of missing data. The most reported category in the methods and results sections was "unexplained loss to follow-up" (n = 34 in methods section and n = 6 in the results section). Only 19 reported a method to handle missing data in their primary analyses, which was most often complete case analysis. Few reviews (n = 9) reported in the methods section conducting sensitivity analysis to judge risk of bias associated with missing outcome data at the level of the meta-analysis; and only five of them presented the results of these analyses in the results section. Most systematic reviews do not explicitly report sufficient information on categories of trial participants with potential missing outcome data or address missing data in their primary analyses. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Comprehensive comparative analysis of 5'-end RNA-sequencing methods.

    PubMed

    Adiconis, Xian; Haber, Adam L; Simmons, Sean K; Levy Moonshine, Ami; Ji, Zhe; Busby, Michele A; Shi, Xi; Jacques, Justin; Lancaster, Madeline A; Pan, Jen Q; Regev, Aviv; Levin, Joshua Z

    2018-06-04

    Specialized RNA-seq methods are required to identify the 5' ends of transcripts, which are critical for studies of gene regulation, but these methods have not been systematically benchmarked. We directly compared six such methods, including the performance of five methods on a single human cellular RNA sample and a new spike-in RNA assay that helps circumvent challenges resulting from uncertainties in annotation and RNA processing. We found that the 'cap analysis of gene expression' (CAGE) method performed best for mRNA and that most of its unannotated peaks were supported by evidence from other genomic methods. We applied CAGE to eight brain-related samples and determined sample-specific transcription start site (TSS) usage, as well as a transcriptome-wide shift in TSS usage between fetal and adult brain.

  8. Effects of sampling strategy, detection probability, and independence of counts on the use of point counts

    USGS Publications Warehouse

    Pendleton, G.W.; Ralph, C. John; Sauer, John R.; Droege, Sam

    1995-01-01

    Many factors affect the use of point counts for monitoring bird populations, including sampling strategies, variation in detection rates, and independence of sample points. The most commonly used sampling plans are stratified sampling, cluster sampling, and systematic sampling. Each of these might be most useful for different objectives or field situations. Variation in detection probabilities and lack of independence among sample points can bias estimates and measures of precision. All of these factors should be con-sidered when using point count methods.

  9. A method of lead determination in human teeth by energy dispersive X-ray fluorescence (EDXRF).

    PubMed

    Sargentini-Maier, M L; Frank, R M; Leroy, M J; Turlot, J C

    1988-12-01

    A systematic sampling procedure was combined with a method of energy dispersive X-ray fluorescence (EDXRF) to study lead content and its variations in human teeth. On serial ground sections made on unembedded permanent teeth of inhabitants of Strasbourg with a special diamond rotating disk, 2 series of 500 microns large punch biopsies were made systematically in 5 directions from the tooth surface to the inner pulpal dentine with a micro-punching unit. In addition, pooled fragments of enamel and dentine were made for each tooth. On each punched fragment or pooled sample, lead content was determined after dissolution in ultrapure nitric acid, on a 4 microns thick polypropylene film, and irradiation with a Siemens EDXRF prototype with direct sample excitation by a high power X-ray tube with a molybdenum anode. Fluorescence was detected by a Si(Li) detector and calcium was used as an internal standard. This technique allowed a rapid, automatic, multielementary and non-destructive analysis of microsamples with good detection limits.

  10. Rhetorical Structure of Education Research Article Methods Sections

    ERIC Educational Resources Information Center

    Zhang, Baoya; Wannaruk, Anchalee

    2016-01-01

    This study investigated the rhetorical move structure of the education research article genre within the framework of Swales' (1981, 1990, 2004) move analysis. A corpus of 120 systematically sampled empirical education research articles served as data input for the analysis. The results indicate that the education research article methods section…

  11. Vitamin concentrations in human milk vary with time within feed, circadian rhythm, and single-dose supplementation

    USDA-ARS?s Scientific Manuscript database

    Importance: Human milk is the subject of many nutrition studies but methods for representative sample collection are not established. Our recently improved, validated methods for analyzing micronutrients in human milk now enable systematic study of factors affecting their concentration. Objective...

  12. Fat- and water-soluble vitamin concentrations in human milk: effects of collection protocol, circadian variation and acute maternal supplementation

    USDA-ARS?s Scientific Manuscript database

    Importance: Human milk is the subject of many nutrition studies but methods for representative sample collection are not established. Our recently improved, validated methods for analyzing micronutrients in human milk now enable systematic study of factors affecting their concentration. Objective:...

  13. Characteristics of Qualitative Descriptive Studies: A Systematic Review.

    PubMed

    Kim, Hyejin; Sefcik, Justine S; Bradway, Christine

    2017-02-01

    Qualitative description (QD) is a term that is widely used to describe qualitative studies of health care and nursing-related phenomena. However, limited discussions regarding QD are found in the existing literature. In this systematic review, we identified characteristics of methods and findings reported in research articles published in 2014 whose authors identified the work as QD. After searching and screening, data were extracted from the sample of 55 QD articles and examined to characterize research objectives, design justification, theoretical/philosophical frameworks, sampling and sample size, data collection and sources, data analysis, and presentation of findings. In this review, three primary findings were identified. First, although there were some inconsistencies, most articles included characteristics consistent with the limited available QD definitions and descriptions. Next, flexibility or variability of methods was common and effective for obtaining rich data and achieving understanding of a phenomenon. Finally, justification for how a QD approach was chosen and why it would be an appropriate fit for a particular study was limited in the sample and, therefore, in need of increased attention. Based on these findings, recommendations include encouragement to researchers to provide as many details as possible regarding the methods of their QD studies so that readers can determine whether the methods used were reasonable and effective in producing useful findings. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  14. A survey method for characterizing daily life experience: the day reconstruction method.

    PubMed

    Kahneman, Daniel; Krueger, Alan B; Schkade, David A; Schwarz, Norbert; Stone, Arthur A

    2004-12-03

    The Day Reconstruction Method (DRM) assesses how people spend their time and how they experience the various activities and settings of their lives, combining features of time-budget measurement and experience sampling. Participants systematically reconstruct their activities and experiences of the preceding day with procedures designed to reduce recall biases. The DRM's utility is shown by documenting close correspondences between the DRM reports of 909 employed women and established results from experience sampling. An analysis of the hedonic treadmill shows the DRM's potential for well-being research.

  15. A novel approach to non-biased systematic random sampling: a stereologic estimate of Purkinje cells in the human cerebellum.

    PubMed

    Agashiwala, Rajiv M; Louis, Elan D; Hof, Patrick R; Perl, Daniel P

    2008-10-21

    Non-biased systematic sampling using the principles of stereology provides accurate quantitative estimates of objects within neuroanatomic structures. However, the basic principles of stereology are not optimally suited for counting objects that selectively exist within a limited but complex and convoluted portion of the sample, such as occurs when counting cerebellar Purkinje cells. In an effort to quantify Purkinje cells in association with certain neurodegenerative disorders, we developed a new method for stereologic sampling of the cerebellar cortex, involving calculating the volume of the cerebellar tissues, identifying and isolating the Purkinje cell layer and using this information to extrapolate non-biased systematic sampling data to estimate the total number of Purkinje cells in the tissues. Using this approach, we counted Purkinje cells in the right cerebella of four human male control specimens, aged 41, 67, 70 and 84 years, and estimated the total Purkinje cell number for the four entire cerebella to be 27.03, 19.74, 20.44 and 22.03 million cells, respectively. The precision of the method is seen when comparing the density of the cells within the tissue: 266,274, 173,166, 167,603 and 183,575 cells/cm3, respectively. Prior literature documents Purkinje cell counts ranging from 14.8 to 30.5 million cells. These data demonstrate the accuracy of our approach. Our novel approach, which offers an improvement over previous methodologies, is of value for quantitative work of this nature. This approach could be applied to morphometric studies of other similarly complex tissues as well.

  16. A novel approach to non-biased systematic random sampling: A stereologic estimate of Purkinje cells in the human cerebellum

    PubMed Central

    Agashiwala, Rajiv M.; Louis, Elan D.; Hof, Patrick R.; Perl, Daniel P.

    2010-01-01

    Non-biased systematic sampling using the principles of stereology provides accurate quantitative estimates of objects within neuroanatomic structures. However, the basic principles of stereology are not optimally suited for counting objects that selectively exist within a limited but complex and convoluted portion of the sample, such as occurs when counting cerebellar Purkinje cells. In an effort to quantify Purkinje cells in association with certain neurodegenerative disorders, we developed a new method for stereologic sampling of the cerebellar cortex, involving calculating the volume of the cerebellar tissues, identifying and isolating the Purkinje cell layer and using this information to extrapolate non-biased systematic sampling data to estimate the total number of Purkinje cells in the tissues. Using this approach, we counted Purkinje cells in the right cerebella of four human male control specimens, aged 41, 67, 70 and 84 years, and estimated the total Purkinje cell number for the four entire cerebella to be 27.03, 19.74, 20.44 and 22.03 million cells, respectively. The precision of the method is seen when comparing the density of the cells within the tissue: 266,274, 173,166, 167,603 and 183,575 cells/cm3, respectively. Prior literature documents Purkinje cell counts ranging from 14.8 to 30.5 million cells. These data demonstrate the accuracy of our approach. Our novel approach, which offers an improvement over previous methodologies, is of value for quantitative work of this nature. This approach could be applied to morphometric studies of other similarly complex tissues as well. PMID:18725208

  17. Sequential sampling of ribes populations in the control of white pine blister rust (Cronartium ribicola Fischer) in California

    Treesearch

    Harold R. Offord

    1966-01-01

    Sequential sampling based on a negative binomial distribution of ribes populations required less than half the time taken by regular systematic line transect sampling in a comparison test. It gave the same control decision as the regular method in 9 of 13 field trials. A computer program that permits sequential plans to be built readily for other white pine regions is...

  18. International recommendation for a comprehensive neuropathologic workup of epilepsy surgery brain tissue: A consensus Task Force report from the ILAE Commission on Diagnostic Methods.

    PubMed

    Blümcke, Ingmar; Aronica, Eleonora; Miyata, Hajime; Sarnat, Harvey B; Thom, Maria; Roessler, Karl; Rydenhag, Bertil; Jehi, Lara; Krsek, Pavel; Wiebe, Samuel; Spreafico, Roberto

    2016-03-01

    Epilepsy surgery is an effective treatment in many patients with drug-resistant focal epilepsies. An early decision for surgical therapy is facilitated by a magnetic resonance imaging (MRI)-visible brain lesion congruent with the electrophysiologically abnormal brain region. Recent advances in the pathologic diagnosis and classification of epileptogenic brain lesions are helpful for clinical correlation, outcome stratification, and patient management. However, application of international consensus classification systems to common epileptic pathologies (e.g., focal cortical dysplasia [FCD] and hippocampal sclerosis [HS]) necessitates standardized protocols for neuropathologic workup of epilepsy surgery specimens. To this end, the Task Force of Neuropathology from the International League Against Epilepsy (ILAE) Commission on Diagnostic Methods developed a consensus standard operational procedure for tissue inspection, distribution, and processing. The aims are to provide a systematic framework for histopathologic workup, meeting minimal standards and maximizing current and future opportunities for morphofunctional correlations and molecular studies for both clinical care and research. Whenever feasible, anatomically intact surgical specimens are desirable to enable systematic analysis in selective hippocampectomies, temporal lobe resections, and lesional or nonlesional neocortical samples. Correct orientation of sample and the sample's relation to neurophysiologically aberrant sites requires good communication between pathology and neurosurgical teams. Systematic tissue sampling of 5-mm slabs along a defined anatomic axis and application of a limited immunohistochemical panel will ensure a reliable differential diagnosis of main pathologies encountered in epilepsy surgery. Wiley Periodicals, Inc. © 2016 International League Against Epilepsy.

  19. Laser Induced Rotation of a Levitated Sample in Vacuum

    NASA Technical Reports Server (NTRS)

    Rhim, W. K.; Paradis, P. F.

    1999-01-01

    A method of systematically controlling the rotational state of a sample levitated in a high vacuum using the photon pressure is described. A zirconium sphere was levitated in the high-temperature electrostatic levitator and it was rotated by irradiating it with a narrow beam of a high power laser on a spot off the center of mass.

  20. Sampling methods for the study of pneumococcal carriage: a systematic review.

    PubMed

    Gladstone, R A; Jefferies, J M; Faust, S N; Clarke, S C

    2012-11-06

    Streptococcus pneumoniae is an important pathogen worldwide. Accurate sampling of S. pneumoniae carriage is central to surveillance studies before and following conjugate vaccination programmes to combat pneumococcal disease. Any bias introduced during sampling will affect downstream recovery and typing. Many variables exist for the method of collection and initial processing, which can make inter-laboratory or international comparisons of data complex. In February 2003, a World Health Organisation working group published a standard method for the detection of pneumococcal carriage for vaccine trials to reduce or eliminate variability. We sought to describe the variables associated with the sampling of S. pneumoniae from collection to storage in the context of the methods recommended by the WHO and those used in pneumococcal carriage studies since its publication. A search of published literature in the online PubMed database was performed on the 1st June 2012, to identify published studies that collected pneumococcal carriage isolates, conducted after the publication of the WHO standard method. After undertaking a systematic analysis of the literature, we show that a number of differences in pneumococcal sampling protocol continue to exist between studies since the WHO publication. The majority of studies sample from the nasopharynx, but the choice of swab and swab transport media is more variable between studies. At present there is insufficient experimental data that supports the optimal sensitivity of any standard method. This may have contributed to incomplete adoption of the primary stages of the WHO detection protocol, alongside pragmatic or logistical issues associated with study design. Consequently studies may not provide a true estimate of pneumococcal carriage. Optimal sampling of carriage could lead to improvements in downstream analysis and the evaluation of pneumococcal vaccine impact and extrapolation to pneumococcal disease control therefore further in depth comparisons would be of value. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    PubMed

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  2. Comparability of HbA1c and lipids measured with dried blood spot versus venous samples: a systematic review and meta-analysis

    PubMed Central

    2014-01-01

    Background Levels of haemoglobin A1c (HbA1c) and blood lipids are important determinants of risk in patients with diabetes. Standard analysis methods based upon venous blood samples can be logistically challenging in resource-poor settings where much of the diabetes epidemic is occurring. Dried blood spots (DBS) provide a simple alternative method for sample collection but the comparability of data from analyses based on DBS is not well established. Methods We conducted a systematic review and meta-analysis to define the association of findings for HbA1c and blood lipids for analyses based upon standard methods compared to DBS. The Cochrane, Embase and Medline databases were searched for relevant reports and summary regression lines were estimated. Results 705 abstracts were found by the initial electronic search with 6 further reports identified by manual review of the full papers. 16 studies provided data for one or more outcomes of interest. There was a close agreement between the results for HbA1c assays based on venous and DBS samples (DBS = 0.9858venous + 0.3809), except for assays based upon affinity chromatography. Significant adjustment was required for assays of total cholesterol (DBS = 0.6807venous + 1.151) but results for triglycerides (DBS = 0.9557venous + 0.1427) were directly comparable. Conclusions For HbA1c and selected blood lipids, assays based on DBS samples are clearly associated with assays based on standard venous samples. There are, however, significant uncertainties about the nature of these associations and there is a need for standardisation of the sample collection, transportation, storage and analysis methods before the technique can be considered mainstream. This should be a research priority because better elucidation of metabolic risks in resource poor settings, where venous sampling is infeasible, will be key to addressing the global epidemic of cardiovascular diseases. PMID:25045323

  3. The inference of vector magnetic fields from polarization measurements with limited spectral resolution

    NASA Technical Reports Server (NTRS)

    Lites, B. W.; Skumanich, A.

    1985-01-01

    A method is presented for recovery of the vector magnetic field and thermodynamic parameters from polarization measurement of photospheric line profiles measured with filtergraphs. The method includes magneto-optic effects and may be utilized on data sampled at arbitrary wavelengths within the line profile. The accuracy of this method is explored through inversion of synthetic Stokes profiles subjected to varying levels of random noise, instrumental wave-length resolution, and line profile sampling. The level of error introduced by the systematic effect of profile sampling over a finite fraction of the 5 minute oscillation cycle is also investigated. The results presented here are intended to guide instrumental design and observational procedure.

  4. A comparative study of Conroy and Monte Carlo methods applied to multiple quadratures and multiple scattering

    NASA Technical Reports Server (NTRS)

    Deepak, A.; Fluellen, A.

    1978-01-01

    An efficient numerical method of multiple quadratures, the Conroy method, is applied to the problem of computing multiple scattering contributions in the radiative transfer through realistic planetary atmospheres. A brief error analysis of the method is given and comparisons are drawn with the more familiar Monte Carlo method. Both methods are stochastic problem-solving models of a physical or mathematical process and utilize the sampling scheme for points distributed over a definite region. In the Monte Carlo scheme the sample points are distributed randomly over the integration region. In the Conroy method, the sample points are distributed systematically, such that the point distribution forms a unique, closed, symmetrical pattern which effectively fills the region of the multidimensional integration. The methods are illustrated by two simple examples: one, of multidimensional integration involving two independent variables, and the other, of computing the second order scattering contribution to the sky radiance.

  5. Exploring Self-Perceived Growth in a Clinical Sample of Severely Traumatized Youth

    ERIC Educational Resources Information Center

    Glad, Kristin Alve; Jensen, Tine K.; Holt, Tonje; Ormhaug, Silje Morup

    2013-01-01

    Objective: The aims of this study were threefold: (1) examine the prevalence of Posttraumatic Growth (PTG) among severely traumatized youth, (2) systematically describe the PTG reported, and (3) study the course of PTG from pre- to post-treatment. Method: The sample consisted of 148 severely traumatized Norwegian youth (M age = 15, SD = 2.2, 79.1%…

  6. Chemical separation of Nd from geological samples for chronological studies using (146)Sm-(142)Nd and (147)Sm-(143)Nd systematics.

    PubMed

    Kagami, Saya; Yokoyama, Tetsuya

    2016-09-21

    Sm-Nd dating, which involves long-lived (147)Sm-(143)Nd and short-lived (146)Sm-(142)Nd systematics, has been widely used in the field of geosciences. To obtain precise and accurate ages of geological samples, the determination of highly precise Nd isotope ratios with nearly complete removal of Ce and Sm is indispensable to avoid mass spectral interference. In this study, we developed a three-step column chemistry procedure for separating Nd from geological samples that includes cation exchange chromatography for separating major elements from rare earth elements (REEs), oxidative extraction chromatography using Ln Resin coupled with HNO3 + KBrO3 for separating tetravalent Ce from the remaining REEs, and final purification of Nd using Ln Resin. This method enables high recovery of Nd (>91%) with effective separation of Nd from Ce and Sm (Ce/Nd < 1.2 × 10(-5) and Sm/Nd < 5.2 × 10(-6)). In addition, we devised a new method for determining Sm/Nd ratios with the isotope dilution inductively coupled plasma mass spectrometry method using (145)Nd- and (149)Sm-enriched spikes coupled with a group separation of REEs using TRU Resin. Applying the techniques developed in this study, we determined the Sm-Nd whole-rock isochron age of basaltic eucrites, yielding 4577 - 88(+ 55) Ma and 4558 ± 300 Ma for (146)Sm-(142)Nd and (147)Sm-(143)Nd systematics, respectively. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Non-destructive identification of unknown minor phases in polycrystalline bulk alloys using three-dimensional X-ray diffraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Yiming, E-mail: yangyiming1988@outlook.com

    Minor phases make considerable contributions to the mechanical and physical properties of metals and alloys. Unfortunately, it is difficult to identify unknown minor phases in a bulk polycrystalline material using conventional metallographic methods. Here, a non-destructive method based on three-dimensional X-ray diffraction (3DXRD) is developed to solve this problem. Simulation results demonstrate that this method is simultaneously able to identify minor phase grains and reveal their positions, orientations and sizes within bulk alloys. According to systematic simulations, the 3DXRD method is practicable for an extensive sample set, including polycrystalline alloys with hexagonal, orthorhombic and cubic minor phases. Experiments were alsomore » conducted to confirm the simulation results. The results for a bulk sample of aluminum alloy AA6061 show that the crystal grains of an unexpected γ-Fe (austenite) phase can be identified, three-dimensionally and nondestructively. Therefore, we conclude that the 3DXRD method is a powerful tool for the identification of unknown minor phases in bulk alloys belonging to a variety of crystal systems. This method also has the potential to be used for in situ observations of the effects of minor phases on the crystallographic behaviors of alloys. - Highlights: •A method based on 3DXRD is developed for identification of unknown minor phase. •Grain position, orientation and size, is simultaneously acquired. •A systematic simulation demonstrated the applicability of the proposed method. •Experimental results on a AA6061 sample confirmed the practicability of the method.« less

  8. A comparative study of clock rate and drift estimation

    NASA Technical Reports Server (NTRS)

    Breakiron, Lee A.

    1994-01-01

    Five different methods of drift determination and four different methods of rate determination were compared using months of hourly phase and frequency data from a sample of cesium clocks and active hydrogen masers. Linear least squares on frequency is selected as the optimal method of determining both drift and rate, more on the basis of parameter parsimony and confidence measures than on random and systematic errors.

  9. The adjusting factor method for weight-scaling truckloads of mixed hardwood sawlogs

    Treesearch

    Edward L. Adams

    1976-01-01

    A new method of weight-scaling truckloads of mixed hardwood sawlogs systematically adjusts for changes in the weight/volume ratio of logs coming into a sawmill. It uses a conversion factor based on the running average of weight/volume ratios of randomly selected sample loads. A test of the method indicated that over a period of time the weight-scaled volume should...

  10. Analysis of black carbon molecular markers by two chromatographic methods (GC-FID and HPLC-DAD)

    NASA Astrophysics Data System (ADS)

    Schneider, Maximilian P. W.; Smittenberg, Rienk H.; Dittmar, Thorsten; Schmidt, Michael W. I.

    2010-05-01

    The analysis of benzenepolycarboxylic acids (BPCA) as a quantitative measure for black carbon (BC) in soil and sediment samples is a well-established method [1, 2]. Briefly, the oxidation of polycondensated BC molecules forms seven molecular markers, which can be assigned to BC, and which subsequently can be quantified by GC-FID (gas chromatography with flame ionization detector). Recently this method has been refined for BC quantification in seawater samples measuring BPCA on HPLC-DAD (High performance liquid chromatography with diode array detector) [3]. However, a systematic comparison of BC as determined by both analytical techniques would be essential to the calculation of global BC budgets, but is lacking. Here we present data for the systematic comparison of the two BPCA methods, both for quantity and quality. We prepared chars under well-defined laboratory conditions. Chestnut hardwood chips and rice straw were pyrolysed at temperatures between 200 and 1000°C under constant N2 stream. The BC contents of the chars have been analysed using the BPCA extraction method followed by either GC-FID or HPLC-DAD quantification [4]. It appears that the GC-FID method yields systematically lower concentrations of BPCA in the chars compared to the HPLC-DAD method. Possible reasons for the observed difference are i) higher losses of sample material during preparation for GC-FID; ii) different quality of the linear regression used for quantification; iii) incomplete derivatisation of B5CA and B6CA, which is needed for GC-FID analysis. In a next step, we will test different derivatisation procedures (methylation with dimethyl sulfate or diazomethane, and silylation) for their influence on the GC-FID results. The aim of this study is to test if black carbon can be quantified in soil, sediment and water samples using one single method - a crucial step when attempting a global BC budget. References: [1] Brodowski, S., Rodionov, A., Haumeier L., Glaser, B., Amelung, W. (2005) Org. Geochem. 36, 1299-1310. [2] Glaser, B., Haumeier, L., Guggenberger, G., Zech, W. (1998) Org. Geochem. 29, 811-819. [3] Dittmar, T. (2008) Org. Geochem. 39. 396-407. [4] Schneider, M.P.W., Hilf, M., Vogt, U.F., Schmidt, M.W.I., Org. Geochem. (submitted)

  11. Methods Used and Topics Addressed in Quantitative Health Research on Gay, Bisexual and Other Men Who Have Sex With Men: A Systematic Review of the Literature.

    PubMed

    Brennan, David J; Bauer, Greta R; Bradley, Kaitlin; Tran, Oth Vilaythong

    2017-01-01

    Research on sexual minority men (gay, bisexual, and other men who have sex with men) was examined with regard to the measures of sexual orientation used, the methods of research, and the main health outcomes under study. A systematic review of English-language quantitative studies was conducted focused on the health of sexual minority men published in 2010 (n = 250). The results provide a snapshot of the literature and revealed that research on sexual minority men overwhelmingly focused on HIV, STIs, and sexual health for which sexual orientation was most commonly defined behaviorally. For topics of mental health or body/fitness outcomes, sexual orientation was most commonly defined by identity. Most study samples were venue-based, and only 8.8% of published papers drew data from population-based samples. The findings suggest that there exists a need for research on sexual minority men's health beyond STIs and HIV that will examine mental and physical health outcomes beyond sexual risk, uses probability-based samples, and addresses intersectional concerns related to race/ethnicity and age.

  12. A new flexible forest inventory in France

    Treesearch

    C. Vidal; T. Belouard; J.-C. Herve; N. Robert; J. Wolsack

    2007-01-01

    The French National Forest Inventory was created in 1958 to assess metropolitan forest resources. To stick to new national and international requirements as well as to enhance reactivity, a new inventory method was implemented in 2004. This new method is based on a systematic sampling grid covering the whole territory every year. The size of the mesh is variable,...

  13. Factors Affecting the Unemployment (Rate) of Female Art Graduates in Iran

    ERIC Educational Resources Information Center

    Hedayat, Mina; Kahn, Sabzali Musa; Hanafi, Jaffri

    2013-01-01

    The aim of this study is to explore the relationship between the opportunities of female artist graduates in Tehran Province and the current employment market. Mixed method was employed in this study. The population of the current study consisted of 240 female artist graduates selected using a systematic random sampling method from both public and…

  14. Propensity-score matching in the cardiovascular surgery literature from 2004 to 2006: a systematic review and suggestions for improvement.

    PubMed

    Austin, Peter C

    2007-11-01

    I conducted a systematic review of the use of propensity score matching in the cardiovascular surgery literature. I examined the adequacy of reporting and whether appropriate statistical methods were used. I examined 60 articles published in the Annals of Thoracic Surgery, European Journal of Cardio-thoracic Surgery, Journal of Cardiovascular Surgery, and the Journal of Thoracic and Cardiovascular Surgery between January 1, 2004, and December 31, 2006. Thirty-one of the 60 studies did not provide adequate information on how the propensity score-matched pairs were formed. Eleven (18%) of studies did not report on whether matching on the propensity score balanced baseline characteristics between treated and untreated subjects in the matched sample. No studies used appropriate methods to compare baseline characteristics between treated and untreated subjects in the propensity score-matched sample. Eight (13%) of the 60 studies explicitly used statistical methods appropriate for the analysis of matched data when estimating the effect of treatment on the outcomes. Two studies used appropriate methods for some outcomes, but not for all outcomes. Thirty-nine (65%) studies explicitly used statistical methods that were inappropriate for matched-pairs data when estimating the effect of treatment on outcomes. Eleven studies did not report the statistical tests that were used to assess the statistical significance of the treatment effect. Analysis of propensity score-matched samples tended to be poor in the cardiovascular surgery literature. Most statistical analyses ignored the matched nature of the sample. I provide suggestions for improving the reporting and analysis of studies that use propensity score matching.

  15. Sampling for Patient Exit Interviews: Assessment of Methods Using Mathematical Derivation and Computer Simulations.

    PubMed

    Geldsetzer, Pascal; Fink, Günther; Vaikath, Maria; Bärnighausen, Till

    2018-02-01

    (1) To evaluate the operational efficiency of various sampling methods for patient exit interviews; (2) to discuss under what circumstances each method yields an unbiased sample; and (3) to propose a new, operationally efficient, and unbiased sampling method. Literature review, mathematical derivation, and Monte Carlo simulations. Our simulations show that in patient exit interviews it is most operationally efficient if the interviewer, after completing an interview, selects the next patient exiting the clinical consultation. We demonstrate mathematically that this method yields a biased sample: patients who spend a longer time with the clinician are overrepresented. This bias can be removed by selecting the next patient who enters, rather than exits, the consultation room. We show that this sampling method is operationally more efficient than alternative methods (systematic and simple random sampling) in most primary health care settings. Under the assumption that the order in which patients enter the consultation room is unrelated to the length of time spent with the clinician and the interviewer, selecting the next patient entering the consultation room tends to be the operationally most efficient unbiased sampling method for patient exit interviews. © 2016 The Authors. Health Services Research published by Wiley Periodicals, Inc. on behalf of Health Research and Educational Trust.

  16. Systematic evaluation of a targeted gene capture sequencing panel for molecular diagnosis of retinitis pigmentosa

    PubMed Central

    Ma, Yuanyuan; Chiang, Pei-Wen; Zhong, Jing; Liu, Xuyang; Asan; Wu, Jing; Su, Yan; Li, Xin; Deng, Jianlian; Huang, Yingping; Zhang, Xinxin; Li, Yang; Fan, Ning; Wang, Ying; Tang, Lihui; Shen, Jinting; Chen, Meiyan; Zhang, Xiuqing; Te, Deng; Banerjee, Santasree; Liu, Hui; Qi, Ming; Yi, Xin

    2018-01-01

    Background Inherited eye diseases are major causes of vision loss in both children and adults. Inherited eye diseases are characterized by clinical variability and pronounced genetic heterogeneity. Genetic testing may provide an accurate diagnosis for ophthalmic genetic disorders and allow gene therapy for specific diseases. Methods A targeted gene capture panel was designed to capture exons of 283 inherited eye disease genes including 58 known causative retinitis pigmentosa (RP) genes. 180 samples were tested with this panel, 68 were previously tested by Sanger sequencing. Systematic evaluation of our method and comprehensive molecular diagnosis were carried on 99 RP patients. Results 96.85% targeted regions were covered by at least 20 folds, the accuracy of variants detection was 99.994%. In 4 of the 68 samples previously tested by Sanger sequencing, mutations of other diseases not consisting with the clinical diagnosis were detected by next-generation sequencing (NGS) not Sanger. Among the 99 RP patients, 64 (64.6%) were detected with pathogenic mutations, while in 3 patients, it was inconsistent between molecular diagnosis and their initial clinical diagnosis. After revisiting, one patient’s clinical diagnosis was reclassified. In addition, 3 patients were found carrying large deletions. Conclusions We have systematically evaluated our method and compared it with Sanger sequencing, and have identified a large number of novel mutations in a cohort of 99 RP patients. The results showed a sufficient accuracy of our method and suggested the importance of molecular diagnosis in clinical diagnosis. PMID:29641573

  17. Comparative Evaluation of Small Molecular Additives and Their Effects on Peptide/Protein Identification.

    PubMed

    Gao, Jing; Zhong, Shaoyun; Zhou, Yanting; He, Han; Peng, Shuying; Zhu, Zhenyun; Liu, Xing; Zheng, Jing; Xu, Bin; Zhou, Hu

    2017-06-06

    Detergents and salts are widely used in lysis buffers to enhance protein extraction from biological samples, facilitating in-depth proteomic analysis. However, these detergents and salt additives must be efficiently removed from the digested samples prior to LC-MS/MS analysis to obtain high-quality mass spectra. Although filter-aided sample preparation (FASP), acetone precipitation (AP), followed by in-solution digestion, and strong cation exchange-based centrifugal proteomic reactors (CPRs) are commonly used for proteomic sample processing, little is known about their efficiencies at removing detergents and salt additives. In this study, we (i) developed an integrative workflow for the quantification of small molecular additives in proteomic samples, developing a multiple reaction monitoring (MRM)-based LC-MS approach for the quantification of six additives (i.e., Tris, urea, CHAPS, SDS, SDC, and Triton X-100) and (ii) systematically evaluated the relationships between the level of additive remaining in samples following sample processing and the number of peptides/proteins identified by mass spectrometry. Although FASP outperformed the other two methods, the results were complementary in terms of peptide/protein identification, as well as the GRAVY index and amino acid distributions. This is the first systematic and quantitative study of the effect of detergents and salt additives on protein identification. This MRM-based approach can be used for an unbiased evaluation of the performance of new sample preparation methods. Data are available via ProteomeXchange under identifier PXD005405.

  18. Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff

    NASA Astrophysics Data System (ADS)

    Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.

    2016-03-01

    Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.

  19. Procedure for the systematic orientation of digitised cranial models. Design and validation.

    PubMed

    Bailo, M; Baena, S; Marín, J J; Arredondo, J M; Auría, J M; Sánchez, B; Tardío, E; Falcón, L

    2015-12-01

    Comparison of bony pieces requires that they are oriented systematically to ensure that homologous regions are compared. Few orientation methods are highly accurate; this is particularly true for methods applied to three-dimensional models obtained by surface scanning, a technique whose special features make it a powerful tool in forensic contexts. The aim of this study was to develop and evaluate a systematic, assisted orientation method for aligning three-dimensional cranial models relative to the Frankfurt Plane, which would be produce accurate orientations independent of operator and anthropological expertise. The study sample comprised four crania of known age and sex. All the crania were scanned and reconstructed using an Eva Artec™ portable 3D surface scanner and subsequently, the position of certain characteristic landmarks were determined by three different operators using the Rhinoceros 3D surface modelling software. Intra-observer analysis showed a tendency for orientation to be more accurate when using the assisted method than when using conventional manual orientation. Inter-observer analysis showed that experienced evaluators achieve results at least as accurate if not more accurate using the assisted method than those obtained using manual orientation; while inexperienced evaluators achieved more accurate orientation using the assisted method. The method tested is a an innovative system capable of providing very precise, systematic and automatised spatial orientations of virtual cranial models relative to standardised anatomical planes independent of the operator and operator experience. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Systematic sampling of discrete and continuous populations: sample selection and the choice of estimator

    Treesearch

    Harry T. Valentine; David L. R. Affleck; Timothy G. Gregoire

    2009-01-01

    Systematic sampling is easy, efficient, and widely used, though it is not generally recognized that a systematic sample may be drawn from the population of interest with or without restrictions on randomization. The restrictions or the lack of them determine which estimators are unbiased, when using the sampling design as the basis for inference. We describe the...

  1. Potential, velocity, and density fields from sparse and noisy redshift-distance samples - Method

    NASA Technical Reports Server (NTRS)

    Dekel, Avishai; Bertschinger, Edmund; Faber, Sandra M.

    1990-01-01

    A method for recovering the three-dimensional potential, velocity, and density fields from large-scale redshift-distance samples is described. Galaxies are taken as tracers of the velocity field, not of the mass. The density field and the initial conditions are calculated using an iterative procedure that applies the no-vorticity assumption at an initial time and uses the Zel'dovich approximation to relate initial and final positions of particles on a grid. The method is tested using a cosmological N-body simulation 'observed' at the positions of real galaxies in a redshift-distance sample, taking into account their distance measurement errors. Malmquist bias and other systematic and statistical errors are extensively explored using both analytical techniques and Monte Carlo simulations.

  2. Modulation infrared thermometry of caloric effects at up to kHz frequencies

    NASA Astrophysics Data System (ADS)

    Döntgen, Jago; Rudolph, Jörg; Waske, Anja; Hägele, Daniel

    2018-03-01

    We present a novel non-contact method for the direct measurement of caloric effects in low volume samples. The adiabatic temperature change ΔT of a magnetocaloric sample is very sensitively determined from thermal radiation. Rapid modulation of ΔT is induced by an oscillating external magnetic field. Detection of thermal radiation with a mercury-cadmium-telluride detector allows for measurements at field frequencies exceeding 1 kHz. In contrast to thermoacoustic methods, our method can be employed in vacuum which enhances adiabatic conditions especially in the case of small volume samples. Systematic measurements of the magnetocaloric effect as a function of temperature, magnetic field amplitude, and modulation frequency give a detailed picture of the thermal behavior of the sample. Highly sensitive measurements of the magnetocaloric effect are demonstrated on a 2 mm thick sample of gadolinium and a 60 μm thick Fe80B12Nb8 ribbon.

  3. Fitting and Phenomenology in Type IA Supernova Cosmology: Generalized Likelihood Analyses for Multiple Evolving Populations and Observations of Near-Infrared Lightcurves Including Host Galaxy Properties

    NASA Astrophysics Data System (ADS)

    Ponder, Kara A.

    In the late 1990s, Type Ia supernovae (SNeIa) led to the discovery that the Universe is expanding at an accelerating rate due to dark energy. Since then, many different tracers of acceleration have been used to characterize dark energy, but the source of cosmic acceleration has remained a mystery. To better understand dark energy, future surveys such as the ground-based Large Synoptic Survey Telescope and the space-based Wide-Field Infrared Survey Telescope will collect thousands of SNeIa to use as a primary dark energy probe. These large surveys will be systematics limited, which makes it imperative for our insight regarding systematics to dramatically increase over the next decade for SNeIa to continue to contribute to precision cosmology. I approach this problem by improving statistical methods in the likelihood analysis and collecting near infrared (NIR) SNeIa with their host galaxies to improve the nearby data set and search for additional systematics. Using more statistically robust methods to account for systematics within the likelihood function can increase accuracy in cosmological parameters with a minimal precision loss. Though a sample of at least 10,000 SNeIa is necessary to confirm multiple populations of SNeIa, the bias in cosmology is ˜ 2 sigma with only 2,500 SNeIa. This work focused on an example systematic (host galaxy correlations), but it can be generalized for any systematic that can be represented by a distribution of multiple Gaussians. The SweetSpot survey gathered 114 low-redshift, NIR SNeIa that will act as a crucial anchor sample for the future high redshift surveys. NIR observations are not as affected by dust contamination, which may lead to increased understanding of systematics seen in optical wavelengths. We obtained spatially resolved spectra for 32 SweetSpot host galaxies to test for local host galaxy correlations. For the first time, we probe global host galaxy correlations with NIR brightnesses from the current literature sample of SNeIa with host galaxy data from publicly available catalogs. We find inconclusive evidence that more massive galaxies host SNeIa that are brighter in the NIR than SNeIa hosted in less massive galaxies.

  4. The Effects of Magnesium Supplementation on Subjective Anxiety and Stress—A Systematic Review

    PubMed Central

    Boyle, Neil Bernard; Lawton, Clare; Dye, Louise

    2017-01-01

    Background: Anxiety related conditions are the most common affective disorders present in the general population with a lifetime prevalence of over 15%. Magnesium (Mg) status is associated with subjective anxiety, leading to the proposition that Mg supplementation may attenuate anxiety symptoms. This systematic review examines the available evidence for the efficacy of Mg supplementation in the alleviation of subjective measures of anxiety and stress. Methods: A systematic search of interventions with Mg alone or in combination (up to 5 additional ingredients) was performed in May 2016. Ovid Medline, PsychInfo, Embase, CINAHL and Cochrane databases were searched using equivalent search terms. A grey literature review of relevant sources was also undertaken. Results: 18 studies were included in the review. All reviewed studies recruited samples based upon an existing vulnerability to anxiety: mildly anxious, premenstrual syndrome (PMS), postpartum status, and hypertension. Four/eight studies in anxious samples, four/seven studies in PMS samples, and one/two studies in hypertensive samples reported positive effects of Mg on subjective anxiety outcomes. Mg had no effect on postpartum anxiety. No study administered a validated measure of subjective stress as an outcome. Conclusions: Existing evidence is suggestive of a beneficial effect of Mg on subjective anxiety in anxiety vulnerable samples. However, the quality of the existing evidence is poor. Well-designed randomised controlled trials are required to further confirm the efficacy of Mg supplementation. PMID:28445426

  5. Shortleaf Pine Seedling Inventory Methods On Broadcast-Seeded Areas in the Missouri Ozarks

    Treesearch

    Kenneth W. Seidel; Nelson F. Rogers

    1966-01-01

    The success of broadcast-seeding of shortleaf pine (Pinus echinata Mill.) after one or several years can be determined with specified precision by a systematic sampling procedure. Seedling results often are expressed as the total number of seedlings per acre, but good distribution is equally important. The total stocking and the stocked milacre methods described here...

  6. Making chaotic behavior in a damped linear harmonic oscillator

    NASA Astrophysics Data System (ADS)

    Konishi, Keiji

    2001-06-01

    The present Letter proposes a simple control method which makes chaotic behavior in a damped linear harmonic oscillator. This method is a modified scheme proposed in paper by Wang and Chen (IEEE CAS-I 47 (2000) 410) which presents an anti-control method for making chaotic behavior in discrete-time linear systems. We provide a systematic procedure to design parameters and sampling period of a feedback controller. Furthermore, we show that our method works well on numerical simulations.

  7. Evaluation of Commercial-off-the-Shelf Materials for the Preservation of Bacillus anthracis Vegetative Cells for Forensic Analysis.

    PubMed

    Angelini, Daniel J; Harris, Jacquelyn V; Burton, Laura L; Rastogi, Pooja R; Smith, Lisa S; Rastogi, Vipin K

    2018-03-01

    Environmental surface sampling is crucial in determining the zones of contamination and overall threat assessment. Viability retention of sampled material is central to such assessments. A systematic study was completed to determine viability of vegetative cells under nonpermissive storage conditions. Despite major gains in nucleic acid sequencing technologies, initial positive identification of threats must be made through direct culture of the sampled material using classical microbiological methods. Solutions have been developed to preserve the viability of pathogens contained within clinical samples, but many have not been examined for their ability to preserve biological agents. The purpose of this study was to systematically examine existing preservation materials that can retain the viability of Bacillus anthracis vegetative cells stored under nonpermissive temperatures. The results show effectiveness of five of seventeen solutions, which are capable of retaining viability of a sporulation deficient strain of B. anthracis Sterne when stored under nonrefrigerated conditions. © 2017 American Academy of Forensic Sciences.

  8. Directional variance adjustment: bias reduction in covariance matrices based on factor analysis with an application to portfolio optimization.

    PubMed

    Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W; Müller, Klaus-Robert; Lemm, Steven

    2013-01-01

    Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation.

  9. Directional Variance Adjustment: Bias Reduction in Covariance Matrices Based on Factor Analysis with an Application to Portfolio Optimization

    PubMed Central

    Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W.; Müller, Klaus-Robert; Lemm, Steven

    2013-01-01

    Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation. PMID:23844016

  10. Reporting guidance considerations from a statistical perspective: overview of tools to enhance the rigour of reporting of randomised trials and systematic reviews.

    PubMed

    Hutton, Brian; Wolfe, Dianna; Moher, David; Shamseer, Larissa

    2017-05-01

    Research waste has received considerable attention from the biomedical community. One noteworthy contributor is incomplete reporting in research publications. When detailing statistical methods and results, ensuring analytic methods and findings are completely documented improves transparency. For publications describing randomised trials and systematic reviews, guidelines have been developed to facilitate complete reporting. This overview summarises aspects of statistical reporting in trials and systematic reviews of health interventions. A narrative approach to summarise features regarding statistical methods and findings from reporting guidelines for trials and reviews was taken. We aim to enhance familiarity of statistical details that should be reported in biomedical research among statisticians and their collaborators. We summarise statistical reporting considerations for trials and systematic reviews from guidance documents including the Consolidated Standards of Reporting Trials (CONSORT) Statement for reporting of trials, the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) Statement for trial protocols, the Statistical Analyses and Methods in the Published Literature (SAMPL) Guidelines for statistical reporting principles, the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement for systematic reviews and PRISMA for Protocols (PRISMA-P). Considerations regarding sharing of study data and statistical code are also addressed. Reporting guidelines provide researchers with minimum criteria for reporting. If followed, they can enhance research transparency and contribute improve quality of biomedical publications. Authors should employ these tools for planning and reporting of their research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  11. SKATE: a docking program that decouples systematic sampling from scoring.

    PubMed

    Feng, Jianwen A; Marshall, Garland R

    2010-11-15

    SKATE is a docking prototype that decouples systematic sampling from scoring. This novel approach removes any interdependence between sampling and scoring functions to achieve better sampling and, thus, improves docking accuracy. SKATE systematically samples a ligand's conformational, rotational and translational degrees of freedom, as constrained by a receptor pocket, to find sterically allowed poses. Efficient systematic sampling is achieved by pruning the combinatorial tree using aggregate assembly, discriminant analysis, adaptive sampling, radial sampling, and clustering. Because systematic sampling is decoupled from scoring, the poses generated by SKATE can be ranked by any published, or in-house, scoring function. To test the performance of SKATE, ligands from the Asetex/CDCC set, the Surflex set, and the Vertex set, a total of 266 complexes, were redocked to their respective receptors. The results show that SKATE was able to sample poses within 2 A RMSD of the native structure for 98, 95, and 98% of the cases in the Astex/CDCC, Surflex, and Vertex sets, respectively. Cross-docking accuracy of SKATE was also assessed by docking 10 ligands to thymidine kinase and 73 ligands to cyclin-dependent kinase. 2010 Wiley Periodicals, Inc.

  12. Self-collected versus clinician-collected sampling for sexually transmitted infections: a systematic review and meta-analysis protocol.

    PubMed

    Taylor, Darlene; Lunny, Carole; Wong, Tom; Gilbert, Mark; Li, Neville; Lester, Richard; Krajden, Mel; Hoang, Linda; Ogilvie, Gina

    2013-10-10

    Three meta-analyses and one systematic review have been conducted on the question of whether self-collected specimens are as accurate as clinician-collected specimens for STI screening. However, these reviews predate 2007 and did not analyze rectal or pharyngeal collection sites. Currently, there is no consensus on which sampling method is the most effective for the diagnosis of genital chlamydia (CT), gonorrhea (GC) or human papillomavirus (HPV) infection. Our meta-analysis aims to be comprehensive in that it will examine the evidence of whether self-collected vaginal, urine, pharyngeal and rectal specimens provide as accurate a clinical diagnosis as clinician-collected samples (reference standard). Eligible studies include both randomized and non-randomized controlled trials, pre- and post-test designs, and controlled observational studies. The databases that will be searched include the Cochrane Database of Systematic Reviews, Web of Science, Database of Abstracts of Reviews of Effects (DARE), EMBASE and PubMed/Medline. Data will be abstracted independently by two reviewers using a standardized pre-tested data abstraction form. Heterogeneity will be assessed using the Q2 test. Sensitivity and specificity estimates with 95% confidence intervals as well as negative and positive likelihood ratios will be pooled and weighted using random effects meta-analysis, if appropriate. A hierarchical summary receiver operating characteristics curve for self-collected specimens will be generated. This synthesis involves a meta-analysis of self-collected samples (urine, vaginal, pharyngeal and rectal swabs) versus clinician-collected samples for the diagnosis of CT, GC and HPV, the most prevalent STIs. Our systematic review will allow patients, clinicians and researchers to determine the diagnostic accuracy of specimens collected by patients compared to those collected by clinicians in the detection of chlamydia, gonorrhea and HPV.

  13. Modeling and enhanced sampling of molecular systems with smooth and nonlinear data-driven collective variables

    NASA Astrophysics Data System (ADS)

    Hashemian, Behrooz; Millán, Daniel; Arroyo, Marino

    2013-12-01

    Collective variables (CVs) are low-dimensional representations of the state of a complex system, which help us rationalize molecular conformations and sample free energy landscapes with molecular dynamics simulations. Given their importance, there is need for systematic methods that effectively identify CVs for complex systems. In recent years, nonlinear manifold learning has shown its ability to automatically characterize molecular collective behavior. Unfortunately, these methods fail to provide a differentiable function mapping high-dimensional configurations to their low-dimensional representation, as required in enhanced sampling methods. We introduce a methodology that, starting from an ensemble representative of molecular flexibility, builds smooth and nonlinear data-driven collective variables (SandCV) from the output of nonlinear manifold learning algorithms. We demonstrate the method with a standard benchmark molecule, alanine dipeptide, and show how it can be non-intrusively combined with off-the-shelf enhanced sampling methods, here the adaptive biasing force method. We illustrate how enhanced sampling simulations with SandCV can explore regions that were poorly sampled in the original molecular ensemble. We further explore the transferability of SandCV from a simpler system, alanine dipeptide in vacuum, to a more complex system, alanine dipeptide in explicit water.

  14. Modeling and enhanced sampling of molecular systems with smooth and nonlinear data-driven collective variables.

    PubMed

    Hashemian, Behrooz; Millán, Daniel; Arroyo, Marino

    2013-12-07

    Collective variables (CVs) are low-dimensional representations of the state of a complex system, which help us rationalize molecular conformations and sample free energy landscapes with molecular dynamics simulations. Given their importance, there is need for systematic methods that effectively identify CVs for complex systems. In recent years, nonlinear manifold learning has shown its ability to automatically characterize molecular collective behavior. Unfortunately, these methods fail to provide a differentiable function mapping high-dimensional configurations to their low-dimensional representation, as required in enhanced sampling methods. We introduce a methodology that, starting from an ensemble representative of molecular flexibility, builds smooth and nonlinear data-driven collective variables (SandCV) from the output of nonlinear manifold learning algorithms. We demonstrate the method with a standard benchmark molecule, alanine dipeptide, and show how it can be non-intrusively combined with off-the-shelf enhanced sampling methods, here the adaptive biasing force method. We illustrate how enhanced sampling simulations with SandCV can explore regions that were poorly sampled in the original molecular ensemble. We further explore the transferability of SandCV from a simpler system, alanine dipeptide in vacuum, to a more complex system, alanine dipeptide in explicit water.

  15. Coupling detergent lysis/clean-up methodology with intact protein fractionation for enhanced proteome characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Ritin; Dill, Brian; Chourey, Karuna

    2012-01-01

    The expanding use of surfactants for proteome sample preparations has prompted the need to systematically optimize the application and removal of these MS-deleterious agents prior to proteome measurements. Here we compare four different detergent clean-up methods (Trichloroacetic acid (TCA) precipitation, Chloroform/Methanol/Water (CMW) extraction, commercial detergent removal spin column method (DRS) and filter-aided sample preparation(FASP)) with respect to varying amounts of protein biomass in the samples, and provide efficiency benchmarks with respect to protein, peptide, and spectral identifications for each method. Our results show that for protein limited samples, FASP outperforms the other three clean-up methods, while at high protein amountmore » all the methods are comparable. This information was used in a dual strategy of comparing molecular weight based fractionated and unfractionated lysates from three increasingly complex samples (Escherichia coli, a five microbial isolate mixture, and a natural microbial community groundwater sample), which were all lysed with SDS and cleaned up using FASP. The two approaches complemented each other by enhancing the number of protein identifications by 8%-25% across the three samples and provided broad pathway coverage.« less

  16. Accuracy and differential bias in copy number measurement of CCL3L1 in association studies with three auto-immune disorders.

    PubMed

    Carpenter, Danielle; Walker, Susan; Prescott, Natalie; Schalkwijk, Joost; Armour, John Al

    2011-08-18

    Copy number variation (CNV) contributes to the variation observed between individuals and can influence human disease progression, but the accurate measurement of individual copy numbers is technically challenging. In the work presented here we describe a modification to a previously described paralogue ratio test (PRT) method for genotyping the CCL3L1/CCL4L1 copy variable region, which we use to ascertain CCL3L1/CCL4L1 copy number in 1581 European samples. As the products of CCL3L1 and CCL4L1 potentially play a role in autoimmunity we performed case control association studies with Crohn's disease, rheumatoid arthritis and psoriasis clinical cohorts. We evaluate the PRT methodology used, paying particular attention to accuracy and precision, and highlight the problems of differential bias in copy number measurements. Our PRT methods for measuring copy number were of sufficient precision to detect very slight but systematic differential bias between results from case and control DNA samples in one study. We find no evidence for an association between CCL3L1 copy number and Crohn's disease, rheumatoid arthritis or psoriasis. Differential bias of this small magnitude, but applied systematically across large numbers of samples, would create a serious risk of false positive associations in copy number, if measured using methods of lower precision, or methods relying on single uncorroborated measurements. In this study the small differential bias detected by PRT in one sample set was resolved by a simple pre-treatment by restriction enzyme digestion.

  17. Accuracy and differential bias in copy number measurement of CCL3L1 in association studies with three auto-immune disorders

    PubMed Central

    2011-01-01

    Background Copy number variation (CNV) contributes to the variation observed between individuals and can influence human disease progression, but the accurate measurement of individual copy numbers is technically challenging. In the work presented here we describe a modification to a previously described paralogue ratio test (PRT) method for genotyping the CCL3L1/CCL4L1 copy variable region, which we use to ascertain CCL3L1/CCL4L1 copy number in 1581 European samples. As the products of CCL3L1 and CCL4L1 potentially play a role in autoimmunity we performed case control association studies with Crohn's disease, rheumatoid arthritis and psoriasis clinical cohorts. Results We evaluate the PRT methodology used, paying particular attention to accuracy and precision, and highlight the problems of differential bias in copy number measurements. Our PRT methods for measuring copy number were of sufficient precision to detect very slight but systematic differential bias between results from case and control DNA samples in one study. We find no evidence for an association between CCL3L1 copy number and Crohn's disease, rheumatoid arthritis or psoriasis. Conclusions Differential bias of this small magnitude, but applied systematically across large numbers of samples, would create a serious risk of false positive associations in copy number, if measured using methods of lower precision, or methods relying on single uncorroborated measurements. In this study the small differential bias detected by PRT in one sample set was resolved by a simple pre-treatment by restriction enzyme digestion. PMID:21851606

  18. Clinical and diagnostic utility of saliva as a non-invasive diagnostic fluid:
a systematic review

    PubMed Central

    Nunes, Lazaro Alessandro Soares; Mussavira, Sayeeda

    2015-01-01

    This systematic review presents the latest trends in salivary research and its applications in health and disease. Among the large number of analytes present in saliva, many are affected by diverse physiological and pathological conditions. Further, the non-invasive, easy and cost-effective collection methods prompt an interest in evaluating its diagnostic or prognostic utility. Accumulating data over the past two decades indicates towards the possible utility of saliva to monitor overall health, diagnose and treat various oral or systemic disorders and drug monitoring. Advances in saliva based systems biology has also contributed towards identification of several biomarkers, development of diverse salivary diagnostic kits and other sensitive analytical techniques. However, its utilization should be carefully evaluated in relation to standardization of pre-analytical and analytical variables, such as collection and storage methods, analyte circadian variation, sample recovery, prevention of sample contamination and analytical procedures. In spite of all these challenges, there is an escalating evolution of knowledge with the use of this biological matrix. PMID:26110030

  19. Systematic random sampling of the comet assay.

    PubMed

    McArt, Darragh G; Wasson, Gillian R; McKerr, George; Saetzler, Kurt; Reed, Matt; Howard, C Vyvyan

    2009-07-01

    The comet assay is a technique used to quantify DNA damage and repair at a cellular level. In the assay, cells are embedded in agarose and the cellular content is stripped away leaving only the DNA trapped in an agarose cavity which can then be electrophoresed. The damaged DNA can enter the agarose and migrate while the undamaged DNA cannot and is retained. DNA damage is measured as the proportion of the migratory 'tail' DNA compared to the total DNA in the cell. The fundamental basis of these arbitrary values is obtained in the comet acquisition phase using fluorescence microscopy with a stoichiometric stain in tandem with image analysis software. Current methods deployed in such an acquisition are expected to be both objectively and randomly obtained. In this paper we examine the 'randomness' of the acquisition phase and suggest an alternative method that offers both objective and unbiased comet selection. In order to achieve this, we have adopted a survey sampling approach widely used in stereology, which offers a method of systematic random sampling (SRS). This is desirable as it offers an impartial and reproducible method of comet analysis that can be used both manually or automated. By making use of an unbiased sampling frame and using microscope verniers, we are able to increase the precision of estimates of DNA damage. Results obtained from a multiple-user pooled variation experiment showed that the SRS technique attained a lower variability than that of the traditional approach. The analysis of a single user with repetition experiment showed greater individual variances while not being detrimental to overall averages. This would suggest that the SRS method offers a better reflection of DNA damage for a given slide and also offers better user reproducibility.

  20. Evidence-based practice guidelines for instructing individuals with neurogenic memory impairments: what have we learned in the past 20 years?

    PubMed

    Ehlhardt, Laurie A; Sohlberg, McKay Moore; Kennedy, Mary; Coelho, Carl; Ylvisaker, Mark; Turkstra, Lyn; Yorkston, Kathryn

    2008-06-01

    This article examines the instructional research literature pertinent to teaching procedures or information to individuals with acquired memory impairments due to brain injury or related conditions. The purpose is to evaluate the available evidence in order to generate practice guidelines for clinicians working in the field of cognitive rehabilitation. A systematic review of the instructional literature from 1986 to 2006 revealed 51 studies meeting search criteria. Studies were analysed and coded within the following four key domains: Population Sample, Intervention, Study Design, and Treatment Outcomes. Coding included 17 characteristics of the population sample; seven intervention parameters; five study design features; and five treatment outcome parameters. Interventions that were evaluated included systematic instructional techniques such as method of vanishing cues and errorless learning. The majority of the studies reported positive outcomes in favour of systematic instruction. However, issues related to the design and execution of effective instruction lack clarity and require further study. The interaction between the target learning objective and the individual learner profile is not well understood. The evidence review concludes with clinical recommendations based on the instructional literature and a call to clinicians to incorporate these methods into their practice to maximise patient outcomes.

  1. Comparison between Thermal Desorption Tubes and Stainless Steel Canisters Used for Measuring Volatile Organic Compounds in Petrochemical Factories

    PubMed Central

    Chang, Cheng-Ping; Lin, Tser-Cheng; Lin, Yu-Wen; Hua, Yi-Chun; Chu, Wei-Ming; Lin, Tzu-Yu; Lin, Yi-Wen; Wu, Jyun-De

    2016-01-01

    Objective: The purpose of this study was to compare thermal desorption tubes and stainless steel canisters for measuring volatile organic compounds (VOCs) emitted from petrochemical factories. Methods: Twelve petrochemical factories in the Mailiao Industrial Complex were recruited for conducting the measurements of VOCs. Thermal desorption tubes and 6-l specially prepared stainless steel canisters were used to simultaneously perform active sampling of environmental air samples. The sampling time of the environmental air samples was set up on 6h close to a full work shift of the workers. A total of 94 pairwise air samples were collected by using the thermal adsorption tubes and stainless steel canisters in these 12 factories in the petrochemical industrial complex. To maximize the number of comparative data points, all the measurements from all the factories in different sampling times were lumped together to perform a linear regression analysis for each selected VOC. Pearson product–moment correlation coefficient was used to examine the correlation between the pairwise measurements of these two sampling methods. A paired t-test was also performed to examine whether the difference in the concentrations of each selected VOC measured by the two methods was statistically significant. Results: The correlation coefficients of seven compounds, including acetone, n-hexane, benzene, toluene, 1,2-dichloroethane, 1,3-butadiene, and styrene were >0.80 indicating the two sampling methods for these VOCs’ measurements had high consistency. The paired t-tests for the measurements of n-hexane, benzene, m/p-xylene, o-xylene, 1,2-dichloroethane, and 1,3-butadiene showed statistically significant difference (P-value < 0.05). This indicated that the two sampling methods had various degrees of systematic errors. Looking at the results of six chemicals and these systematic errors probably resulted from the differences of the detection limits in the two sampling methods for these VOCs. Conclusions: The comparison between the concentrations of each of the 10 selected VOCs measured by the two sampling methods indicted that the thermal desorption tubes provided high accuracy and precision measurements for acetone, benzene, and 1,3-butadiene. The accuracy and precision of using the thermal desorption tubes for measuring the VOCs can be improved due to new developments in sorbent materials, multi-sorbent designs, and thermal desorption instrumentation. More applications of thermal desorption tubes for measuring occupational and environmental hazardous agents can be anticipated. PMID:26585828

  2. Effect of the absolute statistic on gene-sampling gene-set analysis methods.

    PubMed

    Nam, Dougu

    2017-06-01

    Gene-set enrichment analysis and its modified versions have commonly been used for identifying altered functions or pathways in disease from microarray data. In particular, the simple gene-sampling gene-set analysis methods have been heavily used for datasets with only a few sample replicates. The biggest problem with this approach is the highly inflated false-positive rate. In this paper, the effect of absolute gene statistic on gene-sampling gene-set analysis methods is systematically investigated. Thus far, the absolute gene statistic has merely been regarded as a supplementary method for capturing the bidirectional changes in each gene set. Here, it is shown that incorporating the absolute gene statistic in gene-sampling gene-set analysis substantially reduces the false-positive rate and improves the overall discriminatory ability. Its effect was investigated by power, false-positive rate, and receiver operating curve for a number of simulated and real datasets. The performances of gene-set analysis methods in one-tailed (genome-wide association study) and two-tailed (gene expression data) tests were also compared and discussed.

  3. Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.

    PubMed

    Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C

    2016-09-01

    Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.

  4. A Systematic Literature Review: Workplace Violence Against Emergency Medical Services Personnel.

    PubMed

    Pourshaikhian, Majid; Abolghasem Gorji, Hassan; Aryankhesal, Aidin; Khorasani-Zavareh, Davood; Barati, Ahmad

    2016-03-01

    In spite of the high prevalence and consequences of much workplace violence against emergency medical services personnel, this phenomenon has been given insufficient attention. A systematic review can aid the development of guidelines to reduce violence. The research question addressed by this paper is, "What are the characteristics and findings of studies on workplace violence against emergency medical services personnel"? A systematic literature review was conducted using online databases (PubMed, Scopus, Google Scholar, and Magiran) with the help of experienced librarians. Inclusion criteria comprised studies in the English or Persian language and researcher's access to the full text. There was no limit to the entry of the study design. Exclusion criteria included lack of access to the full text of the article, studies published in unreliable journals or conferences, and studies in which the results were shared with other medical or relief groups and there was no possibility of breaking down the results. A "Data extraction form" was designed by the researchers based on the goals of the study that included the title and author(s), study method (type, place of study, sample size, sampling method, and data collection/analysis tool), printing location, information related to the frequency of types of violence, characteristics of victims /perpetrators, and related factors. The papers reviewed utilized a variety of locations and environments, methods, and instrument samplings. The majority of the studies were performed using the quantitative method. No intervention study was found. Most studies focused on the prevalence of violence, and their results indicated that exposure to violence was high. The results are presented in six major themes. Workplace violence and injuries incurred from it are extensive throughout the world. The important causes of violence include the shortage of training programs dealing with violence, lack of violence management protocols, and delays in response times. Therefore, afterthought and resolve are more crucial than ever. Workplace violence reduction strategies and suggestions for future studies are also discussed.

  5. A "three-in-one" sample preparation method for simultaneous determination of B-group water-soluble vitamins in infant formula using VitaFast(®) kits.

    PubMed

    Zhang, Heng; Lan, Fang; Shi, Yupeng; Wan, Zhi-Gang; Yue, Zhen-Feng; Fan, Fang; Lin, Yan-Kui; Tang, Mu-Jin; Lv, Jing-Zhang; Xiao, Tan; Yi, Changqing

    2014-06-15

    VitaFast(®) test kits designed for the microbiological assay in microtiter plate format can be applied to quantitative determination of B-group water-soluble vitamins such as vitamin B12, folic acid and biotin, et al. Compared to traditional microbiological methods, VitaFast(®) kits significantly reduce sample processing time and provide greater reliability, higher productivity and better accuracy. Recently, simultaneous determination of vitamin B12, folic acid and biotin in one sample is urgently required when evaluating the quality of infant formulae in our practical work. However, the present sample preparation protocols which are developed for individual test systems, are incompatible with simultaneous determination of several analytes. To solve this problem, a novel "three-in-one" sample preparation method is herein developed for simultaneous determination of B-group water-soluble vitamins using VitaFast(®) kits. The performance of this novel "three-in-one" sample preparation method was systematically evaluated through comparing with individual sample preparation protocols. The experimental results of the assays which employed "three-in-one" sample preparation method were in good agreement with those obtained from conventional VitaFast(®) extraction methods, indicating that the proposed "three-in-one" sample preparation method is applicable to the present three VitaFast(®) vitamin test systems, thus offering a promising alternative for the three independent sample preparation methods. The proposed new sample preparation method will significantly improve the efficiency of infant formulae inspection. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Methodological quality of meta-analyses on treatments for chronic obstructive pulmonary disease: a cross-sectional study using the AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool

    PubMed Central

    Ho, Robin ST; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel YS; Chung, Vincent CH

    2015-01-01

    Background: Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. Aims: To assess the methodological quality of MAs on COPD treatments. Methods: A cross-sectional study on MAs of COPD trials. MAs published during 2000–2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Results: Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. Conclusions: The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods. PMID:25569783

  7. Systematic evaluation of a targeted gene capture sequencing panel for molecular diagnosis of retinitis pigmentosa.

    PubMed

    Huang, Hui; Chen, Yanhua; Chen, Huishuang; Ma, Yuanyuan; Chiang, Pei-Wen; Zhong, Jing; Liu, Xuyang; Asan; Wu, Jing; Su, Yan; Li, Xin; Deng, Jianlian; Huang, Yingping; Zhang, Xinxin; Li, Yang; Fan, Ning; Wang, Ying; Tang, Lihui; Shen, Jinting; Chen, Meiyan; Zhang, Xiuqing; Te, Deng; Banerjee, Santasree; Liu, Hui; Qi, Ming; Yi, Xin

    2018-01-01

    Inherited eye diseases are major causes of vision loss in both children and adults. Inherited eye diseases are characterized by clinical variability and pronounced genetic heterogeneity. Genetic testing may provide an accurate diagnosis for ophthalmic genetic disorders and allow gene therapy for specific diseases. A targeted gene capture panel was designed to capture exons of 283 inherited eye disease genes including 58 known causative retinitis pigmentosa (RP) genes. 180 samples were tested with this panel, 68 were previously tested by Sanger sequencing. Systematic evaluation of our method and comprehensive molecular diagnosis were carried on 99 RP patients. 96.85% targeted regions were covered by at least 20 folds, the accuracy of variants detection was 99.994%. In 4 of the 68 samples previously tested by Sanger sequencing, mutations of other diseases not consisting with the clinical diagnosis were detected by next-generation sequencing (NGS) not Sanger. Among the 99 RP patients, 64 (64.6%) were detected with pathogenic mutations, while in 3 patients, it was inconsistent between molecular diagnosis and their initial clinical diagnosis. After revisiting, one patient's clinical diagnosis was reclassified. In addition, 3 patients were found carrying large deletions. We have systematically evaluated our method and compared it with Sanger sequencing, and have identified a large number of novel mutations in a cohort of 99 RP patients. The results showed a sufficient accuracy of our method and suggested the importance of molecular diagnosis in clinical diagnosis.

  8. Observation procedure, observer gender, and behavior valence as determinants of sampling error in a behavior assessment analogue

    PubMed Central

    Farkas, Gary M.; Tharp, Roland G.

    1980-01-01

    Several factors thought to influence the representativeness of behavioral assessment data were examined in an analogue study using a multifactorial design. Systematic and unsystematic methods of observing group behavior were investigated using 18 male and 18 female observers. Additionally, valence properties of the observed behaviors were inspected. Observers' assessments of a videotape were compared to a criterion code that defined the population of behaviors. Results indicated that systematic observation procedures were more accurate than unsystematic procedures, though this factor interacted with gender of observer and valence of behavior. Additionally, males tended to sample more representatively than females. A third finding indicated that the negatively valenced behavior was overestimated, whereas the neutral and positively valenced behaviors were accurately assessed. PMID:16795631

  9. A systematic evaluation of normalization methods in quantitative label-free proteomics.

    PubMed

    Välikangas, Tommi; Suomi, Tomi; Elo, Laura L

    2018-01-01

    To date, mass spectrometry (MS) data remain inherently biased as a result of reasons ranging from sample handling to differences caused by the instrumentation. Normalization is the process that aims to account for the bias and make samples more comparable. The selection of a proper normalization method is a pivotal task for the reliability of the downstream analysis and results. Many normalization methods commonly used in proteomics have been adapted from the DNA microarray techniques. Previous studies comparing normalization methods in proteomics have focused mainly on intragroup variation. In this study, several popular and widely used normalization methods representing different strategies in normalization are evaluated using three spike-in and one experimental mouse label-free proteomic data sets. The normalization methods are evaluated in terms of their ability to reduce variation between technical replicates, their effect on differential expression analysis and their effect on the estimation of logarithmic fold changes. Additionally, we examined whether normalizing the whole data globally or in segments for the differential expression analysis has an effect on the performance of the normalization methods. We found that variance stabilization normalization (Vsn) reduced variation the most between technical replicates in all examined data sets. Vsn also performed consistently well in the differential expression analysis. Linear regression normalization and local regression normalization performed also systematically well. Finally, we discuss the choice of a normalization method and some qualities of a suitable normalization method in the light of the results of our evaluation. © The Author 2016. Published by Oxford University Press.

  10. Is it really theoretical? A review of sampling in grounded theory studies in nursing journals.

    PubMed

    McCrae, Niall; Purssell, Edward

    2016-10-01

    Grounded theory is a distinct method of qualitative research, where core features are theoretical sampling and constant comparative analysis. However, inconsistent application of these activities has been observed in published studies. This review assessed the use of theoretical sampling in grounded theory studies in nursing journals. An adapted systematic review was conducted. Three leading nursing journals (2010-2014) were searched for studies stating grounded theory as the method. Sampling was assessed using a concise rating tool. A high proportion (86%) of the 134 articles described an iterative process of data collection and analysis. However, half of the studies did not demonstrate theoretical sampling, with many studies declaring or indicating a purposive sampling approach throughout. Specific reporting guidelines for grounded theory studies should be developed to ensure that study reports describe an iterative process of fieldwork and theoretical development. © 2016 John Wiley & Sons Ltd.

  11. The influence of sampling interval on the accuracy of trail impact assessment

    USGS Publications Warehouse

    Leung, Y.-F.; Marion, J.L.

    1999-01-01

    Trail impact assessment and monitoring (IA&M) programs have been growing in importance and application in recreation resource management at protected areas. Census-based and sampling-based approaches have been developed in such programs, with systematic point sampling being the most common survey design. This paper examines the influence of sampling interval on the accuracy of estimates for selected trail impact problems. A complete census of four impact types on 70 trails in Great Smoky Mountains National Park was utilized as the base data set for the analyses. The census data were resampled at increasing intervals to create a series of simulated point data sets. Estimates of frequency of occurrence and lineal extent for the four impact types were compared with the census data set. The responses of accuracy loss on lineal extent estimates to increasing sampling intervals varied across different impact types, while the responses on frequency of occurrence estimates were consistent, approximating an inverse asymptotic curve. These findings suggest that systematic point sampling may be an appropriate method for estimating the lineal extent but not the frequency of trail impacts. Sample intervals of less than 100 m appear to yield an excellent level of accuracy for the four impact types evaluated. Multiple regression analysis results suggest that appropriate sampling intervals are more likely to be determined by the type of impact in question rather than the length of trail. The census-based trail survey and the resampling-simulation method developed in this study can be a valuable first step in establishing long-term trail IA&M programs, in which an optimal sampling interval range with acceptable accuracy is determined before investing efforts in data collection.

  12. Investigation of internal friction in fused quartz, steel, Plexiglass, and Westerly granite from 0.01 to 1.00 Hertz at 10- 8 to 10-7 strain amplitude.

    USGS Publications Warehouse

    Hsi-Ping, Liu; Peselnick, L.

    1983-01-01

    A detailed evaluation on the method of internal friction measurement by the stress-strain hysteresis loop method from 0.01 to 1 Hz at 10-8-10-7 strain amplitude and 23.9oC is presented. Significant systematic errors in relative phase measurement can result from convex end surfaces of the sample and stress sensor and from end surface irregularities such as nicks and asperities. Preparation of concave end surfaces polished to optical smoothness having a radius of curvature >3.6X104 cm reduces the systematic error in relative phase measurements to <(5.5+ or -2.2)X10-4 radians. -from Authors

  13. Solutions to decrease a systematic error related to AAPH addition in the fluorescence-based ORAC assay.

    PubMed

    Mellado-Ortega, Elena; Zabalgogeazcoa, Iñigo; Vázquez de Aldana, Beatriz R; Arellano, Juan B

    2017-02-15

    Oxygen radical absorbance capacity (ORAC) assay in 96-well multi-detection plate readers is a rapid method to determine total antioxidant capacity (TAC) in biological samples. A disadvantage of this method is that the antioxidant inhibition reaction does not start in all of the 96 wells at the same time due to technical limitations when dispensing the free radical-generating azo initiator 2,2'-azobis (2-methyl-propanimidamide) dihydrochloride (AAPH). The time delay between wells yields a systematic error that causes statistically significant differences in TAC determination of antioxidant solutions depending on their plate position. We propose two alternative solutions to avoid this AAPH-dependent error in ORAC assays. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Comparison between Thermal Desorption Tubes and Stainless Steel Canisters Used for Measuring Volatile Organic Compounds in Petrochemical Factories.

    PubMed

    Chang, Cheng-Ping; Lin, Tser-Cheng; Lin, Yu-Wen; Hua, Yi-Chun; Chu, Wei-Ming; Lin, Tzu-Yu; Lin, Yi-Wen; Wu, Jyun-De

    2016-04-01

    The purpose of this study was to compare thermal desorption tubes and stainless steel canisters for measuring volatile organic compounds (VOCs) emitted from petrochemical factories. Twelve petrochemical factories in the Mailiao Industrial Complex were recruited for conducting the measurements of VOCs. Thermal desorption tubes and 6-l specially prepared stainless steel canisters were used to simultaneously perform active sampling of environmental air samples. The sampling time of the environmental air samples was set up on 6 h close to a full work shift of the workers. A total of 94 pairwise air samples were collected by using the thermal adsorption tubes and stainless steel canisters in these 12 factories in the petrochemical industrial complex. To maximize the number of comparative data points, all the measurements from all the factories in different sampling times were lumped together to perform a linear regression analysis for each selected VOC. Pearson product-moment correlation coefficient was used to examine the correlation between the pairwise measurements of these two sampling methods. A paired t-test was also performed to examine whether the difference in the concentrations of each selected VOC measured by the two methods was statistically significant. The correlation coefficients of seven compounds, including acetone, n-hexane, benzene, toluene, 1,2-dichloroethane, 1,3-butadiene, and styrene were >0.80 indicating the two sampling methods for these VOCs' measurements had high consistency. The paired t-tests for the measurements of n-hexane, benzene, m/p-xylene, o-xylene, 1,2-dichloroethane, and 1,3-butadiene showed statistically significant difference (P-value < 0.05). This indicated that the two sampling methods had various degrees of systematic errors. Looking at the results of six chemicals and these systematic errors probably resulted from the differences of the detection limits in the two sampling methods for these VOCs. The comparison between the concentrations of each of the 10 selected VOCs measured by the two sampling methods indicted that the thermal desorption tubes provided high accuracy and precision measurements for acetone, benzene, and 1,3-butadiene. The accuracy and precision of using the thermal desorption tubes for measuring the VOCs can be improved due to new developments in sorbent materials, multi-sorbent designs, and thermal desorption instrumentation. More applications of thermal desorption tubes for measuring occupational and environmental hazardous agents can be anticipated. © The Author 2015. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  15. Alcohol and Sexual Consent Scale: Development and Validation

    ERIC Educational Resources Information Center

    Ward, Rose Marie; Matthews, Molly R.; Weiner, Judith; Hogan, Kathryn M.; Popson, Halle C.

    2012-01-01

    Objective: To establish a short measure of attitudes toward sexual consent in the context of alcohol consumption. Methods: Using a multistage and systematic measurement development process, the investigators developed the Alcohol and Sexual Consent Scale using a sample of college students. Results: The resulting 12-item scale, the Alcohol and…

  16. Conducting a wildland visual resources inventory

    Treesearch

    James F. Palmer

    1979-01-01

    This paper describes a procedure for systematically inventorying the visual resources of wildland environments. Visual attributes are recorded photographically using two separate sampling methods: one based on professional judgment and the other on random selection. The location and description of each inventoried scene are recorded on U.S. Geological Survey...

  17. Improving the collection of knowledge, attitude and practice data with community surveys: a comparison of two second-stage sampling methods.

    PubMed

    Davis, Rosemary H; Valadez, Joseph J

    2014-12-01

    Second-stage sampling techniques, including spatial segmentation, are widely used in community health surveys when reliable household sampling frames are not available. In India, an unresearched technique for household selection is used in eight states, which samples the house with the last marriage or birth as the starting point. Users question whether this last-birth or last-marriage (LBLM) approach introduces bias affecting survey results. We conducted two simultaneous population-based surveys. One used segmentation sampling; the other used LBLM. LBLM sampling required modification before assessment was possible and a more systematic approach was tested using last birth only. We compared coverage proportions produced by the two independent samples for six malaria indicators and demographic variables (education, wealth and caste). We then measured the level of agreement between the caste of the selected participant and the caste of the health worker making the selection. No significant difference between methods was found for the point estimates of six malaria indicators, education, caste or wealth of the survey participants (range of P: 0.06 to >0.99). A poor level of agreement occurred between the caste of the health worker used in household selection and the caste of the final participant, (Κ = 0.185), revealing little association between the two, and thereby indicating that caste was not a source of bias. Although LBLM was not testable, a systematic last-birth approach was tested. If documented concerns of last-birth sampling are addressed, this new method could offer an acceptable alternative to segmentation in India. However, inter-state caste variation could affect this result. Therefore, additional assessment of last birth is required before wider implementation is recommended. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2013; all rights reserved.

  18. Amino acid distribution in meteorites: diagenesis, extraction methods, and standard metrics in the search for extraterrestrial biosignatures.

    PubMed

    McDonald, Gene D; Storrie-Lombardi, Michael C

    2006-02-01

    The relative abundance of the protein amino acids has been previously investigated as a potential marker for biogenicity in meteoritic samples. However, these investigations were executed without a quantitative metric to evaluate distribution variations, and they did not account for the possibility of interdisciplinary systematic error arising from inter-laboratory differences in extraction and detection techniques. Principal component analysis (PCA), hierarchical cluster analysis (HCA), and stochastic probabilistic artificial neural networks (ANNs) were used to compare the distributions for nine protein amino acids previously reported for the Murchison carbonaceous chondrite, Mars meteorites (ALH84001, Nakhla, and EETA79001), prebiotic synthesis experiments, and terrestrial biota and sediments. These techniques allowed us (1) to identify a shift in terrestrial amino acid distributions secondary to diagenesis; (2) to detect differences in terrestrial distributions that may be systematic differences between extraction and analysis techniques in biological and geological laboratories; and (3) to determine that distributions in meteoritic samples appear more similar to prebiotic chemistry samples than they do to the terrestrial unaltered or diagenetic samples. Both diagenesis and putative interdisciplinary differences in analysis complicate interpretation of meteoritic amino acid distributions. We propose that the analysis of future samples from such diverse sources as meteoritic influx, sample return missions, and in situ exploration of Mars would be less ambiguous with adoption of standardized assay techniques, systematic inclusion of assay standards, and the use of a quantitative, probabilistic metric. We present here one such metric determined by sequential feature extraction and normalization (PCA), information-driven automated exploration of classification possibilities (HCA), and prediction of classification accuracy (ANNs).

  19. Impact of parasitic thermal effects on thermoelectric property measurements by Harman method.

    PubMed

    Kwon, Beomjin; Baek, Seung-Hyub; Kim, Seong Keun; Kim, Jin-Sang

    2014-04-01

    Harman method is a rapid and simple technique to measure thermoelectric properties. However, its validity has been often questioned due to the over-simplified assumptions that this method relies on. Here, we quantitatively investigate the influence of the previously ignored parasitic thermal effects on the Harman method and develop a method to determine an intrinsic ZT. We expand the original Harman relation with three extra terms: heat losses via both the lead wires and radiation, and Joule heating within the sample. Based on the expanded Harman relation, we use differential measurement of the sample geometry to measure the intrinsic ZT. To separately evaluate the parasitic terms, the measured ZTs with systematically varied sample geometries and the lead wire types are fitted to the expanded relation. A huge discrepancy (∼28%) of the measured ZTs depending on the measurement configuration is observed. We are able to separately evaluate those parasitic terms. This work will help to evaluate the intrinsic thermoelectric property with Harman method by eliminating ambiguities coming from extrinsic effects.

  20. Chemical analysis of acoustically levitated drops by Raman spectroscopy.

    PubMed

    Tuckermann, Rudolf; Puskar, Ljiljana; Zavabeti, Mahta; Sekine, Ryo; McNaughton, Don

    2009-07-01

    An experimental apparatus combining Raman spectroscopy with acoustic levitation, Raman acoustic levitation spectroscopy (RALS), is investigated in the field of physical and chemical analytics. Whereas acoustic levitation enables the contactless handling of microsized samples, Raman spectroscopy offers the advantage of a noninvasive method without complex sample preparation. After carrying out some systematic tests to probe the sensitivity of the technique to drop size, shape, and position, RALS has been successfully applied in monitoring sample dilution and preconcentration, evaporation, crystallization, an acid-base reaction, and analytes in a surface-enhanced Raman spectroscopy colloidal suspension.

  1. A Stellar Dynamical Black Hole Mass for the Reverberation Mapped AGN NGC 5273

    NASA Astrophysics Data System (ADS)

    Batiste, Merida; Bentz, Misty C.; Valluri, Monica; Onken, Christopher A.

    2018-01-01

    We present preliminary results from stellar dynamical modeling of the mass of the central super-massive black hole (MBH) in the active galaxy NGC 5273. NGC 5273 is one of the few AGN with a secure MBH measurement from reverberation-mapping that is also nearby enough to measure MBH with stellar dynamical modeling. Dynamical modeling and reverberation-mapping are the two most heavily favored methods of direct MBH determination in the literature, however the specific limitations of each method means that there are very few galaxies for which both can be used. To date only two such galaxies, NGC 3227 and NGC 4151, have MBH determinations from both methods. Given this small sample size, it is not yet clear that the two methods give consistent results. Moreover, given the inherent uncertainties and potential systematic biases in each method, it is likewise unclear whether one method should be preferred over the other. This study is part of an ongoing project to increase the sample of galaxies with secure MBH measurements from both methods, so that a direct comparison may be made. NGC 5273 provides a particularly valuable comparison because it is free of kinematic substructure (e.g. the presence of a bar, as is the case for NGC 4151) which can complicate and potentially bias results from stellar dynamical modeling. I will discuss our current results as well as the advantages and limitations of each method, and the potential sources of systematic bias that may affect comparison between results.

  2. Hydration of Atmospheric Molecular Clusters: Systematic Configurational Sampling.

    PubMed

    Kildgaard, Jens; Mikkelsen, Kurt V; Bilde, Merete; Elm, Jonas

    2018-05-09

    We present a new systematic configurational sampling algorithm for investigating the potential energy surface of hydrated atmospheric molecular clusters. The algo- rithm is based on creating a Fibonacci sphere around each atom in the cluster and adding water molecules to each point in 9 different orientations. To allow the sam- pling of water molecules to existing hydrogen bonds, the cluster is displaced along the hydrogen bond and a water molecule is placed in between in three different ori- entations. Generated redundant structures are eliminated based on minimizing the root mean square distance (RMSD) of different conformers. Initially, the clusters are sampled using the semiempirical PM6 method and subsequently using density func- tional theory (M06-2X and ωB97X-D) with the 6-31++G(d,p) basis set. Applying the developed algorithm we study the hydration of sulfuric acid with up to 15 water molecules. We find that the additions of the first four water molecules "saturate" the sulfuric acid molecule and are more thermodynamically favourable than the addition of water molecule 5-15. Using the large generated set of conformers, we assess the performance of approximate methods (ωB97X-D, M06-2X, PW91 and PW6B95-D3) in calculating the binding energies and assigning the global minimum conformation compared to high level CCSD(T)-F12a/VDZ-F12 reference calculations. The tested DFT functionals systematically overestimates the binding energies compared to cou- pled cluster calculations, and we find that this deficiency can be corrected by a simple scaling factor.

  3. Galaxy Cluster Mass Reconstruction Project – III. The impact of dynamical substructure on cluster mass estimates

    DOE PAGES

    Old, L.; Wojtak, R.; Pearce, F. R.; ...

    2017-12-20

    With the advent of wide-field cosmological surveys, we are approaching samples of hundreds of thousands of galaxy clusters. While such large numbers will help reduce statistical uncertainties, the control of systematics in cluster masses is crucial. Here we examine the effects of an important source of systematic uncertainty in galaxy-based cluster mass estimation techniques: the presence of significant dynamical substructure. Dynamical substructure manifests as dynamically distinct subgroups in phase-space, indicating an ‘unrelaxed’ state. This issue affects around a quarter of clusters in a generally selected sample. We employ a set of mock clusters whose masses have been measured homogeneously withmore » commonly used galaxy-based mass estimation techniques (kinematic, richness, caustic, radial methods). We use these to study how the relation between observationally estimated and true cluster mass depends on the presence of substructure, as identified by various popular diagnostics. We find that the scatter for an ensemble of clusters does not increase dramatically for clusters with dynamical substructure. However, we find a systematic bias for all methods, such that clusters with significant substructure have higher measured masses than their relaxed counterparts. This bias depends on cluster mass: the most massive clusters are largely unaffected by the presence of significant substructure, but masses are significantly overestimated for lower mass clusters, by ~ 10 percent at 10 14 and ≳ 20 percent for ≲ 10 13.5. Finally, the use of cluster samples with different levels of substructure can therefore bias certain cosmological parameters up to a level comparable to the typical uncertainties in current cosmological studies.« less

  4. Galaxy Cluster Mass Reconstruction Project – III. The impact of dynamical substructure on cluster mass estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Old, L.; Wojtak, R.; Pearce, F. R.

    With the advent of wide-field cosmological surveys, we are approaching samples of hundreds of thousands of galaxy clusters. While such large numbers will help reduce statistical uncertainties, the control of systematics in cluster masses is crucial. Here we examine the effects of an important source of systematic uncertainty in galaxy-based cluster mass estimation techniques: the presence of significant dynamical substructure. Dynamical substructure manifests as dynamically distinct subgroups in phase-space, indicating an ‘unrelaxed’ state. This issue affects around a quarter of clusters in a generally selected sample. We employ a set of mock clusters whose masses have been measured homogeneously withmore » commonly used galaxy-based mass estimation techniques (kinematic, richness, caustic, radial methods). We use these to study how the relation between observationally estimated and true cluster mass depends on the presence of substructure, as identified by various popular diagnostics. We find that the scatter for an ensemble of clusters does not increase dramatically for clusters with dynamical substructure. However, we find a systematic bias for all methods, such that clusters with significant substructure have higher measured masses than their relaxed counterparts. This bias depends on cluster mass: the most massive clusters are largely unaffected by the presence of significant substructure, but masses are significantly overestimated for lower mass clusters, by ~ 10 percent at 10 14 and ≳ 20 percent for ≲ 10 13.5. Finally, the use of cluster samples with different levels of substructure can therefore bias certain cosmological parameters up to a level comparable to the typical uncertainties in current cosmological studies.« less

  5. Improving compound-protein interaction prediction by building up highly credible negative samples.

    PubMed

    Liu, Hui; Sun, Jianjiang; Guan, Jihong; Zheng, Jie; Zhou, Shuigeng

    2015-06-15

    Computational prediction of compound-protein interactions (CPIs) is of great importance for drug design and development, as genome-scale experimental validation of CPIs is not only time-consuming but also prohibitively expensive. With the availability of an increasing number of validated interactions, the performance of computational prediction approaches is severely impended by the lack of reliable negative CPI samples. A systematic method of screening reliable negative sample becomes critical to improving the performance of in silico prediction methods. This article aims at building up a set of highly credible negative samples of CPIs via an in silico screening method. As most existing computational models assume that similar compounds are likely to interact with similar target proteins and achieve remarkable performance, it is rational to identify potential negative samples based on the converse negative proposition that the proteins dissimilar to every known/predicted target of a compound are not much likely to be targeted by the compound and vice versa. We integrated various resources, including chemical structures, chemical expression profiles and side effects of compounds, amino acid sequences, protein-protein interaction network and functional annotations of proteins, into a systematic screening framework. We first tested the screened negative samples on six classical classifiers, and all these classifiers achieved remarkably higher performance on our negative samples than on randomly generated negative samples for both human and Caenorhabditis elegans. We then verified the negative samples on three existing prediction models, including bipartite local model, Gaussian kernel profile and Bayesian matrix factorization, and found that the performances of these models are also significantly improved on the screened negative samples. Moreover, we validated the screened negative samples on a drug bioactivity dataset. Finally, we derived two sets of new interactions by training an support vector machine classifier on the positive interactions annotated in DrugBank and our screened negative interactions. The screened negative samples and the predicted interactions provide the research community with a useful resource for identifying new drug targets and a helpful supplement to the current curated compound-protein databases. Supplementary files are available at: http://admis.fudan.edu.cn/negative-cpi/. © The Author 2015. Published by Oxford University Press.

  6. A green analytical method using ultrasound in sample preparation for the flow injection determination of iron, manganese, and zinc in soluble solid samples by flame atomic absorption spectrometry.

    PubMed

    Yebra, M Carmen

    2012-01-01

    A simple and rapid analytical method was developed for the determination of iron, manganese, and zinc in soluble solid samples. The method is based on continuous ultrasonic water dissolution of the sample (5-30 mg) at room temperature followed by flow injection flame atomic absorption spectrometric determination. A good precision of the whole procedure (1.2-4.6%) and a sample throughput of ca. 25 samples h(-1) were obtained. The proposed green analytical method has been successfully applied for the determination of iron, manganese, and zinc in soluble solid food samples (soluble cocoa and soluble coffee) and pharmaceutical preparations (multivitamin tablets). The ranges of concentrations found were 21.4-25.61 μg g(-1) for iron, 5.74-18.30 μg g(-1) for manganese, and 33.27-57.90 μg g(-1) for zinc in soluble solid food samples and 3.75-9.90 μg g(-1) for iron, 0.47-5.05 μg g(-1) for manganese, and 1.55-15.12 μg g(-1) for zinc in multivitamin tablets. The accuracy of the proposed method was established by a comparison with the conventional wet acid digestion method using a paired t-test, indicating the absence of systematic errors.

  7. Comparison of no-purge and pumped sampling methods for monitoring concentrations of ordnance-related compounds in groundwater, Camp Edwards, Massachusetts Military Reservation, Cape Cod, Massachusetts, 2009-2010

    USGS Publications Warehouse

    Savoie, Jennifer G.; LeBlanc, Denis R.

    2012-01-01

    Field tests were conducted near the Impact Area at Camp Edwards on the Massachusetts Military Reservation, Cape Cod, Massachusetts, to determine the utility of no-purge groundwater sampling for monitoring concentrations of ordnance-related explosive compounds and perchlorate in the sand and gravel aquifer. The no-purge methods included (1) a diffusion sampler constructed of rigid porous polyethylene, (2) a diffusion sampler constructed of regenerated-cellulose membrane, and (3) a tubular grab sampler (bailer) constructed of polyethylene film. In samples from 36 monitoring wells, concentrations of perchlorate (ClO4-), hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX), and octahydro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine (HMX), the major contaminants of concern in the Impact Area, in the no-purge samples were compared to concentrations of these compounds in samples collected by low-flow pumped sampling with dedicated bladder pumps. The monitoring wells are constructed of 2- and 2.5-inch-diameter polyvinyl chloride pipe and have approximately 5- to 10-foot-long slotted screens. The no-purge samplers were left in place for 13-64 days to ensure that ambient groundwater flow had flushed the well screen and concentrations in the screen represented water in the adjacent formation. The sampling methods were compared first in six monitoring wells. Concentrations of ClO4-, RDX, and HMX in water samples collected by the three no-purge sampling methods and low-flow pumped sampling were in close agreement for all six monitoring wells. There is no evidence of a systematic bias in the concentration differences among the methods on the basis of type of sampling device, type of contaminant, or order in which the no-purge samplers were tested. A subsequent examination of vertical variations in concentrations of ClO4- in the 10-foot-long screens of six wells by using rigid porous polyethylene diffusion samplers indicated that concentrations in a given well varied by less than 15 percent and the small variations were unlikely to affect the utility of the various sampling methods. The grab sampler was selected for additional tests in 29 of the 36 monitoring wells used during the study. Concentrations of ClO4-, RDX, HMX, and other minor explosive compounds in water samples collected by using a 1-liter grab sampler and low-flow pumped sampling were in close agreement in field tests in the 29 wells. A statistical analysis based on the sign test indicated that there was no bias in the concentration differences between the methods. There also was no evidence for a systematic bias in concentration differences between the methods related to location of the monitoring wells laterally or vertically in the groundwater-flow system. Field tests in five wells also demonstrated that sample collection by using a 2-liter grab sampler and sequential bailing with the 1-liter grab sampler were options for obtaining sufficient sample volume for replicate and spiked quality assurance and control samples. The evidence from the field tests supports the conclusion that diffusion sampling with the rigid porous polyethylene and regenerated-cellulose membranes and grab sampling with the polyethylene-film samplers provide comparable data on the concentrations of ordnance-related compounds in groundwater at the MMR to that obtained by low-flow pumped sampling. These sampling methods are useful methods for monitoring these compounds at the MMR and in similar hydrogeologic environments.

  8. A low proportion of systematic reviews in physical therapy are registered: a survey of 150 published systematic reviews.

    PubMed

    Oliveira, Crystian B; Elkins, Mark R; Lemes, Ítalo Ribeiro; de Oliveira Silva, Danilo; Briani, Ronaldo V; Monteiro, Henrique Luiz; Azevedo, Fábio Mícolis de; Pinto, Rafael Zambelli

    Systematic reviews provide the best evidence about the effectiveness of healthcare interventions. Although systematic reviews are conducted with explicit and transparent methods, discrepancies might occur between the protocol and the publication. To estimate the proportion of systematic reviews of physical therapy interventions that are registered, the methodological quality of (un)registered systematic reviews and the prevalence of outcome reporting bias in registered systematic reviews. A random sample of 150 systematic reviews published in 2015 indexed on the PEDro database. We included systematic reviews written in English, Italian, Portuguese and Spanish. A checklist for assessing the methodological quality of systematic reviews tool was used. Relative risk was calculated to explore the association between meta-analysis results and the changes in the outcomes. Twenty-nine (19%) systematic reviews were registered. Funding and publication in a journal with an impact factor higher than 5.0 were associated with registration. Registered systematic reviews demonstrated significantly higher methodological quality (median=8) than unregistered systematic reviews (median=5). Nine (31%) registered systematic reviews demonstrated discrepancies between protocol and publication with no evidence that such discrepancies were applied to favor the statistical significance of the intervention (RR=1.16; 95% CI: 0.63-2.12). A low proportion of systematic reviews in the physical therapy field are registered. The registered systematic reviews showed high methodological quality without evidence of outcome reporting bias. Further strategies should be implemented to encourage registration. Copyright © 2017 Associação Brasileira de Pesquisa e Pós-Graduação em Fisioterapia. Publicado por Elsevier Editora Ltda. All rights reserved.

  9. Detecting and Estimating Contamination of Human DNA Samples in Sequencing and Array-Based Genotype Data

    PubMed Central

    Jun, Goo; Flickinger, Matthew; Hetrick, Kurt N.; Romm, Jane M.; Doheny, Kimberly F.; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min

    2012-01-01

    DNA sample contamination is a serious problem in DNA sequencing studies and may result in systematic genotype misclassification and false positive associations. Although methods exist to detect and filter out cross-species contamination, few methods to detect within-species sample contamination are available. In this paper, we describe methods to identify within-species DNA sample contamination based on (1) a combination of sequencing reads and array-based genotype data, (2) sequence reads alone, and (3) array-based genotype data alone. Analysis of sequencing reads allows contamination detection after sequence data is generated but prior to variant calling; analysis of array-based genotype data allows contamination detection prior to generation of costly sequence data. Through a combination of analysis of in silico and experimentally contaminated samples, we show that our methods can reliably detect and estimate levels of contamination as low as 1%. We evaluate the impact of DNA contamination on genotype accuracy and propose effective strategies to screen for and prevent DNA contamination in sequencing studies. PMID:23103226

  10. Health effects of indebtedness: a systematic review

    PubMed Central

    2014-01-01

    Background In the aftermath of the global financial crisis, millions of households have been left with debts that they are unable to manage. Indebtedness may impair the wellbeing of those affected by it for years to come. This systematic review focuses on the long-term consequences of indebtedness on health. Methods The method used in the paper is a systematic review. First, bibliographic databases were searched for peer-reviewed articles. Second, the references and citations of the included articles were searched for additional articles. Results The results from our sample of 33 peer-reviewed studies demonstrate serious health effects related to indebtedness. Individuals with unmet loan payments had suicidal ideation and suffered from depression more often than those without such financial problems. Unpaid financial obligations were also related to poorer subjective health and health-related behaviour. Debt counselling and other programmes to mitigate debt-related stress are needed to alleviate the adverse effects of indebtedness on health. Conclusions The results demonstrate that indebtedness has serious effects on health. PMID:24885280

  11. Analysis of the torsional storage modulus of human hair and its relation to hair morphology and cosmetic processing.

    PubMed

    Wortmann, Franz J; Wortmann, Gabriele; Haake, Hans-Martin; Eisfeld, Wolf

    2014-01-01

    Through measurements of three different hair samples (virgin and treated) by the torsional pendulum method (22°C, 22% RH) a systematic decrease of the torsional storage modulus G' with increasing fiber diameter, i.e., polar moment of inertia, is observed. G' is therefore not a material constant for hair. This change of G' implies a systematic component of data variance, which significantly contributes to the limitations of the torsional method for cosmetic claim support. Fitting the data on the basis of a core/shell model for cortex and cuticle enables to separate this systematic component of variance and to greatly enhance the discriminative power of the test. The fitting procedure also provides values for the torsional storage moduli of the morphological components, confirming that the cuticle modulus is substantially higher than that of the cortex. The results give consistent insight into the changes imparted to the morphological components by the cosmetic treatments.

  12. Accuracy of LightCycler(R) SeptiFast for the detection and identification of pathogens in the blood of patients with suspected sepsis: a systematic review protocol.

    PubMed

    Dark, Paul; Wilson, Claire; Blackwood, Bronagh; McAuley, Danny F; Perkins, Gavin D; McMullan, Ronan; Gates, Simon; Warhurst, Geoffrey

    2012-01-01

    Background There is growing interest in the potential utility of molecular diagnostics in improving the detection of life-threatening infection (sepsis). LightCycler® SeptiFast is a multipathogen probe-based real-time PCR system targeting DNA sequences of bacteria and fungi present in blood samples within a few hours. We report here the protocol of the first systematic review of published clinical diagnostic accuracy studies of this technology when compared with blood culture in the setting of suspected sepsis. Methods/design Data sources: the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE), the Health Technology Assessment Database (HTA), the NHS Economic Evaluation Database (NHSEED), The Cochrane Library, MEDLINE, EMBASE, ISI Web of Science, BIOSIS Previews, MEDION and the Aggressive Research Intelligence Facility Database (ARIF). diagnostic accuracy studies that compare the real-time PCR technology with standard culture results performed on a patient's blood sample during the management of sepsis. three reviewers, working independently, will determine the level of evidence, methodological quality and a standard data set relating to demographics and diagnostic accuracy metrics for each study. Statistical analysis/data synthesis: heterogeneity of studies will be investigated using a coupled forest plot of sensitivity and specificity and a scatter plot in Receiver Operator Characteristic (ROC) space. Bivariate model method will be used to estimate summary sensitivity and specificity. The authors will investigate reporting biases using funnel plots based on effective sample size and regression tests of asymmetry. Subgroup analyses are planned for adults, children and infection setting (hospital vs community) if sufficient data are uncovered. Dissemination Recommendations will be made to the Department of Health (as part of an open-access HTA report) as to whether the real-time PCR technology has sufficient clinical diagnostic accuracy potential to move forward to efficacy testing during the provision of routine clinical care. Registration PROSPERO-NIHR Prospective Register of Systematic Reviews (CRD42011001289).

  13. A Critical Assessment of Bias in Survey Studies Using Location-Based Sampling to Recruit Patrons in Bars

    PubMed Central

    Morrison, Christopher; Lee, Juliet P.; Gruenewald, Paul J.; Marzell, Miesha

    2015-01-01

    Location-based sampling is a method to obtain samples of people within ecological contexts relevant to specific public health outcomes. Random selection increases generalizability, however in some circumstances (such as surveying bar patrons) recruitment conditions increase risks of sample bias. We attempted to recruit representative samples of bars and patrons in six California cities, but low response rates precluded meaningful analysis. A systematic review of 24 similar studies revealed that none addressed the key shortcomings of our study. We recommend steps to improve studies that use location-based sampling: (i) purposively sample places of interest, (ii) utilize recruitment strategies appropriate to the environment, and (iii) provide full information on response rates at all levels of sampling. PMID:26574657

  14. Systematic on-site monitoring of compliance dust samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grayson, R.L.; Gandy, J.R.

    1996-12-31

    Maintaining compliance with U.S. respirable coal mine dust standards can be difficult on high-productivity longwall panels. Comprehensive and systematic analysis of compliance dust sample data, coupled with access to the U.S. Bureau of Mines (USBM) DUSTPRO, can yield important information for use in maintaining compliance. The objective of this study was to develop and apply a customized software for the collection, storage, modification, and analysis of respirable dust data while providing for flexible export of data and linking with the USBM`s expert advisory system on dust control. An executable, IBM-compatible software was created and customized for use by the personmore » in charge of collecting, submitting, analyzing, and monitoring respirable dust compliance samples. Both descriptive statistics and multiple regression analysis were incorporated. The software allows ASCH files to be exported and directly links with DUSTPRO. After development and validation of the software, longwall compliance data from two different mines was analyzed to evaluate the value of the software. Data included variables on respirable dust concentration, tons produced, the existence of roof/floor rock (dummy variable), and the sampling cycle (dummy variables). Because of confidentiality, specific data will not be presented, only the equations and ANOVA tables. The final regression models explained 83.8% and 61.1% of the variation in the data for the two panels. Important correlations among variables within sampling cycles showed the value of using dummy variables for sampling cycles. The software proved flexible and fast for its intended use. The insights obtained from use improved the systematic monitoring of respirable dust compliance data, especially for pinpointing the most effective dust control methods during specific sampling cycles.« less

  15. Experimental assessment of the purity of α-cellulose produced by variations of the Brendel method: Implications for stable isotope (δ13C, δ18O) dendroclimatology

    NASA Astrophysics Data System (ADS)

    Brookman, Tom; Whittaker, Thomas

    2012-09-01

    Stable isotope dendroclimatology using α-cellulose has unique potential to deliver multimillennial-scale, sub-annually resolved, terrestrial climate records. However, lengthy processing and analytical methods often preclude such reconstructions. Variants of the Brendel extraction method have reduced these limitations, providing fast, easy methods of isolating α-cellulose in some species. Here, we investigate application of Standard Brendel (SBrendel) variants to resinous soft-woods by treating samples of kauri (Agathis australis), ponderosa pine (Pinus ponderosa) and huon pine (Lagarastrobus franklinii), varying reaction vessel, temperature, boiling time and reagent volume. Numerous samples were visibly `under-processed' and Fourier Transform infrared spectroscopic (FTIR) investigation showed absorption peaks at 1520 cm-1 and ˜1600 cm-1 in those fibers suggesting residual lignin and retained resin respectively. Replicate analyses of all samples processed at high temperature yielded consistent δ13C and δ18O despite color and spectral variations. Spectra and isotopic data revealed that α-cellulose δ13C can be altered during processing, most likely due to chemical contamination from insufficient acetone removal, but is not systematically affected by methodological variation. Reagent amount, temperature and extraction time all influence δ18O, however, and our results demonstrate that different species may require different processing methods. FTIR prior to isotopic analysis is a fast and cost effective way to determine α-cellulose extract purity. Furthermore, a systematic isotopic test such as we present here can also determine sensitivity of isotopic values to methodological variables. Without these tests, isotopic variability introduced by the method could obscure or `create' climatic signals within a data set.

  16. A direct immersion solid-phase microextraction gas chromatography/mass spectrometry method for the simultaneous detection of levamisole and minor cocaine congeners in hair samples from chronic abusers.

    PubMed

    Fucci, Nadia; Gambelunghe, Cristiana; Aroni, Kyriaki; Rossi, Riccardo

    2014-12-01

    Because levamisole has been increasingly found as a component of illicit drugs, a robust method to detect its presence in hair samples is needed. However, no systematic research on the detection of levamisole in hair samples has been published. The method presented here uses direct immersion solid-phase microextraction coupled with gas chromatography and mass spectrometry (DI-SPME-GC/MS) to detect levamisole and minor cocaine congeners in hair samples using a single-extraction method. Fifty hair samples taken in the last 4 years were obtained from cocaine abusers, along with controls taken from drug-free volunteers. Sampling was performed using direct immersion with a 30-μm polydimethylsiloxane fused silica/stainless steel fiber. Calibration curves were prepared by adding known amounts of analytes and deuterated internal standards to the hair samples taken from drug-free volunteers. This study focused on the adulterant levamisole and some minor cocaine congeners (tropococaine, norcocaine, and cocaethylene). Levamisole was detected in 38% of the hair samples analyzed; its concentration ranged from 0.2 to 0.8 ng/mg. The limit of quantification and limit of detection for levamisole, tropococaine, norcocaine, and cocaine were 0.2 and 0.1 ng/mg, respectively. DI-SPME-GC/MS is a sensitive and specific method to detect the presence of levamisole and cocaine congeners in hair samples.

  17. Metadynamics for training neural network model chemistries: A competitive assessment

    NASA Astrophysics Data System (ADS)

    Herr, John E.; Yao, Kun; McIntyre, Ryker; Toth, David W.; Parkhill, John

    2018-06-01

    Neural network model chemistries (NNMCs) promise to facilitate the accurate exploration of chemical space and simulation of large reactive systems. One important path to improving these models is to add layers of physical detail, especially long-range forces. At short range, however, these models are data driven and data limited. Little is systematically known about how data should be sampled, and "test data" chosen randomly from some sampling techniques can provide poor information about generality. If the sampling method is narrow, "test error" can appear encouragingly tiny while the model fails catastrophically elsewhere. In this manuscript, we competitively evaluate two common sampling methods: molecular dynamics (MD), normal-mode sampling, and one uncommon alternative, Metadynamics (MetaMD), for preparing training geometries. We show that MD is an inefficient sampling method in the sense that additional samples do not improve generality. We also show that MetaMD is easily implemented in any NNMC software package with cost that scales linearly with the number of atoms in a sample molecule. MetaMD is a black-box way to ensure samples always reach out to new regions of chemical space, while remaining relevant to chemistry near kbT. It is a cheap tool to address the issue of generalization.

  18. Quantifying and Mitigating the Effect of Preferential Sampling on Phylodynamic Inference

    PubMed Central

    Karcher, Michael D.; Palacios, Julia A.; Bedford, Trevor; Suchard, Marc A.; Minin, Vladimir N.

    2016-01-01

    Phylodynamics seeks to estimate effective population size fluctuations from molecular sequences of individuals sampled from a population of interest. One way to accomplish this task formulates an observed sequence data likelihood exploiting a coalescent model for the sampled individuals’ genealogy and then integrating over all possible genealogies via Monte Carlo or, less efficiently, by conditioning on one genealogy estimated from the sequence data. However, when analyzing sequences sampled serially through time, current methods implicitly assume either that sampling times are fixed deterministically by the data collection protocol or that their distribution does not depend on the size of the population. Through simulation, we first show that, when sampling times do probabilistically depend on effective population size, estimation methods may be systematically biased. To correct for this deficiency, we propose a new model that explicitly accounts for preferential sampling by modeling the sampling times as an inhomogeneous Poisson process dependent on effective population size. We demonstrate that in the presence of preferential sampling our new model not only reduces bias, but also improves estimation precision. Finally, we compare the performance of the currently used phylodynamic methods with our proposed model through clinically-relevant, seasonal human influenza examples. PMID:26938243

  19. Errors in causal inference: an organizational schema for systematic error and random error.

    PubMed

    Suzuki, Etsuji; Tsuda, Toshihide; Mitsuhashi, Toshiharu; Mansournia, Mohammad Ali; Yamamoto, Eiji

    2016-11-01

    To provide an organizational schema for systematic error and random error in estimating causal measures, aimed at clarifying the concept of errors from the perspective of causal inference. We propose to divide systematic error into structural error and analytic error. With regard to random error, our schema shows its four major sources: nondeterministic counterfactuals, sampling variability, a mechanism that generates exposure events and measurement variability. Structural error is defined from the perspective of counterfactual reasoning and divided into nonexchangeability bias (which comprises confounding bias and selection bias) and measurement bias. Directed acyclic graphs are useful to illustrate this kind of error. Nonexchangeability bias implies a lack of "exchangeability" between the selected exposed and unexposed groups. A lack of exchangeability is not a primary concern of measurement bias, justifying its separation from confounding bias and selection bias. Many forms of analytic errors result from the small-sample properties of the estimator used and vanish asymptotically. Analytic error also results from wrong (misspecified) statistical models and inappropriate statistical methods. Our organizational schema is helpful for understanding the relationship between systematic error and random error from a previously less investigated aspect, enabling us to better understand the relationship between accuracy, validity, and precision. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Evidence-based practice: extending the search to find material for the systematic review

    PubMed Central

    Helmer, Diane; Savoie, Isabelle; Green, Carolyn; Kazanjian, Arminée

    2001-01-01

    Background: Cochrane-style systematic reviews increasingly require the participation of librarians. Guidelines on the appropriate search strategy to use for systematic reviews have been proposed. However, research evidence supporting these recommendations is limited. Objective: This study investigates the effectiveness of various systematic search methods used to uncover randomized controlled trials (RCTs) for systematic reviews. Effectiveness is defined as the proportion of relevant material uncovered for the systematic review using extended systematic review search methods. The following extended systematic search methods are evaluated: searching subject-specific or specialized databases (including trial registries), hand searching, scanning reference lists, and communicating personally. Methods: Two systematic review projects were prospectively monitored regarding the method used to identify items as well as the type of items retrieved. The proportion of RCTs identified by each systematic search method was calculated. Results: The extended systematic search methods uncovered 29.2% of all items retrieved for the systematic reviews. The search of specialized databases was the most effective method, followed by scanning of reference lists, communicating personally, and hand searching. Although the number of items identified through hand searching was small, these unique items would otherwise have been missed. Conclusions: Extended systematic search methods are effective tools for uncovering material for the systematic review. The quality of the items uncovered has yet to be assessed and will be key in evaluating the value of the systematic search methods. PMID:11837256

  1. MAPPING THE GALAXY COLOR–REDSHIFT RELATION: OPTIMAL PHOTOMETRIC REDSHIFT CALIBRATION STRATEGIES FOR COSMOLOGY SURVEYS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masters, Daniel; Steinhardt, Charles; Faisst, Andreas

    2015-11-01

    Calibrating the photometric redshifts of ≳10{sup 9} galaxies for upcoming weak lensing cosmology experiments is a major challenge for the astrophysics community. The path to obtaining the required spectroscopic redshifts for training and calibration is daunting, given the anticipated depths of the surveys and the difficulty in obtaining secure redshifts for some faint galaxy populations. Here we present an analysis of the problem based on the self-organizing map, a method of mapping the distribution of data in a high-dimensional space and projecting it onto a lower-dimensional representation. We apply this method to existing photometric data from the COSMOS survey selectedmore » to approximate the anticipated Euclid weak lensing sample, enabling us to robustly map the empirical distribution of galaxies in the multidimensional color space defined by the expected Euclid filters. Mapping this multicolor distribution lets us determine where—in galaxy color space—redshifts from current spectroscopic surveys exist and where they are systematically missing. Crucially, the method lets us determine whether a spectroscopic training sample is representative of the full photometric space occupied by the galaxies in a survey. We explore optimal sampling techniques and estimate the additional spectroscopy needed to map out the color–redshift relation, finding that sampling the galaxy distribution in color space in a systematic way can efficiently meet the calibration requirements. While the analysis presented here focuses on the Euclid survey, similar analysis can be applied to other surveys facing the same calibration challenge, such as DES, LSST, and WFIRST.« less

  2. QCD Precision Measurements and Structure Function Extraction at a High Statistics, High Energy Neutrino Scattering Experiment:. NuSOnG

    NASA Astrophysics Data System (ADS)

    Adams, T.; Batra, P.; Bugel, L.; Camilleri, L.; Conrad, J. M.; de Gouvêa, A.; Fisher, P. H.; Formaggio, J. A.; Jenkins, J.; Karagiorgi, G.; Kobilarcik, T. R.; Kopp, S.; Kyle, G.; Loinaz, W. A.; Mason, D. A.; Milner, R.; Moore, R.; Morfín, J. G.; Nakamura, M.; Naples, D.; Nienaber, P.; Olness, F. I.; Owens, J. F.; Pate, S. F.; Pronin, A.; Seligman, W. G.; Shaevitz, M. H.; Schellman, H.; Schienbein, I.; Syphers, M. J.; Tait, T. M. P.; Takeuchi, T.; Tan, C. Y.; van de Water, R. G.; Yamamoto, R. K.; Yu, J. Y.

    We extend the physics case for a new high-energy, ultra-high statistics neutrino scattering experiment, NuSOnG (Neutrino Scattering On Glass) to address a variety of issues including precision QCD measurements, extraction of structure functions, and the derived Parton Distribution Functions (PDF's). This experiment uses a Tevatron-based neutrino beam to obtain a sample of Deep Inelastic Scattering (DIS) events which is over two orders of magnitude larger than past samples. We outline an innovative method for fitting the structure functions using a parametrized energy shift which yields reduced systematic uncertainties. High statistics measurements, in combination with improved systematics, will enable NuSOnG to perform discerning tests of fundamental Standard Model parameters as we search for deviations which may hint of "Beyond the Standard Model" physics.

  3. Learning process mapping heuristics under stochastic sampling overheads

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.

    1991-01-01

    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  4. Genome-wide association analysis of secondary imaging phenotypes from the Alzheimer's disease neuroimaging initiative study.

    PubMed

    Zhu, Wensheng; Yuan, Ying; Zhang, Jingwen; Zhou, Fan; Knickmeyer, Rebecca C; Zhu, Hongtu

    2017-02-01

    The aim of this paper is to systematically evaluate a biased sampling issue associated with genome-wide association analysis (GWAS) of imaging phenotypes for most imaging genetic studies, including the Alzheimer's Disease Neuroimaging Initiative (ADNI). Specifically, the original sampling scheme of these imaging genetic studies is primarily the retrospective case-control design, whereas most existing statistical analyses of these studies ignore such sampling scheme by directly correlating imaging phenotypes (called the secondary traits) with genotype. Although it has been well documented in genetic epidemiology that ignoring the case-control sampling scheme can produce highly biased estimates, and subsequently lead to misleading results and suspicious associations, such findings are not well documented in imaging genetics. We use extensive simulations and a large-scale imaging genetic data analysis of the Alzheimer's Disease Neuroimaging Initiative (ADNI) data to evaluate the effects of the case-control sampling scheme on GWAS results based on some standard statistical methods, such as linear regression methods, while comparing it with several advanced statistical methods that appropriately adjust for the case-control sampling scheme. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Precision of systematic and random sampling in clustered populations: habitat patches and aggregating organisms.

    PubMed

    McGarvey, Richard; Burch, Paul; Matthews, Janet M

    2016-01-01

    Natural populations of plants and animals spatially cluster because (1) suitable habitat is patchy, and (2) within suitable habitat, individuals aggregate further into clusters of higher density. We compare the precision of random and systematic field sampling survey designs under these two processes of species clustering. Second, we evaluate the performance of 13 estimators for the variance of the sample mean from a systematic survey. Replicated simulated surveys, as counts from 100 transects, allocated either randomly or systematically within the study region, were used to estimate population density in six spatial point populations including habitat patches and Matérn circular clustered aggregations of organisms, together and in combination. The standard one-start aligned systematic survey design, a uniform 10 x 10 grid of transects, was much more precise. Variances of the 10 000 replicated systematic survey mean densities were one-third to one-fifth of those from randomly allocated transects, implying transect sample sizes giving equivalent precision by random survey would need to be three to five times larger. Organisms being restricted to patches of habitat was alone sufficient to yield this precision advantage for the systematic design. But this improved precision for systematic sampling in clustered populations is underestimated by standard variance estimators used to compute confidence intervals. True variance for the survey sample mean was computed from the variance of 10 000 simulated survey mean estimates. Testing 10 published and three newly proposed variance estimators, the two variance estimators (v) that corrected for inter-transect correlation (ν₈ and ν(W)) were the most accurate and also the most precise in clustered populations. These greatly outperformed the two "post-stratification" variance estimators (ν₂ and ν₃) that are now more commonly applied in systematic surveys. Similar variance estimator performance rankings were found with a second differently generated set of spatial point populations, ν₈ and ν(W) again being the best performers in the longer-range autocorrelated populations. However, no systematic variance estimators tested were free from bias. On balance, systematic designs bring more narrow confidence intervals in clustered populations, while random designs permit unbiased estimates of (often wider) confidence interval. The search continues for better estimators of sampling variance for the systematic survey mean.

  6. Two-Year Systematic Study To Assess Norovirus Contamination in Oysters from Commercial Harvesting Areas in the United Kingdom

    PubMed Central

    Gustar, Nicole E.; Powell, Andrew L.; Hartnell, Rachel E.; Lees, David N.

    2012-01-01

    The contamination of bivalve shellfish with norovirus from human fecal sources is recognized as an important human health risk. Standardized quantitative methods for the detection of norovirus in molluscan shellfish are now available, and viral standards are being considered in the European Union and internationally. This 2-year systematic study aimed to investigate the impact of the application of these methods to the monitoring of norovirus contamination in oyster production areas in the United Kingdom. Twenty-four monthly samples of oysters from 39 United Kingdom production areas, chosen to represent a range of potential contamination risk, were tested for norovirus genogroups I and II by using a quantitative real-time reverse transcription (RT)-PCR method. Norovirus was detected in 76.2% (643/844) of samples, with all sites returning at least one positive result. Both prevalences (presence or absence) and norovirus levels varied markedly between sites. However, overall, a marked winter seasonality of contamination by both prevalence and quantity was observed. Correlations were found between norovirus contamination and potential risk indicators, including harvesting area classifications, Escherichia coli scores, and environmental temperatures. A predictive risk score for norovirus contamination was developed by using a combination of these factors. In summary, this study, the largest of its type undertaken to date, provides a systematic analysis of norovirus contamination in commercial oyster production areas in the United Kingdom. The data should assist risk managers to develop control strategies to reduce the risk of human illness resulting from norovirus contamination of bivalve molluscs. PMID:22685151

  7. Family Risk Factors and Prevalence of Dissociative Symptoms among Homeless and Runaway Youth

    ERIC Educational Resources Information Center

    Tyler, Kimberly A.; Cauce, Ana Mari; Whitbeck, Les

    2004-01-01

    Objective: To examine family risk factors associated with dissociative symptoms among homeless and runaway youth. Method: Three hundred and twenty-eight homeless and runaway youth were interviewed using a systematic sampling strategy in metropolitan Seattle. Homeless young people were interviewed on the streets and in shelters by outreach workers…

  8. Rater Perceptions of Bias Using the Multiple Mini-Interview Format: A Qualitative Study

    ERIC Educational Resources Information Center

    Alweis, Richard L.; Fitzpatrick, Caroline; Donato, Anthony A.

    2015-01-01

    Introduction: The Multiple Mini-Interview (MMI) format appears to mitigate individual rater biases. However, the format itself may introduce structural systematic bias, favoring extroverted personality types. This study aimed to gain a better understanding of these biases from the perspective of the interviewer. Methods: A sample of MMI…

  9. ADHD with Comorbid Oppositional Defiant Disorder or Conduct Disorder: Discrete or Nondistinct Disruptive Behavior Disorders?

    ERIC Educational Resources Information Center

    Connor, Daniel F.; Doerfler, Leonard A.

    2008-01-01

    Objective: In children with ADHD who have comorbid disruptive behavior diagnoses distinctions between oppositional defiant disorder (ODD) and conduct disorder (CD) remain unclear. The authors investigate differences between ODD and CD in a large clinical sample of children with ADHD. Method: Consecutively referred and systematically assessed male…

  10. Impact of Processing Method on Recovery of Bacteria from Wipes Used in Biological Surface Sampling

    PubMed Central

    Olson, Nathan D.; Filliben, James J.; Morrow, Jayne B.

    2012-01-01

    Environmental sampling for microbiological contaminants is a key component of hygiene monitoring and risk characterization practices utilized across diverse fields of application. However, confidence in surface sampling results, both in the field and in controlled laboratory studies, has been undermined by large variation in sampling performance results. Sources of variation include controlled parameters, such as sampling materials and processing methods, which often differ among studies, as well as random and systematic errors; however, the relative contributions of these factors remain unclear. The objective of this study was to determine the relative impacts of sample processing methods, including extraction solution and physical dissociation method (vortexing and sonication), on recovery of Gram-positive (Bacillus cereus) and Gram-negative (Burkholderia thailandensis and Escherichia coli) bacteria from directly inoculated wipes. This work showed that target organism had the largest impact on extraction efficiency and recovery precision, as measured by traditional colony counts. The physical dissociation method (PDM) had negligible impact, while the effect of the extraction solution was organism dependent. Overall, however, extraction of organisms from wipes using phosphate-buffered saline with 0.04% Tween 80 (PBST) resulted in the highest mean recovery across all three organisms. The results from this study contribute to a better understanding of the factors that influence sampling performance, which is critical to the development of efficient and reliable sampling methodologies relevant to public health and biodefense. PMID:22706055

  11. Prevalence of peptic ulcer in Iran: Systematic review and meta-analysis methods

    PubMed Central

    Sayehmiri, Kourosh; Abangah, Ghobad; Kalvandi, Gholamreza; Tavan, Hamed; Aazami, Sanaz

    2018-01-01

    Background: Peptic ulcer is a prevalent problem and symptoms include epigastria pain and heartburn. This study aimed at investigating the prevalence and causes of peptic ulcers in Iran using systematic review and meta-analysis. Materials and Methods: Eleven Iranian papers published from 2002 to 2016 are selected using valid keywords in the SID, Goggle scholar, PubMed and Elsevier databases. Results of studies pooled using random effects model in meta-analysis. The heterogeneity of the sample was checked using Q test and I2 index. Results: Total sample size in this study consist of 1335 individuals with peptic ulcer (121 samples per article). The prevalence of peptic ulcers was estimated 34% (95% CI= 0.25 – 0.43). The prevalence of peptic ulcers was 30% and 60% in woman and man respectively. The highest environmental factor (cigarette) has been addressed in 30% (95% CI= 0.23-0.37) of patients. The prevalence of Helicobacter pylori was estimated in 62% (95% CI= 0.49-0.75) of patients. Conclusion: The results of this study show that prevalence of peptic ulcers in Iran (34%) is higher that worldwide rate (6% to 15%). There was an increasing trend in the prevalence of peptic ulcer over a decade from 2002 to 2016. PMID:29456565

  12. CHEERS: The chemical evolution RGS sample

    NASA Astrophysics Data System (ADS)

    de Plaa, J.; Kaastra, J. S.; Werner, N.; Pinto, C.; Kosec, P.; Zhang, Y.-Y.; Mernier, F.; Lovisari, L.; Akamatsu, H.; Schellenberger, G.; Hofmann, F.; Reiprich, T. H.; Finoguenov, A.; Ahoranta, J.; Sanders, J. S.; Fabian, A. C.; Pols, O.; Simionescu, A.; Vink, J.; Böhringer, H.

    2017-11-01

    Context. The chemical yields of supernovae and the metal enrichment of the intra-cluster medium (ICM) are not well understood. The hot gas in clusters of galaxies has been enriched with metals originating from billions of supernovae and provides a fair sample of large-scale metal enrichment in the Universe. High-resolution X-ray spectra of clusters of galaxies provide a unique way of measuring abundances in the hot intracluster medium (ICM). The abundance measurements can provide constraints on the supernova explosion mechanism and the initial-mass function of the stellar population. This paper introduces the CHEmical Enrichment RGS Sample (CHEERS), which is a sample of 44 bright local giant ellipticals, groups, and clusters of galaxies observed with XMM-Newton. Aims: The CHEERS project aims to provide the most accurate set of cluster abundances measured in X-rays using this sample. This paper focuses specifically on the abundance measurements of O and Fe using the reflection grating spectrometer (RGS) on board XMM-Newton. We aim to thoroughly discuss the cluster to cluster abundance variations and the robustness of the measurements. Methods: We have selected the CHEERS sample such that the oxygen abundance in each cluster is detected at a level of at least 5σ in the RGS. The dispersive nature of the RGS limits the sample to clusters with sharp surface brightness peaks. The deep exposures and the size of the sample allow us to quantify the intrinsic scatter and the systematic uncertainties in the abundances using spectral modeling techniques. Results: We report the oxygen and iron abundances as measured with RGS in the core regions of all 44 clusters in the sample. We do not find a significant trend of O/Fe as a function of cluster temperature, but we do find an intrinsic scatter in the O and Fe abundances from cluster to cluster. The level of systematic uncertainties in the O/Fe ratio is estimated to be around 20-30%, while the systematic uncertainties in the absolute O and Fe abundances can be as high as 50% in extreme cases. Thanks to the high statistics of the observations, we were able to identify and correct a systematic bias in the oxygen abundance determination that was due to an inaccuracy in the spectral model. Conclusions: The lack of dependence of O/Fe on temperature suggests that the enrichment of the ICM does not depend on cluster mass and that most of the enrichment likely took place before the ICM was formed. We find that the observed scatter in the O/Fe ratio is due to a combination of intrinsic scatter in the source and systematic uncertainties in the spectral fitting, which we are unable to separate. The astrophysical source of intrinsic scatter could be due to differences in active galactic nucleus activity and ongoing star formation in the brightest cluster galaxy. The systematic scatter is due to uncertainties in the spatial line broadening, absorption column, multi-temperature structure, and the thermal plasma models.

  13. A novel synthesis of a new thorium (IV) metal organic framework nanostructure with well controllable procedure through ultrasound assisted reverse micelle method.

    PubMed

    Sargazi, Ghasem; Afzali, Daryoush; Mostafavi, Ali

    2018-03-01

    Reverse micelle (RM) and ultrasound assisted reverse micelle (UARM) were applied to the synthesis of novel thorium nanostructures as metal organic frameworks (MOFs). Characterization with different techniques showed that the Th-MOF sample synthesized by UARM method had higher thermal stability (354°C), smaller mean particle size (27nm), and larger surface area (2.02×10 3 m 2 /g). Besides, in this novel approach, the nucleation of crystals was found to carry out in a shorter time. The synthesis parameters of UARM method were designed by 2 k-1 factorial and the process control was systematically studied using analysis of variance (ANOVA) and response surface methodology (RSM). ANOVA showed that various factors, including surfactant content, ultrasound duration, temperature, ultrasound power, and interaction between these factors, considerably affected different properties of the Th-MOF samples. According to the 2 k-1 factorial design, the determination coefficient (R 2 ) of the model is 0.999, with no significant lack of fit. The F value of 5432, implied that the model was highly significant and adequate to represent the relationship between the responses and the independent variables, also the large R-adjusted value indicates a good relationship between the experimental data and the fitted model. RSM predicted that it would be possible to produce Th-MOF samples with the thermal stability of 407°C, mean particle size of 13nm, and surface area of 2.20×10 3 m 2 /g. The mechanism controlling the Th-MOF properties was considerably different from the conventional mechanisms. Moreover, the MOF sample synthesized using UARM exhibited higher capacity for nitrogen adsorption as a result of larger pore sizes. It is believed that the UARM method and systematic studies developed in the present work can be considered as a new strategy for their application in other nanoscale MOF samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Journalists and substance use: A systematic literature review.

    PubMed

    MacDonald, Jasmine B; Saliba, Anthony J; Hodgins, Gene

    2016-01-01

    Journalists' exposure to potentially traumatic events (PTEs), high levels of job stress, and anecdotal reports within the industry seem to suggest that journalists are at greater risk than the general population to experience substance use disorders. The present systematic literature review (SLR) aims to provide a concise, comprehensive, and systematic review of the quantitative literature relating to journalists' experience of substance use. The systematic review method adopted within the present study was based on that prescribed by Fink in the 2010 book, Conducting systematic literature reviews: From the internet to paper, 3rd ed., which contains three main elements: sampling the literature, screening the literature, and extracting data. Alcohol consumption is the most widely studied substance in journalist samples and is discussed in relation to quantity, level of risk, and potential alcoholism. The review also considers journalists' use of substances, including cigarettes, cannabis, and other illicit substances. In particular, comparisons are made between journalistic roles and gender. The research is piecemeal in nature, in that more recent research does not build upon the research that has come before it. Much of what has been reported does not reflect the progress that has taken place in recent years within the alcohol consumption and substance use field in terms of theory, assessment, scale development, practice, and interventions with those who use or are addicted to various substances. This SLR raises a number of methodological and theoretical issues to be explored and addressed in future research.

  15. Mapping Vineyard Leaf Area Using Mobile Terrestrial Laser Scanners: Should Rows be Scanned On-the-Go or Discontinuously Sampled?

    PubMed Central

    del-Moral-Martínez, Ignacio; Rosell-Polo, Joan R.; Company, Joaquim; Sanz, Ricardo; Escolà, Alexandre; Masip, Joan; Martínez-Casasnovas, José A.; Arnó, Jaume

    2016-01-01

    The leaf area index (LAI) is defined as the one-side leaf area per unit ground area, and is probably the most widely used index to characterize grapevine vigor. However, LAI varies spatially within vineyard plots. Mapping and quantifying this variability is very important for improving management decisions and agricultural practices. In this study, a mobile terrestrial laser scanner (MTLS) was used to map the LAI of a vineyard, and then to examine how different scanning methods (on-the-go or discontinuous systematic sampling) may affect the reliability of the resulting raster maps. The use of the MTLS allows calculating the enveloping vegetative area of the canopy, which is the sum of the leaf wall areas for both sides of the row (excluding gaps) and the projected upper area. Obtaining the enveloping areas requires scanning from both sides one meter length section along the row at each systematic sampling point. By converting the enveloping areas into LAI values, a raster map of the latter can be obtained by spatial interpolation (kriging). However, the user can opt for scanning on-the-go in a continuous way and compute 1-m LAI values along the rows, or instead, perform the scanning at discontinuous systematic sampling within the plot. An analysis of correlation between maps indicated that MTLS can be used discontinuously in specific sampling sections separated by up to 15 m along the rows. This capability significantly reduces the amount of data to be acquired at field level, the data storage capacity and the processing power of computers. PMID:26797618

  16. Mapping Vineyard Leaf Area Using Mobile Terrestrial Laser Scanners: Should Rows be Scanned On-the-Go or Discontinuously Sampled?

    PubMed

    del-Moral-Martínez, Ignacio; Rosell-Polo, Joan R; Company, Joaquim; Sanz, Ricardo; Escolà, Alexandre; Masip, Joan; Martínez-Casasnovas, José A; Arnó, Jaume

    2016-01-19

    The leaf area index (LAI) is defined as the one-side leaf area per unit ground area, and is probably the most widely used index to characterize grapevine vigor. However, LAI varies spatially within vineyard plots. Mapping and quantifying this variability is very important for improving management decisions and agricultural practices. In this study, a mobile terrestrial laser scanner (MTLS) was used to map the LAI of a vineyard, and then to examine how different scanning methods (on-the-go or discontinuous systematic sampling) may affect the reliability of the resulting raster maps. The use of the MTLS allows calculating the enveloping vegetative area of the canopy, which is the sum of the leaf wall areas for both sides of the row (excluding gaps) and the projected upper area. Obtaining the enveloping areas requires scanning from both sides one meter length section along the row at each systematic sampling point. By converting the enveloping areas into LAI values, a raster map of the latter can be obtained by spatial interpolation (kriging). However, the user can opt for scanning on-the-go in a continuous way and compute 1-m LAI values along the rows, or instead, perform the scanning at discontinuous systematic sampling within the plot. An analysis of correlation between maps indicated that MTLS can be used discontinuously in specific sampling sections separated by up to 15 m along the rows. This capability significantly reduces the amount of data to be acquired at field level, the data storage capacity and the processing power of computers.

  17. Data quality and feasibility of the Experience Sampling Method across the spectrum of severe psychiatric disorders: a protocol for a systematic review and meta-analysis.

    PubMed

    Vachon, Hugo; Rintala, Aki; Viechtbauer, Wolfgang; Myin-Germeys, Inez

    2018-01-18

    Due to a number of methodological advantages and theoretical considerations, more and more studies in clinical psychology research employ the Experience Sampling Method (ESM) as a data collection technique. Despite this growing interest, the absence of methodological guidelines related to the use of ESM has resulted in a large heterogeneity of designs while the potential effects of the design itself on the response behavior of the participants remain unknown. The objectives of this systematic review are to investigate the associations between the design characteristics and the data quality and feasibility of studies relying on ESM in severe psychiatric disorders. We will search for all published studies using ambulatory assessment with patients suffering from major depressive disorder, bipolar disorder, and psychotic disorder or individuals at high risk for these disorders. Electronic database searches will be performed in PubMed and Web of Science with no restriction on the publication date. Two reviewers will independently screen original studies in a title/abstract phase and a full-text phase based on the inclusion criteria. The information related to the design and sample characteristics, data quality, and feasibility will be extracted. We will provide results in terms of a descriptive synthesis, and when applicable, a meta-analysis of the findings will be conducted. Our results will attempt to highlight how the feasibility and data quality of ambulatory assessment might be related to the methodological characteristics of the study designs in severe psychiatric disorders. We will discuss these associations in different subsamples if sufficient data are available and will examine limitations in the reporting of the methods of ambulatory studies in the current literature. The protocol for this systematic review was registered on PROSPERO (PROSPERO 2017: CRD42017060322 ) and is available in full on the University of York website ( http://www.crd.york.ac.uk/PROSPERO/display_record.asp?ID=CRD42017060322 ).

  18. Weak lensing magnification in the Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.; DES Collaboration

    2018-05-01

    In this paper, the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification data set. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.

  19. On-line concentration and determination of all-trans- and 13-cis- retinoic acids in rabbit serum by application of sweeping technique in micellar electrokinetic chromatography.

    PubMed

    Zhao, Yongxi; Kong, Yu; Wang, Bo; Wu, Yayan; Wu, Hong

    2007-03-30

    A simple and rapid micellar electrokinetic chromatography (MEKC) method with UV detection was developed for the simultaneous separation and determination of all-trans- and 13-cis-retinoic acids in rabbit serum by on-line sweeping concentration technique. The serum sample was simply deproteinized and centrifuged. Various parameters affecting sample enrichment and separation were systematically investigated. Under optimal conditions, the analytes could be well separated within 17min, and the relative standard deviations (RSD) of migration times and peak areas were less than 3.4%. Compared with the conventional MEKC injection method, the 18- and 19-fold improvements in sensitivity were achieved, respectively. The proposed method has been successfully applied to the determination of all-trans- and 13-cis-retinoic acids in serum samples from rabbits and could be feasible for the further pharmacokinetics study of all-trans-retinoic acid.

  20. Novel polymeric monolith materials with a β-cyclodextrin-graphene composite for the highly selective extraction of methyl jasmonate.

    PubMed

    Yu, Xinhong; Ling, Xu; Zou, Li; Chen, Zilin

    2017-04-01

    A novel polymeric monolith column with a  β-cyclodextrin-graphene composite was prepared for extraction of methyl jasmonate. A simple, sensitive, and effective polymeric monolith microextraction with high-performance liquid chromatography method has been presented for the determination. To carry out the best microextraction efficiency, several parameters such as sample flow rate, sample volume, and sample pH value were systematically optimized. In addition, the method validation showed a wide linear range of 5-2000 ng/mL, with a good linearity and low limits of detection for methyl jasmonate. The proposed method was successfully applied for the determination of methyl jasmonate in wintersweet flowers with recoveries of 90.67%. The result was confirmed by high-performance liquid chromatography with mass spectrometry. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Simultaneous determination of 20 pharmacologically active substances in cow's milk, goat's milk, and human breast milk by gas chromatography-mass spectrometry.

    PubMed

    Azzouz, Abdelmonaim; Jurado-Sánchez, Beatriz; Souhail, Badredine; Ballesteros, Evaristo

    2011-05-11

    This paper reports a systematic approach to the development of a method that combines continuous solid-phase extraction and gas chromatography-mass spectrometry for the simultaneous determination of 20 pharmacologically active substances including antibacterials (chloramphenicol, florfenicol, pyrimethamine, thiamphenicol), nonsteroideal anti-inflammatories (diclofenac, flunixin, ibuprofen, ketoprofen, naproxen, mefenamic acid, niflumic acid, phenylbutazone), antiseptic (triclosan), antiepileptic (carbamazepine), lipid regulator (clofibric acid), β-blockers (metoprolol, propranolol), and hormones (17α-ethinylestradiol, estrone, 17β-estradiol) in milk samples. The sample preparation procedure involves deproteination of the milk, followed by sample enrichment and cleanup by continuous solid-phase extraction. The proposed method provides a linear response over the range of 0.6-5000 ng/kg and features limits of detection from 0.2 to 1.2 ng/kg depending on the particular analyte. The method was successfully applied to the determination of pharmacologically active substance residues in food samples including whole, raw, half-skim, skim, and powdered milk from different sources (cow, goat, and human breast).

  2. Platform construction and extraction mechanism study of magnetic mixed hemimicelles solid-phase extraction

    NASA Astrophysics Data System (ADS)

    Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua

    2016-12-01

    Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples.

  3. Mapping species distributions with MAXENT using a geographically biased sample of presence data: a performance assessment of methods for correcting sampling bias.

    PubMed

    Fourcade, Yoan; Engler, Jan O; Rödder, Dennis; Secondi, Jean

    2014-01-01

    MAXENT is now a common species distribution modeling (SDM) tool used by conservation practitioners for predicting the distribution of a species from a set of records and environmental predictors. However, datasets of species occurrence used to train the model are often biased in the geographical space because of unequal sampling effort across the study area. This bias may be a source of strong inaccuracy in the resulting model and could lead to incorrect predictions. Although a number of sampling bias correction methods have been proposed, there is no consensual guideline to account for it. We compared here the performance of five methods of bias correction on three datasets of species occurrence: one "virtual" derived from a land cover map, and two actual datasets for a turtle (Chrysemys picta) and a salamander (Plethodon cylindraceus). We subjected these datasets to four types of sampling biases corresponding to potential types of empirical biases. We applied five correction methods to the biased samples and compared the outputs of distribution models to unbiased datasets to assess the overall correction performance of each method. The results revealed that the ability of methods to correct the initial sampling bias varied greatly depending on bias type, bias intensity and species. However, the simple systematic sampling of records consistently ranked among the best performing across the range of conditions tested, whereas other methods performed more poorly in most cases. The strong effect of initial conditions on correction performance highlights the need for further research to develop a step-by-step guideline to account for sampling bias. However, this method seems to be the most efficient in correcting sampling bias and should be advised in most cases.

  4. Mapping Species Distributions with MAXENT Using a Geographically Biased Sample of Presence Data: A Performance Assessment of Methods for Correcting Sampling Bias

    PubMed Central

    Fourcade, Yoan; Engler, Jan O.; Rödder, Dennis; Secondi, Jean

    2014-01-01

    MAXENT is now a common species distribution modeling (SDM) tool used by conservation practitioners for predicting the distribution of a species from a set of records and environmental predictors. However, datasets of species occurrence used to train the model are often biased in the geographical space because of unequal sampling effort across the study area. This bias may be a source of strong inaccuracy in the resulting model and could lead to incorrect predictions. Although a number of sampling bias correction methods have been proposed, there is no consensual guideline to account for it. We compared here the performance of five methods of bias correction on three datasets of species occurrence: one “virtual” derived from a land cover map, and two actual datasets for a turtle (Chrysemys picta) and a salamander (Plethodon cylindraceus). We subjected these datasets to four types of sampling biases corresponding to potential types of empirical biases. We applied five correction methods to the biased samples and compared the outputs of distribution models to unbiased datasets to assess the overall correction performance of each method. The results revealed that the ability of methods to correct the initial sampling bias varied greatly depending on bias type, bias intensity and species. However, the simple systematic sampling of records consistently ranked among the best performing across the range of conditions tested, whereas other methods performed more poorly in most cases. The strong effect of initial conditions on correction performance highlights the need for further research to develop a step-by-step guideline to account for sampling bias. However, this method seems to be the most efficient in correcting sampling bias and should be advised in most cases. PMID:24818607

  5. Comparison of prevalence estimation of Mycobacterium avium subsp. paratuberculosis infection by sampling slaughtered cattle with macroscopic lesions vs. systematic sampling.

    PubMed

    Elze, J; Liebler-Tenorio, E; Ziller, M; Köhler, H

    2013-07-01

    The objective of this study was to identify the most reliable approach for prevalence estimation of Mycobacterium avium ssp. paratuberculosis (MAP) infection in clinically healthy slaughtered cattle. Sampling of macroscopically suspect tissue was compared to systematic sampling. Specimens of ileum, jejunum, mesenteric and caecal lymph nodes were examined for MAP infection using bacterial microscopy, culture, histopathology and immunohistochemistry. MAP was found most frequently in caecal lymph nodes, but sampling more tissues optimized the detection rate. Examination by culture was most efficient while combination with histopathology increased the detection rate slightly. MAP was detected in 49/50 animals with macroscopic lesions representing 1.35% of the slaughtered cattle examined. Of 150 systematically sampled macroscopically non-suspect cows, 28.7% were infected with MAP. This indicates that the majority of MAP-positive cattle are slaughtered without evidence of macroscopic lesions and before clinical signs occur. For reliable prevalence estimation of MAP infection in slaughtered cattle, systematic random sampling is essential.

  6. GET electronics samples data analysis

    NASA Astrophysics Data System (ADS)

    Giovinazzo, J.; Goigoux, T.; Anvar, S.; Baron, P.; Blank, B.; Delagnes, E.; Grinyer, G. F.; Pancin, J.; Pedroza, J. L.; Pibernat, J.; Pollacco, E.; Rebii, A.; Roger, T.; Sizun, P.

    2016-12-01

    The General Electronics for TPCs (GET) has been developed to equip a generation of time projection chamber detectors for nuclear physics, and may also be used for a wider range of detector types. The goal of this paper is to propose first analysis procedures to be applied on raw data samples from the GET system, in order to correct for systematic effects observed on test measurements. We also present a method to estimate the response function of the GET system channels. The response function is required in analysis where the input signal needs to be reconstructed, in terms of time distribution, from the registered output samples.

  7. Novel measuring strategies in neutron interferometry

    NASA Astrophysics Data System (ADS)

    Bonse, Ulrich; Wroblewski, Thomas

    1985-04-01

    Angular misalignment of a sample in a single crystal neutron interferometer leads to systematic errors of the effective sample thickness and in this way to errors in the determination of the coherent scattering length. The misalignment can be determined and the errors can be corrected by a second measurement at a different angular sample position. Furthermore, a method has been developed which allows supervision of the wavelength during the measurements. These two techniques were tested by determining the scattering length of copper. A value of bc = 7.66(4) fm was obtained which is in excellent agreement with previous measurements.

  8. A Green Analytical Method Using Ultrasound in Sample Preparation for the Flow Injection Determination of Iron, Manganese, and Zinc in Soluble Solid Samples by Flame Atomic Absorption Spectrometry

    PubMed Central

    Yebra, M. Carmen

    2012-01-01

    A simple and rapid analytical method was developed for the determination of iron, manganese, and zinc in soluble solid samples. The method is based on continuous ultrasonic water dissolution of the sample (5–30 mg) at room temperature followed by flow injection flame atomic absorption spectrometric determination. A good precision of the whole procedure (1.2–4.6%) and a sample throughput of ca. 25 samples h–1 were obtained. The proposed green analytical method has been successfully applied for the determination of iron, manganese, and zinc in soluble solid food samples (soluble cocoa and soluble coffee) and pharmaceutical preparations (multivitamin tablets). The ranges of concentrations found were 21.4–25.61 μg g−1 for iron, 5.74–18.30 μg g−1 for manganese, and 33.27–57.90 μg g−1 for zinc in soluble solid food samples and 3.75–9.90 μg g−1 for iron, 0.47–5.05 μg g−1 for manganese, and 1.55–15.12 μg g−1 for zinc in multivitamin tablets. The accuracy of the proposed method was established by a comparison with the conventional wet acid digestion method using a paired t-test, indicating the absence of systematic errors. PMID:22567553

  9. A correction method for systematic error in (1)H-NMR time-course data validated through stochastic cell culture simulation.

    PubMed

    Sokolenko, Stanislav; Aucoin, Marc G

    2015-09-04

    The growing ubiquity of metabolomic techniques has facilitated high frequency time-course data collection for an increasing number of applications. While the concentration trends of individual metabolites can be modeled with common curve fitting techniques, a more accurate representation of the data needs to consider effects that act on more than one metabolite in a given sample. To this end, we present a simple algorithm that uses nonparametric smoothing carried out on all observed metabolites at once to identify and correct systematic error from dilution effects. In addition, we develop a simulation of metabolite concentration time-course trends to supplement available data and explore algorithm performance. Although we focus on nuclear magnetic resonance (NMR) analysis in the context of cell culture, a number of possible extensions are discussed. Realistic metabolic data was successfully simulated using a 4-step process. Starting with a set of metabolite concentration time-courses from a metabolomic experiment, each time-course was classified as either increasing, decreasing, concave, or approximately constant. Trend shapes were simulated from generic functions corresponding to each classification. The resulting shapes were then scaled to simulated compound concentrations. Finally, the scaled trends were perturbed using a combination of random and systematic errors. To detect systematic errors, a nonparametric fit was applied to each trend and percent deviations calculated at every timepoint. Systematic errors could be identified at time-points where the median percent deviation exceeded a threshold value, determined by the choice of smoothing model and the number of observed trends. Regardless of model, increasing the number of observations over a time-course resulted in more accurate error estimates, although the improvement was not particularly large between 10 and 20 samples per trend. The presented algorithm was able to identify systematic errors as small as 2.5 % under a wide range of conditions. Both the simulation framework and error correction method represent examples of time-course analysis that can be applied to further developments in (1)H-NMR methodology and the more general application of quantitative metabolomics.

  10. Development of a Method for the Determination of Chromium and Cadmium in Tannery Wastewater Using Laser-Induced Breakdown Spectroscopy

    PubMed Central

    Bukhari, Mahwish; Awan, M. Ali; Qazi, Ishtiaq A.; Baig, M. Anwar

    2012-01-01

    This paper illustrates systematic development of a convenient analytical method for the determination of chromium and cadmium in tannery wastewater using laser-induced breakdown spectroscopy (LIBS). A new approach was developed by which liquid was converted into solid phase sample surface using absorption paper for subsequent LIBS analysis. The optimized values of LIBS parameters were 146.7 mJ for chromium and 89.5 mJ for cadmium (laser pulse energy), 4.5 μs (delay time), 70 mm (lens to sample surface distance), and 7 mm (light collection system to sample surface distance). Optimized values of LIBS parameters demonstrated strong spectrum lines for each metal keeping the background noise at minimum level. The new method of preparing metal standards on absorption papers exhibited calibration curves with good linearity with correlation coefficients, R2 in the range of 0.992 to 0.998. The developed method was tested on real tannery wastewater samples for determination of chromium and cadmium. PMID:22567570

  11. Minority carrier diffusion length extraction in Cu2ZnSn(Se,S)4 solar cells

    NASA Astrophysics Data System (ADS)

    Gokmen, Tayfun; Gunawan, Oki; Mitzi, David B.

    2013-09-01

    We report measurement of minority carrier diffusion length (Ld) for high performance Cu2ZnSn(S,Se)4 (CZTSSe) solar cells in comparison with analogous Cu(In,Ga)(S,Se)2 (CIGSSe) devices. Our Ld extraction method involves performing systematic measurements of the internal quantum efficiency combined with separate capacitance-voltage measurement. This method also enables the measurement of the absorption coefficient of the absorber material as a function of wavelength in a finished device. The extracted values of Ld for CZTSSe samples are at least factor of 2 smaller than those for CIGSSe samples. Combined with minority carrier lifetime (τ) data measured by time-resolved photoluminescence, we deduce the minority carrier mobility (μe), which is also relatively low for the CZTSSe samples.

  12. Decomposition of diverse solid inorganic matrices with molten ammonium bifluoride salt for constituent elemental analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Hara, Matthew J.; Kellogg, Cyndi M.; Parker, Cyrena M.

    Ammonium bifluoride (ABF, NH4F·HF) is a well-known reagent for converting metal oxides to fluorides and for its applications in breaking down minerals and ores in order to extract useful components. It has been more recently applied to the decomposition of inorganic matrices prior to elemental analysis. Herein, a sample decomposition method that employs molten ABF sample treatment in the initial step is systematically evaluated across a range of inorganic sample types: glass, quartz, zircon, soil, and pitchblende ore. Method performance is evaluated across the two variables: duration of molten ABF treatment and ABF reagent mass to sample mass ratio. Themore » degree of solubilization of these sample classes are compared to the fluoride stoichiometry that is theoretically necessary to enact complete fluorination of the sample types. Finally, the sample decomposition method is performed on several soil and pitchblende ore standard reference materials, after which elemental constituent analysis is performed by ICP-OES and ICP-MS. Elemental recoveries are compared to the certified values; results indicate good to excellent recoveries across a range of alkaline earth, rare earth, transition metal, and actinide elements.« less

  13. Comparing generalized ensemble methods for sampling of systems with many degrees of freedom

    DOE PAGES

    Lincoff, James; Sasmal, Sukanya; Head-Gordon, Teresa

    2016-11-03

    Here, we compare two standard replica exchange methods using temperature and dielectric constant as the scaling variables for independent replicas against two new corresponding enhanced sampling methods based on non-equilibrium statistical cooling (temperature) or descreening (dielectric). We test the four methods on a rough 1D potential as well as for alanine dipeptide in water, for which their relatively small phase space allows for the ability to define quantitative convergence metrics. We show that both dielectric methods are inferior to the temperature enhanced sampling methods, and in turn show that temperature cool walking (TCW) systematically outperforms the standard temperature replica exchangemore » (TREx) method. We extend our comparisons of the TCW and TREx methods to the 5 residue met-enkephalin peptide, in which we evaluate the Kullback-Leibler divergence metric to show that the rate of convergence between two independent trajectories is faster for TCW compared to TREx. Finally we apply the temperature methods to the 42 residue amyloid-β peptide in which we find non-negligible differences in the disordered ensemble using TCW compared to the standard TREx. All four methods have been made available as software through the OpenMM Omnia software consortium.« less

  14. Comparing generalized ensemble methods for sampling of systems with many degrees of freedom.

    PubMed

    Lincoff, James; Sasmal, Sukanya; Head-Gordon, Teresa

    2016-11-07

    We compare two standard replica exchange methods using temperature and dielectric constant as the scaling variables for independent replicas against two new corresponding enhanced sampling methods based on non-equilibrium statistical cooling (temperature) or descreening (dielectric). We test the four methods on a rough 1D potential as well as for alanine dipeptide in water, for which their relatively small phase space allows for the ability to define quantitative convergence metrics. We show that both dielectric methods are inferior to the temperature enhanced sampling methods, and in turn show that temperature cool walking (TCW) systematically outperforms the standard temperature replica exchange (TREx) method. We extend our comparisons of the TCW and TREx methods to the 5 residue met-enkephalin peptide, in which we evaluate the Kullback-Leibler divergence metric to show that the rate of convergence between two independent trajectories is faster for TCW compared to TREx. Finally we apply the temperature methods to the 42 residue amyloid-β peptide in which we find non-negligible differences in the disordered ensemble using TCW compared to the standard TREx. All four methods have been made available as software through the OpenMM Omnia software consortium (http://www.omnia.md/).

  15. A systematic random sampling scheme optimized to detect the proportion of rare synapses in the neuropil.

    PubMed

    da Costa, Nuno Maçarico; Hepp, Klaus; Martin, Kevan A C

    2009-05-30

    Synapses can only be morphologically identified by electron microscopy and this is often a very labor-intensive and time-consuming task. When quantitative estimates are required for pathways that contribute a small proportion of synapses to the neuropil, the problems of accurate sampling are particularly severe and the total time required may become prohibitive. Here we present a sampling method devised to count the percentage of rarely occurring synapses in the neuropil using a large sample (approximately 1000 sampling sites), with the strong constraint of doing it in reasonable time. The strategy, which uses the unbiased physical disector technique, resembles that used in particle physics to detect rare events. We validated our method in the primary visual cortex of the cat, where we used biotinylated dextran amine to label thalamic afferents and measured the density of their synapses using the physical disector method. Our results show that we could obtain accurate counts of the labeled synapses, even when they represented only 0.2% of all the synapses in the neuropil.

  16. Impact of parasitic thermal effects on thermoelectric property measurements by Harman method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kwon, Beomjin, E-mail: bkwon@kist.re.kr; Baek, Seung-Hyub; Keun Kim, Seong

    2014-04-15

    Harman method is a rapid and simple technique to measure thermoelectric properties. However, its validity has been often questioned due to the over-simplified assumptions that this method relies on. Here, we quantitatively investigate the influence of the previously ignored parasitic thermal effects on the Harman method and develop a method to determine an intrinsic ZT. We expand the original Harman relation with three extra terms: heat losses via both the lead wires and radiation, and Joule heating within the sample. Based on the expanded Harman relation, we use differential measurement of the sample geometry to measure the intrinsic ZT. Tomore » separately evaluate the parasitic terms, the measured ZTs with systematically varied sample geometries and the lead wire types are fitted to the expanded relation. A huge discrepancy (∼28%) of the measured ZTs depending on the measurement configuration is observed. We are able to separately evaluate those parasitic terms. This work will help to evaluate the intrinsic thermoelectric property with Harman method by eliminating ambiguities coming from extrinsic effects.« less

  17. A Systematic Literature Review: Workplace Violence Against Emergency Medical Services Personnel

    PubMed Central

    Pourshaikhian, Majid; Abolghasem Gorji, Hassan; Aryankhesal, Aidin; Khorasani-Zavareh, Davood; Barati, Ahmad

    2016-01-01

    Context In spite of the high prevalence and consequences of much workplace violence against emergency medical services personnel, this phenomenon has been given insufficient attention. A systematic review can aid the development of guidelines to reduce violence. Objectives The research question addressed by this paper is, “What are the characteristics and findings of studies on workplace violence against emergency medical services personnel”? Data Sources A systematic literature review was conducted using online databases (PubMed, Scopus, Google Scholar, and Magiran) with the help of experienced librarians. Study Selection Inclusion criteria comprised studies in the English or Persian language and researcher’s access to the full text. There was no limit to the entry of the study design. Exclusion criteria included lack of access to the full text of the article, studies published in unreliable journals or conferences, and studies in which the results were shared with other medical or relief groups and there was no possibility of breaking down the results. Data Extraction A “Data extraction form” was designed by the researchers based on the goals of the study that included the title and author(s), study method (type, place of study, sample size, sampling method, and data collection/analysis tool), printing location, information related to the frequency of types of violence, characteristics of victims /perpetrators, and related factors. Results The papers reviewed utilized a variety of locations and environments, methods, and instrument samplings. The majority of the studies were performed using the quantitative method. No intervention study was found. Most studies focused on the prevalence of violence, and their results indicated that exposure to violence was high. The results are presented in six major themes. Conclusions Workplace violence and injuries incurred from it are extensive throughout the world. The important causes of violence include the shortage of training programs dealing with violence, lack of violence management protocols, and delays in response times. Therefore, afterthought and resolve are more crucial than ever. Workplace violence reduction strategies and suggestions for future studies are also discussed. PMID:27169096

  18. [Developments in preparation and experimental method of solid phase microextraction fibers].

    PubMed

    Yi, Xu; Fu, Yujie

    2004-09-01

    Solid phase microextraction (SPME) is a simple and effective adsorption and desorption technique, which concentrates volatile or nonvolatile compounds from liquid samples or headspace of samples. SPME is compatible with analyte separation and detection by gas chromatography, high performance liquid chromatography, and other instrumental methods. It can provide many advantages, such as wide linear scale, low solvent and sample consumption, short analytical times, low detection limits, simple apparatus, and so on. The theory of SPME is introduced, which includes equilibrium theory and non-equilibrium theory. The novel development of fiber preparation methods and relative experimental techniques are discussed. In addition to commercial fiber preparation, different newly developed fabrication techniques, such as sol-gel, electronic deposition, carbon-base adsorption, high-temperature epoxy immobilization, are presented. Effects of extraction modes, selection of fiber coating, optimization of operating conditions, method sensitivity and precision, and systematical automation, are taken into considerations in the analytical process of SPME. A simple perspective of SPME is proposed at last.

  19. An efficient probe of the cosmological CPT violation

    NASA Astrophysics Data System (ADS)

    Zhao, Gong-Bo; Wang, Yuting; Xia, Jun-Qing; Li, Mingzhe; Zhang, Xinmin

    2015-07-01

    We develop an efficient method based on the linear regression algorithm to probe the cosmological CPT violation using the CMB polarisation data. We validate this method using simulated CMB data and apply it to recent CMB observations. We find that a combined data sample of BICEP1 and BOOMERanG 2003 favours a nonzero isotropic rotation angle at 2.3σ confidence level, i.e., bar alpha=-3.3o±1.4o (68% CL) with systematics included.

  20. Observational studies of patients in the emergency department: a comparison of 4 sampling methods.

    PubMed

    Valley, Morgan A; Heard, Kennon J; Ginde, Adit A; Lezotte, Dennis C; Lowenstein, Steven R

    2012-08-01

    We evaluate the ability of 4 sampling methods to generate representative samples of the emergency department (ED) population. We analyzed the electronic records of 21,662 consecutive patient visits at an urban, academic ED. From this population, we simulated different models of study recruitment in the ED by using 2 sample sizes (n=200 and n=400) and 4 sampling methods: true random, random 4-hour time blocks by exact sample size, random 4-hour time blocks by a predetermined number of blocks, and convenience or "business hours." For each method and sample size, we obtained 1,000 samples from the population. Using χ(2) tests, we measured the number of statistically significant differences between the sample and the population for 8 variables (age, sex, race/ethnicity, language, triage acuity, arrival mode, disposition, and payer source). Then, for each variable, method, and sample size, we compared the proportion of the 1,000 samples that differed from the overall ED population to the expected proportion (5%). Only the true random samples represented the population with respect to sex, race/ethnicity, triage acuity, mode of arrival, language, and payer source in at least 95% of the samples. Patient samples obtained using random 4-hour time blocks and business hours sampling systematically differed from the overall ED patient population for several important demographic and clinical variables. However, the magnitude of these differences was not large. Common sampling strategies selected for ED-based studies may affect parameter estimates for several representative population variables. However, the potential for bias for these variables appears small. Copyright © 2012. Published by Mosby, Inc.

  1. Harsh Corporal Punishment of Yemeni Children: Occurrence, Type and Associations

    ERIC Educational Resources Information Center

    Alyahri, Abdullah; Goodman, Robert

    2008-01-01

    Objective: To examine the occurrence, type and associations of harsh corporal punishment in Yemen. Methods: Caregiver and teacher reports were obtained on 1,196 Yemeni 7-10-year olds obtained by systematic random sampling of children in the 1st to 4th grades of urban and rural schools. Caregivers (86% mothers) reported on disciplinary practices,…

  2. The effects of festival attributes upon perceptions of crowding

    Treesearch

    Matthew Anderson; Deborah Kerstette; Alan Graefe

    1998-01-01

    The primary purpose of this study was to explore the relationship between festival attributes and perceived crowding at a festival site. Visitors to the Northwest Folklife Festival in Seattle, Washington, were chosen by a systematic sampling method to complete an on-site and follow-up survey. These surveys included questions which addressed the determinate attributes...

  3. Colour Doppler and microbubble contrast agent ultrasonography do not improve cancer detection rate in transrectal systematic prostate biopsy sampling.

    PubMed

    Taverna, Gianluigi; Morandi, Giovanni; Seveso, Mauro; Giusti, Guido; Benetti, Alessio; Colombo, Piergiuseppe; Minuti, Francesco; Grizzi, Fabio; Graziotti, Pierpaolo

    2011-12-01

    What's known on the subject? and What does the study add? Transrectal gray-scale ultrasonography guided prostate biopsy sampling is the method for diagnosing prostate cancer (PC) in patients with an increased prostate specific antigen level and/or abnormal digital rectal examination. Several imaging strategies have been proposed to optimize the diagnostic value of biopsy sampling, although at the first biopsy nearly 10-30% of PC still remains undiagnosed. This study compares the PC detection rate when employing Colour Doppler ultransongraphy with or without the injection of SonoVue™ microbubble contrast agent, versus the transrectal ultrasongraphy-guided systematic biopsy sampling. The limited accuracy, sensitivity, specificity and the additional cost of using the contrast agent do not justify its routine application in PC detection. • To compare prostate cancer (PC) detection rate employing colour Doppler ultrasonography with or without SonoVue™ contrast agent with transrectal ultrasonography-guided systematic biopsy sampling. • A total of 300 patients with negative digital rectal examination and transrectal grey-scale ultrasonography, with PSA values ranging between 2.5 and 9.9 ng/mL, were randomized into three groups: 100 patients (group A) underwent transrectal ultrasonography-guided systematic bioptic sampling; 100 patients (group B) underwent colour Doppler ultrasonography, and 100 patients (group C) underwent colour Doppler ultrasonography before and during the injection of SonoVue™. • Contrast-enhanced targeted biopsies were sampled into hypervascularized areas of peripheral, transitional, apical or anterior prostate zones. • All the patients included in Groups B and C underwent a further 13 systematic prostate biopsies. The cancer detection rate was calculated for each group. • In 88 (29.3%) patients a histological diagnosis of PC was made, whereas 22 (7.4%) patients were diagnosed with high-grade prostatic intraepithelial neoplasia or atypical small acinar proliferation. • No significant differences were found among the three groups for cancer detection rate (P= 0.329). • Additionally, low sensitivity, specificity and accuracy of colour Doppler with or without SonoVue™ contrast agent were found. • Prostate cancer detection rate does not significantly improve with the use of colour Doppler ultrasonography with or without SonoVue™. • Although no collateral effects have been highlighted, the combined use of colour Doppler ultrasonography and SonoVue™ determines adjunctive costs and increases the mean time for taking a single prostate biopsy. © 2011 THE AUTHORS. BJU INTERNATIONAL © 2011 BJU INTERNATIONAL.

  4. Searching for qualitative research for inclusion in systematic reviews: a structured methodological review.

    PubMed

    Booth, Andrew

    2016-05-04

    Qualitative systematic reviews or qualitative evidence syntheses (QES) are increasingly recognised as a way to enhance the value of systematic reviews (SRs) of clinical trials. They can explain the mechanisms by which interventions, evaluated within trials, might achieve their effect. They can investigate differences in effects between different population groups. They can identify which outcomes are most important to patients, carers, health professionals and other stakeholders. QES can explore the impact of acceptance, feasibility, meaningfulness and implementation-related factors within a real world setting and thus contribute to the design and further refinement of future interventions. To produce valid, reliable and meaningful QES requires systematic identification of relevant qualitative evidence. Although the methodologies of QES, including methods for information retrieval, are well-documented, little empirical evidence exists to inform their conduct and reporting. This structured methodological overview examines papers on searching for qualitative research identified from the Cochrane Qualitative and Implementation Methods Group Methodology Register and from citation searches of 15 key papers. A single reviewer reviewed 1299 references. Papers reporting methodological guidance, use of innovative methodologies or empirical studies of retrieval methods were categorised under eight topical headings: overviews and methodological guidance, sampling, sources, structured questions, search procedures, search strategies and filters, supplementary strategies and standards. This structured overview presents a contemporaneous view of information retrieval for qualitative research and identifies a future research agenda. This review concludes that poor empirical evidence underpins current information practice in information retrieval of qualitative research. A trend towards improved transparency of search methods and further evaluation of key search procedures offers the prospect of rapid development of search methods.

  5. Quality Evaluation and Chemical Markers Screening of Salvia miltiorrhiza Bge. (Danshen) Based on HPLC Fingerprints and HPLC-MSn Coupled with Chemometrics.

    PubMed

    Liang, Wenyi; Chen, Wenjing; Wu, Lingfang; Li, Shi; Qi, Qi; Cui, Yaping; Liang, Linjin; Ye, Ting; Zhang, Lanzhen

    2017-03-17

    Danshen, the dried root of Salvia miltiorrhiza Bge., is a widely used commercially available herbal drug, and unstable quality of different samples is a current issue. This study focused on a comprehensive and systematic method combining fingerprints and chemical identification with chemometrics for discrimination and quality assessment of Danshen samples. Twenty-five samples were analyzed by HPLC-PAD and HPLC-MS n . Forty-nine components were identified and characteristic fragmentation regularities were summarized for further interpretation of bioactive components. Chemometric analysis was employed to differentiate samples and clarify the quality differences of Danshen including hierarchical cluster analysis, principal component analysis, and partial least squares discriminant analysis. Consistent results were that the samples were divided into three categories which reflected the difference in quality of Danshen samples. By analyzing the reasons for sample classification, it was revealed that the processing method had a more obvious impact on sample classification than the geographical origin, it induced the different content of bioactive compounds and finally lead to different qualities. Cryptotanshinone, trijuganone B, and 15,16-dihydrotanshinone I were screened out as markers to distinguish samples by different processing methods. The developed strategy could provide a reference for evaluation and discrimination of other traditional herbal medicines.

  6. Methodological integrative review of the work sampling technique used in nursing workload research.

    PubMed

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  7. Rapid and reliable high-throughput methods of DNA extraction for use in barcoding and molecular systematics of mushrooms.

    PubMed

    Dentinger, Bryn T M; Margaritescu, Simona; Moncalvo, Jean-Marc

    2010-07-01

    We present two methods for DNA extraction from fresh and dried mushrooms that are adaptable to high-throughput sequencing initiatives, such as DNA barcoding. Our results show that these protocols yield ∼85% sequencing success from recently collected materials. Tests with both recent (<2 year) and older (>100 years) specimens reveal that older collections have low success rates and may be an inefficient resource for populating a barcode database. However, our method of extracting DNA from herbarium samples using small amount of tissue is reliable and could be used for important historical specimens. The application of these protocols greatly reduces time, and therefore cost, of generating DNA sequences from mushrooms and other fungi vs. traditional extraction methods. The efficiency of these methods illustrates that standardization and streamlining of sample processing should be shifted from the laboratory to the field. © 2009 Blackwell Publishing Ltd.

  8. Metasynthesis findings: potential versus reality.

    PubMed

    Finfgeld-Connett, Deborah

    2014-11-01

    Early on, qualitative researchers predicted that metasynthesis research had the potential to significantly push knowledge development forward. More recently, scholars have questioned whether this is actually occurring. To examine this concern, a randomly selected sample of metasynthesis articles was systematically reviewed to identify the types of findings that have been produced. Based on this systematic examination, it appears that findings from metasynthesis investigations might not be reaching their full potential. Metasynthesis investigations frequently result in isolated findings rather than findings in relationship, and opportunities to generate research hypotheses and theoretical models are not always fully realized. With this in mind, methods for moving metasynthesis findings into relationship are discussed. © The Author(s) 2014.

  9. Systematic versus random sampling in stereological studies.

    PubMed

    West, Mark J

    2012-12-01

    The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.

  10. Immunoassay and antibody microarray analysis of the HUPO Plasma Proteome Project reference specimens: Systematic variation between sample types and calibration of mass spectrometry data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haab, Brian B.; Geierstanger, Bernhard H.; Michailidis, George

    2005-08-01

    Four different immunoassay and antibody microarray methods performed at four different sites were used to measure the levels of a broad range of proteins (N = 323 assays; 39, 88, 168, and 28 assays at the respective sites; 237 unique analytes) in the human serum and plasma reference specimens distributed by the Plasma Proteome Project (PPP) of the HUPO. The methods provided a means to (1) assess the level of systematic variation in protein abundances associated with blood preparation methods (serum, citrate-anticoagulated-plasma, EDTA-anticoagulated-plasma, or heparin-anticoagulated-plasma) and (2) evaluate the dependence on concentration of MS-based protein identifications from data sets usingmore » the HUPO specimens. Some proteins, particularly cytokines, had highly variable concentrations between the different sample preparations, suggesting specific effects of certain anticoagulants on the stability or availability of these proteins. The linkage of antibody-based measurements from 66 different analytes with the combined MS/MS data from 18 different laboratories showed that protein detection and the quality of MS data increased with analyte concentration. The conclusions from these initial analyses are that the optimal blood preparation method is variable between analytes and that the discovery of blood proteins by MS can be extended to concentrations below the ng/mL range under certain circumstances. Continued developments in antibody-based methods will further advance the scientific goals of the PPP.« less

  11. Costing 'healthy' food baskets in Australia - a systematic review of food price and affordability monitoring tools, protocols and methods.

    PubMed

    Lewis, Meron; Lee, Amanda

    2016-11-01

    To undertake a systematic review to determine similarities and differences in metrics and results between recently and/or currently used tools, protocols and methods for monitoring Australian healthy food prices and affordability. Electronic databases of peer-reviewed literature and online grey literature were systematically searched using the PRISMA approach for articles and reports relating to healthy food and diet price assessment tools, protocols, methods and results that utilised retail pricing. National, state, regional and local areas of Australia from 1995 to 2015. Assessment tools, protocols and methods to measure the price of 'healthy' foods and diets. The search identified fifty-nine discrete surveys of 'healthy' food pricing incorporating six major food pricing tools (those used in multiple areas and time periods) and five minor food pricing tools (those used in a single survey area or time period). Analysis demonstrated methodological differences regarding: included foods; reference households; use of availability and/or quality measures; household income sources; store sampling methods; data collection protocols; analysis methods; and results. 'Healthy' food price assessment methods used in Australia lack comparability across all metrics and most do not fully align with a 'healthy' diet as recommended by the current Australian Dietary Guidelines. None have been applied nationally. Assessment of the price, price differential and affordability of healthy (recommended) and current (unhealthy) diets would provide more robust and meaningful data to inform health and fiscal policy in Australia. The INFORMAS 'optimal' approach provides a potential framework for development of these methods.

  12. The Effect of Storage and Extraction Methods on Amplification of Plasmodium falciparum DNA from Dried Blood Spots.

    PubMed

    Schwartz, Alanna; Baidjoe, Amrish; Rosenthal, Philip J; Dorsey, Grant; Bousema, Teun; Greenhouse, Bryan

    2015-05-01

    Extraction and amplification of DNA from dried blood spots (DBS) collected in field studies is commonly used for detection of Plasmodium falciparum. However, there have been few systematic efforts to determine the effects of storage and extraction methods on the sensitivity of DNA amplification. We investigated the effects of storage conditions, length of storage, and DNA extraction methods on amplification via three PCR-based assays using field samples and laboratory controls. Samples stored as DBS for 2 or more years at ambient temperature showed a significant loss of sensitivity that increased with time; after 10 years only 10% samples with parasite densities > 1,000 parasites/μL were detectable by nested polymerase chain reaction (PCR). Conversely, DBS and extracted DNA stored at -20°C showed no loss of sensitivity with time. Samples with low parasite densities amplified more successfully with saponin/Chelex compared with spin-column-based extraction, though the latter method performed better on samples with higher parasite densities stored for 2 years at ambient temperature. DNA extracted via both methods was stable after 20 freeze-thaw cycles. Our results suggest that DBS should be stored at -20°C or extracted immediately, especially if anticipating 2 or more years of storage. © The American Society of Tropical Medicine and Hygiene.

  13. The Effect of Storage and Extraction Methods on Amplification of Plasmodium falciparum DNA from Dried Blood Spots

    PubMed Central

    Schwartz, Alanna; Baidjoe, Amrish; Rosenthal, Philip J.; Dorsey, Grant; Bousema, Teun; Greenhouse, Bryan

    2015-01-01

    Extraction and amplification of DNA from dried blood spots (DBS) collected in field studies is commonly used for detection of Plasmodium falciparum. However, there have been few systematic efforts to determine the effects of storage and extraction methods on the sensitivity of DNA amplification. We investigated the effects of storage conditions, length of storage, and DNA extraction methods on amplification via three PCR-based assays using field samples and laboratory controls. Samples stored as DBS for 2 or more years at ambient temperature showed a significant loss of sensitivity that increased with time; after 10 years only 10% samples with parasite densities > 1,000 parasites/μL were detectable by nested polymerase chain reaction (PCR). Conversely, DBS and extracted DNA stored at −20°C showed no loss of sensitivity with time. Samples with low parasite densities amplified more successfully with saponin/Chelex compared with spin-column-based extraction, though the latter method performed better on samples with higher parasite densities stored for 2 years at ambient temperature. DNA extracted via both methods was stable after 20 freeze-thaw cycles. Our results suggest that DBS should be stored at −20°C or extracted immediately, especially if anticipating 2 or more years of storage. PMID:25758652

  14. Systematic Methodological Evaluation of a Multiplex Bead-Based Flow Cytometry Assay for Detection of Extracellular Vesicle Surface Signatures

    PubMed Central

    Wiklander, Oscar P. B.; Bostancioglu, R. Beklem; Welsh, Joshua A.; Zickler, Antje M.; Murke, Florian; Corso, Giulia; Felldin, Ulrika; Hagey, Daniel W.; Evertsson, Björn; Liang, Xiu-Ming; Gustafsson, Manuela O.; Mohammad, Dara K.; Wiek, Constanze; Hanenberg, Helmut; Bremer, Michel; Gupta, Dhanu; Björnstedt, Mikael; Giebel, Bernd; Nordin, Joel Z.; Jones, Jennifer C.; EL Andaloussi, Samir; Görgens, André

    2018-01-01

    Extracellular vesicles (EVs) can be harvested from cell culture supernatants and from all body fluids. EVs can be conceptually classified based on their size and biogenesis as exosomes and microvesicles. Nowadays, it is however commonly accepted in the field that there is a much higher degree of heterogeneity within these two subgroups than previously thought. For instance, the surface marker profile of EVs is likely dependent on the cell source, the cell’s activation status, and multiple other parameters. Within recent years, several new methods and assays to study EV heterogeneity in terms of surface markers have been described; most of them are being based on flow cytometry. Unfortunately, such methods generally require dedicated instrumentation, are time-consuming and demand extensive operator expertise for sample preparation, acquisition, and data analysis. In this study, we have systematically evaluated and explored the use of a multiplex bead-based flow cytometric assay which is compatible with most standard flow cytometers and facilitates a robust semi-quantitative detection of 37 different potential EV surface markers in one sample simultaneously. First, assay variability, sample stability over time, and dynamic range were assessed together with the limitations of this assay in terms of EV input quantity required for detection of differently abundant surface markers. Next, the potential effects of EV origin, sample preparation, and quality of the EV sample on the assay were evaluated. The findings indicate that this multiplex bead-based assay is generally suitable to detect, quantify, and compare EV surface signatures in various sample types, including unprocessed cell culture supernatants, cell culture-derived EVs isolated by different methods, and biological fluids. Furthermore, the use and limitations of this assay to assess heterogeneities in EV surface signatures was explored by combining different sets of detection antibodies in EV samples derived from different cell lines and subsets of rare cells. Taken together, this validated multiplex bead-based flow cytometric assay allows robust, sensitive, and reproducible detection of EV surface marker expression in various sample types in a semi-quantitative way and will be highly valuable for many researchers in the EV field in different experimental contexts.

  15. Systematic Methodological Evaluation of a Multiplex Bead-Based Flow Cytometry Assay for Detection of Extracellular Vesicle Surface Signatures.

    PubMed

    Wiklander, Oscar P B; Bostancioglu, R Beklem; Welsh, Joshua A; Zickler, Antje M; Murke, Florian; Corso, Giulia; Felldin, Ulrika; Hagey, Daniel W; Evertsson, Björn; Liang, Xiu-Ming; Gustafsson, Manuela O; Mohammad, Dara K; Wiek, Constanze; Hanenberg, Helmut; Bremer, Michel; Gupta, Dhanu; Björnstedt, Mikael; Giebel, Bernd; Nordin, Joel Z; Jones, Jennifer C; El Andaloussi, Samir; Görgens, André

    2018-01-01

    Extracellular vesicles (EVs) can be harvested from cell culture supernatants and from all body fluids. EVs can be conceptually classified based on their size and biogenesis as exosomes and microvesicles. Nowadays, it is however commonly accepted in the field that there is a much higher degree of heterogeneity within these two subgroups than previously thought. For instance, the surface marker profile of EVs is likely dependent on the cell source, the cell's activation status, and multiple other parameters. Within recent years, several new methods and assays to study EV heterogeneity in terms of surface markers have been described; most of them are being based on flow cytometry. Unfortunately, such methods generally require dedicated instrumentation, are time-consuming and demand extensive operator expertise for sample preparation, acquisition, and data analysis. In this study, we have systematically evaluated and explored the use of a multiplex bead-based flow cytometric assay which is compatible with most standard flow cytometers and facilitates a robust semi-quantitative detection of 37 different potential EV surface markers in one sample simultaneously. First, assay variability, sample stability over time, and dynamic range were assessed together with the limitations of this assay in terms of EV input quantity required for detection of differently abundant surface markers. Next, the potential effects of EV origin, sample preparation, and quality of the EV sample on the assay were evaluated. The findings indicate that this multiplex bead-based assay is generally suitable to detect, quantify, and compare EV surface signatures in various sample types, including unprocessed cell culture supernatants, cell culture-derived EVs isolated by different methods, and biological fluids. Furthermore, the use and limitations of this assay to assess heterogeneities in EV surface signatures was explored by combining different sets of detection antibodies in EV samples derived from different cell lines and subsets of rare cells. Taken together, this validated multiplex bead-based flow cytometric assay allows robust, sensitive, and reproducible detection of EV surface marker expression in various sample types in a semi-quantitative way and will be highly valuable for many researchers in the EV field in different experimental contexts.

  16. Methodological quality of meta-analyses on treatments for chronic obstructive pulmonary disease: a cross-sectional study using the AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool.

    PubMed

    Ho, Robin S T; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel Y S; Chung, Vincent C H

    2015-01-08

    Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. To assess the methodological quality of MAs on COPD treatments. A cross-sectional study on MAs of COPD trials. MAs published during 2000-2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods.

  17. Endorsement of the New Ecological Paradigm in Systematic and E-Mail Samples of College Students

    ERIC Educational Resources Information Center

    Rideout, Bruce E.; Hushen, Katherine; McGinty, Dawn; Perkins, Stephanie; Tate, Jennifer

    2005-01-01

    As the initial phase of a longitudinal study of environmental perspective in college students, resident student opinion was sampled using the New Ecological Paradigm (NEP) scale administered through systematic alphabetical sampling. Sampling was also carried out by a blanket e-mail distribution of surveys for voluntary response. Results showed…

  18. Hybrid Gibbs Sampling and MCMC for CMB Analysis at Small Angular Scales

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey B.; Eriksen, H. K.; Wandelt, B. D.; Gorski, K. M.; Huey, G.; O'Dwyer, I. J.; Dickinson, C.; Banday, A. J.; Lawrence, C. R.

    2008-01-01

    A) Gibbs Sampling has now been validated as an efficient, statistically exact, and practically useful method for "low-L" (as demonstrated on WMAP temperature polarization data). B) We are extending Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters for the entire range of angular scales relevant for Planck. C) Made possible by inclusion of foreground model parameters in Gibbs sampling and hybrid MCMC and Gibbs sampling for the low signal to noise (high-L) regime. D) Future items to be included in the Bayesian framework include: 1) Integration with Hybrid Likelihood (or posterior) code for cosmological parameters; 2) Include other uncertainties in instrumental systematics? (I.e. beam uncertainties, noise estimation, calibration errors, other).

  19. Characterisation of signal enhancements achieved when utilizing a photon diode in deep Raman spectroscopy of tissue

    PubMed Central

    Vardaki, Martha Z.; Matousek, Pavel; Stone, Nicholas

    2016-01-01

    We characterise the performance of a beam enhancing element (‘photon diode’) for use in deep Raman spectroscopy (DRS) of biological tissues. The optical component enhances the number of laser photons coupled into a tissue sample by returning escaping photons back into it at the illumination zone. The method is compatible with transmission Raman spectroscopy, a deep Raman spectroscopy concept, and its implementation leads to considerable enhancement of detected Raman photon rates. In the past, the enhancement concept was demonstrated with a variety of samples (pharmaceutical tablets, tissue, etc) but it was not systematically characterized with biological tissues. In this study, we investigate the enhancing properties of the photon diode in the transmission Raman geometry as a function of: a) the depth and b) the optical properties of tissue samples. Liquid tissue phantoms were employed to facilitate systematic variation of optical properties. These were chosen to mimic optical properties of human tissues, including breast and prostate. The obtained results evidence that a photon diode can enhance Raman signals of tissues by a maximum of × 2.4, although it can also decrease the signals created towards the back of samples that exhibit high scattering or absorption properties. PMID:27375932

  20. Systematic Evaluation of Non-Uniform Sampling Parameters in the Targeted Analysis of Urine Metabolites by 1H,1H 2D NMR Spectroscopy.

    PubMed

    Schlippenbach, Trixi von; Oefner, Peter J; Gronwald, Wolfram

    2018-03-09

    Non-uniform sampling (NUS) allows the accelerated acquisition of multidimensional NMR spectra. The aim of this contribution was the systematic evaluation of the impact of various quantitative NUS parameters on the accuracy and precision of 2D NMR measurements of urinary metabolites. Urine aliquots spiked with varying concentrations (15.6-500.0 µM) of tryptophan, tyrosine, glutamine, glutamic acid, lactic acid, and threonine, which can only be resolved fully by 2D NMR, were used to assess the influence of the sampling scheme, reconstruction algorithm, amount of omitted data points, and seed value on the quantitative performance of NUS in 1 H, 1 H-TOCSY and 1 H, 1 H-COSY45 NMR spectroscopy. Sinusoidal Poisson-gap sampling and a compressed sensing approach employing the iterative re-weighted least squares method for spectral reconstruction allowed a 50% reduction in measurement time while maintaining sufficient quantitative accuracy and precision for both types of homonuclear 2D NMR spectroscopy. Together with other advances in instrument design, such as state-of-the-art cryogenic probes, use of 2D NMR spectroscopy in large biomedical cohort studies seems feasible.

  1. The effectiveness of hydrotherapy in the treatment of social and behavioral aspects of children with autism spectrum disorders: a systematic review

    PubMed Central

    Mortimer, Rachel; Privopoulos, Melinda; Kumar, Saravana

    2014-01-01

    Background Autism spectrum disorders (ASDs) are increasing in prevalence. Children with ASDs present with impairments in social interactions; communication; restricted, repetitive, and stereotyped patterns of behavior, interests, or activities; as well as motor delays. Hydrotherapy is used as a treatment for children with disabilities and motor delays. There have been no systematic reviews conducted on the effectiveness of hydrotherapy in children with ASDs. Aim We aimed to examine the effectiveness of hydrotherapy on social interactions and behaviors in the treatment of children with ASDs. Methods A systematic search of Cochrane, CINAHL, PsycINFO, Embase, MEDLINE®, and Academic Search Premier was conducted. Studies of participants, aged 3–18 years, with ASDs at a high-functioning level were included if they utilized outcome measures assessing social interactions and behaviors through questionnaire or observation. A critical appraisal, using the McMaster Critical Review Form for Quantitative Studies, was performed to assess methodological quality. Results Four studies of varying research design and quality met the inclusion criteria. The participants in these studies were aged between 3–12 years of age. The duration of the intervention ranged from 10–14 weeks, and each study used varied measures of outcome. Overall, all the studies showed some improvements in social interactions or behaviors following a Halliwick-based hydrotherapy intervention. Interpretation Few studies have investigated the effect of hydrotherapy on the social interactions and behaviors of children with ASDs. While there is an increasing body of evidence for hydrotherapy for children with ASDs, this is constrained by small sample size, lack of comparator, crude sampling methods, and the lack of standardized outcome measures. Hydrotherapy shows potential as a treatment method for social interactions and behaviors in children with ASDs. PMID:24520196

  2. MRI-determined liver proton density fat fraction, with MRS validation: Comparison of regions of interest sampling methods in patients with type 2 diabetes.

    PubMed

    Vu, Kim-Nhien; Gilbert, Guillaume; Chalut, Marianne; Chagnon, Miguel; Chartrand, Gabriel; Tang, An

    2016-05-01

    To assess the agreement between published magnetic resonance imaging (MRI)-based regions of interest (ROI) sampling methods using liver mean proton density fat fraction (PDFF) as the reference standard. This retrospective, internal review board-approved study was conducted in 35 patients with type 2 diabetes. Liver PDFF was measured by magnetic resonance spectroscopy (MRS) using a stimulated-echo acquisition mode sequence and MRI using a multiecho spoiled gradient-recalled echo sequence at 3.0T. ROI sampling methods reported in the literature were reproduced and liver mean PDFF obtained by whole-liver segmentation was used as the reference standard. Intraclass correlation coefficients (ICCs), Bland-Altman analysis, repeated-measures analysis of variance (ANOVA), and paired t-tests were performed. ICC between MRS and MRI-PDFF was 0.916. Bland-Altman analysis showed excellent intermethod agreement with a bias of -1.5 ± 2.8%. The repeated-measures ANOVA found no systematic variation of PDFF among the nine liver segments. The correlation between liver mean PDFF and ROI sampling methods was very good to excellent (0.873 to 0.975). Paired t-tests revealed significant differences (P < 0.05) with ROI sampling methods that exclusively or predominantly sampled the right lobe. Significant correlations with mean PDFF were found with sampling methods that included higher number of segments, total area equal or larger than 5 cm(2) , or sampled both lobes (P = 0.001, 0.023, and 0.002, respectively). MRI-PDFF quantification methods should sample each liver segment in both lobes and include a total surface area equal or larger than 5 cm(2) to provide a close estimate of the liver mean PDFF. © 2015 Wiley Periodicals, Inc.

  3. Systematic Review of the Use of Dried Blood Spots for Monitoring HIV Viral Load and for Early Infant Diagnosis

    PubMed Central

    Smit, Pieter W.; Sollis, Kimberly A.; Fiscus, Susan; Ford, Nathan; Vitoria, Marco; Essajee, Shaffiq; Barnett, David; Cheng, Ben; Crowe, Suzanne M.; Denny, Thomas; Landay, Alan; Stevens, Wendy; Habiyambere, Vincent; Perriens, Joseph H.; Peeling, Rosanna W.

    2014-01-01

    Background Dried blood spots (DBS) have been used as alternative specimens to plasma to increase access to HIV viral load (VL) monitoring and early infant diagnosis (EID) in remote settings. We systematically reviewed evidence on the performance of DBS compared to plasma for VL monitoring and EID. Methods and Findings Thirteen peer reviewed HIV VL publications and five HIV EID papers were included. Depending on the technology and the viral load distribution in the study population, the percentage of DBS samples that are within 0.5 log of VL in plasma ranged from 52–100%. Because the input sample volume is much smaller in a blood spot, there is a risk of false negatives with DBS. Sensitivity of DBS VL was found to be 78–100% compared to plasma at VL below 1000 copies/ml, but this increased to 100% at a threshold of 5000 copies/ml. Unlike a plasma VL test which measures only cell free HIV RNA, a DBS VL also measures proviral DNA as well as cell-associated RNA, potentially leading to false positive results when using DBS. The systematic review showed that specificity was close to 100% at DBS VL above 5000 copies/ml, and this threshold would be the most reliable for predicting true virologic failure using DBS. For early infant diagnosis, DBS has a sensitivity of 100% compared to fresh whole blood or plasma in all studies. Conclusions Although limited data are available for EID, DBS offer a highly sensitive and specific sampling strategy to make viral load monitoring and early infant diagnosis more accessible in remote settings. A standardized approach for sampling, storing, and processing DBS samples would be essential to allow successful implementation. Trial Registration PROSPERO Registration #: CRD42013003621. PMID:24603442

  4. Structure of sunspot light bridges in the chromosphere and transition region

    NASA Astrophysics Data System (ADS)

    Rezaei, R.

    2018-01-01

    Context. Light bridges (LBs) are elongated structures with enhanced intensity embedded in sunspot umbra and pores. Aims: We studied the properties of a sample of 60 LBs observed with the Interface Region Imaging Spectrograph (IRIS). Methods: Using IRIS near- and far-ultraviolet spectra, we measured the line intensity, width, and Doppler shift; followed traces of LBs in the chromosphere and transition region (TR); and compared LB parameters with umbra and quiet Sun. Results: There is a systematic emission enhancement in LBs compared to nearby umbra from the photosphere up to the TR. Light bridges are systematically displaced toward the solar limb at higher layers: the amount of the displacement at one solar radius compares well with the typical height of the chromosphere and TR. The intensity of the LB sample compared to the umbra sample peaks at the middle/upper chromosphere where they are almost permanently bright. Spectral lines emerging from the LBs are broader than the nearby umbra. The systematic redshift of the Si IV line in the LB sample is reduced compared to the quiet Sun sample. We found a significant correlation between the line width of ions arising at temperatures from 3 × 104 to 1.5 × 105 K as there is also a strong spatial correlation among the line and continuum intensities. In addition, the intensity-line width relation holds for all spectral lines in this study. The correlations indicate that the cool and hot plasma in LBs are coupled. Conclusions: Light bridges comprise multi-temperature and multi-disciplinary structures extending up to the TR. Diverse heating sources supply the energy and momentum to different layers, resulting in distinct dynamics in the photosphere, chromosphere, and TR.

  5. An automated online turboflow cleanup LC/MS/MS method for the determination of 11 plasticizers in beverages and milk.

    PubMed

    Ates, Ebru; Mittendorf, Klaus; Senyuva, Hamide

    2013-01-01

    An automated sample preparation technique involving cleanup and analytical separation in a single operation using an online coupled TurboFlow (RP-LC system) is reported. This method eliminates time-consuming sample preparation steps that can be potential sources for cross-contamination in the analysis of plasticizers. Using TurboFlow chromatography, liquid samples were injected directly into the automated system without previous extraction or cleanup. Special cleanup columns enabled specific binding of target compounds; higher MW compounds, i.e., fats and proteins, and other matrix interferences with different chemical properties were removed to waste, prior to LC/MS/MS. Systematic stepwise method development using this new technology in the food safety area is described. Selection of optimum columns and mobile phases for loading onto the cleanup column followed by transfer onto the analytical column and MS detection are critical method parameters. The method was optimized for the assay of 10 phthalates (dimethyl, diethyl, dipropyl, butyl benzyl, diisobutyl, dicyclohexyl, dihexyl, diethylhexyl, diisononyl, and diisododecyl) and one adipate (diethylhexyl) in beverages and milk.

  6. Methods for specifying the target difference in a randomised controlled trial: the Difference ELicitation in TriAls (DELTA) systematic review.

    PubMed

    Hislop, Jenni; Adewuyi, Temitope E; Vale, Luke D; Harrild, Kirsten; Fraser, Cynthia; Gurung, Tara; Altman, Douglas G; Briggs, Andrew H; Fayers, Peter; Ramsay, Craig R; Norrie, John D; Harvey, Ian M; Buckley, Brian; Cook, Jonathan A

    2014-05-01

    Randomised controlled trials (RCTs) are widely accepted as the preferred study design for evaluating healthcare interventions. When the sample size is determined, a (target) difference is typically specified that the RCT is designed to detect. This provides reassurance that the study will be informative, i.e., should such a difference exist, it is likely to be detected with the required statistical precision. The aim of this review was to identify potential methods for specifying the target difference in an RCT sample size calculation. A comprehensive systematic review of medical and non-medical literature was carried out for methods that could be used to specify the target difference for an RCT sample size calculation. The databases searched were MEDLINE, MEDLINE In-Process, EMBASE, the Cochrane Central Register of Controlled Trials, the Cochrane Methodology Register, PsycINFO, Science Citation Index, EconLit, the Education Resources Information Center (ERIC), and Scopus (for in-press publications); the search period was from 1966 or the earliest date covered, to between November 2010 and January 2011. Additionally, textbooks addressing the methodology of clinical trials and International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) tripartite guidelines for clinical trials were also consulted. A narrative synthesis of methods was produced. Studies that described a method that could be used for specifying an important and/or realistic difference were included. The search identified 11,485 potentially relevant articles from the databases searched. Of these, 1,434 were selected for full-text assessment, and a further nine were identified from other sources. Fifteen clinical trial textbooks and the ICH tripartite guidelines were also reviewed. In total, 777 studies were included, and within them, seven methods were identified-anchor, distribution, health economic, opinion-seeking, pilot study, review of the evidence base, and standardised effect size. A variety of methods are available that researchers can use for specifying the target difference in an RCT sample size calculation. Appropriate methods may vary depending on the aim (e.g., specifying an important difference versus a realistic difference), context (e.g., research question and availability of data), and underlying framework adopted (e.g., Bayesian versus conventional statistical approach). Guidance on the use of each method is given. No single method provides a perfect solution for all contexts.

  7. New Approach for Investigating Reaction Dynamics and Rates with Ab Initio Calculations.

    PubMed

    Fleming, Kelly L; Tiwary, Pratyush; Pfaendtner, Jim

    2016-01-21

    Herein, we demonstrate a convenient approach to systematically investigate chemical reaction dynamics using the metadynamics (MetaD) family of enhanced sampling methods. Using a symmetric SN2 reaction as a model system, we applied infrequent metadynamics, a theoretical framework based on acceleration factors, to quantitatively estimate the rate of reaction from biased and unbiased simulations. A systematic study of the algorithm and its application to chemical reactions was performed by sampling over 5000 independent reaction events. Additionally, we quantitatively reweighed exhaustive free-energy calculations to obtain the reaction potential-energy surface and showed that infrequent metadynamics works to effectively determine Arrhenius-like activation energies. Exact agreement with unbiased high-temperature kinetics is also shown. The feasibility of using the approach on actual ab initio molecular dynamics calculations is then presented by using Car-Parrinello MD+MetaD to sample the same reaction using only 10-20 calculations of the rare event. Owing to the ease of use and comparatively low-cost of computation, the approach has extensive potential applications for catalysis, combustion, pyrolysis, and enzymology.

  8. Quality by Design: Multidimensional exploration of the design space in high performance liquid chromatography method development for better robustness before validation.

    PubMed

    Monks, K; Molnár, I; Rieger, H-J; Bogáti, B; Szabó, E

    2012-04-06

    Robust HPLC separations lead to fewer analysis failures and better method transfer as well as providing an assurance of quality. This work presents the systematic development of an optimal, robust, fast UHPLC method for the simultaneous assay of two APIs of an eye drop sample and their impurities, in accordance with Quality by Design principles. Chromatography software is employed to effectively generate design spaces (Method Operable Design Regions), which are subsequently employed to determine the final method conditions and to evaluate robustness prior to validation. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Methylxanthines: properties and determination in various objects

    NASA Astrophysics Data System (ADS)

    Andreeva, Elena Yu; Dmitrienko, Stanislava G.; Zolotov, Yurii A.

    2012-05-01

    Published data on the properties and determination of caffeine, theophylline, theobromine and some other methylxanthines in various objects are surveyed and described systematically. Different sample preparation procedures such as liquid extraction from solid matrices and liquid-liquid, supercritical fluid and solid-phase extraction are compared. The key methods of analysis including chromatography, electrophoresis, spectrometry and electrochemical methods are discussed. Examples of methylxanthine determination in plants, food products, energy beverages, pharmaceuticals, biological fluids and natural and waste waters are given. The bibliography includes 393 references.

  10. Experimental study of the continuous casting slab solidification microstructure by the dendrite etching method

    NASA Astrophysics Data System (ADS)

    Yang, X. G.; Xu, Q. T.; Wu, C. L.; Chen, Y. S.

    2017-12-01

    The relationship between the microstructure of the continuous casting slab (CCS) and quality defects of the steel products, as well as evolution and characteristics of the fine equiaxed, columnar, equiaxed zones and crossed dendrites of CCS were systematically investigated in this study. Different microstructures of various CCS samples were revealed. The dendrite etching method was proved to be quite efficient for the analysis of solidified morphologies, which are essential to estimate the material characteristics, especially the CCS microstructure defects.

  11. Comparison of normalization methods for the analysis of metagenomic gene abundance data.

    PubMed

    Pereira, Mariana Buongermino; Wallroth, Mikael; Jonsson, Viktor; Kristiansson, Erik

    2018-04-20

    In shotgun metagenomics, microbial communities are studied through direct sequencing of DNA without any prior cultivation. By comparing gene abundances estimated from the generated sequencing reads, functional differences between the communities can be identified. However, gene abundance data is affected by high levels of systematic variability, which can greatly reduce the statistical power and introduce false positives. Normalization, which is the process where systematic variability is identified and removed, is therefore a vital part of the data analysis. A wide range of normalization methods for high-dimensional count data has been proposed but their performance on the analysis of shotgun metagenomic data has not been evaluated. Here, we present a systematic evaluation of nine normalization methods for gene abundance data. The methods were evaluated through resampling of three comprehensive datasets, creating a realistic setting that preserved the unique characteristics of metagenomic data. Performance was measured in terms of the methods ability to identify differentially abundant genes (DAGs), correctly calculate unbiased p-values and control the false discovery rate (FDR). Our results showed that the choice of normalization method has a large impact on the end results. When the DAGs were asymmetrically present between the experimental conditions, many normalization methods had a reduced true positive rate (TPR) and a high false positive rate (FPR). The methods trimmed mean of M-values (TMM) and relative log expression (RLE) had the overall highest performance and are therefore recommended for the analysis of gene abundance data. For larger sample sizes, CSS also showed satisfactory performance. This study emphasizes the importance of selecting a suitable normalization methods in the analysis of data from shotgun metagenomics. Our results also demonstrate that improper methods may result in unacceptably high levels of false positives, which in turn may lead to incorrect or obfuscated biological interpretation.

  12. [Systematic umbilical cord blood analysis at birth: feasibility and reliability in a French labour ward].

    PubMed

    Ernst, D; Clerc, J; Decullier, E; Gavanier, G; Dupuis, O

    2012-10-01

    At birth, evaluation of neonatal well-being is crucial. It is though important to perform umbilical cord blood gas analysis, and then to analyze the samples. We wanted to establish the feasibility and reliability of systematic umbilical cord blood sampling in a French labour ward. Study of systematic umbilical cord blood gas analysis was realized retrospectively from 1000 consecutive deliveries. We first established the feasibility of the samples. Feasibility was defined by the ratio of complete cord acid-base data on the number of deliveries from alive newborns. Afterwards, we established the reliability on the remaining cord samples. Reliability was the ratio of samples that fulfilled quality criteria defined by Westgate et al. and revised by Kro et al., on the number of complete samples from alive newborns. At last, we looked for factors that would influence these results. The systematic umbilical cord blood sample feasibility reached 91.6%, and the reliability reached 80.7%. About the delivery mode, 38.6% of emergency caesarians (IC 95% [30.8-46.3]; P<0.0001) led to non-valid samples, when only 11.3% of programmed caesarians (IC 95% [4.3-18.2]; P<0.0001) led to non-valid samples. Umbilical cord blood analysis were significantly less validated during emergency caesarians. Realization of systematic cord blood gas analysis was followed by 8.4% of incomplete samples, and by 19.3% that were uninterpretable. Training sessions should be organized to improve the feasibility and reliability, especially during emergency caesarians. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  13. Complementary and Alternative Medicine for Cancer Pain: An Overview of Systematic Reviews

    PubMed Central

    Bao, Yanju; Kong, Xiangying; Yang, Liping; Liu, Rui; Shi, Zhan; Li, Weidong; Hua, Baojin; Hou, Wei

    2014-01-01

    Background and Objective. Now with more and more published systematic reviews of Complementary and Alternative Medicine (CAM) on adult cancer pain, it is necessary to use the methods of overview of systematic review to summarize available evidence, appraise the evidence level, and give suggestions to future research and practice. Methods. A comprehensive search (the Cochrane Library, PubMed, Embase, and ISI Web of Knowledge) was conducted to identify all systematic reviews or meta-analyses of CAM on adult cancer pain. And the evidence levels were evaluated using GRADE approach. Results. 27 systematic reviews were included. Based on available evidence, we could find that psychoeducational interventions, music interventions, acupuncture plus drug therapy, Chinese herbal medicine plus cancer therapy, compound kushen injection, reflexology, lycopene, TENS, qigong, cupping, cannabis, Reiki, homeopathy (Traumeel), and creative arts therapies might have beneficial effects on adult cancer pain. No benefits were found for acupuncture (versus drug therapy or shame acupuncture), and the results were inconsistent for massage therapy, transcutaneous electric nerve stimulation (TENS), and Viscum album L plus cancer treatment. However, the evidence levels for these interventions were low or moderate due to high risk of bias and/or small sample size of primary studies. Conclusion. CAM may be beneficial for alleviating cancer pain, but the evidence levels were found to be low or moderate. Future large and rigor randomized controlled studies are needed to confirm the benefits of CAM on adult cancer pain. PMID:24817897

  14. Complementary and alternative medicine for cancer pain: an overview of systematic reviews.

    PubMed

    Bao, Yanju; Kong, Xiangying; Yang, Liping; Liu, Rui; Shi, Zhan; Li, Weidong; Hua, Baojin; Hou, Wei

    2014-01-01

    Background and Objective. Now with more and more published systematic reviews of Complementary and Alternative Medicine (CAM) on adult cancer pain, it is necessary to use the methods of overview of systematic review to summarize available evidence, appraise the evidence level, and give suggestions to future research and practice. Methods. A comprehensive search (the Cochrane Library, PubMed, Embase, and ISI Web of Knowledge) was conducted to identify all systematic reviews or meta-analyses of CAM on adult cancer pain. And the evidence levels were evaluated using GRADE approach. Results. 27 systematic reviews were included. Based on available evidence, we could find that psychoeducational interventions, music interventions, acupuncture plus drug therapy, Chinese herbal medicine plus cancer therapy, compound kushen injection, reflexology, lycopene, TENS, qigong, cupping, cannabis, Reiki, homeopathy (Traumeel), and creative arts therapies might have beneficial effects on adult cancer pain. No benefits were found for acupuncture (versus drug therapy or shame acupuncture), and the results were inconsistent for massage therapy, transcutaneous electric nerve stimulation (TENS), and Viscum album L plus cancer treatment. However, the evidence levels for these interventions were low or moderate due to high risk of bias and/or small sample size of primary studies. Conclusion. CAM may be beneficial for alleviating cancer pain, but the evidence levels were found to be low or moderate. Future large and rigor randomized controlled studies are needed to confirm the benefits of CAM on adult cancer pain.

  15. Accuracy of LightCycler® SeptiFast for the detection and identification of pathogens in the blood of patients with suspected sepsis: a systematic review protocol

    PubMed Central

    Wilson, Claire; Blackwood, Bronagh; McAuley, Danny F; Perkins, Gavin D; McMullan, Ronan; Gates, Simon; Warhurst, Geoffrey

    2012-01-01

    Background There is growing interest in the potential utility of molecular diagnostics in improving the detection of life-threatening infection (sepsis). LightCycler® SeptiFast is a multipathogen probe-based real-time PCR system targeting DNA sequences of bacteria and fungi present in blood samples within a few hours. We report here the protocol of the first systematic review of published clinical diagnostic accuracy studies of this technology when compared with blood culture in the setting of suspected sepsis. Methods/design Data sources: the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE), the Health Technology Assessment Database (HTA), the NHS Economic Evaluation Database (NHSEED), The Cochrane Library, MEDLINE, EMBASE, ISI Web of Science, BIOSIS Previews, MEDION and the Aggressive Research Intelligence Facility Database (ARIF). Study selection: diagnostic accuracy studies that compare the real-time PCR technology with standard culture results performed on a patient's blood sample during the management of sepsis. Data extraction: three reviewers, working independently, will determine the level of evidence, methodological quality and a standard data set relating to demographics and diagnostic accuracy metrics for each study. Statistical analysis/data synthesis: heterogeneity of studies will be investigated using a coupled forest plot of sensitivity and specificity and a scatter plot in Receiver Operator Characteristic (ROC) space. Bivariate model method will be used to estimate summary sensitivity and specificity. The authors will investigate reporting biases using funnel plots based on effective sample size and regression tests of asymmetry. Subgroup analyses are planned for adults, children and infection setting (hospital vs community) if sufficient data are uncovered. Dissemination Recommendations will be made to the Department of Health (as part of an open-access HTA report) as to whether the real-time PCR technology has sufficient clinical diagnostic accuracy potential to move forward to efficacy testing during the provision of routine clinical care. Registration PROSPERO—NIHR Prospective Register of Systematic Reviews (CRD42011001289). PMID:22240646

  16. A Pilot Systematic Review and Meta-Analysis on the Effectiveness of Problem Based Learning.

    ERIC Educational Resources Information Center

    Newman, Mark

    This paper reports on the development and piloting of a systematic review and meta analysis of research on the effectiveness of problem based learning (PBL). The systematic review protocol was pilot tested with a sample of studies cited as providing "evidence" about the effectiveness of PBL. From the 5 studies mentioned in the sample of reviews,…

  17. Interplay of structural, optical and magnetic properties in Gd doped CeO{sub 2}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soni, S.; Dalela, S., E-mail: sdphysics@rediffmail.com; Kumar, Sudish

    In this research wok systematic investigation on the synthesis, characterization, optical and magnetic properties of Ce{sub 1-x}Gd{sub x}O{sub 2} (where x=0.02, 0.04, 0.06, and 0.10) synthesized using the Solid-state method. Structural, Optical and Magnetic properties of the samples were investigated by X-ray diffraction (XRD), UV-VIS-NIR spectroscopy and VSM. Fluorite structure is confirmed from the XRD measurement on Gd doped CeO{sub 2} samples. Magnetic studies showed that the Gd doped polycrystalline samples display room temperature ferromagnetism and the ferromagnetic ordering strengthens with the Gd concentration.

  18. Contraceptive use and method choice among women with opioid and other substance use disorders: A systematic review.

    PubMed

    Terplan, Mishka; Hand, Dennis J; Hutchinson, Melissa; Salisbury-Afshar, Elizabeth; Heil, Sarah H

    2015-11-01

    To systematically review the literature on contraceptive use by women with opioid and other substance use disorders in order to estimate overall contraceptive use and to examine method choice given the alarmingly high rate of unintended pregnancy in this population. Pubmed (1948-2014) and PsycINFO (1806-2014) databases were searched for peer-reviewed journal articles using a systematic search strategy. Only articles published in English and reporting contraceptive use within samples of women with opioid and other substance use disorders were eligible for inclusion. Out of 580 abstracts reviewed, 105 articles were given a full-text review, and 24 studies met the inclusion criteria. The majority (51%) of women in these studies reported using opioids, with much smaller percentages reporting alcohol and cocaine use. Across studies, contraceptive prevalence ranged widely, from 6%-77%, with a median of 55%. Results from a small subset of studies (N=6) suggest that women with opioid and other substance use disorders used contraception less often than non-drug-using comparison populations (56% vs. 81%, respectively). Regarding method choice, condoms were the most prevalent method, accounting for a median of 62% of contraceptives used, while use of more effective methods, especially implants and intrauterine devices (IUDs), was far less prevalent 8%. Women with opioid and other substance use disorders have an unmet need for contraception, especially for the most effective methods. Offering contraception services in conjunction with substance use treatment and promoting use of more effective methods could help meet this need and reduce unintended pregnancy in this population. Copyright © 2015. Published by Elsevier Inc.

  19. Errors in the estimation method for the rejection of vibrations in adaptive optics systems

    NASA Astrophysics Data System (ADS)

    Kania, Dariusz

    2017-06-01

    In recent years the problem of the mechanical vibrations impact in adaptive optics (AO) systems has been renewed. These signals are damped sinusoidal signals and have deleterious effect on the system. One of software solutions to reject the vibrations is an adaptive method called AVC (Adaptive Vibration Cancellation) where the procedure has three steps: estimation of perturbation parameters, estimation of the frequency response of the plant, update the reference signal to reject/minimalize the vibration. In the first step a very important problem is the estimation method. A very accurate and fast (below 10 ms) estimation method of these three parameters has been presented in several publications in recent years. The method is based on using the spectrum interpolation and MSD time windows and it can be used to estimate multifrequency signals. In this paper the estimation method is used in the AVC method to increase the system performance. There are several parameters that affect the accuracy of obtained results, e.g. CiR - number of signal periods in a measurement window, N - number of samples in the FFT procedure, H - time window order, SNR, b - number of ADC bits, γ - damping ratio of the tested signal. Systematic errors increase when N, CiR, H decrease and when γ increases. The value for systematic error is approximately 10^-10 Hz/Hz for N = 2048 and CiR = 0.1. This paper presents equations that can used to estimate maximum systematic errors for given values of H, CiR and N before the start of the estimation process.

  20. Removing damped sinusoidal vibrations in adaptive optics systems using a DFT-based estimation method

    NASA Astrophysics Data System (ADS)

    Kania, Dariusz

    2017-06-01

    The problem of a vibrations rejection in adaptive optics systems is still present in publications. These undesirable signals emerge because of shaking the system structure, the tracking process, etc., and they usually are damped sinusoidal signals. There are some mechanical solutions to reduce the signals but they are not very effective. One of software solutions are very popular adaptive methods. An AVC (Adaptive Vibration Cancellation) method has been presented and developed in recent years. The method is based on the estimation of three vibrations parameters and values of frequency, amplitude and phase are essential to produce and adjust a proper signal to reduce or eliminate vibrations signals. This paper presents a fast (below 10 ms) and accurate estimation method of frequency, amplitude and phase of a multifrequency signal that can be used in the AVC method to increase the AO system performance. The method accuracy depends on several parameters: CiR - number of signal periods in a measurement window, N - number of samples in the FFT procedure, H - time window order, SNR, THD, b - number of A/D converter bits in a real time system, γ - the damping ratio of the tested signal, φ - the phase of the tested signal. Systematic errors increase when N, CiR, H decrease and when γ increases. The value of systematic error for γ = 0.1%, CiR = 1.1 and N = 32 is approximately 10^-4 Hz/Hz. This paper focuses on systematic errors of and effect of the signal phase and values of γ on the results.

  1. A systematic review and secondary data analysis of the interactions between the serotonin transporter 5-HTTLPR polymorphism and environmental and psychological factors in eating disorders

    PubMed Central

    Rozenblat, Vanja; Ong, Deborah; Fuller-Tyszkiewicz, Matthew; Akkermann, Kirsti; Collier, David; Engels, Rutger C.M.E; Fernandez-Aranda, Fernando; Harro, Jaanus; Homberg, Judith R.; Karwautz, Andreas; Kiive, Evelyn; Klump, Kelly L.; Larson, Christine L.; Racine, Sarah E.; Richardson, Jodie; Steiger, Howard; Stoltenberg, Scott F.; van Strien, Tatjana; Wagner, Gudrun; Treasure, Janet; Krug, Isabel

    2016-01-01

    Objectives To summarize and synthesize the growing gene x environment (GxE) research investigating the promoter region of the serotonin transporter gene (5-HTTLPR) in the eating disorders (ED) field, and overcome the common limitation of low sample size, by undertaking a systematic review followed by a secondary data meta-analysis of studies identified by the review. Method A systematic review of articles using PsycINFO, PubMed, and EMBASE was undertaken to identify studies investigating the interaction between 5-HTTLPR and an environmental or psychological factor, with an ED-related outcome variable. Seven studies were identified by the systematic review, with complete data sets of five community (n=1750, 64.5% female) and two clinical (n=426, 100% female) samples combined to perform four secondary-data analyses: 5-HTTLPR x Traumatic Life Events to predict ED status (n=909), 5-HTTLPR x Sexual and Physical Abuse to predict bulimic symptoms (n=1097), 5-HTTLPR x Depression to predict bulimic symptoms (n=1256), and 5-HTTLPR x Impulsiveness to predict disordered eating (n=1149). Results Under a multiplicative model, the low function (s) allele of 5-HTTLPR interacted with traumatic life events and experiencing both sexual and physical abuse (but not only one) to predict increased likelihood of an ED and bulimic symptoms, respectively. However, under an additive model there was also an interaction between sexual and physical abuse considered independently and 5-HTTLPR, and no interaction with traumatic life events. No other GxE interactions were significant. Conclusion Early promising results should be followed-up with continued cross-institutional collaboration in order to achieve the large sample sizes necessary for genetic research. PMID:27701012

  2. Impact of Family Abuse on Running Away, Deviance, and Street Victimization among Homeless Rural and Urban Youth

    ERIC Educational Resources Information Center

    Thrane, Lisa E.; Hoyt, Danny R.; Whitbeck, Les B.; Yoder, Kevin A.

    2006-01-01

    Problem: Various demographic and familial risk factors have been linked to runaway behavior. To date, there has not been a systematic investigation of the impact of size of community on runaway behavior. This study will compare runaways from smaller cities and rural areas to their urban counterparts. Methods: A convenience sample of 602…

  3. Simulation of meteorological satellite (METSAT) data using LANDSAT data

    NASA Technical Reports Server (NTRS)

    Austin, W. W.; Ryland, W. E.

    1983-01-01

    The information content which can be expected from the advanced very high resolution radiometer system, AVHRR, on the NOAA-6 satellite was assessed, and systematic techniques of data interpretation for use with meteorological satellite data were defined. In-house data from LANDSAT 2 and 3 were used to simulate the spatial, spectral, and sampling methods of the NOAA-6 satellite data.

  4. Analysis of four toxic metals in a single rice seed by matrix solid phase dispersion -inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    He, Xiufen; Chen, Lixia; Chen, Xin; Yu, Huamei; Peng, Lixu; Han, Bingjun

    2016-12-01

    Toxic metals in rice pose great risks to human health. Metal bioaccumulation in rice grains is a criterion of breeding. Rice breeding requires a sensitive method to determine metal content in single rice grains to assist the variety selection. In the present study, four toxic metals of arsenic (As), cadmium (Cd), chromium (Cr) and lead (Pb) in a single rice grain were determined by a simple and rapid method. The developed method is based on matrix solid phase dispersion using multi-wall carbon nanotubes (MWCNTs) as dispersing agent and analyzed by inductively coupled plasma mass spectrometry. The experimental parameters were systematically investigated. The limits of detection (LOD) were 5.0, 0.6, 10 and 2.1 ng g-1 for As, Cd, Cr, and Pb, respectively, with relative standard deviations (n = 6) of <7.7%, demonstrating the good sensitivity and precision of the method. The results of 30 real world rice samples analyzed by this method agreed well with those obtained by the standard microwave digestion. The amount of sample required was reduced approximately 100 fold in comparison with the microwave digestion. The method has a high application potential for other sample matrices and elements with high sensitivity and sample throughput.

  5. Analysis of four toxic metals in a single rice seed by matrix solid phase dispersion -inductively coupled plasma mass spectrometry.

    PubMed

    He, Xiufen; Chen, Lixia; Chen, Xin; Yu, Huamei; Peng, Lixu; Han, Bingjun

    2016-12-06

    Toxic metals in rice pose great risks to human health. Metal bioaccumulation in rice grains is a criterion of breeding. Rice breeding requires a sensitive method to determine metal content in single rice grains to assist the variety selection. In the present study, four toxic metals of arsenic (As), cadmium (Cd), chromium (Cr) and lead (Pb) in a single rice grain were determined by a simple and rapid method. The developed method is based on matrix solid phase dispersion using multi-wall carbon nanotubes (MWCNTs) as dispersing agent and analyzed by inductively coupled plasma mass spectrometry. The experimental parameters were systematically investigated. The limits of detection (LOD) were 5.0, 0.6, 10 and 2.1 ng g -1 for As, Cd, Cr, and Pb, respectively, with relative standard deviations (n = 6) of <7.7%, demonstrating the good sensitivity and precision of the method. The results of 30 real world rice samples analyzed by this method agreed well with those obtained by the standard microwave digestion. The amount of sample required was reduced approximately 100 fold in comparison with the microwave digestion. The method has a high application potential for other sample matrices and elements with high sensitivity and sample throughput.

  6. Sample preparation methods for scanning electron microscopy of homogenized Al-Mg-Si billets: A comparative study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Österreicher, Johannes Albert; Kumar, Manoj

    Characterization of Mg-Si precipitates is crucial for optimizing the homogenization heat treatment of Al-Mg-Si alloys. Although sample preparation is key for high quality scanning electron microscopy imaging, most common methods lead to dealloying of Mg-Si precipitates. In this article we systematically evaluate different sample preparation methods: mechanical polishing, etching with various reagents, and electropolishing using different electrolytes. We demonstrate that the use of a nitric acid and methanol electrolyte for electropolishing a homogenized Al-Mg-Si alloy prevents the dissolution of Mg-Si precipitates, resulting in micrographs of higher quality. This preparation method is investigated in depth and the obtained scanning electron microscopymore » images are compared with transmission electron micrographs: the shape and size of Mg-Si precipitates appear very similar in either method. The scanning electron micrographs allow proper identification and measurement of the Mg-Si phases including needles with lengths of roughly 200 nm. These needles are β″ precipitates as confirmed by high resolution transmission electron microscopy. - Highlights: •Secondary precipitation in homogenized 6xxx Al alloys is crucial for extrudability. •Existing sample preparation methods for SEM are improvable. •Electropolishing with nitric acid/methanol yields superior quality in SEM. •The obtained micrographs are compared to TEM micrographs.« less

  7. Accelerating Convergence in Molecular Dynamics Simulations of Solutes in Lipid Membranes by Conducting a Random Walk along the Bilayer Normal.

    PubMed

    Neale, Chris; Madill, Chris; Rauscher, Sarah; Pomès, Régis

    2013-08-13

    All molecular dynamics simulations are susceptible to sampling errors, which degrade the accuracy and precision of observed values. The statistical convergence of simulations containing atomistic lipid bilayers is limited by the slow relaxation of the lipid phase, which can exceed hundreds of nanoseconds. These long conformational autocorrelation times are exacerbated in the presence of charged solutes, which can induce significant distortions of the bilayer structure. Such long relaxation times represent hidden barriers that induce systematic sampling errors in simulations of solute insertion. To identify optimal methods for enhancing sampling efficiency, we quantitatively evaluate convergence rates using generalized ensemble sampling algorithms in calculations of the potential of mean force for the insertion of the ionic side chain analog of arginine in a lipid bilayer. Umbrella sampling (US) is used to restrain solute insertion depth along the bilayer normal, the order parameter commonly used in simulations of molecular solutes in lipid bilayers. When US simulations are modified to conduct random walks along the bilayer normal using a Hamiltonian exchange algorithm, systematic sampling errors are eliminated more rapidly and the rate of statistical convergence of the standard free energy of binding of the solute to the lipid bilayer is increased 3-fold. We compute the ratio of the replica flux transmitted across a defined region of the order parameter to the replica flux that entered that region in Hamiltonian exchange simulations. We show that this quantity, the transmission factor, identifies sampling barriers in degrees of freedom orthogonal to the order parameter. The transmission factor is used to estimate the depth-dependent conformational autocorrelation times of the simulation system, some of which exceed the simulation time, and thereby identify solute insertion depths that are prone to systematic sampling errors and estimate the lower bound of the amount of sampling that is required to resolve these sampling errors. Finally, we extend our simulations and verify that the conformational autocorrelation times estimated by the transmission factor accurately predict correlation times that exceed the simulation time scale-something that, to our knowledge, has never before been achieved.

  8. Path integral approach to the Wigner representation of canonical density operators for discrete systems coupled to harmonic baths.

    PubMed

    Montoya-Castillo, Andrés; Reichman, David R

    2017-01-14

    We derive a semi-analytical form for the Wigner transform for the canonical density operator of a discrete system coupled to a harmonic bath based on the path integral expansion of the Boltzmann factor. The introduction of this simple and controllable approach allows for the exact rendering of the canonical distribution and permits systematic convergence of static properties with respect to the number of path integral steps. In addition, the expressions derived here provide an exact and facile interface with quasi- and semi-classical dynamical methods, which enables the direct calculation of equilibrium time correlation functions within a wide array of approaches. We demonstrate that the present method represents a practical path for the calculation of thermodynamic data for the spin-boson and related systems. We illustrate the power of the present approach by detailing the improvement of the quality of Ehrenfest theory for the correlation function C zz (t)=Re⟨σ z (0)σ z (t)⟩ for the spin-boson model with systematic convergence to the exact sampling function. Importantly, the numerically exact nature of the scheme presented here and its compatibility with semiclassical methods allows for the systematic testing of commonly used approximations for the Wigner-transformed canonical density.

  9. A sampling design framework for monitoring secretive marshbirds

    USGS Publications Warehouse

    Johnson, D.H.; Gibbs, J.P.; Herzog, M.; Lor, S.; Niemuth, N.D.; Ribic, C.A.; Seamans, M.; Shaffer, T.L.; Shriver, W.G.; Stehman, S.V.; Thompson, W.L.

    2009-01-01

    A framework for a sampling plan for monitoring marshbird populations in the contiguous 48 states is proposed here. The sampling universe is the breeding habitat (i.e. wetlands) potentially used by marshbirds. Selection protocols would be implemented within each of large geographical strata, such as Bird Conservation Regions. Site selection will be done using a two-stage cluster sample. Primary sampling units (PSUs) would be land areas, such as legal townships, and would be selected by a procedure such as systematic sampling. Secondary sampling units (SSUs) will be wetlands or portions of wetlands in the PSUs. SSUs will be selected by a randomized spatially balanced procedure. For analysis, the use of a variety of methods as a means of increasing confidence in conclusions that may be reached is encouraged. Additional effort will be required to work out details and implement the plan.

  10. Method development in high-performance liquid chromatography for high-throughput profiling and metabonomic studies of biofluid samples.

    PubMed

    Pham-Tuan, Hai; Kaskavelis, Lefteris; Daykin, Clare A; Janssen, Hans-Gerd

    2003-06-15

    "Metabonomics" has in the past decade demonstrated enormous potential in furthering the understanding of, for example, disease processes, toxicological mechanisms, and biomarker discovery. The same principles can also provide a systematic and comprehensive approach to the study of food ingredient impact on consumer health. However, "metabonomic" methodology requires the development of rapid, advanced analytical tools to comprehensively profile biofluid metabolites within consumers. Until now, NMR spectroscopy has been used for this purpose almost exclusively. Chromatographic techniques and in particular HPLC, have not been exploited accordingly. The main drawbacks of chromatography are the long analysis time, instabilities in the sample fingerprint and the rigorous sample preparation required. This contribution addresses these problems in the quest to develop generic methods for high-throughput profiling using HPLC. After a careful optimization process, stable fingerprints of biofluid samples can be obtained using standard HPLC equipment. A method using a short monolithic column and a rapid gradient with a high flow-rate has been developed that allowed rapid and detailed profiling of larger numbers of urine samples. The method can be easily translated into a slow, shallow-gradient high-resolution method for identification of interesting peaks by LC-MS/NMR. A similar approach has been applied for cell culture media samples. Due to the much higher protein content of such samples non-porous polymer-based small particle columns yielded the best results. The study clearly shows that HPLC can be used in metabonomic fingerprinting studies.

  11. Irregular analytical errors in diagnostic testing - a novel concept.

    PubMed

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC-isotope-dilution mass spectrometry methods are increasingly used for pre-market validation of routine diagnostic assays (these tests also involve substantial sets of clinical validation samples). Based on this definition/terminology, we list recognized causes of irregular analytical error as a risk catalog for clinical chemistry in this article. These issues include reproducible individual analytical errors (e.g. caused by anti-reagent antibodies) and non-reproducible, sporadic errors (e.g. errors due to incorrect pipetting volume due to air bubbles in a sample), which can both lead to inaccurate results and risks for patients.

  12. Platform construction and extraction mechanism study of magnetic mixed hemimicelles solid-phase extraction

    PubMed Central

    Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua

    2016-01-01

    Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples. PMID:27924944

  13. Systematic Compared With Targeted Staging with Endobronchial Ultrasound in Patients with Lung Cancer.

    PubMed

    Sanz-Santos, José; Serra, Pere; Torky, Mohamed; Andreo, Felipe; Centeno, Carmen; Mendiluce, Leire; Martínez-Barenys, Carlos; López de Castro, Pedro; Ruiz-Manzano, Juan

    2018-04-06

    To evaluate the accuracy of systematic mediastinal staging by endobronchial ultrasound transbronchial needle aspiration (EBUS-TBNA) (sampling of all visible nodes measuring ≥5mm from stations N3 to N1 regardless of their positron emission tomography/computed tomography (PET/CT) features) and compare this staging approach with targeted EBUS-TBNA staging (sampling only 18F-fluorodeoxyglucose (FDG)-avid nodes) in patients with N2 non-small cell lung cancer (NSCLC) on PET/CT. Retrospective study of 107 patients who underwent systematic EBUS-TBNA mediastinal staging. The results were compared with those of a hypothetical scenario where only FDG-avid nodes on PET/CT would be sampled. Systematic EBUS-TBNA sampling demonstrated N3 disease in 3 patients, N2 disease in 60 (42 single-station or N2a, 18 multiple-station or N2b) and N0/N1 disease in 44. Of these 44, seven underwent mediastinoscopy, which did not show mediastinal disease; six of the seven proceeded to lung resection, which also showed no mediastinal disease. Thirty-four N0/N1 patients after EBUS-TBNA underwent lung resection directly: N0/N1 was found in 30 and N2 in four (one N2b with a PET/CT showing N2a disease, three N2a). Sensitivity, specificity, negative predictive value, positive predictive value, and overall accuracy of systematic EBUS-TBNA were 94%, 100%, 90%, 100% and 96%, respectively. Compared to targeted EBUS-TBNA, systematic EBUS-TBNA sampling provided additional important clinical information in 14 cases (13%): three N3 cases would have passed unnoticed, and 11 N2b cases would have been staged as N2a. In clinical practice, systematic sampling of the mediastinum by EBUS-TBNA, regardless of PET/CT features, is to be recommended over targeted sampling. Copyright © 2018. Published by Elsevier Inc.

  14. Inferring Temperature Inversions in Hot Jupiters Via Spitzer Emission Spectroscopy

    NASA Astrophysics Data System (ADS)

    Garhart, Emily; Deming, Drake; Mandell, Avi

    2016-10-01

    We present a systematic study of 35 hot Jupiter secondary eclipses, including 16 hot Jupiters never before characterized via emission, observed at the 3.6 μm and 4.5 μm bandpasses of Warm Spitzer in order to classify their atmospheric structure, namely, the existence of temperature inversions. This is a robust study in that these planets orbit stars with a wide range of compositions, temperatures, and activity levels. This diverse sample allows us to investigate the source of planetary temperature inversions, specifically, its correlation with stellar irradiance and magnetic activity. We correct for systematic and intra-pixel sensitivity effects with a pixel level decorrelation (PLD) method described in Deming et al. (2015). The relationship between eclipse depths and a best-fit blackbody function versus stellar activity, a method described in Knutson et al. (2010), will ultimately enable us to appraise the current hypotheses of temperature inversions.

  15. Cross-cultural dataset for the evolution of religion and morality project.

    PubMed

    Purzycki, Benjamin Grant; Apicella, Coren; Atkinson, Quentin D; Cohen, Emma; McNamara, Rita Anne; Willard, Aiyana K; Xygalatas, Dimitris; Norenzayan, Ara; Henrich, Joseph

    2016-11-08

    A considerable body of research cross-culturally examines the evolution of religious traditions, beliefs and behaviors. The bulk of this research, however, draws from coded qualitative ethnographies rather than from standardized methods specifically designed to measure religious beliefs and behaviors. Psychological data sets that examine religious thought and behavior in controlled conditions tend to be disproportionately sampled from student populations. Some cross-national databases employ standardized methods at the individual level, but are primarily focused on fully market integrated, state-level societies. The Evolution of Religion and Morality Project sought to generate a data set that systematically probed individual level measures sampling across a wider range of human populations. The set includes data from behavioral economic experiments and detailed surveys of demographics, religious beliefs and practices, material security, and intergroup perceptions. This paper describes the methods and variables, briefly introduces the sites and sampling techniques, notes inconsistencies across sites, and provides some basic reporting for the data set.

  16. AutoTag and AutoSnap: Standardized, semi-automatic capture of regions of interest from whole slide images

    PubMed Central

    Marien, Koen M.; Andries, Luc; De Schepper, Stefanie; Kockx, Mark M.; De Meyer, Guido R.Y.

    2015-01-01

    Tumor angiogenesis is measured by counting microvessels in tissue sections at high power magnification as a potential prognostic or predictive biomarker. Until now, regions of interest1 (ROIs) were selected by manual operations within a tumor by using a systematic uniform random sampling2 (SURS) approach. Although SURS is the most reliable sampling method, it implies a high workload. However, SURS can be semi-automated and in this way contribute to the development of a validated quantification method for microvessel counting in the clinical setting. Here, we report a method to use semi-automated SURS for microvessel counting: • Whole slide imaging with Pannoramic SCAN (3DHISTECH) • Computer-assisted sampling in Pannoramic Viewer (3DHISTECH) extended by two self-written AutoHotkey applications (AutoTag and AutoSnap) • The use of digital grids in Photoshop® and Bridge® (Adobe Systems) This rapid procedure allows traceability essential for high throughput protein analysis of immunohistochemically stained tissue. PMID:26150998

  17. Cross-cultural dataset for the evolution of religion and morality project

    PubMed Central

    Purzycki, Benjamin Grant; Apicella, Coren; Atkinson, Quentin D.; Cohen, Emma; McNamara, Rita Anne; Willard, Aiyana K.; Xygalatas, Dimitris; Norenzayan, Ara; Henrich, Joseph

    2016-01-01

    A considerable body of research cross-culturally examines the evolution of religious traditions, beliefs and behaviors. The bulk of this research, however, draws from coded qualitative ethnographies rather than from standardized methods specifically designed to measure religious beliefs and behaviors. Psychological data sets that examine religious thought and behavior in controlled conditions tend to be disproportionately sampled from student populations. Some cross-national databases employ standardized methods at the individual level, but are primarily focused on fully market integrated, state-level societies. The Evolution of Religion and Morality Project sought to generate a data set that systematically probed individual level measures sampling across a wider range of human populations. The set includes data from behavioral economic experiments and detailed surveys of demographics, religious beliefs and practices, material security, and intergroup perceptions. This paper describes the methods and variables, briefly introduces the sites and sampling techniques, notes inconsistencies across sites, and provides some basic reporting for the data set. PMID:27824332

  18. Multivariate regression methods for estimating velocity of ictal discharges from human microelectrode recordings

    NASA Astrophysics Data System (ADS)

    Liou, Jyun-you; Smith, Elliot H.; Bateman, Lisa M.; McKhann, Guy M., II; Goodman, Robert R.; Greger, Bradley; Davis, Tyler S.; Kellis, Spencer S.; House, Paul A.; Schevon, Catherine A.

    2017-08-01

    Objective. Epileptiform discharges, an electrophysiological hallmark of seizures, can propagate across cortical tissue in a manner similar to traveling waves. Recent work has focused attention on the origination and propagation patterns of these discharges, yielding important clues to their source location and mechanism of travel. However, systematic studies of methods for measuring propagation are lacking. Approach. We analyzed epileptiform discharges in microelectrode array recordings of human seizures. The array records multiunit activity and local field potentials at 400 micron spatial resolution, from a small cortical site free of obstructions. We evaluated several computationally efficient statistical methods for calculating traveling wave velocity, benchmarking them to analyses of associated neuronal burst firing. Main results. Over 90% of discharges met statistical criteria for propagation across the sampled cortical territory. Detection rate, direction and speed estimates derived from a multiunit estimator were compared to four field potential-based estimators: negative peak, maximum descent, high gamma power, and cross-correlation. Interestingly, the methods that were computationally simplest and most efficient (negative peak and maximal descent) offer non-inferior results in predicting neuronal traveling wave velocities compared to the other two, more complex methods. Moreover, the negative peak and maximal descent methods proved to be more robust against reduced spatial sampling challenges. Using least absolute deviation in place of least squares error minimized the impact of outliers, and reduced the discrepancies between local field potential-based and multiunit estimators. Significance. Our findings suggest that ictal epileptiform discharges typically take the form of exceptionally strong, rapidly traveling waves, with propagation detectable across millimeter distances. The sequential activation of neurons in space can be inferred from clinically-observable EEG data, with a variety of straightforward computation methods available. This opens possibilities for systematic assessments of ictal discharge propagation in clinical and research settings.

  19. Evaluation of a segment-based LANDSAT full-frame approach to corp area estimation

    NASA Technical Reports Server (NTRS)

    Bauer, M. E. (Principal Investigator); Hixson, M. M.; Davis, S. M.

    1981-01-01

    As the registration of LANDSAT full frames enters the realm of current technology, sampling methods should be examined which utilize other than the segment data used for LACIE. The effect of separating the functions of sampling for training and sampling for area estimation. The frame selected for analysis was acquired over north central Iowa on August 9, 1978. A stratification of he full-frame was defined. Training data came from segments within the frame. Two classification and estimation procedures were compared: statistics developed on one segment were used to classify that segment, and pooled statistics from the segments were used to classify a systematic sample of pixels. Comparisons to USDA/ESCS estimates illustrate that the full-frame sampling approach can provide accurate and precise area estimates.

  20. Efficient Solar Scene Wavefront Estimation with Reduced Systematic and RMS Errors: Summary

    NASA Astrophysics Data System (ADS)

    Anugu, N.; Garcia, P.

    2016-04-01

    Wave front sensing for solar telescopes is commonly implemented with the Shack-Hartmann sensors. Correlation algorithms are usually used to estimate the extended scene Shack-Hartmann sub-aperture image shifts or slopes. The image shift is computed by correlating a reference sub-aperture image with the target distorted sub-aperture image. The pixel position where the maximum correlation is located gives the image shift in integer pixel coordinates. Sub-pixel precision image shifts are computed by applying a peak-finding algorithm to the correlation peak Poyneer (2003); Löfdahl (2010). However, the peak-finding algorithm results are usually biased towards the integer pixels, these errors are called as systematic bias errors Sjödahl (1994). These errors are caused due to the low pixel sampling of the images. The amplitude of these errors depends on the type of correlation algorithm and the type of peak-finding algorithm being used. To study the systematic errors in detail, solar sub-aperture synthetic images are constructed by using a Swedish Solar Telescope solar granulation image1. The performance of cross-correlation algorithm in combination with different peak-finding algorithms is investigated. The studied peak-finding algorithms are: parabola Poyneer (2003); quadratic polynomial Löfdahl (2010); threshold center of gravity Bailey (2003); Gaussian Nobach & Honkanen (2005) and Pyramid Bailey (2003). The systematic error study reveals that that the pyramid fit is the most robust to pixel locking effects. The RMS error analysis study reveals that the threshold centre of gravity behaves better in low SNR, although the systematic errors in the measurement are large. It is found that no algorithm is best for both the systematic and the RMS error reduction. To overcome the above problem, a new solution is proposed. In this solution, the image sampling is increased prior to the actual correlation matching. The method is realized in two steps to improve its computational efficiency. In the first step, the cross-correlation is implemented at the original image spatial resolution grid (1 pixel). In the second step, the cross-correlation is performed using a sub-pixel level grid by limiting the field of search to 4 × 4 pixels centered at the first step delivered initial position. The generation of these sub-pixel grid based region of interest images is achieved with the bi-cubic interpolation. The correlation matching with sub-pixel grid technique was previously reported in electronic speckle photography Sjö'dahl (1994). This technique is applied here for the solar wavefront sensing. A large dynamic range and a better accuracy in the measurements are achieved with the combination of the original pixel grid based correlation matching in a large field of view and a sub-pixel interpolated image grid based correlation matching within a small field of view. The results revealed that the proposed method outperforms all the different peak-finding algorithms studied in the first approach. It reduces both the systematic error and the RMS error by a factor of 5 (i.e., 75% systematic error reduction), when 5 times improved image sampling was used. This measurement is achieved at the expense of twice the computational cost. With the 5 times improved image sampling, the wave front accuracy is increased by a factor of 5. The proposed solution is strongly recommended for wave front sensing in the solar telescopes, particularly, for measuring large dynamic image shifts involved open loop adaptive optics. Also, by choosing an appropriate increment of image sampling in trade-off between the computational speed limitation and the aimed sub-pixel image shift accuracy, it can be employed in closed loop adaptive optics. The study is extended to three other class of sub-aperture images (a point source; a laser guide star; a Galactic Center extended scene). The results are planned to submit for the Optical Express journal.

  1. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  2. Advances for the Topographic Characterisation of SMC Materials

    PubMed Central

    Calvimontes, Alfredo; Grundke, Karina; Müller, Anett; Stamm, Manfred

    2009-01-01

    For a comprehensive study of Sheet Moulding Compound (SMC) surfaces, topographical data obtained by a contact-free optical method (chromatic aberration confocal imaging) were systematically acquired to characterise these surfaces with regard to their statistical, functional and volumetrical properties. Optimal sampling conditions (cut-off length and resolution) were obtained by a topographical-statistical procedure proposed in the present work. By using different length scales specific morphologies due to the influence of moulding conditions, metallic mould topography, glass fibre content and glass fibre orientation can be characterized. The aim of this study is to suggest a systematic topographical characterization procedure for composite materials in order to study and recognize the influence of production conditions on their surface quality.

  3. Improving the spectral measurement accuracy based on temperature distribution and spectra-temperature relationship

    NASA Astrophysics Data System (ADS)

    Li, Zhe; Feng, Jinchao; Liu, Pengyu; Sun, Zhonghua; Li, Gang; Jia, Kebin

    2018-05-01

    Temperature is usually considered as a fluctuation in near-infrared spectral measurement. Chemometric methods were extensively studied to correct the effect of temperature variations. However, temperature can be considered as a constructive parameter that provides detailed chemical information when systematically changed during the measurement. Our group has researched the relationship between temperature-induced spectral variation (TSVC) and normalized squared temperature. In this study, we focused on the influence of temperature distribution in calibration set. Multi-temperature calibration set selection (MTCS) method was proposed to improve the prediction accuracy by considering the temperature distribution of calibration samples. Furthermore, double-temperature calibration set selection (DTCS) method was proposed based on MTCS method and the relationship between TSVC and normalized squared temperature. We compare the prediction performance of PLS models based on random sampling method and proposed methods. The results from experimental studies showed that the prediction performance was improved by using proposed methods. Therefore, MTCS method and DTCS method will be the alternative methods to improve prediction accuracy in near-infrared spectral measurement.

  4. A High-Throughput Method for Direct Detection of Therapeutic Oligonucleotide-Induced Gene Silencing In Vivo

    PubMed Central

    Coles, Andrew H.; Osborn, Maire F.; Alterman, Julia F.; Turanov, Anton A.; Godinho, Bruno M.D.C.; Kennington, Lori; Chase, Kathryn; Aronin, Neil

    2016-01-01

    Preclinical development of RNA interference (RNAi)-based therapeutics requires a rapid, accurate, and robust method of simultaneously quantifying mRNA knockdown in hundreds of samples. The most well-established method to achieve this is quantitative real-time polymerase chain reaction (qRT-PCR), a labor-intensive methodology that requires sample purification, which increases the potential to introduce additional bias. Here, we describe that the QuantiGene® branched DNA (bDNA) assay linked to a 96-well Qiagen TissueLyser II is a quick and reproducible alternative to qRT-PCR for quantitative analysis of mRNA expression in vivo directly from tissue biopsies. The bDNA assay is a high-throughput, plate-based, luminescence technique, capable of directly measuring mRNA levels from tissue lysates derived from various biological samples. We have performed a systematic evaluation of this technique for in vivo detection of RNAi-based silencing. We show that similar quality data is obtained from purified RNA and tissue lysates. In general, we observe low intra- and inter-animal variability (around 10% for control samples), and high intermediate precision. This allows minimization of sample size for evaluation of oligonucleotide efficacy in vivo. PMID:26595721

  5. Remote control missile model test

    NASA Technical Reports Server (NTRS)

    Allen, Jerry M.; Shaw, David S.; Sawyer, Wallace C.

    1989-01-01

    An extremely large, systematic, axisymmetric body/tail fin data base was gathered through tests of an innovative missile model design which is described herein. These data were originally obtained for incorporation into a missile aerodynamics code based on engineering methods (Program MISSILE3), but can also be used as diagnostic test cases for developing computational methods because of the individual-fin data included in the data base. Detailed analysis of four sample cases from these data are presented to illustrate interesting individual-fin force and moment trends. These samples quantitatively show how bow shock, fin orientation, fin deflection, and body vortices can produce strong, unusual, and computationally challenging effects on individual fin loads. Comparisons between these data and calculations from the SWINT Euler code are also presented.

  6. Contraceptive use and method choice among women with opioid and other substance use disorders: A systematic review

    PubMed Central

    Terplan, Mishka; Hand, Dennis J.; Hutchinson, Melissa; Salisbury-Afshar, Elizabeth; Heil, Sarah H.

    2016-01-01

    Aim To systematically review the literature on contraceptive use by women with opioid and other substance use disorders in order to estimate overall contraceptive use and to examine method choice given the alarmingly high rate of unintended pregnancy in this population. Method Pubmed (1948–2014) and PsycINFO (1806–2014) databases were searched for peer-reviewed journal articles using a systematic search strategy. Only articles published in English and reporting contraceptive use within samples of women with opioid and other substance use disorders were eligible for inclusion. Results Out of 580 abstracts reviewed, 105 articles were given a full-text review, and 24 studies met the inclusion criteria. The majority (51%) of women in these studies reported using opioids, with much smaller percentages reporting alcohol and cocaine use. Across studies, contraceptive prevalence ranged widely, from 6%–77%, with a median of 55%. Results from a small subset of studies (N = 6) suggest that women with opioid and other substance use disorders used contraception less often than non-drug-using comparison populations (56% vs. 81%, respectively). Regarding method choice, condoms were the most prevalent method, accounting for a median of 62% of contraceptives used, while use of more effective methods, especially implants and intrauterine devices (IUDs), was far less prevalent 8%. Conclusions Women with opioid and other substance use disorders have an unmet need for contraception, especially for the most effective methods. Offering contraception services in conjunction with substance use treatment and promoting use of more effective methods could help meet this need and reduce unintended pregnancy in this population. PMID:25900803

  7. Systematic review of the use of online questionnaires of older adults.

    PubMed

    Remillard, Meegan L; Mazor, Kathleen M; Cutrona, Sarah L; Gurwitz, Jerry H; Tjia, Jennifer

    2014-04-01

    To describe methodological approaches to population targeting and sampling and to summarize limitations of Internet-based questionnaires in older adults. Systematic literature review. Studies using online questionnaires in older adult populations. English-language articles using search terms for geriatric, age 65 and over, Internet survey, online survey, Internet questionnaire, and online questionnaire in PubMed and EBSCO host between 1984 and July 2012. Inclusion criteria were study population mean age 65 and older and use of an online questionnaire for research. Review of 336 abstracts yielded 14 articles for full review by two investigators; 11 articles met inclusion criteria. Articles were extracted for study design and setting, participant characteristics, recruitment strategy, country, and study limitations. Eleven articles were published after 2001. Studies had populations with a mean age of 65 to 78, included descriptive and analytical designs, and were conducted in the United States, Australia, and Japan. Recruiting methods varied widely from paper fliers and personal e-mails to use of consumer marketing panels. Investigator-reported study limitations included the use of small convenience samples and limited generalizability. Online questionnaires are a feasible method of surveying older adults in some geographic regions and for some subsets of older adults, but limited Internet access constrains recruiting methods and often limits study generalizability. © 2014, Copyright the Authors Journal compilation © 2014, The American Geriatrics Society.

  8. Errors in radial velocity variance from Doppler wind lidar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, H.; Barthelmie, R. J.; Doubrawa, P.

    A high-fidelity lidar turbulence measurement technique relies on accurate estimates of radial velocity variance that are subject to both systematic and random errors determined by the autocorrelation function of radial velocity, the sampling rate, and the sampling duration. Our paper quantifies the effect of the volumetric averaging in lidar radial velocity measurements on the autocorrelation function and the dependence of the systematic and random errors on the sampling duration, using both statistically simulated and observed data. For current-generation scanning lidars and sampling durations of about 30 min and longer, during which the stationarity assumption is valid for atmospheric flows, themore » systematic error is negligible but the random error exceeds about 10%.« less

  9. Errors in radial velocity variance from Doppler wind lidar

    DOE PAGES

    Wang, H.; Barthelmie, R. J.; Doubrawa, P.; ...

    2016-08-29

    A high-fidelity lidar turbulence measurement technique relies on accurate estimates of radial velocity variance that are subject to both systematic and random errors determined by the autocorrelation function of radial velocity, the sampling rate, and the sampling duration. Our paper quantifies the effect of the volumetric averaging in lidar radial velocity measurements on the autocorrelation function and the dependence of the systematic and random errors on the sampling duration, using both statistically simulated and observed data. For current-generation scanning lidars and sampling durations of about 30 min and longer, during which the stationarity assumption is valid for atmospheric flows, themore » systematic error is negligible but the random error exceeds about 10%.« less

  10. Sm-Nd Isotopic Systematics of Troctolite 76335

    NASA Technical Reports Server (NTRS)

    Edmunson, J.; Nyquist, L. E.; Borg, L. E.

    2007-01-01

    A study of the Sm-Nd isotopic systematics of lunar Mg-suite troctolite 76335 was undertaken to further establish the early chronology of lunar magmatism. Because the Rb-Sr isotopic systematics of similar sample 76535 yielded an age of 4570 +/- 70 Ma [2, lambda = 1.402 x 10(exp -11)], 76335 was expected to yield an old age. In contrast, the Sm-Nd and K-Ar ages of 76535 indicate that the sample is approximately 4260 Ma old, one of the youngest ages obtained for a Mg-suite rock. This study establishes the age of 76335 and discusses the constraints placed on its petrogenesis by its Sm-Nd isotope systematics. The Sm-Nd isotopic system of lunar Mg-suite troctolite 76335 indicates an age of 4278 +/- 60 Ma with an initial epsilon (sup 143)(sub Nd) value of 0.06 +/- 0.39. These values are consistent with the Sm-Nd isotopic systematics of similar sample 76535. Thus, it appears that a robust Sm-Nd age can be determined from a highly brecciated lunar sample. The Sm-Nd isotopic systematics of troctolites 76335 and 76535 appear to be different from those dominating the Mg-suite norites and KREEP basalts. Further analysis of the Mg-suite must be completed to reveal the isotopic relationships of these early lunar rocks.

  11. A method for inferring the rate of evolution of homologous characters that can potentially improve phylogenetic inference, resolve deep divergence and correct systematic biases.

    PubMed

    Cummins, Carla A; McInerney, James O

    2011-12-01

    Current phylogenetic methods attempt to account for evolutionary rate variation across characters in a matrix. This is generally achieved by the use of sophisticated evolutionary models, combined with dense sampling of large numbers of characters. However, systematic biases and superimposed substitutions make this task very difficult. Model adequacy can sometimes be achieved at the cost of adding large numbers of free parameters, with each parameter being optimized according to some criterion, resulting in increased computation times and large variances in the model estimates. In this study, we develop a simple approach that estimates the relative evolutionary rate of each homologous character. The method that we describe uses the similarity between characters as a proxy for evolutionary rate. In this article, we work on the premise that if the character-state distribution of a homologous character is similar to many other characters, then this character is likely to be relatively slowly evolving. If the character-state distribution of a homologous character is not similar to many or any of the rest of the characters in a data set, then it is likely to be the result of rapid evolution. We show that in some test cases, at least, the premise can hold and the inferences are robust. Importantly, the method does not use a "starting tree" to make the inference and therefore is tree independent. We demonstrate that this approach can work as well as a maximum likelihood (ML) approach, though the ML method needs to have a known phylogeny, or at least a very good estimate of that phylogeny. We then demonstrate some uses for this method of analysis, including the improvement in phylogeny reconstruction for both deep-level and recent relationships and overcoming systematic biases such as base composition bias. Furthermore, we compare this approach to two well-established methods for reweighting or removing characters. These other methods are tree-based and we show that they can be systematically biased. We feel this method can be useful for phylogeny reconstruction, understanding evolutionary rate variation, and for understanding selection variation on different characters.

  12. Sampling maternal care behaviour in domestic dogs: What's the best approach?

    PubMed

    Czerwinski, Veronika H; Smith, Bradley P; Hynd, Philip I; Hazel, Susan J

    2017-07-01

    Our understanding of the frequency and duration of maternal care behaviours in the domestic dog during the first two postnatal weeks is limited, largely due to the inconsistencies in the sampling methodologies that have been employed. In order to develop a more concise picture of maternal care behaviour during this period, and to help establish the sampling method that represents these behaviours best, we compared a variety of time sampling methods Six litters were continuously observed for a total of 96h over postnatal days 3, 6, 9 and 12 (24h per day). Frequent (dam presence, nursing duration, contact duration) and infrequent maternal behaviours (anogenital licking duration and frequency) were coded using five different time sampling methods that included: 12-h night (1800-0600h), 12-h day (0600-1800h), one hour period during the night (1800-0600h), one hour period during the day (0600-1800h) and a one hour period anytime. Each of the one hour time sampling method consisted of four randomly chosen 15-min periods. Two random sets of four 15-min period were also analysed to ensure reliability. We then determined which of the time sampling methods averaged over the three 24-h periods best represented the frequency and duration of behaviours. As might be expected, frequently occurring behaviours were adequately represented by short (oneh) sampling periods, however this was not the case with the infrequent behaviour. Thus, we argue that the time sampling methodology employed must match the behaviour of interest. This caution applies to maternal behaviour in altricial species, such as canids, as well as all systematic behavioural observations utilising time sampling methodology. Copyright © 2017. Published by Elsevier B.V.

  13. Are synesthetes exceptional beyond their synesthetic associations? A systematic comparison of creativity, personality, cognition, and mental imagery in synesthetes and controls.

    PubMed

    Chun, Charlotte A; Hupé, Jean-Michel

    2016-08-01

    Synesthesia has historically been linked with enhanced creativity, but this had never been demonstrated in a systematically recruited sample. The current study offers a broad examination of creativity, personality, cognition, and mental imagery in a small sample of systematically recruited synesthetes and controls (n = 65). Synesthetes scored higher on some measures of creativity, personality traits of absorption and openness, and cognitive abilities of verbal comprehension and mental imagery. The differences were smaller than those reported in the literature, indicating that previous studies may have overestimated group differences, perhaps due to biased recruitment procedures. Nonetheless, most of our results replicated literature findings, yielding two possibilities: (1) our study was influenced by similar biases, or (2) differences between synesthetes and controls, though modest, are robust across recruitment methods. The covariance among our measures warrants interpretation of these differences as a pattern of associations with synesthesia, leaving open the possibility that this pattern could be explained by differences on a single measured trait, or even a hidden, untested trait. More generally, this study highlights the difficulty of comparing groups of people in psychology, not to mention neuropsychology and neuroimaging studies. The requirements discussed here - systematic recruitment procedures, large battery of tests, and large cohorts - are best fulfilled through collaborative efforts and cumulative science. © 2015 The Authors. British Journal of Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.

  14. Suspended particulate matter collection methods influence the quantification of polycyclic aromatic compounds in the river system.

    PubMed

    Abuhelou, Fayez; Mansuy-Huault, Laurence; Lorgeoux, Catherine; Catteloin, Delphine; Collin, Valéry; Bauer, Allan; Kanbar, Hussein Jaafar; Gley, Renaud; Manceau, Luc; Thomas, Fabien; Montargès-Pelletier, Emmanuelle

    2017-10-01

    In this study, we compared the influence of two different collection methods, filtration (FT) and continuous flow field centrifugation (CFC), on the concentration and the distribution of polycyclic aromatic compounds (PACs) in suspended particulate matter (SPM) occurring in river waters. SPM samples were collected simultaneously with FT and CFC from a river during six sampling campaigns over 2 years, covering different hydrological contexts. SPM samples were analyzed to determine the concentration of PACs including 16 polycyclic aromatic hydrocarbons (PAHs), 11 oxygenated PACs (O-PACs), and 5 nitrogen PACs (N-PACs). Results showed significant differences between the two separation methods. In half of the sampling campaigns, PAC concentrations differed from a factor 2 to 30 comparing FT and CFC-collected SPMs. The PAC distributions were also affected by the separation method. FT-collected SPM were enriched in 2-3 ring PACs whereas CFC-collected SPM had PAC distributions dominated by medium to high molecular weight compounds typical of combustion processes. This could be explained by distinct cut-off threshold of the two separation methods and strongly suggested the retention of colloidal and/or fine matter on glass-fiber filters particularly enriched in low molecular PACs. These differences between FT and CFC were not systematic but rather enhanced by high water flow rates.

  15. Damage evolution analysis of coal samples under cyclic loading based on single-link cluster method

    NASA Astrophysics Data System (ADS)

    Zhang, Zhibo; Wang, Enyuan; Li, Nan; Li, Xuelong; Wang, Xiaoran; Li, Zhonghui

    2018-05-01

    In this paper, the acoustic emission (AE) response of coal samples under cyclic loading is measured. The results show that there is good positive relation between AE parameters and stress. The AE signal of coal samples under cyclic loading exhibits an obvious Kaiser Effect. The single-link cluster (SLC) method is applied to analyze the spatial evolution characteristics of AE events and the damage evolution process of coal samples. It is found that a subset scale of the SLC structure becomes smaller and smaller when the number of cyclic loading increases, and there is a negative linear relationship between the subset scale and the degree of damage. The spatial correlation length ξ of an SLC structure is calculated. The results show that ξ fluctuates around a certain value from the second cyclic loading process to the fifth cyclic loading process, but spatial correlation length ξ clearly increases in the sixth loading process. Based on the criterion of microcrack density, the coal sample failure process is the transformation from small-scale damage to large-scale damage, which is the reason for changes in the spatial correlation length. Through a systematic analysis, the SLC method is an effective method to research the damage evolution process of coal samples under cyclic loading, and will provide important reference values for studying coal bursts.

  16. The gas chromatographic determination of volatile fatty acids in wastewater samples: evaluation of experimental biases in direct injection method against thermal desorption method.

    PubMed

    Ullah, Md Ahsan; Kim, Ki-Hyun; Szulejko, Jan E; Cho, Jinwoo

    2014-04-11

    The production of short-chained volatile fatty acids (VFAs) by the anaerobic bacterial digestion of sewage (wastewater) affords an excellent opportunity to alternative greener viable bio-energy fuels (i.e., microbial fuel cell). VFAs in wastewater (sewage) samples are commonly quantified through direct injection (DI) into a gas chromatograph with a flame ionization detector (GC-FID). In this study, the reliability of VFA analysis by the DI-GC method has been examined against a thermal desorption (TD-GC) method. The results indicate that the VFA concentrations determined from an aliquot from each wastewater sample by the DI-GC method were generally underestimated, e.g., reductions of 7% (acetic acid) to 93.4% (hexanoic acid) relative to the TD-GC method. The observed differences between the two methods suggest the possibly important role of the matrix effect to give rise to the negative biases in DI-GC analysis. To further explore this possibility, an ancillary experiment was performed to examine bias patterns of three DI-GC approaches. For instance, the results of the standard addition (SA) method confirm the definite role of matrix effect when analyzing wastewater samples by DI-GC. More importantly, their biases tend to increase systematically with increasing molecular weight and decreasing VFA concentrations. As such, the use of DI-GC method, if applied for the analysis of samples with a complicated matrix, needs a thorough validation to improve the reliability in data acquisition. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. A systematic comparison of different object-based classification techniques using high spatial resolution imagery in agricultural environments

    NASA Astrophysics Data System (ADS)

    Li, Manchun; Ma, Lei; Blaschke, Thomas; Cheng, Liang; Tiede, Dirk

    2016-07-01

    Geographic Object-Based Image Analysis (GEOBIA) is becoming more prevalent in remote sensing classification, especially for high-resolution imagery. Many supervised classification approaches are applied to objects rather than pixels, and several studies have been conducted to evaluate the performance of such supervised classification techniques in GEOBIA. However, these studies did not systematically investigate all relevant factors affecting the classification (segmentation scale, training set size, feature selection and mixed objects). In this study, statistical methods and visual inspection were used to compare these factors systematically in two agricultural case studies in China. The results indicate that Random Forest (RF) and Support Vector Machines (SVM) are highly suitable for GEOBIA classifications in agricultural areas and confirm the expected general tendency, namely that the overall accuracies decline with increasing segmentation scale. All other investigated methods except for RF and SVM are more prone to obtain a lower accuracy due to the broken objects at fine scales. In contrast to some previous studies, the RF classifiers yielded the best results and the k-nearest neighbor classifier were the worst results, in most cases. Likewise, the RF and Decision Tree classifiers are the most robust with or without feature selection. The results of training sample analyses indicated that the RF and adaboost. M1 possess a superior generalization capability, except when dealing with small training sample sizes. Furthermore, the classification accuracies were directly related to the homogeneity/heterogeneity of the segmented objects for all classifiers. Finally, it was suggested that RF should be considered in most cases for agricultural mapping.

  18. Identification and selection of cases and controls in the Pneumonia Etiology Research for Child Health project.

    PubMed

    Deloria-Knoll, Maria; Feikin, Daniel R; Scott, J Anthony G; O'Brien, Katherine L; DeLuca, Andrea N; Driscoll, Amanda J; Levine, Orin S

    2012-04-01

    Methods for the identification and selection of patients (cases) with severe or very severe pneumonia and controls for the Pneumonia Etiology Research for Child Health (PERCH) project were needed. Issues considered include eligibility criteria and sampling strategies, whether to enroll hospital or community controls, whether to exclude controls with upper respiratory tract infection (URTI) or nonsevere pneumonia, and matching criteria, among others. PERCH ultimately decided to enroll community controls and an additional human immunodeficiency virus (HIV)-infected control group at high HIV-prevalence sites matched on age and enrollment date of cases; controls with symptoms of URTI or nonsevere pneumonia will not be excluded. Systematic sampling of cases (when necessary) and random sampling of controls will be implemented. For each issue, we present the options that were considered, the advantages and disadvantages of each, the rationale for the methods selected for PERCH, and remaining implications and limitations.

  19. Complex magnetic properties of TbMn{sub 1-x}Fe{sub x}O{sub 3} (x = 0.1 and 0.2) nanoparticles prepared by the sol-gel method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Das, A.; Chatterjee, S.; Das, D., E-mail: ddas@alpha.iuc.res.in

    2016-05-23

    TbMn{sub 1-x}Fe{sub x}O{sub 3} nanoparticles (NPs) with x = 0, 0.1 and 0.2 have been prepared by adopting the chemical sol-gel method. Phase identification and particle size estimation are done by XRD analysis. M-H measurements at 5 K indicate a complete ferromagnetic behaviour in the Fe-doped samples with large coercivity whereas the pristine sample shows presence of both ferromagnetic and antiferromagnetic orders. ZFC and FC magnetization curves of all samples show signature of antiferromagnetic ordering of both terbium and manganese magnetic moments along with a systematic shift of ordering temperatures with Fe substitution. {sup 57}Fe Mössbauer spectroscopic measurements of the Fe-dopedmore » samples at room temperature confirm the paramagnetic behaviour and reduction of electric field gradient around Fe probe atoms with increase of Fe concentration.« less

  20. Prediction of cyclohexane-water distribution coefficient for SAMPL5 drug-like compounds with the QMPFF3 and ARROW polarizable force fields.

    PubMed

    Kamath, Ganesh; Kurnikov, Igor; Fain, Boris; Leontyev, Igor; Illarionov, Alexey; Butin, Oleg; Olevanov, Michael; Pereyaslavets, Leonid

    2016-11-01

    We present the performance of blind predictions of water-cyclohexane distribution coefficients for 53 drug-like compounds in the SAMPL5 challenge by three methods currently in use within our group. Two of them utilize QMPFF3 and ARROW, polarizable force-fields of varying complexity, and the third uses the General Amber Force-Field (GAFF). The polarizable FF's are implemented in an in-house MD package, Arbalest. We find that when we had time to parametrize the functional groups with care (batch 0), the polarizable force-fields outperformed the non-polarizable one. Conversely, on the full set of 53 compounds, GAFF performed better than both QMPFF3 and ARROW. We also describe the torsion-restrain method we used to improve sampling of molecular conformational space and thus the overall accuracy of prediction. The SAMPL5 challenge highlighted several drawbacks of our force-fields, such as our significant systematic over-estimation of hydrophobic interactions, specifically for alkanes and aromatic rings.

  1. Applicability of solid-phase microextraction combined with gas chromatography atomic emission detection (GC-MIP AED) for the determination of butyltin compounds in sediment samples.

    PubMed

    Carpinteiro, J; Rodríguez, I; Cela, R

    2004-11-01

    The performance of solid-phase microextraction (SPME) applied to the determination of butyltin compounds in sediment samples is systematically evaluated. Matrix effects and influence of blank signals on the detection limits of the method are studied in detail. The interval of linear response is also evaluated in order to assess the applicability of the method to sediments polluted with butyltin compounds over a large range of concentrations. Advantages and drawbacks of including an SPME step, instead of the classic liquid-liquid extraction of the derivatized analytes, in the determination of butyltin compounds in sediment samples are considered in terms of achieved detection limits and experimental effort. Analytes were extracted from the samples by sonication using glacial acetic acid. An aliquot of the centrifuged extract was placed on a vial where compounds were ethylated and concentrated on a PDMS fiber using the headspace mode. Determinations were carried out using GC-MIP AED.

  2. A Systematic Review of Published Respondent-Driven Sampling Surveys Collecting Behavioral and Biologic Data.

    PubMed

    Johnston, Lisa G; Hakim, Avi J; Dittrich, Samantha; Burnett, Janet; Kim, Evelyn; White, Richard G

    2016-08-01

    Reporting key details of respondent-driven sampling (RDS) survey implementation and analysis is essential for assessing the quality of RDS surveys. RDS is both a recruitment and analytic method and, as such, it is important to adequately describe both aspects in publications. We extracted data from peer-reviewed literature published through September, 2013 that reported collected biological specimens using RDS. We identified 151 eligible peer-reviewed articles describing 222 surveys conducted in seven regions throughout the world. Most published surveys reported basic implementation information such as survey city, country, year, population sampled, interview method, and final sample size. However, many surveys did not report essential methodological and analytical information for assessing RDS survey quality, including number of recruitment sites, seeds at start and end, maximum number of waves, and whether data were adjusted for network size. Understanding the quality of data collection and analysis in RDS is useful for effectively planning public health service delivery and funding priorities.

  3. Optimized Heart Sampling and Systematic Evaluation of Cardiac Therapies in Mouse Models of Ischemic Injury: Assessment of Cardiac Remodeling and Semi-Automated Quantification of Myocardial Infarct Size.

    PubMed

    Valente, Mariana; Araújo, Ana; Esteves, Tiago; Laundos, Tiago L; Freire, Ana G; Quelhas, Pedro; Pinto-do-Ó, Perpétua; Nascimento, Diana S

    2015-12-02

    Cardiac therapies are commonly tested preclinically in small-animal models of myocardial infarction. Following functional evaluation, post-mortem histological analysis is essential to assess morphological and molecular alterations underlying the effectiveness of treatment. However, non-methodical and inadequate sampling of the left ventricle often leads to misinterpretations and variability, making direct study comparisons unreliable. Protocols are provided for representative sampling of the ischemic mouse heart followed by morphometric analysis of the left ventricle. Extending the use of this sampling to other types of in situ analysis is also illustrated through the assessment of neovascularization and cellular engraftment in a cell-based therapy setting. This is of interest to the general cardiovascular research community as it details methods for standardization and simplification of histo-morphometric evaluation of emergent heart therapies. © 2015 by John Wiley & Sons, Inc. Copyright © 2015 John Wiley & Sons, Inc.

  4. Determination of the influence of dispersion pattern of pesticide-resistant individuals on the reliability of resistance estimates using different sampling plans.

    PubMed

    Shah, R; Worner, S P; Chapman, R B

    2012-10-01

    Pesticide resistance monitoring includes resistance detection and subsequent documentation/ measurement. Resistance detection would require at least one (≥1) resistant individual(s) to be present in a sample to initiate management strategies. Resistance documentation, on the other hand, would attempt to get an estimate of the entire population (≥90%) of the resistant individuals. A computer simulation model was used to compare the efficiency of simple random and systematic sampling plans to detect resistant individuals and to document their frequencies when the resistant individuals were randomly or patchily distributed. A patchy dispersion pattern of resistant individuals influenced the sampling efficiency of systematic sampling plans while the efficiency of random sampling was independent of such patchiness. When resistant individuals were randomly distributed, sample sizes required to detect at least one resistant individual (resistance detection) with a probability of 0.95 were 300 (1%) and 50 (10% and 20%); whereas, when resistant individuals were patchily distributed, using systematic sampling, sample sizes required for such detection were 6000 (1%), 600 (10%) and 300 (20%). Sample sizes of 900 and 400 would be required to detect ≥90% of resistant individuals (resistance documentation) with a probability of 0.95 when resistant individuals were randomly dispersed and present at a frequency of 10% and 20%, respectively; whereas, when resistant individuals were patchily distributed, using systematic sampling, a sample size of 3000 and 1500, respectively, was necessary. Small sample sizes either underestimated or overestimated the resistance frequency. A simple random sampling plan is, therefore, recommended for insecticide resistance detection and subsequent documentation.

  5. Urban Land Cover Mapping Accuracy Assessment - A Cost-benefit Analysis Approach

    NASA Astrophysics Data System (ADS)

    Xiao, T.

    2012-12-01

    One of the most important components in urban land cover mapping is mapping accuracy assessment. Many statistical models have been developed to help design simple schemes based on both accuracy and confidence levels. It is intuitive that an increased number of samples increases the accuracy as well as the cost of an assessment. Understanding cost and sampling size is crucial in implementing efficient and effective of field data collection. Few studies have included a cost calculation component as part of the assessment. In this study, a cost-benefit sampling analysis model was created by combining sample size design and sampling cost calculation. The sampling cost included transportation cost, field data collection cost, and laboratory data analysis cost. Simple Random Sampling (SRS) and Modified Systematic Sampling (MSS) methods were used to design sample locations and to extract land cover data in ArcGIS. High resolution land cover data layers of Denver, CO and Sacramento, CA, street networks, and parcel GIS data layers were used in this study to test and verify the model. The relationship between the cost and accuracy was used to determine the effectiveness of each sample method. The results of this study can be applied to other environmental studies that require spatial sampling.

  6. Method and reporting quality in health professions education research: a systematic review.

    PubMed

    Cook, David A; Levinson, Anthony J; Garside, Sarah

    2011-03-01

    Studies evaluating reporting quality in health professions education (HPE) research have demonstrated deficiencies, but none have used comprehensive reporting standards. Additionally, the relationship between study methods and effect size (ES) in HPE research is unknown. This review aimed to evaluate, in a sample of experimental studies of Internet-based instruction, the quality of reporting, the relationship between reporting and methodological quality, and associations between ES and study methods. We conducted a systematic search of databases including MEDLINE, Scopus, CINAHL, EMBASE and ERIC, for articles published during 1990-2008. Studies (in any language) quantifying the effect of Internet-based instruction in HPE compared with no intervention or other instruction were included. Working independently and in duplicate, we coded reporting quality using the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement, and coded study methods using a modified Newcastle-Ottawa Scale (m-NOS), the Medical Education Research Study Quality Instrument (MERSQI), and the Best Evidence in Medical Education (BEME) global scale. For reporting quality, articles scored a mean±standard deviation (SD) of 51±25% of STROBE elements for the Introduction, 58±20% for the Methods, 50±18% for the Results and 41±26% for the Discussion sections. We found positive associations (all p<0.0001) between reporting quality and MERSQI (ρ=0.64), m-NOS (ρ=0.57) and BEME (ρ=0.58) scores. We explored associations between study methods and knowledge ES by subtracting each study's ES from the pooled ES for studies using that method and comparing these differences between subgroups. Effect sizes in single-group pretest/post-test studies differed from the pooled estimate more than ESs in two-group studies (p=0.013). No difference was found between other study methods (yes/no: representative sample, comparison group from same community, randomised, allocation concealed, participants blinded, assessor blinded, objective assessment, high follow-up). Information is missing from all sections of reports of HPE experiments. Single-group pre-/post-test studies may overestimate ES compared with two-group designs. Other methodological variations did not bias study results in this sample. © Blackwell Publishing Ltd 2011.

  7. Computerized Cognitive Rehabilitation of Attention and Executive Function in Acquired Brain Injury: A Systematic Review.

    PubMed

    Bogdanova, Yelena; Yee, Megan K; Ho, Vivian T; Cicerone, Keith D

    Comprehensive review of the use of computerized treatment as a rehabilitation tool for attention and executive function in adults (aged 18 years or older) who suffered an acquired brain injury. Systematic review of empirical research. Two reviewers independently assessed articles using the methodological quality criteria of Cicerone et al. Data extracted included sample size, diagnosis, intervention information, treatment schedule, assessment methods, and outcome measures. A literature review (PubMed, EMBASE, Ovid, Cochrane, PsychINFO, CINAHL) generated a total of 4931 publications. Twenty-eight studies using computerized cognitive interventions targeting attention and executive functions were included in this review. In 23 studies, significant improvements in attention and executive function subsequent to training were reported; in the remaining 5, promising trends were observed. Preliminary evidence suggests improvements in cognitive function following computerized rehabilitation for acquired brain injury populations including traumatic brain injury and stroke. Further studies are needed to address methodological issues (eg, small sample size, inadequate control groups) and to inform development of guidelines and standardized protocols.

  8. [An attempt for standardization of serum CA19-9 levels, in order to dissolve the gap between three different methods].

    PubMed

    Hayashi, Kuniki; Hoshino, Tadashi; Yanai, Mitsuru; Tsuchiya, Tatsuyuki; Kumasaka, Kazunari; Kawano, Kinya

    2004-06-01

    It is well known that serious method-related differences exist in results of serum CA19-9, and the necessity of standardization has been pointed out. In this study, differences of serum tumor marker CA19-9 levels obtained by various immunoassay kits (CLEIA, FEIA, LPIA and RIA) were evaluated in sixty-seven clinical samples and five calibrators and the possibility to improve the inter-methodological differences were observed not only for clinical samples but also for calibrators. We supposed an assumed standard material using by a calibrator. We calculated the serum levels of CA19-9 when using the assumed standard material for three different measurement methods. We approximate the CA19-9 values using by this method. It is suggested that the obtained CA19-9 values could be approximated by recalculation with the assumed standard material would be able to correct between-method and between-laboratory discrepancies in particular systematic errors.

  9. Information fusion methods based on physical laws.

    PubMed

    Rao, Nageswara S V; Reister, David B; Barhen, Jacob

    2005-01-01

    We consider systems whose parameters satisfy certain easily computable physical laws. Each parameter is directly measured by a number of sensors, or estimated using measurements, or both. The measurement process may introduce both systematic and random errors which may then propagate into the estimates. Furthermore, the actual parameter values are not known since every parameter is measured or estimated, which makes the existing sample-based fusion methods inapplicable. We propose a fusion method for combining the measurements and estimators based on the least violation of physical laws that relate the parameters. Under fairly general smoothness and nonsmoothness conditions on the physical laws, we show the asymptotic convergence of our method and also derive distribution-free performance bounds based on finite samples. For suitable choices of the fuser classes, we show that for each parameter the fused estimate is probabilistically at least as good as its best measurement as well as best estimate. We illustrate the effectiveness of this method for a practical problem of fusing well-log data in methane hydrate exploration.

  10. Quantification and characterisation of fatty acid methyl esters in microalgae: Comparison of pretreatment and purification methods.

    PubMed

    Lage, Sandra; Gentili, Francesco G

    2018-06-01

    A systematic qualitative and quantitative analysis of fatty acid methyl esters (FAMEs) is crucial for microalgae species selection for biodiesel production. The aim of this study is to identify the best method to assess microalgae FAMEs composition and content. A single-step method, was tested with and without purification steps-that is, separation of lipid classes by thin-layer chromatography (TLC) or solid-phase extraction (SPE). The efficiency of a direct transesterification method was also evaluated. Additionally, the yield of the FAMEs and the profiles of the microalgae samples with different pretreatments (boiled in isopropanol, freezing, oven-dried and freeze-dried) were compared. The application of a purification step after lipid extraction proved to be essential for an accurate FAMEs characterisation. The purification methods, which included TLC and SPE, provided superior results compared to not purifying the samples. Freeze-dried microalgae produced the lowest FAMEs yield. However, FAMEs profiles were generally equivalent among the pretreatments. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Land Boundary Delineation to Supporting of Program Systematic Complete Land Registration (PTSL) Using Multicopter-RTF Data (Case study: Wotan Village, Panceng Sub District, Gresik district)

    NASA Astrophysics Data System (ADS)

    Cahyono, A. B.; Deviantari, U. W.

    2017-12-01

    According to statutory regulation issued by Ministry of Land and Spatial Planning/Head of National Land Agency (BPN) number 35/2016, Comprehensive Systematic land registration is a sequential activity of which continuously and systematically carried out by the government ranging from collecting, processing, recording and presenting, as well as maintaining the physical and juridical data in the form of map and list of land-plots and flats, including the transfer of legal title for land plots and flats with their inherent rights. Delineation is one method to identify land plots by utilizing map image or high resolution photo and defining the boundaries by drawing lines to determine the valid and recognizable boundaries. A guideline to delineate the unregistered land plots may be determined from this two methods’ accuracy result, using general boundary applied to aerial photo taken by multicopter RTF. Data taken from a height of 70 meter on an area obtained a number of 156 photos with 5 GCP resulting in an photo map with GSD 2.14 cm. The 11 samples parcels are selected in the sites of ± 7 ha. There are 11 samples of land parcels are tested. The area difference test for every parcel using a average standard deviation of 17,043 indicates that there are three land parcels which have significant area difference and 8 others do not have significant area difference. Based on the tolerance of National Land Agency, among 11 parcels studied, there are 8 parcels that fullfill the tolerances and three others do not fullfill tolerances. The percentage of area difference average between land registration map and orthophoto is 4,72%. The result shows that the differences in boundaries and areas that may be caused by a systematic error of method in describing the boundaries of the ground.

  12. Comparison of projection skills of deterministic ensemble methods using pseudo-simulation data generated from multivariate Gaussian distribution

    NASA Astrophysics Data System (ADS)

    Oh, Seok-Geun; Suh, Myoung-Seok

    2017-07-01

    The projection skills of five ensemble methods were analyzed according to simulation skills, training period, and ensemble members, using 198 sets of pseudo-simulation data (PSD) produced by random number generation assuming the simulated temperature of regional climate models. The PSD sets were classified into 18 categories according to the relative magnitude of bias, variance ratio, and correlation coefficient, where each category had 11 sets (including 1 truth set) with 50 samples. The ensemble methods used were as follows: equal weighted averaging without bias correction (EWA_NBC), EWA with bias correction (EWA_WBC), weighted ensemble averaging based on root mean square errors and correlation (WEA_RAC), WEA based on the Taylor score (WEA_Tay), and multivariate linear regression (Mul_Reg). The projection skills of the ensemble methods improved generally as compared with the best member for each category. However, their projection skills are significantly affected by the simulation skills of the ensemble member. The weighted ensemble methods showed better projection skills than non-weighted methods, in particular, for the PSD categories having systematic biases and various correlation coefficients. The EWA_NBC showed considerably lower projection skills than the other methods, in particular, for the PSD categories with systematic biases. Although Mul_Reg showed relatively good skills, it showed strong sensitivity to the PSD categories, training periods, and number of members. On the other hand, the WEA_Tay and WEA_RAC showed relatively superior skills in both the accuracy and reliability for all the sensitivity experiments. This indicates that WEA_Tay and WEA_RAC are applicable even for simulation data with systematic biases, a short training period, and a small number of ensemble members.

  13. Methods for Specifying the Target Difference in a Randomised Controlled Trial: The Difference ELicitation in TriAls (DELTA) Systematic Review

    PubMed Central

    Hislop, Jenni; Adewuyi, Temitope E.; Vale, Luke D.; Harrild, Kirsten; Fraser, Cynthia; Gurung, Tara; Altman, Douglas G.; Briggs, Andrew H.; Fayers, Peter; Ramsay, Craig R.; Norrie, John D.; Harvey, Ian M.; Buckley, Brian; Cook, Jonathan A.

    2014-01-01

    Background Randomised controlled trials (RCTs) are widely accepted as the preferred study design for evaluating healthcare interventions. When the sample size is determined, a (target) difference is typically specified that the RCT is designed to detect. This provides reassurance that the study will be informative, i.e., should such a difference exist, it is likely to be detected with the required statistical precision. The aim of this review was to identify potential methods for specifying the target difference in an RCT sample size calculation. Methods and Findings A comprehensive systematic review of medical and non-medical literature was carried out for methods that could be used to specify the target difference for an RCT sample size calculation. The databases searched were MEDLINE, MEDLINE In-Process, EMBASE, the Cochrane Central Register of Controlled Trials, the Cochrane Methodology Register, PsycINFO, Science Citation Index, EconLit, the Education Resources Information Center (ERIC), and Scopus (for in-press publications); the search period was from 1966 or the earliest date covered, to between November 2010 and January 2011. Additionally, textbooks addressing the methodology of clinical trials and International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) tripartite guidelines for clinical trials were also consulted. A narrative synthesis of methods was produced. Studies that described a method that could be used for specifying an important and/or realistic difference were included. The search identified 11,485 potentially relevant articles from the databases searched. Of these, 1,434 were selected for full-text assessment, and a further nine were identified from other sources. Fifteen clinical trial textbooks and the ICH tripartite guidelines were also reviewed. In total, 777 studies were included, and within them, seven methods were identified—anchor, distribution, health economic, opinion-seeking, pilot study, review of the evidence base, and standardised effect size. Conclusions A variety of methods are available that researchers can use for specifying the target difference in an RCT sample size calculation. Appropriate methods may vary depending on the aim (e.g., specifying an important difference versus a realistic difference), context (e.g., research question and availability of data), and underlying framework adopted (e.g., Bayesian versus conventional statistical approach). Guidance on the use of each method is given. No single method provides a perfect solution for all contexts. Please see later in the article for the Editors' Summary PMID:24824338

  14. Case Series Investigations in Cognitive Neuropsychology

    PubMed Central

    Schwartz, Myrna F.; Dell, Gary S.

    2011-01-01

    Case series methodology involves the systematic assessment of a sample of related patients, with the goal of understanding how and why they differ from one another. This method has become increasingly important in cognitive neuropsychology, which has long been identified with single-subject research. We review case series studies dealing with impaired semantic memory, reading, and language production, and draw attention to the affinity of this methodology for testing theories that are expressed as computational models and for addressing questions about neuroanatomy. It is concluded that case series methods usefully complement single-subject techniques. PMID:21714756

  15. Hepatitis C bio-behavioural surveys in people who inject drugs-a systematic review of sensitivity to the theoretical assumptions of respondent driven sampling.

    PubMed

    Buchanan, Ryan; Khakoo, Salim I; Coad, Jonathan; Grellier, Leonie; Parkes, Julie

    2017-07-11

    New, more effective and better-tolerated therapies for hepatitis C (HCV) have made the elimination of HCV a feasible objective. However, for this to be achieved, it is necessary to have a detailed understanding of HCV epidemiology in people who inject drugs (PWID). Respondent-driven sampling (RDS) can provide prevalence estimates in hidden populations such as PWID. The aims of this systematic review are to identify published studies that use RDS in PWID to measure the prevalence of HCV, and compare each study against the STROBE-RDS checklist to assess their sensitivity to the theoretical assumptions underlying RDS. Searches were undertaken in accordance with PRISMA systematic review guidelines. Included studies were English language publications in peer-reviewed journals, which reported the use of RDS to recruit PWID to an HCV bio-behavioural survey. Data was extracted under three headings: (1) survey overview, (2) survey outcomes, and (3) reporting against selected STROBE-RDS criteria. Thirty-one studies met the inclusion criteria. They varied in scale (range 1-15 survey sites) and the sample sizes achieved (range 81-1000 per survey site) but were consistent in describing the use of standard RDS methods including: seeds, coupons and recruitment incentives. Twenty-seven studies (87%) either calculated or reported the intention to calculate population prevalence estimates for HCV and two used RDS data to calculate the total population size of PWID. Detailed operational and analytical procedures and reporting against selected criteria from the STROBE-RDS checklist varied between studies. There were widespread indications that sampling did not meet the assumptions underlying RDS, which led to two studies being unable to report an estimated HCV population prevalence in at least one survey location. RDS can be used to estimate a population prevalence of HCV in PWID and estimate the PWID population size. Accordingly, as a single instrument, it is a useful tool for guiding HCV elimination. However, future studies should report the operational conduct of each survey in accordance with the STROBE-RDS checklist to indicate sensitivity to the theoretical assumptions underlying the method. PROSPERO CRD42015019245.

  16. Effects of Pilates exercise programs in people with chronic low back pain: a systematic review.

    PubMed

    Patti, Antonino; Bianco, Antonino; Paoli, Antonio; Messina, Giuseppe; Montalto, Maria Alessandra; Bellafiore, Marianna; Battaglia, Giuseppe; Iovane, Angelo; Palma, Antonio

    2015-01-01

    The Pilates method has recently become a fast-growing popular way of exercise recommended for healthy individuals and those engaged in rehabilitation. Several published studies have examined the effects of Pilates method in people with chronic low back pain (LBP). The objective of this study is to describe and provide an extensive overview of the scientific literature comparing the effectiveness of the Pilates method on pain and disability in patients with chronic nonspecific LBP. The study is based on the data from the following sources: MEDLINE-NLM, MEDLINE-EBSCO, Scopus Elsevier, Cochrane, DOAJ, SciELO, and PLOSONE. Original articles and systematic reviews of adults with chronic nonspecific LBP that evaluated pain and/or disability were included in this study; studies in which the primary treatment was based on Pilates method exercises compared with no treatment, minimal intervention, other types of intervention, or other types of exercises. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) were adopted. The literature search included 7 electronic databases and the reference list of relevant systematic reviews and original articles to July 2014. Two independent investigators conducted the literature search and performed the synthesis as follows: Study Design; Sample (n); Disability measure; Intervention; and Main results. The searches identified a total of 128 articles. From these, 29 were considered eligible and were included in the analysis. The items were stratified as follows: Pilates method versus other kind of exercises (n = 6 trials) and Pilates method versus no treatment group or minimal intervention for short-term pain (n = 9 trials); the therapeutic effect of the Pilates method in randomized cohorts (n = 5); and analysis of reviews (n = 9). We found that there is a dearth of studies that clearly demonstrates the efficacy of a specific Pilates exercise program over another in the treatment of chronic pain. However, the consensus in the field suggests that Pilates method is more effective than minimal physical exercise intervention in reducing pain. These conclusions need to be supported by other proper investigations.

  17. Atmospheric vs. anaerobic processing of metabolome samples for the metabolite profiling of a strict anaerobic bacterium, Clostridium acetobutylicum.

    PubMed

    Lee, Sang-Hyun; Kim, Sooah; Kwon, Min-A; Jung, Young Hoon; Shin, Yong-An; Kim, Kyoung Heon

    2014-12-01

    Well-established metabolome sample preparation is a prerequisite for reliable metabolomic data. For metabolome sampling of a Gram-positive strict anaerobe, Clostridium acetobutylicum, fast filtration and metabolite extraction with acetonitrile/methanol/water (2:2:1, v/v) at -20°C under anaerobic conditions has been commonly used. This anaerobic metabolite processing method is laborious and time-consuming since it is conducted in an anaerobic chamber. Also, there have not been any systematic method evaluation and development of metabolome sample preparation for strict anaerobes and Gram-positive bacteria. In this study, metabolome sampling and extraction methods were rigorously evaluated and optimized for C. acetobutylicum by using gas chromatography/time-of-flight mass spectrometry-based metabolomics, in which a total of 116 metabolites were identified. When comparing the atmospheric (i.e., in air) and anaerobic (i.e., in an anaerobic chamber) processing of metabolome sample preparation, there was no significant difference in the quality and quantity of the metabolomic data. For metabolite extraction, pure methanol at -20°C was a better solvent than acetonitrile/methanol/water (2:2:1, v/v/v) at -20°C that is frequently used for C. acetobutylicum, and metabolite profiles were significantly different depending on extraction solvents. This is the first evaluation of metabolite sample preparation under aerobic processing conditions for an anaerobe. This method could be applied conveniently, efficiently, and reliably to metabolome analysis for strict anaerobes in air. © 2014 Wiley Periodicals, Inc.

  18. Crop identification and area estimation over large geographic areas using LANDSAT MSS data

    NASA Technical Reports Server (NTRS)

    Bauer, M. E. (Principal Investigator)

    1977-01-01

    The author has identified the following significant results. LANDSAT MSS data was adequate to accurately identify wheat in Kansas; corn and soybean estimates in Indiana were less accurate. Computer-aided analysis techniques were effectively used to extract crop identification information from LANDSAT data. Systematic sampling of entire counties made possible by computer classification methods resulted in very precise area estimates at county, district, and state levels. Training statistics were successfully extended from one county to other counties having similar crops and soils if the training areas sampled the total variation of the area to be classified.

  19. Signal recognition efficiencies of artificial neural-network pulse-shape discrimination in HPGe -decay searches

    NASA Astrophysics Data System (ADS)

    Caldwell, A.; Cossavella, F.; Majorovits, B.; Palioselitis, D.; Volynets, O.

    2015-07-01

    A pulse-shape discrimination method based on artificial neural networks was applied to pulses simulated for different background, signal and signal-like interactions inside a germanium detector. The simulated pulses were used to investigate variations of efficiencies as a function of used training set. It is verified that neural networks are well-suited to identify background pulses in true-coaxial high-purity germanium detectors. The systematic uncertainty on the signal recognition efficiency derived using signal-like evaluation samples from calibration measurements is estimated to be 5 %. This uncertainty is due to differences between signal and calibration samples.

  20. Ring artifact reduction in synchrotron x-ray tomography through helical acquisition

    NASA Astrophysics Data System (ADS)

    Pelt, Daniël M.; Parkinson, Dilworth Y.

    2018-03-01

    In synchrotron x-ray tomography, systematic defects in certain detector elements can result in arc-shaped artifacts in the final reconstructed image of the scanned sample. These ring artifacts are commonly found in many applications of synchrotron tomography, and can make it difficult or impossible to use the reconstructed image in further analyses. The severity of ring artifacts is often reduced in practice by applying pre-processing on the acquired data, or post-processing on the reconstructed image. However, such additional processing steps can introduce additional artifacts as well, and rely on specific choices of hyperparameter values. In this paper, a different approach to reducing the severity of ring artifacts is introduced: a helical acquisition mode. By moving the sample parallel to the rotation axis during the experiment, the sample is detected at different detector positions in each projection, reducing the effect of systematic errors in detector elements. Alternatively, helical acquisition can be viewed as a way to transform ring artifacts to helix-like artifacts in the reconstructed volume, reducing their severity. We show that data acquired with the proposed mode can be transformed to data acquired with a virtual circular trajectory, enabling further processing of the data with existing software packages for circular data. Results for both simulated data and experimental data show that the proposed method is able to significantly reduce ring artifacts in practice, even compared with popular existing methods, without introducing additional artifacts.

  1. A systematic approach to designing statistically powerful heteroscedastic 2 × 2 factorial studies while minimizing financial costs.

    PubMed

    Jan, Show-Li; Shieh, Gwowen

    2016-08-31

    The 2 × 2 factorial design is widely used for assessing the existence of interaction and the extent of generalizability of two factors where each factor had only two levels. Accordingly, research problems associated with the main effects and interaction effects can be analyzed with the selected linear contrasts. To correct for the potential heterogeneity of variance structure, the Welch-Satterthwaite test is commonly used as an alternative to the t test for detecting the substantive significance of a linear combination of mean effects. This study concerns the optimal allocation of group sizes for the Welch-Satterthwaite test in order to minimize the total cost while maintaining adequate power. The existing method suggests that the optimal ratio of sample sizes is proportional to the ratio of the population standard deviations divided by the square root of the ratio of the unit sampling costs. Instead, a systematic approach using optimization technique and screening search is presented to find the optimal solution. Numerical assessments revealed that the current allocation scheme generally does not give the optimal solution. Alternatively, the suggested approaches to power and sample size calculations give accurate and superior results under various treatment and cost configurations. The proposed approach improves upon the current method in both its methodological soundness and overall performance. Supplementary algorithms are also developed to aid the usefulness and implementation of the recommended technique in planning 2 × 2 factorial designs.

  2. Delayed reward discounting and addictive behavior: a meta-analysis

    PubMed Central

    Amlung, Michael T.; Few, Lauren R.; Ray, Lara A.; Sweet, Lawrence H.; Munafò, Marcus R.

    2011-01-01

    Rationale Delayed reward discounting (DRD) is a behavioral economic index of impulsivity and numerous studies have examined DRD in relation to addictive behavior. To synthesize the findings across the literature, the current review is a meta-analysis of studies comparing DRD between criterion groups exhibiting addictive behavior and control groups. Objectives The meta-analysis sought to characterize the overall patterns of findings, systematic variability by sample and study type, and possible small study (publication) bias. Methods Literature reviews identified 310 candidate articles from which 46 studies reporting 64 comparisons were identified (total N=56,013). Results From the total comparisons identified, a small magnitude effect was evident (d=.15; p<.00001) with very high heterogeneity of effect size. Based on systematic observed differences, large studies assessing DRD with a small number of self-report items were removed and an analysis of 57 comparisons (n=3,329) using equivalent methods and exhibiting acceptable heterogeneity revealed a medium magnitude effect (d=.58; p<.00001). Further analyses revealed significantly larger effect sizes for studies using clinical samples (d=.61) compared with studies using nonclinical samples (d=.45). Indices of small study bias among the various comparisons suggested varying levels of influence by unpublished findings, ranging from minimal to moderate. Conclusions These results provide strong evidence of greater DRD in individuals exhibiting addictive behavior in general and particularly in individuals who meet criteria for an addictive disorder. Implications for the assessment of DRD and research priorities are discussed. PMID:21373791

  3. Temperature Dependent Electron Transport Properties of Gold Nanoparticles and Composites: Scanning Tunneling Spectroscopy Investigations.

    PubMed

    Patil, Sumati; Datar, Suwarna; Dharmadhikari, C V

    2018-03-01

    Scanning tunneling spectroscopy (STS) is used for investigating variations in electronic properties of gold nanoparticles (AuNPs) and its composite with urethane-methacrylate comb polymer (UMCP) as function of temperature. Films are prepared by drop casting AuNPs and UMCP in desired manner on silicon substrates. Samples are further analyzed for morphology under scanning electron microscopy (SEM) and atomic force microscopy (AFM). STS measurements performed in temperature range of 33 °C to 142 °C show systematic variation in current versus voltage (I-V) curves, exhibiting semiconducting to metallic transition/Schottky behavior for different samples, depending upon preparation method and as function of temperature. During current versus time (I-t) measurement for AuNPs, random telegraphic noise is observed at room temperature. Random switching of tunneling current between two discrete levels is observed for this sample. Power spectra derived from I-t show 1/f2 dependence. Statistical analysis of fluctuations shows exponential behavior with time width τ ≈ 7 ms. Local density of states (LDOS) plots derived from I-V curves of each sample show systematic shift in valance/conduction band edge towards/away from Fermi level, with respect to increase in temperature. Schottky emission is best fitted electron emission mechanism for all samples over certain range of bias voltage. Schottky plots are used to calculate barrier heights and temperature dependent measurements helped in measuring activation energies for electron transport in all samples.

  4. Reproductive tract infections: prevalence and risk factors in rural Bangladesh.

    PubMed Central

    Hawkes, Sarah; Morison, Linda; Chakraborty, Jyotsnamoy; Gausia, Kaniz; Ahmed, Farid; Islam, Shamim Sufia; Alam, Nazmul; Brown, David; Mabey, David

    2002-01-01

    OBJECTIVE: To determine the prevalence of and risk factors for reproductive tract infections among men and women in a rural community in Bangladesh. METHODS: In the Matlab area a systematic sample of married non-pregnant women aged 15-50 years was drawn from a comprehensive household registration system for married women. A systematic sample of married and unmarried men in the same age group was drawn from a census-derived demographic surveillance list. Private interviews were conducted with 804 women in a clinic, and cervical, vaginal, urinary and serological samples were collected. Urine and blood specimens were obtained from 969 men who were interviewed at home. FINDINGS: The prevalence of bacterial and viral reproductive tract infections was low to moderate. For example, fewer than 1% of the women had a cervical infection. No cases of human immunodeficiency virus (HIV) infection were found. However, among men there was a high level of reported risk behaviour and a low level of protection against infection. CONCLUSION: A low prevalence of reproductive tract infections, coupled with a high level of reported risk behaviour, indicated a need for primary programmes that would prevent an increase in the incidence of reproductive tract infections, sexually transmitted infections and HIV infection. PMID:11984603

  5. Prevalence of peptic ulcer in Iran: Systematic review and meta-analysis methods.

    PubMed

    Sayehmiri, Kourosh; Abangah, Ghobad; Kalvandi, Gholamreza; Tavan, Hamed; Aazami, Sanaz

    2018-01-01

    Peptic ulcer is a prevalent problem and symptoms include epigastria pain and heartburn. This study aimed at investigating the prevalence and causes of peptic ulcers in Iran using systematic review and meta-analysis. Eleven Iranian papers published from 2002 to 2016 are selected using valid keywords in the SID, Goggle scholar, PubMed and Elsevier databases. Results of studies pooled using random effects model in meta-analysis. The heterogeneity of the sample was checked using Q test and I 2 index. Total sample size in this study consist of 1335 individuals with peptic ulcer (121 samples per article). The prevalence of peptic ulcers was estimated 34% (95% CI= 0.25 - 0.43). The prevalence of peptic ulcers was 30% and 60% in woman and man respectively. The highest environmental factor (cigarette) has been addressed in 30% (95% CI= 0.23-0.37) of patients. The prevalence of Helicobacter pylori was estimated in 62% (95% CI= 0.49-0.75) of patients. The results of this study show that prevalence of peptic ulcers in Iran (34%) is higher that worldwide rate (6% to 15%). There was an increasing trend in the prevalence of peptic ulcer over a decade from 2002 to 2016.

  6. [Systematic review of the validity of urine cultures collected by sterile perineal bags].

    PubMed

    Ochoa Sangrador, C; Pascual Terrazas, A

    2016-02-01

    The perineal adhesive bag is the most used method in our country for urine culture collection in infants, despite having a high risk of contamination and false-positive results. We aim to quantify both types of risks through a systematic review. Search updated in May 2014 in PUBMED, SCOPUS (includes EMBASE), IBECS; CINAHL, LILACS AND CUIDEN, without language or time limits. Percentages of contaminated urines, false positives, sensitivity and specificity (with respect to catheterization or bladder puncture) were recorded. A total of 21 studies of medium quality (7,659 samples) were selected. The pooled percentage of contaminated urines was 46.6% (15 studies; 6856 samples; 95% confidence interval [95% CI]: 35.6 to 57.8%; I(2): 97.3%). The pooled percentage of false positives was 61.1% (12 studies; 575 samples; 95% CI: 37.9 to 82.2%; I(2): 96.2%). Sensitivity (88%; 95% CI: 81-93%; I(2): 55.2%), and specificity (82%; 95% CI: 75-89%; I(2): 41.3%) were estimated in five studies, but without including contaminated urines. The perineal adhesive bag is not a valid enough method for urine culture collection, because almost half are contaminated and, if they are positive, two out of three are false. Although these estimates are imprecise, because of their great heterogeneity, they should be considered when choosing the method of urine collection. The estimates of sensitivity and specificity are not applicable because they do not take into account the risk of contamination. Copyright © 2015 Asociación Española de Pediatría. Published by Elsevier España, S.L.U. All rights reserved.

  7. Systematic optimization of ethyl glucuronide extraction conditions from scalp hair by design of experiments and its potential effect on cut-off values appraisal.

    PubMed

    Alladio, Eugenio; Biosa, Giulia; Seganti, Fabrizio; Di Corcia, Daniele; Salomone, Alberto; Vincenti, Marco; Baumgartner, Markus R

    2018-05-11

    The quantitative determination of ethyl glucuronide (EtG) in hair samples is consistently used throughout the world to assess chronic excessive alcohol consumption. For administrative and legal purposes, the analytical results are compared with cut-off values recognized by regulatory authorities and scientific societies. However, it has been recently recognized that the analytical results depend on the hair sample pretreatment procedures, including the crumbling and extraction conditions. A systematic evaluation of the EtG extraction conditions from pulverized scalp hair was conducted by design of experiments (DoE) considering the extraction time, temperature, pH, and solvent composition as potential influencing factors. It was concluded that an overnight extraction at 60°C with pure water at neutral pH represents the most effective conditions to achieve high extraction yields. The absence of differential degradation of the internal standard (isotopically-labeled EtG) under such conditions was confirmed and the overall analytical method was validated according to SGWTOX and ISO17025 criteria. Twenty real hair samples with different EtG content were analyzed with three commonly accepted procedures: (a) hair manually cut in snippets and extracted at room temperature; (b) pulverized hair extracted at room temperature; (c) hair treated with the optimized method. Average increments of EtG concentration around 69% (from a to c) and 29% (from b to c) were recorded. In light of these results, the authors urge the scientific community to undertake an inter-laboratory study with the aim of defining more in detail the optimal hair EtG detection method and verifying the corresponding cut-off level for legal enforcements. This article is protected by copyright. All rights reserved.

  8. Manual search approaches used by systematic reviewers in dermatology.

    PubMed

    Vassar, Matt; Atakpo, Paul; Kash, Melissa J

    2016-10-01

    Manual searches are supplemental approaches to database searches to identify additional primary studies for systematic reviews. The authors argue that these manual approaches, in particular hand-searching and perusing reference lists, are often considered the same yet lead to different outcomes. We conducted a PubMed search for systematic reviews in the top 10 dermatology journals (January 2006-January 2016). After screening, the final sample comprised 292 reviews. Statements related to manual searches were extracted from each review and categorized by the primary and secondary authors. Each statement was categorized as either "Search of Reference List," "Hand Search," "Both," or "Unclear." Of the 292 systematic reviews included in our sample, 143 reviews (48.97%) did not report a hand-search or scan of reference lists. One-hundred thirty-six reviews (46.58%) reported searches of reference lists, while 4 reviews (1.37%) reported systematic hand-searches. Three reviews (1.03%) reported use of both hand-searches and scanning reference lists. Six reviews (2.05%) were classified as unclear due to vague wording. Authors of systematic reviews published in dermatology journals in our study sample scanned reference lists more frequently than they conducted hand-searches, possibly contributing to biased search outcomes. We encourage systematic reviewers to routinely practice hand-searching in order to minimize bias.

  9. Diverse Applications of Environmental DNA Methods in Parasitology.

    PubMed

    Bass, David; Stentiford, Grant D; Littlewood, D T J; Hartikainen, Hanna

    2015-10-01

    Nucleic acid extraction and sequencing of genes from organisms within environmental samples encompasses a variety of techniques collectively referred to as environmental DNA or 'eDNA'. The key advantages of eDNA analysis include the detection of cryptic or otherwise elusive organisms, large-scale sampling with fewer biases than specimen-based methods, and generation of data for molecular systematics. These are particularly relevant for parasitology because parasites can be difficult to locate and are morphologically intractable and genetically divergent. However, parasites have rarely been the focus of eDNA studies. Focusing on eukaryote parasites, we review the increasing diversity of the 'eDNA toolbox'. Combining eDNA methods with complementary tools offers much potential to understand parasite communities, disease risk, and parasite roles in broader ecosystem processes such as food web structuring and community assembly. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  10. Effects of forcefield and sampling method in all-atom simulations of inherently disordered proteins: Application to conformational preferences of human amylin

    PubMed Central

    Peng, Enxi; Todorova, Nevena

    2017-01-01

    Although several computational modelling studies have investigated the conformational behaviour of inherently disordered protein (IDP) amylin, discrepancies in identifying its preferred solution conformations still exist between various forcefields and sampling methods used. Human islet amyloid polypeptide has long been a subject of research, both experimentally and theoretically, as the aggregation of this protein is believed to be the lead cause of type-II diabetes. In this work, we present a systematic forcefield assessment using one of the most advanced non-biased sampling techniques, Replica Exchange with Solute Tempering (REST2), by comparing the secondary structure preferences of monomeric amylin in solution. This study also aims to determine the ability of common forcefields to sample a transition of the protein from a helical membrane bound conformation into the disordered solution state of amylin. Our results demonstrated that the CHARMM22* forcefield showed the best ability to sample multiple conformational states inherent for amylin. It is revealed that REST2 yielded results qualitatively consistent with experiments and in quantitative agreement with other sampling methods, however far more computationally efficiently and without any bias. Therefore, combining an unbiased sampling technique such as REST2 with a vigorous forcefield testing could be suggested as an important step in developing an efficient and robust strategy for simulating IDPs. PMID:29023509

  11. Effects of forcefield and sampling method in all-atom simulations of inherently disordered proteins: Application to conformational preferences of human amylin.

    PubMed

    Peng, Enxi; Todorova, Nevena; Yarovsky, Irene

    2017-01-01

    Although several computational modelling studies have investigated the conformational behaviour of inherently disordered protein (IDP) amylin, discrepancies in identifying its preferred solution conformations still exist between various forcefields and sampling methods used. Human islet amyloid polypeptide has long been a subject of research, both experimentally and theoretically, as the aggregation of this protein is believed to be the lead cause of type-II diabetes. In this work, we present a systematic forcefield assessment using one of the most advanced non-biased sampling techniques, Replica Exchange with Solute Tempering (REST2), by comparing the secondary structure preferences of monomeric amylin in solution. This study also aims to determine the ability of common forcefields to sample a transition of the protein from a helical membrane bound conformation into the disordered solution state of amylin. Our results demonstrated that the CHARMM22* forcefield showed the best ability to sample multiple conformational states inherent for amylin. It is revealed that REST2 yielded results qualitatively consistent with experiments and in quantitative agreement with other sampling methods, however far more computationally efficiently and without any bias. Therefore, combining an unbiased sampling technique such as REST2 with a vigorous forcefield testing could be suggested as an important step in developing an efficient and robust strategy for simulating IDPs.

  12. A Rubric for Extracting Idea Density from Oral Language Samples

    PubMed Central

    Chand, Vineeta; Baynes, Kathleen; Bonnici, Lisa M.; Farias, Sarah Tomaszewski

    2012-01-01

    While past research has demonstrated that low idea density (ID) scores from natural language samples correlate with late life risk for cognitive decline and Alzheimer’s disease pathology, there are no published rubrics for collecting and analyzing language samples for idea density to verify or extend these findings into new settings. This paper outlines the history of ID research and findings, discusses issues with past rubrics, and then presents an operationalized method for the systematic measurement of ID in language samples, with an extensive manual available as a supplement to this article (Analysis of Idea Density, AID). Finally, reliability statistics for this rubric in the context of dementia research on aging populations and verification that AID can replicate the significant association between ID and late life cognition are presented. PMID:23042498

  13. Correction of elevation offsets in multiple co-located lidar datasets

    USGS Publications Warehouse

    Thompson, David M.; Dalyander, P. Soupy; Long, Joseph W.; Plant, Nathaniel G.

    2017-04-07

    IntroductionTopographic elevation data collected with airborne light detection and ranging (lidar) can be used to analyze short- and long-term changes to beach and dune systems. Analysis of multiple lidar datasets at Dauphin Island, Alabama, revealed systematic, island-wide elevation differences on the order of 10s of centimeters (cm) that were not attributable to real-world change and, therefore, were likely to represent systematic sampling offsets. These offsets vary between the datasets, but appear spatially consistent within a given survey. This report describes a method that was developed to identify and correct offsets between lidar datasets collected over the same site at different times so that true elevation changes over time, associated with sediment accumulation or erosion, can be analyzed.

  14. Analysis of selected data from the triservice missile data base

    NASA Technical Reports Server (NTRS)

    Allen, Jerry M.; Shaw, David S.; Sawyer, Wallace C.

    1989-01-01

    An extremely large, systematic, axisymmetric-body/tail-fin data base has been gathered through tests of an innovative missile model design which is described herein. These data were originally obtained for incorporation into a missile aerodynamics code based on engineering methods (Program MISSILE3), but these data are also valuable as diagnostic test cases for developing computational methods because of the individual-fin data included in the data base. Detailed analyses of four sample cases from these data are presented to illustrate interesting individual-fin force and moment trends. These samples quantitatively show how bow shock, fin orientation, fin deflection, and body vortices can produce strong, unusual, and computationally challenging effects on individual fin loads. Flow-visualization photographs are examined to provide physical insight into the cause of these effects.

  15. Relative impact of key sources of systematic noise in Affymetrix and Illumina gene-expression microarray experiments.

    PubMed

    Kitchen, Robert R; Sabine, Vicky S; Simen, Arthur A; Dixon, J Michael; Bartlett, John M S; Sims, Andrew H

    2011-12-01

    Systematic processing noise, which includes batch effects, is very common in microarray experiments but is often ignored despite its potential to confound or compromise experimental results. Compromised results are most likely when re-analysing or integrating datasets from public repositories due to the different conditions under which each dataset is generated. To better understand the relative noise-contributions of various factors in experimental-design, we assessed several Illumina and Affymetrix datasets for technical variation between replicate hybridisations of Universal Human Reference (UHRR) and individual or pooled breast-tumour RNA. A varying degree of systematic noise was observed in each of the datasets, however in all cases the relative amount of variation between standard control RNA replicates was found to be greatest at earlier points in the sample-preparation workflow. For example, 40.6% of the total variation in reported expressions were attributed to replicate extractions, compared to 13.9% due to amplification/labelling and 10.8% between replicate hybridisations. Deliberate probe-wise batch-correction methods were effective in reducing the magnitude of this variation, although the level of improvement was dependent on the sources of noise included in the model. Systematic noise introduced at the chip, run, and experiment levels of a combined Illumina dataset were found to be highly dependent upon the experimental design. Both UHRR and pools of RNA, which were derived from the samples of interest, modelled technical variation well although the pools were significantly better correlated (4% average improvement) and better emulated the effects of systematic noise, over all probes, than the UHRRs. The effect of this noise was not uniform over all probes, with low GC-content probes found to be more vulnerable to batch variation than probes with a higher GC-content. The magnitude of systematic processing noise in a microarray experiment is variable across probes and experiments, however it is generally the case that procedures earlier in the sample-preparation workflow are liable to introduce the most noise. Careful experimental design is important to protect against noise, detailed meta-data should always be provided, and diagnostic procedures should be routinely performed prior to downstream analyses for the detection of bias in microarray studies.

  16. Relative impact of key sources of systematic noise in Affymetrix and Illumina gene-expression microarray experiments

    PubMed Central

    2011-01-01

    Background Systematic processing noise, which includes batch effects, is very common in microarray experiments but is often ignored despite its potential to confound or compromise experimental results. Compromised results are most likely when re-analysing or integrating datasets from public repositories due to the different conditions under which each dataset is generated. To better understand the relative noise-contributions of various factors in experimental-design, we assessed several Illumina and Affymetrix datasets for technical variation between replicate hybridisations of Universal Human Reference (UHRR) and individual or pooled breast-tumour RNA. Results A varying degree of systematic noise was observed in each of the datasets, however in all cases the relative amount of variation between standard control RNA replicates was found to be greatest at earlier points in the sample-preparation workflow. For example, 40.6% of the total variation in reported expressions were attributed to replicate extractions, compared to 13.9% due to amplification/labelling and 10.8% between replicate hybridisations. Deliberate probe-wise batch-correction methods were effective in reducing the magnitude of this variation, although the level of improvement was dependent on the sources of noise included in the model. Systematic noise introduced at the chip, run, and experiment levels of a combined Illumina dataset were found to be highly dependant upon the experimental design. Both UHRR and pools of RNA, which were derived from the samples of interest, modelled technical variation well although the pools were significantly better correlated (4% average improvement) and better emulated the effects of systematic noise, over all probes, than the UHRRs. The effect of this noise was not uniform over all probes, with low GC-content probes found to be more vulnerable to batch variation than probes with a higher GC-content. Conclusions The magnitude of systematic processing noise in a microarray experiment is variable across probes and experiments, however it is generally the case that procedures earlier in the sample-preparation workflow are liable to introduce the most noise. Careful experimental design is important to protect against noise, detailed meta-data should always be provided, and diagnostic procedures should be routinely performed prior to downstream analyses for the detection of bias in microarray studies. PMID:22133085

  17. Calibration of mass spectrometric peptide mass fingerprint data without specific external or internal calibrants

    PubMed Central

    Wolski, Witold E; Lalowski, Maciej; Jungblut, Peter; Reinert, Knut

    2005-01-01

    Background Peptide Mass Fingerprinting (PMF) is a widely used mass spectrometry (MS) method of analysis of proteins and peptides. It relies on the comparison between experimentally determined and theoretical mass spectra. The PMF process requires calibration, usually performed with external or internal calibrants of known molecular masses. Results We have introduced two novel MS calibration methods. The first method utilises the local similarity of peptide maps generated after separation of complex protein samples by two-dimensional gel electrophoresis. It computes a multiple peak-list alignment of the data set using a modified Minimum Spanning Tree (MST) algorithm. The second method exploits the idea that hundreds of MS samples are measured in parallel on one sample support. It improves the calibration coefficients by applying a two-dimensional Thin Plate Splines (TPS) smoothing algorithm. We studied the novel calibration methods utilising data generated by three different MALDI-TOF-MS instruments. We demonstrate that a PMF data set can be calibrated without resorting to external or relying on widely occurring internal calibrants. The methods developed here were implemented in R and are part of the BioConductor package mscalib available from . Conclusion The MST calibration algorithm is well suited to calibrate MS spectra of protein samples resulting from two-dimensional gel electrophoretic separation. The TPS based calibration algorithm might be used to correct systematic mass measurement errors observed for large MS sample supports. As compared to other methods, our combined MS spectra calibration strategy increases the peptide/protein identification rate by an additional 5 – 15%. PMID:16102175

  18. ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.

    2011-04-20

    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less

  19. Methodological issues and recommendations for systematic reviews of prognostic studies: an example from cardiovascular disease.

    PubMed

    Dretzke, Janine; Ensor, Joie; Bayliss, Sue; Hodgkinson, James; Lordkipanidzé, Marie; Riley, Richard D; Fitzmaurice, David; Moore, David

    2014-12-03

    Prognostic factors are associated with the risk of future health outcomes in individuals with a particular health condition. The prognostic ability of such factors is increasingly being assessed in both primary research and systematic reviews. Systematic review methodology in this area is continuing to evolve, reflected in variable approaches to key methodological aspects. The aim of this article was to (i) explore and compare the methodology of systematic reviews of prognostic factors undertaken for the same clinical question, (ii) to discuss implications for review findings, and (iii) to present recommendations on what might be considered to be 'good practice' approaches. The sample was comprised of eight systematic reviews addressing the same clinical question, namely whether 'aspirin resistance' (a potential prognostic factor) has prognostic utility relative to future vascular events in patients on aspirin therapy for secondary prevention. A detailed comparison of methods around study identification, study selection, quality assessment, approaches to analysis, and reporting of findings was undertaken and the implications discussed. These were summarised into key considerations that may be transferable to future systematic reviews of prognostic factors. Across systematic reviews addressing the same clinical question, there were considerable differences in the numbers of studies identified and overlap between included studies, which could only partially be explained by different study eligibility criteria. Incomplete reporting and differences in terminology within primary studies hampered study identification and selection process across reviews. Quality assessment was highly variable and only one systematic review considered a checklist for studies of prognostic questions. There was inconsistency between reviews in approaches towards analysis, synthesis, addressing heterogeneity and reporting of results. Different methodological approaches may ultimately affect the findings and interpretation of systematic reviews of prognostic research, with implications for clinical decision-making.

  20. Salting-out assisted liquid-liquid extraction combined with gas chromatography-mass spectrometry for the determination of pyrethroid insecticides in high salinity and biological samples.

    PubMed

    Niu, Zongliang; Yu, Chunwei; He, Xiaowen; Zhang, Jun; Wen, Yingying

    2017-09-05

    A salting-out assisted liquid-liquid extraction (SALLE) combined with gas chromatography-mass spectrometry (GC-MS) method was developed for the determination of four pyrethroid insecticides (PYRs) in high salinity and biological samples. Several parameters including sample pH, salting-out solution volume and salting-out solution pH influencing the extraction efficiency were systematically investigated with the aid of orthogonal design. The optimal extraction conditions of SALLE were: 4mL of salting-out solution with pH=4 and the sample pH=3. Under the optimum extraction and determination conditions, good responses for four PYRs were obtained in a range of 5-5000ng/mL, with linear coefficients greater than 0.998. The recoveries of the four PYRs ranged from 74% to 110%, with standard deviations ranging from 1.8% to 9.8%. The limits of detection based on a signal-to-noise ratio of 3 were between 1.5-60.6ng/mL. The method was applied to the determination of PYRs in urine, seawater and wastewater samples with a satisfactory result. The results demonstrated that this SALLE-GC-MS method was successfully applied to determine PYRs in high salinity and biological samples. SALLE avoided the need for the elimination of salinity and protein in the sample matrix, as well as clean-up of the extractant. Most of all, no centrifugation or any special apparatus are required, make this a promising method for rapid sample preparation procedure. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. A combination strategy for extraction and isolation of multi-component natural products by systematic two-phase solvent extraction-(13)C nuclear magnetic resonance pattern recognition and following conical counter-current chromatography separation: Podophyllotoxins and flavonoids from Dysosma versipellis (Hance) as examples.

    PubMed

    Yang, Zhi; Wu, Youqian; Wu, Shihua

    2016-01-29

    Despite of substantial developments of extraction and separation techniques, isolation of natural products from natural resources is still a challenging task. In this work, an efficient strategy for extraction and isolation of multi-component natural products has been successfully developed by combination of systematic two-phase liquid-liquid extraction-(13)C NMR pattern recognition and following conical counter-current chromatography separation. A small-scale crude sample was first distributed into 9 systematic hexane-ethyl acetate-methanol-water (HEMWat) two-phase solvent systems for determination of the optimum extraction solvents and partition coefficients of the prominent components. Then, the optimized solvent systems were used in succession to enrich the hydrophilic and lipophilic components from the large-scale crude sample. At last, the enriched components samples were further purified by a new conical counter-current chromatography (CCC). Due to the use of (13)C NMR pattern recognition, the kinds and structures of major components in the solvent extracts could be predicted. Therefore, the method could collect simultaneously the partition coefficients and the structural information of components in the selected two-phase solvents. As an example, a cytotoxic extract of podophyllotoxins and flavonoids from Dysosma versipellis (Hance) was selected. After the systematic HEMWat system solvent extraction and (13)C NMR pattern recognition analyses, the crude extract of D. versipellis was first degreased by the upper phase of HEMWat system (9:1:9:1, v/v), and then distributed in the two phases of the system of HEMWat (2:8:2:8, v/v) to obtain the hydrophilic lower phase extract and lipophilic upper phase extract, respectively. These extracts were further separated by conical CCC with the HEMWat systems (1:9:1:9 and 4:6:4:6, v/v). As results, total 17 cytotoxic compounds were isolated and identified. In general, whole results suggested that the strategy was very efficient for the systematic extraction and isolation of biological active components from the complex biomaterials. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Evaluation of methods for measuring particulate matter emissions from gas turbines.

    PubMed

    Petzold, Andreas; Marsh, Richard; Johnson, Mark; Miller, Michael; Sevcenco, Yura; Delhaye, David; Ibrahim, Amir; Williams, Paul; Bauer, Heidi; Crayford, Andrew; Bachalo, William D; Raper, David

    2011-04-15

    The project SAMPLE evaluated methods for measuring particle properties in the exhaust of aircraft engines with respect to the development of standardized operation procedures for particulate matter measurement in aviation industry. Filter-based off-line mass methods included gravimetry and chemical analysis of carbonaceous species by combustion methods. Online mass methods were based on light absorption measurement or used size distribution measurements obtained from an electrical mobility analyzer approach. Number concentrations were determined using different condensation particle counters (CPC). Total mass from filter-based methods balanced gravimetric mass within 8% error. Carbonaceous matter accounted for 70% of gravimetric mass while the remaining 30% were attributed to hydrated sulfate and noncarbonaceous organic matter fractions. Online methods were closely correlated over the entire range of emission levels studied in the tests. Elemental carbon from combustion methods and black carbon from optical methods deviated by maximum 5% with respect to mass for low to medium emission levels, whereas for high emission levels a systematic deviation between online methods and filter based methods was found which is attributed to sampling effects. CPC based instruments proved highly reproducible for number concentration measurements with a maximum interinstrument standard deviation of 7.5%.

  3. Quantifying errors without random sampling.

    PubMed

    Phillips, Carl V; LaPole, Luwanna M

    2003-06-12

    All quantifications of mortality, morbidity, and other health measures involve numerous sources of error. The routine quantification of random sampling error makes it easy to forget that other sources of error can and should be quantified. When a quantification does not involve sampling, error is almost never quantified and results are often reported in ways that dramatically overstate their precision. We argue that the precision implicit in typical reporting is problematic and sketch methods for quantifying the various sources of error, building up from simple examples that can be solved analytically to more complex cases. There are straightforward ways to partially quantify the uncertainty surrounding a parameter that is not characterized by random sampling, such as limiting reported significant figures. We present simple methods for doing such quantifications, and for incorporating them into calculations. More complicated methods become necessary when multiple sources of uncertainty must be combined. We demonstrate that Monte Carlo simulation, using available software, can estimate the uncertainty resulting from complicated calculations with many sources of uncertainty. We apply the method to the current estimate of the annual incidence of foodborne illness in the United States. Quantifying uncertainty from systematic errors is practical. Reporting this uncertainty would more honestly represent study results, help show the probability that estimated values fall within some critical range, and facilitate better targeting of further research.

  4. Chromatographic-ICPMS methods for trace element and isotope analysis of water and biogenic calcite

    NASA Astrophysics Data System (ADS)

    Klinkhammer, G. P.; Haley, B. A.; McManus, J.; Palmer, M. R.

    2003-04-01

    ICP-MS is a powerful technique because of its sensitivity and speed of analysis. This is especially true for refractory elements that are notoriously difficult using TIMS and less energetic techniques. However, as ICP-MS instruments become more sensitive to elements of interest they also become more sensitive to interference. This becomes a pressing issue when analyzing samples with high total dissolved solids. This paper describes two trace element methods that overcome these problems by using chromatographic techniques to precondition samples prior to analysis by ICP-MS: separation of rare earth elements (REEs) from seawater using HPLC-ICPMS, and flow-through dissolution of foraminiferal calcite. Using HPLC in combination with ICP-MS it is possible to isolate the REEs from matrix, other transition elements, and each other. This method has been developed for small volume samples (5ml) making it possible to analyze sediment pore waters. As another example, subjecting foram shells to flow-through reagent addition followed by time-resolved analysis in the ICP-MS allows for systematic cleaning and dissolution of foram shells. This method provides information about the relationship between dissolution tendency and elemental composition. Flow-through is also amenable to automation thus yielding the high sample throughput required for paleoceanography, and produces a highly resolved elemental matrix that can be statistically analyzed.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jennings, E.; Madigan, M.

    Given the complexity of modern cosmological parameter inference where we arefaced with non-Gaussian data and noise, correlated systematics and multi-probecorrelated data sets, the Approximate Bayesian Computation (ABC) method is apromising alternative to traditional Markov Chain Monte Carlo approaches in thecase where the Likelihood is intractable or unknown. The ABC method is called"Likelihood free" as it avoids explicit evaluation of the Likelihood by using aforward model simulation of the data which can include systematics. Weintroduce astroABC, an open source ABC Sequential Monte Carlo (SMC) sampler forparameter estimation. A key challenge in astrophysics is the efficient use oflarge multi-probe datasets to constrainmore » high dimensional, possibly correlatedparameter spaces. With this in mind astroABC allows for massive parallelizationusing MPI, a framework that handles spawning of jobs across multiple nodes. Akey new feature of astroABC is the ability to create MPI groups with differentcommunicators, one for the sampler and several others for the forward modelsimulation, which speeds up sampling time considerably. For smaller jobs thePython multiprocessing option is also available. Other key features include: aSequential Monte Carlo sampler, a method for iteratively adapting tolerancelevels, local covariance estimate using scikit-learn's KDTree, modules forspecifying optimal covariance matrix for a component-wise or multivariatenormal perturbation kernel, output and restart files are backed up everyiteration, user defined metric and simulation methods, a module for specifyingheterogeneous parameter priors including non-standard prior PDFs, a module forspecifying a constant, linear, log or exponential tolerance level,well-documented examples and sample scripts. This code is hosted online athttps://github.com/EliseJ/astroABC« less

  6. Equilibrium sampling by reweighting nonequilibrium simulation trajectories

    NASA Astrophysics Data System (ADS)

    Yang, Cheng; Wan, Biao; Xu, Shun; Wang, Yanting; Zhou, Xin

    2016-03-01

    Based on equilibrium molecular simulations, it is usually difficult to efficiently visit the whole conformational space of complex systems, which are separated into some metastable regions by high free energy barriers. Nonequilibrium simulations could enhance transitions among these metastable regions and then be applied to sample equilibrium distributions in complex systems, since the associated nonequilibrium effects can be removed by employing the Jarzynski equality (JE). Here we present such a systematical method, named reweighted nonequilibrium ensemble dynamics (RNED), to efficiently sample equilibrium conformations. The RNED is a combination of the JE and our previous reweighted ensemble dynamics (RED) method. The original JE reproduces equilibrium from lots of nonequilibrium trajectories but requires that the initial distribution of these trajectories is equilibrium. The RED reweights many equilibrium trajectories from an arbitrary initial distribution to get the equilibrium distribution, whereas the RNED has both advantages of the two methods, reproducing equilibrium from lots of nonequilibrium simulation trajectories with an arbitrary initial conformational distribution. We illustrated the application of the RNED in a toy model and in a Lennard-Jones fluid to detect its liquid-solid phase coexistence. The results indicate that the RNED sufficiently extends the application of both the original JE and the RED in equilibrium sampling of complex systems.

  7. Equilibrium sampling by reweighting nonequilibrium simulation trajectories.

    PubMed

    Yang, Cheng; Wan, Biao; Xu, Shun; Wang, Yanting; Zhou, Xin

    2016-03-01

    Based on equilibrium molecular simulations, it is usually difficult to efficiently visit the whole conformational space of complex systems, which are separated into some metastable regions by high free energy barriers. Nonequilibrium simulations could enhance transitions among these metastable regions and then be applied to sample equilibrium distributions in complex systems, since the associated nonequilibrium effects can be removed by employing the Jarzynski equality (JE). Here we present such a systematical method, named reweighted nonequilibrium ensemble dynamics (RNED), to efficiently sample equilibrium conformations. The RNED is a combination of the JE and our previous reweighted ensemble dynamics (RED) method. The original JE reproduces equilibrium from lots of nonequilibrium trajectories but requires that the initial distribution of these trajectories is equilibrium. The RED reweights many equilibrium trajectories from an arbitrary initial distribution to get the equilibrium distribution, whereas the RNED has both advantages of the two methods, reproducing equilibrium from lots of nonequilibrium simulation trajectories with an arbitrary initial conformational distribution. We illustrated the application of the RNED in a toy model and in a Lennard-Jones fluid to detect its liquid-solid phase coexistence. The results indicate that the RNED sufficiently extends the application of both the original JE and the RED in equilibrium sampling of complex systems.

  8. Method for improving instrument response

    DOEpatents

    Hahn, David W.; Hencken, Kenneth R.; Johnsen, Howard A.; Flower, William L.

    2000-01-01

    This invention pertains generally to a method for improving the accuracy of particle analysis under conditions of discrete particle loading and particularly to a method for improving signal-to-noise ratio and instrument response in laser spark spectroscopic analysis of particulate emissions. Under conditions of low particle density loading (particles/m.sup.3) resulting from low overall metal concentrations and/or large particle size uniform sampling can not be guaranteed. The present invention discloses a technique for separating laser sparks that arise from sample particles from those that do not; that is, a process for systematically "gating" the instrument response arising from "sampled" particles from those responses which do not, is dislosed as a solution to his problem. The disclosed approach is based on random sampling combined with a conditional analysis of each pulse. A threshold value is determined for the ratio of the intensity of a spectral line for a given element to a baseline region. If the threshold value is exceeded, the pulse is classified as a "hit" and that data is collected and an average spectrum is generated from an arithmetic average of "hits". The true metal concentration is determined from the averaged spectrum.

  9. Determination of airborne carbonyls: comparison of a thermal desorption/GC method with the standard DNPH/HPLC method.

    PubMed

    Ho, Steven Sai Hang; Yu, Jian Zhen

    2004-02-01

    The standard method for the determination of gaseous carbonyls is to collect carbonyls onto 2,4-dinitrophenyl hydrazine (DNPH) coated solid sorbent followed by solvent extraction of the solid sorbent and analysis of the derivatives using high-pressure liquid chromatography (HPLC). This paper describes a newly developed approach that involves collection of the carbonyls onto pentafluorophenyl hydrazine (PFPH) coated solid sorbents followed by thermal desorption and gas chromatographic (GC) analysis of the PFPH derivatives with mass spectrometric (MS) detection. Sampling tubes loaded with 510 nmol of PFPH on Tenax sorbent effectively collect gaseous carbonyls, including formaldehyde, acetaldehyde, propanal, butanal, heptanal, octanal, acrolein, 2-furfural, benzaldehyde, p-tolualdehyde, glyoxal, and methylglyoxal, at a flow rate of at least up to 100 mL/min. All of the tested carbonyls are shown to have method detection limits (MDLs) of subnanomoles per sampling tube, corresponding to air concentrations of <0.3 ppbv for a sampled volume of 24 L. These limits are 2-12 times lower than those that can be obtained using the DNPH/HPLC method. The improvement of MDLs is especially pronounced for carbonyls larger than formaldehyde and acetaldehyde. The PFPH/GC method also offers better peak separation and more sensitive and specific detection through the use of MS detection. Comparison studies on ambient samples and kitchen exhaust samples have demonstrated that the two methods do not yield systematic differences in concentrations of the carbonyls that are above their respective MDLs in both methods, including formaldehyde, acetaldehyde, acrolein, and butanal. The lower MDLs afforded by the PFPH/ GC method also enable the determination of a few more carbonyls in both applications.

  10. Prevalence of anxiety and depressive disorders among youth with intellectual disabilities: A systematic review and meta-analysis.

    PubMed

    Maïano, Christophe; Coutu, Sylvain; Tracey, Danielle; Bouchard, Stéphane; Lepage, Geneviève; Morin, Alexandre J S; Moullec, Grégory

    2018-04-06

    The purpose of this meta-analytic study was to determine the pooled prevalence estimates of anxiety and depressive disorders among children and adolescents with intellectual disabilities (ID) and to assess the extent to which these pooled prevalence rates differed according to studies' characteristics. A systematic literature search was performed in nine databases and 21 studies, published between 1975 and 2015, met the inclusion criteria. The resulting pooled prevalence estimates of combined subtypes of anxiety and depressive disorders were respectively (a) 5.4% and 2.8% across samples; (b) 1.2% and 0.03% among children; and (c) 7.9% and 1.4% among adolescents. Pooled prevalence estimates for specific subtypes of anxiety disorders ranged from (a) 0.2% to 11.5% across samples; (b) 0.7% to 17.6% among children; and (c) 0.6% to 19.8% among adolescents. Pooled prevalence estimates of dysthymic disorder and major depressive disorder were respectively (a) 3.4% and 2.5% across samples; (b) 2.1% and 3.2% among children; and (c) 6.9% and 5.7% among adolescents. Finally, subgroup analyses showed significant variations in the pooled prevalence estimates of combined subtypes of anxiety disorders, obsessive-compulsive disorder, and generalized anxiety disorder; and combined subtypes of depressive disorders. The present findings of this meta-analysis should be interpreted with caution given several limitations related to the characteristics of the populations, diagnostic method and sampling method. Findings provide recommendations for future studies investigating psychological disorders among youth with ID, as well as how clinicians and policy makers can improve diagnostic practices and support for youth with ID. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Evaluation Studies of Robotic Rollators by the User Perspective: A Systematic Review.

    PubMed

    Werner, Christian; Ullrich, Phoebe; Geravand, Milad; Peer, Angelika; Hauer, Klaus

    2016-01-01

    Robotic rollators enhance the basic functions of established devices by technically advanced physical, cognitive, or sensory support to increase autonomy in persons with severe impairment. In the evaluation of such ambient assisted living solutions, both the technical and user perspectives are important to prove usability, effectiveness and safety, and to ensure adequate device application. The aim of this systematic review is to summarize the methodology of studies evaluating robotic rollators with focus on the user perspective and to give recommendations for future evaluation studies. A systematic literature search up to December 31, 2014, was conducted based on the Cochrane Review methodology using the electronic databases PubMed and IEEE Xplore. Articles were selected according to the following inclusion criteria: evaluation studies of robotic rollators documenting human-robot interaction, no case reports, published in English language. Twenty-eight studies were identified that met the predefined inclusion criteria. Large heterogeneity in the definitions of the target user group, study populations, study designs and assessment methods was found across the included studies. No generic methodology to evaluate robotic rollators could be identified. We found major methodological shortcomings related to insufficient sample descriptions and sample sizes, and lack of appropriate, standardized and validated assessment methods. Long-term use in habitual environment was also not evaluated. Apart from the heterogeneity, methodological deficits in most of the identified studies became apparent. Recommendations for future evaluation studies include: clear definition of target user group, adequate selection of subjects, inclusion of other assistive mobility devices for comparison, evaluation of the habitual use of advanced prototypes, adequate assessment strategy with established, standardized and validated methods, and statistical analysis of study results. Assessment strategies may additionally focus on specific functionalities of the robotic rollators allowing an individually tailored assessment of innovative features to document their added value. © 2016 S. Karger AG, Basel.

  12. Characteristics of meta-analyses and their component studies in the Cochrane Database of Systematic Reviews: a cross-sectional, descriptive analysis

    PubMed Central

    2011-01-01

    Background Cochrane systematic reviews collate and summarise studies of the effects of healthcare interventions. The characteristics of these reviews and the meta-analyses and individual studies they contain provide insights into the nature of healthcare research and important context for the development of relevant statistical and other methods. Methods We classified every meta-analysis with at least two studies in every review in the January 2008 issue of the Cochrane Database of Systematic Reviews (CDSR) according to the medical specialty, the types of interventions being compared and the type of outcome. We provide descriptive statistics for numbers of meta-analyses, numbers of component studies and sample sizes of component studies, broken down by these categories. Results We included 2321 reviews containing 22,453 meta-analyses, which themselves consist of data from 112,600 individual studies (which may appear in more than one meta-analysis). Meta-analyses in the areas of gynaecology, pregnancy and childbirth (21%), mental health (13%) and respiratory diseases (13%) are well represented in the CDSR. Most meta-analyses address drugs, either with a control or placebo group (37%) or in a comparison with another drug (25%). The median number of meta-analyses per review is six (inter-quartile range 3 to 12). The median number of studies included in the meta-analyses with at least two studies is three (inter-quartile range 2 to 6). Sample sizes of individual studies range from 2 to 1,242,071, with a median of 91 participants. Discussion It is clear that the numbers of studies eligible for meta-analyses are typically very small for all medical areas, outcomes and interventions covered by Cochrane reviews. This highlights the particular importance of suitable methods for the meta-analysis of small data sets. There was little variation in number of studies per meta-analysis across medical areas, across outcome data types or across types of interventions being compared. PMID:22114982

  13. Analysis of biomolecular solvation sites by 3D-RISM theory.

    PubMed

    Sindhikara, Daniel J; Hirata, Fumio

    2013-06-06

    We derive, implement, and apply equilibrium solvation site analysis for biomolecules. Our method utilizes 3D-RISM calculations to quickly obtain equilibrium solvent distributions without either necessity of simulation or limits of solvent sampling. Our analysis of these distributions extracts highest likelihood poses of solvent as well as localized entropies, enthalpies, and solvation free energies. We demonstrate our method on a structure of HIV-1 protease where excellent structural and thermodynamic data are available for comparison. Our results, obtained within minutes, show systematic agreement with available experimental data. Further, our results are in good agreement with established simulation-based solvent analysis methods. This method can be used not only for visual analysis of active site solvation but also for virtual screening methods and experimental refinement.

  14. Control chart pattern recognition using RBF neural network with new training algorithm and practical features.

    PubMed

    Addeh, Abdoljalil; Khormali, Aminollah; Golilarz, Noorbakhsh Amiri

    2018-05-04

    The control chart patterns are the most commonly used statistical process control (SPC) tools to monitor process changes. When a control chart produces an out-of-control signal, this means that the process has been changed. In this study, a new method based on optimized radial basis function neural network (RBFNN) is proposed for control chart patterns (CCPs) recognition. The proposed method consists of four main modules: feature extraction, feature selection, classification and learning algorithm. In the feature extraction module, shape and statistical features are used. Recently, various shape and statistical features have been presented for the CCPs recognition. In the feature selection module, the association rules (AR) method has been employed to select the best set of the shape and statistical features. In the classifier section, RBFNN is used and finally, in RBFNN, learning algorithm has a high impact on the network performance. Therefore, a new learning algorithm based on the bees algorithm has been used in the learning module. Most studies have considered only six patterns: Normal, Cyclic, Increasing Trend, Decreasing Trend, Upward Shift and Downward Shift. Since three patterns namely Normal, Stratification, and Systematic are very similar to each other and distinguishing them is very difficult, in most studies Stratification and Systematic have not been considered. Regarding to the continuous monitoring and control over the production process and the exact type detection of the problem encountered during the production process, eight patterns have been investigated in this study. The proposed method is tested on a dataset containing 1600 samples (200 samples from each pattern) and the results showed that the proposed method has a very good performance. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Diagnostic performance and safety of a three-dimensional 14-core systematic biopsy method.

    PubMed

    Takeshita, Hideki; Kawakami, Satoru; Numao, Noboru; Sakura, Mizuaki; Tatokoro, Manabu; Yamamoto, Shinya; Kijima, Toshiki; Komai, Yoshinobu; Saito, Kazutaka; Koga, Fumitaka; Fujii, Yasuhisa; Fukui, Iwao; Kihara, Kazunori

    2015-03-01

    To investigate the diagnostic performance and safety of a three-dimensional 14-core biopsy (3D14PBx) method, which is a combination of the transrectal six-core and transperineal eight-core biopsy methods. Between December 2005 and August 2010, 1103 men underwent 3D14PBx at our institutions and were analysed prospectively. Biopsy criteria included a PSA level of 2.5-20 ng/mL or abnormal digital rectal examination (DRE) findings, or both. The primary endpoint of the study was diagnostic performance and the secondary endpoint was safety. We applied recursive partitioning to the entire study cohort to delineate the unique contribution of each sampling site to overall and clinically significant cancer detection. Prostate cancer was detected in 503 of the 1103 patients (45.6%). Age, family history of prostate cancer, DRE, PSA, percentage of free PSA and prostate volume were associated with the positive biopsy results significantly and independently. Of the 503 cancers detected, 39 (7.8%) were clinically locally advanced (≥cT3a), 348 (69%) had a biopsy Gleason score (GS) of ≥7, and 463 (92%) met the definition of biopsy-based significant cancer. Recursive partitioning analysis showed that each sampling site contributed uniquely to both the overall and the biopsy-based significant cancer detection rate of the 3D14PBx method. The overall cancer-positive rate of each sampling site ranged from 14.5% in the transrectal far lateral base to 22.8% in the transrectal far lateral apex. As of August 2010, 210 patients (42%) had undergone radical prostatectomy, of whom 55 (26%) were found to have pathologically non-organ-confined disease, 174 (83%) had prostatectomy GS ≥7 and 185 (88%) met the definition of prostatectomy-based significant cancer. This is the first prospective analysis of the diagnostic performance of an extended biopsy method, which is a simplified version of the somewhat redundant super-extended three-dimensional 26-core biopsy. As expected, each sampling site uniquely contributed not only to overall cancer detection, but also to significant cancer detection. 3D14PBx is a feasible systematic biopsy method in men with PSA <20 ng/mL. © 2014 The Authors. BJU International © 2014 BJU International.

  16. Evaluation of Normalization Methods to Pave the Way Towards Large-Scale LC-MS-Based Metabolomics Profiling Experiments

    PubMed Central

    Valkenborg, Dirk; Baggerman, Geert; Vanaerschot, Manu; Witters, Erwin; Dujardin, Jean-Claude; Burzykowski, Tomasz; Berg, Maya

    2013-01-01

    Abstract Combining liquid chromatography-mass spectrometry (LC-MS)-based metabolomics experiments that were collected over a long period of time remains problematic due to systematic variability between LC-MS measurements. Until now, most normalization methods for LC-MS data are model-driven, based on internal standards or intermediate quality control runs, where an external model is extrapolated to the dataset of interest. In the first part of this article, we evaluate several existing data-driven normalization approaches on LC-MS metabolomics experiments, which do not require the use of internal standards. According to variability measures, each normalization method performs relatively well, showing that the use of any normalization method will greatly improve data-analysis originating from multiple experimental runs. In the second part, we apply cyclic-Loess normalization to a Leishmania sample. This normalization method allows the removal of systematic variability between two measurement blocks over time and maintains the differential metabolites. In conclusion, normalization allows for pooling datasets from different measurement blocks over time and increases the statistical power of the analysis, hence paving the way to increase the scale of LC-MS metabolomics experiments. From our investigation, we recommend data-driven normalization methods over model-driven normalization methods, if only a few internal standards were used. Moreover, data-driven normalization methods are the best option to normalize datasets from untargeted LC-MS experiments. PMID:23808607

  17. The importance of transmission electron microscopy analysis of spermatozoa: Diagnostic applications and basic research.

    PubMed

    Moretti, Elena; Sutera, Gaetano; Collodel, Giulia

    2016-06-01

    This review is aimed at discussing the role of ultrastructural studies on human spermatozoa and evaluating transmission electron microscopy as a diagnostic tool that can complete andrology protocols. It is clear that morphological sperm defects may explain decreased fertilizing potential and acquire particular value in the field of male infertility. Electron microscopy is the best method to identify systematic or monomorphic and non-systematic or polymorphic sperm defects. The systematic defects are characterized by a particular anomaly that affects the vast majority of spermatozoa in a semen sample, whereas a heterogeneous combination of head and tail defects found in variable percentages are typically non-systematic or polymorphic sperm defects. A correct diagnosis of these specific sperm alterations is important for choosing the male infertility's therapy and for deciding to turn to assisted reproduction techniques. Transmission electron microscopy (TEM) also represents a valuable method to explore the in vitro effects of different compounds (for example drugs with potential spermicidal activity) on the morphology of human spermatozoa. Finally, TEM used in combination with immunohistochemical techniques, integrates structural and functional aspects that provide a wide horizon in the understanding of sperm physiology and pathology. transmission electron microscopy: TEM; World Health Organization: WHO; light microscopy: LM; motile sperm organelle morphology examination: MSOME; intracytoplasmic morphologically selected sperm injection: IMSI; intracytoplasmic sperm injection: ICSI; dysplasia of fibrous sheath: DFS; primary ciliary dyskinesia: PCD; outer dense fibers: ODF; assisted reproduction technologies: ART; scanning electron microscopy: SEM; polyvinylpirrolidone: PVP; tert-butylhydroperoxide: TBHP.

  18. Systematic Review of the Use of Online Questionnaires among the Geriatric Population

    PubMed Central

    Remillard, Meegan L.; Mazor, Kathleen M.; Cutrona, Sarah L.; Gurwitz, Jerry H.; Tjia, Jennifer

    2014-01-01

    Background/Objectives The use of internet-based questionnaires to collect information from older adults is not well established. This systematic literature review of studies using online questionnaires in older adult populations aims to 1. describe methodologic approaches to population targeting and sampling and 2. summarize limitations of Internet-based questionnaires in geriatric populations. Design, Setting, Participants We identified English language articles using search terms for geriatric, age 65 and over, Internet survey, online survey, Internet questionnaire, and online questionnaire in PubMed and EBSCO host between 1984 and July 2012. Inclusion criteria were: study population mean age ≥65 years old and use of an online questionnaire for research. Review of 336 abstracts yielded 14 articles for full review by 2 investigators; 11 articles met inclusion criteria. Measurements Articles were extracted for study design and setting, patient characteristics, recruitment strategy, country, and study limitations. Results Eleven (11) articles were published after 2001. Studies had populations with a mean age of 65 to 78 years, included descriptive and analytical designs, and were conducted in the United States, Australia, and Japan. Recruiting methods varied widely from paper fliers and personal emails to use of consumer marketing panels. Investigator-reported study limitations included the use of small convenience samples and limited generalizability. Conclusion Online questionnaires are a feasible method of surveying older adults in some geographic regions and for some subsets of older adults, but limited Internet access constrains recruiting methods and often limits study generalizability. PMID:24635138

  19. Participant comprehension of research for which they volunteer: a systematic review.

    PubMed

    Montalvo, Wanda; Larson, Elaine

    2014-11-01

    Evidence indicates that research participants often do not fully understand the studies for which they have volunteered. The aim of this systematic review was to examine the relationship between the process of obtaining informed consent for research and participant comprehension and satisfaction with the research. Systematic review of published research on informed consent and participant comprehension of research for which they volunteer using the Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) Statement as a guide. PubMed, Cumulative Index for Nursing and Allied Health Literature, Cochrane Central Register of Controlled Trails, and Cochrane Database of Systematic Reviews were used to search the literature for studies meeting the following inclusion criteria: (a) published between January 1, 2006, and December 31, 2013, (b) interventional or descriptive quantitative design, (c) published in a peer-reviewed journal, (d) written in English, and (e) assessed participant comprehension or satisfaction with the research process. Studies were assessed for quality using seven indicators: sampling method, use of controls or comparison groups, response rate, description of intervention, description of outcome, statistical method, and health literacy assessment. Of 176 studies identified, 27 met inclusion criteria: 13 (48%) were randomized interventional designs and 14 (52%) were descriptive. Three categories of studies included projects assessing (a) enhanced consent process or form, (b) multimedia methods, and (c) education to improve participant understanding. Most (78%) used investigator-developed tools to assess participant comprehension, did not assess participant health literacy (74%), or did not assess the readability level of the consent form (89%). Researchers found participants lacked basic understanding of research elements: randomization, placebo, risks, and therapeutic misconception. Findings indicate (a) inconsistent assessment of participant reading or health literacy level, (b) measurement variation associated with use of nonstandardized tools, and (c) continued therapeutic misconception and lack of understanding among research participants of randomization, placebo, benefit, and risk. While the Agency for Healthcare and Quality and National Quality Forum have published informed consent and authorization toolkits, previously published validated tools are underutilized. Informed consent requires the assessment of health literacy, reading level, and comprehension of research participants using validated assessment tools and methods. © 2014 Sigma Theta Tau International.

  20. Convergent and sequential synthesis designs: implications for conducting and reporting systematic reviews of qualitative and quantitative evidence.

    PubMed

    Hong, Quan Nha; Pluye, Pierre; Bujold, Mathieu; Wassef, Maggy

    2017-03-23

    Systematic reviews of qualitative and quantitative evidence can provide a rich understanding of complex phenomena. This type of review is increasingly popular, has been used to provide a landscape of existing knowledge, and addresses the types of questions not usually covered in reviews relying solely on either quantitative or qualitative evidence. Although several typologies of synthesis designs have been developed, none have been tested on a large sample of reviews. The aim of this review of reviews was to identify and develop a typology of synthesis designs and methods that have been used and to propose strategies for synthesizing qualitative and quantitative evidence. A review of systematic reviews combining qualitative and quantitative evidence was performed. Six databases were searched from inception to December 2014. Reviews were included if they were systematic reviews combining qualitative and quantitative evidence. The included reviews were analyzed according to three concepts of synthesis processes: (a) synthesis methods, (b) sequence of data synthesis, and (c) integration of data and synthesis results. A total of 459 reviews were included. The analysis of this literature highlighted a lack of transparency in reporting how evidence was synthesized and a lack of consistency in the terminology used. Two main types of synthesis designs were identified: convergent and sequential synthesis designs. Within the convergent synthesis design, three subtypes were found: (a) data-based convergent synthesis design, where qualitative and quantitative evidence is analyzed together using the same synthesis method, (b) results-based convergent synthesis design, where qualitative and quantitative evidence is analyzed separately using different synthesis methods and results of both syntheses are integrated during a final synthesis, and (c) parallel-results convergent synthesis design consisting of independent syntheses of qualitative and quantitative evidence and an interpretation of the results in the discussion. Performing systematic reviews of qualitative and quantitative evidence is challenging because of the multiple synthesis options. The findings provide guidance on how to combine qualitative and quantitative evidence. Also, recommendations are made to improve the conducting and reporting of this type of review.

  1. Systematics of stretching of fluid inclusions I: fluorite and sphalerite at 1 atmosphere confining pressure.

    USGS Publications Warehouse

    Bodnar, R.J.; Bethke, P.M.

    1984-01-01

    Measured homogenization T of fluid inclusions in fluorite and sphalerite may be higher than the true homogenization T of samples that have been previously heated in the laboratory or naturally in post-entrapment events. As T and with it internal P is increased, the resulting volume increase may become inelastic. If the volume increase exceeds the precision of T measurement, the inclusion is said to have stretched. More than 1300 measurements on fluid inclusions in fluorite and sphalerite indicate that stretching is systematically related to P-V-T-X properties of the fluid, inclusion size and shape, physical properties of the host mineral, and the confining P. Experimental methods are detailed in an appendix. The mechanism of stretching is probably plastic deformation or - not observed - microfracturing. The systematic relationship between the internal P necessary to initiate stretching and the inclusion volume provides a means of recognizing previously stretched inclusions and estimating the magnitude of post-entrapment thermal events. -G.J.N.

  2. Single-Blinded Prospective Implementation of a Preoperative Imaging Checklist for Endoscopic Sinus Surgery.

    PubMed

    Error, Marc; Ashby, Shaelene; Orlandi, Richard R; Alt, Jeremiah A

    2018-01-01

    Objective To determine if the introduction of a systematic preoperative sinus computed tomography (CT) checklist improves identification of critical anatomic variations in sinus anatomy among patients undergoing endoscopic sinus surgery. Study Design Single-blinded prospective cohort study. Setting Tertiary care hospital. Subjects and Methods Otolaryngology residents were asked to identify critical surgical sinus anatomy on preoperative CT scans before and after introduction of a systematic approach to reviewing sinus CT scans. The percentage of correctly identified structures was documented and compared with a 2-sample t test. Results A total of 57 scans were reviewed: 28 preimplementation and 29 postimplementation. Implementation of the sinus CT checklist improved identification of critical sinus anatomy from 24% to 84% correct ( P < .001). All residents, junior and senior, demonstrated significant improvement in identification of sinus anatomic variants, including those not directly included in the systematic review implemented. Conclusion The implementation of a preoperative endoscopic sinus surgery radiographic checklist improves identification of critical anatomic sinus variations in a training population.

  3. The perpetrators of medical child abuse (Munchausen Syndrome by Proxy) - A systematic review of 796 cases.

    PubMed

    Yates, Gregory; Bass, Christopher

    2017-10-01

    Little is known about the perpetrators of medical child abuse (MCA) which is often described as "Munchausen's syndrome by proxy" or "factitious disorder imposed on another". The demographic and clinical characteristics of these abusers have yet to be described in a sufficiently large sample. We aimed to address this issue through a systematic review of case reports and series in the professional literature. A systematic search for case reports and series published since 1965 was undertaken using MEDLINE, Web of Science and EMBASE. 4100 database records were screened. A supplementary search was then conducted using GoogleScholar and reference lists of eligible studies. Our search yielded a total sample of 796 perpetrators: 309 from case reports and 487 from case series. Information extracted included demographic and clinical characteristics, in addition to methods of abuse and case outcomes. Nearly all abusers were female (97.6%) and the victim's mother (95.6%). Most were married (75.8%). Mean caretaker age at the child's presentation was 27.6 years. Perpetrators were frequently reported to be in healthcare-related professions (45.6%), to have had obstetric complications (23.5%), or to have histories of childhood maltreatment (30%). The most common psychiatric diagnoses recorded were factitious disorder imposed on self (30.9%), personality disorder (18.6%), and depression (14.2%). From the largest analysis of MCA perpetrators to date, we provide several clinical recommendations. In particular, we urge clinicians to consider mothers with a personal history of childhood maltreatment, obstetric complications, and/or factitious disorder at heightened risk for MCA. Longitudinal studies are required to establish the true prognostic value of these factors as our method may have been vulnerable to publication bias. Copyright © 2017. Published by Elsevier Ltd.

  4. Understanding Ethical Issues of Research Participation from the Perspective of Participating Children and Adolescents: A Systematic Review

    PubMed Central

    Broome, Marion E.

    2017-01-01

    Background The past twenty years have seen distinct shifts in the way the participation of children and adolescents in research is viewed. This has been emphasized by the growing pediatric research enterprise. Additional information on children’s and adolescents’ experiences during research participation is needed to better inform researchers on the ethical conduct of research with this vulnerable population. Aims The objective of this analysis was to examine ethical issues in research with children and adolescents from their perspective as participants, including: assent, parental consent, risk perception, impact of research participation, and incentives. Methods This systematic review was conducted per the Long et al. framework by means of an iterative searching process. Using the key words ‘research ethics’ and ‘child or pediatric or adolescent’, PubMed, CINAHL, and EBSCOhost databases were searched to identify articles. Limitations placed on the original searches were: English language, year of publication between 2003–2014, humans, abstract available, and age birth–18 years. Findings Twenty-three empiric studies were identified and formed the sample. Included studies represented a diverse range of areas of research, methods, settings, sample demographics, authors, and journals. Discussion Even young children demonstrated the ability to understand essential elements of research, although there is variability in children’s level of understanding. Trust was a significant contributing factor to children’s and adolescents’ participation in research, and also shaped their assessments of risk. Research participation was mainly beneficial for children and adolescents. Incentives were mainly viewed positively, although concerns of possible undue influence were expressed. Linking Evidence to Action This systematic review highlights the importance of including the perspectives of children and adolescents and provides researchers and nurse clinicians with best practices for involving children in research. PMID:28207982

  5. The effectiveness of hydrotherapy in the treatment of social and behavioral aspects of children with autism spectrum disorders: a systematic review.

    PubMed

    Mortimer, Rachel; Privopoulos, Melinda; Kumar, Saravana

    2014-01-01

    Autism spectrum disorders (ASDs) are increasing in prevalence. Children with ASDs present with impairments in social interactions; communication; restricted, repetitive, and stereotyped patterns of behavior, interests, or activities; as well as motor delays. Hydrotherapy is used as a treatment for children with disabilities and motor delays. There have been no systematic reviews conducted on the effectiveness of hydrotherapy in children with ASDs. We aimed to examine the effectiveness of hydrotherapy on social interactions and behaviors in the treatment of children with ASDs. A systematic search of Cochrane, CINAHL, PsycINFO, Embase, MEDLINE®, and Academic Search Premier was conducted. Studies of participants, aged 3-18 years, with ASDs at a high-functioning level were included if they utilized outcome measures assessing social interactions and behaviors through questionnaire or observation. A critical appraisal, using the McMaster Critical Review Form for Quantitative Studies, was performed to assess methodological quality. Four studies of varying research design and quality met the inclusion criteria. The participants in these studies were aged between 3-12 years of age. The duration of the intervention ranged from 10-14 weeks, and each study used varied measures of outcome. Overall, all the studies showed some improvements in social interactions or behaviors following a Halliwick-based hydrotherapy intervention. Few studies have investigated the effect of hydrotherapy on the social interactions and behaviors of children with ASDs. While there is an increasing body of evidence for hydrotherapy for children with ASDs, this is constrained by small sample size, lack of comparator, crude sampling methods, and the lack of standardized outcome measures. Hydrotherapy shows potential as a treatment method for social interactions and behaviors in children with ASDs.

  6. Circulating cancer stem cell markers in breast carcinomas: a systematic review protocol.

    PubMed

    Mansoori, Maryam; Madjd, Zahra; Janani, Leila; Rasti, Arezoo

    2017-12-19

    Breast cancer is one of the most common types of cancer in women worldwide. Recent studies have provided strong support for the cancer stem cell (CSC) hypothesis, which suggests that many cancers, including breast cancer, are driven by a subpopulation of cells that display stem cell-like properties. The hypothesis that a subpopulation of circulating tumor cells (CTCs) possesses many CSC-like hallmarks is reinforced by the expression of related molecular markers between these two cell populations. The aim of this study is to systematically review primary studies and identify circulating CSC markers in breast cancer patients. Relevant observational studies evaluating the expression of circulating breast cancer stem cell markers through October 31, 2016, will be searched in PubMed, SCOPUS, Embase, ISI Web of Science, and Google Scholar with no restriction on language. Full copies of articles identified by the search and considered to meet the inclusion criteria will be obtained for data extraction and synthesis. Two quality assessment tools will be used for evaluating observational studies like case control, which are the Hoy et al. suggested tool and Newcastle-Ottawa Scale (NOS), respectively. Publication bias will be assessed by funnel plots or Egger's test (i.e., plots of study results against precision), and data synthesis will be performed using Stata software (Stata Corp V.12, TX, USA).This systematic review will be reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). Detecting cancer stem cells in blood will help clinicians to monitor cancer patients by obtaining as many samples as needed with a non-invasive method and in any stages; it is not possible to repeat sampling on working on tissue samples. By identifying cancer stem cells early in blood, it will be possible to distinguish metastasis in early stages. CRD42016043810.

  7. astroABC : An Approximate Bayesian Computation Sequential Monte Carlo sampler for cosmological parameter estimation

    NASA Astrophysics Data System (ADS)

    Jennings, E.; Madigan, M.

    2017-04-01

    Given the complexity of modern cosmological parameter inference where we are faced with non-Gaussian data and noise, correlated systematics and multi-probe correlated datasets,the Approximate Bayesian Computation (ABC) method is a promising alternative to traditional Markov Chain Monte Carlo approaches in the case where the Likelihood is intractable or unknown. The ABC method is called "Likelihood free" as it avoids explicit evaluation of the Likelihood by using a forward model simulation of the data which can include systematics. We introduce astroABC, an open source ABC Sequential Monte Carlo (SMC) sampler for parameter estimation. A key challenge in astrophysics is the efficient use of large multi-probe datasets to constrain high dimensional, possibly correlated parameter spaces. With this in mind astroABC allows for massive parallelization using MPI, a framework that handles spawning of processes across multiple nodes. A key new feature of astroABC is the ability to create MPI groups with different communicators, one for the sampler and several others for the forward model simulation, which speeds up sampling time considerably. For smaller jobs the Python multiprocessing option is also available. Other key features of this new sampler include: a Sequential Monte Carlo sampler; a method for iteratively adapting tolerance levels; local covariance estimate using scikit-learn's KDTree; modules for specifying optimal covariance matrix for a component-wise or multivariate normal perturbation kernel and a weighted covariance metric; restart files output frequently so an interrupted sampling run can be resumed at any iteration; output and restart files are backed up at every iteration; user defined distance metric and simulation methods; a module for specifying heterogeneous parameter priors including non-standard prior PDFs; a module for specifying a constant, linear, log or exponential tolerance level; well-documented examples and sample scripts. This code is hosted online at https://github.com/EliseJ/astroABC.

  8. Systematic Product Development of Control and Diagnosis Functionalities

    NASA Astrophysics Data System (ADS)

    Stetter, R.; Simundsson, A.

    2017-01-01

    In the scientific field of systematic product development a wide range of helpful methods, guidelines and tools were generated and published in recent years. Until now little special attention was given to design guidelines aiming at supporting product development engineers to design products that allow and support control or diagnosis functions. The general trend to ubiquitous computing and the first development steps towards cognitive systems as well as a general trend toward higher product safety, reliability and reduced total cost of ownership (TCO) in many engineering fields lead to a higher importance of control and diagnosis. In this paper a first attempt is made to formulate general valid guidelines how products can be developed in order to allow and to achieve effective and efficient control and diagnosis. The guidelines are elucidated on the example of an automated guided vehicle. One main concern of this paper is the integration of control and diagnosis functionalities into the development of complete systems which include mechanical, electrical and electronic subsystems. For the development of such systems the strategies, methods and tools of systematic product development have attracted significant attention during the last decades. Today, the functionality and safety of most products is to a large degree dependent on control and diagnosis functionalities. Still, there is comparatively little research concentrating on the integration of the development of these functionalities into the overall product development processes. The paper starts with a background describing Systematic Product Development. The second section deals with the product development of the sample product. The third part clarifies the notions monitoring, control and diagnosis. The following parts summarize some insights and formulate first hypotheses concerning control and diagnosis in Systematic Product Development.

  9. Meta-Storms: efficient search for similar microbial communities based on a novel indexing scheme and similarity score for metagenomic data.

    PubMed

    Su, Xiaoquan; Xu, Jian; Ning, Kang

    2012-10-01

    It has long been intriguing scientists to effectively compare different microbial communities (also referred as 'metagenomic samples' here) in a large scale: given a set of unknown samples, find similar metagenomic samples from a large repository and examine how similar these samples are. With the current metagenomic samples accumulated, it is possible to build a database of metagenomic samples of interests. Any metagenomic samples could then be searched against this database to find the most similar metagenomic sample(s). However, on one hand, current databases with a large number of metagenomic samples mostly serve as data repositories that offer few functionalities for analysis; and on the other hand, methods to measure the similarity of metagenomic data work well only for small set of samples by pairwise comparison. It is not yet clear, how to efficiently search for metagenomic samples against a large metagenomic database. In this study, we have proposed a novel method, Meta-Storms, that could systematically and efficiently organize and search metagenomic data. It includes the following components: (i) creating a database of metagenomic samples based on their taxonomical annotations, (ii) efficient indexing of samples in the database based on a hierarchical taxonomy indexing strategy, (iii) searching for a metagenomic sample against the database by a fast scoring function based on quantitative phylogeny and (iv) managing database by index export, index import, data insertion, data deletion and database merging. We have collected more than 1300 metagenomic data from the public domain and in-house facilities, and tested the Meta-Storms method on these datasets. Our experimental results show that Meta-Storms is capable of database creation and effective searching for a large number of metagenomic samples, and it could achieve similar accuracies compared with the current popular significance testing-based methods. Meta-Storms method would serve as a suitable database management and search system to quickly identify similar metagenomic samples from a large pool of samples. ningkang@qibebt.ac.cn Supplementary data are available at Bioinformatics online.

  10. Annotating Protein Functional Residues by Coupling High-Throughput Fitness Profile and Homologous-Structure Analysis.

    PubMed

    Du, Yushen; Wu, Nicholas C; Jiang, Lin; Zhang, Tianhao; Gong, Danyang; Shu, Sara; Wu, Ting-Ting; Sun, Ren

    2016-11-01

    Identification and annotation of functional residues are fundamental questions in protein sequence analysis. Sequence and structure conservation provides valuable information to tackle these questions. It is, however, limited by the incomplete sampling of sequence space in natural evolution. Moreover, proteins often have multiple functions, with overlapping sequences that present challenges to accurate annotation of the exact functions of individual residues by conservation-based methods. Using the influenza A virus PB1 protein as an example, we developed a method to systematically identify and annotate functional residues. We used saturation mutagenesis and high-throughput sequencing to measure the replication capacity of single nucleotide mutations across the entire PB1 protein. After predicting protein stability upon mutations, we identified functional PB1 residues that are essential for viral replication. To further annotate the functional residues important to the canonical or noncanonical functions of viral RNA-dependent RNA polymerase (vRdRp), we performed a homologous-structure analysis with 16 different vRdRp structures. We achieved high sensitivity in annotating the known canonical polymerase functional residues. Moreover, we identified a cluster of noncanonical functional residues located in the loop region of the PB1 β-ribbon. We further demonstrated that these residues were important for PB1 protein nuclear import through the interaction with Ran-binding protein 5. In summary, we developed a systematic and sensitive method to identify and annotate functional residues that are not restrained by sequence conservation. Importantly, this method is generally applicable to other proteins about which homologous-structure information is available. To fully comprehend the diverse functions of a protein, it is essential to understand the functionality of individual residues. Current methods are highly dependent on evolutionary sequence conservation, which is usually limited by sampling size. Sequence conservation-based methods are further confounded by structural constraints and multifunctionality of proteins. Here we present a method that can systematically identify and annotate functional residues of a given protein. We used a high-throughput functional profiling platform to identify essential residues. Coupling it with homologous-structure comparison, we were able to annotate multiple functions of proteins. We demonstrated the method with the PB1 protein of influenza A virus and identified novel functional residues in addition to its canonical function as an RNA-dependent RNA polymerase. Not limited to virology, this method is generally applicable to other proteins that can be functionally selected and about which homologous-structure information is available. Copyright © 2016 Du et al.

  11. Mixed methods in psychotherapy research: A review of method(ology) integration in psychotherapy science.

    PubMed

    Bartholomew, Theodore T; Lockard, Allison J

    2018-06-13

    Mixed methods can foster depth and breadth in psychological research. However, its use remains in development in psychotherapy research. Our purpose was to review the use of mixed methods in psychotherapy research. Thirty-one studies were identified via the PRISMA systematic review method. Using Creswell & Plano Clark's typologies to identify design characteristics, we assessed each study for rigor and how each used mixed methods. Key features of mixed methods designs and these common patterns were identified: (a) integration of clients' perceptions via mixing; (b) understanding group psychotherapy; (c) integrating methods with cases and small samples; (d) analyzing clinical data as qualitative data; and (e) exploring cultural identities in psychotherapy through mixed methods. The review is discussed with respect to the value of integrating multiple data in single studies to enhance psychotherapy research. © 2018 Wiley Periodicals, Inc.

  12. An Automatic Quality Control Pipeline for High-Throughput Screening Hit Identification.

    PubMed

    Zhai, Yufeng; Chen, Kaisheng; Zhong, Yang; Zhou, Bin; Ainscow, Edward; Wu, Ying-Ta; Zhou, Yingyao

    2016-09-01

    The correction or removal of signal errors in high-throughput screening (HTS) data is critical to the identification of high-quality lead candidates. Although a number of strategies have been previously developed to correct systematic errors and to remove screening artifacts, they are not universally effective and still require fair amount of human intervention. We introduce a fully automated quality control (QC) pipeline that can correct generic interplate systematic errors and remove intraplate random artifacts. The new pipeline was first applied to ~100 large-scale historical HTS assays; in silico analysis showed auto-QC led to a noticeably stronger structure-activity relationship. The method was further tested in several independent HTS runs, where QC results were sampled for experimental validation. Significantly increased hit confirmation rates were obtained after the QC steps, confirming that the proposed method was effective in enriching true-positive hits. An implementation of the algorithm is available to the screening community. © 2016 Society for Laboratory Automation and Screening.

  13. Monte-Carlo-based phase retardation estimator for polarization sensitive optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Duan, Lian; Makita, Shuichi; Yamanari, Masahiro; Lim, Yiheng; Yasuno, Yoshiaki

    2011-08-01

    A Monte-Carlo-based phase retardation estimator is developed to correct the systematic error in phase retardation measurement by polarization sensitive optical coherence tomography (PS-OCT). Recent research has revealed that the phase retardation measured by PS-OCT has a distribution that is neither symmetric nor centered at the true value. Hence, a standard mean estimator gives us erroneous estimations of phase retardation, and it degrades the performance of PS-OCT for quantitative assessment. In this paper, the noise property in phase retardation is investigated in detail by Monte-Carlo simulation and experiments. A distribution transform function is designed to eliminate the systematic error by using the result of the Monte-Carlo simulation. This distribution transformation is followed by a mean estimator. This process provides a significantly better estimation of phase retardation than a standard mean estimator. This method is validated both by numerical simulations and experiments. The application of this method to in vitro and in vivo biological samples is also demonstrated.

  14. HICOSMO: cosmology with a complete sample of galaxy clusters - II. Cosmological results

    NASA Astrophysics Data System (ADS)

    Schellenberger, G.; Reiprich, T. H.

    2017-10-01

    The X-ray bright, hot gas in the potential well of a galaxy cluster enables systematic X-ray studies of samples of galaxy clusters to constrain cosmological parameters. HIFLUGCS consists of the 64 X-ray brightest galaxy clusters in the Universe, building up a local sample. Here, we utilize this sample to determine, for the first time, individual hydrostatic mass estimates for all the clusters of the sample and, by making use of the completeness of the sample, we quantify constraints on the two interesting cosmological parameters, Ωm and σ8. We apply our total hydrostatic and gas mass estimates from the X-ray analysis to a Bayesian cosmological likelihood analysis and leave several parameters free to be constrained. We find Ωm = 0.30 ± 0.01 and σ8 = 0.79 ± 0.03 (statistical uncertainties, 68 per cent credibility level) using our default analysis strategy combining both a mass function analysis and the gas mass fraction results. The main sources of biases that we correct here are (1) the influence of galaxy groups (incompleteness in parent samples and differing behaviour of the Lx-M relation), (2) the hydrostatic mass bias, (3) the extrapolation of the total mass (comparing various methods), (4) the theoretical halo mass function and (5) other physical effects (non-negligible neutrino mass). We find that galaxy groups introduce a strong bias, since their number density seems to be over predicted by the halo mass function. On the other hand, incorporating baryonic effects does not result in a significant change in the constraints. The total (uncorrected) systematic uncertainties (∼20 per cent) clearly dominate the statistical uncertainties on cosmological parameters for our sample.

  15. An evaluation of a reagentless method for the determination of total mercury in aquatic life

    USGS Publications Warehouse

    Haynes, Sekeenia; Gragg, Richard D.; Johnson, Elijah; Robinson, Larry; Orazio, Carl E.

    2006-01-01

    Multiple treatment (i.e., drying, chemical digestion, and oxidation) steps are often required during preparation of biological matrices for quantitative analysis of mercury; these multiple steps could potentially lead to systematic errors and poor recovery of the analyte. In this study, the Direct Mercury Analyzer (Milestone Inc., Monroe, CT) was utilized to measure total mercury in fish tissue by integrating steps of drying, sample combustion and gold sequestration with successive identification using atomic absorption spectrometry. We also evaluated the differences between the mercury concentrations found in samples that were homogenized and samples with no preparation. These results were confirmed with cold vapor atomic absorbance and fluorescence spectrometric methods of analysis. Finally, total mercury in wild captured largemouth bass (n = 20) were assessed using the Direct Mercury Analyzer to examine internal variability between mercury concentrations in muscle, liver and brain organs. Direct analysis of total mercury measured in muscle tissue was strongly correlated with muscle tissue that was homogenized before analysis (r = 0.81, p < 0.0001). Additionally, results using this integrated method compared favorably (p < 0.05) with conventional cold vapor spectrometry with atomic absorbance and fluorescence detection methods. Mercury concentrations in brain were significantly lower than concentrations in muscle (p < 0.001) and liver (p < 0.05) tissues. This integrated method can measure a wide range of mercury concentrations (0-500 ??g) using small sample sizes. Total mercury measurements in this study are comparative to the methods (cold vapor) commonly used for total mercury analysis and are devoid of laborious sample preparation and expensive hazardous waste. ?? Springer 2006.

  16. Simulating and assessing boson sampling experiments with phase-space representations

    NASA Astrophysics Data System (ADS)

    Opanchuk, Bogdan; Rosales-Zárate, Laura; Reid, Margaret D.; Drummond, Peter D.

    2018-04-01

    The search for new, application-specific quantum computers designed to outperform any classical computer is driven by the ending of Moore's law and the quantum advantages potentially obtainable. Photonic networks are promising examples, with experimental demonstrations and potential for obtaining a quantum computer to solve problems believed classically impossible. This introduces a challenge: how does one design or understand such photonic networks? One must be able to calculate observables using general methods capable of treating arbitrary inputs, dissipation, and noise. We develop complex phase-space software for simulating these photonic networks, and apply this to boson sampling experiments. Our techniques give sampling errors orders of magnitude lower than experimental correlation measurements for the same number of samples. We show that these techniques remove systematic errors in previous algorithms for estimating correlations, with large improvements in errors in some cases. In addition, we obtain a scalable channel-combination strategy for assessment of boson sampling devices.

  17. Removing the echoes from terahertz pulse reflection system and sample

    NASA Astrophysics Data System (ADS)

    Liu, Haishun; Zhang, Zhenwei; Zhang, Cunlin

    2018-01-01

    Due to the echoes both from terahertz (THz) pulse reflection system and sample, the THz primary pulse will be distorted. The system echoes include two types. One preceding the main peak probably is caused by ultrafast laser pulse and the other at the back of the primary pulse is caused by the Fabry-Perot (F-P) etalon effect of detector. We attempt to remove the corresponding echoes by using two kinds of deconvolution. A Si wafer of 400μm was selected as the tested sample. Firstly, the method of double Gaussian filter (DGF) decnvolution was used to remove the systematic echoes, and then another deconvolution technique was employed to eliminate the two obvious echoes of the sample. The ultimate results indicated: although the combination of two deconvolution techniques could not entirely remove the echoes of sample and system, the echoes were largely reduced.

  18. A field study of selected U.S. Geological Survey analytical methods for measuring pesticides in filtered stream water, June - September 2012

    USGS Publications Warehouse

    Martin, Jeffrey D.; Norman, Julia E.; Sandstrom, Mark W.; Rose, Claire E.

    2017-09-06

    U.S. Geological Survey monitoring programs extensively used two analytical methods, gas chromatography/mass spectrometry and liquid chromatography/mass spectrometry, to measure pesticides in filtered water samples during 1992–2012. In October 2012, the monitoring programs began using direct aqueous-injection liquid chromatography tandem mass spectrometry as a new analytical method for pesticides. The change in analytical methods, however, has the potential to inadvertently introduce bias in analysis of datasets that span the change.A field study was designed to document performance of the new method in a variety of stream-water matrices and to quantify any potential changes in measurement bias or variability that could be attributed to changes in analytical methods. The goals of the field study were to (1) summarize performance (bias and variability of pesticide recovery) of the new method in a variety of stream-water matrices; (2) compare performance of the new method in laboratory blank water (laboratory reagent spikes) to that in a variety of stream-water matrices; (3) compare performance (analytical recovery) of the new method to that of the old methods in a variety of stream-water matrices; (4) compare pesticide detections and concentrations measured by the new method to those of the old methods in a variety of stream-water matrices; (5) compare contamination measured by field blank water samples in old and new methods; (6) summarize the variability of pesticide detections and concentrations measured by the new method in field duplicate water samples; and (7) identify matrix characteristics of environmental water samples that adversely influence the performance of the new method. Stream-water samples and a variety of field quality-control samples were collected at 48 sites in the U.S. Geological Survey monitoring networks during June–September 2012. Stream sites were located across the United States and included sites in agricultural and urban land-use settings, as well as sites on major rivers.The results of the field study identified several challenges for the analysis and interpretation of data analyzed by both old and new methods, particularly when data span the change in methods and are combined for analysis of temporal trends in water quality. The main challenges identified are large (greater than 30 percent), statistically significant differences in analytical recovery, detection capability, and (or) measured concentrations for selected pesticides. These challenges are documented and discussed, but specific guidance or statistical methods to resolve these differences in methods are beyond the scope of the report. The results of the field study indicate that the implications of the change in analytical methods must be assessed individually for each pesticide and method.Understanding the possible causes of the systematic differences in concentrations between methods that remain after recovery adjustment might be necessary to determine how to account for the differences in data analysis. Because recoveries for each method are independently determined from separate reference standards and spiking solutions, the differences might be due to an error in one of the reference standards or solutions or some other basic aspect of standard procedure in the analytical process. Further investigation of the possible causes is needed, which will lead to specific decisions on how to compensate for these differences in concentrations in data analysis. In the event that further investigations do not provide insight into the causes of systematic differences in concentrations between methods, the authors recommend continuing to collect and analyze paired environmental water samples by both old and new methods. This effort should be targeted to seasons, sites, and expected concentrations to supplement those concentrations already assessed and to compare the ongoing analytical recovery of old and new methods to those observed in the summer and fall of 2012.

  19. Systematic text condensation: a strategy for qualitative analysis.

    PubMed

    Malterud, Kirsti

    2012-12-01

    To present background, principles, and procedures for a strategy for qualitative analysis called systematic text condensation and discuss this approach compared with related strategies. Giorgi's psychological phenomenological analysis is the point of departure and inspiration for systematic text condensation. The basic elements of Giorgi's method and the elaboration of these in systematic text condensation are presented, followed by a detailed description of procedures for analysis according to systematic text condensation. Finally, similarities and differences compared with other frequently applied methods for qualitative analysis are identified, as the foundation of a discussion of strengths and limitations of systematic text condensation. Systematic text condensation is a descriptive and explorative method for thematic cross-case analysis of different types of qualitative data, such as interview studies, observational studies, and analysis of written texts. The method represents a pragmatic approach, although inspired by phenomenological ideas, and various theoretical frameworks can be applied. The procedure consists of the following steps: 1) total impression - from chaos to themes; 2) identifying and sorting meaning units - from themes to codes; 3) condensation - from code to meaning; 4) synthesizing - from condensation to descriptions and concepts. Similarities and differences comparing systematic text condensation with other frequently applied qualitative methods regarding thematic analysis, theoretical methodological framework, analysis procedures, and taxonomy are discussed. Systematic text condensation is a strategy for analysis developed from traditions shared by most of the methods for analysis of qualitative data. The method offers the novice researcher a process of intersubjectivity, reflexivity, and feasibility, while maintaining a responsible level of methodological rigour.

  20. Elimination of 'ghost'-effect-related systematic error in metrology of X-ray optics with a long trace profiler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yashchuk, Valeriy V.; Irick, Steve C.; MacDowell, Alastair A.

    2005-04-28

    A data acquisition technique and relevant program for suppression of one of the systematic effects, namely the ''ghost'' effect, of a second generation long trace profiler (LTP) is described. The ''ghost'' effect arises when there is an unavoidable cross-contamination of the LTP sample and reference signals into one another, leading to a systematic perturbation in the recorded interference patterns and, therefore, a systematic variation of the measured slope trace. Perturbations of about 1-2 {micro}rad have been observed with a cylindrically shaped X-ray mirror. Even stronger ''ghost'' effects show up in an LTP measurement with a mirror having a toroidal surfacemore » figure. The developed technique employs separate measurement of the ''ghost''-effect-related interference patterns in the sample and the reference arms and then subtraction of the ''ghost'' patterns from the sample and the reference interference patterns. The procedure preserves the advantage of simultaneously measuring the sample and reference signals. The effectiveness of the technique is illustrated with LTP metrology of a variety of X-ray mirrors.« less

  1. Systematical Optimization of Reverse-phase Chromatography for Shotgun Proteomics

    PubMed Central

    Xu, Ping; Duong, Duc M.; Peng, Junmin

    2009-01-01

    Summary We report the optimization of a common LC/MS/MS platform to maximize the number of proteins identified from a complex biological sample. The platform uses digested yeast lysate on a 75 μm internal diameter × 12 cm reverse-phase column that is combined with an LTQ-Orbitrap mass spectrometer. We first generated a yeast peptide mix that was quantified by multiple methods including the strategy of stable isotope labeling with amino acids in cell culture (SILAC). The peptide mix was analyzed on a highly reproducible, automated nanoLC/MS/MS system with systematic adjustment of loading amount, flow rate, elution gradient range and length. Interestingly, the column was found to be almost saturated by loading ~1 μg of the sample. Whereas the optimal flow rate (~0.2 μl/min) and elution buffer range (13–32% of acetonitrile) appeared to be independent of the loading amount, the best gradient length varied according to the amount of samples: 160 min for 1 μg of the peptide mix, but 40 min for 10 ng of the same sample. The effect of these parameters on elution peptide peak width is evaluated. After full optimization, 1,012 proteins (clustered in 806 groups) with an estimated protein false discovery rate of ~3% were identified in 1 μg of yeast lysate in a single 160-min LC/MS/MS run. PMID:19566079

  2. Sitting-Meditation Interventions Among Youth: A Review of Treatment Efficacy

    PubMed Central

    Black, David S.; Milam, Joel; Sussman, Steve

    2011-01-01

    OBJECTIVE Although the efficacy of meditation interventions has been examined among adult samples, meditation treatment effects among youth are relatively unknown. We systematically reviewed empirical studies for the health-related effects of sitting-meditative practices implemented among youth aged 6 to 18 years in school, clinic, and community settings. METHODS A systematic review of electronic databases (PubMed, Ovid, Web of Science, Cochrane Reviews Database, Google Scholar) was conducted from 1982 to 2008, obtaining a sample of 16 empirical studies related to sitting-meditation interventions among youth. RESULTS Meditation modalities included mindfulness meditation, transcendental meditation, mindfulness-based stress reduction, and mindfulness-based cognitive therapy. Study samples primarily consisted of youth with preexisting conditions such as high-normal blood pressure, attention-deficit/hyperactivity disorder, and learning disabilities. Studies that examined physiologic outcomes were composed almost entirely of African American/black participants. Median effect sizes were slightly smaller than those obtained from adult samples and ranged from 0.16 to 0.29 for physiologic outcomes and 0.27 to 0.70 for psychosocial/behavioral outcomes. CONCLUSIONS Sitting meditation seems to be an effective intervention in the treatment of physiologic, psychosocial, and behavioral conditions among youth. Because of current limitations, carefully constructed research is needed to advance our understanding of sitting meditation and its future use as an effective treatment modality among younger populations. PMID:19706568

  3. Response rate differences between web and alternative data collection methods for public health research: a systematic review of the literature.

    PubMed

    Blumenberg, Cauane; Barros, Aluísio J D

    2018-07-01

    To systematically review the literature and compare response rates (RRs) of web surveys to alternative data collection methods in the context of epidemiologic and public health studies. We reviewed the literature using PubMed, LILACS, SciELO, WebSM, and Google Scholar databases. We selected epidemiologic and public health studies that considered the general population and used two parallel data collection methods, being one web-based. RR differences were analyzed using two-sample test of proportions, and pooled using random effects. We investigated agreement using Bland-and-Altman, and correlation using Pearson's coefficient. We selected 19 studies (nine randomized trials). The RR of the web-based data collection was 12.9 percentage points (p.p.) lower (95% CI = - 19.0, - 6.8) than the alternative methods, and 15.7 p.p. lower (95% CI = - 24.2, - 7.3) considering only randomized trials. Monetary incentives did not reduce the RR differences. A strong positive correlation (r = 0.83) between the RRs was observed. Web-based data collection present lower RRs compared to alternative methods. However, it is not recommended to interpret this as a meta-analytical evidence due to the high heterogeneity of the studies.

  4. Three-dimensional Imaging Methods for Quantitative Analysis of Facial Soft Tissues and Skeletal Morphology in Patients with Orofacial Clefts: A Systematic Review

    PubMed Central

    Kuijpers, Mette A. R.; Chiu, Yu-Ting; Nada, Rania M.; Carels, Carine E. L.; Fudalej, Piotr S.

    2014-01-01

    Background Current guidelines for evaluating cleft palate treatments are mostly based on two-dimensional (2D) evaluation, but three-dimensional (3D) imaging methods to assess treatment outcome are steadily rising. Objective To identify 3D imaging methods for quantitative assessment of soft tissue and skeletal morphology in patients with cleft lip and palate. Data sources Literature was searched using PubMed (1948–2012), EMBASE (1980–2012), Scopus (2004–2012), Web of Science (1945–2012), and the Cochrane Library. The last search was performed September 30, 2012. Reference lists were hand searched for potentially eligible studies. There was no language restriction. Study selection We included publications using 3D imaging techniques to assess facial soft tissue or skeletal morphology in patients older than 5 years with a cleft lip with/or without cleft palate. We reviewed studies involving the facial region when at least 10 subjects in the sample size had at least one cleft type. Only primary publications were included. Data extraction Independent extraction of data and quality assessments were performed by two observers. Results Five hundred full text publications were retrieved, 144 met the inclusion criteria, with 63 high quality studies. There were differences in study designs, topics studied, patient characteristics, and success measurements; therefore, only a systematic review could be conducted. Main 3D-techniques that are used in cleft lip and palate patients are CT, CBCT, MRI, stereophotogrammetry, and laser surface scanning. These techniques are mainly used for soft tissue analysis, evaluation of bone grafting, and changes in the craniofacial skeleton. Digital dental casts are used to evaluate treatment and changes over time. Conclusion Available evidence implies that 3D imaging methods can be used for documentation of CLP patients. No data are available yet showing that 3D methods are more informative than conventional 2D methods. Further research is warranted to elucidate it. Systematic review registration International Prospective Register of Systematic Reviews, PROSPERO CRD42012002041 PMID:24710215

  5. Systematic construction and control of stereo nerve vision network in intelligent manufacturing

    NASA Astrophysics Data System (ADS)

    Liu, Hua; Wang, Helong; Guo, Chunjie; Ding, Quanxin; Zhou, Liwei

    2017-10-01

    A system method of constructing stereo vision by using neural network is proposed, and the operation and control mechanism in actual operation are proposed. This method makes effective use of the neural network in learning and memory function, by after training with samples. Moreover, the neural network can learn the nonlinear relationship in the stereoscopic vision system and the internal and external orientation elements. These considerations are Worthy of attention, which includes limited constraints, the scientific of critical group, the operating speed and the operability in technical aspects. The results support our theoretical forecast.

  6. Estimating the encounter rate variance in distance sampling

    USGS Publications Warehouse

    Fewster, R.M.; Buckland, S.T.; Burnham, K.P.; Borchers, D.L.; Jupp, P.E.; Laake, J.L.; Thomas, L.

    2009-01-01

    The dominant source of variance in line transect sampling is usually the encounter rate variance. Systematic survey designs are often used to reduce the true variability among different realizations of the design, but estimating the variance is difficult and estimators typically approximate the variance by treating the design as a simple random sample of lines. We explore the properties of different encounter rate variance estimators under random and systematic designs. We show that a design-based variance estimator improves upon the model-based estimator of Buckland et al. (2001, Introduction to Distance Sampling. Oxford: Oxford University Press, p. 79) when transects are positioned at random. However, if populations exhibit strong spatial trends, both estimators can have substantial positive bias under systematic designs. We show that poststratification is effective in reducing this bias. ?? 2008, The International Biometric Society.

  7. Cancer-Related Fatigue and Its Associations with Depression and Anxiety: A Systematic Review

    PubMed Central

    Brown, Linda F.; Kroenke, Kurt

    2010-01-01

    Background Fatigue is an important symptom in cancer and has been shown to be associated with psychological distress. Objectives This review assesses evidence regarding associations of CRF with depression and anxiety. Methods Database searches yielded 59 studies reporting correlation coefficients or odds ratios. Results Combined sample size was 12,103. Average correlation of fatigue with depression, weighted by sample size, was 0.56 and for anxiety, 0.46. Thirty-one instruments were used to assess fatigue, suggesting a lack of consensus on measurement. Conclusion This review confirms the association of fatigue with depression and anxiety. Directionality needs to be better delineated in longitudinal studies. PMID:19855028

  8. Systematic Development and Validation of a Thin-Layer Densitometric Bioanalytical Method for Estimation of Mangiferin Employing Analytical Quality by Design (AQbD) Approach

    PubMed Central

    Khurana, Rajneet Kaur; Rao, Satish; Beg, Sarwar; Katare, O.P.; Singh, Bhupinder

    2016-01-01

    The present work aims at the systematic development of a simple, rapid and highly sensitive densitometry-based thin-layer chromatographic method for the quantification of mangiferin in bioanalytical samples. Initially, the quality target method profile was defined and critical analytical attributes (CAAs) earmarked, namely, retardation factor (Rf), peak height, capacity factor, theoretical plates and separation number. Face-centered cubic design was selected for optimization of volume loaded and plate dimensions as the critical method parameters selected from screening studies employing D-optimal and Plackett–Burman design studies, followed by evaluating their effect on the CAAs. The mobile phase containing a mixture of ethyl acetate : acetic acid : formic acid : water in a 7 : 1 : 1 : 1 (v/v/v/v) ratio was finally selected as the optimized solvent for apt chromatographic separation of mangiferin at 262 nm with Rf 0.68 ± 0.02 and all other parameters within the acceptance limits. Method validation studies revealed high linearity in the concentration range of 50–800 ng/band for mangiferin. The developed method showed high accuracy, precision, ruggedness, robustness, specificity, sensitivity, selectivity and recovery. In a nutshell, the bioanalytical method for analysis of mangiferin in plasma revealed the presence of well-resolved peaks and high recovery of mangiferin. PMID:26912808

  9. Epidemiology and Reporting Characteristics of Systematic Reviews of Biomedical Research: A Cross-Sectional Study.

    PubMed

    Page, Matthew J; Shamseer, Larissa; Altman, Douglas G; Tetzlaff, Jennifer; Sampson, Margaret; Tricco, Andrea C; Catalá-López, Ferrán; Li, Lun; Reid, Emma K; Sarkis-Onofre, Rafael; Moher, David

    2016-05-01

    Systematic reviews (SRs) can help decision makers interpret the deluge of published biomedical literature. However, a SR may be of limited use if the methods used to conduct the SR are flawed, and reporting of the SR is incomplete. To our knowledge, since 2004 there has been no cross-sectional study of the prevalence, focus, and completeness of reporting of SRs across different specialties. Therefore, the aim of our study was to investigate the epidemiological and reporting characteristics of a more recent cross-section of SRs. We searched MEDLINE to identify potentially eligible SRs indexed during the month of February 2014. Citations were screened using prespecified eligibility criteria. Epidemiological and reporting characteristics of a random sample of 300 SRs were extracted by one reviewer, with a 10% sample extracted in duplicate. We compared characteristics of Cochrane versus non-Cochrane reviews, and the 2014 sample of SRs versus a 2004 sample of SRs. We identified 682 SRs, suggesting that more than 8,000 SRs are being indexed in MEDLINE annually, corresponding to a 3-fold increase over the last decade. The majority of SRs addressed a therapeutic question and were conducted by authors based in China, the UK, or the US; they included a median of 15 studies involving 2,072 participants. Meta-analysis was performed in 63% of SRs, mostly using standard pairwise methods. Study risk of bias/quality assessment was performed in 70% of SRs but was rarely incorporated into the analysis (16%). Few SRs (7%) searched sources of unpublished data, and the risk of publication bias was considered in less than half of SRs. Reporting quality was highly variable; at least a third of SRs did not report use of a SR protocol, eligibility criteria relating to publication status, years of coverage of the search, a full Boolean search logic for at least one database, methods for data extraction, methods for study risk of bias assessment, a primary outcome, an abstract conclusion that incorporated study limitations, or the funding source of the SR. Cochrane SRs, which accounted for 15% of the sample, had more complete reporting than all other types of SRs. Reporting has generally improved since 2004, but remains suboptimal for many characteristics. An increasing number of SRs are being published, and many are poorly conducted and reported. Strategies are needed to help reduce this avoidable waste in research.

  10. MRI/US fusion-guided prostate biopsy allows for equivalent cancer detection with significantly fewer needle cores in biopsy-naive men

    PubMed Central

    Yarlagadda, Vidhush K.; Lai, Win Shun; Gordetsky, Jennifer B.; Porter, Kristin K.; Nix, Jeffrey W.; Thomas, John V.; Rais-Bahrami, Soroush

    2018-01-01

    PURPOSE We aimed to investigate the efficiency and cancer detection of magnetic resonance imaging (MRI)/ultrasonography (US) fusion-guided prostate biopsy in a cohort of biopsy-naive men compared with standard-of-care systematic extended sextant transrectal ultrasonography (TRUS)-guided biopsy. METHODS From 2014 to 2016, 72 biopsy-naive men referred for initial prostate cancer evaluation who underwent MRI of the prostate were prospectively evaluated. Retrospective review was performed on 69 patients with lesions suspicious for malignancy who underwent MRI/US fusion-guided biopsy in addition to systematic extended sextant biopsy. Biometric, imaging, and pathology data from both the MRI-targeted biopsies and systematic biopsies were analyzed and compared. RESULTS There were no significant differences in overall prostate cancer detection when comparing MRI-targeted biopsies to standard systematic biopsies (P = 0.39). Furthermore, there were no significant differences in the distribution of severity of cancers based on grade groups in cases with cancer detection (P = 0.68). However, significantly fewer needle cores were taken during the MRI/US fusion-guided biopsy compared with systematic biopsy (63% less cores sampled, P < 0.001) CONCLUSION In biopsy-naive men, MRI/US fusion-guided prostate biopsy offers equal prostate cancer detection compared with systematic TRUS-guided biopsy with significantly fewer tissue cores using the targeted technique. This approach can potentially reduce morbidity in the future if used instead of systematic biopsy without sacrificing the ability to detect prostate cancer, particularly in cases with higher grade disease. PMID:29770762

  11. Systematic review of the evidence for Trails B cut-off scores in assessing fitness-to-drive

    PubMed Central

    Roy, Mononita; Molnar, Frank

    2013-01-01

    Background Fitness-to-drive guidelines recommend employing the Trail Making B Test (a.k.a. Trails B), but do not provide guidance regarding cut-off scores. There is ongoing debate regarding the optimal cut-off score on the Trails B test. The objective of this study was to address this controversy by systematically reviewing the evidence for specific Trails B cut-off scores (e.g., cut-offs in both time to completion and number of errors) with respect to fitness-to-drive. Methods Systematic review of all prospective cohort, retrospective cohort, case-control, correlation, and cross-sectional studies reporting the ability of the Trails B to predict driving safety that were published in English-language, peer-reviewed journals. Results Forty-seven articles were reviewed. None of the articles justified sample sizes via formal calculations. Cut-off scores reported based on research include: 90 seconds, 133 seconds, 147 seconds, 180 seconds, and < 3 errors. Conclusions There is support for the previously published Trails B cut-offs of 3 minutes or 3 errors (the ‘3 or 3 rule’). Major methodological limitations of this body of research were uncovered including (1) lack of justification of sample size leaving studies open to Type II error (i.e., false negative findings), and (2) excessive focus on associations rather than clinically useful cut-off scores. PMID:23983828

  12. Dispersive liquid-liquid microextraction based on the solidification of floating organic droplet for the determination of polychlorinated biphenyls in aqueous samples.

    PubMed

    Dai, Liping; Cheng, Jing; Matsadiq, Guzalnur; Liu, Lu; Li, Jun-Kai

    2010-08-03

    In the proposed method, an extraction solvent with a lower toxicity and density than the solvents typically used in dispersive liquid-liquid microextraction was used to extract seven polychlorinated biphenyls (PCBs) from aqueous samples. Due to the density and melting point of the extraction solvent, the extract which forms a layer on top of aqueous sample can be collected by solidifying it at low temperatures, which form a layer on top of the aqueous sample. Furthermore, the solidified phase can be easily removed from the aqueous phase. Based on preliminary studies, 1-undecanol was selected as the extraction solvent, and a series of parameters that affect the extraction efficiency were systematically investigated. Under the optimized conditions, enrichment factors for PCBs ranged between 494 and 606. Based on a signal-to-noise ratio of 3, the limit of detection for the method ranged between 3.3 and 5.4 ng L(-1). Good linearity, reproducibility and recovery were also obtained. 2010 Elsevier B.V. All rights reserved.

  13. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    NASA Astrophysics Data System (ADS)

    Davis, C.; Rozo, E.; Roodman, A.; Alarcon, A.; Cawthon, R.; Gatti, M.; Lin, H.; Miquel, R.; Rykoff, E. S.; Troxel, M. A.; Vielzeuf, P.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Doel, P.; Drlica-Wagner, A.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gaztanaga, E.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; Jain, B.; James, D. J.; Jeltema, T.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Kuropatkin, N.; Lahav, O.; Li, T. S.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Ogando, R. L. C.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.; Wechsler, R. H.

    2018-06-01

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogues with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty of Δz ˜ ±0.01. We forecast that our proposal can, in principle, control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Our results provide strong motivation to launch a programme to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.

  14. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    DOE PAGES

    Davis, C.; Rozo, E.; Roodman, A.; ...

    2018-03-26

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less

  15. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, C.; Rozo, E.; Roodman, A.

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less

  16. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    PubMed

    Almutairy, Meznah; Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  17. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches

    PubMed Central

    Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989

  18. Luminescence dating of quaternary deposits in geology in Brazil.

    PubMed

    Tatumi, Sonia Hatsue; Gozzi, Giuliano; Yee, Márcio; de Oliveira, Victor Inácio; Sallun, Alethéa Ernandes Martins; Suguio, Kenitiro

    2006-01-01

    In the present work, systematic dating by luminescence methods has been done on 50 Quaternary geological samples within the study area of São Paulo State, Brazil. Bleaching experiments showed that residual TL intensity of 375 degrees C peak, of the quartz, was obtained after 10 h of sunlight exposition. Intensities decays of the 325 and 375 degrees C TL peaks can be fitted using second order exponential equation. Paleodose values were evaluated using regeneration methods with multiple aliquots. Samples dated indicate preliminary ages varying from 9 +/- 1 to 935 +/- 130 ka for colluvio-elluvial deposits, and from 17 +/- 2 to 215 +/- 30 ka for alluvial deposits of the study area. They cover four peneplained surfaces shaped during the Quaternary: I (1000-400 ka), II (400-120 ka), III (120-10 ka) and IV (10 ka until today), in decreasing order.

  19. Characterization of Factors Affecting Nanoparticle Tracking Analysis Results With Synthetic and Protein Nanoparticles.

    PubMed

    Krueger, Aaron B; Carnell, Pauline; Carpenter, John F

    2016-04-01

    In many manufacturing and research areas, the ability to accurately monitor and characterize nanoparticles is becoming increasingly important. Nanoparticle tracking analysis is rapidly becoming a standard method for this characterization, yet several key factors in data acquisition and analysis may affect results. Nanoparticle tracking analysis is prone to user input and bias on account of a high number of parameters available, contains a limited analysis volume, and individual sample characteristics such as polydispersity or complex protein solutions may affect analysis results. This study systematically addressed these key issues. The integrated syringe pump was used to increase the sample volume analyzed. It was observed that measurements recorded under flow caused a reduction in total particle counts for both polystyrene and protein particles compared to those collected under static conditions. In addition, data for polydisperse samples tended to lose peak resolution at higher flow rates, masking distinct particle populations. Furthermore, in a bimodal particle population, a bias was seen toward the larger species within the sample. The impacts of filtration on an agitated intravenous immunoglobulin sample and operating parameters including "MINexps" and "blur" were investigated to optimize the method. Taken together, this study provides recommendations on instrument settings and sample preparations to properly characterize complex samples. Copyright © 2016. Published by Elsevier Inc.

  20. Evaluation of analytical performance of a new high-sensitivity immunoassay for cardiac troponin I.

    PubMed

    Masotti, Silvia; Prontera, Concetta; Musetti, Veronica; Storti, Simona; Ndreu, Rudina; Zucchelli, Gian Carlo; Passino, Claudio; Clerico, Aldo

    2018-02-23

    The study aim was to evaluate and compare the analytical performance of the new chemiluminescent immunoassay for cardiac troponin I (cTnI), called Access hs-TnI using DxI platform, with those of Access AccuTnI+3 method, and high-sensitivity (hs) cTnI method for ARCHITECT platform. The limits of blank (LoB), detection (LoD) and quantitation (LoQ) at 10% and 20% CV were evaluated according to international standardized protocols. For the evaluation of analytical performance and comparison of cTnI results, both heparinized plasma samples, collected from healthy subjects and patients with cardiac diseases, and quality control samples distributed in external quality assessment programs were used. LoB, LoD and LoQ at 20% and 10% CV values of the Access hs-cTnI method were 0.6, 1.3, 2.1 and 5.3 ng/L, respectively. Access hs-cTnI method showed analytical performance significantly better than that of Access AccuTnI+3 method and similar results to those of hs ARCHITECT cTnI method. Moreover, the cTnI concentrations measured with Access hs-cTnI method showed close linear regressions with both Access AccuTnI+3 and ARCHITECT hs-cTnI methods, although there were systematic differences between these methods. There was no difference between cTnI values measured by Access hs-cTnI in heparinized plasma and serum samples, whereas there was a significant difference between cTnI values, respectively measured in EDTA and heparin plasma samples. Access hs-cTnI has analytical sensitivity parameters significantly improved compared to Access AccuTnI+3 method and is similar to those of the high-sensitivity method using ARCHITECT platform.

  1. I-Xe systematics in LL chondrites

    NASA Technical Reports Server (NTRS)

    Bernatowicz, T. J.; Podosek, F. A.; Swindle, T. D.; Honda, M.

    1988-01-01

    A stepwise heating analysis of Ar and Xe data from five neutron-irradiated whole rock LL chondrites (Soko Banja, Alta Ameen, Tuxtuac, Guidder, and Olivenza) is presented, emphasizing the complicated thermal history of ordinary chondrites. None of the present meteorites show a well-defined (Ar-40)-(Ar-39) apparent age plateau comprised of more than two release fractions. Most of the samples are found to yield well-defined high-temperature correlations between Xe-129/Xe-130 and Xe-128/Xe-130, and thus determinations of I-129/I-127 and Xe-129/Xe-130 at the time of isotopic closure for Xe. As in the case of other ordinary chondrites, the I-Xe systematics for LL chondrites correlate neither with a metamorphic grade nor with chronologies based opon other methods.

  2. Counting glomeruli and podocytes: rationale and methodologies

    PubMed Central

    Puelles, Victor G.; Bertram, John F.

    2015-01-01

    Purpose of review There is currently much interest in the numbers of both glomeruli and podocytes. This interest stems from greater understanding of the effects of suboptimal fetal events on nephron endowment, the associations between low nephron number and chronic cardiovascular and kidney disease in adults, and the emergence of the podocyte depletion hypothesis. Recent findings Obtaining accurate and precise estimates of glomerular and podocyte number has proven surprisingly difficult. When whole kidneys or large tissue samples are available, design-based stereological methods are considered gold-standard because they are based on principles that negate systematic bias. However, these methods are often tedious and time-consuming, and oftentimes inapplicable when dealing with small samples such as biopsies. Therefore, novel methods suitable for small tissue samples, and innovative approaches to facilitate high through put measurements, such as magnetic resonance imaging (MRI) to estimate glomerular number and flow cytometry to estimate podocyte number, have recently been described. Summary This review describes current gold-standard methods for estimating glomerular and podocyte number, as well as methods developed in the past 3 years. We are now better placed than ever before to accurately and precisely estimate glomerular and podocyte number, and to examine relationships between these measurements and kidney health and disease. PMID:25887899

  3. Conventional, Bayesian, and Modified Prony's methods for characterizing fast and slow waves in equine cancellous bone

    PubMed Central

    Groopman, Amber M.; Katz, Jonathan I.; Holland, Mark R.; Fujita, Fuminori; Matsukawa, Mami; Mizuno, Katsunori; Wear, Keith A.; Miller, James G.

    2015-01-01

    Conventional, Bayesian, and the modified least-squares Prony's plus curve-fitting (MLSP + CF) methods were applied to data acquired using 1 MHz center frequency, broadband transducers on a single equine cancellous bone specimen that was systematically shortened from 11.8 mm down to 0.5 mm for a total of 24 sample thicknesses. Due to overlapping fast and slow waves, conventional analysis methods were restricted to data from sample thicknesses ranging from 11.8 mm to 6.0 mm. In contrast, Bayesian and MLSP + CF methods successfully separated fast and slow waves and provided reliable estimates of the ultrasonic properties of fast and slow waves for sample thicknesses ranging from 11.8 mm down to 3.5 mm. Comparisons of the three methods were carried out for phase velocity at the center frequency and the slope of the attenuation coefficient for the fast and slow waves. Good agreement among the three methods was also observed for average signal loss at the center frequency. The Bayesian and MLSP + CF approaches were able to separate the fast and slow waves and provide good estimates of the fast and slow wave properties even when the two wave modes overlapped in both time and frequency domains making conventional analysis methods unreliable. PMID:26328678

  4. Multi-pesticides residue analysis of grains using modified magnetic nanoparticle adsorbent for facile and efficient cleanup.

    PubMed

    Liu, Zhenzhen; Qi, Peipei; Wang, Xiangyun; Wang, Zhiwei; Xu, Xiahong; Chen, Wenxue; Wu, Liyu; Zhang, Hu; Wang, Qiang; Wang, Xinquan

    2017-09-01

    A facile, rapid sample pretreatment method was developed based on magnetic nanoparticles for multi-pesticides residue analysis of grains. Magnetite (Fe 3 O 4 ) nanoparticles modified with 3-(N,N-diethylamino)propyltrimethoxysilane (Fe 3 O 4 -PSA) and commercial C18 were selected as the cleanup adsorbents to remove the target interferences of the matrix, such as fatty acids and non-polar compounds. Rice was used as the representative grain sample for method optimization. The amount of Fe 3 O 4 -PSA and C18 were systematically investigated for selecting the suitable purification conditions, and the simultaneous determination of 50 pesticides and 8 related metabolites in rice was established by liquid chromatography-tandem mass spectrometry. Under the optimal conditions, the method validation was performed including linearity, sensitivity, matrix effect, recovery and precision, which all satisfy the requirement for pesticides residue analysis. Compared to the conventional QuEChERS method with non-magnetic material as cleanup adsorbent, the present method can save 30% of the pretreatment time, giving the high throughput analysis possible. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. NetBenchmark: a bioconductor package for reproducible benchmarks of gene regulatory network inference.

    PubMed

    Bellot, Pau; Olsen, Catharina; Salembier, Philippe; Oliveras-Vergés, Albert; Meyer, Patrick E

    2015-09-29

    In the last decade, a great number of methods for reconstructing gene regulatory networks from expression data have been proposed. However, very few tools and datasets allow to evaluate accurately and reproducibly those methods. Hence, we propose here a new tool, able to perform a systematic, yet fully reproducible, evaluation of transcriptional network inference methods. Our open-source and freely available Bioconductor package aggregates a large set of tools to assess the robustness of network inference algorithms against different simulators, topologies, sample sizes and noise intensities. The benchmarking framework that uses various datasets highlights the specialization of some methods toward network types and data. As a result, it is possible to identify the techniques that have broad overall performances.

  6. Ecological content validation of the Information Assessment Method for parents (IAM-parent): A mixed methods study.

    PubMed

    Bujold, M; El Sherif, R; Bush, P L; Johnson-Lafleur, J; Doray, G; Pluye, P

    2018-02-01

    This mixed methods study content validated the Information Assessment Method for parents (IAM-parent) that allows users to systematically rate and comment on online parenting information. Quantitative data and results: 22,407 IAM ratings were collected; of the initial 32 items, descriptive statistics showed that 10 had low relevance. Qualitative data and results: IAM-based comments were collected, and 20 IAM users were interviewed (maximum variation sample); the qualitative data analysis assessed the representativeness of IAM items, and identified items with problematic wording. Researchers, the program director, and Web editors integrated quantitative and qualitative results, which led to a shorter and clearer IAM-parent. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Biomedical journals lack a consistent method to detect outcome reporting bias: a cross-sectional analysis.

    PubMed

    Huan, L N; Tejani, A M; Egan, G

    2014-10-01

    An increasing amount of recently published literature has implicated outcome reporting bias (ORB) as a major contributor to skewing data in both randomized controlled trials and systematic reviews; however, little is known about the current methods in place to detect ORB. This study aims to gain insight into the detection and management of ORB by biomedical journals. This was a cross-sectional analysis involving standardized questions via email or telephone with the top 30 biomedical journals (2012) ranked by impact factor. The Cochrane Database of Systematic Reviews was excluded leaving 29 journals in the sample. Of 29 journals, 24 (83%) responded to our initial inquiry of which 14 (58%) answered our questions and 10 (42%) declined participation. Five (36%) of the responding journals indicated they had a specific method to detect ORB, whereas 9 (64%) did not have a specific method in place. The prevalence of ORB in the review process seemed to differ with 4 (29%) journals indicating ORB was found commonly, whereas 7 (50%) indicated ORB was uncommon or never detected by their journal previously. The majority (n = 10/14, 72%) of journals were unwilling to report or make discrepancies found in manuscripts available to the public. Although the minority, there were some journals (n = 4/14, 29%) which described thorough methods to detect ORB. Many journals seemed to lack a method with which to detect ORB and its estimated prevalence was much lower than that reported in literature suggesting inadequate detection. There exists a potential for overestimation of treatment effects of interventions and unclear risks. Fortunately, there are journals within this sample which appear to utilize comprehensive methods for detection of ORB, but overall, the data suggest improvements at the biomedical journal level for detecting and minimizing the effect of this bias are needed. © 2014 John Wiley & Sons Ltd.

  8. Canine retraction: A systematic review of different methods used.

    PubMed

    Kulshrestha, Rohit S; Tandon, Ragni; Chandra, Pratik

    2015-01-01

    Canine retraction is a very important step in treatment of patients with crowding, or first premolar extraction cases. In severe crowding cases until, the canines have been distilized to relive the crowding, space to correctly align the incisors will not be available. Correct positioning of the canines after retraction is of great importance for the function, stability, and esthetics. The aim of this systematic review was to examine, in an evidence-based way, which kinds of canine retraction methods/techniques are most effective and which have the least side effects. A literature survey was performed by applying the Medline Database (Entrez PubMed) and Science Direct database covering the period from 1985 to 2014, to find out efficient ways to accomplish canine retraction. Randomized controlled trials (RCTs), prospective and retrospective controlled studies, and clinical trials were included. Two reviewers selected and extracted the data independently and assessed the quality of the retrieved studies. The search strategy resulted in 324 articles, of which 22 met the inclusion criteria. Due to the vast heterogeneity in study methods, the scientific evidence was too weak to evaluate retraction efficiency during space closure. The data so far reviewed proved that elastomeric power chains, elastic threads, magnets, NiTi coil springs, corticotomies, distraction osteogenesis, and laser therapy, all are able to provide optimum rate of tooth movements. All the methods were nearly similar to each other for retraction of canines Most of the techniques lead to anchorage loss in various amounts depending on the methods used. Most of the studies had serious problems with small sample size, confounding factors, lack of method error analysis, and no blinding in measurements. To obtain reliable scientific evidence, controlled RCT's with sufficient sample sizes are needed to determine which method/technique is the most effective in the respective retraction situation. Further studies should also consider patient acceptance and cost analysis as well as implants and minor surgeries for canine retraction.

  9. Developing Risk Prediction Models for Postoperative Pancreatic Fistula: a Systematic Review of Methodology and Reporting Quality.

    PubMed

    Wen, Zhang; Guo, Ya; Xu, Banghao; Xiao, Kaiyin; Peng, Tao; Peng, Minhao

    2016-04-01

    Postoperative pancreatic fistula is still a major complication after pancreatic surgery, despite improvements of surgical technique and perioperative management. We sought to systematically review and critically access the conduct and reporting of methods used to develop risk prediction models for predicting postoperative pancreatic fistula. We conducted a systematic search of PubMed and EMBASE databases to identify articles published before January 1, 2015, which described the development of models to predict the risk of postoperative pancreatic fistula. We extracted information of developing a prediction model including study design, sample size and number of events, definition of postoperative pancreatic fistula, risk predictor selection, missing data, model-building strategies, and model performance. Seven studies of developing seven risk prediction models were included. In three studies (42 %), the number of events per variable was less than 10. The number of candidate risk predictors ranged from 9 to 32. Five studies (71 %) reported using univariate screening, which was not recommended in building a multivariate model, to reduce the number of risk predictors. Six risk prediction models (86 %) were developed by categorizing all continuous risk predictors. The treatment and handling of missing data were not mentioned in all studies. We found use of inappropriate methods that could endanger the development of model, including univariate pre-screening of variables, categorization of continuous risk predictors, and model validation. The use of inappropriate methods affects the reliability and the accuracy of the probability estimates of predicting postoperative pancreatic fistula.

  10. Accuracy of recommended sampling and assay methods for the determination of plasma-free and urinary fractionated metanephrines in the diagnosis of pheochromocytoma and paraganglioma: a systematic review.

    PubMed

    Därr, Roland; Kuhn, Matthias; Bode, Christoph; Bornstein, Stefan R; Pacak, Karel; Lenders, Jacques W M; Eisenhofer, Graeme

    2017-06-01

    To determine the accuracy of biochemical tests for the diagnosis of pheochromocytoma and paraganglioma. A search of the PubMed database was conducted for English-language articles published between October 1958 and December 2016 on the biochemical diagnosis of pheochromocytoma and paraganglioma using immunoassay methods or high-performance liquid chromatography with coulometric/electrochemical or tandem mass spectrometric detection for measurement of fractionated metanephrines in 24-h urine collections or plasma-free metanephrines obtained under seated or supine blood sampling conditions. Application of the Standards for Reporting of Diagnostic Studies Accuracy Group criteria yielded 23 suitable articles. Summary receiver operating characteristic analysis revealed sensitivities/specificities of 94/93% and 91/93% for measurement of plasma-free metanephrines and urinary fractionated metanephrines using high-performance liquid chromatography or immunoassay methods, respectively. Partial areas under the curve were 0.947 vs. 0.911. Irrespective of the analytical method, sensitivity was significantly higher for supine compared with seated sampling, 95 vs. 89% (p < 0.02), while specificity was significantly higher for supine sampling compared with 24-h urine, 95 vs. 90% (p < 0.03). Partial areas under the curve were 0.942, 0.913, and 0.932 for supine sampling, seated sampling, and urine. Test accuracy increased linearly from 90 to 93% for 24-h urine at prevalence rates of 0.0-1.0, decreased linearly from 94 to 89% for seated sampling and was constant at 95% for supine conditions. Current tests for the biochemical diagnosis of pheochromocytoma and paraganglioma show excellent diagnostic accuracy. Supine sampling conditions and measurement of plasma-free metanephrines using high-performance liquid chromatography with coulometric/electrochemical or tandem mass spectrometric detection provides the highest accuracy at all prevalence rates.

  11. Tactical Defenses Against Systematic Variation in Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2002-01-01

    This paper examines the role of unexplained systematic variation on the reproducibility of wind tunnel test results. Sample means and variances estimated in the presence of systematic variations are shown to be susceptible to bias errors that are generally non-reproducible functions of those variations. Unless certain precautions are taken to defend against the effects of systematic variation, it is shown that experimental results can be difficult to duplicate and of dubious value for predicting system response with the highest precision or accuracy that could otherwise be achieved. Results are reported from an experiment designed to estimate how frequently systematic variations are in play in a representative wind tunnel experiment. These results suggest that significant systematic variation occurs frequently enough to cast doubts on the common assumption that sample observations can be reliably assumed to be independent. The consequences of ignoring correlation among observations induced by systematic variation are considered in some detail. Experimental tactics are described that defend against systematic variation. The effectiveness of these tactics is illustrated through computational experiments and real wind tunnel experimental results. Some tutorial information describes how to analyze experimental results that have been obtained using such quality assurance tactics.

  12. Citation of previous meta-analyses on the same topic: a clue to perpetuation of incorrect methods?

    PubMed

    Li, Tianjing; Dickersin, Kay

    2013-06-01

    Systematic reviews and meta-analyses serve as a basis for decision-making and clinical practice guidelines and should be carried out using appropriate methodology to avoid incorrect inferences. We describe the characteristics, statistical methods used for meta-analyses, and citation patterns of all 21 glaucoma systematic reviews we identified pertaining to the effectiveness of prostaglandin analog eye drops in treating primary open-angle glaucoma, published between December 2000 and February 2012. We abstracted data, assessed whether appropriate statistical methods were applied in meta-analyses, and examined citation patterns of included reviews. We identified two forms of problematic statistical analyses in 9 of the 21 systematic reviews examined. Except in 1 case, none of the 9 reviews that used incorrect statistical methods cited a previously published review that used appropriate methods. Reviews that used incorrect methods were cited 2.6 times more often than reviews that used appropriate statistical methods. We speculate that by emulating the statistical methodology of previous systematic reviews, systematic review authors may have perpetuated incorrect approaches to meta-analysis. The use of incorrect statistical methods, perhaps through emulating methods described in previous research, calls conclusions of systematic reviews into question and may lead to inappropriate patient care. We urge systematic review authors and journal editors to seek the advice of experienced statisticians before undertaking or accepting for publication a systematic review and meta-analysis. The author(s) have no proprietary or commercial interest in any materials discussed in this article. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  13. Use of Matrix Sampling Procedures to Assess Achievement in Solving Open Addition and Subtraction Sentences.

    ERIC Educational Resources Information Center

    Montague, Margariete A.

    This study investigated the feasibility of concurrently and randomly sampling examinees and items in order to estimate group achievement. Seven 32-item tests reflecting a 640-item universe of simple open sentences were used such that item selection (random, systematic) and assignment (random, systematic) of items (four, eight, sixteen) to forms…

  14. Inert gases in a terra sample - Measurements in six grain-size fractions and two single particles from Lunar 20.

    NASA Technical Reports Server (NTRS)

    Heymann, D.; Lakatos, S.; Walton, J. R.

    1973-01-01

    Review of the results of inert gas measurements performed on six grain-size fractions and two single particles from four samples of Luna 20 material. Presented and discussed data include the inert gas contents, element and isotope systematics, radiation ages, and Ar-36/Ar-40 systematics.

  15. Transport Coefficients from Large Deviation Functions

    NASA Astrophysics Data System (ADS)

    Gao, Chloe; Limmer, David

    2017-10-01

    We describe a method for computing transport coefficients from the direct evaluation of large deviation function. This method is general, relying on only equilibrium fluctuations, and is statistically efficient, employing trajectory based importance sampling. Equilibrium fluctuations of molecular currents are characterized by their large deviation functions, which is a scaled cumulant generating function analogous to the free energy. A diffusion Monte Carlo algorithm is used to evaluate the large deviation functions, from which arbitrary transport coefficients are derivable. We find significant statistical improvement over traditional Green-Kubo based calculations. The systematic and statistical errors of this method are analyzed in the context of specific transport coefficient calculations, including the shear viscosity, interfacial friction coefficient, and thermal conductivity.

  16. Zeta potential and Raman studies of PVP capped Bi2S3 nanoparticles synthesized by polyol method

    NASA Astrophysics Data System (ADS)

    Tarachand, Sathe, Vasant G.; Okram, Gunadhor S.

    2018-05-01

    Here we report the synthesis and characterisation of polyvinylpyrrolidone (PVP) capped Bi2S3 nanoparticles via one step catalyst-free polyol method. Raman spectroscopy, dynamic light scattering and zeta potential analysis were performed on it. Rietveld refinement of powder XRD of PVP capped samples confirmed the formation of single phase orthorhombic Bi2S3 for all PVP capped samples. The presence of eight obvious Raman modes further confirmed the formation of stoichiometric Bi2S3. Dynamic light scattering (DLS) studies show a clear increase in hydrodynamic diameter for samples made with increasing PVP concentration. Particle size obtained from DLS and XRD (using Scherrer's formula) combine with change in full width half maxima of Raman modes collectively suggest overall improvement in crystallinity and quality of product on introducing PVP. In zeta potential (ζ) measurement, steric hindrance of carbon chains plays very crucial role and a systematic reduction of ζ value is observed for samples made with decreasing PVP concentration. An isoelectric point is obtained for sample made with low PVP (1g). Present results are likely to open a window for its medical and catalytic applications.

  17. Supernova Cosmology Without Spectroscopy

    NASA Astrophysics Data System (ADS)

    Johnson, Elizabeth; Scolnic, Daniel; Kessler, Rick; Rykoff, Eli; Rozo, Eduardo

    2018-01-01

    Present and future supernovae (SN) surveys face several challenges: the ability to acquire redshifts of either the SN or its host galaxy, the ability to classify a SN without a spectrum, and unknown relations between SN luminosity and host galaxy type. We present here a new approach that addresses these challenges. From the large sample of SNe discovered and measured by the Dark Energy Survey (DES), we cull the sample to only supernovae (SNe) located in luminous red galaxies (LRGs). For these galaxies, photometric redshift estimates are expected to be accurate to a standard deviation of 0.02x(1+z). In addition, only Type Ia Supernovae are expected to exist in these galaxies, thereby providing a pure SNIa sample. Furthermore, we can combine this high-redshift sample with a low-redshift SN sample of only SNe located in LRGs, thereby producing a sample that is less sensitive to host galaxy relations because the host galaxy demographic is consistent across the redshift range. We find that the current DES sample has ~250 SNe in LRGs, a similar amount to current SNIa samples used to measure cosmological parameters. We present our method to produce a photometric-only Hubble diagram and measure cosmological parameters. Finally, we discuss systematic uncertainties from this approach, and forecast constraints from this method for LSST, which should have a sample roughly 200 times as large.

  18. Molecular testing of adult Pacific salmon and trout (Oncorhynchus spp.) for several RNA viruses demonstrates widespread distribution of piscine orthoreovirus in Alaska and Washington

    USGS Publications Warehouse

    Purcell, Maureen; Thompson, Rachel L.; Evered, Joy; Kerwin, John; Meyers, Ted R.; Stewart, Bruce; Winton, James

    2018-01-01

    This research was initiated in conjunction with a systematic, multiagency surveillance effort in the United States (U.S.) in response to reported findings of infectious salmon anaemia virus (ISAV) RNA in British Columbia, Canada. In the systematic surveillance study reported in a companion paper, tissues from various salmonids taken from Washington and Alaska were surveyed for ISAV RNA using the U.S.-approved diagnostic method, and samples were released for use in this present study only after testing negative. Here, we tested a subset of these samples for ISAV RNA with three additional published molecular assays, as well as for RNA from salmonid alphavirus (SAV), piscine myocarditis virus (PMCV) and piscine orthoreovirus (PRV). All samples (n = 2,252; 121 stock cohorts) tested negative for RNA from ISAV, PMCV, and SAV. In contrast, there were 25 stock cohorts from Washington and Alaska that had one or more individuals test positive for PRV RNA; prevalence within stocks varied and ranged from 2% to 73%. The overall prevalence of PRV RNA-positive individuals across the study was 3.4% (77 of 2,252 fish tested). Findings of PRV RNA were most common in coho (Oncorhynchus kisutch Walbaum) and Chinook (O. tshawytscha Walbaum) salmon.

  19. The modified Ottawa method to establish the update need of a systematic review: glass-ionomer versus resin sealants for caries prevention

    PubMed Central

    MICKENAUTSCH, Steffen; YENGOPAL, Veerasamy

    2013-01-01

    Objective To demonstrate the application of the modified Ottawa method by establishing the update need of a systematic review with focus on the caries preventive effect of GIC versus resin pit and fissure sealants; to answer the question as to whether the existing conclusions of this systematic review are still current; to establish whether a new update of this systematic review was needed. Methods: Application of the Modified Ottawa method. Application date: April/May 2012. Results Four signals aligned with the criteria of the modified Ottawa method were identified. The content of these signals suggest that higher precision of the current systematic review results might be achieved if an update of the current review were conducted at this point in time. However, these signals further indicate that such systematic review update, despite its higher precision, would only confirm the existing review conclusion that no statistically significant difference exists in the caries-preventive effect of GIC and resin-based fissure sealants. Conclusion In conclusion, this study demonstrated the modified Ottawa method as an effective tool in establishing the update need of the systematic review. In addition, it was established that the conclusions of the systematic review in relation to the caries preventive effect of GIC versus resin based fissure sealants are still current, and that no update of this systematic review was warranted at date of application. PMID:24212996

  20. The Emergence of Systematic Review in Toxicology

    PubMed Central

    Stephens, Martin L.; Betts, Kellyn; Beck, Nancy B.; Cogliano, Vincent; Dickersin, Kay; Fitzpatrick, Suzanne; Freeman, James; Gray, George; Hartung, Thomas; McPartland, Jennifer; Rooney, Andrew A.; Scherer, Roberta W.; Verloo, Didier; Hoffmann, Sebastian

    2016-01-01

    The Evidence-based Toxicology Collaboration hosted a workshop on “The Emergence of Systematic Review and Related Evidence-based Approaches in Toxicology,” on November 21, 2014 in Baltimore, Maryland. The workshop featured speakers from agencies and organizations applying systematic review approaches to questions in toxicology, speakers with experience in conducting systematic reviews in medicine and healthcare, and stakeholders in industry, government, academia, and non-governmental organizations. Based on the workshop presentations and discussion, here we address the state of systematic review methods in toxicology, historical antecedents in both medicine and toxicology, challenges to the translation of systematic review from medicine to toxicology, and thoughts on the way forward. We conclude with a recommendation that as various agencies and organizations adapt systematic review methods, they continue to work together to ensure that there is a harmonized process for how the basic elements of systematic review methods are applied in toxicology. PMID:27208075

  1. A novel reversed-phase HPLC method for the determination of urinary creatinine by pre-column derivatization with ethyl chloroformate: comparative studies with the standard Jaffé and isotope-dilution mass spectrometric assays.

    PubMed

    Leung, Elvis M K; Chan, Wan

    2014-02-01

    Creatinine is an important biomarker for renal function diagnosis and normalizing variations in urinary drug/metabolites concentration. Quantification of creatinine in biological fluids such as urine and plasma is important for clinical diagnosis as well as in biomonitoring programs and urinary metabolomics/metabonomics research. Current methods for creatinine determination either are nonselective or involve the use of expensive mass spectrometers. In this paper, a novel reversed-phase high-performance liquid chromatographic (HPLC) method for the determination of creatinine of high hydrophilicity by pre-column derivatization with ethyl chloroformate is presented. N-Ethyloxycarbonylation of creatinine significantly enhanced the hydrophobicity of creatinine, facilitating its chromatographic retention as well as quantification by HPLC. Factors governing the derivatization reaction were studied and optimized. The developed method was validated and applied for the determination of creatinine in rat urine samples. Comparative studies with isotope-dilution mass spectrometric method revealed that the two methods do not yield systematic differences in creatinine concentrations, indicating the HPLC method is suitable for the determination of creatinine in urine samples.

  2. Comparison of Three Methods of Reducing Test Anxiety: Systematic Desensitization, Implosive Therapy, and Study Counseling

    ERIC Educational Resources Information Center

    Cornish, Richard D.; Dilley, Josiah S.

    1973-01-01

    Systematic desensitization, implosive therapy, and study counseling have all been effective in reducing test anxiety. In addition, systematic desensitization has been compared to study counseling for effectiveness. This study compares all three methods and suggests that systematic desentization is more effective than the others, and that implosive…

  3. Comparison of Experimental Methods for Estimating Matrix Diffusion Coefficients for Contaminant Transport Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Telfeyan, Katherine Christina; Ware, Stuart Douglas; Reimus, Paul William

    Diffusion cell and diffusion wafer experiments were conducted to compare methods for estimating matrix diffusion coefficients in rock core samples from Pahute Mesa at the Nevada Nuclear Security Site (NNSS). A diffusion wafer method, in which a solute diffuses out of a rock matrix that is pre-saturated with water containing the solute, is presented as a simpler alternative to the traditional through-diffusion (diffusion cell) method. Both methods yielded estimates of matrix diffusion coefficients that were within the range of values previously reported for NNSS volcanic rocks. The difference between the estimates of the two methods ranged from 14 to 30%,more » and there was no systematic high or low bias of one method relative to the other. From a transport modeling perspective, these differences are relatively minor when one considers that other variables (e.g., fracture apertures, fracture spacings) influence matrix diffusion to a greater degree and tend to have greater uncertainty than diffusion coefficients. For the same relative random errors in concentration measurements, the diffusion cell method yields diffusion coefficient estimates that have less uncertainty than the wafer method. However, the wafer method is easier and less costly to implement and yields estimates more quickly, thus allowing a greater number of samples to be analyzed for the same cost and time. Given the relatively good agreement between the methods, and the lack of any apparent bias between the methods, the diffusion wafer method appears to offer advantages over the diffusion cell method if better statistical representation of a given set of rock samples is desired.« less

  4. Comparison of experimental methods for estimating matrix diffusion coefficients for contaminant transport modeling

    NASA Astrophysics Data System (ADS)

    Telfeyan, Katherine; Ware, S. Doug; Reimus, Paul W.; Birdsell, Kay H.

    2018-02-01

    Diffusion cell and diffusion wafer experiments were conducted to compare methods for estimating effective matrix diffusion coefficients in rock core samples from Pahute Mesa at the Nevada Nuclear Security Site (NNSS). A diffusion wafer method, in which a solute diffuses out of a rock matrix that is pre-saturated with water containing the solute, is presented as a simpler alternative to the traditional through-diffusion (diffusion cell) method. Both methods yielded estimates of effective matrix diffusion coefficients that were within the range of values previously reported for NNSS volcanic rocks. The difference between the estimates of the two methods ranged from 14 to 30%, and there was no systematic high or low bias of one method relative to the other. From a transport modeling perspective, these differences are relatively minor when one considers that other variables (e.g., fracture apertures, fracture spacings) influence matrix diffusion to a greater degree and tend to have greater uncertainty than effective matrix diffusion coefficients. For the same relative random errors in concentration measurements, the diffusion cell method yields effective matrix diffusion coefficient estimates that have less uncertainty than the wafer method. However, the wafer method is easier and less costly to implement and yields estimates more quickly, thus allowing a greater number of samples to be analyzed for the same cost and time. Given the relatively good agreement between the methods, and the lack of any apparent bias between the methods, the diffusion wafer method appears to offer advantages over the diffusion cell method if better statistical representation of a given set of rock samples is desired.

  5. The association between the environmental endocrine disruptor bisphenol A and polycystic ovary syndrome: a systematic review and meta-analysis.

    PubMed

    Hu, Ying; Wen, Shu; Yuan, Dongzhi; Peng, Le; Zeng, Rujun; Yang, Zhilan; Liu, Qi; Xu, Liangzhi; Kang, Deying

    2018-05-01

    To investigate the association between bisphenol A (BPA) and polycystic ovary syndrome (PCOS). A systematic review and meta-analysis using STATA software for observational studies. Nine studies involving 493 PCOS patients and 440 controls were included in this review. The meta-analysis demonstrated that PCOS patients had significantly higher BPA levels compared with control groups (standardized mean difference (SMD): 2.437, 95% confidence interval (CI): (1.265, 3.609), p < .001). For studies of serum samples detected by enzyme-linked immunosorbent assay (ELISA), subgroup analyses according to ethnicity, body mass index (BMI), sample size, detection method (high-performance liquid chromatography (HPLC) and ELISA), PCOS-to-control ratio and study quality displayed that high BPA levels were significantly associated with Caucasian PCOS patients (SMD: 0.615, 95% CI: (0.308, 0.922), p < .001), high BMI (SMD: 0.512, 95% CI: (0.180, 0.843), p = .002), high quality (SMD: 0.624, 95% CI: (0.391, 0.856), p < .001), and high HOMA-IR (SMD: 0.467, 95% CI: (0.121, 0.813), p = .008). Serum BPA may be positively associated with women with PCOS and BPA might be involved in the insulin-resistance and hyperandrogenism of PCOS. More evidence from high quality studies, advanced detection methods, and larger cohorts for observational trials are needed to further confirm the association between BPA and PCOS.

  6. Variable Lifting Index (VLI): A New Method for Evaluating Variable Lifting Tasks.

    PubMed

    Waters, Thomas; Occhipinti, Enrico; Colombini, Daniela; Alvarez-Casado, Enrique; Fox, Robert

    2016-08-01

    We seek to develop a new approach for analyzing the physical demands of highly variable lifting tasks through an adaptation of the Revised NIOSH (National Institute for Occupational Safety and Health) Lifting Equation (RNLE) into a Variable Lifting Index (VLI). There are many jobs that contain individual lifts that vary from lift to lift due to the task requirements. The NIOSH Lifting Equation is not suitable in its present form to analyze variable lifting tasks. In extending the prior work on the VLI, two procedures are presented to allow users to analyze variable lifting tasks. One approach involves the sampling of lifting tasks performed by a worker over a shift and the calculation of the Frequency Independent Lift Index (FILI) for each sampled lift and the aggregation of the FILI values into six categories. The Composite Lift Index (CLI) equation is used with lifting index (LI) category frequency data to calculate the VLI. The second approach employs a detailed systematic collection of lifting task data from production and/or organizational sources. The data are organized into simplified task parameter categories and further aggregated into six FILI categories, which also use the CLI equation to calculate the VLI. The two procedures will allow practitioners to systematically employ the VLI method to a variety of work situations where highly variable lifting tasks are performed. The scientific basis for the VLI procedure is similar to that for the CLI originally presented by NIOSH; however, the VLI method remains to be validated. The VLI method allows an analyst to assess highly variable manual lifting jobs in which the task characteristics vary from lift to lift during a shift. © 2015, Human Factors and Ergonomics Society.

  7. Variable Lifting Index (VLI)

    PubMed Central

    Waters, Thomas; Occhipinti, Enrico; Colombini, Daniela; Alvarez-Casado, Enrique; Fox, Robert

    2015-01-01

    Objective: We seek to develop a new approach for analyzing the physical demands of highly variable lifting tasks through an adaptation of the Revised NIOSH (National Institute for Occupational Safety and Health) Lifting Equation (RNLE) into a Variable Lifting Index (VLI). Background: There are many jobs that contain individual lifts that vary from lift to lift due to the task requirements. The NIOSH Lifting Equation is not suitable in its present form to analyze variable lifting tasks. Method: In extending the prior work on the VLI, two procedures are presented to allow users to analyze variable lifting tasks. One approach involves the sampling of lifting tasks performed by a worker over a shift and the calculation of the Frequency Independent Lift Index (FILI) for each sampled lift and the aggregation of the FILI values into six categories. The Composite Lift Index (CLI) equation is used with lifting index (LI) category frequency data to calculate the VLI. The second approach employs a detailed systematic collection of lifting task data from production and/or organizational sources. The data are organized into simplified task parameter categories and further aggregated into six FILI categories, which also use the CLI equation to calculate the VLI. Results: The two procedures will allow practitioners to systematically employ the VLI method to a variety of work situations where highly variable lifting tasks are performed. Conclusions: The scientific basis for the VLI procedure is similar to that for the CLI originally presented by NIOSH; however, the VLI method remains to be validated. Application: The VLI method allows an analyst to assess highly variable manual lifting jobs in which the task characteristics vary from lift to lift during a shift. PMID:26646300

  8. Sampling in Atypical Endometrial Hyperplasia: Which Method Results in the Lowest Underestimation of Endometrial Cancer? A Systematic Review and Meta-analysis.

    PubMed

    Bourdel, Nicolas; Chauvet, Pauline; Tognazza, Enrica; Pereira, Bruno; Botchorishvili, Revaz; Canis, Michel

    2016-01-01

    Our objective was to identify the most accurate method of endometrial sampling for the diagnosis of complex atypical hyperplasia (CAH), and the related risk of underestimation of endometrial cancer. We conducted a systematic literature search in PubMed and EMBASE (January 1999-September 2013) to identify all registered articles on this subject. Studies were selected with a 2-step method. First, titles and abstracts were analyzed by 2 reviewers, and 69 relevant articles were selected for full reading. Then, the full articles were evaluated to determine whether full inclusion criteria were met. We selected 27 studies, taking into consideration the comparison between histology of endometrial hyperplasia obtained by diagnostic tests of interest (uterine curettage, hysteroscopically guided biopsy, or hysteroscopic endometrial resection) and subsequent results of hysterectomy. Analysis of the studies reviewed focused on 1106 patients with a preoperative diagnosis of atypical endometrial hyperplasia. The mean risk of finding endometrial cancer at hysterectomy after atypical endometrial hyperplasia diagnosed by uterine curettage was 32.7% (95% confidence interval [CI], 26.2-39.9), with a risk of 45.3% (95% CI, 32.8-58.5) after hysteroscopically guided biopsy and 5.8% (95% CI, 0.8-31.7) after hysteroscopic resection. In total, the risk of underestimation of endometrial cancer reaches a very high rate in patients with CAH using the classic method of evaluation (i.e., uterine curettage or hysteroscopically guided biopsy). This rate of underdiagnosed endometrial cancer leads to the risk of inappropriate surgical procedures (31.7% of tubal conservation in the data available and no abdominal exploration in 24.6% of the cases). Hysteroscopic resection seems to reduce the risk of underdiagnosed endometrial cancer. Copyright © 2016 AAGL. Published by Elsevier Inc. All rights reserved.

  9. Epidemiology and Reporting Characteristics of Systematic Reviews of Biomedical Research: A Cross-Sectional Study

    PubMed Central

    Page, Matthew J.; Shamseer, Larissa; Altman, Douglas G.; Tetzlaff, Jennifer; Tricco, Andrea C.; Catalá-López, Ferrán; Li, Lun; Reid, Emma K.; Sarkis-Onofre, Rafael; Moher, David

    2016-01-01

    Background Systematic reviews (SRs) can help decision makers interpret the deluge of published biomedical literature. However, a SR may be of limited use if the methods used to conduct the SR are flawed, and reporting of the SR is incomplete. To our knowledge, since 2004 there has been no cross-sectional study of the prevalence, focus, and completeness of reporting of SRs across different specialties. Therefore, the aim of our study was to investigate the epidemiological and reporting characteristics of a more recent cross-section of SRs. Methods and Findings We searched MEDLINE to identify potentially eligible SRs indexed during the month of February 2014. Citations were screened using prespecified eligibility criteria. Epidemiological and reporting characteristics of a random sample of 300 SRs were extracted by one reviewer, with a 10% sample extracted in duplicate. We compared characteristics of Cochrane versus non-Cochrane reviews, and the 2014 sample of SRs versus a 2004 sample of SRs. We identified 682 SRs, suggesting that more than 8,000 SRs are being indexed in MEDLINE annually, corresponding to a 3-fold increase over the last decade. The majority of SRs addressed a therapeutic question and were conducted by authors based in China, the UK, or the US; they included a median of 15 studies involving 2,072 participants. Meta-analysis was performed in 63% of SRs, mostly using standard pairwise methods. Study risk of bias/quality assessment was performed in 70% of SRs but was rarely incorporated into the analysis (16%). Few SRs (7%) searched sources of unpublished data, and the risk of publication bias was considered in less than half of SRs. Reporting quality was highly variable; at least a third of SRs did not report use of a SR protocol, eligibility criteria relating to publication status, years of coverage of the search, a full Boolean search logic for at least one database, methods for data extraction, methods for study risk of bias assessment, a primary outcome, an abstract conclusion that incorporated study limitations, or the funding source of the SR. Cochrane SRs, which accounted for 15% of the sample, had more complete reporting than all other types of SRs. Reporting has generally improved since 2004, but remains suboptimal for many characteristics. Conclusions An increasing number of SRs are being published, and many are poorly conducted and reported. Strategies are needed to help reduce this avoidable waste in research. PMID:27218655

  10. Level 2 validation of a flow cytometric method for detection of Escherichia coli O157:H7 in raw spinach.

    PubMed

    Williams, Anna J; Cooper, Willie M; Summage-West, Christine V; Sims, Lillie M; Woodruff, Robert; Christman, Jessica; Moskal, Ted J; Ramsaroop, Shawn; Sutherland, John B; Alusta, Pierre; Wilkes, Jon G; Buzatu, Dan A

    2015-12-23

    The Bacteriological Analytical Manual (BAM) method currently used by the United States Food and Drug Administration (FDA) to detect Escherichia coli O157:H7 in spinach was systematically compared to a new flow cytometry based method. This Food and Drug Administration (FDA) level 2 external laboratory validation study was designed to determine the latter method's sensitivity and speed for analysis of this pathogen in raw spinach. Detection of target cell inoculations with a low cell count is critical, since enterohemorrhagic strains of E. coli require an infective dose of as few as 10 cells (Schmid-Hempel and Frank, 2007). Although, according to the FDA, the infectious dose is unknown (Food and Drug Administration, 1993). Therefore, the inoculation level into the spinach, a total of 2.0±2.6 viable E. coli O157 cells, was specified to yield between 25% and 75% detection by the new method, out of 20 samples (10 positives and 10 negatives). This criterion was met in that the new method detected 60% of the nominally positive samples; the corresponding sensitivity of the reference method was 50%. For both methods the most likely explanation for false negatives was that no viable cells were actually introduced into the sample. In this validation study, the flow cytometry method was equal to the BAM in sensitivity and far superior in speed. Published by Elsevier B.V.

  11. Impact of Different Creatinine Measurement Methods on Liver Transplant Allocation

    PubMed Central

    Kaiser, Thorsten; Kinny-Köster, Benedict; Bartels, Michael; Parthaune, Tanja; Schmidt, Michael; Thiery, Joachim

    2014-01-01

    Introduction The model for end-stage liver disease (MELD) score is used in many countries to prioritize organ allocation for the majority of patients who require orthotopic liver transplantation. This score is calculated based on the following laboratory parameters: creatinine, bilirubin and the international normalized ratio (INR). Consequently, high measurement accuracy is essential for equitable and fair organ allocation. For serum creatinine measurements, the Jaffé method and enzymatic detection are well-established routine diagnostic tests. Methods A total of 1,013 samples from 445 patients on the waiting list or in evaluation for liver transplantation were measured using both creatinine methods from November 2012 to September 2013 at the university hospital Leipzig, Germany. The measurements were performed in parallel according to the manufacturer’s instructions after the samples arrived at the institute of laboratory medicine. Patients who had required renal replacement therapy twice in the previous week were excluded from analyses. Results Despite the good correlation between the results of both creatinine quantification methods, relevant differences were observed, which led to different MELD scores. The Jaffé measurement led to greater MELD score in 163/1,013 (16.1%) samples with differences of up to 4 points in one patient, whereas differences of up to 2 points were identified in 15/1,013 (1.5%) samples using the enzymatic assay. Overall, 50/152 (32.9%) patients with MELD scores >20 had higher scores when the Jaffé method was used. Discussion Using the Jaffé method to measure creatinine levels in samples from patients who require liver transplantation may lead to a systematic preference in organ allocation. In this study, the differences were particularly pronounced in samples with MELD scores >20, which has clinical relevance in the context of urgency of transplantation. These data suggest that official recommendations are needed to determine which laboratory diagnostic methods should be used when calculating MELD scores. PMID:24587188

  12. Rapidly differentiating grape seeds from different sources based on characteristic fingerprints using direct analysis in real time coupled with time-of-flight mass spectrometry combined with chemometrics.

    PubMed

    Song, Yuqiao; Liao, Jie; Dong, Junxing; Chen, Li

    2015-09-01

    The seeds of grapevine (Vitis vinifera) are a byproduct of wine production. To examine the potential value of grape seeds, grape seeds from seven sources were subjected to fingerprinting using direct analysis in real time coupled with time-of-flight mass spectrometry combined with chemometrics. Firstly, we listed all reported components (56 components) from grape seeds and calculated the precise m/z values of the deprotonated ions [M-H](-) . Secondly, the experimental conditions were systematically optimized based on the peak areas of total ion chromatograms of the samples. Thirdly, the seven grape seed samples were examined using the optimized method. Information about 20 grape seed components was utilized to represent characteristic fingerprints. Finally, hierarchical clustering analysis and principal component analysis were performed to analyze the data. Grape seeds from seven different sources were classified into two clusters; hierarchical clustering analysis and principal component analysis yielded similar results. The results of this study lay the foundation for appropriate utilization and exploitation of grape seed samples. Due to the absence of complicated sample preparation methods and chromatographic separation, the method developed in this study represents one of the simplest and least time-consuming methods for grape seed fingerprinting. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Hierarchical Bayesian modelling of gene expression time series across irregularly sampled replicates and clusters.

    PubMed

    Hensman, James; Lawrence, Neil D; Rattray, Magnus

    2013-08-20

    Time course data from microarrays and high-throughput sequencing experiments require simple, computationally efficient and powerful statistical models to extract meaningful biological signal, and for tasks such as data fusion and clustering. Existing methodologies fail to capture either the temporal or replicated nature of the experiments, and often impose constraints on the data collection process, such as regularly spaced samples, or similar sampling schema across replications. We propose hierarchical Gaussian processes as a general model of gene expression time-series, with application to a variety of problems. In particular, we illustrate the method's capacity for missing data imputation, data fusion and clustering.The method can impute data which is missing both systematically and at random: in a hold-out test on real data, performance is significantly better than commonly used imputation methods. The method's ability to model inter- and intra-cluster variance leads to more biologically meaningful clusters. The approach removes the necessity for evenly spaced samples, an advantage illustrated on a developmental Drosophila dataset with irregular replications. The hierarchical Gaussian process model provides an excellent statistical basis for several gene-expression time-series tasks. It has only a few additional parameters over a regular GP, has negligible additional complexity, is easily implemented and can be integrated into several existing algorithms. Our experiments were implemented in python, and are available from the authors' website: http://staffwww.dcs.shef.ac.uk/people/J.Hensman/.

  14. Enhanced conformational sampling of nucleic acids by a new Hamiltonian replica exchange molecular dynamics approach.

    PubMed

    Curuksu, Jeremy; Zacharias, Martin

    2009-03-14

    Although molecular dynamics (MD) simulations have been applied frequently to study flexible molecules, the sampling of conformational states separated by barriers is limited due to currently possible simulation time scales. Replica-exchange (Rex)MD simulations that allow for exchanges between simulations performed at different temperatures (T-RexMD) can achieve improved conformational sampling. However, in the case of T-RexMD the computational demand grows rapidly with system size. A Hamiltonian RexMD method that specifically enhances coupled dihedral angle transitions has been developed. The method employs added biasing potentials as replica parameters that destabilize available dihedral substates and was applied to study coupled dihedral transitions in nucleic acid molecules. The biasing potentials can be either fixed at the beginning of the simulation or optimized during an equilibration phase. The method was extensively tested and compared to conventional MD simulations and T-RexMD simulations on an adenine dinucleotide system and on a DNA abasic site. The biasing potential RexMD method showed improved sampling of conformational substates compared to conventional MD simulations similar to T-RexMD simulations but at a fraction of the computational demand. It is well suited to study systematically the fine structure and dynamics of large nucleic acids under realistic conditions including explicit solvent and ions and can be easily extended to other types of molecules.

  15. Investigation of a systematic offset in the measurement of organic carbon with a semicontinuous analyzer.

    PubMed

    Offenberg, John H; Lewandowski, Michael; Edney, Edward O; Kleindienst, Tadeusz E; Jaoui, Mohammed

    2007-05-01

    Organic carbon (OC) was measured semicontinuously in laboratory experiments of steady-state secondary organic aerosol formed by hydrocarbon + nitrogen oxide irradiations. Examination of the mass of carbon measured on the filter for various sample volumes reveals a systematic offset that is not observed when performing an instrumental blank. These findings suggest that simple subtraction of instrumental blanks determined as the standard analysis without sample collection (i.e., by cycling the pump and valves yet filtering zero liters of air followed by routine chemical analysis) from measured concentrations may be inadequate. This may be especially true for samples collected through the filtration of small air volumes wherein the influence of the systematic offset is greatest. All of the experiments show that filtering a larger volume of air minimizes the influence of contributions from the systematic offset. Application of these results to measurements of ambient concentrations of carbonaceous aerosol suggests a need for collection of sufficient carbon mass to minimize the relative influence of the offset signal.

  16. The analytical calibration in (bio)imaging/mapping of the metallic elements in biological samples--definitions, nomenclature and strategies: state of the art.

    PubMed

    Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech

    2015-01-01

    Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Optical characterization of porcine articular cartilage using a polarimetry technique with differential Mueller matrix formulism.

    PubMed

    Chang, Ching-Min; Lo, Yu-Lung; Tran, Nghia-Khanh; Chang, Yu-Jen

    2018-03-20

    A method is proposed for characterizing the optical properties of articular cartilage sliced from a pig's thighbone using a Stokes-Mueller polarimetry technique. The principal axis angle, phase retardance, optical rotation angle, circular diattenuation, diattenuation axis angle, linear diattenuation, and depolarization index properties of the cartilage sample are all decoupled in the proposed analytical model. Consequently, the accuracy and robustness of the extracted results are improved. The glucose concentration, collagen distribution, and scattering properties of samples from various depths of the articular cartilage are systematically explored via an inspection of the related parameters. The results show that the glucose concentration and scattering effect are both enhanced in the superficial region of the cartilage. By contrast, the collagen density increases with an increasing sample depth.

  18. Thallium Bromide Deposited Using Spray Coating

    NASA Astrophysics Data System (ADS)

    Ferreira, E. S.; Mulato, M.

    2012-08-01

    Spray coating was used to produce thallium bromide samples on glass substrates. The influence of several fabrication parameters on the final structural properties of the samples was investigated. Substrate position, substrate temperature, solution concentration, carrying gas, and solution flow were varied systematically, the physical deposition mechanism involved in each case being discussed. Total deposition time of about 3.5 h can lead to 62-μm-thick films, comprising completely packed micrometer-sized crystalline grains. X-ray diffraction and scanning electron microscopy were used to characterize the samples. On the basis of the experimental data, the optimum fabrication conditions were identified. The technique offers an alternative method for fast, cheap fabrication of large-area devices for the detection of high-energy radiation, i.e., X-rays and γ-rays, in medical imaging.

  19. Energy landscapes and properties of biomolecules.

    PubMed

    Wales, David J

    2005-11-09

    Thermodynamic and dynamic properties of biomolecules can be calculated using a coarse-grained approach based upon sampling stationary points of the underlying potential energy surface. The superposition approximation provides an overall partition function as a sum of contributions from the local minima, and hence functions such as internal energy, entropy, free energy and the heat capacity. To obtain rates we must also sample transition states that link the local minima, and the discrete path sampling method provides a systematic means to achieve this goal. A coarse-grained picture is also helpful in locating the global minimum using the basin-hopping approach. Here we can exploit a fictitious dynamics between the basins of attraction of local minima, since the objective is to find the lowest minimum, rather than to reproduce the thermodynamics or dynamics.

  20. Harman Measurements for Thermoelectric Materials and Modules under Non-Adiabatic Conditions

    NASA Astrophysics Data System (ADS)

    Roh, Im-Jun; Lee, Yun Goo; Kang, Min-Su; Lee, Jae-Uk; Baek, Seung-Hyub; Kim, Seong Keun; Ju, Byeong-Kwon; Hyun, Dow-Bin; Kim, Jin-Sang; Kwon, Beomjin

    2016-12-01

    Accuracy of the Harman measurement largely depends on the heat transfer between the sample and its surroundings, so-called parasitic thermal effects (PTEs). Similar to the material evaluations, measuring thermoelectric modules (TEMs) is also affected by the PTEs especially when measuring under atmospheric condition. Here, we study the correction methods for the Harman measurements with systematically varied samples (both bulk materials and TEMs) at various conditions. Among several PTEs, the heat transfer via electric wires is critical. Thus, we estimate the thermal conductance of the electric wires, and correct the measured properties for a certain sample shape and measuring temperature. The PTEs are responsible for the underestimation of the TEM properties especially under atmospheric conditions (10-35%). This study will be useful to accurately characterize the thermoelectric properties of materials and modules.

  1. Anthelmintic Intake on the Nutritional Status, Hemoglobin Content, and Learning Achievement of the Elementary School Student in Sukarami Palembang

    NASA Astrophysics Data System (ADS)

    Hartati; Aryanti, S.; Muherman, S. Y.

    2017-03-01

    The main purpose study was to find out the effect of once a year of 400 mg albendazole on the nutritional status and learning achievement of elementary school students in Sukarami Palembang. Methods study used quasi experimental research with non -equivalent control group pretest-posttest design. This study was conducted in Palembang, South Sumatera for one year. Samples of this study were 1914 students deriving from a systematic stratified random sampling and divided into 2 groups: 986 students for the treatment samples were given 400 mg albendazole and 928 students for the controlled samples were given placebo. The result of this study found that there was a decrease in the prevalence of worm infection both in the treatment and controlled samples. However the number of infected students in the treatment samples decreased. The implication research is the drug albendazole worm declared as the most effective drug to treat intestinal worm infections.

  2. Validation of a sampling plan to generate food composition data.

    PubMed

    Sammán, N C; Gimenez, M A; Bassett, N; Lobo, M O; Marcoleri, M E

    2016-02-15

    A methodology to develop systematic plans for food sampling was proposed. Long life whole and skimmed milk, and sunflower oil were selected to validate the methodology in Argentina. Fatty acid profile in all foods, proximal composition, and calcium's content in milk were determined with AOAC methods. The number of samples (n) was calculated applying Cochran's formula with variation coefficients ⩽12% and an estimate error (r) maximum permissible ⩽5% for calcium content in milks and unsaturated fatty acids in oil. n were 9, 11 and 21 for long life whole and skimmed milk, and sunflower oil respectively. Sample units were randomly collected from production sites and sent to labs. Calculated r with experimental data was ⩽10%, indicating high accuracy in the determination of analyte content of greater variability and reliability of the proposed sampling plan. The methodology is an adequate and useful tool to develop sampling plans for food composition analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Geostatistical modeling of riparian forest microclimate and its implications for sampling

    USGS Publications Warehouse

    Eskelson, B.N.I.; Anderson, P.D.; Hagar, J.C.; Temesgen, H.

    2011-01-01

    Predictive models of microclimate under various site conditions in forested headwater stream - riparian areas are poorly developed, and sampling designs for characterizing underlying riparian microclimate gradients are sparse. We used riparian microclimate data collected at eight headwater streams in the Oregon Coast Range to compare ordinary kriging (OK), universal kriging (UK), and kriging with external drift (KED) for point prediction of mean maximum air temperature (Tair). Several topographic and forest structure characteristics were considered as site-specific parameters. Height above stream and distance to stream were the most important covariates in the KED models, which outperformed OK and UK in terms of root mean square error. Sample patterns were optimized based on the kriging variance and the weighted means of shortest distance criterion using the simulated annealing algorithm. The optimized sample patterns outperformed systematic sample patterns in terms of mean kriging variance mainly for small sample sizes. These findings suggest methods for increasing efficiency of microclimate monitoring in riparian areas.

  4. Effect size measures in a two-independent-samples case with nonnormal and nonhomogeneous data.

    PubMed

    Li, Johnson Ching-Hong

    2016-12-01

    In psychological science, the "new statistics" refer to the new statistical practices that focus on effect size (ES) evaluation instead of conventional null-hypothesis significance testing (Cumming, Psychological Science, 25, 7-29, 2014). In a two-independent-samples scenario, Cohen's (1988) standardized mean difference (d) is the most popular ES, but its accuracy relies on two assumptions: normality and homogeneity of variances. Five other ESs-the unscaled robust d (d r * ; Hogarty & Kromrey, 2001), scaled robust d (d r ; Algina, Keselman, & Penfield, Psychological Methods, 10, 317-328, 2005), point-biserial correlation (r pb ; McGrath & Meyer, Psychological Methods, 11, 386-401, 2006), common-language ES (CL; Cliff, Psychological Bulletin, 114, 494-509, 1993), and nonparametric estimator for CL (A w ; Ruscio, Psychological Methods, 13, 19-30, 2008)-may be robust to violations of these assumptions, but no study has systematically evaluated their performance. Thus, in this simulation study the performance of these six ESs was examined across five factors: data distribution, sample, base rate, variance ratio, and sample size. The results showed that A w and d r were generally robust to these violations, and A w slightly outperformed d r . Implications for the use of A w and d r in real-world research are discussed.

  5. Self-reference and random sampling approach for label-free identification of DNA composition using plasmonic nanomaterials.

    PubMed

    Freeman, Lindsay M; Pang, Lin; Fainman, Yeshaiahu

    2018-05-09

    The analysis of DNA has led to revolutionary advancements in the fields of medical diagnostics, genomics, prenatal screening, and forensic science, with the global DNA testing market expected to reach revenues of USD 10.04 billion per year by 2020. However, the current methods for DNA analysis remain dependent on the necessity for fluorophores or conjugated proteins, leading to high costs associated with consumable materials and manual labor. Here, we demonstrate a potential label-free DNA composition detection method using surface-enhanced Raman spectroscopy (SERS) in which we identify the composition of cytosine and adenine within single strands of DNA. This approach depends on the fact that there is one phosphate backbone per nucleotide, which we use as a reference to compensate for systematic measurement variations. We utilize plasmonic nanomaterials with random Raman sampling to perform label-free detection of the nucleotide composition within DNA strands, generating a calibration curve from standard samples of DNA and demonstrating the capability of resolving the nucleotide composition. The work represents an innovative way for detection of the DNA composition within DNA strands without the necessity of attached labels, offering a highly sensitive and reproducible method that factors in random sampling to minimize error.

  6. Transcutaneous bilirubinometry reduces the need for blood sampling in neonates with visible jaundice.

    PubMed

    Mishra, S; Chawla, D; Agarwal, R; Deorari, A K; Paul, V K; Bhutani, V K

    2009-12-01

    We determined usefulness of transcutaneous bilirubinometry to decrease the need for blood sampling to assay serum total bilirubin (STB) in the management of jaundiced healthy Indian neonates. Newborns, > or =35 weeks' gestation, with clinical evidence of jaundice were enrolled in an institutional approved randomized clinical trial. The severity of hyperbilirubinaemia was determined by two non-invasive methods: i) protocol-based visual assessment of bilirubin (VaB) and ii) transcutaneous bilirubin (TcB) determination (BiliCheck). By a random allocation, either method was used to decide the need for blood sampling, which was defined to be present if assessed STB by allocated method exceeded 80% of hour-specific threshold values for phototherapy (2004 AAP Guidelines). A total of 617 neonates were randomized to either TcB (n = 314) or VaB (n = 303) groups with comparable gestation, birth weight and postnatal age. Need for blood sampling to assay STB was 34% lower (95% CI: 10% to 51%) in the TcB group compared with VaB group (17.5% vs 26.4% assessments; risk difference: -8.9%, 95% CI: -2.4% to -15.4%; p = 0.008). Routine use of transcutaneous bilirubinometry compared with systematic visual assessment of bilirubin significantly reduced the need for blood sampling to assay STB in jaundiced term and late-preterm neonates. (ClinicalTrials.gov number, NCT00653874).

  7. Importance of tissue preparation methods in FTIR micro-spectroscopical analysis of biological tissues: 'traps for new users'.

    PubMed

    Zohdi, Vladislava; Whelan, Donna R; Wood, Bayden R; Pearson, James T; Bambery, Keith R; Black, M Jane

    2015-01-01

    Fourier Transform Infrared (FTIR) micro-spectroscopy is an emerging technique for the biochemical analysis of tissues and cellular materials. It provides objective information on the holistic biochemistry of a cell or tissue sample and has been applied in many areas of medical research. However, it has become apparent that how the tissue is handled prior to FTIR micro-spectroscopic imaging requires special consideration, particularly with regards to methods for preservation of the samples. We have performed FTIR micro-spectroscopy on rodent heart and liver tissue sections (two spectroscopically very different biological tissues) that were prepared by desiccation drying, ethanol substitution and formalin fixation and have compared the resulting spectra with that of fully hydrated freshly excised tissues. We have systematically examined the spectra for any biochemical changes to the native state of the tissue caused by the three methods of preparation and have detected changes in infrared (IR) absorption band intensities and peak positions. In particular, the position and profile of the amide I, key in assigning protein secondary structure, changes depending on preparation method and the lipid absorptions lose intensity drastically when these tissues are hydrated with ethanol. Indeed, we demonstrate that preserving samples through desiccation drying, ethanol substitution or formalin fixation significantly alters the biochemical information detected using spectroscopic methods when compared to spectra of fresh hydrated tissue. It is therefore imperative to consider tissue preparative effects when preparing, measuring, and analyzing samples using FTIR spectroscopy.

  8. The Epidemiology of Substance Use Disorders in US Veterans: A Systematic Review and Analysis of Assessment Methods

    PubMed Central

    Lan, Chiao-Wen; Fiellin, David A.; Barry, Declan T.; Bryant, Kendall J.; Gordon, Adam J.; Edelman, E. Jennifer; Gaither, Julie R.; Maisto, Stephen A.; Marshall, Brandon D.L.

    2016-01-01

    Background Substance use disorders (SUDs), which encompass alcohol and drug use disorders (AUDs, DUDs), constitute a major public health challenge among US veterans. SUDs are among the most common and costly of all health conditions among veterans. Objectives This study sought to examine the epidemiology of SUDs among US veterans, compare the prevalence of SUDs in studies using diagnostic and administrative criteria assessment methods, and summarize trends in the prevalence of SUDs reported in studies sampling US veterans over time. Methods Comprehensive electronic database searches were conducted. A total of 3,490 studies were identified. We analyzed studies sampling US veterans and reporting prevalence, distribution, and examining AUDs and DUDs. Results Of the studies identified, 72 met inclusion criteria. The studies were published between 1995 and 2013. Studies using diagnostic criteria reported higher prevalence of AUDs (32% vs. 10%) and DUDs (20% vs. 5%) than administrative criteria, respectively. Regardless of assessment method, both the lifetime and past year prevalence of AUDs in studies sampling US veterans has declined gradually over time. Conclusion The prevalence of SUDs reported in studies sampling US veterans are affected by assessment method. Given the significant public health problems of SUDs among US veterans, improved guidelines for clinical screening using validated diagnostic criteria to assess AUDs and DUDs in US veteran populations are needed. Scientific Significance These findings may inform VA and other healthcare systems in prevention, diagnosis, and intervention for SUDs among US veterans. PMID:26693830

  9. Got Power? A Systematic Review of Sample Size Adequacy in Health Professions Education Research

    ERIC Educational Resources Information Center

    Cook, David A.; Hatala, Rose

    2015-01-01

    Many education research studies employ small samples, which in turn lowers statistical power. We re-analyzed the results of a meta-analysis of simulation-based education to determine study power across a range of effect sizes, and the smallest effect that could be plausibly excluded. We systematically searched multiple databases through May 2011,…

  10. Forest resources of southeast Alaska, 2000: results of a single-phase systematic sample.

    Treesearch

    Willem W.S. van Hees

    2003-01-01

    A baseline assessment of forest resources in southeast Alaska was made by using a single-phase, unstratified, systematic-grid sample, with ground plots established at each grid intersection. Ratio-of-means estimators were used to develop population estimates. Forests cover an estimated 48 percent of the 22.9-million-acre southeast Alaska inventory unit. Dominant forest...

  11. Methodological and ethical challenges in studying patients’ perceptions of coercion: a systematic mixed studies review

    PubMed Central

    2014-01-01

    Background Despite improvements in psychiatric inpatient care, patient restrictions in psychiatric hospitals are still in use. Studying perceptions among patients who have been secluded or physically restrained during their hospital stay is challenging. We sought to review the methodological and ethical challenges in qualitative and quantitative studies aiming to describe patients’ perceptions of coercive measures, especially seclusion and physical restraints during their hospital stay. Methods Systematic mixed studies review was the study method. Studies reporting patients’ perceptions of coercive measures, especially seclusion and physical restraints during hospital stay were included. Methodological issues such as study design, data collection and recruitment process, participants, sampling, patient refusal or non-participation, and ethical issues such as informed consent process, and approval were synthesized systematically. Electronic searches of CINALH, MEDLINE, PsychINFO and The Cochrane Library (1976-2012) were carried out. Results Out of 846 initial citations, 32 studies were included, 14 qualitative and 18 quantitative studies. A variety of methodological approaches were used, although descriptive and explorative designs were used in most cases. Data were mainly collected in qualitative studies by interviews (n = 13) or in quantitative studies by self-report questionnaires (n = 12). The recruitment process was explained in 59% (n = 19) of the studies. In most cases convenience sampling was used, yet five studies used randomization. Patient’s refusal or non-participation was reported in 37% (n = 11) of studies. Of all studies, 56% (n = 18) had reported undergone an ethical review process in an official board or committee. Respondents were informed and consent was requested in 69% studies (n = 22). Conclusions The use of different study designs made comparison methodologically challenging. The timing of data collection (considering bias and confounding factors) and the reasons for non-participation of eligible participants are likewise methodological challenges, e.g. recommended flow charts could aid the information. Other challenges identified were the recruitment of large and representative samples. Ethical challenges included requesting participants’ informed consent and respecting ethical procedures. PMID:24894162

  12. Comparison of ASE and SFE with Soxhlet, Sonication, and Methanolic Saponification Extractions for the Determination of Organic Micropollutants in Marine Particulate Matter.

    PubMed

    Heemken, O P; Theobald, N; Wenclawiak, B W

    1997-06-01

    The methods of accelerated solvent extraction (ASE) and supercritical fluid extraction (SFE) of polycyclic aromatic hydrocarbons (PAHs), aliphatic hydrocarbons, and chlorinated hydrocarbons from marine samples were investigated. The results of extractions of a certified sediment and four samples of suspended particulate matter (SPM) were compared to classical Soxhlet (SOX), ultrasonication (USE), and methanolic saponification extraction (MSE) methods. The recovery data, including precision and systematic deviations of each method, were evaluated statistically. It was found that recoveries and precision of ASE and SFE compared well with the other methods investigated. Using SFE, the average recoveries of PAHs in three different samples ranged from 96 to 105%, for ASE the recoveries were in the range of 97-108% compared to the reference methods. Compared to the certified values of sediment HS-6, the average recoveries of SFE and ASE were 87 and 88%, most compounds being within the limits of confidence. Also, for alkanes the average recoveries by SFE and ASE were equal to the results obtained by SOX, USE, and MSE. In the case of SFE, the recoveries were in the range 93-115%, and ASE achieved recoveries of 94-107% as compared to the other methods. For ASE and SFE, the influence of water on the extraction efficiency was examined. While the natural water content of the SPM sample (56 wt %) led to insufficient recoveries in ASE and SFE, quantitative extractions were achieved in SFE after addition of anhydrous sodium sulfate to the sample. Finally, ASE was applied to SPM-loaded filter candles whereby a mixture of n-hexane/acetone as extraction solvent allowed the simultaneous determination of PAHs, alkanes, and chlorinated hydrocarbons.

  13. Systematic forensic toxicological analysis by liquid-chromatography-quadrupole-time-of-flight mass spectrometry in serum and comparison to gas chromatography-mass spectrometry.

    PubMed

    Grapp, Marcel; Kaufmann, Christoph; Streit, Frank; Binder, Lutz

    2018-06-01

    Comprehensive screening procedures for psychoactive agents in body fluids are an essential task in clinical and forensic toxicology. With the continuous emergence and adaption of new psychoactive substances (NPS) keeping a screening method up to date is challenging. To meet these demands, hyphenated high-resolution mass spectrometry has gained interest as extensive and expandable screening approach. Here we present a comprehensive method for systematic toxicological analysis of serum by liquid chromatography-quadrupole-time-of-flight mass spectrometry (LC-QTOF-MS) with data independent acquisition. The potential of this method was demonstrated by analysis of 247 authentic serum- and 12 post-mortem femoral blood samples. Thus 950 compounds, comprising 185 different drugs and metabolites could be identified. For the detected substances, including pharmaceutical substances, illicit drugs as well as NPS, serum concentrations were confirmed ranging from traces to toxic values indicating the capability for forensic toxicological requirements. Positive identification of drugs was achieved by accurate mass measurement (±5ppm for [M+H] + ; ±10ppm for [M-H] - ), retention time (±0.35min), isotopic pattern match (less than 10 m/z RMS [ppm]), isotope match intensity (less than 20% RMS) and the presence of at least two fragment ions. The LC-QTOF-MS procedure was shown to be superior to serum screening by GC-MS, since 240% (335 versus 141) more drugs were identified in serum samples compared to GC-MS. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Economic Evaluation alongside Multinational Studies: A Systematic Review of Empirical Studies

    PubMed Central

    Oppong, Raymond; Jowett, Sue; Roberts, Tracy E.

    2015-01-01

    Purpose of the study This study seeks to explore methods for conducting economic evaluations alongside multinational trials by conducting a systematic review of the methods used in practice and the challenges that are typically faced by the researchers who conducted the economic evaluations. Methods A review was conducted for the period 2002 to 2012, with potentially relevant articles identified by searching the Medline, Embase and NHS EED databases. Studies were included if they were full economic evaluations conducted alongside a multinational trial. Results A total of 44 studies out of a possible 2667 met the inclusion criteria. Methods used for the analyses varied between studies, indicating a lack of consensus on how economic evaluation alongside multinational studies should be carried out. The most common challenge appeared to be related to addressing differences between countries, which potentially hinders the generalisability and transferability of results. Other challenges reported included inadequate sample sizes and choosing cost-effectiveness thresholds. Conclusions It is recommended that additional guidelines be developed to aid researchers in this area and that these be based on an understanding of the challenges associated with multinational trials and the strengths and limitations of alternative approaches. Guidelines should focus on ensuring that results will aid decision makers in their individual countries. PMID:26121465

  15. Review of teaching methods and critical thinking skills.

    PubMed

    Kowalczyk, Nina

    2011-01-01

    Critical information is needed to inform radiation science educators regarding successful critical thinking educational strategies. From an evidence-based research perspective, systematic reviews are identified as the most current and highest level of evidence. Analysis at this high level is crucial in analyzing those teaching methods most appropriate to the development of critical thinking skills. To conduct a systematic literature review to identify teaching methods that demonstrate a positive effect on the development of students' critical thinking skills and to identify how these teaching strategies can best translate to radiologic science educational programs. A comprehensive literature search was conducted resulting in an assessment of 59 full reports. Nineteen of the 59 reports met inclusion criteria and were reviewed based on the level of evidence presented. Inclusion criteria included studies conducted in the past 10 years on sample sizes of 20 or more individuals demonstrating use of specific teaching interventions for 5 to 36 months in postsecondary health-related educational programs. The majority of the research focused on problem-based learning (PBL) requiring standardized small-group activities. Six of the 19 studies focused on PBL and demonstrated significant differences in student critical thinking scores. PBL, as described in the nursing literature, is an effective teaching method that should be used in radiation science education. ©2011 by the American Society of Radiologic Technologists.

  16. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    PubMed

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical uniform random (VUR) sections.

  17. Audit of lymphadenectomy in lung cancer resections using a specimen collection kit and checklist

    PubMed Central

    Osarogiagbon, Raymond U.; Sareen, Srishti; Eke, Ransome; Yu, Xinhua; McHugh, Laura M.; Kernstine, Kemp H.; Putnam, Joe B.; Robbins, Edward T.

    2014-01-01

    Background Audits of operative summaries and pathology reports reveal wide discordance in identifying the extent of lymphadenectomy performed (the communication gap). We tested the ability of a pre-labeled lymph node specimen collection kit and checklist to narrow the communication gap between operating surgeons, pathologists, and auditors of surgeons’ operation notes. Methods We conducted a prospective single cohort study of lung cancer resections performed with a lymph node collection kit from November 2010 to January 2013. We used the kappa statistic to compare surgeon claims on a checklist of lymph node stations harvested intraoperatively, to pathology reports, and an independent audit of surgeons’ operative summaries. Lymph node collection procedures were classified into 4 groups based on the anatomic origin of resected lymph nodes: mediastinal lymph node dissection, systematic sampling, random sampling and no sampling. Results From the pathology report, 73% of 160 resections had a mediastinal lymph node dissection or systematic sampling procedure, 27% had random sampling. The concordance with surgeon claims was 80% (kappa statistic 0.69 [CI 0.60 – 0.79]). Concordance between independent audits of the operation notes and either the pathology report (kappa 0.14 [0.04 – 0.23]), or surgeon claims (kappa 0.09 [0.03 – 0.22]), was poor. Conclusion A pre-labeled specimen collection kit and checklist significantly narrowed the communication gap between surgeons and pathologists in identifying the extent of lymphadenectomy. Audit of surgeons’ operation notes did not accurately reflect the procedure performed, bringing its value for quality improvement work into question. PMID:25530090

  18. Tracking Matrix Effects in the Analysis of DNA Adducts of Polycyclic Aromatic Hydrocarbons

    PubMed Central

    Klaene, Joshua J.; Flarakos, Caroline; Glick, James; Barret, Jennifer T.; Zarbl, Helmut; Vouros, Paul

    2015-01-01

    LC-MS using electrospray ionization is currently the method of choice in bio-organic analysis covering a wide range of applications in a broad spectrum of biological media. The technique is noted for its high sensitivity but one major limitation which hinders achievement of its optimal sensitivity is the signal suppression due to matrix inferences introduced by the presence of co-extracted compounds during the sample preparation procedure. The analysis of DNA adducts of common environmental carcinogens is particularly sensitive to such matrix effects as sample preparation is a multistep process which involves “contamination” of the sample due to the addition of enzymes and other reagents for digestion of the DNA in order to isolate the analyte(s). This problem is further exacerbated by the need to reach low levels of quantitation (LOQ in the ppb level) while also working with limited (2-5 μg) quantities of sample. We report here on the systematic investigation of ion signal suppression contributed by each individual step involved in the sample preparation associated with the analysis of DNA adducts of polycyclic aromatic hydrocarbon (PAH) using as model analyte dG-BaP, the deoxyguanosine adduct of benzo[a]pyrene (BaP). The individual matrix contribution of each one of these sources to analyte signal was systematically addressed as were any interactive effects. The information was used to develop a validated analytical protocol for the target biomarker at levels typically encountered in vivo using as little as 2 μg of DNA and applied to a dose response study using a metabolically competent cell line. PMID:26607319

  19. The Universal Multizone Crystallizator (UMC) Furnace: An International Cooperative Agreement

    NASA Technical Reports Server (NTRS)

    Watring, D. A.; Su, C.-H.; Gillies, D.; Roosz, T.; Babcsan, N.

    1996-01-01

    The Universal Multizone Crystallizator (UMC) is a special apparatus for crystal growth under terrestrial and microgravity conditions. The use of twenty-five zones allows the UMC to be used for several normal freezing growth techniques. The thermal profile is electronically translated along the stationary sample by systematically reducing the power to the control zones. Elimination of mechanical translation devices increases the systems reliability while simultaneously reducing the size and weight. This paper addresses the UMC furnace design, sample cartridge and typical thermal profiles and corresponding power requirements necessary for the dynamic gradient freeze crystal growth technique. Results from physical vapor transport and traveling heater method crystal growth experiments are also discussed.

  20. Directional view interpolation for compensation of sparse angular sampling in cone-beam CT.

    PubMed

    Bertram, Matthias; Wiegert, Jens; Schafer, Dirk; Aach, Til; Rose, Georg

    2009-07-01

    In flat detector cone-beam computed tomography and related applications, sparse angular sampling frequently leads to characteristic streak artifacts. To overcome this problem, it has been suggested to generate additional views by means of interpolation. The practicality of this approach is investigated in combination with a dedicated method for angular interpolation of 3-D sinogram data. For this purpose, a novel dedicated shape-driven directional interpolation algorithm based on a structure tensor approach is developed. Quantitative evaluation shows that this method clearly outperforms conventional scene-based interpolation schemes. Furthermore, the image quality trade-offs associated with the use of interpolated intermediate views are systematically evaluated for simulated and clinical cone-beam computed tomography data sets of the human head. It is found that utilization of directionally interpolated views significantly reduces streak artifacts and noise, at the expense of small introduced image blur.

  1. Quantum Ensemble Classification: A Sampling-Based Learning Control Approach.

    PubMed

    Chen, Chunlin; Dong, Daoyi; Qi, Bo; Petersen, Ian R; Rabitz, Herschel

    2017-06-01

    Quantum ensemble classification (QEC) has significant applications in discrimination of atoms (or molecules), separation of isotopes, and quantum information extraction. However, quantum mechanics forbids deterministic discrimination among nonorthogonal states. The classification of inhomogeneous quantum ensembles is very challenging, since there exist variations in the parameters characterizing the members within different classes. In this paper, we recast QEC as a supervised quantum learning problem. A systematic classification methodology is presented by using a sampling-based learning control (SLC) approach for quantum discrimination. The classification task is accomplished via simultaneously steering members belonging to different classes to their corresponding target states (e.g., mutually orthogonal states). First, a new discrimination method is proposed for two similar quantum systems. Then, an SLC method is presented for QEC. Numerical results demonstrate the effectiveness of the proposed approach for the binary classification of two-level quantum ensembles and the multiclass classification of multilevel quantum ensembles.

  2. Pharmacists' knowledge and the difficulty of obtaining emergency contraception.

    PubMed

    Bennett, Wendy; Petraitis, Carol; D'Anella, Alicia; Marcella, Stephen

    2003-10-01

    This cross-sectional study was performed to examine knowledge and attitudes among pharmacists about emergency contraception (EC) and determine the factors associated with their provision of EC. A random systematic sampling method was used to obtain a sample (N = 320) of pharmacies in Pennsylvania. A "mystery shopper" telephone survey method was utilized. Only 35% of pharmacists stated that they would be able to fill a prescription for EC that day. Also, many community pharmacists do not have sufficient or accurate information about EC. In a logistic regression model, pharmacists' lack of information relates to the low proportion of pharmacists able to dispense it. In conclusion, access to EC from community pharmacists in Pennsylvania is severely limited. Interventions to improve timely access to EC involve increased education for pharmacists, as well as increased community request for these products as an incentive for pharmacists to stock them.

  3. A Comparative Study of Sample Preparation for Staining and Immunodetection of Plant Cell Walls by Light Microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verhertbruggen, Yves; Walker, Jesse L.; Guillon, Fabienne

    Staining and immunodetection by light microscopy are methods widely used to investigate plant cell walls. The two techniques have been crucial to study the cell wall architecture in planta, its deconstruction by chemicals or cell wall-degrading enzymes. They have been instrumental in detecting the presence of cell types, in deciphering plant cell wall evolution and in characterizing plant mutants and transformants. The success of immunolabeling relies on how plant materials are embedded and sectioned. Agarose coating, wax and resin embedding are, respectively, associated with vibratome, microtome and ultramicrotome sectioning. Here, we have systematically carried out a comparative analysis of thesemore » three methods of sample preparation when they are applied for cell wall staining and cell wall immunomicroscopy. In order to help the plant community in understanding and selecting adequate methods of embedding and sectioning for cell wall immunodetection, we review in this article the advantages and limitations of these three methods. Moreover, we offer detailed protocols of embedding for studying plant materials through microscopy.« less

  4. A Comparative Study of Sample Preparation for Staining and Immunodetection of Plant Cell Walls by Light Microscopy

    DOE PAGES

    Verhertbruggen, Yves; Walker, Jesse L.; Guillon, Fabienne; ...

    2017-08-29

    Staining and immunodetection by light microscopy are methods widely used to investigate plant cell walls. The two techniques have been crucial to study the cell wall architecture in planta, its deconstruction by chemicals or cell wall-degrading enzymes. They have been instrumental in detecting the presence of cell types, in deciphering plant cell wall evolution and in characterizing plant mutants and transformants. The success of immunolabeling relies on how plant materials are embedded and sectioned. Agarose coating, wax and resin embedding are, respectively, associated with vibratome, microtome and ultramicrotome sectioning. Here, we have systematically carried out a comparative analysis of thesemore » three methods of sample preparation when they are applied for cell wall staining and cell wall immunomicroscopy. In order to help the plant community in understanding and selecting adequate methods of embedding and sectioning for cell wall immunodetection, we review in this article the advantages and limitations of these three methods. Moreover, we offer detailed protocols of embedding for studying plant materials through microscopy.« less

  5. A Comparative Study of Sample Preparation for Staining and Immunodetection of Plant Cell Walls by Light Microscopy

    PubMed Central

    Verhertbruggen, Yves; Walker, Jesse L.; Guillon, Fabienne; Scheller, Henrik V.

    2017-01-01

    Staining and immunodetection by light microscopy are methods widely used to investigate plant cell walls. The two techniques have been crucial to study the cell wall architecture in planta, its deconstruction by chemicals or cell wall-degrading enzymes. They have been instrumental in detecting the presence of cell types, in deciphering plant cell wall evolution and in characterizing plant mutants and transformants. The success of immunolabeling relies on how plant materials are embedded and sectioned. Agarose coating, wax and resin embedding are, respectively, associated with vibratome, microtome and ultramicrotome sectioning. Here, we have systematically carried out a comparative analysis of these three methods of sample preparation when they are applied for cell wall staining and cell wall immunomicroscopy. In order to help the plant community in understanding and selecting adequate methods of embedding and sectioning for cell wall immunodetection, we review in this article the advantages and limitations of these three methods. Moreover, we offer detailed protocols of embedding for studying plant materials through microscopy. PMID:28900439

  6. A Novel Application for the Cavalieri Principle: A Stereological and Methodological Study

    PubMed Central

    Altunkaynak, Berrin Zuhal; Altunkaynak, Eyup; Unal, Deniz; Unal, Bunyamin

    2009-01-01

    Objective The Cavalieri principle was applied to consecutive pathology sections that were photographed at the same magnification and used to estimate tissue volumes via superimposing a point counting grid on these images. The goal of this study was to perform the Cavalieri method quickly and practically. Materials and Methods In this study, 10 adult female Sprague Dawley rats were used. Brain tissue was removed and sampled both systematically and randomly. Brain volumes were estimated using two different methods. First, all brain slices were scanned with an HP ScanJet 3400C scanner, and their images were shown on a PC monitor. Brain volume was then calculated based on these images. Second, all brain slices were photographed in 10× magnification with a microscope camera, and brain volumes were estimated based on these micrographs. Results There was no statistically significant difference between the volume measurements of the two techniques (P>0.05; Paired Samples t Test). Conclusion This study demonstrates that personal computer scanning of serial tissue sections allows for easy and reliable volume determination based on the Cavalieri method. PMID:25610077

  7. Check-Standard Testing Across Multiple Transonic Wind Tunnels with the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    Deloach, Richard

    2012-01-01

    This paper reports the result of an analysis of wind tunnel data acquired in support of the Facility Analysis Verification & Operational Reliability (FAVOR) project. The analysis uses methods referred to collectively at Langley Research Center as the Modern Design of Experiments (MDOE). These methods quantify the total variance in a sample of wind tunnel data and partition it into explained and unexplained components. The unexplained component is further partitioned in random and systematic components. This analysis was performed on data acquired in similar wind tunnel tests executed in four different U.S. transonic facilities. The measurement environment of each facility was quantified and compared.

  8. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  9. Grizzly bear density in Glacier National Park, Montana

    USGS Publications Warehouse

    Kendall, K.C.; Stetz, J.B.; Roon, David A.; Waits, L.P.; Boulanger, J.B.; Paetkau, David

    2008-01-01

    We present the first rigorous estimate of grizzly bear (Ursus arctos) population density and distribution in and around Glacier National Park (GNP), Montana, USA. We used genetic analysis to identify individual bears from hair samples collected via 2 concurrent sampling methods: 1) systematically distributed, baited, barbed-wire hair traps and 2) unbaited bear rub trees found along trails. We used Huggins closed mixture models in Program MARK to estimate total population size and developed a method to account for heterogeneity caused by unequal access to rub trees. We corrected our estimate for lack of geographic closure using a new method that utilizes information from radiocollared bears and the distribution of bears captured with DNA sampling. Adjusted for closure, the average number of grizzly bears in our study area was 240.7 (95% CI = 202–303) in 1998 and 240.6 (95% CI = 205–304) in 2000. Average grizzly bear density was 30 bears/1,000 km2, with 2.4 times more bears detected per hair trap inside than outside GNP. We provide baseline information important for managing one of the few remaining populations of grizzlies in the contiguous United States.

  10. Determination of mutagenic amines in water and food samples by high pressure liquid chromatography with amperometric detection using a multiwall carbon nanotubes-glassy carbon electrode.

    PubMed

    Bueno, Ana María; Marín, Miguel Ángel; Contento, Ana María; Ríos, Ángel

    2016-02-01

    A chromatographic method, using amperometric detection, for the sensitive determination of six representative mutagenic amines was developed. A glassy carbon electrode (GCE), modified with multiwall carbon nanotubes (GCE-CNTs), was prepared and its response compared to a conventional glassy carbon electrode. The chromatographic method (HPLC-GCE-CNTs) allowed the separation and the determination of heterocyclic aromatic amines (HAAs) classified as mutagenic amines by the International Agency for Research of Cancer. The new electrode was systematically studied in terms of stability, sensitivity, and reproducibility. Statistical analysis of the obtained data demonstrated that the modified electrode provided better sensitivity than the conventional unmodified ones. Detection limits were in the 3.0 and 7.5 ng/mL range, whereas quantification limits ranged between 9.5 and 25.0 ng/mL were obtained. The applicability of the method was demonstrated by the determination of the amines in several types of samples (water and food samples). Recoveries indicate very good agreement between amounts added and those found for all HAAs (recoveries in the 92% and 105% range). Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Direct and indirect measurement of the magnetocaloric effect in bulk and nanostructured Ni-Mn-In Heusler alloy

    NASA Astrophysics Data System (ADS)

    Ghahremani, Mohammadreza; Aslani, Amir; Hosseinnia, Marjan; Bennett, Lawrence H.; Della Torre, Edward

    2018-05-01

    A systematic study of the magnetocaloric effect of a Ni51Mn33.4In15.6 Heusler alloy converted to nanoparticles via high energy ball-milling technique in the temperature range of 270 to 310 K has been performed. The properties of the particles were characterized by x-ray diffraction, electron microscopy, and magnetometer techniques. Isothermal magnetic field variation of magnetization exhibits field hysteresis in bulk Ni51Mn33.4In15.6 alloy across the martensitic transition which significantly lessened in the nanoparticles. The magnetocaloric effects of the bulk and nanoparticle samples were measured both with direct method, through our state of the art direct test bed apparatus with controllability over the applied fields and temperatures, as well as an indirect method through Maxwell and thermodynamic equations. In direct measurements, nanoparticle sample's critical temperature decreased by 6 K, but its magnetocaloric effect enhanced by 17% over the bulk counterpart. Additionally, when comparing the direct and indirect magnetocaloric curves, the direct method showed 14% less adiabatic temperature change in the bulk and 5% less adiabatic temperature change in the nanostructured sample.

  12. Application of relativistic electrons for the quantitative analysis of trace elements

    NASA Astrophysics Data System (ADS)

    Hoffmann, D. H. H.; Brendel, C.; Genz, H.; Löw, W.; Richter, A.

    1984-04-01

    Particle induced X-ray emission methods (PIXE) have been extended to relativistic electrons to induce X-ray emission (REIXE) for quantitative trace-element analysis. The electron beam (20 ≤ E0≤ 70 MeV) was supplied by the Darmstadt electron linear accelerator DALINAC. Systematic measurements of absolute K-, L- and M-shell ionization cross sections revealed a scaling behaviour of inner-shell ionization cross sections from which X-ray production cross sections can be deduced for any element of interest for a quantitative sample investigation. Using a multielemental mineral monazite sample from Malaysia the sensitivity of REIXE is compared to well established methods of trace-element analysis like proton- and X-ray-induced X-ray fluorescence analysis. The achievable detection limit for very heavy elements amounts to about 100 ppm for the REIXE method. As an example of an application the investigation of a sample prepared from manganese nodules — picked up from the Pacific deep sea — is discussed, which showed the expected high mineral content of Fe, Ni, Cu and Ti, although the search for aliquots of Pt did not show any measurable content within an upper limit of 250 ppm.

  13. Measurement of the top quark mass using template methods on dilepton events in p anti-p collisions at s**(1/2) = 1.96-TeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abulencia, A.; Acosta, D.; Adelman, Jahred A.

    2006-02-01

    The authors describe a measurement of the top quark mass from events produced in p{bar p} collisions at a center-of-mass energy of 1.96 TeV, using the Collider Detector at Fermilab. They identify t{bar t} candidates where both W bosons from the top quarks decay into leptons (e{nu}, {mu}{nu}, or {tau}{nu}) from a data sample of 360 pb{sup -1}. The top quark mass is reconstructed in each event separately by three different methods, which draw upon simulated distributions of the neutrino pseudorapidity, t{bar t} longitudinal momentum, or neutrino azimuthal angle in order to extract probability distributions for the top quark mass.more » For each method, representative mass distributions, or templates, are constructed from simulated samples of signal and background events, and parameterized to form continuous probability density functions. A likelihood fit incorporating these parameterized templates is then performed on the data sample masses in order to derive a final top quark mass. Combining the three template methods, taking into account correlations in their statistical and systematic uncertainties, results in a top quark mass measurement of 170.1 {+-} 6.0(stat.) {+-} 4.1(syst.) GeV/c{sup 2}.« less

  14. A Systematic Review of Psychological Studies Applied to Futsal

    PubMed Central

    Dias, Cláudia Salomé; Fonseca, António Manuel

    2016-01-01

    Abstract This study presents a systematic review of psychological studies applied to futsal. A total of 23 studies were analyzed within five sections: a year overview and the name of journals, research designs, data collection, sample characteristics, and a focus category. This study found that the first psychological articles that were applied to futsal were published in 2008, and the number of publications gradually increased since then. The majority of examined studies were cross-sectional designs and conducted at the elite level in European and Asian countries. Most studies did not use mixed methods and did not specify the age of the subjects. Psychological research applied to futsal focused on athletes, non-athletes and several psychological factors. Critical and innovative reflections were made to highlight research gaps and present suggestions for further research. PMID:28149362

  15. Vibrational Spectroscopy as a Promising Toolbox for Analyzing Functionalized Ceramic Membranes.

    PubMed

    Kiefer, Johannes; Bartels, Julia; Kroll, Stephen; Rezwan, Kurosch

    2018-01-01

    Ceramic materials find use in many fields including the life sciences and environmental engineering. For example, ceramic membranes have shown to be promising filters for water treatment and virus retention. The analysis of such materials, however, remains challenging. In the present study, the potential of three vibrational spectroscopic methods for characterizing functionalized ceramic membranes for water treatment is evaluated. For this purpose, Raman scattering, infrared (IR) absorption, and solvent infrared spectroscopy (SIRS) were employed. The data were analyzed with respect to spectral changes as well as using principal component analysis (PCA). The Raman spectra allow an unambiguous discrimination of the sample types. The IR spectra do not change systematically with functionalization state of the material. Solvent infrared spectroscopy allows a systematic distinction and enables studying the molecular interactions between the membrane surface and the solvent.

  16. Multiplex Microsphere Immunoassays for the Detection of IgM and IgG to Arboviral Diseases

    PubMed Central

    Basile, Alison J.; Horiuchi, Kalanthe; Panella, Amanda J.; Laven, Janeen; Kosoy, Olga; Lanciotti, Robert S.; Venkateswaran, Neeraja; Biggerstaff, Brad J.

    2013-01-01

    Serodiagnosis of arthropod-borne viruses (arboviruses) at the Division of Vector-Borne Diseases, CDC, employs a combination of individual enzyme-linked immunosorbent assays and microsphere immunoassays (MIAs) to test for IgM and IgG, followed by confirmatory plaque-reduction neutralization tests. Based upon the geographic origin of a sample, it may be tested concurrently for multiple arboviruses, which can be a cumbersome task. The advent of multiplexing represents an opportunity to streamline these types of assays; however, because serologic cross-reactivity of the arboviral antigens often confounds results, it is of interest to employ data analysis methods that address this issue. Here, we constructed 13-virus multiplexed IgM and IgG MIAs that included internal and external controls, based upon the Luminex platform. Results from samples tested using these methods were analyzed using 8 different statistical schemes to identify the best way to classify the data. Geographic batteries were also devised to serve as a more practical diagnostic format, and further samples were tested using the abbreviated multiplexes. Comparative error rates for the classification schemes identified a specific boosting method based on logistic regression “Logitboost” as the classification method of choice. When the data from all samples tested were combined into one set, error rates from the multiplex IgM and IgG MIAs were <5% for all geographic batteries. This work represents both the most comprehensive, validated multiplexing method for arboviruses to date, and also the most systematic attempt to determine the most useful classification method for use with these types of serologic tests. PMID:24086608

  17. The Emergence of Systematic Review in Toxicology.

    PubMed

    Stephens, Martin L; Betts, Kellyn; Beck, Nancy B; Cogliano, Vincent; Dickersin, Kay; Fitzpatrick, Suzanne; Freeman, James; Gray, George; Hartung, Thomas; McPartland, Jennifer; Rooney, Andrew A; Scherer, Roberta W; Verloo, Didier; Hoffmann, Sebastian

    2016-07-01

    The Evidence-based Toxicology Collaboration hosted a workshop on "The Emergence of Systematic Review and Related Evidence-based Approaches in Toxicology," on November 21, 2014 in Baltimore, Maryland. The workshop featured speakers from agencies and organizations applying systematic review approaches to questions in toxicology, speakers with experience in conducting systematic reviews in medicine and healthcare, and stakeholders in industry, government, academia, and non-governmental organizations. Based on the workshop presentations and discussion, here we address the state of systematic review methods in toxicology, historical antecedents in both medicine and toxicology, challenges to the translation of systematic review from medicine to toxicology, and thoughts on the way forward. We conclude with a recommendation that as various agencies and organizations adapt systematic review methods, they continue to work together to ensure that there is a harmonized process for how the basic elements of systematic review methods are applied in toxicology. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.

  18. Empiric antibiotic treatment for urinary tract infection in preschool children: susceptibilities of urine sample isolates.

    PubMed

    Butler, Christopher C; O'Brien, Kathryn; Wootton, Mandy; Pickles, Timothy; Hood, Kerenza; Howe, Robin; Waldron, Cherry-Ann; Thomas-Jones, Emma; Dudley, Jan; Van Der Voort, Judith; Rumsby, Kate; Little, Paul; Downing, Harriet; Harman, Kim; Hay, Alastair D

    2016-04-01

    Antibiotic treatment recommendations based on susceptibility data from routinely submitted urine samples may be biased because of variation in sampling, laboratory procedures and inclusion of repeat samples, leading to uncertainty about empirical treatment. To describe and compare susceptibilities of Escherichia coli cultured from routinely submitted samples, with E. coli causing urinary tract infection (UTI) from a cohort of systematically sampled, acutely unwell children. Susceptibilities of 1458 E. coli isolates submitted during the course of routine primary care for children <5 years (routine care samples), compared to susceptibilities of 79 E. coli isolates causing UTI from 5107 children <5 years presenting to primary care with an acute illness [systematic sampling: the Diagnosis of Urinary Tract infection in Young children (DUTY) cohort]. The percentage of E. coli sensitive to antibiotics cultured from routinely submitted samples were as follows: amoxicillin 45.1% (95% confidence interval: 42.5-47.7%); co-amoxiclav using the lower systemic break point (BP) 86.6% (84.7-88.3%); cephalexin 95.1% (93.9-96.1%); trimethoprim 74.0% (71.7-76.2%) and nitrofurantoin 98.2% (97.4-98.8%). The percentage of E. coli sensitive to antibiotics cultured from systematically sampled DUTY urines considered to be positive for UTI were as follows: amoxicillin 50.6% (39.8-61.4%); co-amoxiclav using the systemic BP 83.5% (73.9-90.1%); co-amoxiclav using the urinary BP 94.9% (87.7-98.4%); cephalexin 98.7% (93.2-99.8%); trimethoprim 70.9% (60.1-80.0%); nitrofurantoin 100% (95.3-100.0%) and ciprofloxacin 96.2% (89.4-98.7%). Escherichia coli susceptibilities from routine and systematically obtained samples were similar. Most UTIs in preschool children remain susceptible to nitrofurantoin, co-amoxiclav and cephalexin. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Correcting for the influence of sampling conditions on biomarkers of exposure to phenols and phthalates: a 2-step standardization method based on regression residuals.

    PubMed

    Mortamais, Marion; Chevrier, Cécile; Philippat, Claire; Petit, Claire; Calafat, Antonia M; Ye, Xiaoyun; Silva, Manori J; Brambilla, Christian; Eijkemans, Marinus J C; Charles, Marie-Aline; Cordier, Sylvaine; Slama, Rémy

    2012-04-26

    Environmental epidemiology and biomonitoring studies typically rely on biological samples to assay the concentration of non-persistent exposure biomarkers. Between-participant variations in sampling conditions of these biological samples constitute a potential source of exposure misclassification. Few studies attempted to correct biomarker levels for this error. We aimed to assess the influence of sampling conditions on concentrations of urinary biomarkers of select phenols and phthalates, two widely-produced families of chemicals, and to standardize biomarker concentrations on sampling conditions. Urine samples were collected between 2002 and 2006 among 287 pregnant women from Eden and Pélagie cohorts, from which phthalates and phenols metabolites levels were assayed. We applied a 2-step standardization method based on regression residuals. First, the influence of sampling conditions (including sampling hour, duration of storage before freezing) and of creatinine levels on biomarker concentrations were characterized using adjusted linear regression models. In the second step, the model estimates were used to remove the variability in biomarker concentrations due to sampling conditions and to standardize concentrations as if all samples had been collected under the same conditions (e.g., same hour of urine collection). Sampling hour was associated with concentrations of several exposure biomarkers. After standardization for sampling conditions, median concentrations differed by--38% for 2,5-dichlorophenol to +80 % for a metabolite of diisodecyl phthalate. However, at the individual level, standardized biomarker levels were strongly correlated (correlation coefficients above 0.80) with unstandardized measures. Sampling conditions, such as sampling hour, should be systematically collected in biomarker-based studies, in particular when the biomarker half-life is short. The 2-step standardization method based on regression residuals that we proposed in order to limit the impact of heterogeneity in sampling conditions could be further tested in studies describing levels of biomarkers or their influence on health.

  20. Dietitians and Nutritionists: Stigma in the Context of Obesity. A Systematic Review

    PubMed Central

    Jung, Franziska U. C. E.; Luck-Sikorski, Claudia; Wiemers, Nina; Riedel-Heller, Steffi G.

    2015-01-01

    Aim Negative attitudes towards people with obesity are common even in health care settings. So far, the attitudes and causal beliefs of dietitians and nutritionists have not been investigated systematically. The aim of this article was to review the current state of quantitative research on weight-related stigma by dietitians and nutritionists. Method A systematic literature review was conducted in 2014 using PubMed, PsycINFO, Web of Science and Cochrane Library. Results Eight studies were found that differ in regard to study characteristics, instruments and the origin of the sample. Six out of eight studies reported weight stigma expressed by dietitians and nutritionists. Their believed causes of obesity indicated a defined preference for internal factors rather than genetics or biology. Discussion Results of studies were not homogenous. The degree of negative attitudes by dietitians and nutritionists towards people with obesity appeared to be slightly less pronounced compared to the general public and other health care professionals. Stigma and its consequences should be included into educational programs to optimally prepare dietitians and nutritionists. PMID:26466329

  1. A systematic review on empowerment for healthy nutrition in health promotion.

    PubMed

    Brandstetter, Susanne; Rüter, Jana; Curbach, Janina; Loss, Julika

    2015-12-01

    The present review aimed to identify and synthesize studies that used an empowerment approach within the field of healthy nutrition. A systematic review was conducted. Studies were identified by database searching (PubMed, PsycINFO, Web of Science and Psyndex). Searching, selecting and reporting were done according to the PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) Statement. Health promotion including the subject of healthy nutrition. Individuals from non-clinical populations. A total of 1226 studies were screened for eligibility, eight studies were finally included. Three studies used the empowerment approach within a qualitative research paradigm and five studies within (quasi-) experimental intervention studies. Heterogeneity in settings, samples and evaluation methods was high. Most studies referred to the key message of empowerment, i.e. taking control over one's own life. However, the ways in which this key message was implemented in the interventions differed considerably. The number of studies included was very low. Furthermore, most studies had some limitations in terms of reporting how the empowerment approach was actually applied. The empowerment approach still seems to be unfamiliar within the field of healthy nutrition.

  2. A systematic review of grounded theory studies in physiotherapy.

    PubMed

    Ali, Nancy; May, Stephen; Grafton, Kate

    2018-05-23

    This systematic review aimed at appraising the methodological rigor of grounded theory research published in the field of physiotherapy to assess how the methodology is understood and applied. A secondary aim was to provide research implications drawn from the findings to guide future grounded theory methodology (GTM) research. A systematic search was conducted in MEDLINE, CINHAL, SPORT Discus, Science Direct, PubMed, Scopus, and Web of Science to identify studies in the field of physiotherapy that reported using GTM and/or methods in the study title and/or abstract. The descriptive characteristics and methodological quality of eligible studies were examined using grounded theory methodology assessment guidelines. The review included 68 studies conducted between 1998 and 2017. The findings showed that GTM is becoming increasingly used by physiotherapy researchers. Thirty-six studies (53%) demonstrated a good understanding and appropriate application of GTM. Thirty-two studies (47%) presented descriptive findings and were considered to be of poor methodological quality. There are several key tenets of GTM that are integral to the iterative process of qualitative theorizing and need to be applied throughout all research practices including sampling, data collection, and analysis.

  3. Congruence between patients’ preferred and perceived participation in medical decision-making: a review of the literature

    PubMed Central

    2014-01-01

    Background Patients are increasingly expected and asked to be involved in health care decisions. In this decision-making process, preferences for participation are important. In this systematic review we aim to provide an overview the literature related to the congruence between patients’ preferences and their perceived participation in medical decision-making. We also explore the direction of mismatched and outline factors associated with congruence. Methods A systematic review was performed on patient participation in medical decision-making. Medline, PsycINFO, CINAHL, EMBASE and the Cochrane Library databases up to September 2012, were searched and all studies were rigorously critically appraised. In total 44 papers were included, they sampled contained 52 different patient samples. Results Mean of congruence between preference for and perceived participation in decision-making was 60% (49 and 70 representing 25th and 75th percentiles). If no congruence was found, of 36 patient samples most patients preferred more involvement and of 9 patient samples most patients preferred less involvement. Factors associated with preferences the most investigated were age and educational level. Younger patients preferred more often an active or shared role as did higher educated patients. Conclusion This review suggests that a similar approach to all patients is not likely to meet patients’ wishes, since preferences for participation vary among patients. Health care professionals should be sensitive to patients individual preferences and communicate about patients’ participation wishes on a regular basis during their illness trajectory. PMID:24708833

  4. Automated Detection of Knickpoints and Knickzones Across Transient Landscapes

    NASA Astrophysics Data System (ADS)

    Gailleton, B.; Mudd, S. M.; Clubb, F. J.

    2017-12-01

    Mountainous regions are ubiquitously dissected by river channels, which transmit climate and tectonic signals to the rest of the landscape by adjusting their long profiles. Fluvial response to allogenic forcing is often expressed through the upstream propagation of steepened reaches, referred to as knickpoints or knickzones. The identification and analysis of these steepened reaches has numerous applications in geomorphology, such as modelling long-term landscape evolution, understanding controls on fluvial incision, and constraining tectonic uplift histories. Traditionally, the identification of knickpoints or knickzones from fluvial profiles requires manual selection or calibration. This process is both time-consuming and subjective, as different workers may select different steepened reaches within the profile. We propose an objective, statistically-based method to systematically pick knickpoints/knickzones on a landscape scale using an outlier-detection algorithm. Our method integrates river profiles normalised by drainage area (Chi, using the approach of Perron and Royden, 2013), then separates the chi-elevation plots into a series of transient segments using the method of Mudd et al. (2014). This method allows the systematic detection of knickpoints across a DEM, regardless of size, using a high-performance algorithm implemented in the open-source Edinburgh Land Surface Dynamics Topographic Tools (LSDTopoTools) software package. After initial knickpoint identification, outliers are selected using several sorting and binning methods based on the Median Absolute Deviation, to avoid the influence sample size. We test our method on a series of DEMs and grid resolutions, and show that our method consistently identifies accurate knickpoint locations across each landscape tested.

  5. Systematic Development and Validation of a Thin-Layer Densitometric Bioanalytical Method for Estimation of Mangiferin Employing Analytical Quality by Design (AQbD) Approach.

    PubMed

    Khurana, Rajneet Kaur; Rao, Satish; Beg, Sarwar; Katare, O P; Singh, Bhupinder

    2016-01-01

    The present work aims at the systematic development of a simple, rapid and highly sensitive densitometry-based thin-layer chromatographic method for the quantification of mangiferin in bioanalytical samples. Initially, the quality target method profile was defined and critical analytical attributes (CAAs) earmarked, namely, retardation factor (Rf), peak height, capacity factor, theoretical plates and separation number. Face-centered cubic design was selected for optimization of volume loaded and plate dimensions as the critical method parameters selected from screening studies employing D-optimal and Plackett-Burman design studies, followed by evaluating their effect on the CAAs. The mobile phase containing a mixture of ethyl acetate : acetic acid : formic acid : water in a 7 : 1 : 1 : 1 (v/v/v/v) ratio was finally selected as the optimized solvent for apt chromatographic separation of mangiferin at 262 nm withRf 0.68 ± 0.02 and all other parameters within the acceptance limits. Method validation studies revealed high linearity in the concentration range of 50-800 ng/band for mangiferin. The developed method showed high accuracy, precision, ruggedness, robustness, specificity, sensitivity, selectivity and recovery. In a nutshell, the bioanalytical method for analysis of mangiferin in plasma revealed the presence of well-resolved peaks and high recovery of mangiferin. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Primer on statistical interpretation or methods report card on propensity-score matching in the cardiology literature from 2004 to 2006: a systematic review.

    PubMed

    Austin, Peter C

    2008-09-01

    Propensity-score matching is frequently used in the cardiology literature. Recent systematic reviews have found that this method is, in general, poorly implemented in the medical literature. The study objective was to examine the quality of the implementation of propensity-score matching in the general cardiology literature. A total of 44 articles published in the American Heart Journal, the American Journal of Cardiology, Circulation, the European Heart Journal, Heart, the International Journal of Cardiology, and the Journal of the American College of Cardiology between January 1, 2004, and December 31, 2006, were examined. Twenty of the 44 studies did not provide adequate information on how the propensity-score-matched pairs were formed. Fourteen studies did not report whether matching on the propensity score balanced baseline characteristics between treated and untreated subjects in the matched sample. Only 4 studies explicitly used statistical methods appropriate for matched studies to compare baseline characteristics between treated and untreated subjects. Only 11 (25%) of the 44 studies explicitly used statistical methods appropriate for the analysis of matched data when estimating the effect of treatment on the outcomes. Only 2 studies described the matching method used, assessed balance in baseline covariates by appropriate methods, and used appropriate statistical methods to estimate the treatment effect and its significance. Application of propensity-score matching was poor in the cardiology literature. Suggestions for improving the reporting and analysis of studies that use propensity-score matching are provided.

  7. Estimation of population mean under systematic sampling

    NASA Astrophysics Data System (ADS)

    Noor-ul-amin, Muhammad; Javaid, Amjad

    2017-11-01

    In this study we propose a generalized ratio estimator under non-response for systematic random sampling. We also generate a class of estimators through special cases of generalized estimator using different combinations of coefficients of correlation, kurtosis and variation. The mean square errors and mathematical conditions are also derived to prove the efficiency of proposed estimators. Numerical illustration is included using three populations to support the results.

  8. Uncertainty Quantification in Alchemical Free Energy Methods.

    PubMed

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-06-12

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  9. Detecting and overcoming systematic errors in genome-scale phylogenies.

    PubMed

    Rodríguez-Ezpeleta, Naiara; Brinkmann, Henner; Roure, Béatrice; Lartillot, Nicolas; Lang, B Franz; Philippe, Hervé

    2007-06-01

    Genome-scale data sets result in an enhanced resolution of the phylogenetic inference by reducing stochastic errors. However, there is also an increase of systematic errors due to model violations, which can lead to erroneous phylogenies. Here, we explore the impact of systematic errors on the resolution of the eukaryotic phylogeny using a data set of 143 nuclear-encoded proteins from 37 species. The initial observation was that, despite the impressive amount of data, some branches had no significant statistical support. To demonstrate that this lack of resolution is due to a mutual annihilation of phylogenetic and nonphylogenetic signals, we created a series of data sets with slightly different taxon sampling. As expected, these data sets yielded strongly supported but mutually exclusive trees, thus confirming the presence of conflicting phylogenetic and nonphylogenetic signals in the original data set. To decide on the correct tree, we applied several methods expected to reduce the impact of some kinds of systematic error. Briefly, we show that (i) removing fast-evolving positions, (ii) recoding amino acids into functional categories, and (iii) using a site-heterogeneous mixture model (CAT) are three effective means of increasing the ratio of phylogenetic to nonphylogenetic signal. Finally, our results allow us to formulate guidelines for detecting and overcoming phylogenetic artefacts in genome-scale phylogenetic analyses.

  10. Effect of cervical cancer education and provider recommendation for screening on screening rates: A systematic review and meta-analysis

    PubMed Central

    Achenbach, Chad J.; O’Dwyer, Linda C.; Evans, Charlesnika T.; McHugh, Megan; Hou, Lifang; Simon, Melissa A.; Murphy, Robert L.; Jordan, Neil

    2017-01-01

    Background Although cervical cancer is largely preventable through screening, detection and treatment of precancerous abnormalities, it remains one of the top causes of cancer-related morbidity and mortality globally. Objectives The objective of this systematic review is to understand the evidence of the effect of cervical cancer education compared to control conditions on cervical cancer screening rates in eligible women population at risk of cervical cancer. We also sought to understand the effect of provider recommendations for screening to eligible women on cervical cancer screening (CCS) rates compared to control conditions in eligible women population at risk of cervical cancer. Methods We used the PICO (Problem or Population, Interventions, Comparison and Outcome) framework as described in the Cochrane Collaboration Handbook to develop our search strategy. The details of our search strategy has been described in our systematic review protocol published in the International Prospective Register of systematic reviews (PROSPERO). The protocol registration number is CRD42016045605 available at: http://www.crd.york.ac.uk/prospero/display_record.asp?src=trip&ID=CRD42016045605. The search string was used in Pubmed, Embase, Cochrane Systematic Reviews and Cochrane CENTRAL register of controlled trials to retrieve study reports that were screened for inclusion in this review. Our data synthesis and reporting was guided by the Preferred Reporting Items for Systematic Reviews and Meta-analysis (PRISMA). We did a qualitative synthesis of evidence and, where appropriate, individual study effects were pooled in meta-analyses using RevMan 5.3 Review Manager. The Higgins I2 was used to assess for heterogeneity in studies pooled together for overall summary effects. We did assessment of risk of bias of individual studies included and assessed risk of publication bias across studies pooled together in meta-analysis by Funnel plot. Results Out of 3072 study reports screened, 28 articles were found to be eligible for inclusion in qualitative synthesis (5 of which were included in meta-analysis of educational interventions and 8 combined in meta-analysis of HPV self-sampling interventions), while 45 were excluded for various reasons. The use of theory-based educational interventions significantly increased CCS rates by more than double (OR, 2.46, 95% CI: 1.88, 3.21). Additionally, offering women the option of self-sampling for Human Papillomavirus (HPV) testing increased CCS rates by nearly 2-fold (OR = 1.71, 95% CI: 1.32, 2.22). We also found that invitation letters alone (or with a follow up phone contact), making an appointment, and sending reminders to patients who are due or overdue for screening had a significant effect on improving participation and CCS rates in populations at risk. Conclusion Our findings supports the implementation of theory-based cervical cancer educational interventions to increase women’s participation in cervical cancer screening programs, particularly when targeting communities with low literacy levels. Additionally, cervical cancer screening programs should consider the option of offering women the opportunity for self-sample collection particularly when such women have not responded to previous screening invitation or reminder letters for Pap smear collection as a method of screening. PMID:28873092

  11. Examining the cosmic acceleration with the latest Union2 supernova data

    NASA Astrophysics Data System (ADS)

    Li, Zhengxiang; Wu, Puxun; Yu, Hongwei

    2011-01-01

    In this Letter, by reconstructing the Om diagnostic and the deceleration parameter q from the latest Union2 Type Ia Supernova sample with and without the systematic error along with the baryon acoustic oscillation (BAO) and the cosmic microwave background (CMB), we study the cosmic expanding history, using the Chevallier-Polarski-Linder (CPL) parametrization. We obtain that Union2+BAO favor an expansion with a decreasing of the acceleration at z<0.3. However, once the CMB data is added in the analysis, the cosmic acceleration is found to be still increasing, indicating a tension between low redshift data and high redshift. In order to reduce this tension significantly, two different methods are considered and thus two different subsamples of Union2 are selected. We then find that two different subsamples+BAO+CMB give completely different results on the cosmic expanding history when the systematic error is ignored, with one suggesting a decreasing cosmic acceleration, the other just the opposite, although both of them alone with BAO support that the cosmic acceleration is slowing down. However, once the systematic error is considered, two different subsamples of Union2 along with BAO and CMB all favor an increasing of the present cosmic acceleration. Therefore a clear-cut answer on whether the cosmic acceleration is slowing down calls for more consistent data and more reliable methods to analyze them.

  12. Getting under the skin of the primary care consultation using video stimulated recall: a systematic review

    PubMed Central

    2014-01-01

    Background Video stimulated recall (VSR) is a method of enhancing participants’ accounts of the consultation using a video recording of the event to encourage and prompt recall in a post consultation interview. VSR is used in education and education research, and to a lesser extent in medical and nursing research. Little is known about the sort of research questions that lend themselves best to the use of VSR or the impact of the specific VSR procedure on study quality. This systematic review describes studies in primary care that have used the method and aims to identify the strengths, weaknesses and role of VSR. Methods A systematic literature search has been conducted to identify primary care consultation research using VSR. Two authors undertook data extraction and quality appraisal of identified papers and a narrative synthesis has been conducted to draw together the findings. In addition, theory on classifying VSR procedures derived from other disciplines is used as a lens through which to assess the relevance of VSR technique. Results Twenty eight publications were identified that reported VSR in primary care doctor-patient consultation research. VSR was identified as a useful method to explore specific events within the consultation, mundane or routine occurrences, non-spoken events and appears to particularly add value to doctor’s post consultation accounts. However, studies frequently had insufficient description of methods to properly evaluate both the quality of the study, and the influence of VSR technique on findings. Conclusions VSR is particularly useful for study of specific consultation events when a ‘within case’ approach is used in analysis, comparing and contrasting findings from the consultation and post-consultation interview. Alignment of the choice of VSR procedure and sampling to the study research question was established as particularly important in the quality of studies. Future researchers may consider the role of process evaluation to understand further the impact of research design on data yielded and the acceptability of the method to participants. PMID:25175450

  13. Abundances of disk and bulge giants from high-resolution optical spectra. I. O, Mg, Ca, and Ti in the solar neighborhood and Kepler field samples

    NASA Astrophysics Data System (ADS)

    Jönsson, H.; Ryde, N.; Nordlander, T.; Pehlivan Rhodin, A.; Hartman, H.; Jönsson, P.; Eriksson, K.

    2017-02-01

    Context. The Galactic bulge is an intriguing and significant part of our Galaxy, but it is hard to observe because it is both distant and covered by dust in the disk. Therefore, there are not many high-resolution optical spectra of bulge stars with large wavelength coverage, whose determined abundances can be compared with nearby, similarly analyzed stellar samples. Aims: We aim to determine the diagnostically important alpha elements of a sample of bulge giants using high-resolution optical spectra with large wavelength coverage. The abundances found are compared to similarly derived abundances from similar spectra of similar stars in the local thin and thick disks. In this first paper we focus on the solar neighborhood reference sample. Methods: We used spectral synthesis to derive the stellar parameters as well as the elemental abundances of both the local and bulge samples of giants. We took special care to benchmark our method of determining stellar parameters against independent measurements of effective temperatures from angular diameter measurements and surface gravities from asteroseismology. Results: In this first paper we present the method used to determine the stellar parameters and elemental abundances, evaluate them, and present the results for our local disk sample of 291 giants. Conclusions: When comparing our determined spectroscopic temperatures to those derived from angular diameter measurements, we reproduce these with a systematic difference of +10 K and a standard deviation of 53 K. The spectroscopic gravities reproduce those determined from asteroseismology with a systematic offset of +0.10 dex and a standard deviation of 0.12 dex. When it comes to the abundance trends, our sample of local disk giants closely follows trends found in other works analyzing solar neighborhood dwarfs, showing that the much brighter giant stars are as good abundance probes as the often used dwarfs. Based on observations made with the Nordic Optical Telescope (programs 51-018 and 53-002), operated by the Nordic Optical Telescope Scientific Association at the Observatorio del Roque de los Muchachos, La Palma, Spain, of the Instituto de Astrofisica de Canarias, and on spectral data retrieved from PolarBase at Observatoire Midi Pyrénées.Full Tables A.1 and A.3 are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A100

  14. Histamine food poisonings: A systematic review and meta-analysis.

    PubMed

    Colombo, Fabio M; Cattaneo, Patrizia; Confalonieri, Enrica; Bernardi, Cristian

    2018-05-03

    The aim of this study was to assess the mean of histamine concentration in food poisoning. Systematic review and meta-analysis of reports published between 1959 and 2013. Main criteria for inclusion of studies were: all report types that present outbreaks of "histamine poisoning' or "scombroid syndrome" from food, including histamine content and type of food. Health status of people involved must be nonpathological. Fifty-five (55) reports were included, these studies reported 103 incidents. All pooled analyses were based on random effect model; histamine mean concentration in poisoning samples was 1107.21 mg/kg with confidence interval for the meta-mean of 422.62-2900.78 mg/kg; heterogeneity index (I2) was 100% (P < 0.0001); prediction interval was 24.12-50822.78 mg/kg. Fish involved in histamine poisoning was mainly tuna or Istiophoridae species. No clues of association between concomitant conditions (female sex, alcohol consumption, previous medication, and consumption of histamine releasing food) and histamine poisoning, were highlighted. This is the first systematic review and meta-analysis that analyzes all the available data on histamine poisoning outbreaks evaluating the histamine concentration in food involved. Histamine mean concentration in poisoning samples was fairly high. Our study suffers from some limitations, which are intrinsic of the studies included, for instance the lack of a complete anamnesis of each poisoning episode. Protocol registration: Methods were specified in advance and have been published as a protocol in PROSPERO database (18/07/2012 -CRD42012002566).

  15. Racism and health service utilisation: A systematic review and meta-analysis

    PubMed Central

    Cormack, Donna; Harris, Ricci; Paradies, Yin

    2017-01-01

    Although racism has been posited as driver of racial/ethnic inequities in healthcare, the relationship between racism and health service use and experience has yet to be systematically reviewed or meta-analysed. This paper presents a systematic review and meta-analysis of quantitative empirical studies that report associations between self-reported racism and various measures of healthcare service utilisation. Data were reviewed and extracted from 83 papers reporting 70 studies. Studies included 250,850 participants and were conducted predominately in the U.S. The meta-analysis included 59 papers reporting 52 studies, which were analysed using random effects models and mean weighted effect sizes. Racism was associated with more negative patient experiences of health services (HSU-E) (OR = 0.351 (95% CI [0.236,0.521], k = 19), including lower levels of healthcare-related trust, satisfaction, and communication. Racism was not associated with health service use (HSU-U) as an outcome group, and was not associated with most individual HSU-U outcomes, including having had examinations, health service visits and admissions to health professionals and services. Racism was associated with health service use outcomes such as delaying/not getting healthcare, and lack of adherence to treatment uptake, although these effects may be influenced by a small sample of studies, and publication bias, respectively. Limitations to the literature reviewed in terms of study designs, sampling methods and measurements are discussed along with suggested future directions in the field. PMID:29253855

  16. Racism and health service utilisation: A systematic review and meta-analysis.

    PubMed

    Ben, Jehonathan; Cormack, Donna; Harris, Ricci; Paradies, Yin

    2017-01-01

    Although racism has been posited as driver of racial/ethnic inequities in healthcare, the relationship between racism and health service use and experience has yet to be systematically reviewed or meta-analysed. This paper presents a systematic review and meta-analysis of quantitative empirical studies that report associations between self-reported racism and various measures of healthcare service utilisation. Data were reviewed and extracted from 83 papers reporting 70 studies. Studies included 250,850 participants and were conducted predominately in the U.S. The meta-analysis included 59 papers reporting 52 studies, which were analysed using random effects models and mean weighted effect sizes. Racism was associated with more negative patient experiences of health services (HSU-E) (OR = 0.351 (95% CI [0.236,0.521], k = 19), including lower levels of healthcare-related trust, satisfaction, and communication. Racism was not associated with health service use (HSU-U) as an outcome group, and was not associated with most individual HSU-U outcomes, including having had examinations, health service visits and admissions to health professionals and services. Racism was associated with health service use outcomes such as delaying/not getting healthcare, and lack of adherence to treatment uptake, although these effects may be influenced by a small sample of studies, and publication bias, respectively. Limitations to the literature reviewed in terms of study designs, sampling methods and measurements are discussed along with suggested future directions in the field.

  17. A novel dispersive liquid-liquid microextraction based on solidification of floating organic droplet method for determination of polycyclic aromatic hydrocarbons in aqueous samples.

    PubMed

    Xu, Hui; Ding, Zongqing; Lv, Lili; Song, Dandan; Feng, Yu-Qi

    2009-03-16

    A new dispersive liquid-liquid microextraction based on solidification of floating organic droplet method (DLLME-SFO) was developed for the determination of five kinds of polycyclic aromatic hydrocarbons (PAHs) in environmental water samples. In this method, no specific holder, such as the needle tip of microsyringe and the hollow fiber, is required for supporting the organic microdrop due to the using of organic solvent with low density and proper melting point. Furthermore, the extractant droplet can be collected easily by solidifying it in the lower temperature. 1-Dodecanol was chosen as extraction solvent in this work. A series of parameters that influence extraction were investigated systematically. Under optimal conditions, enrichment factors (EFs) for PAHs were in the range of 88-118. The limit of detections (LODs) for naphthalene, diphenyl, acenaphthene, anthracene and fluoranthene were 0.045, 0.86, 0.071, 1.1 and 0.66ngmL(-1), respectively. Good reproducibility and recovery of the method were also obtained. Compared with the traditional liquid-phase microextraction (LPME) and dispersive liquid-liquid microextraction (DLLME) methods, the proposed method obtained about 2 times higher enrichment factor than those in LPME. Moreover, the solidification of floating organic solvent facilitated the phase transfer. And most importantly, it avoided using high-density and toxic solvent in the traditional DLLME method. The proposed method was successfully applied to determinate PAHs in the environmental water samples. The simple and low-cost method provides an alternative method for the analysis of non-polar compounds in complex environmental water.

  18. A Targeted LC-MS/MS Method for the Simultaneous Detection and Quantitation of Egg, Milk, and Peanut Allergens in Sugar Cookies.

    PubMed

    Boo, Chelsea C; Parker, Christine H; Jackson, Lauren S

    2018-01-01

    Food allergy is a growing public health concern, with many individuals reporting allergies to multiple food sources. Compliance with food labeling regulations and prevention of inadvertent cross-contact in manufacturing requires the use of reliable methods for the detection and quantitation of allergens in processed foods. In this work, a novel liquid chromatography-tandem mass spectrometry multiple-reaction monitoring method for multiallergen detection and quantitation of egg, milk, and peanut was developed and evaluated in an allergen-incurred baked sugar cookie matrix. A systematic evaluation of method parameters, including sample extraction, concentration, and digestion, were optimized for candidate allergen peptide markers. The optimized method enabled the reliable detection and quantitation of egg, milk, and peanut allergens in sugar cookies, with allergen concentrations as low as 5 ppm allergen-incurred ingredient.

  19. Factors affecting the quality of hospital hotel services from the patients and their companions’ point of view: A national study in Iran

    PubMed Central

    Shirzadi, Seyed Majid; Raeissi, Pouran; Nasiripour, Amir Ashkan; Tabibi, Seyed Jamaleddin

    2016-01-01

    Background: The hospitality design of a hospital is a complex process that depends on careful planning, systematic thinking, and consideration of various factors. This study aimed to determine the viewpoints of patients and their relatives on factors affecting hospital hotel services in Iran in 2015. The results of this study can be used to design a suitable model for the assessment and improvement of hospitality service quality. Materials and Methods: In this cross-sectional descriptive study, 10 hospitals of Iran were included. The subjects of the study included 480 patients and their companions from different internal and surgical wards. Simple random sampling method was performed at the hospitals, where patients were selected through stratified sampling based on hospital wards, and in each ward, through systematic sampling based on the bed numbers. A researcher-made questionnaire was used as the study tool which was developed through reviewing the literature and opinions of experts. Its internal reliability was determined based on Cronbach's alpha coefficient (α =0.85). Results: In reviewing the eleven aspects of hospital hotel services regarding the patients’ and their companions’ viewpoint, services related to all aspects, whether human, economic, operational, personnel identification, safety, health care services, physical, clinical welfare, cultural, patient guidance, or public welfare services, received mean scores of higher than three (out of five). Conclusion: The present study showed that in the patients’ and their companions’ viewpoint, factors affecting hospital hotel services in the country are very important. The tool used in this study can be a criterion for assessing the status of the hotel services of the country's major hospitals, so accordingly, the assessment and improvement of the existing conditions can be possible. PMID:27904592

  20. The development of open access journal publishing from 1993 to 2009.

    PubMed

    Laakso, Mikael; Welling, Patrik; Bukvova, Helena; Nyman, Linus; Björk, Bo-Christer; Hedlund, Turid

    2011-01-01

    Open Access (OA) is a model for publishing scholarly peer reviewed journals, made possible by the Internet. The full text of OA journals and articles can be freely read, as the publishing is funded through means other than subscriptions. Empirical research concerning the quantitative development of OA publishing has so far consisted of scattered individual studies providing brief snapshots, using varying methods and data sources. This study adopts a systematic method for studying the development of OA journals from their beginnings in the early 1990s until 2009. Because no comprehensive index of OA articles exists, systematic manual data collection from journal web sites was conducted based on journal-level data extracted from the Directory of Open Access Journals (DOAJ). Due to the high number of journals registered in the DOAJ, almost 5000 at the time of the study, stratified random sampling was used. A separate sample of verified early pioneer OA journals was also studied. The results show a very rapid growth of OA publishing during the period 1993-2009. During the last year an estimated 191 000 articles were published in 4769 journals. Since the year 2000, the average annual growth rate has been 18% for the number of journals and 30% for the number of articles. This can be contrasted to the reported 3,5% yearly volume increase in journal publishing in general. In 2009 the share of articles in OA journals, of all peer reviewed journal articles, reached 7,7%. Overall, the results document a rapid growth in OA journal publishing over the last fifteen years. Based on the sampling results and qualitative data a division into three distinct periods is suggested: The Pioneering years (1993-1999), the Innovation years (2000-2004), and the Consolidation years (2005-2009).

  1. The Development of Open Access Journal Publishing from 1993 to 2009

    PubMed Central

    Laakso, Mikael; Welling, Patrik; Bukvova, Helena; Nyman, Linus; Björk, Bo-Christer; Hedlund, Turid

    2011-01-01

    Open Access (OA) is a model for publishing scholarly peer reviewed journals, made possible by the Internet. The full text of OA journals and articles can be freely read, as the publishing is funded through means other than subscriptions. Empirical research concerning the quantitative development of OA publishing has so far consisted of scattered individual studies providing brief snapshots, using varying methods and data sources. This study adopts a systematic method for studying the development of OA journals from their beginnings in the early 1990s until 2009. Because no comprehensive index of OA articles exists, systematic manual data collection from journal web sites was conducted based on journal-level data extracted from the Directory of Open Access Journals (DOAJ). Due to the high number of journals registered in the DOAJ, almost 5000 at the time of the study, stratified random sampling was used. A separate sample of verified early pioneer OA journals was also studied. The results show a very rapid growth of OA publishing during the period 1993–2009. During the last year an estimated 191 000 articles were published in 4769 journals. Since the year 2000, the average annual growth rate has been 18% for the number of journals and 30% for the number of articles. This can be contrasted to the reported 3,5% yearly volume increase in journal publishing in general. In 2009 the share of articles in OA journals, of all peer reviewed journal articles, reached 7,7%. Overall, the results document a rapid growth in OA journal publishing over the last fifteen years. Based on the sampling results and qualitative data a division into three distinct periods is suggested: The Pioneering years (1993–1999), the Innovation years (2000–2004), and the Consolidation years (2005–2009). PMID:21695139

  2. Quality of reporting of pilot and feasibility cluster randomised trials: a systematic review

    PubMed Central

    Chan, Claire L; Leyrat, Clémence; Eldridge, Sandra M

    2017-01-01

    Objectives To systematically review the quality of reporting of pilot and feasibility of cluster randomised trials (CRTs). In particular, to assess (1) the number of pilot CRTs conducted between 1 January 2011 and 31 December 2014, (2) whether objectives and methods are appropriate and (3) reporting quality. Methods We searched PubMed (2011–2014) for CRTs with ‘pilot’ or ‘feasibility’ in the title or abstract; that were assessing some element of feasibility and showing evidence the study was in preparation for a main effectiveness/efficacy trial. Quality assessment criteria were based on the Consolidated Standards of Reporting Trials (CONSORT) extensions for pilot trials and CRTs. Results Eighteen pilot CRTs were identified. Forty-four per cent did not have feasibility as their primary objective, and many (50%) performed formal hypothesis testing for effectiveness/efficacy despite being underpowered. Most (83%) included ‘pilot’ or ‘feasibility’ in the title, and discussed implications for progression from the pilot to the future definitive trial (89%), but fewer reported reasons for the randomised pilot trial (39%), sample size rationale (44%) or progression criteria (17%). Most defined the cluster (100%), and number of clusters randomised (94%), but few reported how the cluster design affected sample size (17%), whether consent was sought from clusters (11%), or who enrolled clusters (17%). Conclusions That only 18 pilot CRTs were identified necessitates increased awareness of the importance of conducting and publishing pilot CRTs and improved reporting. Pilot CRTs should primarily be assessing feasibility, avoiding formal hypothesis testing for effectiveness/efficacy and reporting reasons for the pilot, sample size rationale and progression criteria, as well as enrolment of clusters, and how the cluster design affects design aspects. We recommend adherence to the CONSORT extensions for pilot trials and CRTs. PMID:29122791

  3. Comparative evaluation of direct plating and most probable number for enumeration of low levels of Listeria monocytogenes in naturally contaminated ice cream products.

    PubMed

    Chen, Yi; Pouillot, Régis; S Burall, Laurel; Strain, Errol A; Van Doren, Jane M; De Jesus, Antonio J; Laasri, Anna; Wang, Hua; Ali, Laila; Tatavarthy, Aparna; Zhang, Guodong; Hu, Lijun; Day, James; Sheth, Ishani; Kang, Jihun; Sahu, Surasri; Srinivasan, Devayani; Brown, Eric W; Parish, Mickey; Zink, Donald L; Datta, Atin R; Hammack, Thomas S; Macarisin, Dumitru

    2017-01-16

    A precise and accurate method for enumeration of low level of Listeria monocytogenes in foods is critical to a variety of studies. In this study, paired comparison of most probable number (MPN) and direct plating enumeration of L. monocytogenes was conducted on a total of 1730 outbreak-associated ice cream samples that were naturally contaminated with low level of L. monocytogenes. MPN was performed on all 1730 samples. Direct plating was performed on all samples using the RAPID'L.mono (RLM) agar (1600 samples) and agar Listeria Ottaviani and Agosti (ALOA; 130 samples). Probabilistic analysis with Bayesian inference model was used to compare paired direct plating and MPN estimates of L. monocytogenes in ice cream samples because assumptions implicit in ordinary least squares (OLS) linear regression analyses were not met for such a comparison. The probabilistic analysis revealed good agreement between the MPN and direct plating estimates, and this agreement showed that the MPN schemes and direct plating schemes using ALOA or RLM evaluated in the present study were suitable for enumerating low levels of L. monocytogenes in these ice cream samples. The statistical analysis further revealed that OLS linear regression analyses of direct plating and MPN data did introduce bias that incorrectly characterized systematic differences between estimates from the two methods. Published by Elsevier B.V.

  4. New approaches to the analysis of complex samples using fluorescence lifetime techniques and organized media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertz, P.R.

    Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectralmore » fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.« less

  5. Evaluating sampling designs by computer simulation: A case study with the Missouri bladderpod

    USGS Publications Warehouse

    Morrison, L.W.; Smith, D.R.; Young, C.; Nichols, D.W.

    2008-01-01

    To effectively manage rare populations, accurate monitoring data are critical. Yet many monitoring programs are initiated without careful consideration of whether chosen sampling designs will provide accurate estimates of population parameters. Obtaining accurate estimates is especially difficult when natural variability is high, or limited budgets determine that only a small fraction of the population can be sampled. The Missouri bladderpod, Lesquerella filiformis Rollins, is a federally threatened winter annual that has an aggregated distribution pattern and exhibits dramatic interannual population fluctuations. Using the simulation program SAMPLE, we evaluated five candidate sampling designs appropriate for rare populations, based on 4 years of field data: (1) simple random sampling, (2) adaptive simple random sampling, (3) grid-based systematic sampling, (4) adaptive grid-based systematic sampling, and (5) GIS-based adaptive sampling. We compared the designs based on the precision of density estimates for fixed sample size, cost, and distance traveled. Sampling fraction and cost were the most important factors determining precision of density estimates, and relative design performance changed across the range of sampling fractions. Adaptive designs did not provide uniformly more precise estimates than conventional designs, in part because the spatial distribution of L. filiformis was relatively widespread within the study site. Adaptive designs tended to perform better as sampling fraction increased and when sampling costs, particularly distance traveled, were taken into account. The rate that units occupied by L. filiformis were encountered was higher for adaptive than for conventional designs. Overall, grid-based systematic designs were more efficient and practically implemented than the others. ?? 2008 The Society of Population Ecology and Springer.

  6. Normalization Approaches for Removing Systematic Biases Associated with Mass Spectrometry and Label-Free Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Callister, Stephen J.; Barry, Richard C.; Adkins, Joshua N.

    2006-02-01

    Central tendency, linear regression, locally weighted regression, and quantile techniques were investigated for normalization of peptide abundance measurements obtained from high-throughput liquid chromatography-Fourier transform ion cyclotron resonance mass spectrometry (LC-FTICR MS). Arbitrary abundances of peptides were obtained from three sample sets, including a standard protein sample, two Deinococcus radiodurans samples taken from different growth phases, and two mouse striatum samples from control and methamphetamine-stressed mice (strain C57BL/6). The selected normalization techniques were evaluated in both the absence and presence of biological variability by estimating extraneous variability prior to and following normalization. Prior to normalization, replicate runs from each sample setmore » were observed to be statistically different, while following normalization replicate runs were no longer statistically different. Although all techniques reduced systematic bias, assigned ranks among the techniques revealed significant trends. For most LC-FTICR MS analyses, linear regression normalization ranked either first or second among the four techniques, suggesting that this technique was more generally suitable for reducing systematic biases.« less

  7. How hot? Systematic convergence of the replica exchange method using multiple reservoirs.

    PubMed

    Ruscio, Jory Z; Fawzi, Nicolas L; Head-Gordon, Teresa

    2010-02-01

    We have devised a systematic approach to converge a replica exchange molecular dynamics simulation by dividing the full temperature range into a series of higher temperature reservoirs and a finite number of lower temperature subreplicas. A defined highest temperature reservoir of equilibrium conformations is used to help converge a lower but still hot temperature subreplica, which in turn serves as the high-temperature reservoir for the next set of lower temperature subreplicas. The process is continued until an optimal temperature reservoir is reached to converge the simulation at the target temperature. This gradual convergence of subreplicas allows for better and faster convergence at the temperature of interest and all intermediate temperatures for thermodynamic analysis, as well as optimizing the use of multiple processors. We illustrate the overall effectiveness of our multiple reservoir replica exchange strategy by comparing sampling and computational efficiency with respect to replica exchange, as well as comparing methods when converging the structural ensemble of the disordered Abeta(21-30) peptide simulated with explicit water by comparing calculated Rotating Overhauser Effect Spectroscopy intensities to experimentally measured values. Copyright 2009 Wiley Periodicals, Inc.

  8. Celiac disease in Iran: a systematic review and meta-analysis

    PubMed Central

    Mohammadibakhsh, Roghayeh; Sohrabi, Rahim; Salemi, Morteza; Mirghaed, Masood Taheri; Behzadifar, Masoud

    2017-01-01

    Introduction Celiac disease (CD) is a chronic autoimmune-mediated disorder with both intestinal and systemic manifestations. The aim of this study was to determine the prevalence of celiac disease in Iran. Methods We conducted a systematic search on Embase, Pub Med, Web of Science, Google Scholar, MagIran, Scientific Information database (SID) and Iranmedex from 2003 through to November 2015. The Der-Simonian/Laird’s (DL), with a 95% confidence interval employed to estimate the overall pooled prevalence. Heterogeneity was investigated by using subgroup analysis based on sample size and time of study. Results Sixty-three studies with 36,833 participants met inclusion criteria for analysis. The overall prevalence of celiac disease in 63 studies that had used serological tests for the diagnosis was observed as 3% (95% CI: 0.03–0.03) and the overall prevalence of celiac disease in studies that had used biopsy method for diagnosis was observed as 2% (95% CI: 0.01–0.02). Conclusion The prevalence of celiac disease in Iran was similar or even higher than world-wide reported. PMID:28461861

  9. Meta-analysis, complexity, and heterogeneity: a qualitative interview study of researchers' methodological values and practices.

    PubMed

    Lorenc, Theo; Felix, Lambert; Petticrew, Mark; Melendez-Torres, G J; Thomas, James; Thomas, Sian; O'Mara-Eves, Alison; Richardson, Michelle

    2016-11-16

    Complex or heterogeneous data pose challenges for systematic review and meta-analysis. In recent years, a number of new methods have been developed to meet these challenges. This qualitative interview study aimed to understand researchers' understanding of complexity and heterogeneity and the factors which may influence the choices researchers make in synthesising complex data. We conducted interviews with a purposive sample of researchers (N = 19) working in systematic review or meta-analysis across a range of disciplines. We analysed data thematically using a framework approach. Participants reported using a broader range of methods and data types in complex reviews than in traditional reviews. A range of techniques are used to explore heterogeneity, but there is some debate about their validity, particularly when applied post hoc. Technical considerations of how to synthesise complex evidence cannot be isolated from questions of the goals and contexts of research. However, decisions about how to analyse data appear to be made in a largely informal way, drawing on tacit expertise, and their relation to these broader questions remains unclear.

  10. Mission Drift in Qualitative Research, or Moving Toward a Systematic Review of Qualitative Studies, Moving Back to a More Systematic Narrative Review

    ERIC Educational Resources Information Center

    Jones, Kip

    2004-01-01

    The paper argues that the systematic review of qualitative research is best served by reliance upon qualitative methods themselves. A case is made for strengthening the narrative literature review and using narrative itself as a method of review. A technique is proposed that builds upon recent developments in qualitative systematic review by the…

  11. Resonance Rayleigh scattering method for determination of 2-mercaptobenzothiazole using gold nanoparticles probe.

    PubMed

    Parham, Hooshang; Pourreza, Nahid; Marahel, Farzaneh

    2015-01-01

    A sensitive, simple and novel method was developed to determine 2-mercaptobenzothiazole (2MBT) in water samples. This method was based on the interaction between gold nanoparticles (AuNPs) and 2MBT followed by increasing of the resonance Rayleigh scattering (RRS) intensity of nanoparticles. The change in RRS intensity (ΔIRRS) was linearly correlated to the concentration of 2MBT over the ranges of 5.0-100.0 and 100.0-300.0 μg L(-1). 2MBT can be measured in a short time (5 min) without any complicated or time-consuming sample pretreatment process. Parameters that affect the RRS intensities such as pH, concentration of AuNPs, standing time, electrolyte concentration, and coexisting substances were systematically investigated and optimized. Interference tests showed that the developed method has a very good selectivity and could be used conveniently for determination of 2MBT. The limit of detection (LOD) and limit of quantification (LOQ) were 1.0 and 3.0 μg L(-1), respectively. Relative standard deviations (RSD) for 20.0 and 80.0 μg L(-1) of 2MBT were 1.1 and 2.3, respectively. Possible mechanisms for the RRS changes of AuNPs in the presence of 2MBT were discussed and the method was successfully applied for the analysis of real water samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Sampling for mercury at subnanogram per litre concentrations for load estimation in rivers

    USGS Publications Warehouse

    Colman, J.A.; Breault, R.F.

    2000-01-01

    Estimation of constituent loads in streams requires collection of stream samples that are representative of constituent concentrations, that is, composites of isokinetic multiple verticals collected along a stream transect. An all-Teflon isokinetic sampler (DH-81) cleaned in 75??C, 4 N HCl was tested using blank, split, and replicate samples to assess systematic and random sample contamination by mercury species. Mean mercury concentrations in field-equipment blanks were low: 0.135 ng??L-1 for total mercury (??Hg) and 0.0086 ng??L-1 for monomethyl mercury (MeHg). Mean square errors (MSE) for ??Hg and MeHg duplicate samples collected at eight sampling stations were not statistically different from MSE of samples split in the laboratory, which represent the analytical and splitting error. Low fieldblank concentrations and statistically equal duplicate- and split-sample MSE values indicate that no measurable contamination was occurring during sampling. Standard deviations associated with example mercury load estimations were four to five times larger, on a relative basis, than standard deviations calculated from duplicate samples, indicating that error of the load determination was primarily a function of the loading model used, not of sampling or analytical methods.

  13. Measuring attitudes towards the dying process: A systematic review of tools.

    PubMed

    Groebe, Bernadette; Strupp, Julia; Eisenmann, Yvonne; Schmidt, Holger; Schlomann, Anna; Rietz, Christian; Voltz, Raymond

    2018-04-01

    At the end of life, anxious attitudes concerning the dying process are common in patients in Palliative Care. Measurement tools can identify vulnerabilities, resources and the need for subsequent treatment to relieve suffering and support well-being. To systematically review available tools measuring attitudes towards dying, their operationalization, the method of measurement and the methodological quality including generalizability to different contexts. Systematic review according to the PRISMA Statement. Methodological quality of tools assessed by standardized review criteria. MEDLINE, PsycINFO, PsyndexTests and the Health and Psychosocial Instruments were searched from their inception to April 2017. A total of 94 identified studies reported the development and/or validation of 44 tools. Of these, 37 were questionnaires and 7 alternative measurement methods (e.g. projective measures). In 34 of 37 questionnaires, the emotional evaluation (e.g. anxiety) towards dying is measured. Dying is operationalized in general items ( n = 20), in several specific aspects of dying ( n = 34) and as dying of others ( n = 14). Methodological quality of tools was reported inconsistently. Nine tools reported good internal consistency. Of 37 tools, 4 were validated in a clinical sample (e.g. terminal cancer; Huntington disease), indicating questionable generalizability to clinical contexts for most tools. Many tools exist to measure attitudes towards the dying process using different endpoints. This overview can serve as decision framework on which tool to apply in which contexts. For clinical application, only few tools were available. Further validation of existing tools and potential alternative methods in various populations is needed.

  14. Enzymatic electrochemical detection coupled to multivariate calibration for the determination of phenolic compounds in environmental samples.

    PubMed

    Hernandez, Silvia R; Kergaravat, Silvina V; Pividori, Maria Isabel

    2013-03-15

    An approach based on the electrochemical detection of the horseradish peroxidase enzymatic reaction by means of square wave voltammetry was developed for the determination of phenolic compounds in environmental samples. First, a systematic optimization procedure of three factors involved in the enzymatic reaction was carried out using response surface methodology through a central composite design. Second, the enzymatic electrochemical detection coupled with a multivariate calibration method based in the partial least-squares technique was optimized for the determination of a mixture of five phenolic compounds, i.e. phenol, p-aminophenol, p-chlorophenol, hydroquinone and pyrocatechol. The calibration and validation sets were built and assessed. In the calibration model, the LODs for phenolic compounds oscillated from 0.6 to 1.4 × 10(-6) mol L(-1). Recoveries for prediction samples were higher than 85%. These compounds were analyzed simultaneously in spiked samples and in water samples collected close to tanneries and landfills. Published by Elsevier B.V.

  15. Comparison of experimental methods for estimating matrix diffusion coefficients for contaminant transport modeling

    DOE PAGES

    Telfeyan, Katherine Christina; Ware, Stuart Doug; Reimus, Paul William; ...

    2018-01-31

    Here, diffusion cell and diffusion wafer experiments were conducted to compare methods for estimating effective matrix diffusion coefficients in rock core samples from Pahute Mesa at the Nevada Nuclear Security Site (NNSS). A diffusion wafer method, in which a solute diffuses out of a rock matrix that is pre-saturated with water containing the solute, is presented as a simpler alternative to the traditional through-diffusion (diffusion cell) method. Both methods yielded estimates of effective matrix diffusion coefficients that were within the range of values previously reported for NNSS volcanic rocks. The difference between the estimates of the two methods ranged frommore » 14 to 30%, and there was no systematic high or low bias of one method relative to the other. From a transport modeling perspective, these differences are relatively minor when one considers that other variables (e.g., fracture apertures, fracture spacings) influence matrix diffusion to a greater degree and tend to have greater uncertainty than effective matrix diffusion coefficients. For the same relative random errors in concentration measurements, the diffusion cell method yields effective matrix diffusion coefficient estimates that have less uncertainty than the wafer method. However, the wafer method is easier and less costly to implement and yields estimates more quickly, thus allowing a greater number of samples to be analyzed for the same cost and time. Given the relatively good agreement between the methods, and the lack of any apparent bias between the methods, the diffusion wafer method appears to offer advantages over the diffusion cell method if better statistical representation of a given set of rock samples is desired.« less

  16. Comparison of experimental methods for estimating matrix diffusion coefficients for contaminant transport modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Telfeyan, Katherine Christina; Ware, Stuart Doug; Reimus, Paul William

    Here, diffusion cell and diffusion wafer experiments were conducted to compare methods for estimating effective matrix diffusion coefficients in rock core samples from Pahute Mesa at the Nevada Nuclear Security Site (NNSS). A diffusion wafer method, in which a solute diffuses out of a rock matrix that is pre-saturated with water containing the solute, is presented as a simpler alternative to the traditional through-diffusion (diffusion cell) method. Both methods yielded estimates of effective matrix diffusion coefficients that were within the range of values previously reported for NNSS volcanic rocks. The difference between the estimates of the two methods ranged frommore » 14 to 30%, and there was no systematic high or low bias of one method relative to the other. From a transport modeling perspective, these differences are relatively minor when one considers that other variables (e.g., fracture apertures, fracture spacings) influence matrix diffusion to a greater degree and tend to have greater uncertainty than effective matrix diffusion coefficients. For the same relative random errors in concentration measurements, the diffusion cell method yields effective matrix diffusion coefficient estimates that have less uncertainty than the wafer method. However, the wafer method is easier and less costly to implement and yields estimates more quickly, thus allowing a greater number of samples to be analyzed for the same cost and time. Given the relatively good agreement between the methods, and the lack of any apparent bias between the methods, the diffusion wafer method appears to offer advantages over the diffusion cell method if better statistical representation of a given set of rock samples is desired.« less

  17. Designing dipolar recoupling and decoupling experiments for biological solid-state NMR using interleaved continuous wave and RF pulse irradiation.

    PubMed

    Bjerring, Morten; Jain, Sheetal; Paaske, Berit; Vinther, Joachim M; Nielsen, Niels Chr

    2013-09-17

    Rapid developments in solid-state NMR methodology have boosted this technique into a highly versatile tool for structural biology. The invention of increasingly advanced rf pulse sequences that take advantage of better hardware and sample preparation have played an important part in these advances. In the development of these new pulse sequences, researchers have taken advantage of analytical tools, such as average Hamiltonian theory or lately numerical methods based on optimal control theory. In this Account, we focus on the interplay between these strategies in the systematic development of simple pulse sequences that combines continuous wave (CW) irradiation with short pulses to obtain improved rf pulse, recoupling, sampling, and decoupling performance. Our initial work on this problem focused on the challenges associated with the increasing use of fully or partly deuterated proteins to obtain high-resolution, liquid-state-like solid-state NMR spectra. Here we exploit the overwhelming presence of (2)H in such samples as a source of polarization and to gain structural information. The (2)H nuclei possess dominant quadrupolar couplings which complicate even the simplest operations, such as rf pulses and polarization transfer to surrounding nuclei. Using optimal control and easy analytical adaptations, we demonstrate that a series of rotor synchronized short pulses may form the basis for essentially ideal rf pulse performance. Using similar approaches, we design (2)H to (13)C polarization transfer experiments that increase the efficiency by one order of magnitude over standard cross polarization experiments. We demonstrate how we can translate advanced optimal control waveforms into simple interleaved CW and rf pulse methods that form a new cross polarization experiment. This experiment significantly improves (1)H-(15)N and (15)N-(13)C transfers, which are key elements in the vast majority of biological solid-state NMR experiments. In addition, we demonstrate how interleaved sampling of spectra exploiting polarization from (1)H and (2)H nuclei can substantially enhance the sensitivity of such experiments. Finally, we present systematic development of (1)H decoupling methods where CW irradiation of moderate amplitude is interleaved with strong rotor-synchronized refocusing pulses. We show that these sequences remove residual cross terms between dipolar coupling and chemical shielding anisotropy more effectively and improve the spectral resolution over that observed in current state-of-the-art methods.

  18. Quantification of HTLV-1 Clonality and TCR Diversity

    PubMed Central

    Laydon, Daniel J.; Melamed, Anat; Sim, Aaron; Gillet, Nicolas A.; Sim, Kathleen; Darko, Sam; Kroll, J. Simon; Douek, Daniel C.; Price, David A.; Bangham, Charles R. M.; Asquith, Becca

    2014-01-01

    Estimation of immunological and microbiological diversity is vital to our understanding of infection and the immune response. For instance, what is the diversity of the T cell repertoire? These questions are partially addressed by high-throughput sequencing techniques that enable identification of immunological and microbiological “species” in a sample. Estimators of the number of unseen species are needed to estimate population diversity from sample diversity. Here we test five widely used non-parametric estimators, and develop and validate a novel method, DivE, to estimate species richness and distribution. We used three independent datasets: (i) viral populations from subjects infected with human T-lymphotropic virus type 1; (ii) T cell antigen receptor clonotype repertoires; and (iii) microbial data from infant faecal samples. When applied to datasets with rarefaction curves that did not plateau, existing estimators systematically increased with sample size. In contrast, DivE consistently and accurately estimated diversity for all datasets. We identify conditions that limit the application of DivE. We also show that DivE can be used to accurately estimate the underlying population frequency distribution. We have developed a novel method that is significantly more accurate than commonly used biodiversity estimators in microbiological and immunological populations. PMID:24945836

  19. A Hybrid Monte Carlo importance sampling of rare events in Turbulence and in Turbulent Models

    NASA Astrophysics Data System (ADS)

    Margazoglou, Georgios; Biferale, Luca; Grauer, Rainer; Jansen, Karl; Mesterhazy, David; Rosenow, Tillmann; Tripiccione, Raffaele

    2017-11-01

    Extreme and rare events is a challenging topic in the field of turbulence. Trying to investigate those instances through the use of traditional numerical tools turns to be a notorious task, as they fail to systematically sample the fluctuations around them. On the other hand, we propose that an importance sampling Monte Carlo method can selectively highlight extreme events in remote areas of the phase space and induce their occurrence. We present a brand new computational approach, based on the path integral formulation of stochastic dynamics, and employ an accelerated Hybrid Monte Carlo (HMC) algorithm for this purpose. Through the paradigm of stochastic one-dimensional Burgers' equation, subjected to a random noise that is white-in-time and power-law correlated in Fourier space, we will prove our concept and benchmark our results with standard CFD methods. Furthermore, we will present our first results of constrained sampling around saddle-point instanton configurations (optimal fluctuations). The research leading to these results has received funding from the EU Horizon 2020 research and innovation programme under Grant Agreement No. 642069, and from the EU Seventh Framework Programme (FP7/2007-2013) under ERC Grant Agreement No. 339032.

  20. A mixed-methods approach to systematic reviews.

    PubMed

    Pearson, Alan; White, Heath; Bath-Hextall, Fiona; Salmond, Susan; Apostolo, Joao; Kirkpatrick, Pamela

    2015-09-01

    There are an increasing number of published single-method systematic reviews that focus on different types of evidence related to a particular topic. As policy makers and practitioners seek clear directions for decision-making from systematic reviews, it is likely that it will be increasingly difficult for them to identify 'what to do' if they are required to find and understand a plethora of syntheses related to a particular topic.Mixed-methods systematic reviews are designed to address this issue and have the potential to produce systematic reviews of direct relevance to policy makers and practitioners.On the basis of the recommendations of the Joanna Briggs Institute International Mixed Methods Reviews Methodology Group in 2012, the Institute adopted a segregated approach to mixed-methods synthesis as described by Sandelowski et al., which consists of separate syntheses of each component method of the review. Joanna Briggs Institute's mixed-methods synthesis of the findings of the separate syntheses uses a Bayesian approach to translate the findings of the initial quantitative synthesis into qualitative themes and pooling these with the findings of the initial qualitative synthesis.

Top