Sample records for adequate sample sizes

  1. Methodological reporting of randomized clinical trials in respiratory research in 2010.

    PubMed

    Lu, Yi; Yao, Qiuju; Gu, Jie; Shen, Ce

    2013-09-01

    Although randomized controlled trials (RCTs) are considered the highest level of evidence, they are also subject to bias, due to a lack of adequately reported randomization, and therefore the reporting should be as explicit as possible for readers to determine the significance of the contents. We evaluated the methodological quality of RCTs in respiratory research in high ranking clinical journals, published in 2010. We assessed the methodological quality, including generation of the allocation sequence, allocation concealment, double-blinding, sample-size calculation, intention-to-treat analysis, flow diagrams, number of medical centers involved, diseases, funding sources, types of interventions, trial registration, number of times the papers have been cited, journal impact factor, journal type, and journal endorsement of the CONSORT (Consolidated Standards of Reporting Trials) rules, in RCTs published in 12 top ranking clinical respiratory journals and 5 top ranking general medical journals. We included 176 trials, of which 93 (53%) reported adequate generation of the allocation sequence, 66 (38%) reported adequate allocation concealment, 79 (45%) were double-blind, 123 (70%) reported adequate sample-size calculation, 88 (50%) reported intention-to-treat analysis, and 122 (69%) included a flow diagram. Multivariate logistic regression analysis revealed that journal impact factor ≥ 5 was the only variable that significantly influenced adequate allocation sequence generation. Trial registration and journal impact factor ≥ 5 significantly influenced adequate allocation concealment. Medical interventions, trial registration, and journal endorsement of the CONSORT statement influenced adequate double-blinding. Publication in one of the general medical journal influenced adequate sample-size calculation. The methodological quality of RCTs in respiratory research needs improvement. Stricter enforcement of the CONSORT statement should enhance the quality of RCTs.

  2. How large a training set is needed to develop a classifier for microarray data?

    PubMed

    Dobbin, Kevin K; Zhao, Yingdong; Simon, Richard M

    2008-01-01

    A common goal of gene expression microarray studies is the development of a classifier that can be used to divide patients into groups with different prognoses, or with different expected responses to a therapy. These types of classifiers are developed on a training set, which is the set of samples used to train a classifier. The question of how many samples are needed in the training set to produce a good classifier from high-dimensional microarray data is challenging. We present a model-based approach to determining the sample size required to adequately train a classifier. It is shown that sample size can be determined from three quantities: standardized fold change, class prevalence, and number of genes or features on the arrays. Numerous examples and important experimental design issues are discussed. The method is adapted to address ex post facto determination of whether the size of a training set used to develop a classifier was adequate. An interactive web site for performing the sample size calculations is provided. We showed that sample size calculations for classifier development from high-dimensional microarray data are feasible, discussed numerous important considerations, and presented examples.

  3. Methodological reporting quality of randomized controlled trials: A survey of seven core journals of orthopaedics from Mainland China over 5 years following the CONSORT statement.

    PubMed

    Zhang, J; Chen, X; Zhu, Q; Cui, J; Cao, L; Su, J

    2016-11-01

    In recent years, the number of randomized controlled trials (RCTs) in the field of orthopaedics is increasing in Mainland China. However, randomized controlled trials (RCTs) are inclined to bias if they lack methodological quality. Therefore, we performed a survey of RCT to assess: (1) What about the quality of RCTs in the field of orthopedics in Mainland China? (2) Whether there is difference between the core journals of the Chinese department of orthopedics and Orthopaedics Traumatology Surgery & Research (OTSR). This research aimed to evaluate the methodological reporting quality according to the CONSORT statement of randomized controlled trials (RCTs) in seven key orthopaedic journals published in Mainland China over 5 years from 2010 to 2014. All of the articles were hand researched on Chongqing VIP database between 2010 and 2014. Studies were considered eligible if the words "random", "randomly", "randomization", "randomized" were employed to describe the allocation way. Trials including animals, cadavers, trials published as abstracts and case report, trials dealing with subgroups analysis, or trials without the outcomes were excluded. In addition, eight articles selected from Orthopaedics Traumatology Surgery & Research (OTSR) between 2010 and 2014 were included in this study for comparison. The identified RCTs are analyzed using a modified version of the Consolidated Standards of Reporting Trials (CONSORT), including the sample size calculation, allocation sequence generation, allocation concealment, blinding and handling of dropouts. A total of 222 RCTs were identified in seven core orthopaedic journals. No trials reported adequate sample size calculation, 74 (33.4%) reported adequate allocation generation, 8 (3.7%) trials reported adequate allocation concealment, 18 (8.1%) trials reported adequate blinding and 16 (7.2%) trials reported handling of dropouts. In OTSR, 1 (12.5%) trial reported adequate sample size calculation, 4 (50.0%) reported adequate allocation generation, 1 (12.5%) trials reported adequate allocation concealment, 2 (25.0%) trials reported adequate blinding and 5 (62.5%) trials reported handling of dropouts. There were statistical differences as for sample size calculation and handling of dropouts between papers from Mainland China and OTSR (P<0.05). The findings of this study show that the methodological reporting quality of RCTs in seven core orthopaedic journals from the Mainland China is far from satisfaction and it needs to further improve to keep up with the standards of the CONSORT statement. Level III case control. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  4. STREAMBED PARTICLE SIZE FROM PEBBLE COUNTS USING VISUALLY ESTIMATED SIZE CLSASES: JUNK OR USEFUL DATA?

    EPA Science Inventory

    In large-scale studies, it is often neither feasible nor necessary to obtain the large samples of 400 particles advocated by many geomorphologists to adequately quantify streambed surface particle-size distributions. Synoptic surveys such as U.S. Environmental Protection Agency...

  5. Capturing heterogeneity: The role of a study area's extent for estimating mean throughfall

    NASA Astrophysics Data System (ADS)

    Zimmermann, Alexander; Voss, Sebastian; Metzger, Johanna Clara; Hildebrandt, Anke; Zimmermann, Beate

    2016-11-01

    The selection of an appropriate spatial extent of a sampling plot is one among several important decisions involved in planning a throughfall sampling scheme. In fact, the choice of the extent may determine whether or not a study can adequately characterize the hydrological fluxes of the studied ecosystem. Previous attempts to optimize throughfall sampling schemes focused on the selection of an appropriate sample size, support, and sampling design, while comparatively little attention has been given to the role of the extent. In this contribution, we investigated the influence of the extent on the representativeness of mean throughfall estimates for three forest ecosystems of varying stand structure. Our study is based on virtual sampling of simulated throughfall fields. We derived these fields from throughfall data sampled in a simply structured forest (young tropical forest) and two heterogeneous forests (old tropical forest, unmanaged mixed European beech forest). We then sampled the simulated throughfall fields with three common extents and various sample sizes for a range of events and for accumulated data. Our findings suggest that the size of the study area should be carefully adapted to the complexity of the system under study and to the required temporal resolution of the throughfall data (i.e. event-based versus accumulated). Generally, event-based sampling in complex structured forests (conditions that favor comparatively long autocorrelations in throughfall) requires the largest extents. For event-based sampling, the choice of an appropriate extent can be as important as using an adequate sample size.

  6. Sampling intraspecific variability in leaf functional traits: Practical suggestions to maximize collected information.

    PubMed

    Petruzzellis, Francesco; Palandrani, Chiara; Savi, Tadeja; Alberti, Roberto; Nardini, Andrea; Bacaro, Giovanni

    2017-12-01

    The choice of the best sampling strategy to capture mean values of functional traits for a species/population, while maintaining information about traits' variability and minimizing the sampling size and effort, is an open issue in functional trait ecology. Intraspecific variability (ITV) of functional traits strongly influences sampling size and effort. However, while adequate information is available about intraspecific variability between individuals (ITV BI ) and among populations (ITV POP ), relatively few studies have analyzed intraspecific variability within individuals (ITV WI ). Here, we provide an analysis of ITV WI of two foliar traits, namely specific leaf area (SLA) and osmotic potential (π), in a population of Quercus ilex L. We assessed the baseline ITV WI level of variation between the two traits and provided the minimum and optimal sampling size in order to take into account ITV WI , comparing sampling optimization outputs with those previously proposed in the literature. Different factors accounted for different amount of variance of the two traits. SLA variance was mostly spread within individuals (43.4% of the total variance), while π variance was mainly spread between individuals (43.2%). Strategies that did not account for all the canopy strata produced mean values not representative of the sampled population. The minimum size to adequately capture the studied functional traits corresponded to 5 leaves taken randomly from 5 individuals, while the most accurate and feasible sampling size was 4 leaves taken randomly from 10 individuals. We demonstrate that the spatial structure of the canopy could significantly affect traits variability. Moreover, different strategies for different traits could be implemented during sampling surveys. We partially confirm sampling sizes previously proposed in the recent literature and encourage future analysis involving different traits.

  7. Linear Combinations of Multiple Outcome Measures to Improve the Power of Efficacy Analysis ---Application to Clinical Trials on Early Stage Alzheimer Disease

    PubMed Central

    Xiong, Chengjie; Luo, Jingqin; Morris, John C; Bateman, Randall

    2018-01-01

    Modern clinical trials on Alzheimer disease (AD) focus on the early symptomatic stage or even the preclinical stage. Subtle disease progression at the early stages, however, poses a major challenge in designing such clinical trials. We propose a multivariate mixed model on repeated measures to model the disease progression over time on multiple efficacy outcomes, and derive the optimum weights to combine multiple outcome measures by minimizing the sample sizes to adequately power the clinical trials. A cross-validation simulation study is conducted to assess the accuracy for the estimated weights as well as the improvement in reducing the sample sizes for such trials. The proposed methodology is applied to the multiple cognitive tests from the ongoing observational study of the Dominantly Inherited Alzheimer Network (DIAN) to power future clinical trials in the DIAN with a cognitive endpoint. Our results show that the optimum weights to combine multiple outcome measures can be accurately estimated, and that compared to the individual outcomes, the combined efficacy outcome with these weights significantly reduces the sample size required to adequately power clinical trials. When applied to the clinical trial in the DIAN, the estimated linear combination of six cognitive tests can adequately power the clinical trial. PMID:29546251

  8. Sample Size in Qualitative Interview Studies: Guided by Information Power.

    PubMed

    Malterud, Kirsti; Siersma, Volkert Dirk; Guassora, Ann Dorrit

    2015-11-27

    Sample sizes must be ascertained in qualitative studies like in quantitative studies but not by the same means. The prevailing concept for sample size in qualitative studies is "saturation." Saturation is closely tied to a specific methodology, and the term is inconsistently applied. We propose the concept "information power" to guide adequate sample size for qualitative studies. Information power indicates that the more information the sample holds, relevant for the actual study, the lower amount of participants is needed. We suggest that the size of a sample with sufficient information power depends on (a) the aim of the study, (b) sample specificity, (c) use of established theory, (d) quality of dialogue, and (e) analysis strategy. We present a model where these elements of information and their relevant dimensions are related to information power. Application of this model in the planning and during data collection of a qualitative study is discussed. © The Author(s) 2015.

  9. Revisiting sample size: are big trials the answer?

    PubMed

    Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J

    2012-07-18

    The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.

  10. A novel sample size formula for the weighted log-rank test under the proportional hazards cure model.

    PubMed

    Xiong, Xiaoping; Wu, Jianrong

    2017-01-01

    The treatment of cancer has progressed dramatically in recent decades, such that it is no longer uncommon to see a cure or log-term survival in a significant proportion of patients with various types of cancer. To adequately account for the cure fraction when designing clinical trials, the cure models should be used. In this article, a sample size formula for the weighted log-rank test is derived under the fixed alternative hypothesis for the proportional hazards cure models. Simulation showed that the proposed sample size formula provides an accurate estimation of sample size for designing clinical trials under the proportional hazards cure models. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Fish habitat conditions: Using the Northern/Intermountain Regions' inventory procedures for detecting differences on two differently managed watersheds

    Treesearch

    C. Kerry Overton; Michael A. Radko; Rodger L. Nelson

    1993-01-01

    Differences in fish habitat variables between two studied watersheds may be related to differences in land management. In using the R1/R4 Watershed-Scale Fish Habitat Inventory Process, for most habitat variables, evaluations of sample sizes of at least 30 habitat units were adequate. Guidelines will help land managers in determining sample sizes required to detect...

  12. Sampling methods for amphibians in streams in the Pacific Northwest.

    Treesearch

    R. Bruce Bury; Paul Stephen Corn

    1991-01-01

    Methods describing how to sample aquatic and semiaquatic amphibians in small streams and headwater habitats in the Pacific Northwest are presented. We developed a technique that samples 10-meter stretches of selected streams, which was adequate to detect presence or absence of amphibian species and provided sample sizes statistically sufficient to compare abundance of...

  13. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    USDA-ARS?s Scientific Manuscript database

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  14. Mixing problems in using indicators for measuring regional blood flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ushioda, E.; Nuwayhid, B.; Tabsh, K.

    A basic requirement for using indicators for measuring blood flow is adequate mixing of the indicator with blood prior to sampling the site. This requirement has been met by depositing the indicator in the heart and sampling from an artery. Recently, authors have injected microspheres into veins and sampled from venous sites. The present studies were designed to investigate the mixing problems in sheep and rabbits by means of Cardio-Green and labeled microspheres. The indicators were injected at different points in the circulatory system, and blood was sampled at different levels of the venous and arterial systems. Results show themore » following: (a) When an indicator of small molecular size (Cardio-Green) is allowed to pass through the heart chambers, adequate mixing is achieved, yielding accurate and reproducible results. (b) When any indicator (Cardio-Green or microspheres) is injected into veins, and sampling is done at any point in the venous system, mixing is inadequate, yielding flow results which are inconsistent and erratic. (c) For an indicator or large molecular size (microspheres), injecting into the left side of the heart and sampling from arterial sites yield accurate and reproducible results regardless of whether blood is sampled continuously or intermittently.« less

  15. Flex-rigid pleuroscopic biopsy with the SB knife Jr is a novel technique for diagnosis of malignant or benign fibrothorax.

    PubMed

    Wang, Xiao-Bo; Yin, Yan; Miao, Yuan; Eberhardt, Ralf; Hou, Gang; Herth, Felix J; Kang, Jian

    2016-11-01

    Diagnosing pleural effusion is challenging, especially in patients with malignant or benign fibrothorax, which is difficult to sample using standard flexible forceps (SFF) via flex-rigid pleuroscopy. An adequate sample is crucial for the differential diagnosis of malignant fibrothorax (malignant pleural mesothelioma, metastatic lung carcinoma, etc.) from benign fibrothorax (benign asbestos pleural disease, tuberculous pleuritis, etc.). Novel biopsy techniques are required in flex-rigid pleuroscopy to improve the sample size and quality. The SB knife Jr, which is a scissor forceps that uses a mono-pole high frequency, was developed to allow convenient and accurate resection of larger lesions during endoscopic dissection (ESD). Herein, we report two patients with fibrothorax who underwent a pleural biopsy using an SB knife Jr to investigate the potential use of this tool in flex-rigid pleuroscopy when pleural lesions are difficult to biopsy via SFF. The biopsies were successful, with sufficient size and quality for definitive diagnosis. We also successfully performed adhesiolysis with the SB knife Jr in one case, and adequate biopsies were conducted. No complications were observed. Electrosurgical biopsy with the SB knife Jr during flex-rigid pleuroscopy allowed us to obtain adequate samples for the diagnosis of malignant versus benign fibrothorax, which is usually not possible with SFF. The SB knife Jr also demonstrated a potential use for pleuropulmonary adhesions.

  16. Evaluation of species richness estimators based on quantitative performance measures and sensitivity to patchiness and sample grain size

    NASA Astrophysics Data System (ADS)

    Willie, Jacob; Petre, Charles-Albert; Tagg, Nikki; Lens, Luc

    2012-11-01

    Data from forest herbaceous plants in a site of known species richness in Cameroon were used to test the performance of rarefaction and eight species richness estimators (ACE, ICE, Chao1, Chao2, Jack1, Jack2, Bootstrap and MM). Bias, accuracy, precision and sensitivity to patchiness and sample grain size were the evaluation criteria. An evaluation of the effects of sampling effort and patchiness on diversity estimation is also provided. Stems were identified and counted in linear series of 1-m2 contiguous square plots distributed in six habitat types. Initially, 500 plots were sampled in each habitat type. The sampling process was monitored using rarefaction and a set of richness estimator curves. Curves from the first dataset suggested adequate sampling in riparian forest only. Additional plots ranging from 523 to 2143 were subsequently added in the undersampled habitats until most of the curves stabilized. Jack1 and ICE, the non-parametric richness estimators, performed better, being more accurate and less sensitive to patchiness and sample grain size, and significantly reducing biases that could not be detected by rarefaction and other estimators. This study confirms the usefulness of non-parametric incidence-based estimators, and recommends Jack1 or ICE alongside rarefaction while describing taxon richness and comparing results across areas sampled using similar or different grain sizes. As patchiness varied across habitat types, accurate estimations of diversity did not require the same number of plots. The number of samples needed to fully capture diversity is not necessarily the same across habitats, and can only be known when taxon sampling curves have indicated adequate sampling. Differences in observed species richness between habitats were generally due to differences in patchiness, except between two habitats where they resulted from differences in abundance. We suggest that communities should first be sampled thoroughly using appropriate taxon sampling curves before explaining differences in diversity.

  17. Use of a new jumbo forceps improves tissue acquisition of Barrett's esophagus surveillance biopsies.

    PubMed

    Komanduri, Sri; Swanson, Garth; Keefer, Laurie; Jakate, Shriram

    2009-12-01

    The major risk factors for the development of esophageal adenocarcinoma remain long-standing GERD and resultant Barrett's esophagus (BE). Finding the exact method of adequate tissue sampling for surveillance of dysplasia in BE remains a dilemma. We prospectively compared standard large-capacity biopsy forceps with a new jumbo biopsy forceps for dysplasia detection in BE. Prospective, single-center investigation. We prospectively enrolled 32 patients undergoing surveillance endoscopy for BE. Biopsy samples were obtained in paired fashion alternating between the experimental (jumbo) and control (large-capacity) forceps. Each sample was assessed for histopathology, specimen size, and adequacy. A total of 712 specimens were available for analysis for this investigation. Six patients were found to have dysplasia, and in 5 of those patients, the dysplasia was only detected with the jumbo forceps. The mean width was significantly greater in the Radial Jaw 4 jumbo group (3.3 mm vs 1.9 mm [P < .005]) as was the mean depth (2.0 mm vs 1.1 mm [P < .005]). Sixteen percent of samples obtained with the standard forceps provided an adequate sample, whereas the jumbo forceps provided an adequate sample 79% of the time (P < .05). A lack of a validated index for assessment of tissue adequacy in BE. The Radial Jaw 4 jumbo biopsy forceps significantly improves dysplasia detection and adequate tissue sampling in patients undergoing endoscopy for BE.

  18. Mutagenic Potential of 4-Nitrophenyl Monochloromethyl (Phenyl) Phosphinate Using the Drosophila melanogaster Sex-Linked Recessive Lethal Test.

    DTIC Science & Technology

    1983-08-01

    chromosomes were tested from the concurrent negative control. This sample size was adequate for analysis using the Fisher’s Exact test ( personal communication...study may be regarded as adequate ( personal communication - Dr. Gildengorin, Statistician, Information Sciences, Letterman Army Institute of Research...Health Sciences 0917 Arlington Road Bethesda MD 20014 CM nd Commander US Army Euvaoomens Hygine Agency US Army Research Institute Abardan Proving Ground MD

  19. What is the optimum sample size for the study of peatland testate amoeba assemblages?

    PubMed

    Mazei, Yuri A; Tsyganov, Andrey N; Esaulov, Anton S; Tychkov, Alexander Yu; Payne, Richard J

    2017-10-01

    Testate amoebae are widely used in ecological and palaeoecological studies of peatlands, particularly as indicators of surface wetness. To ensure data are robust and comparable it is important to consider methodological factors which may affect results. One significant question which has not been directly addressed in previous studies is how sample size (expressed here as number of Sphagnum stems) affects data quality. In three contrasting locations in a Russian peatland we extracted samples of differing size, analysed testate amoebae and calculated a number of widely-used indices: species richness, Simpson diversity, compositional dissimilarity from the largest sample and transfer function predictions of water table depth. We found that there was a trend for larger samples to contain more species across the range of commonly-used sample sizes in ecological studies. Smaller samples sometimes failed to produce counts of testate amoebae often considered minimally adequate. It seems likely that analyses based on samples of different sizes may not produce consistent data. Decisions about sample size need to reflect trade-offs between logistics, data quality, spatial resolution and the disturbance involved in sample extraction. For most common ecological applications we suggest that samples of more than eight Sphagnum stems are likely to be desirable. Copyright © 2017 Elsevier GmbH. All rights reserved.

  20. Kidney function endpoints in kidney transplant trials: a struggle for power.

    PubMed

    Ibrahim, A; Garg, A X; Knoll, G A; Akbari, A; White, C A

    2013-03-01

    Kidney function endpoints are commonly used in randomized controlled trials (RCTs) in kidney transplantation (KTx). We conducted this study to estimate the proportion of ongoing RCTs with kidney function endpoints in KTx where the proposed sample size is large enough to detect meaningful differences in glomerular filtration rate (GFR) with adequate statistical power. RCTs were retrieved using the key word "kidney transplantation" from the National Institute of Health online clinical trial registry. Included trials had at least one measure of kidney function tracked for at least 1 month after transplant. We determined the proportion of two-arm parallel trials that had sufficient sample sizes to detect a minimum 5, 7.5 and 10 mL/min difference in GFR between arms. Fifty RCTs met inclusion criteria. Only 7% of the trials were above a sample size of 562, the number needed to detect a minimum 5 mL/min difference between the groups should one exist (assumptions: α = 0.05; power = 80%, 10% loss to follow-up, common standard deviation of 20 mL/min). The result increased modestly to 36% of trials when a minimum 10 mL/min difference was considered. Only a minority of ongoing trials have adequate statistical power to detect between-group differences in kidney function using conventional sample size estimating parameters. For this reason, some potentially effective interventions which ultimately could benefit patients may be abandoned from future assessment. © Copyright 2013 The American Society of Transplantation and the American Society of Transplant Surgeons.

  1. Bone Marrow Biopsy: RNA Isolation with Expression Profiling in Men with Metastatic Castration-resistant Prostate Cancer—Factors Affecting Diagnostic Success

    PubMed Central

    Afonso, P. Diana; Vinson, Emily N.; Turnbull, James D.; Morris, Karla K.; Foye, Adam; Madden, John F.; Roy Choudhury, Kingshuk; Febbo, Phillip G.; George, Daniel J.

    2013-01-01

    Purpose To determine the rate at which computed tomographically guided pelvic percutaneous bone biopsy in men with metastatic castration-resistant prostate cancer (mCRPC) yields adequate tissue for genomic profiling and to identify issues likely to affect diagnostic yields. Materials and Methods This study was institutional review board approved, and written informed consent was obtained. In a phase II trial assessing response to everolimus, 31 men with mCRPC underwent 54 biopsy procedures (eight men before and 23 men both before and during treatment). Variables assessed were lesion location (iliac wing adjacent to sacroiliac joint, iliac wing anterior and/or superior to sacroiliac joint, sacrum, and remainder of pelvis), mean lesion attenuation, subjective lesion attenuation (purely sclerotic vs mixed), central versus peripheral lesion sampling, lesion size, core number, and use of zoledronic acid for more than 1 year. Results Of 54 biopsy procedures, 21 (39%) yielded adequate tissue for RNA isolation and genomic profiling. Three of four sacral biopsies were adequate. Biopsies of the ilium adjacent to the sacroiliac joints were more likely adequate than those from elsewhere in the ilium (48% vs 28%, respectively). All five biopsies performed in other pelvic locations yielded inadequate tissue for RNA isolation. Mean attenuation of lesions with inadequate tissue was 172 HU greater than those with adequate tissue (621.1 HU ± 166 vs 449 HU ± 221, respectively; P = .002). Use of zoledronic acid, peripheral sampling, core number, and lesion size affected yields, but the differences were not statistically significant. Histologic examination with hematoxylin-eosin staining showed that results of 36 (67%) biopsies were positive for cancer; only mean attenuation differences were significant (707 HU ± 144 vs 473 HU ± 191, negative vs positive, respectively; P < .001). Conclusion In men with mCRPC, percutaneous sampling of osseous metastases for genomic profiling is possible, but use of zoledronic acid for more than 1 year may reduce the yield of adequate tissue for RNA isolation. Sampling large low-attenuating lesions at their periphery maximizes yield. © RSNA, 2013 PMID:23925271

  2. Optimal sample sizes for the design of reliability studies: power consideration.

    PubMed

    Shieh, Gwowen

    2014-09-01

    Intraclass correlation coefficients are used extensively to measure the reliability or degree of resemblance among group members in multilevel research. This study concerns the problem of the necessary sample size to ensure adequate statistical power for hypothesis tests concerning the intraclass correlation coefficient in the one-way random-effects model. In view of the incomplete and problematic numerical results in the literature, the approximate sample size formula constructed from Fisher's transformation is reevaluated and compared with an exact approach across a wide range of model configurations. These comprehensive examinations showed that the Fisher transformation method is appropriate only under limited circumstances, and therefore it is not recommended as a general method in practice. For advance design planning of reliability studies, the exact sample size procedures are fully described and illustrated for various allocation and cost schemes. Corresponding computer programs are also developed to implement the suggested algorithms.

  3. Model Choice and Sample Size in Item Response Theory Analysis of Aphasia Tests

    ERIC Educational Resources Information Center

    Hula, William D.; Fergadiotis, Gerasimos; Martin, Nadine

    2012-01-01

    Purpose: The purpose of this study was to identify the most appropriate item response theory (IRT) measurement model for aphasia tests requiring 2-choice responses and to determine whether small samples are adequate for estimating such models. Method: Pyramids and Palm Trees (Howard & Patterson, 1992) test data that had been collected from…

  4. Improved ASTM G72 Test Method for Ensuring Adequate Fuel-to-Oxidizer Ratios

    NASA Technical Reports Server (NTRS)

    Juarez, Alfredo; Harper, Susana A.

    2016-01-01

    The ASTM G72/G72M-15 Standard Test Method for Autogenous Ignition Temperature of Liquids and Solids in a High-Pressure Oxygen-Enriched Environment is currently used to evaluate materials for the ignition susceptibility driven by exposure to external heat in an enriched oxygen environment. Testing performed on highly volatile liquids such as cleaning solvents has proven problematic due to inconsistent test results (non-ignitions). Non-ignition results can be misinterpreted as favorable oxygen compatibility, although they are more likely associated with inadequate fuel-to-oxidizer ratios. Forced evaporation during purging and inadequate sample size were identified as two potential causes for inadequate available sample material during testing. In an effort to maintain adequate fuel-to-oxidizer ratios within the reaction vessel during test, several parameters were considered, including sample size, pretest sample chilling, pretest purging, and test pressure. Tests on a variety of solvents exhibiting a range of volatilities are presented in this paper. A proposed improvement to the standard test protocol as a result of this evaluation is also presented. Execution of the final proposed improved test protocol outlines an incremental step method of determining optimal conditions using increased sample sizes while considering test system safety limits. The proposed improved test method increases confidence in results obtained by utilizing the ASTM G72 autogenous ignition temperature test method and can aid in the oxygen compatibility assessment of highly volatile liquids and other conditions that may lead to false non-ignition results.

  5. Neuromuscular dose-response studies: determining sample size.

    PubMed

    Kopman, A F; Lien, C A; Naguib, M

    2011-02-01

    Investigators planning dose-response studies of neuromuscular blockers have rarely used a priori power analysis to determine the minimal sample size their protocols require. Institutional Review Boards and peer-reviewed journals now generally ask for this information. This study outlines a proposed method for meeting these requirements. The slopes of the dose-response relationships of eight neuromuscular blocking agents were determined using regression analysis. These values were substituted for γ in the Hill equation. When this is done, the coefficient of variation (COV) around the mean value of the ED₅₀ for each drug is easily calculated. Using these values, we performed an a priori one-sample two-tailed t-test of the means to determine the required sample size when the allowable error in the ED₅₀ was varied from ±10-20%. The COV averaged 22% (range 15-27%). We used a COV value of 25% in determining the sample size. If the allowable error in finding the mean ED₅₀ is ±15%, a sample size of 24 is needed to achieve a power of 80%. Increasing 'accuracy' beyond this point requires increasing greater sample sizes (e.g. an 'n' of 37 for a ±12% error). On the basis of the results of this retrospective analysis, a total sample size of not less than 24 subjects should be adequate for determining a neuromuscular blocking drug's clinical potency with a reasonable degree of assurance.

  6. A Future Moon Mission: Curatorial Statistics on Regolith Fragments Applicable to Sample Collection by Raking

    NASA Technical Reports Server (NTRS)

    Allton, J. H.; Bevill, T. J.

    2003-01-01

    The strategy of raking rock fragments from the lunar regolith as a means of acquiring representative samples has wide support due to science return, spacecraft simplicity (reliability) and economy [3, 4, 5]. While there exists widespread agreement that raking or sieving the bulk regolith is good strategy, there is lively discussion about the minimum sample size. Advocates of consor-tium studies desire fragments large enough to support petrologic and isotopic studies. Fragments from 5 to 10 mm are thought adequate [4, 5]. Yet, Jolliff et al. [6] demonstrated use of 2-4 mm fragments as repre-sentative of larger rocks. Here we make use of cura-torial records and sample catalogs to give a different perspective on minimum sample size for a robotic sample collector.

  7. A survey sampling approach for pesticide monitoring of community water systems using groundwater as a drinking water source.

    PubMed

    Whitmore, Roy W; Chen, Wenlin

    2013-12-04

    The ability to infer human exposure to substances from drinking water using monitoring data helps determine and/or refine potential risks associated with drinking water consumption. We describe a survey sampling approach and its application to an atrazine groundwater monitoring study to adequately characterize upper exposure centiles and associated confidence intervals with predetermined precision. Study design and data analysis included sampling frame definition, sample stratification, sample size determination, allocation to strata, analysis weights, and weighted population estimates. Sampling frame encompassed 15 840 groundwater community water systems (CWS) in 21 states throughout the U. S. Median, and 95th percentile atrazine concentrations were 0.0022 and 0.024 ppb, respectively, for all CWS. Statistical estimates agreed with historical monitoring results, suggesting that the study design was adequate and robust. This methodology makes no assumptions regarding the occurrence distribution (e.g., lognormality); thus analyses based on the design-induced distribution provide the most robust basis for making inferences from the sample to target population.

  8. Improving the Selection, Classification, and Utilization of Army Enlisted Personnel. Project A: Research Plan

    DTIC Science & Technology

    1983-05-01

    occur. 4) It is also true that during a given time period, at a given base, not all of the people in the sample will actually be available for testing...taken sample sizes into consideration, we currently estimate that with few exceptions, we will have adequate samples to perform the analysis of simple ...aalanced Half Sample Repli- cations (BHSA). His analyses of simple cases have shown that this method is substantially more efficient than the

  9. Sampling studies to estimate the HIV prevalence rate in female commercial sex workers.

    PubMed

    Pascom, Ana Roberta Pati; Szwarcwald, Célia Landmann; Barbosa Júnior, Aristides

    2010-01-01

    We investigated sampling methods being used to estimate the HIV prevalence rate among female commercial sex workers. The studies were classified according to the adequacy or not of the sample size to estimate HIV prevalence rate and according to the sampling method (probabilistic or convenience). We identified 75 studies that estimated the HIV prevalence rate among female sex workers. Most of the studies employed convenience samples. The sample size was not adequate to estimate HIV prevalence rate in 35 studies. The use of convenience sample limits statistical inference for the whole group. It was observed that there was an increase in the number of published studies since 2005, as well as in the number of studies that used probabilistic samples. This represents a large advance in the monitoring of risk behavior practices and HIV prevalence rate in this group.

  10. Knowledge and use of information and communication technology by health sciences students of the University of Ghana.

    PubMed

    Dery, Samuel; Vroom, Frances da-Costa; Godi, Anthony; Afagbedzi, Seth; Dwomoh, Duah

    2016-09-01

    Studies have shown that ICT adoption contributes to productivity and economic growth. It is therefore important that health workers have knowledge in ICT to ensure adoption and uptake of ICT tools to enable efficient health delivery. To determine the knowledge and use of ICT among students of the College of Health Sciences at the University of Ghana. This was a cross-sectional study conducted among students in all the five Schools of the College of Health Sciences at the University of Ghana. A total of 773 students were sampled from the Schools. Sampling proportionate to size was then used to determine the sample sizes required for each school, academic programme and level of programme. Simple random sampling was subsequently used to select students from each stratum. Computer knowledge was high among students at almost 99%. About 83% owned computers (p < 0.001) and self-rated computer knowledge was also 87 % (p <0.001). Usage was mostly for studying at 93% (p< 0.001). This study shows students have adequate knowledge and use of computers. It brings about an opportunity to introduce ICT in healthcare delivery to them. This will ensure their adequate preparedness to embrace new ways of delivering care to improve service delivery. Africa Build Project, Grant Number: FP7-266474.

  11. Identification of missing variants by combining multiple analytic pipelines.

    PubMed

    Ren, Yingxue; Reddy, Joseph S; Pottier, Cyril; Sarangi, Vivekananda; Tian, Shulan; Sinnwell, Jason P; McDonnell, Shannon K; Biernacka, Joanna M; Carrasquillo, Minerva M; Ross, Owen A; Ertekin-Taner, Nilüfer; Rademakers, Rosa; Hudson, Matthew; Mainzer, Liudmila Sergeevna; Asmann, Yan W

    2018-04-16

    After decades of identifying risk factors using array-based genome-wide association studies (GWAS), genetic research of complex diseases has shifted to sequencing-based rare variants discovery. This requires large sample sizes for statistical power and has brought up questions about whether the current variant calling practices are adequate for large cohorts. It is well-known that there are discrepancies between variants called by different pipelines, and that using a single pipeline always misses true variants exclusively identifiable by other pipelines. Nonetheless, it is common practice today to call variants by one pipeline due to computational cost and assume that false negative calls are a small percent of total. We analyzed 10,000 exomes from the Alzheimer's Disease Sequencing Project (ADSP) using multiple analytic pipelines consisting of different read aligners and variant calling strategies. We compared variants identified by using two aligners in 50,100, 200, 500, 1000, and 1952 samples; and compared variants identified by adding single-sample genotyping to the default multi-sample joint genotyping in 50,100, 500, 2000, 5000 and 10,000 samples. We found that using a single pipeline missed increasing numbers of high-quality variants correlated with sample sizes. By combining two read aligners and two variant calling strategies, we rescued 30% of pass-QC variants at sample size of 2000, and 56% at 10,000 samples. The rescued variants had higher proportions of low frequency (minor allele frequency [MAF] 1-5%) and rare (MAF < 1%) variants, which are the very type of variants of interest. In 660 Alzheimer's disease cases with earlier onset ages of ≤65, 4 out of 13 (31%) previously-published rare pathogenic and protective mutations in APP, PSEN1, and PSEN2 genes were undetected by the default one-pipeline approach but recovered by the multi-pipeline approach. Identification of the complete variant set from sequencing data is the prerequisite of genetic association analyses. The current analytic practice of calling genetic variants from sequencing data using a single bioinformatics pipeline is no longer adequate with the increasingly large projects. The number and percentage of quality variants that passed quality filters but are missed by the one-pipeline approach rapidly increased with sample size.

  12. Blinded sample size re-estimation in three-arm trials with 'gold standard' design.

    PubMed

    Mütze, Tobias; Friede, Tim

    2017-10-15

    In this article, we study blinded sample size re-estimation in the 'gold standard' design with internal pilot study for normally distributed outcomes. The 'gold standard' design is a three-arm clinical trial design that includes an active and a placebo control in addition to an experimental treatment. We focus on the absolute margin approach to hypothesis testing in three-arm trials at which the non-inferiority of the experimental treatment and the assay sensitivity are assessed by pairwise comparisons. We compare several blinded sample size re-estimation procedures in a simulation study assessing operating characteristics including power and type I error. We find that sample size re-estimation based on the popular one-sample variance estimator results in overpowered trials. Moreover, sample size re-estimation based on unbiased variance estimators such as the Xing-Ganju variance estimator results in underpowered trials, as it is expected because an overestimation of the variance and thus the sample size is in general required for the re-estimation procedure to eventually meet the target power. To overcome this problem, we propose an inflation factor for the sample size re-estimation with the Xing-Ganju variance estimator and show that this approach results in adequately powered trials. Because of favorable features of the Xing-Ganju variance estimator such as unbiasedness and a distribution independent of the group means, the inflation factor does not depend on the nuisance parameter and, therefore, can be calculated prior to a trial. Moreover, we prove that the sample size re-estimation based on the Xing-Ganju variance estimator does not bias the effect estimate. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Finite element simulation and experimental verification of ultrasonic non-destructive inspection of defects in additively manufactured materials

    NASA Astrophysics Data System (ADS)

    Taheri, H.; Koester, L.; Bigelow, T.; Bond, L. J.

    2018-04-01

    Industrial applications of additively manufactured components are increasing quickly. Adequate quality control of the parts is necessary in ensuring safety when using these materials. Base material properties, surface conditions, as well as location and size of defects are some of the main targets for nondestructive evaluation of additively manufactured parts, and the problem of adequate characterization is compounded given the challenges of complex part geometry. Numerical modeling can allow the interplay of the various factors to be studied, which can lead to improved measurement design. This paper presents a finite element simulation verified by experimental results of ultrasonic waves scattering from flat bottom holes (FBH) in additive manufacturing materials. A focused beam immersion ultrasound transducer was used for both the modeling and simulations in the additive manufactured samples. The samples were SS17 4 PH steel samples made by laser sintering in a powder bed.

  14. Dullgren extraction of soil mites (Acarina): Effect of refrigeration time on extraction efficiency

    Treesearch

    Michelle B. Lakly; D.A. Crossley

    2000-01-01

    Soil microarthropods constitute one of the most species rich communities in . forest ecosystems (Crossley & Blair, 1991). The effects of soil fauna in these systems on decomposition rates, nutrient regeneration and soil structure have been well documented; however, dependable estimates of population size and community structure largely depend upon adequate sampling...

  15. The Power of the Test for Treatment Effects in Three-Level Block Randomized Designs

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2008-01-01

    Experiments that involve nested structures may assign treatment conditions either to subgroups (such as classrooms) or individuals within subgroups (such as students). The design of such experiments requires knowledge of the intraclass correlation structure to compute the sample sizes necessary to achieve adequate power to detect the treatment…

  16. Sampling strategies for radio-tracking coyotes

    USGS Publications Warehouse

    Smith, G.J.; Cary, J.R.; Rongstad, O.J.

    1981-01-01

    Ten coyotes radio-tracked for 24 h periods were most active at night and moved little during daylight hours. Home-range size determined from radio-locations of 3 adult coyotes increased with the number of locations until an asymptote was reached at about 35-40 independent day locations or 3 6 nights of hourly radio-locations. Activity of the coyote did not affect the asymptotic nature of the home-range calculations, but home-range sizes determined from more than 3 nights of hourly locations were considerably larger than home-range sizes determined from daylight locations. Coyote home-range sizes were calculated from daylight locations, full-night tracking periods, and half-night tracking periods. Full- and half-lnight sampling strategies involved obtaining hourly radio-locations during 12 and 6 h periods, respectively. The half-night sampling strategy was the best compromise for our needs, as it adequately indexed the home-range size, reduced time and energy spent, and standardized the area calculation without requiring the researcher to become completely nocturnal. Sight tracking also provided information about coyote activity and sociability.

  17. Combined effect of pulse density and grid cell size on predicting and mapping aboveground carbon in fast‑growing Eucalyptus forest plantation using airborne LiDAR data

    Treesearch

    Carlos Alberto Silva; Andrew Thomas Hudak; Carine Klauberg; Lee Alexandre Vierling; Carlos Gonzalez‑Benecke; Samuel de Padua Chaves Carvalho; Luiz Carlos Estraviz Rodriguez; Adrian Cardil

    2017-01-01

    LiDAR measurements can be used to predict and map AGC across variable-age Eucalyptus plantations with adequate levels of precision and accuracy using 5 pulses m− 2 and a grid cell size of 5 m. The promising results for AGC modeling in this study will allow for greater confidence in comparing AGC estimates with varying LiDAR sampling densities for Eucalyptus plantations...

  18. Size distribution and source identification of total suspended particulate matter and associated heavy metals in the urban atmosphere of Delhi.

    PubMed

    Srivastava, Arun; Jain, V K

    2007-06-01

    A study of the atmospheric particulate size distribution of total suspended particulate matter (TSPM) and associated heavy metal concentrations has been carried out for the city of Delhi. Urban particles were collected using a five-stage impactor at six sites in three different seasons, viz. winter, summer and monsoon in the year 2001. Five samples from each site in each season were collected. Each sample (filter paper) was extracted with a mixture of nitric acid, hydrochloric acid and hydrofluoric acid. The acid solutions of the samples were analysed in five-particle fractions by atomic absorption spectrometry (AAS). The impactor stage fractionation of particles shows that a major portion of TSPM concentration is in the form of PM0.7 (i.e. <0.7microm). Similarly, the most of the metal mass viz. Mn, Cr, Cd, Pb, Ni, and Fe are also concentrated in the PM0.7 mode. The only exceptions are size distributions pertaining to Cu and Ca. Though, Cu is more in PM0.7 mode, its presence in size intervals 5.4-1.6microm and 1.6-0.7microm is also significant, whilst in case of Ca there is no definite pattern in its distribution with size of particles. The average PM10.9 (i.e. <10.9microm) concentrations are approximately 90.2%+/-4.5%, 81.4%+/-1.4% and 86.4%+/-9.6% of TSPM for winter, summer and monsoon seasons, respectively. Source apportionment reveals that there are two sources of TSPM and PM10.9, while three and four sources were observed for PM1.6 (i.e. <1.6microm) and PM0.7, respectively. Results of regression analyses show definite correlations between PM10.9 and other fine size fractions, suggesting PM10.9 may adequately act as a surrogate for both PM1.6 and PM0.7, while PM1.6 may adequately act as a surrogate for PM0.7.

  19. Caution regarding the choice of standard deviations to guide sample size calculations in clinical trials.

    PubMed

    Chen, Henian; Zhang, Nanhua; Lu, Xiaosun; Chen, Sophie

    2013-08-01

    The method used to determine choice of standard deviation (SD) is inadequately reported in clinical trials. Underestimations of the population SD may result in underpowered clinical trials. This study demonstrates how using the wrong method to determine population SD can lead to inaccurate sample sizes and underpowered studies, and offers recommendations to maximize the likelihood of achieving adequate statistical power. We review the practice of reporting sample size and its effect on the power of trials published in major journals. Simulated clinical trials were used to compare the effects of different methods of determining SD on power and sample size calculations. Prior to 1996, sample size calculations were reported in just 1%-42% of clinical trials. This proportion increased from 38% to 54% after the initial Consolidated Standards of Reporting Trials (CONSORT) was published in 1996, and from 64% to 95% after the revised CONSORT was published in 2001. Nevertheless, underpowered clinical trials are still common. Our simulated data showed that all minimal and 25th-percentile SDs fell below 44 (the population SD), regardless of sample size (from 5 to 50). For sample sizes 5 and 50, the minimum sample SDs underestimated the population SD by 90.7% and 29.3%, respectively. If only one sample was available, there was less than 50% chance that the actual power equaled or exceeded the planned power of 80% for detecting a median effect size (Cohen's d = 0.5) when using the sample SD to calculate the sample size. The proportions of studies with actual power of at least 80% were about 95%, 90%, 85%, and 80% when we used the larger SD, 80% upper confidence limit (UCL) of SD, 70% UCL of SD, and 60% UCL of SD to calculate the sample size, respectively. When more than one sample was available, the weighted average SD resulted in about 50% of trials being underpowered; the proportion of trials with power of 80% increased from 90% to 100% when the 75th percentile and the maximum SD from 10 samples were used. Greater sample size is needed to achieve a higher proportion of studies having actual power of 80%. This study only addressed sample size calculation for continuous outcome variables. We recommend using the 60% UCL of SD, maximum SD, 80th-percentile SD, and 75th-percentile SD to calculate sample size when 1 or 2 samples, 3 samples, 4-5 samples, and more than 5 samples of data are available, respectively. Using the sample SD or average SD to calculate sample size should be avoided.

  20. Application of SAXS and SANS in evaluation of porosity, pore size distribution and surface area of coal

    USGS Publications Warehouse

    Radlinski, A.P.; Mastalerz, Maria; Hinde, A.L.; Hainbuchner, M.; Rauch, H.; Baron, M.; Lin, J.S.; Fan, L.; Thiyagarajan, P.

    2004-01-01

    This paper discusses the applicability of small angle X-ray scattering (SAXS) and small angle neutron scattering (SANS) techniques for determining the porosity, pore size distribution and internal specific surface area in coals. The method is noninvasive, fast, inexpensive and does not require complex sample preparation. It uses coal grains of about 0.8 mm size mounted in standard pellets as used for petrographic studies. Assuming spherical pore geometry, the scattering data are converted into the pore size distribution in the size range 1 nm (10 A??) to 20 ??m (200,000 A??) in diameter, accounting for both open and closed pores. FTIR as well as SAXS and SANS data for seven samples of oriented whole coals and corresponding pellets with vitrinite reflectance (Ro) values in the range 0.55% to 5.15% are presented and analyzed. Our results demonstrate that pellets adequately represent the average microstructure of coal samples. The scattering data have been used to calculate the maximum surface area available for methane adsorption. Total porosity as percentage of sample volume is calculated and compared with worldwide trends. By demonstrating the applicability of SAXS and SANS techniques to determine the porosity, pore size distribution and surface area in coals, we provide a new and efficient tool, which can be used for any type of coal sample, from a thin slice to a representative sample of a thick seam. ?? 2004 Elsevier B.V. All rights reserved.

  1. Influences of sampling size and pattern on the uncertainty of correlation estimation between soil water content and its influencing factors

    NASA Astrophysics Data System (ADS)

    Lai, Xiaoming; Zhu, Qing; Zhou, Zhiwen; Liao, Kaihua

    2017-12-01

    In this study, seven random combination sampling strategies were applied to investigate the uncertainties in estimating the hillslope mean soil water content (SWC) and correlation coefficients between the SWC and soil/terrain properties on a tea + bamboo hillslope. One of the sampling strategies is the global random sampling and the other six are the stratified random sampling on the top, middle, toe, top + mid, top + toe and mid + toe slope positions. When each sampling strategy was applied, sample sizes were gradually reduced and each sampling size contained 3000 replicates. Under each sampling size of each sampling strategy, the relative errors (REs) and coefficients of variation (CVs) of the estimated hillslope mean SWC and correlation coefficients between the SWC and soil/terrain properties were calculated to quantify the accuracy and uncertainty. The results showed that the uncertainty of the estimations decreased as the sampling size increasing. However, larger sample sizes were required to reduce the uncertainty in correlation coefficient estimation than in hillslope mean SWC estimation. Under global random sampling, 12 randomly sampled sites on this hillslope were adequate to estimate the hillslope mean SWC with RE and CV ≤10%. However, at least 72 randomly sampled sites were needed to ensure the estimated correlation coefficients with REs and CVs ≤10%. Comparing with all sampling strategies, reducing sampling sites on the middle slope had the least influence on the estimation of hillslope mean SWC and correlation coefficients. Under this strategy, 60 sites (10 on the middle slope and 50 on the top and toe slopes) were enough to ensure the estimated correlation coefficients with REs and CVs ≤10%. This suggested that when designing the SWC sampling, the proportion of sites on the middle slope can be reduced to 16.7% of the total number of sites. Findings of this study will be useful for the optimal SWC sampling design.

  2. Sample Size Methods for Estimating HIV Incidence from Cross-Sectional Surveys

    PubMed Central

    Brookmeyer, Ron

    2015-01-01

    Summary Understanding HIV incidence, the rate at which new infections occur in populations, is critical for tracking and surveillance of the epidemic. In this paper we derive methods for determining sample sizes for cross-sectional surveys to estimate incidence with sufficient precision. We further show how to specify sample sizes for two successive cross-sectional surveys to detect changes in incidence with adequate power. In these surveys biomarkers such as CD4 cell count, viral load, and recently developed serological assays are used to determine which individuals are in an early disease stage of infection. The total number of individuals in this stage, divided by the number of people who are uninfected, is used to approximate the incidence rate. Our methods account for uncertainty in the durations of time spent in the biomarker defined early disease stage. We find that failure to account for this uncertainty when designing surveys can lead to imprecise estimates of incidence and underpowered studies. We evaluated our sample size methods in simulations and found that they performed well in a variety of underlying epidemics. Code for implementing our methods in R is available with this paper at the Biometrics website on Wiley Online Library. PMID:26302040

  3. Sample size methods for estimating HIV incidence from cross-sectional surveys.

    PubMed

    Konikoff, Jacob; Brookmeyer, Ron

    2015-12-01

    Understanding HIV incidence, the rate at which new infections occur in populations, is critical for tracking and surveillance of the epidemic. In this article, we derive methods for determining sample sizes for cross-sectional surveys to estimate incidence with sufficient precision. We further show how to specify sample sizes for two successive cross-sectional surveys to detect changes in incidence with adequate power. In these surveys biomarkers such as CD4 cell count, viral load, and recently developed serological assays are used to determine which individuals are in an early disease stage of infection. The total number of individuals in this stage, divided by the number of people who are uninfected, is used to approximate the incidence rate. Our methods account for uncertainty in the durations of time spent in the biomarker defined early disease stage. We find that failure to account for this uncertainty when designing surveys can lead to imprecise estimates of incidence and underpowered studies. We evaluated our sample size methods in simulations and found that they performed well in a variety of underlying epidemics. Code for implementing our methods in R is available with this article at the Biometrics website on Wiley Online Library. © 2015, The International Biometric Society.

  4. Only pick the right grains: Modelling the bias due to subjective grain-size interval selection for chronometric and fingerprinting approaches.

    NASA Astrophysics Data System (ADS)

    Dietze, Michael; Fuchs, Margret; Kreutzer, Sebastian

    2016-04-01

    Many modern approaches of radiometric dating or geochemical fingerprinting rely on sampling sedimentary deposits. A key assumption of most concepts is that the extracted grain-size fraction of the sampled sediment adequately represents the actual process to be dated or the source area to be fingerprinted. However, these assumptions are not always well constrained. Rather, they have to align with arbitrary, method-determined size intervals, such as "coarse grain" or "fine grain" with partly even different definitions. Such arbitrary intervals violate principal process-based concepts of sediment transport and can thus introduce significant bias to the analysis outcome (i.e., a deviation of the measured from the true value). We present a flexible numerical framework (numOlum) for the statistical programming language R that allows quantifying the bias due to any given analysis size interval for different types of sediment deposits. This framework is applied to synthetic samples from the realms of luminescence dating and geochemical fingerprinting, i.e. a virtual reworked loess section. We show independent validation data from artificially dosed and subsequently mixed grain-size proportions and we present a statistical approach (end-member modelling analysis, EMMA) that allows accounting for the effect of measuring the compound dosimetric history or geochemical composition of a sample. EMMA separates polymodal grain-size distributions into the underlying transport process-related distributions and their contribution to each sample. These underlying distributions can then be used to adjust grain-size preparation intervals to minimise the incorporation of "undesired" grain-size fractions.

  5. Enrichment methods provide a feasible approach to comprehensive and adequately powered investigations of the brain methylome

    PubMed Central

    Chan, Robin F.; Shabalin, Andrey A.; Xie, Lin Y.; Adkins, Daniel E.; Zhao, Min; Turecki, Gustavo; Clark, Shaunna L.; Aberg, Karolina A.

    2017-01-01

    Abstract Methylome-wide association studies are typically performed using microarray technologies that only assay a very small fraction of the CG methylome and entirely miss two forms of methylation that are common in brain and likely of particular relevance for neuroscience and psychiatric disorders. The alternative is to use whole genome bisulfite (WGB) sequencing but this approach is not yet practically feasible with sample sizes required for adequate statistical power. We argue for revisiting methylation enrichment methods that, provided optimal protocols are used, enable comprehensive, adequately powered and cost-effective genome-wide investigations of the brain methylome. To support our claim we use data showing that enrichment methods approximate the sensitivity obtained with WGB methods and with slightly better specificity. However, this performance is achieved at <5% of the reagent costs. Furthermore, because many more samples can be sequenced simultaneously, projects can be completed about 15 times faster. Currently the only viable option available for comprehensive brain methylome studies, enrichment methods may be critical for moving the field forward. PMID:28334972

  6. Valid approximation of spatially distributed grain size distributions - A priori information encoded to a feedforward network

    NASA Astrophysics Data System (ADS)

    Berthold, T.; Milbradt, P.; Berkhahn, V.

    2018-04-01

    This paper presents a model for the approximation of multiple, spatially distributed grain size distributions based on a feedforward neural network. Since a classical feedforward network does not guarantee to produce valid cumulative distribution functions, a priori information is incor porated into the model by applying weight and architecture constraints. The model is derived in two steps. First, a model is presented that is able to produce a valid distribution function for a single sediment sample. Although initially developed for sediment samples, the model is not limited in its application; it can also be used to approximate any other multimodal continuous distribution function. In the second part, the network is extended in order to capture the spatial variation of the sediment samples that have been obtained from 48 locations in the investigation area. Results show that the model provides an adequate approximation of grain size distributions, satisfying the requirements of a cumulative distribution function.

  7. Samples in applied psychology: over a decade of research in review.

    PubMed

    Shen, Winny; Kiger, Thomas B; Davies, Stacy E; Rasch, Rena L; Simon, Kara M; Ones, Deniz S

    2011-09-01

    This study examines sample characteristics of articles published in Journal of Applied Psychology (JAP) from 1995 to 2008. At the individual level, the overall median sample size over the period examined was approximately 173, which is generally adequate for detecting the average magnitude of effects of primary interest to researchers who publish in JAP. Samples using higher units of analyses (e.g., teams, departments/work units, and organizations) had lower median sample sizes (Mdn ≈ 65), yet were arguably robust given typical multilevel design choices of JAP authors despite the practical constraints of collecting data at higher units of analysis. A substantial proportion of studies used student samples (~40%); surprisingly, median sample sizes for student samples were smaller than working adult samples. Samples were more commonly occupationally homogeneous (~70%) than occupationally heterogeneous. U.S. and English-speaking participants made up the vast majority of samples, whereas Middle Eastern, African, and Latin American samples were largely unrepresented. On the basis of study results, recommendations are provided for authors, editors, and readers, which converge on 3 themes: (a) appropriateness and match between sample characteristics and research questions, (b) careful consideration of statistical power, and (c) the increased popularity of quantitative synthesis. Implications are discussed in terms of theory building, generalizability of research findings, and statistical power to detect effects. PsycINFO Database Record (c) 2011 APA, all rights reserved

  8. Design, analysis and presentation of factorial randomised controlled trials

    PubMed Central

    Montgomery, Alan A; Peters, Tim J; Little, Paul

    2003-01-01

    Background The evaluation of more than one intervention in the same randomised controlled trial can be achieved using a parallel group design. However this requires increased sample size and can be inefficient, especially if there is also interest in considering combinations of the interventions. An alternative may be a factorial trial, where for two interventions participants are allocated to receive neither intervention, one or the other, or both. Factorial trials require special considerations, however, particularly at the design and analysis stages. Discussion Using a 2 × 2 factorial trial as an example, we present a number of issues that should be considered when planning a factorial trial. The main design issue is that of sample size. Factorial trials are most often powered to detect the main effects of interventions, since adequate power to detect plausible interactions requires greatly increased sample sizes. The main analytical issues relate to the investigation of main effects and the interaction between the interventions in appropriate regression models. Presentation of results should reflect the analytical strategy with an emphasis on the principal research questions. We also give an example of how baseline and follow-up data should be presented. Lastly, we discuss the implications of the design, analytical and presentational issues covered. Summary Difficulties in interpreting the results of factorial trials if an influential interaction is observed is the cost of the potential for efficient, simultaneous consideration of two or more interventions. Factorial trials can in principle be designed to have adequate power to detect realistic interactions, and in any case they are the only design that allows such effects to be investigated. PMID:14633287

  9. The Effect of Small Sample Size on Two-Level Model Estimates: A Review and Illustration

    ERIC Educational Resources Information Center

    McNeish, Daniel M.; Stapleton, Laura M.

    2016-01-01

    Multilevel models are an increasingly popular method to analyze data that originate from a clustered or hierarchical structure. To effectively utilize multilevel models, one must have an adequately large number of clusters; otherwise, some model parameters will be estimated with bias. The goals for this paper are to (1) raise awareness of the…

  10. Biostatistics Series Module 5: Determining Sample Size

    PubMed Central

    Hazra, Avijit; Gogtay, Nithya

    2016-01-01

    Determining the appropriate sample size for a study, whatever be its type, is a fundamental aspect of biomedical research. An adequate sample ensures that the study will yield reliable information, regardless of whether the data ultimately suggests a clinically important difference between the interventions or elements being studied. The probability of Type 1 and Type 2 errors, the expected variance in the sample and the effect size are the essential determinants of sample size in interventional studies. Any method for deriving a conclusion from experimental data carries with it some risk of drawing a false conclusion. Two types of false conclusion may occur, called Type 1 and Type 2 errors, whose probabilities are denoted by the symbols σ and β. A Type 1 error occurs when one concludes that a difference exists between the groups being compared when, in reality, it does not. This is akin to a false positive result. A Type 2 error occurs when one concludes that difference does not exist when, in reality, a difference does exist, and it is equal to or larger than the effect size defined by the alternative to the null hypothesis. This may be viewed as a false negative result. When considering the risk of Type 2 error, it is more intuitive to think in terms of power of the study or (1 − β). Power denotes the probability of detecting a difference when a difference does exist between the groups being compared. Smaller α or larger power will increase sample size. Conventional acceptable values for power and α are 80% or above and 5% or below, respectively, when calculating sample size. Increasing variance in the sample tends to increase the sample size required to achieve a given power level. The effect size is the smallest clinically important difference that is sought to be detected and, rather than statistical convention, is a matter of past experience and clinical judgment. Larger samples are required if smaller differences are to be detected. Although the principles are long known, historically, sample size determination has been difficult, because of relatively complex mathematical considerations and numerous different formulas. However, of late, there has been remarkable improvement in the availability, capability, and user-friendliness of power and sample size determination software. Many can execute routines for determination of sample size and power for a wide variety of research designs and statistical tests. With the drudgery of mathematical calculation gone, researchers must now concentrate on determining appropriate sample size and achieving these targets, so that study conclusions can be accepted as meaningful. PMID:27688437

  11. [Sequential sampling plans to Orthezia praelonga Douglas (Hemiptera: Sternorrhyncha, Ortheziidae) in citrus].

    PubMed

    Costa, Marilia G; Barbosa, José C; Yamamoto, Pedro T

    2007-01-01

    The sequential sampling is characterized by using samples of variable sizes, and has the advantage of reducing sampling time and costs if compared to fixed-size sampling. To introduce an adequate management for orthezia, sequential sampling plans were developed for orchards under low and high infestation. Data were collected in Matão, SP, in commercial stands of the orange variety 'Pêra Rio', at five, nine and 15 years of age. Twenty samplings were performed in the whole area of each stand by observing the presence or absence of scales on plants, being plots comprised of ten plants. After observing that in all of the three stands the scale population was distributed according to the contagious model, fitting the Negative Binomial Distribution in most samplings, two sequential sampling plans were constructed according to the Sequential Likelihood Ratio Test (SLRT). To construct these plans an economic threshold of 2% was adopted and the type I and II error probabilities were fixed in alpha = beta = 0.10. Results showed that the maximum numbers of samples expected to determine control need were 172 and 76 samples for stands with low and high infestation, respectively.

  12. Sampling benthic macroinvertebrates in a large flood-plain river: Considerations of study design, sample size, and cost

    USGS Publications Warehouse

    Bartsch, L.A.; Richardson, W.B.; Naimo, T.J.

    1998-01-01

    Estimation of benthic macroinvertebrate populations over large spatial scales is difficult due to the high variability in abundance and the cost of sample processing and taxonomic analysis. To determine a cost-effective, statistically powerful sample design, we conducted an exploratory study of the spatial variation of benthic macroinvertebrates in a 37 km reach of the Upper Mississippi River. We sampled benthos at 36 sites within each of two strata, contiguous backwater and channel border. Three standard ponar (525 cm(2)) grab samples were obtained at each site ('Original Design'). Analysis of variance and sampling cost of strata-wide estimates for abundance of Oligochaeta, Chironomidae, and total invertebrates showed that only one ponar sample per site ('Reduced Design') yielded essentially the same abundance estimates as the Original Design, while reducing the overall cost by 63%. A posteriori statistical power analysis (alpha = 0.05, beta = 0.20) on the Reduced Design estimated that at least 18 sites per stratum were needed to detect differences in mean abundance between contiguous backwater and channel border areas for Oligochaeta, Chironomidae, and total invertebrates. Statistical power was nearly identical for the three taxonomic groups. The abundances of several taxa of concern (e.g., Hexagenia mayflies and Musculium fingernail clams) were too spatially variable to estimate power with our method. Resampling simulations indicated that to achieve adequate sampling precision for Oligochaeta, at least 36 sample sites per stratum would be required, whereas a sampling precision of 0.2 would not be attained with any sample size for Hexagenia in channel border areas, or Chironomidae and Musculium in both strata given the variance structure of the original samples. Community-wide diversity indices (Brillouin and 1-Simpsons) increased as sample area per site increased. The backwater area had higher diversity than the channel border area. The number of sampling sites required to sample benthic macroinvertebrates during our sampling period depended on the study objective and ranged from 18 to more than 40 sites per stratum. No single sampling regime would efficiently and adequately sample all components of the macroinvertebrate community.

  13. Randomized comparison of 3 different-sized biopsy forceps for quality of sampling in Barrett’s esophagus

    PubMed Central

    Gonzalez, Susana; Yu, Woojin M.; Smith, Michael S.; Slack, Kristen N.; Rotterdam, Heidrun; Abrams, Julian A.; Lightdale, Charles J.

    2011-01-01

    Background Several types of forceps are available for use in sampling Barrett’s esophagus (BE). Few data exist with regard to biopsy quality for histologic assessment. Objective To evaluate sampling quality of 3 different forceps in patients with BE. Design Single-center, randomized clinical trial. Patients Consecutive patients with BE undergoing upper endoscopy. Interventions Patients randomized to have biopsy specimens taken with 1 of 3 types of forceps: standard, large capacity, or jumbo. Main Outcome Measurements Specimen adequacy was defined a priori as a well-oriented biopsy sample 2 mm or greater in diameter and with at least muscularis mucosa present. Results A total of 65 patients were enrolled and analyzed (standard forceps, n = 21; large-capacity forceps, n = 21; jumbo forceps, n = 23). Compared with jumbo forceps, a significantly higher proportion of biopsy samples with large-capacity forceps were adequate (37.8% vs 25.2%, P = .002). Of the standard forceps biopsy samples, 31.9% were adequate, which was not significantly different from specimens taken with large-capacity (P = .20) or jumbo (P = .09) forceps. Biopsy specimens taken with jumbo forceps had the largest diameter (median, 3.0 mm vs 2.5 mm [standard] vs 2.8 mm [large capacity]; P = .0001). However, jumbo forceps had the lowest proportion of specimens that were well oriented (overall P = .001). Limitations Heterogeneous patient population precluded dysplasia detection analyses. Conclusions Our results challenge the requirement of jumbo forceps and therapeutic endoscopes to properly perform the Seattle protocol. We found that standard and large-capacity forceps used with standard upper endoscopes produced biopsy samples at least as adequate as those obtained with jumbo forceps and therapeutic endoscopes in patients with BE. PMID:21034895

  14. The choice of catecholamines in septic shock: more and more good arguments to strengthen the known position, but don't lose the faith!

    PubMed

    Meier-Hellmann, Andreas

    2006-01-01

    The choice of catecholamines for hemodynamic stabilisation in septic shock patients has been an ongoing debate for several years. Several studies have investigated the regional effects in septic patients. Because of an often very small sample size, because of inconsistent results and because of methodical problems in the monitoring techniques used in these studies, however, it is not possible to provide clear recommendations concerning the use of catecholamines in sepsis. Prospective and adequate-sized studies are necessary because outcome data are completely lacking.

  15. Non-invasive genetic censusing and monitoring of primate populations.

    PubMed

    Arandjelovic, Mimi; Vigilant, Linda

    2018-03-01

    Knowing the density or abundance of primate populations is essential for their conservation management and contextualizing socio-demographic and behavioral observations. When direct counts of animals are not possible, genetic analysis of non-invasive samples collected from wildlife populations allows estimates of population size with higher accuracy and precision than is possible using indirect signs. Furthermore, in contrast to traditional indirect survey methods, prolonged or periodic genetic sampling across months or years enables inference of group membership, movement, dynamics, and some kin relationships. Data may also be used to estimate sex ratios, sex differences in dispersal distances, and detect gene flow among locations. Recent advances in capture-recapture models have further improved the precision of population estimates derived from non-invasive samples. Simulations using these methods have shown that the confidence interval of point estimates includes the true population size when assumptions of the models are met, and therefore this range of population size minima and maxima should be emphasized in population monitoring studies. Innovations such as the use of sniffer dogs or anti-poaching patrols for sample collection are important to ensure adequate sampling, and the expected development of efficient and cost-effective genotyping by sequencing methods for DNAs derived from non-invasive samples will automate and speed analyses. © 2018 Wiley Periodicals, Inc.

  16. Field test comparison of an autocorrelation technique for determining grain size using a digital 'beachball' camera versus traditional methods

    USGS Publications Warehouse

    Barnard, P.L.; Rubin, D.M.; Harney, J.; Mustain, N.

    2007-01-01

    This extensive field test of an autocorrelation technique for determining grain size from digital images was conducted using a digital bed-sediment camera, or 'beachball' camera. Using 205 sediment samples and >1200 images from a variety of beaches on the west coast of the US, grain size ranging from sand to granules was measured from field samples using both the autocorrelation technique developed by Rubin [Rubin, D.M., 2004. A simple autocorrelation algorithm for determining grain size from digital images of sediment. Journal of Sedimentary Research, 74(1): 160-165.] and traditional methods (i.e. settling tube analysis, sieving, and point counts). To test the accuracy of the digital-image grain size algorithm, we compared results with manual point counts of an extensive image data set in the Santa Barbara littoral cell. Grain sizes calculated using the autocorrelation algorithm were highly correlated with the point counts of the same images (r2 = 0.93; n = 79) and had an error of only 1%. Comparisons of calculated grain sizes and grain sizes measured from grab samples demonstrated that the autocorrelation technique works well on high-energy dissipative beaches with well-sorted sediment such as in the Pacific Northwest (r2 ??? 0.92; n = 115). On less dissipative, more poorly sorted beaches such as Ocean Beach in San Francisco, results were not as good (r2 ??? 0.70; n = 67; within 3% accuracy). Because the algorithm works well compared with point counts of the same image, the poorer correlation with grab samples must be a result of actual spatial and vertical variability of sediment in the field; closer agreement between grain size in the images and grain size of grab samples can be achieved by increasing the sampling volume of the images (taking more images, distributed over a volume comparable to that of a grab sample). In all field tests the autocorrelation method was able to predict the mean and median grain size with ???96% accuracy, which is more than adequate for the majority of sedimentological applications, especially considering that the autocorrelation technique is estimated to be at least 100 times faster than traditional methods.

  17. Addressing the "Replication Crisis": Using Original Studies to Design Replication Studies with Appropriate Statistical Power.

    PubMed

    Anderson, Samantha F; Maxwell, Scott E

    2017-01-01

    Psychology is undergoing a replication crisis. The discussion surrounding this crisis has centered on mistrust of previous findings. Researchers planning replication studies often use the original study sample effect size as the basis for sample size planning. However, this strategy ignores uncertainty and publication bias in estimated effect sizes, resulting in overly optimistic calculations. A psychologist who intends to obtain power of .80 in the replication study, and performs calculations accordingly, may have an actual power lower than .80. We performed simulations to reveal the magnitude of the difference between actual and intended power based on common sample size planning strategies and assessed the performance of methods that aim to correct for effect size uncertainty and/or bias. Our results imply that even if original studies reflect actual phenomena and were conducted in the absence of questionable research practices, popular approaches to designing replication studies may result in a low success rate, especially if the original study is underpowered. Methods correcting for bias and/or uncertainty generally had higher actual power, but were not a panacea for an underpowered original study. Thus, it becomes imperative that 1) original studies are adequately powered and 2) replication studies are designed with methods that are more likely to yield the intended level of power.

  18. Estimating the breeding population of long-billed curlew in the United States

    USGS Publications Warehouse

    Stanley, T.R.; Skagen, S.K.

    2007-01-01

    Determining population size and long-term trends in population size for species of high concern is a priority of international, national, and regional conservation plans. Long-billed curlews (Numenius americanus) are a species of special concern in North America due to apparent declines in their population. Because long-billed curlews are not adequately monitored by existing programs, we undertook a 2-year study with the goals of 1) determining present long-billed curlew distribution and breeding population size in the United States and 2) providing recommendations for a long-term long-billed curlew monitoring protocol. We selected a stratified random sample of survey routes in 16 western states for sampling in 2004 and 2005, and we analyzed count data from these routes to estimate detection probabilities and abundance. In addition, we evaluated habitat along roadsides to determine how well roadsides represented habitat throughout the sampling units. We estimated there were 164,515 (SE = 42,047) breeding long-billed curlews in 2004, and 109,533 (SE = 31,060) breeding individuals in 2005. These estimates far exceed currently accepted estimates based on expert opinion. We found that habitat along roadsides was representative of long-billed curlew habitat in general. We make recommendations for improving sampling methodology, and we present power curves to provide guidance on minimum sample sizes required to detect trends in abundance.

  19. Improving tritium exposure reconstructions using accelerator mass spectrometry

    PubMed Central

    Hunt, J. R.; Vogel, J. S.; Knezovich, J. P.

    2010-01-01

    Direct measurement of tritium atoms by accelerator mass spectrometry (AMS) enables rapid low-activity tritium measurements from milligram-sized samples and permits greater ease of sample collection, faster throughput, and increased spatial and/or temporal resolution. Because existing methodologies for quantifying tritium have some significant limitations, the development of tritium AMS has allowed improvements in reconstructing tritium exposure concentrations from environmental measurements and provides an important additional tool in assessing the temporal and spatial distribution of chronic exposure. Tritium exposure reconstructions using AMS were previously demonstrated for a tree growing on known levels of tritiated water and for trees exposed to atmospheric releases of tritiated water vapor. In these analyses, tritium levels were measured from milligram-sized samples with sample preparation times of a few days. Hundreds of samples were analyzed within a few months of sample collection and resulted in the reconstruction of spatial and temporal exposure from tritium releases. Although the current quantification limit of tritium AMS is not adequate to determine natural environmental variations in tritium concentrations, it is expected to be sufficient for studies assessing possible health effects from chronic environmental tritium exposure. PMID:14735274

  20. A Systematic Review of Published Respondent-Driven Sampling Surveys Collecting Behavioral and Biologic Data.

    PubMed

    Johnston, Lisa G; Hakim, Avi J; Dittrich, Samantha; Burnett, Janet; Kim, Evelyn; White, Richard G

    2016-08-01

    Reporting key details of respondent-driven sampling (RDS) survey implementation and analysis is essential for assessing the quality of RDS surveys. RDS is both a recruitment and analytic method and, as such, it is important to adequately describe both aspects in publications. We extracted data from peer-reviewed literature published through September, 2013 that reported collected biological specimens using RDS. We identified 151 eligible peer-reviewed articles describing 222 surveys conducted in seven regions throughout the world. Most published surveys reported basic implementation information such as survey city, country, year, population sampled, interview method, and final sample size. However, many surveys did not report essential methodological and analytical information for assessing RDS survey quality, including number of recruitment sites, seeds at start and end, maximum number of waves, and whether data were adjusted for network size. Understanding the quality of data collection and analysis in RDS is useful for effectively planning public health service delivery and funding priorities.

  1. Alpha spectrometric characterization of process-related particle size distributions from active particle sampling at the Los Alamos National Laboratory uranium foundry

    NASA Astrophysics Data System (ADS)

    Plionis, A. A.; Peterson, D. S.; Tandon, L.; LaMont, S. P.

    2010-03-01

    Uranium particles within the respirable size range pose a significant hazard to the health and safety of workers. Significant differences in the deposition and incorporation patterns of aerosols within the respirable range can be identified and integrated into sophisticated health physics models. Data characterizing the uranium particle size distribution resulting from specific foundry-related processes are needed. Using personal air sampling cascade impactors, particles collected from several foundry processes were sorted by activity median aerodynamic diameter onto various Marple substrates. After an initial gravimetric assessment of each impactor stage, the substrates were analyzed by alpha spectrometry to determine the uranium content of each stage. Alpha spectrometry provides rapid non-distructive isotopic data that can distinguish process uranium from natural sources and the degree of uranium contribution to the total accumulated particle load. In addition, the particle size bins utilized by the impactors provide adequate resolution to determine if a process particle size distribution is: lognormal, bimodal, or trimodal. Data on process uranium particle size values and distributions facilitate the development of more sophisticated and accurate models for internal dosimetry, resulting in an improved understanding of foundry worker health and safety.

  2. An evaluation of space acquired data as a tool for wildlife management in Alaska

    NASA Technical Reports Server (NTRS)

    Vantries, B. J. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Interpretation of ERTS-1 imagery by color-coded densitometric displays and digital processes data verified that with adequate quadrat in situ sampling ERTS-1 data could be extrapolated to describe accurately the vegetative characteristics of analogous sites, and that surface acres of water for waterfowl production were obtainable for ponds a minimum of 5 acres in size.

  3. The ovenbird (Seiurus aurocapilla) as a model for testing food-value theory

    USGS Publications Warehouse

    Streby, Henry M.; Peterson, Sean M.; Scholtens, Brian; Monroe, Adrian; Andersen, David

    2013-01-01

    Food-value theory states that territorial animals space themselves such that each territory contains adequate food for rearing young. The ovenbird (Seiurus aurocapilla) is often cited as a species for which this hypothesis is supported because ovenbird territory size is inversely related to ground-invertebrate abundance within territories. However, little is known about juvenile ovenbird diet and whether food availability is accurately assessed using ground-sampling methods. We examined the relationship between ground-litter food availability and juvenile ovenbird diet in mixed northern hardwood-coniferous forests of north-central Minnesota. We sampled food availability with pitfall traps and litter samples, and concurrently sampled diet of juvenile ovenbirds from stomach samples. We found that juvenile ovenbirds were fed selectively from available food resources. In addition, we found that both ground-sampling methods greatly under-sampled forest caterpillars and snails, which together comprised 63% of juvenile ovenbird diet by mass. Combined with recent radio-telemetry findings that spot-mapping methods can poorly estimate territory size for forest songbirds, our results suggest that comparisons of spot-mapped ovenbird territories with ground-sampled invertebrate availability may not be reliable tests of food-value theory.

  4. What is an adequate sample size? Operationalising data saturation for theory-based interview studies.

    PubMed

    Francis, Jill J; Johnston, Marie; Robertson, Clare; Glidewell, Liz; Entwistle, Vikki; Eccles, Martin P; Grimshaw, Jeremy M

    2010-12-01

    In interview studies, sample size is often justified by interviewing participants until reaching 'data saturation'. However, there is no agreed method of establishing this. We propose principles for deciding saturation in theory-based interview studies (where conceptual categories are pre-established by existing theory). First, specify a minimum sample size for initial analysis (initial analysis sample). Second, specify how many more interviews will be conducted without new ideas emerging (stopping criterion). We demonstrate these principles in two studies, based on the theory of planned behaviour, designed to identify three belief categories (Behavioural, Normative and Control), using an initial analysis sample of 10 and stopping criterion of 3. Study 1 (retrospective analysis of existing data) identified 84 shared beliefs of 14 general medical practitioners about managing patients with sore throat without prescribing antibiotics. The criterion for saturation was achieved for Normative beliefs but not for other beliefs or studywise saturation. In Study 2 (prospective analysis), 17 relatives of people with Paget's disease of the bone reported 44 shared beliefs about taking genetic testing. Studywise data saturation was achieved at interview 17. We propose specification of these principles for reporting data saturation in theory-based interview studies. The principles may be adaptable for other types of studies.

  5. Influence of size-fractioning techniques on concentrations of selected trace metals in bottom materials from two streams in northeastern Ohio

    USGS Publications Warehouse

    Koltun, G.F.; Helsel, Dennis R.

    1986-01-01

    Identical stream-bottom material samples, when fractioned to the same size by different techniques, may contain significantly different trace-metal concentrations. Precision of techniques also may differ, which could affect the ability to discriminate between size-fractioned bottom-material samples having different metal concentrations. Bottom-material samples fractioned to less than 0.020 millimeters by means of three common techniques (air elutriation, sieving, and settling) were analyzed for six trace metals to determine whether the technique used to obtain the desired particle-size fraction affects the ability to discriminate between bottom materials having different trace-metal concentrations. In addition, this study attempts to assess whether median trace-metal concentrations in size-fractioned bottom materials of identical origin differ depending on the size-fractioning technique used. Finally, this study evaluates the efficiency of the three size-fractioning techniques in terms of time, expense, and effort involved. Bottom-material samples were collected at two sites in northeastern Ohio: One is located in an undeveloped forested basin, and the other is located in a basin having a mixture of industrial and surface-mining land uses. The sites were selected for their close physical proximity, similar contributing drainage areas, and the likelihood that trace-metal concentrations in the bottom materials would be significantly different. Statistically significant differences in the concentrations of trace metals were detected between bottom-material samples collected at the two sites when the samples had been size-fractioned by means of air elutriation or sieving. Statistical analyses of samples that had been size fractioned by settling in native water were not measurably different in any of the six trace metals analyzed. Results of multiple comparison tests suggest that differences related to size-fractioning technique were evident in median copper, lead, and iron concentrations. Technique-related differences in copper concentrations most likely resulted from contamination of air-elutriated samples by a feed tip on the elutriator apparatus. No technique-related differences were observed in chromium, manganese, or zinc concentrations. Although air elutriation was the most expensive sizefractioning technique investigated, samples fractioned by this technique appeared to provide a superior level of discrimination between metal concentrations present in the bottom materials of the two sites. Sieving was an adequate lower-cost but more laborintensive alternative.

  6. Effects of sample size on KERNEL home range estimates

    USGS Publications Warehouse

    Seaman, D.E.; Millspaugh, J.J.; Kernohan, Brian J.; Brundige, Gary C.; Raedeke, Kenneth J.; Gitzen, Robert A.

    1999-01-01

    Kernel methods for estimating home range are being used increasingly in wildlife research, but the effect of sample size on their accuracy is not known. We used computer simulations of 10-200 points/home range and compared accuracy of home range estimates produced by fixed and adaptive kernels with the reference (REF) and least-squares cross-validation (LSCV) methods for determining the amount of smoothing. Simulated home ranges varied from simple to complex shapes created by mixing bivariate normal distributions. We used the size of the 95% home range area and the relative mean squared error of the surface fit to assess the accuracy of the kernel home range estimates. For both measures, the bias and variance approached an asymptote at about 50 observations/home range. The fixed kernel with smoothing selected by LSCV provided the least-biased estimates of the 95% home range area. All kernel methods produced similar surface fit for most simulations, but the fixed kernel with LSCV had the lowest frequency and magnitude of very poor estimates. We reviewed 101 papers published in The Journal of Wildlife Management (JWM) between 1980 and 1997 that estimated animal home ranges. A minority of these papers used nonparametric utilization distribution (UD) estimators, and most did not adequately report sample sizes. We recommend that home range studies using kernel estimates use LSCV to determine the amount of smoothing, obtain a minimum of 30 observations per animal (but preferably a?Y50), and report sample sizes in published results.

  7. A Bayesian Perspective on the Reproducibility Project: Psychology

    PubMed Central

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors—a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis—for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor < 10). The majority of the studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable. PMID:26919473

  8. A Bayesian Perspective on the Reproducibility Project: Psychology.

    PubMed

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors-a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis-for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor < 10). The majority of the studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable.

  9. Statistical considerations in monitoring birds over large areas

    USGS Publications Warehouse

    Johnson, D.H.

    2000-01-01

    The proper design of a monitoring effort depends primarily on the objectives desired, constrained by the resources available to conduct the work. Typically, managers have numerous objectives, such as determining abundance of the species, detecting changes in population size, evaluating responses to management activities, and assessing habitat associations. A design that is optimal for one objective will likely not be optimal for others. Careful consideration of the importance of the competing objectives may lead to a design that adequately addresses the priority concerns, although it may not be optimal for any individual objective. Poor design or inadequate sample sizes may result in such weak conclusions that the effort is wasted. Statistical expertise can be used at several stages, such as estimating power of certain hypothesis tests, but is perhaps most useful in fundamental considerations of describing objectives and designing sampling plans.

  10. A pilot randomized trial of two cognitive rehabilitation interventions for mild cognitive impairment: caregiver outcomes.

    PubMed

    Cuc, Andrea V; Locke, Dona E C; Duncan, Noah; Fields, Julie A; Snyder, Charlene Hoffman; Hanna, Sherrie; Lunde, Angela; Smith, Glenn E; Chandler, Melanie

    2017-12-01

    This study aims to provide effect size estimates of the impact of two cognitive rehabilitation interventions provided to patients with mild cognitive impairment: computerized brain fitness exercise and memory support system on support partners' outcomes of depression, anxiety, quality of life, and partner burden. A randomized controlled pilot trial was performed. At 6 months, the partners from both treatment groups showed stable to improved depression scores, while partners in an untreated control group showed worsening depression over 6 months. There were no statistically significant differences on anxiety, quality of life, or burden outcomes in this small pilot trial; however, effect sizes were moderate, suggesting that the sample sizes in this pilot study were not adequate to detect statistical significance. Either form of cognitive rehabilitation may help partners' mood, compared with providing no treatment. However, effect size estimates related to other partner outcomes (i.e., burden, quality of life, and anxiety) suggest that follow-up efficacy trials will need sample sizes of at least 30-100 people per group to accurately determine significance. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. High Strain Rate Mechanical Properties of Epoxy and Epoxy-Based Particulate Composites

    DTIC Science & Technology

    2007-08-01

    and titanium alloy (Ti- 6Al - 4V ) bar materials available. For all bar systems, the properties of the sample are determined by measuring the...polished, carbon-coated specimens provided adequate contrast between the aluminum particles, the epoxy matrix and any porosity present after curing...difference between the two measures of particle size can be explained by the higher levels of porosity observed in the Epoxy-65H2 specimen, which

  12. HSQC-1,n-ADEQUATE: a new approach to long-range 13C-13C correlation by covariance processing.

    PubMed

    Martin, Gary E; Hilton, Bruce D; Willcott, M Robert; Blinov, Kirill A

    2011-10-01

    Long-range, two-dimensional heteronuclear shift correlation NMR methods play a pivotal role in the assembly of novel molecular structures. The well-established GHMBC method is a high-sensitivity mainstay technique, affording connectivity information via (n)J(CH) coupling pathways. Unfortunately, there is no simple way of determining the value of n and hence no way of differentiating two-bond from three- and occasionally four-bond correlations. Three-bond correlations, however, generally predominate. Recent work has shown that the unsymmetrical indirect covariance or generalized indirect covariance processing of multiplicity edited GHSQC and 1,1-ADEQUATE spectra provides high-sensitivity access to a (13)C-(13) C connectivity map in the form of an HSQC-1,1-ADEQUATE spectrum. Covariance processing of these data allows the 1,1-ADEQUATE connectivity information to be exploited with the inherent sensitivity of the GHSQC spectrum rather than the intrinsically lower sensitivity of the 1,1-ADEQUATE spectrum itself. Data acquisition times and/or sample size can be substantially reduced when covariance processing is to be employed. In an extension of that work, 1,n-ADEQUATE spectra can likewise be subjected to covariance processing to afford high-sensitivity access to the equivalent of (4)J(CH) GHMBC connectivity information. The method is illustrated using strychnine as a model compound. Copyright © 2011 John Wiley & Sons, Ltd.

  13. Towards Monitoring Biodiversity in Amazonian Forests: How Regular Samples Capture Meso-Scale Altitudinal Variation in 25 km2 Plots

    PubMed Central

    Norris, Darren; Fortin, Marie-Josée; Magnusson, William E.

    2014-01-01

    Background Ecological monitoring and sampling optima are context and location specific. Novel applications (e.g. biodiversity monitoring for environmental service payments) call for renewed efforts to establish reliable and robust monitoring in biodiversity rich areas. As there is little information on the distribution of biodiversity across the Amazon basin, we used altitude as a proxy for biological variables to test whether meso-scale variation can be adequately represented by different sample sizes in a standardized, regular-coverage sampling arrangement. Methodology/Principal Findings We used Shuttle-Radar-Topography-Mission digital elevation values to evaluate if the regular sampling arrangement in standard RAPELD (rapid assessments (“RAP”) over the long-term (LTER [“PELD” in Portuguese])) grids captured patters in meso-scale spatial variation. The adequacy of different sample sizes (n = 4 to 120) were examined within 32,325 km2/3,232,500 ha (1293×25 km2 sample areas) distributed across the legal Brazilian Amazon. Kolmogorov-Smirnov-tests, correlation and root-mean-square-error were used to measure sample representativeness, similarity and accuracy respectively. Trends and thresholds of these responses in relation to sample size and standard-deviation were modeled using Generalized-Additive-Models and conditional-inference-trees respectively. We found that a regular arrangement of 30 samples captured the distribution of altitude values within these areas. Sample size was more important than sample standard deviation for representativeness and similarity. In contrast, accuracy was more strongly influenced by sample standard deviation. Additionally, analysis of spatially interpolated data showed that spatial patterns in altitude were also recovered within areas using a regular arrangement of 30 samples. Conclusions/Significance Our findings show that the logistically feasible sample used in the RAPELD system successfully recovers meso-scale altitudinal patterns. This suggests that the sample size and regular arrangement may also be generally appropriate for quantifying spatial patterns in biodiversity at similar scales across at least 90% (≈5 million km2) of the Brazilian Amazon. PMID:25170894

  14. Does the bathing water classification depend on sampling strategy? A bootstrap approach for bathing water quality assessment, according to Directive 2006/7/EC requirements.

    PubMed

    López, Iago; Alvarez, César; Gil, José L; Revilla, José A

    2012-11-30

    Data on the 95th and 90th percentiles of bacteriological quality indicators are used to classify bathing waters in Europe, according to the requirements of Directive 2006/7/EC. However, percentile values and consequently, classification of bathing waters depend both on sampling effort and sample-size, which may undermine an appropriate assessment of bathing water classification. To analyse the influence of sampling effort and sample size on water classification, a bootstrap approach was applied to 55 bacteriological quality datasets of several beaches in the Balearic Islands (Spain). Our results show that the probability of failing the regulatory standards of the Directive is high when sample size is low, due to a higher variability in percentile values. In this way, 49% of the bathing waters reaching an "Excellent" classification (95th percentile of Escherichia coli under 250 cfu/100 ml) can fail the "Excellent" regulatory standard due to sampling strategy, when 23 samples per season are considered. This percentage increases to 81% when 4 samples per season are considered. "Good" regulatory standards can also be failed in bathing waters with an "Excellent" classification as a result of these sampling strategies. The variability in percentile values may affect bathing water classification and is critical for the appropriate design and implementation of bathing water Quality Monitoring and Assessment Programs. Hence, variability of percentile values should be taken into account by authorities if an adequate management of these areas is to be achieved. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Development and evaluation of a water level proportional water sampler

    NASA Astrophysics Data System (ADS)

    Schneider, P.; Lange, A.; Doppler, T.

    2013-12-01

    We developed and adapted a new type of sampler for time-integrated, water level proportional water quality sampling (e.g. nutrients, contaminants and stable isotopes). Our samplers are designed for sampling small to mid-size streams based on the law of Hagen-Poiseuille, where a capillary (or a valve) limits the sampling aliquot by reducing the air flux out of a submersed plastic (HDPE) sampling container. They are good alternatives to battery-operated automated water samplers when working in remote areas, or at streams that are characterized by pronounced daily discharge variations such as glacier streams. We evaluated our samplers against standard automated water samplers (ISCO 2900 and ISCO 6712) during the snowmelt in the Black Forest and the Alps and tested them in remote glacial catchments in Iceland, Switzerland and Kyrgyzstan. The results clearly showed that our samplers are an adequate tool for time-integrated, water level proportional water sampling at remote test sites, as they do not need batteries, are relatively inexpensive, lightweight, and compact. They are well suited for headwater streams - especially when sampling for stable isotopes - as the sampled water is perfectly protected against evaporation. Moreover, our samplers have a reduced risk of icing in cold environments, as they are installed submersed in water, whereas automated samplers (typically installed outside the stream) may get clogged due to icing of hoses. Based on this study, we find these samplers to be an adequate replacement for automated samplers when time-integrated sampling or solute load estimates are the main monitoring tasks.

  16. High Strain Rate Mechanical Properties of Epoxy and Epoxy-Based Particulate Composites (Preprint)

    DTIC Science & Technology

    2007-05-01

    WC) and titanium alloy (Ti- 6Al - 4V ) bar materials available. For all bar systems, the properties of the sample are determined by measuring the...metallographically-polished, carbon-coated specimens provided adequate contrast between the aluminum particles, the epoxy matrix and any porosity present after...The difference between the two measures of particle size can be explained by the higher levels of porosity observed in the Epoxy-65H2 specimen, which

  17. Experimental toxicology: Issues of statistics, experimental design, and replication.

    PubMed

    Briner, Wayne; Kirwan, Jeral

    2017-01-01

    The difficulty of replicating experiments has drawn considerable attention. Issues with replication occur for a variety of reasons ranging from experimental design to laboratory errors to inappropriate statistical analysis. Here we review a variety of guidelines for statistical analysis, design, and execution of experiments in toxicology. In general, replication can be improved by using hypothesis driven experiments with adequate sample sizes, randomization, and blind data collection techniques. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Clinimetric evaluation of shoulder disability questionnaires: a systematic review of the literature

    PubMed Central

    Bot, S; Terwee, C; van der Windt, D A W M; Bouter, L; Dekker, J; de Vet, H C W

    2004-01-01

    Methods: Systematic literature searches were performed to identify self administered shoulder disability questionnaires. A checklist was developed to evaluate and compare the clinimetric quality of the instruments. Results: Two reviewers identified and evaluated 16 questionnaires by our checklist. Most studies were found for the Disability of the Arm, Shoulder, and Hand scale (DASH), the Shoulder Pain and Disability Index (SPADI), and the American Shoulder and Elbow Surgeons Standardised Shoulder Assessment Form (ASES). None of the questionnaires demonstrated satisfactory results for all properties. Most questionnaires claim to measure several domains (for example, pain, physical, emotional, and social functioning), yet dimensionality was studied in only three instruments. The internal consistency was calculated for seven questionnaires and only one received an adequate rating. Twelve questionnaires received positive ratings for construct validity, although depending on the population studied, four of these questionnaires received poor ratings too. Seven questionnaires were shown to have adequate test-retest reliability (ICC >0.70), but five questionnaires were tested inadequately. In most clinimetric studies only small sample sizes (n<43) were used. Nearly all publications lacked information on the interpretation of scores. Conclusion: The DASH, SPADI, and ASES have been studied most extensively, and yet even published validation studies of these instruments have limitations in study design, sample sizes, or evidence for dimensionality. Overall, the DASH received the best ratings for its clinimetric properties. PMID:15020324

  19. [An investigation of the statistical power of the effect size in randomized controlled trials for the treatment of patients with type 2 diabetes mellitus using Chinese medicine].

    PubMed

    Ma, Li-Xin; Liu, Jian-Ping

    2012-01-01

    To investigate whether the power of the effect size was based on adequate sample size in randomized controlled trials (RCTs) for the treatment of patients with type 2 diabetes mellitus (T2DM) using Chinese medicine. China Knowledge Resource Integrated Database (CNKI), VIP Database for Chinese Technical Periodicals (VIP), Chinese Biomedical Database (CBM), and Wangfang Data were systematically recruited using terms like "Xiaoke" or diabetes, Chinese herbal medicine, patent medicine, traditional Chinese medicine, randomized, controlled, blinded, and placebo-controlled. Limitation was set on the intervention course > or = 3 months in order to identify the information of outcome assessement and the sample size. Data collection forms were made according to the checking lists found in the CONSORT statement. Independent double data extractions were performed on all included trials. The statistical power of the effects size for each RCT study was assessed using sample size calculation equations. (1) A total of 207 RCTs were included, including 111 superiority trials and 96 non-inferiority trials. (2) Among the 111 superiority trials, fasting plasma glucose (FPG) and glycosylated hemoglobin HbA1c (HbA1c) outcome measure were reported in 9% and 12% of the RCTs respectively with the sample size > 150 in each trial. For the outcome of HbA1c, only 10% of the RCTs had more than 80% power. For FPG, 23% of the RCTs had more than 80% power. (3) In the 96 non-inferiority trials, the outcomes FPG and HbA1c were reported as 31% and 36% respectively. These RCTs had a samples size > 150. For HbA1c only 36% of the RCTs had more than 80% power. For FPG, only 27% of the studies had more than 80% power. The sample size for statistical analysis was distressingly low and most RCTs did not achieve 80% power. In order to obtain a sufficient statistic power, it is recommended that clinical trials should establish clear research objective and hypothesis first, and choose scientific and evidence-based study design and outcome measurements. At the same time, calculate required sample size to ensure a precise research conclusion.

  20. Parameter recovery, bias and standard errors in the linear ballistic accumulator model.

    PubMed

    Visser, Ingmar; Poessé, Rens

    2017-05-01

    The linear ballistic accumulator (LBA) model (Brown & Heathcote, , Cogn. Psychol., 57, 153) is increasingly popular in modelling response times from experimental data. An R package, glba, has been developed to fit the LBA model using maximum likelihood estimation which is validated by means of a parameter recovery study. At sufficient sample sizes parameter recovery is good, whereas at smaller sample sizes there can be large bias in parameters. In a second simulation study, two methods for computing parameter standard errors are compared. The Hessian-based method is found to be adequate and is (much) faster than the alternative bootstrap method. The use of parameter standard errors in model selection and inference is illustrated in an example using data from an implicit learning experiment (Visser et al., , Mem. Cogn., 35, 1502). It is shown that typical implicit learning effects are captured by different parameters of the LBA model. © 2017 The British Psychological Society.

  1. Annual variation in polychlorinated biphenyl (PCB) exposure in tree swallow (Tachycineta bicolor) eggs and nestlings at Great Lakes Restoration Initiative (GLRI) study sites

    USGS Publications Warehouse

    Custer, Christine M.; Custer, Thomas W.; Dummer, Paul; Goldberg, Diana R.; Franson, J. Christian

    2018-01-01

    Tree swallow (Tachycineta bicolor) eggs and nestlings were collected from 16 sites across the Great Lakes to quantify normal annual variation in total polychlorinated biphenyl (PCB) exposure and to validate the sample size choice in earlier work. A sample size of five eggs or five nestlings per site was adequate to quantify exposure to PCBs in tree swallows given the current exposure levels and variation. There was no difference in PCB exposure in two randomly selected sets of five eggs collected in the same year, but analyzed in different years. Additionally, there was only modest annual variation in exposure, with between 69% (nestlings) and 73% (eggs) of sites having no differences between years. There was a tendency, both statistically and qualitatively, for there to be less exposure in the second year compared to the first year.

  2. Women's health: periodontitis and its relation to hormonal changes, adverse pregnancy outcomes and osteoporosis.

    PubMed

    Krejci, Charlene B; Bissada, Nabil F

    2012-01-01

    To examine the literature with respect to periodontitis and issues specific to women's health, namely, hormonal changes, adverse pregnancy outcomes and osteoporosis. The literature was evaluated to review reported associations between periodontitis and genderspecific issues, namely, hormonal changes, adverse pregnancy outcomes and osteoporosis. Collectively, the literature provided a large body of evidence that supports various associations between periodontitis and hormonal changes, adverse pregnancy outcomes and osteoporosis; however, certain shortcomings were noted with respect to biases involving definitions, sample sizes and confounding variables. Specific cause and effect relationships could not be delineated at this time and neither could definitive treatment interventions. Future research must include randomised controlled trials with consistent definitions, adequate controls and sufficiently large sample sizes in order to clarify specific associations, identify cause and effect relationships, define treatment options and determine treatment interventions which will lessen the untoward effects on the at-risk populations.

  3. Online extraction LC-MS/MS method for the simultaneous quantitative confirmation of urine drugs of abuse and metabolites: amphetamines, opiates, cocaine, cannabis, benzodiazepines and methadone.

    PubMed

    de Jager, Andrew D; Bailey, Neville L

    2011-09-01

    A rapid LC-MS/MS method for confirmatory testing of five major categories of drugs of abuse (amphetamine-type substances, opiates, cocaine, cannabis metabolites and benzodiazepines) in urine has been developed. All drugs of abuse mandated by the Australian/New Zealand Standard AS/NZS 4308:2008 are quantified in a single chromatographic run. Urine samples are diluted with a mixture of isotope labelled internal standards. An on-line trap-and-flush approach, followed by LC-ESI-MS/MS has been successfully used to process samples in a functioning drugs of abuse laboratory. Following injection of diluted urine samples, compounds retained on the trap cartridge are flushed onto a reverse-phase C18 HPLC column (5-μm particle size) with embedded hydrophylic functionality. A total chromatographic run-time of 15 min is required for adequate resolution. Automated quantitation software algorithms have been developed in-house using XML scripting to partially automate the identification of positive samples, taking into account ion ratio (IR) and retention times (Rt). The sensitivity of the assay was found to be adequate for the quantitation of drugs in urine at and below the confirmation cut-off concentrations prescribed by AS/NZS 4308:2008. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. The effectiveness of increased apical enlargement in reducing intracanal bacteria.

    PubMed

    Card, Steven J; Sigurdsson, Asgeir; Orstavik, Dag; Trope, Martin

    2002-11-01

    It has been suggested that the apical portion of a root canal is not adequately disinfected by typical instrumentation regimens. The purpose of this study was to determine whether instrumentation to sizes larger than typically used would more effectively remove culturable bacteria from the canal. Forty patients with clinical and radiographic evidence of apical periodontitis were recruited from the endodontic clinic. Mandibular cuspids (n = 2), bicuspids (n = 11), and molars (mesial roots) (n = 27) were selected for the study. Bacterial sampling was performed upon access and after each of two consecutive instrumentations. The first instrumentation utilized 1% NaOCI and 0.04 taper ProFile rotary files. The cuspid and bicuspid canals were instrumented to a #8 size and the molar canals to a #7 size. The second instrumentation utilized LightSpeed files and 1% NaOCl irrigation for further enlargement of the apical third. Typically, molars were instrumented to size 60 and cuspid/bicuspid canals to size 80. Our findings show that 100% of the cuspid/bicuspid canals and 81.5% of the molar canals were rendered bacteria-free after the first instrumentation sizes. The molar results improved to 89% after the second instrumentation. Of the (59.3%) molar mesial canals without a clinically detectable communication, 93% were rendered bacteria-free with the first instrumentation. Using a Wilcoxon rank sum test, statistically significant differences (p < 0.0001) were found between the initial sample and the samples after the first and second instrumentations. The differences between the samples that followed the two instrumentation regimens were not significant (p = 0.0617). It is concluded that simple root canal systems (without multiple canal communications) may be rendered bacteria-free when preparation of this type is utilized.

  5. Sample size requirements for separating out the effects of combination treatments: randomised controlled trials of combination therapy vs. standard treatment compared to factorial designs for patients with tuberculous meningitis.

    PubMed

    Wolbers, Marcel; Heemskerk, Dorothee; Chau, Tran Thi Hong; Yen, Nguyen Thi Bich; Caws, Maxine; Farrar, Jeremy; Day, Jeremy

    2011-02-02

    In certain diseases clinical experts may judge that the intervention with the best prospects is the addition of two treatments to the standard of care. This can either be tested with a simple randomized trial of combination versus standard treatment or with a 2 x 2 factorial design. We compared the two approaches using the design of a new trial in tuberculous meningitis as an example. In that trial the combination of 2 drugs added to standard treatment is assumed to reduce the hazard of death by 30% and the sample size of the combination trial to achieve 80% power is 750 patients. We calculated the power of corresponding factorial designs with one- to sixteen-fold the sample size of the combination trial depending on the contribution of each individual drug to the combination treatment effect and the strength of an interaction between the two. In the absence of an interaction, an eight-fold increase in sample size for the factorial design as compared to the combination trial is required to get 80% power to jointly detect effects of both drugs if the contribution of the less potent treatment to the total effect is at least 35%. An eight-fold sample size increase also provides a power of 76% to detect a qualitative interaction at the one-sided 10% significance level if the individual effects of both drugs are equal. Factorial designs with a lower sample size have a high chance to be underpowered, to show significance of only one drug even if both are equally effective, and to miss important interactions. Pragmatic combination trials of multiple interventions versus standard therapy are valuable in diseases with a limited patient pool if all interventions test the same treatment concept, it is considered likely that either both or none of the individual interventions are effective, and only moderate drug interactions are suspected. An adequately powered 2 x 2 factorial design to detect effects of individual drugs would require at least 8-fold the sample size of the combination trial. Current Controlled Trials ISRCTN61649292.

  6. Determination of the efficacy of preservation of non-eye area water-miscible cosmetic and toiletry formulations: collaborative study.

    PubMed

    Machtiger, N A; Fischler, G E; Adams, M C; Spielmaker, R; Graf, J F

    2001-01-01

    A collaborative study was conducted to test a method developed to distinguish between adequately and inadequately preserved cosmetic formulations. Nineteen laboratories participated in the study. Samples tested included shampoos, hair conditioners, oil-in-water emulsions, and water-in-oil-emulsions. Triplicate samples of 4 adequately preserved and 4 inadequately preserved cosmetic products were tested by each collaborative laboratory. Results showed that all inadequately preserved shampoo and conditioner samples failed to meet the acceptance criteria for adequately preserved formulations. Of the 51 preserved samples, 49 shampoos and 48 conditioners met the criteria for adequate preservation. All samples of inadequately preserved water-in-oil emulsions and oil-in-water emulsions failed to meet the acceptance criteria, whereas all adequately preserved emulsion formulations met the acceptance criteria.

  7. Sources of variability in collection and preparation of paint and lead-coating samples.

    PubMed

    Harper, S L; Gutknecht, W F

    2001-06-01

    Chronic exposure of children to lead (Pb) can result in permanent physiological impairment. Since surfaces coated with lead-containing paints and varnishes are potential sources of exposure, it is extremely important that reliable methods for sampling and analysis be available. The sources of variability in the collection and preparation of samples were investigated to improve the performance and comparability of methods and to ensure that data generated will be adequate for its intended use. Paint samples of varying sizes (areas and masses) were collected at different locations across a variety of surfaces including metal, plaster, concrete, and wood. A variety of grinding techniques were compared. Manual mortar and pestle grinding for at least 1.5 min and mechanized grinding techniques were found to generate similar homogenous particle size distributions required for aliquots as small as 0.10 g. When 342 samples were evaluated for sample weight loss during mortar and pestle grinding, 4% had 20% or greater loss with a high of 41%. Homogenization and sub-sampling steps were found to be the principal sources of variability related to the size of the sample collected. Analysis of samples from different locations on apparently identical surfaces were found to vary by more than a factor of two both in Pb concentration (mg cm-2 or %) and areal coating density (g cm-2). Analyses of substrates were performed to determine the Pb remaining after coating removal. Levels as high as 1% Pb were found in some substrate samples, corresponding to more than 35 mg cm-2 Pb. In conclusion, these sources of variability must be considered in development and/or application of any sampling and analysis methodologies.

  8. Seven ways to increase power without increasing N.

    PubMed

    Hansen, W B; Collins, L M

    1994-01-01

    Many readers of this monograph may wonder why a chapter on statistical power was included. After all, by now the issue of statistical power is in many respects mundane. Everyone knows that statistical power is a central research consideration, and certainly most National Institute on Drug Abuse grantees or prospective grantees understand the importance of including a power analysis in research proposals. However, there is ample evidence that, in practice, prevention researchers are not paying sufficient attention to statistical power. If they were, the findings observed by Hansen (1992) in a recent review of the prevention literature would not have emerged. Hansen (1992) examined statistical power based on 46 cohorts followed longitudinally, using nonparametric assumptions given the subjects' age at posttest and the numbers of subjects. Results of this analysis indicated that, in order for a study to attain 80-percent power for detecting differences between treatment and control groups, the difference between groups at posttest would need to be at least 8 percent (in the best studies) and as much as 16 percent (in the weakest studies). In order for a study to attain 80-percent power for detecting group differences in pre-post change, 22 of the 46 cohorts would have needed relative pre-post reductions of greater than 100 percent. Thirty-three of the 46 cohorts had less than 50-percent power to detect a 50-percent relative reduction in substance use. These results are consistent with other review findings (e.g., Lipsey 1990) that have shown a similar lack of power in a broad range of research topics. Thus, it seems that, although researchers are aware of the importance of statistical power (particularly of the necessity for calculating it when proposing research), they somehow are failing to end up with adequate power in their completed studies. This chapter argues that the failure of many prevention studies to maintain adequate statistical power is due to an overemphasis on sample size (N) as the only, or even the best, way to increase statistical power. It is easy to see how this overemphasis has come about. Sample size is easy to manipulate, has the advantage of being related to power in a straight-forward way, and usually is under the direct control of the researcher, except for limitations imposed by finances or subject availability. Another option for increasing power is to increase the alpha used for hypothesis-testing but, as very few researchers seriously consider significance levels much larger than the traditional .05, this strategy seldom is used. Of course, sample size is important, and the authors of this chapter are not recommending that researchers cease choosing sample sizes carefully. Rather, they argue that researchers should not confine themselves to increasing N to enhance power. It is important to take additional measures to maintain and improve power over and above making sure the initial sample size is sufficient. The authors recommend two general strategies. One strategy involves attempting to maintain the effective initial sample size so that power is not lost needlessly. The other strategy is to take measures to maximize the third factor that determines statistical power: effect size.

  9. Assessing Unmet and Latent Demand for Pharmacists at the State Level

    PubMed Central

    Arora, Prachi; Mott, David A.; Chui, Michelle A.; Kreling, David H.

    2016-01-01

    Background Past reports suggest that a near balance has been reached in the supply and demand for pharmacists in the US. Although data on the level of supply of pharmacists is available, there is no continuous and systematic tracking of the level of demand (unmet and latent) for pharmacists at state level. Unmet demand, an established construct in pharmacy workforce, is important to measure the number of vacancies and assess pharmacist shortage consistently over time. Latent demand or potential demand is a novel construct and has never been measured in pharmacy workforce. With the increase in supply, it is important to measure the potential demand that could be budgeted in pharmacies in the near future. Objective The objective of this study was to measure the unmet and latent demand for pharmacists and explore the association between latent demand and workload characteristics in community and hospital pharmacies in Wisconsin in 2011-12. Methods The study used a cross-sectional, descriptive survey design. A sample of community pharmacies (n=1,064) and hospital pharmacies (n=126) licensed in Wisconsin in 2011-12 was identified. Key informants (managers/owners) of sampled pharmacies were sent a one-page cover letter explaining the purpose of the study and requesting participation and a three page survey form. The main outcome measures of the study were total number of FTE pharmacist positions vacant, presence of adequate staff size, additional number of FTE pharmacist positions needed to attain adequate staff size, prescription volume, daily census, hospital size and number of hours prescription department is open. Descriptive statistics were calculated for all the pharmacies collectively, then separately for community and hospital pharmacies. Pharmacy setting, vacancies and workload characteristics of pharmacies with and without latent demand were compared using chi-squared test of independence and/or t-test. Sample weights were calculated and used in all the analyses to weigh the estimates to all pharmacies in Wisconsin. Results Overall response rate to the survey was 50.1%. Of the total number of FTE pharmacist positions budgeted in Wisconsin, 54.3 FTE positions (1.5%) were reported vacant in 2011-12. Approximately 28.2% of the community and hospital pharmacies reported the presence of latent demand. Latent demand was significantly associated with higher workload in community pharmacies and larger bed size in hospital pharmacies. Conclusion There appeared to be a balance between the supply and demand for pharmacists in Wisconsin in 2011-12. There is a potential for additional FTE positions (latent demand) to be budgeted in pharmacies to attain adequate pharmacist staff size. It is important to consistently track the level of unmet and latent demand for pharmacists in Wisconsin and combine this information with other workforce characteristics to guide the decision making of pharmacy workforce planners and pharmacy managers. PMID:27330846

  10. Spatial distribution of nymphs of Scaphoideus titanus (Homoptera: Cicadellidae) in grapes, and evaluation of sequential sampling plans.

    PubMed

    Lessio, Federico; Alma, Alberto

    2006-04-01

    The spatial distribution of the nymphs of Scaphoideus titanus Ball (Homoptera Cicadellidae), the vector of grapevine flavescence dorée (Candidatus Phytoplasma vitis, 16Sr-V), was studied by applying Taylor's power law. Studies were conducted from 2002 to 2005, in organic and conventional vineyards of Piedmont, northern Italy. Minimum sample size and fixed precision level stop lines were calculated to develop appropriate sampling plans. Model validation was performed, using independent field data, by means of Resampling Validation of Sample Plans (RVSP) resampling software. The nymphal distribution, analyzed via Taylor's power law, was aggregated, with b = 1.49. A sample of 32 plants was adequate at low pest densities with a precision level of D0 = 0.30; but for a more accurate estimate (D0 = 0.10), the required sample size needs to be 292 plants. Green's fixed precision level stop lines seem to be more suitable for field sampling: RVSP simulations of this sampling plan showed precision levels very close to the desired levels. However, at a prefixed precision level of 0.10, sampling would become too time-consuming, whereas a precision level of 0.25 is easily achievable. How these results could influence the correct application of the compulsory control of S. titanus and Flavescence dorée in Italy is discussed.

  11. Statistical power analysis in wildlife research

    USGS Publications Warehouse

    Steidl, R.J.; Hayes, J.P.

    1997-01-01

    Statistical power analysis can be used to increase the efficiency of research efforts and to clarify research results. Power analysis is most valuable in the design or planning phases of research efforts. Such prospective (a priori) power analyses can be used to guide research design and to estimate the number of samples necessary to achieve a high probability of detecting biologically significant effects. Retrospective (a posteriori) power analysis has been advocated as a method to increase information about hypothesis tests that were not rejected. However, estimating power for tests of null hypotheses that were not rejected with the effect size observed in the study is incorrect; these power estimates will always be a??0.50 when bias adjusted and have no relation to true power. Therefore, retrospective power estimates based on the observed effect size for hypothesis tests that were not rejected are misleading; retrospective power estimates are only meaningful when based on effect sizes other than the observed effect size, such as those effect sizes hypothesized to be biologically significant. Retrospective power analysis can be used effectively to estimate the number of samples or effect size that would have been necessary for a completed study to have rejected a specific null hypothesis. Simply presenting confidence intervals can provide additional information about null hypotheses that were not rejected, including information about the size of the true effect and whether or not there is adequate evidence to 'accept' a null hypothesis as true. We suggest that (1) statistical power analyses be routinely incorporated into research planning efforts to increase their efficiency, (2) confidence intervals be used in lieu of retrospective power analyses for null hypotheses that were not rejected to assess the likely size of the true effect, (3) minimum biologically significant effect sizes be used for all power analyses, and (4) if retrospective power estimates are to be reported, then the I?-level, effect sizes, and sample sizes used in calculations must also be reported.

  12. Statistical inference involving binomial and negative binomial parameters.

    PubMed

    García-Pérez, Miguel A; Núñez-Antón, Vicente

    2009-05-01

    Statistical inference about two binomial parameters implies that they are both estimated by binomial sampling. There are occasions in which one aims at testing the equality of two binomial parameters before and after the occurrence of the first success along a sequence of Bernoulli trials. In these cases, the binomial parameter before the first success is estimated by negative binomial sampling whereas that after the first success is estimated by binomial sampling, and both estimates are related. This paper derives statistical tools to test two hypotheses, namely, that both binomial parameters equal some specified value and that both parameters are equal though unknown. Simulation studies are used to show that in small samples both tests are accurate in keeping the nominal Type-I error rates, and also to determine sample size requirements to detect large, medium, and small effects with adequate power. Additional simulations also show that the tests are sufficiently robust to certain violations of their assumptions.

  13. A preliminary psychometric evaluation of Music in Dementia Assessment Scales (MiDAS).

    PubMed

    McDermott, Orii; Orgeta, Vasiliki; Ridder, Hanne Mette; Orrell, Martin

    2014-06-01

    Music in Dementia Assessment Scales (MiDAS), an observational outcome measure for music therapy with people with moderate to severe dementia, was developed from qualitative data of focus groups and interviews. Expert and peer consultations were conducted at each stage of the scale development to maximize its content validity. This study aimed to evaluate the psychometric properties of MiDAS. Care home residents with dementia attended weekly group music therapy for up to ten sessions. Music therapists and care home staff were requested to complete weekly MiDAS ratings. The Quality of Life Scale (QoL-AD) was completed at three time-points. A total of 629 (staff = 306, therapist = 323) MiDAS forms were completed. The statistical analysis revealed that MiDAS has high therapist inter-rater reliability, low staff inter-rater reliability, adequate staff test-retest reliability, adequate concurrent validity, and good construct validity. High factor loadings between the five MiDAS Visual Analogue Scale (VAS) items, levels of Interest, Response, Initiation, Involvement, and Enjoyment, were found. This study indicates that MiDAS has good psychometric properties despite the small sample size. Future research with a larger sample size could provide a more in-depth psychometric evaluation, including further exploration of the underlying factors. MiDAS provides a measure of engagement with musical experience and offers insight into who is likely to benefit on other outcomes such as quality of life or reduction in psychiatric symptoms.

  14. Estimating the Effective Sample Size of Tree Topologies from Bayesian Phylogenetic Analyses

    PubMed Central

    Lanfear, Robert; Hua, Xia; Warren, Dan L.

    2016-01-01

    Bayesian phylogenetic analyses estimate posterior distributions of phylogenetic tree topologies and other parameters using Markov chain Monte Carlo (MCMC) methods. Before making inferences from these distributions, it is important to assess their adequacy. To this end, the effective sample size (ESS) estimates how many truly independent samples of a given parameter the output of the MCMC represents. The ESS of a parameter is frequently much lower than the number of samples taken from the MCMC because sequential samples from the chain can be non-independent due to autocorrelation. Typically, phylogeneticists use a rule of thumb that the ESS of all parameters should be greater than 200. However, we have no method to calculate an ESS of tree topology samples, despite the fact that the tree topology is often the parameter of primary interest and is almost always central to the estimation of other parameters. That is, we lack a method to determine whether we have adequately sampled one of the most important parameters in our analyses. In this study, we address this problem by developing methods to estimate the ESS for tree topologies. We combine these methods with two new diagnostic plots for assessing posterior samples of tree topologies, and compare their performance on simulated and empirical data sets. Combined, the methods we present provide new ways to assess the mixing and convergence of phylogenetic tree topologies in Bayesian MCMC analyses. PMID:27435794

  15. Binomial Test Method for Determining Probability of Detection Capability for Fracture Critical Applications

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2011-01-01

    The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that for a minimum flaw size and all greater flaw sizes, there is 0.90 probability of detection with 95% confidence (90/95 POD). Directed design of experiments for probability of detection (DOEPOD) has been developed to provide an efficient and accurate methodology that yields estimates of POD and confidence bounds for both Hit-Miss or signal amplitude testing, where signal amplitudes are reduced to Hit-Miss by using a signal threshold Directed DOEPOD uses a nonparametric approach for the analysis or inspection data that does require any assumptions about the particular functional form of a POD function. The DOEPOD procedure identifies, for a given sample set whether or not the minimum requirement of 0.90 probability of detection with 95% confidence is demonstrated for a minimum flaw size and for all greater flaw sizes (90/95 POD). The DOEPOD procedures are sequentially executed in order to minimize the number of samples needed to demonstrate that there is a 90/95 POD lower confidence bound at a given flaw size and that the POD is monotonic for flaw sizes exceeding that 90/95 POD flaw size. The conservativeness of the DOEPOD methodology results is discussed. Validated guidelines for binomial estimation of POD for fracture critical inspection are established.

  16. What is a species? A new universal method to measure differentiation and assess the taxonomic rank of allopatric populations, using continuous variables

    PubMed Central

    Donegan, Thomas M.

    2018-01-01

    Abstract Existing models for assigning species, subspecies, or no taxonomic rank to populations which are geographically separated from one another were analyzed. This was done by subjecting over 3,000 pairwise comparisons of vocal or biometric data based on birds to a variety of statistical tests that have been proposed as measures of differentiation. One current model which aims to test diagnosability (Isler et al. 1998) is highly conservative, applying a hard cut-off, which excludes from consideration differentiation below diagnosis. It also includes non-overlap as a requirement, a measure which penalizes increases to sample size. The “species scoring” model of Tobias et al. (2010) involves less drastic cut-offs, but unlike Isler et al. (1998), does not control adequately for sample size and attributes scores in many cases to differentiation which is not statistically significant. Four different models of assessing effect sizes were analyzed: using both pooled and unpooled standard deviations and controlling for sample size using t-distributions or omitting to do so. Pooled standard deviations produced more conservative effect sizes when uncontrolled for sample size but less conservative effect sizes when so controlled. Pooled models require assumptions to be made that are typically elusive or unsupported for taxonomic studies. Modifications to improving these frameworks are proposed, including: (i) introducing statistical significance as a gateway to attributing any weighting to findings of differentiation; (ii) abandoning non-overlap as a test; (iii) recalibrating Tobias et al. (2010) scores based on effect sizes controlled for sample size using t-distributions. A new universal method is proposed for measuring differentiation in taxonomy using continuous variables and a formula is proposed for ranking allopatric populations. This is based first on calculating effect sizes using unpooled standard deviations, controlled for sample size using t-distributions, for a series of different variables. All non-significant results are excluded by scoring them as zero. Distance between any two populations is calculated using Euclidian summation of non-zeroed effect size scores. If the score of an allopatric pair exceeds that of a related sympatric pair, then the allopatric population can be ranked as species and, if not, then at most subspecies rank should be assigned. A spreadsheet has been programmed and is being made available which allows this and other tests of differentiation and rank studied in this paper to be rapidly analyzed. PMID:29780266

  17. Sampling populations of humans across the world: ELSI issues.

    PubMed

    Knoppers, Bartha Maria; Zawati, Ma'n H; Kirby, Emily S

    2012-01-01

    There are an increasing number of population studies collecting data and samples to illuminate gene-environment contributions to disease risk and health. The rising affordability of innovative technologies capable of generating large amounts of data helps achieve statistical power and has paved the way for new international research collaborations. Most data and sample collections can be grouped into longitudinal, disease-specific, or residual tissue biobanks, with accompanying ethical, legal, and social issues (ELSI). Issues pertaining to consent, confidentiality, and oversight cannot be examined using a one-size-fits-all approach-the particularities of each biobank must be taken into account. It remains to be seen whether current governance approaches will be adequate to handle the impact of next-generation sequencing technologies on communication with participants in population biobanking studies.

  18. Relationship between mid-water trawling effort and catch composition uncertainty in two large lakes (Huron and Michigan) dominated by alosines, osmerids, and coregonines

    USGS Publications Warehouse

    Warner, David M.; Claramunt, Randall M.; Schaeffer, Jeffrey S.; Yule, Daniel L.; Hrabik, Tom R.; Peintka, Bernie; Rudstam, Lars G.; Holuszko, Jeffrey D.; O'Brien, Timothy P.

    2012-01-01

    Because it is not possible to identify species with echosounders alone, trawling is widely used as a method for collecting species and size composition data for allocating acoustic fish density estimates to species or size groups. In the Laurentian Great Lakes, data from midwater trawls are commonly used for such allocations. However, there are no rules for how much midwater trawling effort is required to adequately describe species and size composition of the pelagic fish communities in these lakes, so the balance between acoustic sampling effort and trawling effort has been unguided. We used midwater trawl data collected between 1986 and 2008 in lakes Michigan and Huron and a variety of analytical techniques to develop guidance for appropriate levels of trawl effort. We used multivariate regression trees and re-sampling techniques to i. identify factors that influence species and size composition of the pelagic fish communities in these lakes, ii. identify stratification schemes for the two lakes, iii. determine if there was a relationship between uncertainty in catch composition and the number of tows made, and iv. predict the number of tows required to reach desired uncertainty targets. We found that depth occupied by fish below the surface was the most influential explanatory variable. Catch composition varied between lakes at depths <38.5 m below the surface, but not at depths ≥38.5 m below the surface. Year, latitude, and bottom depth influenced catch composition in the near-surface waters of Lake Michigan, while only year was important for Lake Huron surface waters. There was an inverse relationship between RSE [relative standard error = 100 × (SE/mean)] and the number of tows made for the proportions of the different size and species groups. We found for the fifth (Lake Huron) and sixth (Lake Michigan) largest lakes in the world, 15–35 tows were adequate to achieve target RSEs (15% and 30%) for ubiquitous species, but rarer species required much higher, and at times, impractical effort levels to reach these targets.

  19. Adequacy of Prenatal Care and Gestational Weight Gain

    PubMed Central

    Crandell, Jamie L.; Jones-Vessey, Kathleen

    2016-01-01

    Abstract Background: The goal of prenatal care is to maximize health outcomes for a woman and her fetus. We examined how prenatal care is associated with meeting the 2009 Institute of Medicine (IOM) guidelines for gestational weight gain. Sample: The study used deidentified birth certificate data supplied by the North Carolina State Center for Health Statistics. The sample included 197,354 women (≥18 years) who delivered singleton full-term infants in 2011 and 2012. Methods: A generalized multinomial model was used to identify how adequate prenatal care was associated with the odds of gaining excessive or insufficient weight during pregnancy according to the 2009 IOM guidelines. The model adjusted for prepregnancy body size, sociodemographic factors, and birth weight. Results: A total of 197,354 women (≥18 years) delivered singleton full-term infants. The odds ratio (OR) for excessive weight gain was 2.44 (95% CI 2.37–2.50) in overweight and 2.33 (95% CI 2.27–2.40) in obese women compared with normal weight women. The OR for insufficient weight gain was 1.15 (95% CI 1.09–1.22) for underweight and 1.34 (95% CI 1.30–1.39) for obese women compared with normal weight women. Prenatal care at the inadequate or intermediate levels was associated with insufficient weight gain (OR: 1.32, 95% CI 1.27–1.38; OR: 1.15, 95% CI 1.09–1.21, respectively) compared with adequate prenatal care. Women with inadequate care were less likely to gain excessive weight (OR: 0.88, 95% CI 0.86–0.91). Conclusions: Whereas prenatal care was effective for preventing insufficient weight gain regardless of prepregnancy body size, educational background, and racial/ethnic group, there were no indications that adequate prenatal care was associated with reduced risk for excessive gestational weight gain. Further research is needed to improve prenatal care programs for preventing excess weight gain. PMID:26741198

  20. Football Equipment Removal Improves Chest Compression and Ventilation Efficacy.

    PubMed

    Mihalik, Jason P; Lynall, Robert C; Fraser, Melissa A; Decoster, Laura C; De Maio, Valerie J; Patel, Amar P; Swartz, Erik E

    2016-01-01

    Airway access recommendations in potential catastrophic spine injury scenarios advocate for facemask removal, while keeping the helmet and shoulder pads in place for ensuing emergency transport. The anecdotal evidence to support these recommendations assumes that maintaining the helmet and shoulder pads assists inline cervical stabilization and that facial access guarantees adequate airway access. Our objective was to determine the effect of football equipment interference on performing chest compressions and delivering adequate ventilations on patient simulators. We hypothesized that conditions with more football equipment would decrease chest compression and ventilation efficacy. Thirty-two certified athletic trainers were block randomized to participate in six different compression conditions and six different ventilation conditions using human patient simulators. Data for chest compression (mean compression depth, compression rate, percentage of correctly released compressions, and percentage of adequate compressions) and ventilation (total ventilations, mean ventilation volume, and percentage of ventilations delivering adequate volume) conditions were analyzed across all conditions. The fully equipped athlete resulted in the lowest mean compression depth (F5,154 = 22.82; P < 0.001; Effect Size = 0.98) and delivery of adequate compressions (F5,154 = 15.06; P < 0.001; Effect Size = 1.09) compared to all other conditions. Bag-valve mask conditions resulted in delivery of significantly higher mean ventilation volumes compared to all 1- or 2-person pocketmask conditions (F5,150 = 40.05; P < 0.001; Effect Size = 1.47). Two-responder ventilation scenarios resulted in delivery of a greater number of total ventilations (F5,153 = 3.99; P = 0.002; Effect Size = 0.26) and percentage of adequate ventilations (F5,150 = 5.44; P < 0.001; Effect Size = 0.89) compared to one-responder scenarios. Non-chinstrap conditions permitted greater ventilation volumes (F3,28 = 35.17; P < 0.001; Effect Size = 1.78) and a greater percentage of adequate volume (F3,28 = 4.85; P = 0.008; Effect Size = 1.12) compared to conditions with the chinstrap buckled or with the chinstrap in place but not buckled. Chest compression and ventilation delivery are compromised in equipment-intense conditions when compared to conditions whereby equipment was mostly or entirely removed. Emergency medical personnel should remove the helmet and shoulder pads from all football athletes who require cardiopulmonary resuscitation, while maintaining appropriate cervical spine stabilization when injury is suspected. Further research is needed to confirm our findings supporting full equipment removal for chest compression and ventilation delivery.

  1. 21 CFR 606.40 - Facilities.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...) Provide adequate space for the following when applicable: (1) Private and accurate examinations of... convenient toilet facilities for donors and personnel. Drains shall be of adequate size and, where connected...

  2. 21 CFR 606.40 - Facilities.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...) Provide adequate space for the following when applicable: (1) Private and accurate examinations of... convenient toilet facilities for donors and personnel. Drains shall be of adequate size and, where connected...

  3. 21 CFR 606.40 - Facilities.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...) Provide adequate space for the following when applicable: (1) Private and accurate examinations of... convenient toilet facilities for donors and personnel. Drains shall be of adequate size and, where connected...

  4. Evaluating estimators for numbers of females with cubs-of-the-year in the Yellowstone grizzly bear population

    USGS Publications Warehouse

    Cherry, S.; White, G.C.; Keating, K.A.; Haroldson, Mark A.; Schwartz, Charles C.

    2007-01-01

    Current management of the grizzly bear (Ursus arctos) population in Yellowstone National Park and surrounding areas requires annual estimation of the number of adult female bears with cubs-of-the-year. We examined the performance of nine estimators of population size via simulation. Data were simulated using two methods for different combinations of population size, sample size, and coefficient of variation of individual sighting probabilities. We show that the coefficient of variation does not, by itself, adequately describe the effects of capture heterogeneity, because two different distributions of capture probabilities can have the same coefficient of variation. All estimators produced biased estimates of population size with bias decreasing as effort increased. Based on the simulation results we recommend the Chao estimator for model M h be used to estimate the number of female bears with cubs of the year; however, the estimator of Chao and Shen may also be useful depending on the goals of the research.

  5. 12 CFR 560.1 - General.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... soundness, ensure adequate portfolio diversification and are appropriate for the size and condition of the... should adequately monitor the condition of its portfolio and the adequacy of any collateral securing its...

  6. (Sample) Size Matters: Best Practices for Defining Error in Planktic Foraminiferal Proxy Records

    NASA Astrophysics Data System (ADS)

    Lowery, C.; Fraass, A. J.

    2016-02-01

    Paleoceanographic research is a vital tool to extend modern observational datasets and to study the impact of climate events for which there is no modern analog. Foraminifera are one of the most widely used tools for this type of work, both as paleoecological indicators and as carriers for geochemical proxies. However, the use of microfossils as proxies for paleoceanographic conditions brings about a unique set of problems. This is primarily due to the fact that groups of individual foraminifera, which usually live about a month, are used to infer average conditions for time periods ranging from hundreds to tens of thousands of years. Because of this, adequate sample size is very important for generating statistically robust datasets, particularly for stable isotopes. In the early days of stable isotope geochemistry, instrumental limitations required hundreds of individual foraminiferal tests to return a value. This had the fortunate side-effect of smoothing any seasonal to decadal changes within the planktic foram population. With the advent of more sensitive mass spectrometers, smaller sample sizes have now become standard. While this has many advantages, the use of smaller numbers of individuals to generate a data point has lessened the amount of time averaging in the isotopic analysis and decreased precision in paleoceanographic datasets. With fewer individuals per sample, the differences between individual specimens will result in larger variation, and therefore error, and less precise values for each sample. Unfortunately, most (the authors included) do not make a habit of reporting the error associated with their sample size. We have created an open-source model in R to quantify the effect of sample sizes under various realistic and highly modifiable parameters (calcification depth, diagenesis in a subset of the population, improper identification, vital effects, mass, etc.). For example, a sample in which only 1 in 10 specimens is diagenetically altered can be off by >0.3‰ δ18O VPDB, or 1°C. Here, we demonstrate the use of this tool to quantify error in micropaleontological datasets, and suggest best practices for minimizing error when generating stable isotope data with foraminifera.

  7. Sampling for Global Epidemic Models and the Topology of an International Airport Network

    PubMed Central

    Bobashev, Georgiy; Morris, Robert J.; Goedecke, D. Michael

    2008-01-01

    Mathematical models that describe the global spread of infectious diseases such as influenza, severe acute respiratory syndrome (SARS), and tuberculosis (TB) often consider a sample of international airports as a network supporting disease spread. However, there is no consensus on how many cities should be selected or on how to select those cities. Using airport flight data that commercial airlines reported to the Official Airline Guide (OAG) in 2000, we have examined the network characteristics of network samples obtained under different selection rules. In addition, we have examined different size samples based on largest flight volume and largest metropolitan populations. We have shown that although the bias in network characteristics increases with the reduction of the sample size, a relatively small number of areas that includes the largest airports, the largest cities, the most-connected cities, and the most central cities is enough to describe the dynamics of the global spread of influenza. The analysis suggests that a relatively small number of cities (around 200 or 300 out of almost 3000) can capture enough network information to adequately describe the global spread of a disease such as influenza. Weak traffic flows between small airports can contribute to noise and mask other means of spread such as the ground transportation. PMID:18776932

  8. Sample size requirements for separating out the effects of combination treatments: Randomised controlled trials of combination therapy vs. standard treatment compared to factorial designs for patients with tuberculous meningitis

    PubMed Central

    2011-01-01

    Background In certain diseases clinical experts may judge that the intervention with the best prospects is the addition of two treatments to the standard of care. This can either be tested with a simple randomized trial of combination versus standard treatment or with a 2 × 2 factorial design. Methods We compared the two approaches using the design of a new trial in tuberculous meningitis as an example. In that trial the combination of 2 drugs added to standard treatment is assumed to reduce the hazard of death by 30% and the sample size of the combination trial to achieve 80% power is 750 patients. We calculated the power of corresponding factorial designs with one- to sixteen-fold the sample size of the combination trial depending on the contribution of each individual drug to the combination treatment effect and the strength of an interaction between the two. Results In the absence of an interaction, an eight-fold increase in sample size for the factorial design as compared to the combination trial is required to get 80% power to jointly detect effects of both drugs if the contribution of the less potent treatment to the total effect is at least 35%. An eight-fold sample size increase also provides a power of 76% to detect a qualitative interaction at the one-sided 10% significance level if the individual effects of both drugs are equal. Factorial designs with a lower sample size have a high chance to be underpowered, to show significance of only one drug even if both are equally effective, and to miss important interactions. Conclusions Pragmatic combination trials of multiple interventions versus standard therapy are valuable in diseases with a limited patient pool if all interventions test the same treatment concept, it is considered likely that either both or none of the individual interventions are effective, and only moderate drug interactions are suspected. An adequately powered 2 × 2 factorial design to detect effects of individual drugs would require at least 8-fold the sample size of the combination trial. Trial registration Current Controlled Trials ISRCTN61649292 PMID:21288326

  9. Exome sequencing of extreme phenotypes identifies DCTN4 as a modifier of chronic Pseudomonas aeruginosa infection in cystic fibrosis.

    PubMed

    Emond, Mary J; Louie, Tin; Emerson, Julia; Zhao, Wei; Mathias, Rasika A; Knowles, Michael R; Wright, Fred A; Rieder, Mark J; Tabor, Holly K; Nickerson, Deborah A; Barnes, Kathleen C; Gibson, Ronald L; Bamshad, Michael J

    2012-07-08

    Exome sequencing has become a powerful and effective strategy for the discovery of genes underlying Mendelian disorders. However, use of exome sequencing to identify variants associated with complex traits has been more challenging, partly because the sample sizes needed for adequate power may be very large. One strategy to increase efficiency is to sequence individuals who are at both ends of a phenotype distribution (those with extreme phenotypes). Because the frequency of alleles that contribute to the trait are enriched in one or both phenotype extremes, a modest sample size can potentially be used to identify novel candidate genes and/or alleles. As part of the National Heart, Lung, and Blood Institute (NHLBI) Exome Sequencing Project (ESP), we used an extreme phenotype study design to discover that variants in DCTN4, encoding a dynactin protein, are associated with time to first P. aeruginosa airway infection, chronic P. aeruginosa infection and mucoid P. aeruginosa in individuals with cystic fibrosis.

  10. Eddy Covariance Measurements of the Sea-Spray Aerosol Flu

    NASA Astrophysics Data System (ADS)

    Brooks, I. M.; Norris, S. J.; Yelland, M. J.; Pascal, R. W.; Prytherch, J.

    2015-12-01

    Historically, almost all estimates of the sea-spray aerosol source flux have been inferred through various indirect methods. Direct estimates via eddy covariance have been attempted by only a handful of studies, most of which measured only the total number flux, or achieved rather coarse size segregation. Applying eddy covariance to the measurement of sea-spray fluxes is challenging: most instrumentation must be located in a laboratory space requiring long sample lines to an inlet collocated with a sonic anemometer; however, larger particles are easily lost to the walls of the sample line. Marine particle concentrations are generally low, requiring a high sample volume to achieve adequate statistics. The highly hygroscopic nature of sea salt means particles change size rapidly with fluctuations in relative humidity; this introduces an apparent bias in flux measurements if particles are sized at ambient humidity. The Compact Lightweight Aerosol Spectrometer Probe (CLASP) was developed specifically to make high rate measurements of aerosol size distributions for use in eddy covariance measurements, and the instrument and data processing and analysis techniques have been refined over the course of several projects. Here we will review some of the issues and limitations related to making eddy covariance measurements of the sea spray source flux over the open ocean, summarise some key results from the last decade, and present new results from a 3-year long ship-based measurement campaign as part of the WAGES project. Finally we will consider requirements for future progress.

  11. Design and simulation study of the immunization Data Quality Audit (DQA).

    PubMed

    Woodard, Stacy; Archer, Linda; Zell, Elizabeth; Ronveaux, Olivier; Birmingham, Maureen

    2007-08-01

    The goal of the Data Quality Audit (DQA) is to assess whether the Global Alliance for Vaccines and Immunization-funded countries are adequately reporting the number of diphtheria-tetanus-pertussis immunizations given, on which the "shares" are awarded. Given that this sampling design is a modified two-stage cluster sample (modified because a stratified, rather than a simple, random sample of health facilities is obtained from the selected clusters); the formula for the calculation of the standard error for the estimate is unknown. An approximated standard error has been proposed, and the first goal of this simulation is to assess the accuracy of the standard error. Results from the simulations based on hypothetical populations were found not to be representative of the actual DQAs that were conducted. Additional simulations were then conducted on the actual DQA data to better access the precision of the DQ with both the original and the increased sample sizes.

  12. Serum Copper Status in School-Age Children and Pregnant Women in China Nutrition and Health Survey 2010-2012.

    PubMed

    Liu, Xiaobing; Piao, Jianhua; Zhang, Yu; Li, Min; Li, Weidong; Yang, Lichen; Yang, Xiaoguang

    2016-10-01

    Serum copper is an insensitive but reliable biomarker reflecting the change of copper nutritional status in both depleted and replete populations. The current study aimed to establish the reference values of serum copper in school-age children and pregnant women in China and to explore the adequate range of serum copper for both these two categories of people. A multistage, stratified, random sampling combined with probability proportionate to regional size sampling method was employed. A total of 4019 subjects (2736 school-age children and 1283 pregnant women) were selected from China Nutrition and Health Survey 2010-2012 (CNHS 2010-2012). The concentration of serum copper was determined by sector field inductively coupled plasma mass spectrometry (SF-ICP-MS). The adequate range of serum copper was determined by the logistic sigmoid saturation curve of the median derivatives. The median concentration of serum copper was 1140.9 μg/L with a range of 746.7-1677.6 μg/L for school-age children and 1933.4 μg/L with a range of 947.4-3391.4 μg/L for pregnant women. The adequate range of serum copper was 905.7-1440.7 μg/L for school-age children and 1308.8-2537.8 μg/L for pregnant women. These parameters represent an essential prerequisite for the assessment of copper nutritional status, as well as nutrition interventions.

  13. Microbiopsies versus Bergström needle for skeletal muscle sampling: impact on maximal mitochondrial respiration rate.

    PubMed

    Isner-Horobeti, M E; Charton, A; Daussin, F; Geny, B; Dufour, S P; Richard, R

    2014-05-01

    Microbiopsies are increasingly used as an alternative to the standard Bergström technique for skeletal muscle sampling. The potential impact of these two different procedures on mitochondrial respiration rate is unknown. The objective of this work was to compare microbiopsies versus Bergström procedure on mitochondrial respiration in skeletal muscle. 52 vastus lateralis muscle samples were obtained from 13 anesthetized pigs, either with a Bergström [6 gauges (G)] needle or with microbiopsy needles (12, 14, 18G). Maximal mitochondrial respiration (V GM-ADP) was assessed using an oxygraphic method on permeabilized fibers. The weight of the muscle samples and V GM-ADP decreased with the increasing gauge of the needles. A positive nonlinear relationship was observed between the weight of the muscle sample and the level of maximal mitochondrial respiration (r = 0.99, p < 0.05) and between needle size and maximal mitochondrial respiration (r = 0.99, p < 0.05). Microbiopsies give lower muscle sample weight and maximal rate of mitochondrial respiration compared to the standard Bergström needle.Therefore, the higher the gauge (i.e. the smaller the size) of the microbiopsy needle, the lower is the maximal rate of respiration. Microbiopsies of skeletal muscle underestimate the maximal mitochondrial respiration rate, and this finding needs to be highlighted for adequate interpretation and comparison with literature data.

  14. SRD5A1 genotype frequency differences in women with mild versus severe premenstrual symptoms.

    PubMed

    Adams, Marlene; McCrone, Susan

    2012-02-01

    The aims of this small pilot study were to explore the association between premenstrual symptom severity and two genes from the gamma-aminobutyric acid (GABA) pathway: steroid-5-alpha-reductase, alpha polypeptide 1 (SRD5A1) and gamma-aminobutyric acid receptor subunit alpha-4 (GABRA4). Saliva samples were obtained from a convenience sample of 19 Caucasian females ages 18-25 years, ten cases and nine controls. Deoxyribonucleic acid (DNA) was isolated, and genotyping performed on ten single nucleotide polymorphisms (SNPs). Ten percent of cases and 44% of controls had the cytosine/cytosine (C/C) genotype for the SRD5A1 SNP, rs501999 indicating that this genotype may protect women against severe premenstrual symptoms. Replication of this study using an adequately powered sample size is warranted.

  15. The determination of specific forms of aluminum in natural water

    USGS Publications Warehouse

    Barnes, R.B.

    1975-01-01

    A procedure for analysis and pretreatment of natural-water samples to determine very low concentrations of Al is described which distinguishes the rapidly reacting equilibrium species from the metastable or slowly reacting macro ions and colloidal suspended material. Aluminum is complexed with 8-hydroxyquinoline (oxine), pH is adjusted to 8.3 to minimize interferences, and the aluminum oxinate is extracted with methyl isobutyl ketone (MIBK) prior to analysis by atomic absorption. To determine equilibrium species only, the contact time between sample and 8-hydroxyquinoline is minimized. The Al may be extracted at the sample site with a minimum of equipment and the MIBK extract stored for several weeks prior to atomic absorption analysis. Data obtained from analyses of 39 natural groundwater samples indicate that filtration through a 0.1-??m pore size filter is not an adequate means of removing all insoluble and metastable Al species present, and extraction of Al immediately after collection is necessary if only dissolved and readily reactive species are to be determined. An average of 63% of the Al present in natural waters that had been filtered through 0.1-??m pore size filters was in the form of monomeric ions. The total Al concentration, which includes all forms that passed through a 0.1-??m pore size filter, ranged 2-70 ??g/l. The concentration of Al in the form of monomeric ions ranged from below detection to 57 ??g/l. Most of the natural water samples used in this study were collected from thermal springs and oil wells. ?? 1975.

  16. Journal impact factor and methodological quality of surgical randomized controlled trials: an empirical study.

    PubMed

    Ahmed Ali, Usama; Reiber, Beata M M; Ten Hove, Joren R; van der Sluis, Pieter C; Gooszen, Hein G; Boermeester, Marja A; Besselink, Marc G

    2017-11-01

    The journal impact factor (IF) is often used as a surrogate marker for methodological quality. The objective of this study is to evaluate the relation between the journal IF and methodological quality of surgical randomized controlled trials (RCTs). Surgical RCTs published in PubMed in 1999 and 2009 were identified. According to IF, RCTs were divided into groups of low (<2), median (2-3) and high IF (>3), as well as into top-10 vs all other journals. Methodological quality characteristics and factors concerning funding, ethical approval and statistical significance of outcomes were extracted and compared between the IF groups. Additionally, a multivariate regression was performed. The median IF was 2.2 (IQR 2.37). The percentage of 'low-risk of bias' RCTs was 13% for top-10 journals vs 4% for other journals in 1999 (P < 0.02), and 30 vs 12% in 2009 (P < 0.02). Similar results were observed for high vs low IF groups. The presence of sample-size calculation, adequate generation of allocation and intention-to-treat analysis were independently associated with publication in higher IF journals; as were multicentre trials and multiple authors. Publication of RCTs in high IF journals is associated with moderate improvement in methodological quality compared to RCTs published in lower IF journals. RCTs with adequate sample-size calculation, generation of allocation or intention-to-treat analysis were associated with publication in a high IF journal. On the other hand, reporting a statistically significant outcome and being industry funded were not independently associated with publication in a higher IF journal.

  17. Delivery of cardiopulmonary resuscitation in the microgravity environment

    NASA Technical Reports Server (NTRS)

    Barratt, M. R.; Billica, R. D.

    1992-01-01

    The microgravity environment presents several challenges for delivering effective cardiopulmonary resuscitation (CPR). Chest compressions must be driven by muscular force rather than by the weight of the rescuer's upper torso. Airway stabilization is influenced by the neutral body posture. Rescuers will consist of crew members of varying sizes and degrees of physical deconditioning from space flight. Several methods of CPR designed to accommodate these factors were tested in the one G environment, in parabolic flight, and on a recent shuttle flight. Methods: Utilizing study participants of varying sizes, different techniques of CPR delivery were evaluated using a recording CPR manikin to assess adequacy of compressive force and frequency. Under conditions of parabolic flight, methods tested included conventional positioning of rescuer and victim, free floating 'Heimlich type' compressions, straddling the patient with active and passive restraints, and utilizing a mechanical cardiac compression assist device (CCAD). Multiple restrain systems and ventilation methods were also assessed. Results: Delivery of effective CPR was possible in all configurations tested. Reliance on muscular force alone was quickly fatiguing to the rescuer. Effectiveness of CPR was dependent on technique, adequate restraint of the rescuer and patient, and rescuer size and preference. Free floating CPR was adequate but rapidly fatiguing. The CCAD was able to provide adequate compressive force but positioning was problematic. Conclusions: Delivery of effective CPR in microgravity will be dependent on adequate resuer and patient restraint, technique, and rescuer size and preference. Free floating CPR may be employed as a stop gap method until patient restraint is available. Development of an adequate CCAD would be desirable to compensate for the effects of deconditioning.

  18. Optimal Order of Successive Office Hysteroscopy and Endometrial Biopsy for the Evaluation of Abnormal Uterine Bleeding: A Randomized Controlled Trial.

    PubMed

    Sarkar, Papri; Mikhail, Emad; Schickler, Robyn; Plosker, Shayne; Imudia, Anthony N

    2017-09-01

    To estimate the optimal order of office hysteroscopy and endometrial biopsy when performed successively for evaluation of abnormal uterine bleeding. Patients undergoing successive office hysteroscopy and endometrial biopsy were included in a single-blind, prospective, randomized trial. The primary outcome was to evaluate the effect of order of procedures on patients' pain score. Prespecified secondary outcomes include procedure duration, hysteroscopic visualization of the uterine cavity, endometrial sample adequacy, and number of attempts at biopsy. Pain scores were assessed using a visual analog scale from 0 to 10 and endometrial sample adequacy was determined from the histopathology report. Hysteroscopy images were recorded. Sample size of 34 per group (n=68) was determined to be adequate to detect a difference of 20% in visual analog scale score between hysteroscopy first (group A) and biopsy first (group B) at α of 0.05 and 80% power. Between October 2015 and January 2017, 78 women were randomized to group A (n=40) and group B (n=38). There was no difference in global pain perception [7 (0-10) vs 7 (0-10); P=.57, 95% CI 5.8-7.1]. Procedure duration [3 (1-9) vs 3 (2-10), P=.32, 95% CI 3.3-4.1] and endometrial sample adequacy (78.9% vs 75.7%, P=.74) were similar in both groups. Group A patients had better endometrial visualization (P<.001) than group B based on the hysteroscopic images: excellent (50% vs 7.9%), good (20% vs 34.2%), and fair (22.5% vs 44.7%); group B participants required fewer endometrial biopsy attempts at obtaining adequate tissue sample (two vs one; P<.001, 1.6-1.9). Patients having successive office hysteroscopy and endometrial biopsy for evaluation of abnormal uterine bleeding, the global pain perception, and time required are independent of the order in which procedures are performed. Performing hysteroscopy first ensures better image, whereas biopsy first yields adequate tissue sample with fewer attempts. ClinicalTrials.gov, NCT02472184.

  19. Random forests ensemble classifier trained with data resampling strategy to improve cardiac arrhythmia diagnosis.

    PubMed

    Ozçift, Akin

    2011-05-01

    Supervised classification algorithms are commonly used in the designing of computer-aided diagnosis systems. In this study, we present a resampling strategy based Random Forests (RF) ensemble classifier to improve diagnosis of cardiac arrhythmia. Random forests is an ensemble classifier that consists of many decision trees and outputs the class that is the mode of the class's output by individual trees. In this way, an RF ensemble classifier performs better than a single tree from classification performance point of view. In general, multiclass datasets having unbalanced distribution of sample sizes are difficult to analyze in terms of class discrimination. Cardiac arrhythmia is such a dataset that has multiple classes with small sample sizes and it is therefore adequate to test our resampling based training strategy. The dataset contains 452 samples in fourteen types of arrhythmias and eleven of these classes have sample sizes less than 15. Our diagnosis strategy consists of two parts: (i) a correlation based feature selection algorithm is used to select relevant features from cardiac arrhythmia dataset. (ii) RF machine learning algorithm is used to evaluate the performance of selected features with and without simple random sampling to evaluate the efficiency of proposed training strategy. The resultant accuracy of the classifier is found to be 90.0% and this is a quite high diagnosis performance for cardiac arrhythmia. Furthermore, three case studies, i.e., thyroid, cardiotocography and audiology, are used to benchmark the effectiveness of the proposed method. The results of experiments demonstrated the efficiency of random sampling strategy in training RF ensemble classification algorithm. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. [Study on the resilience internal factors in a sample of Puerto Rican centenarians].

    PubMed

    Rosado-Medina, José J; Rodríguez-Gómez, José R; Altieri-Ramirez, Gladys

    2012-01-01

    Old age is a stage that is usually characterized by lost at the physiological, psychological and social level that generates much distress to individuals. However, the centenaries have been identified as an example of successful aging, within other factors, because they have adequate managed skills that help them to deal with healthy normal losses. Resilience could be one of the factors that may help the Centennials to age successfully. It is necessary more studies with Puerto Rico Centennials since we lack such investigations. This study has an expo facto design; in addition we evaluate psychometrically the Symptoms Check List 90-R (SCL-90-R). The scale of Internal Resilience Factors (EFIR), a semi structured interview and the SCL-90-R were used to identify factors associated with successful aging in the centennials. In addition we explore if there exist gender differences in internal factors of resilience within the sample. 23 Centennials, 15 men and 8 women, of different parts of Puerto Rico (average age = 101. 5 years). Internal resilience factors associated with the aging process were identifying, those were: emotional stability, optimism, behavioral factor and behavioral and emotional skills component. These factors are consistent with the revised literature on positive emotions and adaptive ageing. On the other hand, no statistically significant difference was identified (p <. 05) for the internal factors of resilience on the basis of gender, a finding agreed with the revised literature. The multiple tests administered showed adequate internal consistency (EFIR: (=. 726); SCL-90-R: (=. 941). The Symptoms Check list 90-R (SCL-90-R) was valid with a Cronbach's alpha of. 941. We identified internal resilience factors that could be linked with successfully aging: those factors are encouraging the elderly population. In addition used tests showed adequate internal consistency. Limitations in relation to the size of the sample and the distribution of gender were identified, thus we suggest further research with larger samples.

  1. [Potentials in the regionalization of health indicators using small-area estimation methods : Exemplary results based on the 2009, 2010 and 2012 GEDA studies].

    PubMed

    Kroll, Lars Eric; Schumann, Maria; Müters, Stephan; Lampert, Thomas

    2017-12-01

    Nationwide health surveys can be used to estimate regional differences in health. Using traditional estimation techniques, the spatial depth for these estimates is limited due to the constrained sample size. So far - without special refreshment samples - results have only been available for larger populated federal states of Germany. An alternative is regression-based small-area estimation techniques. These models can generate smaller-scale data, but are also subject to greater statistical uncertainties because of the model assumptions. In the present article, exemplary regionalized results based on the studies "Gesundheit in Deutschland aktuell" (GEDA studies) 2009, 2010 and 2012, are compared to the self-rated health status of the respondents. The aim of the article is to analyze the range of regional estimates in order to assess the usefulness of the techniques for health reporting more adequately. The results show that the estimated prevalence is relatively stable when using different samples. Important determinants of the variation of the estimates are the achieved sample size on the district level and the type of the district (cities vs. rural regions). Overall, the present study shows that small-area modeling of prevalence is associated with additional uncertainties compared to conventional estimates, which should be taken into account when interpreting the corresponding findings.

  2. Sound absorption by suspensions of nonspherical particles: Measurements compared with predictions using various particle sizing techniques

    NASA Astrophysics Data System (ADS)

    Richards, Simon D.; Leighton, Timothy G.; Brown, Niven R.

    2003-10-01

    Knowledge of the particle size distribution is required in order to predict ultrasonic absorption in polydisperse particulate suspensions. This paper shows that the method used to measure the particle size distribution can lead to important differences in the predicted absorption. A reverberation technique developed for measuring ultrasonic absorption by suspended particles is used to measure the absorption in suspensions of nonspherical particles. Two types of particulates are studied: (i) kaolin (china clay) particles which are platelike in form; and (ii) calcium carbonate particles which are more granular. Results are compared to theoretical predictions of visco-inertial absorption by suspensions of spherical particles. The particle size distributions, which are required for these predictions, are measured by laser diffraction, gravitational sedimentation and centrifugal sedimentation, all of which assume spherical particles. For a given sample, each sizing technique yields a different size distribution, leading to differences in the predicted absorption. The particle size distributions obtained by gravitational and centrifugal sedimentation are reinterpreted to yield a representative size distribution of oblate spheroids, and predictions for absorption by these spheroids are compared with the measurements. Good agreement between theory and measurement for the flat kaolin particles is obtained, demonstrating that these particles can be adequately represented by oblate spheroids.

  3. How many landmarks are enough to characterize shape and size variation?

    PubMed

    Watanabe, Akinobu

    2018-01-01

    Accurate characterization of morphological variation is crucial for generating reliable results and conclusions concerning changes and differences in form. Despite the prevalence of landmark-based geometric morphometric (GM) data in the scientific literature, a formal treatment of whether sampled landmarks adequately capture shape variation has remained elusive. Here, I introduce LaSEC (Landmark Sampling Evaluation Curve), a computational tool to assess the fidelity of morphological characterization by landmarks. This task is achieved by calculating how subsampled data converge to the pattern of shape variation in the full dataset as landmark sampling is increased incrementally. While the number of landmarks needed for adequate shape variation is dependent on individual datasets, LaSEC helps the user (1) identify under- and oversampling of landmarks; (2) assess robustness of morphological characterization; and (3) determine the number of landmarks that can be removed without compromising shape information. In practice, this knowledge could reduce time and cost associated with data collection, maintain statistical power in certain analyses, and enable the incorporation of incomplete, but important, specimens to the dataset. Results based on simulated shape data also reveal general properties of landmark data, including statistical consistency where sampling additional landmarks has the tendency to asymptotically improve the accuracy of morphological characterization. As landmark-based GM data become more widely adopted, LaSEC provides a systematic approach to evaluate and refine the collection of shape data--a goal paramount for accumulation and analysis of accurate morphological information.

  4. A low intensity sampling method for assessing blue crab abundance at Aransas National Wildlife Refuge and preliminary results on the relationship of blue crab abundance to whooping crane winter mortality

    USGS Publications Warehouse

    Pugesek, Bruce H.; Baldwin, Michael J.; Stehn, Thomas; Folk, Martin J.; Nesbitt, Stephen A.

    2008-01-01

    We sampled blue crabs (Callinectes sapidus) in marshes on the Aransas National Wildlife Refuge, Texas from 1997 to 2005 to determine whether whooping crane (Grus americana) mortality was related to the availability of this food source. For four years, 1997 - 2001, we sampled monthly from the fall through the spring. From these data, we developed a reduced sampling effort method that adequately characterized crab abundance and reduced the potential for disturbance to the cranes. Four additional years of data were collected with the reduced sampling effort methods. Yearly variation in crab numbers was high, ranging from a low of 0.1 crabs to a high of 3.4 crabs per 100-m transect section. Mortality among adult cranes was inversely related to crab abundance. We found no relationship between crab abundance and mortality among juvenile cranes, possibly as a result of a smaller population size of juveniles compared to adults.

  5. Multipinhole SPECT helical scan parameters and imaging volume

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yao, Rutao, E-mail: rutaoyao@buffalo.edu; Deng, Xiao; Wei, Qingyang

    Purpose: The authors developed SPECT imaging capability on an animal PET scanner using a multiple-pinhole collimator and step-and-shoot helical data acquisition protocols. The objective of this work was to determine the preferred helical scan parameters, i.e., the angular and axial step sizes, and the imaging volume, that provide optimal imaging performance. Methods: The authors studied nine helical scan protocols formed by permuting three rotational and three axial step sizes. These step sizes were chosen around the reference values analytically calculated from the estimated spatial resolution of the SPECT system and the Nyquist sampling theorem. The nine helical protocols were evaluatedmore » by two figures-of-merit: the sampling completeness percentage (SCP) and the root-mean-square (RMS) resolution. SCP was an analytically calculated numerical index based on projection sampling. RMS resolution was derived from the reconstructed images of a sphere-grid phantom. Results: The RMS resolution results show that (1) the start and end pinhole planes of the helical scheme determine the axial extent of the effective field of view (EFOV), and (2) the diameter of the transverse EFOV is adequately calculated from the geometry of the pinhole opening, since the peripheral region beyond EFOV would introduce projection multiplexing and consequent effects. The RMS resolution results of the nine helical scan schemes show optimal resolution is achieved when the axial step size is the half, and the angular step size is about twice the corresponding values derived from the Nyquist theorem. The SCP results agree in general with that of RMS resolution but are less critical in assessing the effects of helical parameters and EFOV. Conclusions: The authors quantitatively validated the effective FOV of multiple pinhole helical scan protocols and proposed a simple method to calculate optimal helical scan parameters.« less

  6. Electrofishing distance needed to estimate consistent Index of Biotic Integrity (IBI) scores in raftable Oregon rivers

    EPA Science Inventory

    An important issue surrounding assessment of riverine fish assemblages is the minimum amount of sampling distance needed to adequately determine biotic condition. Determining adequate sampling distance is important because sampling distance affects estimates of fish assemblage c...

  7. Hypnotherapy for labor and birth.

    PubMed

    Beebe, Kathleen R

    2014-01-01

    Hypnotherapy is an integrative mind-body technique with therapeutic potential in various health care applications, including labor and birth. Evaluating the efficacy of this modality in controlled studies can be difficult, because of methodologic challenges, such as obtaining adequate sample sizes and standardizing experimental conditions. Women using hypnosis techniques for childbirth in hospital settings may face barriers related to caregiver resistance or institutional policies. The potential anxiolytic and analgesic effects of clinical hypnosis for childbirth merit further study. Nurses caring for women during labor and birth can increase their knowledge and skills with strategies for supporting hypnotherapeutic techniques. © 2014 AWHONN.

  8. Addressing Challenges in Studies of Behavioral Responses of Whales to Noise.

    PubMed

    Cato, Douglas H; Dunlop, Rebecca A; Noad, Michael J; McCauley, Robert D; Kniest, Eric; Paton, David; Kavanagh, Ailbhe S

    2016-01-01

    Studying the behavioral response of whales to noise presents numerous challenges. In addition to the characteristics of the noise exposure, many factors may affect the response and these must be measured and accounted for in the analysis. An adequate sample size that includes matching controls is crucial if meaningful results are to be obtained. Field work is thus complicated, logistically difficult, and expensive. This paper discusses some of the challenges and how they are being met in a large-scale multiplatform project in which humpback whales are exposed to the noise of seismic air guns.

  9. Multicanonical hybrid Monte Carlo algorithm: Boosting simulations of compact QED

    NASA Astrophysics Data System (ADS)

    Arnold, G.; Schilling, K.; Lippert, Th.

    1999-03-01

    We demonstrate that substantial progress can be achieved in the study of the phase structure of four-dimensional compact QED by a joint use of hybrid Monte Carlo and multicanonical algorithms through an efficient parallel implementation. This is borne out by the observation of considerable speedup of tunnelling between the metastable states, close to the phase transition, on the Wilson line. We estimate that the creation of adequate samples (with order 100 flip-flops) becomes a matter of half a year's run time at 2 Gflops sustained performance for lattices of size up to 244.

  10. A sampling strategy for promoting and assessing medical student retention of physical examination skills.

    PubMed

    Williams, Reed G; Klamen, Debra L; Mayer, David; Valaski, Maureen; Roberts, Nicole K

    2007-10-01

    Skill acquisition and maintenance requires spaced deliberate practice. Assessing medical students' physical examination performance ability is resource intensive. The authors assessed the nature and size of physical examination performance samples necessary to accurately estimate total physical examination skill. Physical examination assessment data were analyzed from second year students at the University of Illinois College of Medicine at Chicago in 2002, 2003, and 2004 (N = 548). Scores on subgroups of physical exam maneuvers were compared with scores on the total physical exam, to identify sound predictors of total test performance. Five exam subcomponents were sufficiently correlated to overall test performance and provided adequate sensitivity and specificity to serve as a means to prompt continued student review and rehearsal of physical examination technical skills. Selection and administration of samples of the total physical exam provide a resource-saving approach for promoting and estimating overall physical examination skills retention.

  11. Development and characteristics of Mechanical Porous Ambient Comet Simulants as comet surface analogs

    NASA Astrophysics Data System (ADS)

    Carey, Elizabeth M.; Peters, Gregory H.; Choukroun, Mathieu; Chu, Lauren; Carpenter, Emma; Cohen, Brooklin; Panossian, Lara; Zhou, Yu Meng; Sarkissian, Ani; Moreland, Scott; Shiraishi, Lori R.; Backes, Paul; Zacny, Kris; Green, Jacklyn R.; Raymond, Carol

    2017-11-01

    Comets are icy remnants of the Solar System formation, and as such contain some of the most primitive volatiles and organic materials. Sampling the surface of a comet is a high priority for the New Frontiers program. Planetary simulants are crucial to the development of adequate in situ instruments and sample acquisition systems. A high-fidelity comet surface simulant has been developed to support hardware design and development for one Comet Surface Sample Return tool, the BiBlade Comet Sampler. Mechanical Porous Ambient Comet Simulants (MPACS) can be manufactured to cover a wide range of desired physical properties, such as density and cone penetration resistance, and exhibit a brittle fracture mode. The structure of the MPACS materials is an aggregated composite structure of weakly-bonded grains of very small size (diameter ≤ 40 μm) that are most relevant to the structure of the surface of a comet nucleus.

  12. Influence of Specimen Preparation and Specimen Size on Composite Transverse Tensile Strength and Scatter

    NASA Technical Reports Server (NTRS)

    OBrien, T. Kevin; Chawan, Arun D.; DeMarco, Kevin; Paris, Isabelle

    2001-01-01

    The influence of specimen polishing, configuration, and size on the transverse tension strength of two glass-epoxy materials, and one carbon-epoxy material, loaded in three and four point bending was evaluated. Polishing machined edges, arid/or tension side failure surfaces, was detrimental to specimen strength characterization instead of yielding a higher, more accurate, strength as a result of removing inherent manufacture and handling flaws. Transverse tension strength was typically lower for longer span lengths due to the classical weakest link effect. However, strength was less sensitive to volume changes achieved by increasing specimen width. The Weibull scaling law typically over-predicted changes in transverse tension strengths in three point bend tests and under-predicted changes in transverse tension strengths in four point bend tests. Furthermore, the Weibull slope varied with specimen configuration, volume, and sample size. Hence, this scaling law was not adequate for predicting transverse tension strength of heterogeneous, fiber-reinforced, polymer matrix composites.

  13. 30 CFR 75.513-1 - Electric conductor; size.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Electric conductor; size. 75.513-1 Section 75.513-1 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR COAL MINE SAFETY... Electric conductor; size. An electric conductor is not of sufficient size to have adequate carrying...

  14. 30 CFR 75.513-1 - Electric conductor; size.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Electric conductor; size. 75.513-1 Section 75.513-1 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR COAL MINE SAFETY... Electric conductor; size. An electric conductor is not of sufficient size to have adequate carrying...

  15. 30 CFR 75.513-1 - Electric conductor; size.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Electric conductor; size. 75.513-1 Section 75... AND HEALTH MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Electrical Equipment-General § 75.513-1 Electric conductor; size. An electric conductor is not of sufficient size to have adequate carrying...

  16. 30 CFR 75.513-1 - Electric conductor; size.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Electric conductor; size. 75.513-1 Section 75... AND HEALTH MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Electrical Equipment-General § 75.513-1 Electric conductor; size. An electric conductor is not of sufficient size to have adequate carrying...

  17. 30 CFR 75.513-1 - Electric conductor; size.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Electric conductor; size. 75.513-1 Section 75... AND HEALTH MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Electrical Equipment-General § 75.513-1 Electric conductor; size. An electric conductor is not of sufficient size to have adequate carrying...

  18. Photoactivity of N-doped ZnO nanoparticles in oxidative and reductive reactions

    NASA Astrophysics Data System (ADS)

    Oliveira, Jéssica A.; Nogueira, André E.; Gonçalves, Maria C. P.; Paris, Elaine C.; Ribeiro, Caue; Poirier, Gael Y.; Giraldi, Tania R.

    2018-03-01

    N-doped ZnO is a prospective material for photocatalytic reactions. However, only oxidative paths are well investigated in the literature. This paper describes a comparative study about ZnO and ZnO:N potential for oxidative and reductive reactions, probed by rhodamine B dye photodegradation and CO2 photoreduction. The materials were prepared by the polymeric precursor method, using urea as a nitrogen source, and different heat treatments were used to observe their effects on surface decontamination, crystallinity, particle sizes and shapes, and photocatalytic performance. ZnO and ZnO:N presented a wurtzite crystalline structure and nanometric-scale particles. Samples submitted to higher temperatures showed lower specific surface areas, but higher crystallinity and lower contents of species adsorbed on their surfaces. On the other hand, the photocatalysts annealed in shorter times presented smaller crystallite sizes and lower crystallinity. These factors influenced the photoactivity in both conditions, i.e., oxidation and reduction reactions, under the ultraviolet and visible light, indicating that structural factors influenced the adequate charge separation and consequent photocatalytic activity since the as-synthesized samples were versatile photocatalysts in both redox reactions.

  19. Mesoscale spatial variability of selected aquatic invertebrate community metrics from a minimally impaired stream segment

    USGS Publications Warehouse

    Gebler, J.B.

    2004-01-01

    The related topics of spatial variability of aquatic invertebrate community metrics, implications of spatial patterns of metric values to distributions of aquatic invertebrate communities, and ramifications of natural variability to the detection of human perturbations were investigated. Four metrics commonly used for stream assessment were computed for 9 stream reaches within a fairly homogeneous, minimally impaired stream segment of the San Pedro River, Arizona. Metric variability was assessed for differing sampling scenarios using simple permutation procedures. Spatial patterns of metric values suggest that aquatic invertebrate communities are patchily distributed on subsegment and segment scales, which causes metric variability. Wide ranges of metric values resulted in wide ranges of metric coefficients of variation (CVs) and minimum detectable differences (MDDs), and both CVs and MDDs often increased as sample size (number of reaches) increased, suggesting that any particular set of sampling reaches could yield misleading estimates of population parameters and effects that can be detected. Mean metric variabilities were substantial, with the result that only fairly large differences in metrics would be declared significant at ?? = 0.05 and ?? = 0.20. The number of reaches required to obtain MDDs of 10% and 20% varied with significance level and power, and differed for different metrics, but were generally large, ranging into tens and hundreds of reaches. Study results suggest that metric values from one or a small number of stream reach(es) may not be adequate to represent a stream segment, depending on effect sizes of interest, and that larger sample sizes are necessary to obtain reasonable estimates of metrics and sample statistics. For bioassessment to progress, spatial variability may need to be investigated in many systems and should be considered when designing studies and interpreting data.

  20. Knowledge and attitude towards total knee arthroplasty among the public in Saudi Arabia: a nationwide population-based study.

    PubMed

    Al-Mohrej, Omar A; Alshammari, Faris O; Aljuraisi, Abdulrahman M; Bin Amer, Lujain A; Masuadi, Emad M; Al-Kenani, Nader S

    2018-04-01

    Studies on total knee arthroplasty (TKA) in Saudi Arabia are scarce, and none have reported the knowledge and attitude of the procedure in Saudi Arabia. Our study aims to measure the knowledge and attitude of TKA among the adult Saudi population. To encompass a representative sample of this cross-sectional survey, all 13 administrative areas were used as ready-made geographical clusters. For each cluster, stratified random sampling was performed to maximize participation in the study. In each area, random samples of mobile phone numbers were selected with a probability proportional to the administrative area population size. Sample size calculation was based on the assumption that 50% of the participants would have some level of knowledge, with a 2% margin of error and 95% confidence level. To reach our intended sample size of 1540, we contacted 1722 participants with a response rate of 89.4%. The expected percentage of public knowledge was 50%; however, the actual percentage revealed by this study was much lower (29.7%). A stepwise multiple logistic regression was used to assess the factors that positively affected the knowledge score regarding TKA. Age [P = 0.016 with OR of 0.47], higher income [P = 0.001 with OR of 0.52] and participants with a positive history of TKA or that have known someone who underwent the surgery [P < 0.001 with OR of 0.15] had a positive impact on the total knowledge score. There are still misconceptions among the public in Saudi Arabia concerning TKA, its indications and results. We recommend that doctors use the results of our survey to assess their conversations with their patients, and to determine whether the results of the procedure are adequately clarified.

  1. Cluster randomised crossover trials with binary data and unbalanced cluster sizes: application to studies of near-universal interventions in intensive care.

    PubMed

    Forbes, Andrew B; Akram, Muhammad; Pilcher, David; Cooper, Jamie; Bellomo, Rinaldo

    2015-02-01

    Cluster randomised crossover trials have been utilised in recent years in the health and social sciences. Methods for analysis have been proposed; however, for binary outcomes, these have received little assessment of their appropriateness. In addition, methods for determination of sample size are currently limited to balanced cluster sizes both between clusters and between periods within clusters. This article aims to extend this work to unbalanced situations and to evaluate the properties of a variety of methods for analysis of binary data, with a particular focus on the setting of potential trials of near-universal interventions in intensive care to reduce in-hospital mortality. We derive a formula for sample size estimation for unbalanced cluster sizes, and apply it to the intensive care setting to demonstrate the utility of the cluster crossover design. We conduct a numerical simulation of the design in the intensive care setting and for more general configurations, and we assess the performance of three cluster summary estimators and an individual-data estimator based on binomial-identity-link regression. For settings similar to the intensive care scenario involving large cluster sizes and small intra-cluster correlations, the sample size formulae developed and analysis methods investigated are found to be appropriate, with the unweighted cluster summary method performing well relative to the more optimal but more complex inverse-variance weighted method. More generally, we find that the unweighted and cluster-size-weighted summary methods perform well, with the relative efficiency of each largely determined systematically from the study design parameters. Performance of individual-data regression is adequate with small cluster sizes but becomes inefficient for large, unbalanced cluster sizes. When outcome prevalences are 6% or less and the within-cluster-within-period correlation is 0.05 or larger, all methods display sub-nominal confidence interval coverage, with the less prevalent the outcome the worse the coverage. As with all simulation studies, conclusions are limited to the configurations studied. We confined attention to detecting intervention effects on an absolute risk scale using marginal models and did not explore properties of binary random effects models. Cluster crossover designs with binary outcomes can be analysed using simple cluster summary methods, and sample size in unbalanced cluster size settings can be determined using relatively straightforward formulae. However, caution needs to be applied in situations with low prevalence outcomes and moderate to high intra-cluster correlations. © The Author(s) 2014.

  2. Inactivation of Alicyclobacillus acidoterrestris ATCC 49025 spores in apple juice by pulsed light. Influence of initial contamination and required reduction levels.

    PubMed

    Ferrario, Mariana I; Guerrero, Sandra N

    The purpose of this study was to analyze the response of different initial contamination levels of Alicyclobacillus acidoterrestris ATCC 49025 spores in apple juice as affected by pulsed light treatment (PL, batch mode, xenon lamp, 3pulses/s, 0-71.6J/cm 2 ). Biphasic and Weibull frequency distribution models were used to characterize the relationship between inoculum size and treatment time with the reductions achieved after PL exposure. Additionally, a second order polynomial model was computed to relate required PL processing time to inoculum size and requested log reductions. PL treatment caused up to 3.0-3.5 log reductions, depending on the initial inoculum size. Inactivation curves corresponding to PL-treated samples were adequately characterized by both Weibull and biphasic models (R adj 2 94-96%), and revealed that lower initial inoculum sizes were associated with higher inactivation rates. According to the polynomial model, the predicted time for PL treatment increased exponentially with inoculum size. Copyright © 2017 Asociación Argentina de Microbiología. Publicado por Elsevier España, S.L.U. All rights reserved.

  3. Genetic stock assessment of spawning arctic cisco (Coregonus autumnalis) populations by flow cytometric determination of DNA content.

    PubMed

    Lockwood, S F; Bickham, J W

    1991-01-01

    Intraspecific variation in cellular DNA content was measured in five Coregonus autumnalis spawning populations from the Mackenzie River drainage, Canada, using flow cytometry. The rivers assayed were the Peel, Arctic Red, Mountain, Carcajou, and Liard rivers. DNA content was determined from whole blood preparations of fish from all rivers except the Carcajou, for which kidney tissue was used. DNA content measurements of kidney and blood preparations of the same fish from the Mountain River revealed statistically indistinguishable results. Mosaicism was found in blood preparations from the Peel, Arctic Red, Mountain, and Liard rivers, but was not observed in kidney tissue preparations from the Mountain or Carcajou rivers. The Liard River sample had significantly elevated mean DNA content relative to the other four samples; all other samples were statistically indistinguishable. Significant differences in mean DNA content among spawning stocks of a single species reinforces the need for adequate sample sizes of both individuals and populations when reporting "C" values for a particular species.

  4. Design Characteristics Influence Performance of Clinical Prediction Rules in Validation: A Meta-Epidemiological Study

    PubMed Central

    Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda

    2016-01-01

    Background Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules’ performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Methods Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. Results A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2–4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Conclusion Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved. PMID:26730980

  5. Design Characteristics Influence Performance of Clinical Prediction Rules in Validation: A Meta-Epidemiological Study.

    PubMed

    Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda

    2016-01-01

    Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules' performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2-4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved.

  6. The art and science of choosing efficacy endpoints for rare disease clinical trials.

    PubMed

    Cox, Gerald F

    2018-04-01

    An important challenge in rare disease clinical trials is to demonstrate a clinically meaningful and statistically significant response to treatment. Selecting the most appropriate and sensitive efficacy endpoints for a treatment trial is part art and part science. The types of endpoints should align with the stage of development (e.g., proof of concept vs. confirmation of clinical efficacy). The patient characteristics and disease stage should reflect the treatment goal of improving disease manifestations or preventing disease progression. For rare diseases, regulatory approval requires demonstration of clinical benefit, defined as how a patient, feels, functions, or survives, in at least one adequate and well-controlled pivotal study conducted according to Good Clinical Practice. In some cases, full regulatory approval can occur using a validated surrogate biomarker, while accelerated, or provisional, approval can occur using a biomarker that is likely to predict clinical benefit. Rare disease studies are small by necessity and require the use of endpoints with large effect sizes to demonstrate statistical significance. Understanding the quantitative factors that determine effect size and its impact on powering the study with an adequate sample size is key to the successful choice of endpoints. Interpreting the clinical meaningfulness of an observed change in an efficacy endpoint can be justified by statistical methods, regulatory precedence, and clinical context. Heterogeneous diseases that affect multiple organ systems may be better accommodated by endpoints that assess mean change across multiple endpoints within the same patient rather than mean change in an individual endpoint across all patients. © 2018 Wiley Periodicals, Inc.

  7. A dataset describing brooding in three species of South African brittle stars, comprising seven high-resolution, micro X-ray computed tomography scans.

    PubMed

    Landschoff, Jannes; Du Plessis, Anton; Griffiths, Charles L

    2015-01-01

    Brooding brittle stars have a special mode of reproduction whereby they retain their eggs and juveniles inside respiratory body sacs called bursae. In the past, studying this phenomenon required disturbance of the sample by dissecting the adult. This caused irreversible damage and made the sample unsuitable for future studies. Micro X-ray computed tomography (μCT) is a promising technique, not only to visualise juveniles inside the bursae, but also to keep the sample intact and make the dataset of the scan available for future reference. Seven μCT scans of five freshly fixed (70 % ethanol) individuals, representing three differently sized brittle star species, provided adequate image quality to determine the numbers, sizes and postures of internally brooded young, as well as anatomy and morphology of adults. No staining agents were necessary to achieve high-resolution, high-contrast images, which permitted visualisations of both calcified and soft tissue. The raw data (projection and reconstruction images) are publicly available for download from GigaDB. Brittle stars of all sizes are suitable candidates for μCT imaging. This explicitly adds a new technique to the suite of tools available for studying the development of internally brooded young. The purpose of applying the technique was to visualise juveniles inside the adult, but because of the universally good quality of the dataset, the images can also be used for anatomical or comparative morphology-related studies of adult structures.

  8. A systematic approach to designing statistically powerful heteroscedastic 2 × 2 factorial studies while minimizing financial costs.

    PubMed

    Jan, Show-Li; Shieh, Gwowen

    2016-08-31

    The 2 × 2 factorial design is widely used for assessing the existence of interaction and the extent of generalizability of two factors where each factor had only two levels. Accordingly, research problems associated with the main effects and interaction effects can be analyzed with the selected linear contrasts. To correct for the potential heterogeneity of variance structure, the Welch-Satterthwaite test is commonly used as an alternative to the t test for detecting the substantive significance of a linear combination of mean effects. This study concerns the optimal allocation of group sizes for the Welch-Satterthwaite test in order to minimize the total cost while maintaining adequate power. The existing method suggests that the optimal ratio of sample sizes is proportional to the ratio of the population standard deviations divided by the square root of the ratio of the unit sampling costs. Instead, a systematic approach using optimization technique and screening search is presented to find the optimal solution. Numerical assessments revealed that the current allocation scheme generally does not give the optimal solution. Alternatively, the suggested approaches to power and sample size calculations give accurate and superior results under various treatment and cost configurations. The proposed approach improves upon the current method in both its methodological soundness and overall performance. Supplementary algorithms are also developed to aid the usefulness and implementation of the recommended technique in planning 2 × 2 factorial designs.

  9. Simulating recurrent event data with hazard functions defined on a total time scale.

    PubMed

    Jahn-Eimermacher, Antje; Ingel, Katharina; Ozga, Ann-Kathrin; Preussler, Stella; Binder, Harald

    2015-03-08

    In medical studies with recurrent event data a total time scale perspective is often needed to adequately reflect disease mechanisms. This means that the hazard process is defined on the time since some starting point, e.g. the beginning of some disease, in contrast to a gap time scale where the hazard process restarts after each event. While techniques such as the Andersen-Gill model have been developed for analyzing data from a total time perspective, techniques for the simulation of such data, e.g. for sample size planning, have not been investigated so far. We have derived a simulation algorithm covering the Andersen-Gill model that can be used for sample size planning in clinical trials as well as the investigation of modeling techniques. Specifically, we allow for fixed and/or random covariates and an arbitrary hazard function defined on a total time scale. Furthermore we take into account that individuals may be temporarily insusceptible to a recurrent incidence of the event. The methods are based on conditional distributions of the inter-event times conditional on the total time of the preceeding event or study start. Closed form solutions are provided for common distributions. The derived methods have been implemented in a readily accessible R script. The proposed techniques are illustrated by planning the sample size for a clinical trial with complex recurrent event data. The required sample size is shown to be affected not only by censoring and intra-patient correlation, but also by the presence of risk-free intervals. This demonstrates the need for a simulation algorithm that particularly allows for complex study designs where no analytical sample size formulas might exist. The derived simulation algorithm is seen to be useful for the simulation of recurrent event data that follow an Andersen-Gill model. Next to the use of a total time scale, it allows for intra-patient correlation and risk-free intervals as are often observed in clinical trial data. Its application therefore allows the simulation of data that closely resemble real settings and thus can improve the use of simulation studies for designing and analysing studies.

  10. 77 FR 39442 - Receipts-Based, Small Business Size Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-03

    ... RIN 3150-AJ14 [NRC-2012-0062] Receipts-Based, Small Business Size Standard AGENCY: Nuclear Regulatory... business size standard from $6.5 million to $7 million to conform to the standard set by the Small Business... updating the receipts-based, small business size standard from $6.5 million to $7.0 million. Adequate...

  11. 46 CFR 160.054-2 - Type and size.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Kits, First-Aid, for Inflatable Liferafts § 160.054-2 Type and size. (a) Type. First-aid kits covered by this specification shall be of the water-tight type... special consideration. (b) Size. First-aid kits shall be of a size adequate for packing 12 standard single...

  12. 46 CFR 160.054-2 - Type and size.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Kits, First-Aid, for Inflatable Liferafts § 160.054-2 Type and size. (a) Type. First-aid kits covered by this specification shall be of the water-tight type... special consideration. (b) Size. First-aid kits shall be of a size adequate for packing 12 standard single...

  13. 46 CFR 160.054-2 - Type and size.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Kits, First-Aid, for Inflatable Liferafts § 160.054-2 Type and size. (a) Type. First-aid kits covered by this specification shall be of the water-tight type... special consideration. (b) Size. First-aid kits shall be of a size adequate for packing 12 standard single...

  14. 46 CFR 160.054-2 - Type and size.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Kits, First-Aid, for Inflatable Liferafts § 160.054-2 Type and size. (a) Type. First-aid kits covered by this specification shall be of the water-tight type... special consideration. (b) Size. First-aid kits shall be of a size adequate for packing 12 standard single...

  15. 46 CFR 160.054-2 - Type and size.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Kits, First-Aid, for Inflatable Liferafts § 160.054-2 Type and size. (a) Type. First-aid kits covered by this specification shall be of the water-tight type... special consideration. (b) Size. First-aid kits shall be of a size adequate for packing 12 standard single...

  16. Lexical Threshold Revisited: Lexical Text Coverage, Learners' Vocabulary Size and Reading Comprehension

    ERIC Educational Resources Information Center

    Laufer, Batia; Ravenhorst-Kalovski, Geke C.

    2010-01-01

    We explore the relationship between second language (L2) learners' vocabulary size, lexical text coverage that their vocabulary provides and their reading comprehension. We also conceptualize "adequate reading comprehension" and look for the lexical threshold for such reading in terms of coverage and vocabulary size. Vocabulary size was…

  17. Size relationships between the parasitic copepod, Lernanthropus cynoscicola , and its fish host, Cynoscion guatucupa.

    PubMed

    Timi, J T; Lanfranchi, A L

    2006-02-01

    The effects of the size of Cynoscion guatucupa on the size and demographic parameters of their parasitic copepod Lernanthropus cynoscicola were evaluated. Prevalence of copepods increased with host size up to fish of intermediate length, then it decreased, probably because changes in size of gill filaments affect their attachment capability, enhancing the possibility of being detached by respiratory currents. Body length of copepods was significantly correlated with host length, indicating that only parasites of an 'adequate' size can be securely attached to a fish of a given size. The absence of relationship between the coefficient of variability in copepod length and both host length and number of conspecifics, together with the host-size dependence of both male and juvenile female sizes, prevent to interpret this relationship as a phenomenon of developmental plasticity. Therefore, the observed peak of prevalence could reflect the distribution of size frequencies in the population of copepods, with more individuals near the average length. Concluding, the 'optimum' host size for L. cynoscicola could merely be the adequate size for most individuals in the population, depending, therefore, on a populational attribute of parasites. However, its location along the host size range could be determined by a balance between fecundity and number of available hosts, which increases and decreases, respectively, with both host and parasite size.

  18. Evaluation of selected information on splitting devices for water samples

    USGS Publications Warehouse

    Capel, P.D.; Larson, S.J.

    1996-01-01

    Four devices for splitting water samples into representative aliquots are used by the U.S. Geological Survey's Water Resources Division. A thorough evaluation of these devices (14-liter churn, 8-liter churn, plastic cone, and Teflon cone) encompasses a wide variety of concerns, based on both chemical and physical considerations. This report surveys the existing data (as of April 1994) on cleaning efficiency and splitting capability of these devices and presents the data in a systematic framework for evaluation. From the existing data, some of these concerns are adequately or partially addressed, but the majority of concerns could not be addressed because of the lack of data. In general, the existing cleaning and transport protocols are adequate at the milligram per liter level, but the adequacy is largely unknown for trace elements and organic chemicals at lower concen- trations. The existing data indicate that better results are obtained when the splitters are cleaned in the laboratory rather than in the field. Two conclusions that can be reached on the splitting capability of solids are that more work must be done with all four devices to characterize and quantify their limitations and range of usefulness, and that the 14-liter churn (and by association, the 8-liter churn) is not useful in obtaining representative splits of sand-sized particles.

  19. The Relationship between Organizational Culture Types and Innovation in Aerospace Companies

    NASA Astrophysics Data System (ADS)

    Nelson, Adaora N.

    Innovation in the aerospace industry has proven to be an effective strategy for competitiveness and sustainability. The organizational culture of the firm must be conducive to innovation. The problem was that although innovation is needed for aerospace companies to be competitive and sustainable, certain organizational culture issues might hinder leaders from successfully innovating (Emery, 2010; Ramanigopal, 2012). The purpose of this study was to assess the relationship of hierarchical, clan, adhocracy and market organizational types and innovation in aerospace companies within the U.S while controlling for company size and length of time in business. The non-experimental quantitative study included a random sample of 136 aerospace leaders in the U.S. There was a significant relationship between market organizational culture and innovation, F(1,132) = 4.559, p = .035. No significant relationships were found between hierarchical organizational culture and innovation and between clan culture and innovation. The relationship between adhocracy culture and innovation was not significant, possible due to inadequate sample size. Company size was shown to be a justifiable covariate in the study, due to a significant relationship with innovative (F(1, 130) = 4.66, p < .1, r = .19). Length of time in business had no relationship with innovation. The findings imply that market organizational cultures are more likely to result in innovative outcomes in the aerospace industry. Organizational leaders are encouraged to adopt a market culture and adopt smaller organizational structures. Recommendations for further research include investigating the relationship between adhocracy culture and innovation using an adequate sample size. Research is needed to determine other variables that predict innovation. This study should be repeated at periodic intervals and across other industrial sectors and countries.

  20. Pilot application study of corridor performance indicators

    DOT National Transportation Integrated Search

    1998-09-16

    The need for effective multimodal performance indicators (or measures) is becoming increasingly important for adequate planning in all sizes of transportation environments, including small and medium-size communities. These measures are essential for...

  1. [Comparative quality measurements part 3: funnel plots].

    PubMed

    Kottner, Jan; Lahmann, Nils

    2014-02-01

    Comparative quality measurements between organisations or institutions are common. Quality measures need to be standardised and risk adjusted. Random error must also be taken adequately into account. Rankings without consideration of the precision lead to flawed interpretations and enhances "gaming". Application of confidence intervals is one possibility to take chance variation into account. Funnel plots are modified control charts based on Statistical Process Control (SPC) theory. The quality measures are plotted against their sample size. Warning and control limits that are 2 or 3 standard deviations from the center line are added. With increasing group size the precision increases and so the control limits are forming a funnel. Data points within the control limits are considered to show common cause variation; data points outside special cause variation without the focus of spurious rankings. Funnel plots offer data based information about how to evaluate institutional performance within quality management contexts.

  2. Medication safety research by observational study design.

    PubMed

    Lao, Kim S J; Chui, Celine S L; Man, Kenneth K C; Lau, Wallis C Y; Chan, Esther W; Wong, Ian C K

    2016-06-01

    Observational studies have been recognised to be essential for investigating the safety profile of medications. Numerous observational studies have been conducted on the platform of large population databases, which provide adequate sample size and follow-up length to detect infrequent and/or delayed clinical outcomes. Cohort and case-control are well-accepted traditional methodologies for hypothesis testing, while within-individual study designs are developing and evolving, addressing previous known methodological limitations to reduce confounding and bias. Respective examples of observational studies of different study designs using medical databases are shown. Methodology characteristics, study assumptions, strengths and weaknesses of each method are discussed in this review.

  3. 46 CFR 160.041-2 - Type and size.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Kits, First-Aid, for Merchant Vessels § 160.041-2 Type and size. (a) Type. First-aid kits covered by this specification shall be of the water-tight cabinet carrying... consideration. (b) Size. First-aid kits shall be of a size (approximately 9″×9″×21/2″ inside) adequate for...

  4. 46 CFR 160.041-2 - Type and size.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Kits, First-Aid, for Merchant Vessels § 160.041-2 Type and size. (a) Type. First-aid kits covered by this specification shall be of the water-tight cabinet carrying... consideration. (b) Size. First-aid kits shall be of a size (approximately 9″×9″×21/2″ inside) adequate for...

  5. 46 CFR 160.041-2 - Type and size.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Kits, First-Aid, for Merchant Vessels § 160.041-2 Type and size. (a) Type. First-aid kits covered by this specification shall be of the water-tight cabinet carrying... consideration. (b) Size. First-aid kits shall be of a size (approximately 9″ × 9″ × 21/2″ inside) adequate for...

  6. 46 CFR 160.041-2 - Type and size.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Kits, First-Aid, for Merchant Vessels § 160.041-2 Type and size. (a) Type. First-aid kits covered by this specification shall be of the water-tight cabinet carrying... consideration. (b) Size. First-aid kits shall be of a size (approximately 9″ × 9″ × 21/2″ inside) adequate for...

  7. 46 CFR 160.041-2 - Type and size.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Kits, First-Aid, for Merchant Vessels § 160.041-2 Type and size. (a) Type. First-aid kits covered by this specification shall be of the water-tight cabinet carrying... consideration. (b) Size. First-aid kits shall be of a size (approximately 9″×9″×21/2″ inside) adequate for...

  8. Benthic macroinvertebrate field sampling effort required to produce a sample adequate for the assessment of rivers and streams of Neuquén Province, Argentina

    EPA Science Inventory

    This multi-year pilot study evaluated a proposed field method for its effectiveness in the collection of a benthic macroinvertebrate sample adequate for use in the condition assessment of streams and rivers in the Neuquén Province, Argentina. A total of 13 sites, distribut...

  9. The Hartung-Knapp-Sidik-Jonkman method for random effects meta-analysis is straightforward and considerably outperforms the standard DerSimonian-Laird method

    PubMed Central

    2014-01-01

    Background The DerSimonian and Laird approach (DL) is widely used for random effects meta-analysis, but this often results in inappropriate type I error rates. The method described by Hartung, Knapp, Sidik and Jonkman (HKSJ) is known to perform better when trials of similar size are combined. However evidence in realistic situations, where one trial might be much larger than the other trials, is lacking. We aimed to evaluate the relative performance of the DL and HKSJ methods when studies of different sizes are combined and to develop a simple method to convert DL results to HKSJ results. Methods We evaluated the performance of the HKSJ versus DL approach in simulated meta-analyses of 2–20 trials with varying sample sizes and between-study heterogeneity, and allowing trials to have various sizes, e.g. 25% of the trials being 10-times larger than the smaller trials. We also compared the number of “positive” (statistically significant at p < 0.05) findings using empirical data of recent meta-analyses with > = 3 studies of interventions from the Cochrane Database of Systematic Reviews. Results The simulations showed that the HKSJ method consistently resulted in more adequate error rates than the DL method. When the significance level was 5%, the HKSJ error rates at most doubled, whereas for DL they could be over 30%. DL, and, far less so, HKSJ had more inflated error rates when the combined studies had unequal sizes and between-study heterogeneity. The empirical data from 689 meta-analyses showed that 25.1% of the significant findings for the DL method were non-significant with the HKSJ method. DL results can be easily converted into HKSJ results. Conclusions Our simulations showed that the HKSJ method consistently results in more adequate error rates than the DL method, especially when the number of studies is small, and can easily be applied routinely in meta-analyses. Even with the HKSJ method, extra caution is needed when there are = <5 studies of very unequal sizes. PMID:24548571

  10. A community trial of the impact of improved sexually transmitted disease treatment on the HIV epidemic in rural Tanzania: 2. Baseline survey results.

    PubMed

    Grosskurth, H; Mosha, F; Todd, J; Senkoro, K; Newell, J; Klokke, A; Changalucha, J; West, B; Mayaud, P; Gavyole, A

    1995-08-01

    To determine baseline HIV prevalence in a trial of improved sexually transmitted disease (STD) treatment, and to investigate risk factors for HIV. To assess comparability of intervention and comparison communities with respect to HIV/STD prevalence and risk factors. To assess adequacy of sample size. Twelve communities in Mwanza Region, Tanzania: one matched pair of roadside communities, four pairs of rural communities, and one pair of island communities. One community from each pair was randomly allocated to receive the STD intervention following the baseline survey. Approximately 1000 adults aged 15-54 years were randomly sampled from each community. Subjects were interviewed, and HIV and syphilis serology performed. Men with a positive leucocyte esterase dipstick test on urine, or reporting a current STD, were tested for urethral infections. A total of 12,534 adults were enrolled. Baseline HIV prevalences were 7.7% (roadside), 3.8% (rural) and 1.8% (islands). Associations were observed with marital status, injections, education, travel, history of STD and syphilis serology. Prevalence was higher in circumcised men, but not significantly after adjusting for confounders. Intervention and comparison communities were similar in the prevalence of HIV (3.8 versus 4.4%), active syphilis (8.7 versus 8.2%), and most recorded risk factors. Within-pair variability in HIV prevalence was close to the value assumed for sample size calculations. The trial cohort was successfully established. Comparability of intervention and comparison communities at baseline was confirmed for most factors. Matching appears to have achieved a trial of adequate sample size. The apparent lack of a protective effect of male circumcision contrasts with other studies in Africa.

  11. Impact of specimen adequacy on the assessment of renal allograft biopsy specimens.

    PubMed

    Cimen, S; Geldenhuys, L; Guler, S; Imamoglu, A; Molinari, M

    2016-01-01

    The Banff classification was introduced to achieve uniformity in the assessment of renal allograft biopsies. The primary aim of this study was to evaluate the impact of specimen adequacy on the Banff classification. All renal allograft biopsies obtained between July 2010 and June 2012 for suspicion of acute rejection were included. Pre-biopsy clinical data on suspected diagnosis and time from renal transplantation were provided to a nephropathologist who was blinded to the original pathological report. Second pathological readings were compared with the original to assess agreement stratified by specimen adequacy. Cohen's kappa test and Fisher's exact test were used for statistical analyses. Forty-nine specimens were reviewed. Among these specimens, 81.6% were classified as adequate, 6.12% as minimal, and 12.24% as unsatisfactory. The agreement analysis among the first and second readings revealed a kappa value of 0.97. Full agreement between readings was found in 75% of the adequate specimens, 66.7 and 50% for minimal and unsatisfactory specimens, respectively. There was no agreement between readings in 5% of the adequate specimens and 16.7% of the unsatisfactory specimens. For the entire sample full agreement was found in 71.4%, partial agreement in 20.4% and no agreement in 8.2% of the specimens. Statistical analysis using Fisher's exact test yielded a P value above 0.25 showing that - probably due to small sample size - the results were not statistically significant. Specimen adequacy may be a determinant of a diagnostic agreement in renal allograft specimen assessment. While additional studies including larger case numbers are required to further delineate the impact of specimen adequacy on the reliability of histopathological assessments, specimen quality must be considered during clinical decision making while dealing with biopsy reports based on minimal or unsatisfactory specimens.

  12. ELECTROFISHING DISTANCE NEEDED TO ESTIMATE FISH SPECIES RICHNESS IN RAFTABLE WESTERN USA RIVERS

    EPA Science Inventory

    A critical issue in river monitoring is the minimum amount of sampling distance required to adequately represent the fish assemblage of a reach. Determining adequate sampling distance is important because it affects estimates of fish assemblage integrity and diversity at local a...

  13. Apollo rocks, fines and soil cores

    NASA Astrophysics Data System (ADS)

    Allton, J.; Bevill, T.

    Apollo rocks and soils not only established basic lunar properties and ground truth for global remote sensing, they also provided important lessons for planetary protection (Adv. Space Res ., 1998, v. 22, no. 3 pp. 373-382). The six Apollo missions returned 2196 samples weighing 381.7 kg, comprised of rocks, fines, soil cores and 2 gas samples. By examining which samples were allocated for scientific investigations, information was obtained on usefulness of sampling strategy, sampling devices and containers, sample types and diversity, and on size of sample needed by various disciplines. Diversity was increased by using rakes to gather small rocks on the Moon and by removing fragments >1 mm from soils by sieving in the laboratory. Breccias and soil cores are diverse internally. Per unit weight these samples were more often allocated for research. Apollo investigators became adept at wringing information from very small sample sizes. By pushing the analytical limits, the main concern was adequate size for representative sampling. Typical allocations for trace element analyses were 750 mg for rocks, 300 mg for fines and 70 mg for core subsamples. Age-dating and isotope systematics allocations were typically 1 g for rocks and fines, but only 10% of that amount for core depth subsamples. Historically, allocations for organics and microbiology were 4 g (10% for cores). Modern allocations for biomarker detection are 100mg. Other disciplines supported have been cosmogenic nuclides, rock and soil petrology, sedimentary volatiles, reflectance, magnetics, and biohazard studies . Highly applicable to future sample return missions was the Apollo experience with organic contamination, estimated to be from 1 to 5 ng/g sample for Apollo 11 (Simonheit &Flory, 1970; Apollo 11, 12 &13 Organic contamination Monitoring History, U.C. Berkeley; Burlingame et al., 1970, Apollo 11 LSC , pp. 1779-1792). Eleven sources of contaminants, of which 7 are applicable to robotic missions, were identified and reduced; thus, improving Apollo 12 samples to 0.1 ng/g. Apollo sample documentation preserves the parentage, orientation, and location, packaging, handling and environmental histories of each of the 90,000 subsamples currently curated. Active research on Apollo samples continues today, and because 80% by weight of the Apollo collection remains pristine, researchers have a reservoir of material to support studies well into the future.

  14. 46 CFR 148.250 - Direct reduced iron (DRI); hot-molded briquettes.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... concentration of fines (pieces less than 6.35mm in size) in any one location in the cargo hold. (f) Adequate... hot-molded briquettes. (h) Radar and RDF scanners must be adequately protected against dust generated during cargo transfer operations of DRI hot-molded briquettes. (i) During final discharge only, a fine...

  15. 46 CFR 148.250 - Direct reduced iron (DRI); hot-molded briquettes.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... concentration of fines (pieces less than 6.35mm in size) in any one location in the cargo hold. (f) Adequate... hot-molded briquettes. (h) Radar and RDF scanners must be adequately protected against dust generated during cargo transfer operations of DRI hot-molded briquettes. (i) During final discharge only, a fine...

  16. 46 CFR 148.250 - Direct reduced iron (DRI); hot-molded briquettes.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... concentration of fines (pieces less than 6.35mm in size) in any one location in the cargo hold. (f) Adequate... hot-molded briquettes. (h) Radar and RDF scanners must be adequately protected against dust generated during cargo transfer operations of DRI hot-molded briquettes. (i) During final discharge only, a fine...

  17. 46 CFR 148.250 - Direct reduced iron (DRI); hot-molded briquettes.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... concentration of fines (pieces less than 6.35mm in size) in any one location in the cargo hold. (f) Adequate... hot-molded briquettes. (h) Radar and RDF scanners must be adequately protected against dust generated during cargo transfer operations of DRI hot-molded briquettes. (i) During final discharge only, a fine...

  18. 7 CFR 1436.8 - Security for loan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... adequate size and value at the time of the application as determined by the county committee to adequately... such collateral except for prior liens on the underlying real estate that by operation of law attach to the collateral if it is or will become a fixture. If any such prior lien on the real estate will...

  19. 7 CFR 1436.8 - Security for loan.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... adequate size and value at the time of the application as determined by the county committee to adequately... such collateral except for prior liens on the underlying real estate that by operation of law attach to the collateral if it is or will become a fixture. If any such prior lien on the real estate will...

  20. Designing clinical trials to test disease-modifying agents: application to the treatment trials of Alzheimer's disease.

    PubMed

    Xiong, Chengjie; van Belle, Gerald; Miller, J Philip; Morris, John C

    2011-02-01

    Therapeutic trials of disease-modifying agents on Alzheimer's disease (AD) require novel designs and analyses involving switch of treatments for at least a portion of subjects enrolled. Randomized start and randomized withdrawal designs are two examples of such designs. Crucial design parameters such as sample size and the time of treatment switch are important to understand in designing such clinical trials. The purpose of this article is to provide methods to determine sample sizes and time of treatment switch as well as optimum statistical tests of treatment efficacy for clinical trials of disease-modifying agents on AD. A general linear mixed effects model is proposed to test the disease-modifying efficacy of novel therapeutic agents on AD. This model links the longitudinal growth from both the placebo arm and the treatment arm at the time of treatment switch for these in the delayed treatment arm or early withdrawal arm and incorporates the potential correlation on the rate of cognitive change before and after the treatment switch. Sample sizes and the optimum time for treatment switch of such trials as well as optimum test statistic for the treatment efficacy are determined according to the model. Assuming an evenly spaced longitudinal design over a fixed duration, the optimum treatment switching time in a randomized start or a randomized withdrawal trial is half way through the trial. With the optimum test statistic for the treatment efficacy and over a wide spectrum of model parameters, the optimum sample size allocations are fairly close to the simplest design with a sample size ratio of 1:1:1 among the treatment arm, the delayed treatment or early withdrawal arm, and the placebo arm. The application of the proposed methodology to AD provides evidence that much larger sample sizes are required to adequately power disease-modifying trials when compared with those for symptomatic agents, even when the treatment switch time and efficacy test are optimally chosen. The proposed method assumes that the only and immediate effect of treatment switch is on the rate of cognitive change. Crucial design parameters for the clinical trials of disease-modifying agents on AD can be optimally chosen. Government and industry officials as well as academia researchers should consider the optimum use of the clinical trials design for disease-modifying agents on AD in their effort to search for the treatments with the potential to modify the underlying pathophysiology of AD.

  1. Assessing the effect of self instructional module on knowledge of menopause & hormone replacement therapy for menopausal women in Moradabad (UP).

    PubMed

    Khalid, Mehvish; Chhuggani, Manju

    2014-01-01

    Objectives of the study were to identify the problems faced by menopausal women and to find out the remedial measures adopted by them, to assess the knowledge of menopausal women regarding menopause & hormone replacement therapy (HRT) before and after administration of self-instructional module (SIM) and to find out the acceptability and utility of the SIM. An evaluative research approach, with pre-experimental one group pre-test post-test design was adopted. Purposive sampling technique was used to obtain an adequate size of the sample. The sample comprised of 100 menopausal women living in selected community of Moradabad (UP). A knowledge questionnaire and opinionnaire were administered, and SIM on menopause and HRT administered. It was found that there was deficit in knowledge of menopausal women regarding menopause and HRT. Mean post-test knowledge scores were significantly higher than mean pre-test knowledge scores. SIM was found highly acceptable and useful by menopausal women.

  2. The Importance and Role of Intracluster Correlations in Planning Cluster Trials

    PubMed Central

    Preisser, John S.; Reboussin, Beth A.; Song, Eun-Young; Wolfson, Mark

    2008-01-01

    There is increasing recognition of the critical role of intracluster correlations of health behavior outcomes in cluster intervention trials. This study examines the estimation, reporting, and use of intracluster correlations in planning cluster trials. We use an estimating equations approach to estimate the intracluster correlations corresponding to the multiple-time-point nested cross-sectional design. Sample size formulae incorporating 2 types of intracluster correlations are examined for the purpose of planning future trials. The traditional intracluster correlation is the correlation among individuals within the same community at a specific time point. A second type is the correlation among individuals within the same community at different time points. For a “time × condition” analysis of a pretest–posttest nested cross-sectional trial design, we show that statistical power considerations based upon a posttest-only design generally are not an adequate substitute for sample size calculations that incorporate both types of intracluster correlations. Estimation, reporting, and use of intracluster correlations are illustrated for several dichotomous measures related to underage drinking collected as part of a large nonrandomized trial to enforce underage drinking laws in the United States from 1998 to 2004. PMID:17879427

  3. Air-Q intubating laryngeal airway: A study of the second generation supraglottic airway device.

    PubMed

    Attarde, Viren Bhaskar; Kotekar, Nalini; Shetty, Sarika M

    2016-05-01

    Air-Q intubating laryngeal mask airway (ILA) is used as a supraglottic airway device and as a conduit for endotracheal intubation. This study aims to assess the efficacy of the Air-Q ILA regarding ease of insertion, adequacy of ventilation, rate of successful intubation, haemodynamic response and airway morbidity. Sixty patients presenting for elective surgery at our Medical College Hospital were selected. Following adequate premedication, baseline vital parameters, pulse rate and blood pressure were recorded. Air-Q size 3.5 for patients 50-70 kg and size 4.5 for 70-100 kg was selected. After achieving adequate intubating conditions, Air-Q ILA was introduced. Confirming adequate ventilation, appropriate sized endotracheal tube was advanced through the Air-Q blindly to intubate the trachea. Placement of the endotracheal tube in trachea was confirmed. Air-Q ILA was successfully inserted in 88.3% of patients in first attempt and 11.7% patients in second attempt. Ventilation was adequate in 100% of patients. Intubation was successful in 76.7% of patients with Air-Q ILA. 23.3% of patients were intubated by direct laryngoscopy following failure with two attempts using Air-Q ILA. Post-intubation the change in heart rate was statistically significant (P < 0.0001). 10% of patients were noted to have a sore throat and 5% of patients had mild airway trauma. Air-Q ILA is a reliable device as a supraglottic airway ensuring adequate ventilation as well as a conduit for endotracheal intubation. It benefits the patient by avoiding the stress of direct laryngoscopy and is also superior alternative device for use in a difficult airway.

  4. Comparative Analysis of Registered Nurses' and Nursing Students' Attitudes and Use of Nonpharmacologic Methods of Pain Management.

    PubMed

    Stewart, Malcolm; Cox-Davenport, Rebecca A

    2015-08-01

    Despite the benefits that nonpharmacologic methods of pain management have to offer, nurses cite barriers that inhibit their use in practice. The purpose of this research study was to compare the perceptions of prelicensed student nurses (SNs) and registered nurses (RNs) toward nonpharmacologic methods of pain management. A sample size of 64 students and 49 RNs was recruited. Each participant completed a questionnaire about their use and perceptions nonpharmacologic pain control methods. Sixty-nine percent of RNs reported a stronger belief that nonpharmacologic methods gave relief to their patients compared with 59% of SNs (p = .028). Seventy-five percent of student nurses felt they had adequate education about nonpharmacologic pain modalities compared with 51% of RN who felt less than adequately educated (p = .016). These findings highlight the need for education about nonpharmacologic approaches to pain management. Applications of these findings may decrease barriers to the use of nonpharmacologic methods of pain management. Copyright © 2015 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  5. A comparison of selected MMPI-2 and MMPI-2-RF validity scales in assessing effort on cognitive tests in a military sample.

    PubMed

    Jones, Alvin; Ingram, M Victoria

    2011-10-01

    Using a relatively new statistical paradigm, Optimal Data Analysis (ODA; Yarnold & Soltysik, 2005), this research demonstrated that newly developed scales for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2) and MMPI-2 Restructured Form (MMPI-2-RF) specifically designed to assess over-reporting of cognitive and/or somatic symptoms were more effective than the MMPI-2 F-family of scales in predicting effort status on tests of cognitive functioning in a sample of 288 military members. ODA demonstrated that when all scales were performing at their theoretical maximum possible level of classification accuracy, the Henry Heilbronner Index (HHI), Response Bias Scale (RBS), Fake Bad Scale (FBS), and the Symptom Validity Scale (FBS-r) outperformed the F-family of scales on a variety of ODA indexes of classification accuracy, including an omnibus measure (effect strength total, EST) of the descriptive and prognostic utility of ODA models developed for each scale. Based on the guidelines suggested by Yarnold and Soltysik for evaluating effect strengths for ODA models, the newly developed scales had effects sizes that were moderate in size (37.66 to 45.68), whereas the F-family scales had effects strengths that ranged from weak to moderate (15.42 to 32.80). In addition, traditional analysis demonstrated that HHI, RBS, FBS, and FBS-R had large effect sizes (0.98 to 1.16) based on Cohen's (1988) suggested categorization of effect size when comparing mean scores for adequate versus inadequate effort groups, whereas F-family of scales had small to medium effect sizes (0.25 to 0.76). The MMPI-2-RF Infrequent Somatic Responses Scale (F(S)) tended to perform in a fashion similar to F, the best performing F-family scale.

  6. Choosing a design to fit the situation: how to improve specificity and positive predictive values using Bayesian lot quality assurance sampling.

    PubMed

    Olives, Casey; Pagano, Marcello

    2013-02-01

    Lot Quality Assurance Sampling (LQAS) is a provably useful tool for monitoring health programmes. Although LQAS ensures acceptable Producer and Consumer risks, the literature alleges that the method suffers from poor specificity and positive predictive values (PPVs). We suggest that poor LQAS performance is due, in part, to variation in the true underlying distribution. However, until now the role of the underlying distribution in expected performance has not been adequately examined. We present Bayesian-LQAS (B-LQAS), an approach to incorporating prior information into the choice of the LQAS sample size and decision rule, and explore its properties through a numerical study. Additionally, we analyse vaccination coverage data from UNICEF's State of the World's Children in 1968-1989 and 2008 to exemplify the performance of LQAS and B-LQAS. Results of our numerical study show that the choice of LQAS sample size and decision rule is sensitive to the distribution of prior information, as well as to individual beliefs about the importance of correct classification. Application of the B-LQAS approach to the UNICEF data improves specificity and PPV in both time periods (1968-1989 and 2008) with minimal reductions in sensitivity and negative predictive value. LQAS is shown to be a robust tool that is not necessarily prone to poor specificity and PPV as previously alleged. In situations where prior or historical data are available, B-LQAS can lead to improvements in expected performance.

  7. A Protocol to Preserve the Integrity of Stable Fly (Diptera: Muscidae) DNA for Long Distance Shipment

    USDA-ARS?s Scientific Manuscript database

    Population genetic studies on a global scale may be hampered by the ability to acquire quality samples from distant countries. Preservation methods must be adequate to prevent the samples from decay during shipping, so an adequate quantity of quality DNA can be extracted for analysis, and materials...

  8. Relationships between diatoms and tidal environments in Oregon and Washington, USA

    USGS Publications Warehouse

    Sawai, Yuki; Horton, Benjamin P.; Kemp, Andrew C.; Hawkes, Andrea D.; Nagumo, Tamostsu; Nelson, Alan R.

    2016-01-01

    A new regional dataset comprising 425 intertidal diatom taxa from 175 samples from 11 ecologically diverse Oregon and Washington estuaries illustrates the importance of compiling a large modern dataset from a range of sites. Cluster analyses and detrended correspondence analysis of the diatom assemblages identify distinct vertical zones within supratidal, intertidal and subtidal environments at six of the 11 study sites, but the abundance of some of the most common species varies widely among and within sites. Canonical correspondence analysis of the regional dataset shows relationships between diatom species and tidal exposure, salinity and substratum (grain size and organic content). Correspondence analyses of local datasets show higher values of explained variation than the analysis of the combined regional dataset. Our results emphasize that studies of the autecology of diatom species require many samples from a range of modern environments to adequately characterize species–environment relationships.

  9. Exploring the Factor Structure of Neurocognitive Measures in Older Individuals

    PubMed Central

    Santos, Nadine Correia; Costa, Patrício Soares; Amorim, Liliana; Moreira, Pedro Silva; Cunha, Pedro; Cotter, Jorge; Sousa, Nuno

    2015-01-01

    Here we focus on factor analysis from a best practices point of view, by investigating the factor structure of neuropsychological tests and using the results obtained to illustrate on choosing a reasonable solution. The sample (n=1051 individuals) was randomly divided into two groups: one for exploratory factor analysis (EFA) and principal component analysis (PCA), to investigate the number of factors underlying the neurocognitive variables; the second to test the “best fit” model via confirmatory factor analysis (CFA). For the exploratory step, three extraction (maximum likelihood, principal axis factoring and principal components) and two rotation (orthogonal and oblique) methods were used. The analysis methodology allowed exploring how different cognitive/psychological tests correlated/discriminated between dimensions, indicating that to capture latent structures in similar sample sizes and measures, with approximately normal data distribution, reflective models with oblimin rotation might prove the most adequate. PMID:25880732

  10. Uranium carbide fission target R&D for RIA - an update

    NASA Astrophysics Data System (ADS)

    Greene, J. P.; Levand, A.; Nolen, J.; Burtseva, T.

    2004-12-01

    For the Rare Isotope Accelerator (RIA) facility, ISOL targets employing refractory compounds of uranium are being developed to produce radioactive ions for post-acceleration. The availability of refractory uranium compounds in forms that have good thermal conductivity, relatively high density, and adequate release properties for short-lived isotopes remains an important issue. Investigations using commercially obtained uranium carbide material and prepared into targets involving various binder materials have been carried out at ANL. Thin sample pellets have been produced for measurements of thermal conductivity using a new method based on electron bombardment with the thermal radiation observed using a two-color optical pyrometer and performed on samples as a function of grain size, pressing pressure and sintering temperature. Manufacture of uranium carbide powder has now been achieved at ANL. Simulations have been carried out on the thermal behavior of the secondary target assembly incorporating various heat shield configurations.

  11. 78 FR 8961 - Special Conditions: Embraer S.A., Model EMB-550 Airplane; Hydrophobic Coatings in Lieu of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-07

    ..., airflow over the windshield may be necessary to remove moisture, but may not be adequate to maintain a... be necessary to remove moisture from the windshield, may not be adequate to maintain a sufficiently... dependent on water droplet size for effective precipitation removal. For example, precipitation in the form...

  12. Effects of pre-analytical variables on flow cytometric diagnosis of canine lymphoma: A retrospective study (2009-2015).

    PubMed

    Comazzi, S; Cozzi, M; Bernardi, S; Zanella, D R; Aresu, L; Stefanello, D; Marconato, L; Martini, V

    2018-02-01

    Flow cytometry (FC) is increasingly being used for immunophenotyping and staging of canine lymphoma. The aim of this retrospective study was to assess pre-analytical variables that might influence the diagnostic utility of FC of lymph node (LN) fine needle aspirate (FNA) specimens from dogs with lymphoproliferative diseases. The study included 987 cases with LN FNA specimens sent for immunophenotyping that were submitted to a diagnostic laboratory in Italy from 2009 to 2015. Cases were grouped into 'diagnostic' and 'non-diagnostic'. Pre-analytical factors analysed by univariate and multivariate analyses were animal-related factors (breed, age, sex, size), operator-related factors (year, season, shipping method, submitting veterinarian) and sample-related factors (type of sample material, cellular concentration, cytological smears, artefacts). The submitting veterinarian, sample material, sample cellularity and artefacts affected the likelihood of having a diagnostic sample. The availability of specimens from different sites and of cytological smears increased the odds of obtaining a diagnostic result. Major artefacts affecting diagnostic utility included poor cellularity and the presence of dead cells. Flow cytometry on LN FNA samples yielded conclusive results in more than 90% of cases with adequate sample quality and sampling conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. The Cost of Class Size Reduction: Advice for Policymakers. RAND Graduate School Dissertation.

    ERIC Educational Resources Information Center

    Reichardt, Robert E.

    This dissertation provides information to state-level policymakers that will help them avoid two implementation problems seen in the past in California's class-size-reduction (CSR) reform. The first problem was that flat, per student reimbursement did not adequately cover costs in districts with larger pre-CSR class-sizes or smaller schools. The…

  14. A novel synthesis of a new thorium (IV) metal organic framework nanostructure with well controllable procedure through ultrasound assisted reverse micelle method.

    PubMed

    Sargazi, Ghasem; Afzali, Daryoush; Mostafavi, Ali

    2018-03-01

    Reverse micelle (RM) and ultrasound assisted reverse micelle (UARM) were applied to the synthesis of novel thorium nanostructures as metal organic frameworks (MOFs). Characterization with different techniques showed that the Th-MOF sample synthesized by UARM method had higher thermal stability (354°C), smaller mean particle size (27nm), and larger surface area (2.02×10 3 m 2 /g). Besides, in this novel approach, the nucleation of crystals was found to carry out in a shorter time. The synthesis parameters of UARM method were designed by 2 k-1 factorial and the process control was systematically studied using analysis of variance (ANOVA) and response surface methodology (RSM). ANOVA showed that various factors, including surfactant content, ultrasound duration, temperature, ultrasound power, and interaction between these factors, considerably affected different properties of the Th-MOF samples. According to the 2 k-1 factorial design, the determination coefficient (R 2 ) of the model is 0.999, with no significant lack of fit. The F value of 5432, implied that the model was highly significant and adequate to represent the relationship between the responses and the independent variables, also the large R-adjusted value indicates a good relationship between the experimental data and the fitted model. RSM predicted that it would be possible to produce Th-MOF samples with the thermal stability of 407°C, mean particle size of 13nm, and surface area of 2.20×10 3 m 2 /g. The mechanism controlling the Th-MOF properties was considerably different from the conventional mechanisms. Moreover, the MOF sample synthesized using UARM exhibited higher capacity for nitrogen adsorption as a result of larger pore sizes. It is believed that the UARM method and systematic studies developed in the present work can be considered as a new strategy for their application in other nanoscale MOF samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Are fixed grain size ratios useful proxies for loess sedimentation dynamics? Experiences from Remizovka, Kazakhstan

    NASA Astrophysics Data System (ADS)

    Schulte, Philipp; Sprafke, Tobias; Rodrigues, Leonor; Fitzsimmons, Kathryn E.

    2018-04-01

    Loess-paleosol sequences (LPS) are sensitive terrestrial archives of past aeolian dynamics and paleoclimatic changes within the Quaternary. Grain size (GS) analysis is commonly used to interpret aeolian dynamics and climate influences on LPS, based on granulometric parameters such as specific GS classes, ratios of GS classes and statistical manipulation of GS data. However, the GS distribution of a loess sample is not solely a function of aeolian dynamics; rather complex polygenetic depositional and post-depositional processes must be taken into account. This study assesses the reliability of fixed GS ratios as proxies for past sedimentation dynamics using the case study of Remizovka in southeast Kazakhstan. Continuous sampling of the upper 8 m of the profile, which shows extremely weak pedogenic alteration and is therefore dominated by primary aeolian activity, indicates that fixed GS ratios do not adequately serve as proxies for loess sedimentation dynamics. We find through the calculation of single value parameters, that "true" variations within sensitive GS classes are masked by relative changes of the more frequent classes. Heatmap signatures provide the visualization of GS variability within LPS without significant data loss within the measured classes of a sample, or across all measured samples. We also examine the effect of two different commonly used laser diffraction devices on GS ratio calculation by duplicate measurements, the Beckman Coulter (LS13320) and a Malvern Mastersizer Hydro (MM2000), as well as the applicability and significance of the so-called "twin peak ratio" previously developed on samples from the same section. The LS13320 provides higher resolution results than the MM2000, nevertheless the GS ratios related to variations in the silt-sized fraction were comparable. However, we could not detect a twin peak within the coarse silt as detected in the original study using the same device. Our GS measurements differ from previous works at Remizovka in several instances, calling into question the interpretation of paleoclimatic implications using GS data alone.

  16. Continuing education: online monitoring of haemodialysis dose.

    PubMed

    Vartia, Aarne

    2018-01-25

    Kt/V urea reflects the efficacy of haemodialysis scaled to patient size (urea distribution volume). The guidelines recommend monthly Kt/V measurements based on blood samples. Modern haemodialysis machines are equipped with accessories monitoring the dose online at every session without extra costs, blood samples and computers. To describe the principles, devices, benefits and shortcomings of online monitoring of haemodialysis dose. A critical literature overview and discussion. UV absorbance methods measure Kt/V, ionic dialysance Kt (product of clearance and treatment time; cleared volume without scaling). Both are easy and useful methods, but comparison is difficult due to problems in scaling of the dialysis dose to the patient's size. The best dose estimation method is the one which predicts the quality of life and survival most accurately. There is some evidence on the predictive value of ionic dialysance Kt, but more documentation is required on the UV method. Online monitoring is a useful tool in everyday quality assurance, but blood samples are still required for more accurate kinetic modelling. After reading this article the reader should be able to: Understand the elements of the Kt/V equation for dialysis dose. Compare and contrast different methods of measurement of dialysis dose. Reflect on the importance of adequate dialysis dose for patient survival and life quality. © 2018 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  17. Study of water-oil emulsion combustion in large pilot power plants for fine particle matter emission reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allouis, C.; Beretta, F.; L'Insalata, A.

    2007-04-15

    The combustion of heavy fuel oil for power generation is a great source of carbonaceous and inorganic particle emissions, even though the combustion technologies and their efficiency are improving. The information about the size distribution function of the particles originated by trace metals present into the fuels is not adequate. In this paper, we focused our attention the influence of emulsion oil-water on the larger distribution mode of both the carbonaceous and metallic particles. Isokinetic sampling was performed at the exhausts of flames of a low-sulphur content heavy oil and its emulsion with water produced in two large pilot plants.more » The samples were size-segregated by mean of an 8-stages Andersen impactor. Further investigation performed on the samples using electronic microscopy (SEM) coupled with X-ray analysis (EDX) evidenced the presence of solid spherical particles, plerosphere, with typical dimensions ranging between 200 nm and 2-3 {mu}m, whose atomic composition contains a large amount of the trace metals present in the parent oils (Fe, V, Ni, etc.). EDX analyses revealed that the metal concentration increases as the plerosphere dimension decreases. We also observed that the use of emulsion slightly reduce the emission of fine particles (D{sub 50} < 8 {mu}m) in the large scale plant. (author)« less

  18. Mercury in fishes from Wrangell-St. Elias National Park and Preserve, Alaska

    USGS Publications Warehouse

    Kowalski, Brandon M.; Willacker, James J.; Zimmerman, Christian E.; Eagles-Smith, Collin A.

    2014-01-01

    In this study, mercury (Hg) concentrations were examined in fishes from Wrangell-St. Elias National Park and Preserve, Alaska, the largest and one of the most remote units in the national park system. The goals of the study were to (1) examine the distribution of Hg in select lakes of Wrangell-St. Elias National Park and Preserve; (2) evaluate the differences in Hg concentrations among fish species and with fish age and size; and (3) assess the potential ecological risks of Hg to park fishes, wildlife, and human consumers by comparing Hg concentrations to a series of risk benchmarks. Total Hg concentrations ranged from 17.9 to 616.4 nanograms per gram wet weight (ng/g ww), with a mean (± standard error) of 180.0 ±17.9 across the 83 individuals sampled. Without accounting for the effects of size, Hg concentrations varied by a factor of 10.9 across sites and species. After accounting for the effects of size, Hg concentrations were even more variable, differing by a factor of as much as 13.2 within a single species sampled from two lakes. Such inter-site variation suggests that site characteristics play an important role in determining fish Hg concentrations and that more intensive sampling may be necessary to adequately characterize Hg contamination in the park. Size-normalized Hg concentrations also differed among three species sampled from Tanada Lake, and Hg concentrations were strongly correlated with age. Furthermore, potential risks to park fish, wildlife, and human users were variable across lakes and species. Although no fish from two of the lakes studied (Grizzly Lake and Summit Lake) had Hg concentrations exceeding any of the benchmarks used, concentrations in Copper Lake and Tanada Lake exceeded conservative benchmarks for bird (90 ng/g ww in whole-body) and human (150 ng/g ww in muscle) consumption. In Tanada Lake, concentrations in most fishes also exceeded benchmarks for risk to moderate- and low-sensitivity avian consumers (180 and 270 ng/g ww in whole-body, respectively), as well as the concentration at which Alaska State guidelines suggest at-risk groups limit fish consumption to 3 meals per week (320 ng/g). However, the relationship between Hg concentrations and fish size in Tanada Lake suggests that consumption of smaller-sized fishes could reduce Hg exposure in human consumers.

  19. THE EFFECTIVENESS OF QUADRATS FOR MEASURING VASCULAR PLANT DIVERSITY

    EPA Science Inventory

    Quadrats are widely used for measuring characteristics of vascular plant communities. It is well recognized that quadrat size affects measurements of frequency and cover. The ability of quadrats of varying sizes to adequately measure diversity has not been established. An exha...

  20. Myocardial Infarct Size by CMR in Clinical Cardioprotection Studies: Insights From Randomized Controlled Trials.

    PubMed

    Bulluck, Heerajnarain; Hammond-Haley, Matthew; Weinmann, Shane; Martinez-Macias, Roberto; Hausenloy, Derek J

    2017-03-01

    The aim of this study was to review randomized controlled trials (RCTs) using cardiac magnetic resonance (CMR) to assess myocardial infarct (MI) size in reperfused patients with ST-segment elevation myocardial infarction (STEMI). There is limited guidance on the use of CMR in clinical cardioprotection RCTs in patients with STEMI treated by primary percutaneous coronary intervention. All RCTs in which CMR was used to quantify MI size in patients with STEMI treated with primary percutaneous coronary intervention were identified and reviewed. Sixty-two RCTs (10,570 patients, January 2006 to November 2016) were included. One-third did not report CMR vendor or scanner strength, the contrast agent and dose used, and the MI size quantification technique. Gadopentetate dimeglumine was most commonly used, followed by gadoterate meglumine and gadobutrol at 0.20 mmol/kg each, with late gadolinium enhancement acquired at 10 min; in most RCTs, MI size was quantified manually, followed by the 5 standard deviation threshold; dropout rates were 9% for acute CMR only and 16% for paired acute and follow-up scans. Weighted mean acute and chronic MI sizes (≤12 h, initial TIMI [Thrombolysis in Myocardial Infarction] flow grade 0 to 3) from the control arms were 21 ± 14% and 15 ± 11% of the left ventricle, respectively, and could be used for future sample-size calculations. Pre-selecting patients most likely to benefit from the cardioprotective therapy (≤6 h, initial TIMI flow grade 0 or 1) reduced sample size by one-third. Other suggested recommendations for standardizing CMR in future RCTs included gadobutrol at 0.15 mmol/kg with late gadolinium enhancement at 15 min, manual or 6-SD threshold for MI quantification, performing acute CMR at 3 to 5 days and follow-up CMR at 6 months, and adequate reporting of the acquisition and analysis of CMR. There is significant heterogeneity in RCT design using CMR in patients with STEMI. The authors provide recommendations for standardizing the assessment of MI size using CMR in future clinical cardioprotection RCTs. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Puma (Puma concolor) epididymal sperm morphometry

    PubMed Central

    Cucho, Hernán; Alarcón, Virgilio; Ordóñez, César; Ampuero, Enrique; Meza, Aydee; Soler, Carles

    2016-01-01

    The Andean puma (Puma concolor) has not been widely studied, particularly in reference to its semen characteristics. The aim of the present study was to define the morphometry of puma sperm heads and classify their subpopulations by cluster analysis. Samples were recovered postmortem from two epididymides from one animal and prepared for morphological observation after staining with the Hemacolor kit. Morphometric data were obtained from 581 spermatozoa using a CASA-Morph system, rendering 13 morphometric parameters. The principal component (PC) analysis was performed followed by cluster analysis for the establishment of subpopulations. Two PC components were obtained, the first related to size and the second to shape. Three subpopulations were observed, corresponding to elongated and intermediate-size sperm heads and acrosomes, to large heads with large acrosomes, and to small heads with short acrosomes. In conclusion, puma spermatozoa showed no uniform sperm morphology but three clear subpopulations. These results should be used for future work in the establishment of an adequate germplasm bank of this species. PMID:27678466

  2. Puma (Puma concolor) epididymal sperm morphometry.

    PubMed

    Cucho, Hernán; Alarcón, Virgilio; Ordóñez, César; Ampuero, Enrique; Meza, Aydee; Soler, Carles

    2016-01-01

    The Andean puma (Puma concolor) has not been widely studied, particularly in reference to its semen characteristics. The aim of the present study was to define the morphometry of puma sperm heads and classify their subpopulations by cluster analysis. Samples were recovered postmortem from two epididymides from one animal and prepared for morphological observation after staining with the Hemacolor kit. Morphometric data were obtained from 581 spermatozoa using a CASA-Morph system, rendering 13 morphometric parameters. The principal component (PC) analysis was performed followed by cluster analysis for the establishment of subpopulations. Two PC components were obtained, the first related to size and the second to shape. Three subpopulations were observed, corresponding to elongated and intermediate-size sperm heads and acrosomes, to large heads with large acrosomes, and to small heads with short acrosomes. In conclusion, puma spermatozoa showed no uniform sperm morphology but three clear subpopulations. These results should be used for future work in the establishment of an adequate germplasm bank of this species.

  3. Determination of the cumulus size distribution from LANDSAT pictures

    NASA Technical Reports Server (NTRS)

    Karg, E.; Mueller, H.; Quenzel, H.

    1983-01-01

    Varying insolation causes undesirable thermic stress to the receiver of a solar power plant. The rapid change of insolation depends on the size distribution of the clouds; in order to measure these changes, it is suitable to determine typical cumulus size distributions. For this purpose, LANDSAT-images are adequate. Several examples of cumulus size distributions will be presented and their effects on the operation of a solar power plant are discussed.

  4. The peculiar behavior of the glass transition temperature of amorphous drug-polymer films coated on inert sugar spheres.

    PubMed

    Dereymaker, Aswin; Van Den Mooter, Guy

    2015-05-01

    Fluid bed coating has been proposed in the past as an alternative technology for manufacturing of drug-polymer amorphous solid dispersions, or so-called glass solutions. It has the advantage of being a one-step process, and thus omitting separate drying steps, addition of excipients, or manipulation of the dosage form. In search of an adequate sample preparation method for modulated differential scanning calorimetry analysis of beads coated with glass solutions, glass transition broadening and decrease of the glass transition temperature (Tg ) were observed with increasing particle size of crushed coated beads and crushed isolated films of indomethacin (INDO) and polyvinylpyrrolidone (PVP). Substituting INDO with naproxen gave comparable results. When ketoconazole was probed or the solvent in INDO-PVP films was switched to dichloromethane (DCM) or a methanol-DCM mixture, two distinct Tg regions were observed. Small particle sizes had a glass transition in the high Tg region, and large particle sizes had a glass transition in the low Tg region. This particle size-dependent glass transition was ascribed to different residual solvent amounts in the bulk and at the surface of the particles. A correlation was observed between the deviation of the Tg from that calculated from the Gordon-Taylor equation and the amount of residual solvent at the Tg of particles with different sizes. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  5. Optimal detection pinhole for lowering speckle noise while maintaining adequate optical sectioning in confocal reflectance microscopes

    PubMed Central

    Rajadhyaksha, Milind

    2012-01-01

    Abstract. Coherent speckle influences the resulting image when narrow spectral line-width and single spatial mode illumination are used, though these are the same light-source properties that provide the best radiance-to-cost ratio. However, a suitable size of the detection pinhole can be chosen to maintain adequate optical sectioning while making the probability density of the speckle noise more normal and reducing its effect. The result is a qualitatively better image with improved contrast, which is easier to read. With theoretical statistics and experimental results, we show that the detection pinhole size is a fundamental parameter for designing imaging systems for use in turbid media. PMID:23224184

  6. Fit Assessment of N95 Filtering-Facepiece Respirators in the U.S. Centers for Disease Control and Prevention Strategic National Stockpile.

    PubMed

    Bergman, Michael; Zhuang, Ziqing; Brochu, Elizabeth; Palmiero, Andrew

    National Institute for Occupational Safety and Health (NIOSH)-approved N95 filtering-facepiece respirators (FFR) are currently stockpiled by the U.S. Centers for Disease Control and Prevention (CDC) for emergency deployment to healthcare facilities in the event of a widespread emergency such as an influenza pandemic. This study assessed the fit of N95 FFRs purchased for the CDC Strategic National Stockpile. The study addresses the question of whether the fit achieved by specific respirator sizes relates to facial size categories as defined by two NIOSH fit test panels. Fit test data were analyzed from 229 test subjects who performed a nine-donning fit test on seven N95 FFR models using a quantitative fit test protocol. An initial respirator model selection process was used to determine if the subject could achieve an adequate fit on a particular model; subjects then tested the adequately fitting model for the nine-donning fit test. Only data for models which provided an adequate initial fit (through the model selection process) for a subject were analyzed for this study. For the nine-donning fit test, six of the seven respirator models accommodated the fit of subjects (as indicated by geometric mean fit factor > 100) for not only the intended NIOSH bivariate and PCA panel sizes corresponding to the respirator size, but also for other panel sizes which were tested for each model. The model which showed poor performance may not be accurately represented because only two subjects passed the initial selection criteria to use this model. Findings are supportive of the current selection of facial dimensions for the new NIOSH panels. The various FFR models selected for the CDC Strategic National Stockpile provide a range of sizing options to fit a variety of facial sizes.

  7. Fit Assessment of N95 Filtering-Facepiece Respirators in the U.S. Centers for Disease Control and Prevention Strategic National Stockpile

    PubMed Central

    Bergman, Michael; Zhuang, Ziqing; Brochu, Elizabeth; Palmiero, Andrew

    2016-01-01

    National Institute for Occupational Safety and Health (NIOSH)-approved N95 filtering-facepiece respirators (FFR) are currently stockpiled by the U.S. Centers for Disease Control and Prevention (CDC) for emergency deployment to healthcare facilities in the event of a widespread emergency such as an influenza pandemic. This study assessed the fit of N95 FFRs purchased for the CDC Strategic National Stockpile. The study addresses the question of whether the fit achieved by specific respirator sizes relates to facial size categories as defined by two NIOSH fit test panels. Fit test data were analyzed from 229 test subjects who performed a nine-donning fit test on seven N95 FFR models using a quantitative fit test protocol. An initial respirator model selection process was used to determine if the subject could achieve an adequate fit on a particular model; subjects then tested the adequately fitting model for the nine-donning fit test. Only data for models which provided an adequate initial fit (through the model selection process) for a subject were analyzed for this study. For the nine-donning fit test, six of the seven respirator models accommodated the fit of subjects (as indicated by geometric mean fit factor > 100) for not only the intended NIOSH bivariate and PCA panel sizes corresponding to the respirator size, but also for other panel sizes which were tested for each model. The model which showed poor performance may not be accurately represented because only two subjects passed the initial selection criteria to use this model. Findings are supportive of the current selection of facial dimensions for the new NIOSH panels. The various FFR models selected for the CDC Strategic National Stockpile provide a range of sizing options to fit a variety of facial sizes. PMID:26877587

  8. Validation of internal controls for extraction and amplification of nucleic acids from enteric viruses in water samples.

    PubMed

    Hata, Akihiko; Katayama, Hiroyuki; Kitajima, Masaaki; Visvanathan, Chettiyappan; Nol, Chea; Furumai, Hiroaki

    2011-07-01

    Inhibitors that reduce viral nucleic acid extraction efficiency and interfere with cDNA synthesis and/or polymerase activity affect the molecular detection of viruses in aquatic environments. To overcome these significant problems, we developed a methodology for assessing nucleic acid yields and DNA amplification efficiencies for environmental water samples. This involved adding particles of adenovirus type 5 and murine norovirus and newly developed primer-sharing controls, which are amplified with the same primer pairs and result in the same amplicon sizes as the targets, to these samples. We found that nucleic acid loss during the extraction process, rather than reverse transcription-PCR (RT-PCR) inhibition, more significantly attributed to underestimation of the presence of viral genomes in the environmental water samples tested in this study. Our success rate for satisfactorily amplifying viral RNAs and DNAs by RT-PCR was higher than that for obtaining adequate nucleic acid preparations. We found that inhibitory properties were greatest when we used larger sample volumes. A magnetic silica bead-based RNA extraction method effectively removed inhibitors that interfere with viral nucleic acid extraction and RT-PCR. To our knowledge, this is the first study to assess the inhibitory properties of environmental water samples by using both control virus particles and primer-sharing controls.

  9. Defensive platform size and survivability. [Platform survivability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canavan, Gregory H.

    1988-06-01

    This report discusses the survivability of space platforms, concentrating on space based kinetic energy interceptors. It evaluates the efficacy of hardening, maneuver, self-defense, and deception in extending the survivability of platforms of varying sizes to expected threats, concluding that they should be adequate in the near and mid terms.

  10. Financing Class Size Reduction

    ERIC Educational Resources Information Center

    Achilles, C. M.

    2005-01-01

    Class size reduction has been shown to, among other things, improve academic achievement for all students and particularly for low-income and minority students. With the No Child Left Behind Act's heavy emphasis on scientifically based research, adequate yearly progress, and disaggregated results, one wonders why all children aren't enrolled in…

  11. A Bayesian Nonparametric Meta-Analysis Model

    ERIC Educational Resources Information Center

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G.

    2015-01-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall…

  12. Reversed-phase HPLC analysis of levetiracetam in tablets using monolithic and conventional C18 silica columns.

    PubMed

    Can, Nafiz O; Arli, Goksel

    2010-01-01

    Development and validation of an RP-HPLC method for determination of levetiracetam in pharmaceutical tablets is described. The separation and quantification of levetiracetam and caffeine (internal standard) were performed using a single analytical procedure with two different types of stationary phases, conventional Phenomenex Gemini C18 (100 x 4.6 mm, 5 microm) and Merck Chromolith Performance RP18e (100 x 4.6 mm, macropore size 2 mm, micropore size 13 nm) monolithic silica. Five-microliter aliquots of samples were injected into the system and eluted using water-acetonitrile (90 + 10, v/v) mobile phase pumped at the rate of 1 mL/min. The analyte peaks were detected at 200 nm using a diode array detector with adequate resolution. Validation studies were performed using the method recommended by the International Conference on Harmonization, the U.S. Pharmacopeia, and AOAC INTERNATIONAL, which includes accuracy, precision, range, limits, robustness, and system suitability parameters. Levetiracetam and caffeine were detected in about 7 min using the conventional column, whereas less than 5 min was required when the monolithic column was used. Calibration plots had r values close to unity in the range of 0.8-8.0 microg/mL. Assay of levetiracetam in a tablet formulation was demonstrated as an application to real samples.

  13. Rationale and design of the IMPACT EU-trial: improve management of heart failure with procalcitonin biomarkers in cardiology (BIC)-18.

    PubMed

    Möckel, Martin; Slagman, Anna; Vollert, Jörn Ole; Ebmeyer, Stefan; Wiemer, Jan C; Searle, Julia; Giannitsis, Evangelos; Kellum, John A; Maisel, Alan

    2018-02-01

    To evaluate the effectiveness of procalcitonin (PCT)-guided antibiotic treatment compared to current treatment practice to reduce 90-day all-cause mortality in emergency patients with shortness of breath (SOB) and suspected acute heart failure (AHF). Concomitant AHF and lower respiratory tract (or other bacterial) infection in emergency patients with dyspnea are common and can be difficult to diagnose. Early and adequate initiation of antibiotic therapy (ABX) significantly improves patient outcome, but superfluous prescription of ABX maybe harmful. In a multicentre, prospective, randomized, controlled process trial with an open intervention, adult emergency patients with SOB and increased levels of natriuretic peptides will be randomized to either a standard care group or a PCT-guided group with respect to the initiation of antibiotic treatment. In the PCT-guided group, the initiation of antibiotic therapy is based on the results of acute PCT measurements at admission, using a cut-off of 0.2 ng/ml. A two-stage sample-size adaptive design is used; an interim analysis was done after completion of 50% of patients and the final sample size remained unchanged. Primary endpoint is 90-day all-cause mortality. The current study will provide evidence, whether the routine use of PCT in patients with suspected AHF improves outcome.

  14. Quantification of prairie restoration for phytostability at a remediated defense plant.

    PubMed

    Franson, Raymond L; Scholes, Chad M

    2011-01-01

    In June 2008 and 2009, cover, density, and species diversity were measured on two areas of the prairie at the U. S. Department of Energy Weldon Spring Site to begin quantification of the prairie establishment and the effects of a prairie burn. Sampling began by testing for the most appropriate transect length (cover) and quadrat size (density) for quantification of vegetation. Total cover increased in the first growing season after burning. Conversely, total cover decreased in the unburned area in one year. The trend in litter cover is the opposite with litter decreasing after burning, but increasing in one year in the unburned area. Bare ground decreased in one year in the unburned area, but was unchanged after burning. Species diversity tripled after fire, but was unchanged in one year in the unburned area. The results show that litter and fire both affect plant cover. If land reclamation activities are to be an integral part of hazardous waste remediation at contaminated sites, then the success of reclamation efforts needs to be quantified along with success criteria for waste remediation of the sites. The results show that plant cover can be easily quantified, but that density measures are more biased which makes it more difficult to achieve adequate sample size for plant density.

  15. Exploring how to increase response rates to surveys of older people.

    PubMed

    Palonen, Mira; Kaunonen, Marja; Åstedt-Kurki, Päivi

    2016-05-01

    To address the special considerations that need to be taken into account when collecting data from older people in healthcare research. An objective of all research studies is to ensure there is an adequate sample size. The final sample size will be influenced by methods of recruitment and data collection, among other factors. There are some special considerations that need to be addressed when collecting data among older people. Quantitative surveys of people aged 60 or over in 2009-2014 were analysed using statistical methods. A quantitative study of patients aged 75 or over in an emergency department was used as an example. A methodological approach to analysing quantitative studies concerned with older people. The best way to ensure high response rates in surveys involving people aged 60 or over is to collect data in the presence of the researcher; response rates are lowest in posted surveys and settings where the researcher is not present when data are collected. Response rates do not seem to vary according to the database from which information about the study participants is obtained or according to who is responsible for recruitment to the survey. Implications for research/practice To conduct coherent studies with older people, the data collection process should be carefully considered.

  16. Evaluating test-retest reliability in patient-reported outcome measures for older people: A systematic review.

    PubMed

    Park, Myung Sook; Kang, Kyung Ja; Jang, Sun Joo; Lee, Joo Yun; Chang, Sun Ju

    2018-03-01

    This study aimed to evaluate the components of test-retest reliability including time interval, sample size, and statistical methods used in patient-reported outcome measures in older people and to provide suggestions on the methodology for calculating test-retest reliability for patient-reported outcomes in older people. This was a systematic literature review. MEDLINE, Embase, CINAHL, and PsycINFO were searched from January 1, 2000 to August 10, 2017 by an information specialist. This systematic review was guided by both the Preferred Reporting Items for Systematic Reviews and Meta-Analyses checklist and the guideline for systematic review published by the National Evidence-based Healthcare Collaborating Agency in Korea. The methodological quality was assessed by the Consensus-based Standards for the selection of health Measurement Instruments checklist box B. Ninety-five out of 12,641 studies were selected for the analysis. The median time interval for test-retest reliability was 14days, and the ratio of sample size for test-retest reliability to the number of items in each measure ranged from 1:1 to 1:4. The most frequently used statistical methods for continuous scores was intraclass correlation coefficients (ICCs). Among the 63 studies that used ICCs, 21 studies presented models for ICC calculations and 30 studies reported 95% confidence intervals of the ICCs. Additional analyses using 17 studies that reported a strong ICC (>0.09) showed that the mean time interval was 12.88days and the mean ratio of the number of items to sample size was 1:5.37. When researchers plan to assess the test-retest reliability of patient-reported outcome measures for older people, they need to consider an adequate time interval of approximately 13days and the sample size of about 5 times the number of items. Particularly, statistical methods should not only be selected based on the types of scores of the patient-reported outcome measures, but should also be described clearly in the studies that report the results of test-retest reliability. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Choosing a design to fit the situation: how to improve specificity and positive predictive values using Bayesian lot quality assurance sampling

    PubMed Central

    Olives, Casey; Pagano, Marcello

    2013-01-01

    Background Lot Quality Assurance Sampling (LQAS) is a provably useful tool for monitoring health programmes. Although LQAS ensures acceptable Producer and Consumer risks, the literature alleges that the method suffers from poor specificity and positive predictive values (PPVs). We suggest that poor LQAS performance is due, in part, to variation in the true underlying distribution. However, until now the role of the underlying distribution in expected performance has not been adequately examined. Methods We present Bayesian-LQAS (B-LQAS), an approach to incorporating prior information into the choice of the LQAS sample size and decision rule, and explore its properties through a numerical study. Additionally, we analyse vaccination coverage data from UNICEF’s State of the World’s Children in 1968–1989 and 2008 to exemplify the performance of LQAS and B-LQAS. Results Results of our numerical study show that the choice of LQAS sample size and decision rule is sensitive to the distribution of prior information, as well as to individual beliefs about the importance of correct classification. Application of the B-LQAS approach to the UNICEF data improves specificity and PPV in both time periods (1968–1989 and 2008) with minimal reductions in sensitivity and negative predictive value. Conclusions LQAS is shown to be a robust tool that is not necessarily prone to poor specificity and PPV as previously alleged. In situations where prior or historical data are available, B-LQAS can lead to improvements in expected performance. PMID:23378151

  18. Getting a scientific paper published in Epilepsia: an editor's perspective.

    PubMed

    Schwartzkroin, Philip A

    2013-11-01

    Getting a paper published in Epilepsia depends first and foremost on the quality of the work reported, and on the clarity and convincingness of the presentation. Papers should focus on important and interesting topics with clearly stated objectives and goals. The observations and findings are of greatest interest when they are novel and change our views on the mechanisms and/or treatment of an epileptic disease. Studies should be carefully designed to include adequate sample size, comparison groups, and statistical analyses. Critically, the data must be clearly presented and appropriately interpreted. If followed, these recommendations will improve an author's chances of having his/her paper accepted in a high quality journal like Epilepsia. Wiley Periodicals, Inc. © 2013 International League Against Epilepsy.

  19. First Surface-resolved Results with the Infrared Optical Telescope Array Imaging Interferometer: Detection of Asymmetries in Asymptotic Giant Branch Stars

    NASA Astrophysics Data System (ADS)

    Ragland, S.; Traub, W. A.; Berger, J.-P.; Danchi, W. C.; Monnier, J. D.; Willson, L. A.; Carleton, N. P.; Lacasse, M. G.; Millan-Gabet, R.; Pedretti, E.; Schloerb, F. P.; Cotton, W. D.; Townes, C. H.; Brewer, M.; Haguenauer, P.; Kern, P.; Labeye, P.; Malbet, F.; Malin, D.; Pearlman, M.; Perraut, K.; Souccar, K.; Wallace, G.

    2006-11-01

    We have measured nonzero closure phases for about 29% of our sample of 56 nearby asymptotic giant branch (AGB) stars, using the three-telescope Infrared Optical Telescope Array (IOTA) interferometer at near-infrared wavelengths (H band) and with angular resolutions in the range 5-10 mas. These nonzero closure phases can only be generated by asymmetric brightness distributions of the target stars or their surroundings. We discuss how these results were obtained and how they might be interpreted in terms of structures on or near the target stars. We also report measured angular sizes and hypothesize that most Mira stars would show detectable asymmetry if observed with adequate angular resolution.

  20. Designing a household survey to address seasonality in child care arrangements.

    PubMed

    Schmidt, Stefanie R; Wang, Kevin H; Sonenstein, Freya L

    2008-04-01

    In household telephone surveys, a long field period may be required to maximize the response rate and achieve adequate sample sizes. However, long field periods can be problematic when measures of seasonally affected behavior are sought. Surveys of child care use are one example because child care arrangements vary by season. Options include varying the questions posed about school-year and summer arrangements or posing retrospective questions about child care use for the school year only. This article evaluates the bias associated with the use of retrospective questions about school-year child care arrangements in the 1999 National Survey of America's Families. The authors find little evidence of bias and hence recommend that future surveys use the retrospective approach.

  1. Methodological Reporting Quality of Randomized Controlled Trials in 3 Leading Diabetes Journals From 2011 to 2013 Following CONSORT Statement: A System Review.

    PubMed

    Zhai, Xiao; Wang, Yiran; Mu, Qingchun; Chen, Xiao; Huang, Qin; Wang, Qijin; Li, Ming

    2015-07-01

    To appraise the current reporting methodological quality of randomized clinical trials (RCTs) in 3 leading diabetes journals.We systematically searched the literature for RCTs in Diabetes Care, Diabetes and Diabetologia from 2011 to 2013.Characteristics were extracted based on Consolidated Standards of Reporting Trials (CONSORT) statement. Generation of allocation, concealment of allocation, intention-to-treat (ITT) analysis and handling of dropouts were defined as primary outcome and "low risk of bias." Sample size calculation, type of intervention, country, number of patients, funding source were also revealed and descriptively reported. Trials were compared among journals, study years, and other characters.A total of 305 RCTs were enrolled in this study. One hundred eight (35.4%) trials reported adequate generation of allocation, 87 (28.5%) trials reported adequate concealment of allocation, 53 (23.8%) trials used ITT analysis, and 130 (58.3%) trials were adequate in handling of dropouts. Only 15 (4.9%) were "low risk of bias" trials. Studies at a large scale (n > 100) or from European presented with more "low risk of bias" trials than those at a small scale (n ≤ 100) or from other regions. No improvements were found in these 3 years.This study shows that methodological reporting quality of RCTs in the major diabetes journals remains suboptimal. It can be further improved to meet and keep up with the standards of the CONSORT statement.

  2. Clonal growth: invasion or stability? A comparative study of clonal architecture and diversity in native and introduced lineages of Phragmites australis (Poaceae).

    PubMed

    Douhovnikoff, Vladimir; Hazelton, Eric L G

    2014-09-01

    • The characteristics of clonal growth that are advantageous in invasive plants can also result in native plants' ability to resist invasion. In Maine, we compared the clonal architecture and diversity of an invasive lineage (introduced Phragmites) and a noninvasive lineage (native Phragmites) present in much of North America. This study is the first on stand-scale diversity using a sample size and systematic spatial-sampling scheme adequate for characterizing clonal structure in Phragmites. Our questions included: (1) Does the structure and extent of clonal growth suggest that the potential for clonal growth contributes to the invasiveness of the introduced lineage? (2) Is clonal growth common in the native lineage, acting as a possible source of ecological resistance and resilience?• Microsatellite markers were used to measure clonal sizes, architecture, and diversity within each lineage in stands within four marshes in Maine.• Clonal diversity measures indicated that clonal growth was significantly greater in stands of the native lineage than in the introduced. While lineage was a consistent predictor of clonal diversity relative ranking, the marsh location was a much stronger predictor of the absolute range of these values.• Our results indicate an important role for clonal growth in the space consolidation of native Phragmites and could explain why the introduced lineage, with stronger competitive traits, has not replaced the native where they co-occur. These results with regard to clone size, size distributions, singleton occurrence, and clonal architecture provide some evidence for stand development that follows a genotypic initial floristics model. © 2014 Botanical Society of America, Inc.

  3. Sexual Functioning and Behavior of Men with Body Dysmorphic Disorder Concerning Penis Size Compared with Men Anxious about Penis Size and with Controls: A Cohort Study

    PubMed Central

    Veale, David; Miles, Sarah; Read, Julie; Troglia, Andrea; Wylie, Kevan; Muir, Gordon

    2015-01-01

    Introduction Little is known about the sexual functioning and behavior of men anxious about the size of their penis and the means that they might use to try to alter the size of their penis. Aim To compare sexual functioning and behavior in men with body dysmorphic disorder (BDD) concerning penis size and in men with small penis anxiety (SPA without BDD) and in a control group of men who do not have any concerns. Methods An opportunistic sample of 90 men from the community were recruited and divided into three groups: BDD (n = 26); SPA (n = 31) and controls (n = 33). Main Outcome Measures The Index of Erectile Function (IEF), sexual identity and history; and interventions to alter the size of their penis. Results Men with BDD compared with controls had reduced erectile dysfunction, orgasmic function, intercourse satisfaction and overall satisfaction on the IEF. Men with SPA compared with controls had reduced intercourse satisfaction. There were no differences in sexual desire, the frequency of intercourse or masturbation across any of the three groups. Men with BDD and SPA were more likely than the controls to attempt to alter the shape or size of their penis (for example jelqing, vacuum pumps or stretching devices) with poor reported success. Conclusion Men with BDD are more likely to have erectile dysfunction and less satisfaction with intercourse than controls but maintain their libido. Further research is required to develop and evaluate a psychological intervention for such men with adequate outcome measures. PMID:26468378

  4. Sexual Functioning and Behavior of Men with Body Dysmorphic Disorder Concerning Penis Size Compared with Men Anxious about Penis Size and with Controls: A Cohort Study.

    PubMed

    Veale, David; Miles, Sarah; Read, Julie; Troglia, Andrea; Wylie, Kevan; Muir, Gordon

    2015-09-01

    Little is known about the sexual functioning and behavior of men anxious about the size of their penis and the means that they might use to try to alter the size of their penis. To compare sexual functioning and behavior in men with body dysmorphic disorder (BDD) concerning penis size and in men with small penis anxiety (SPA without BDD) and in a control group of men who do not have any concerns. An opportunistic sample of 90 men from the community were recruited and divided into three groups: BDD (n = 26); SPA (n = 31) and controls (n = 33). The Index of Erectile Function (IEF), sexual identity and history; and interventions to alter the size of their penis. Men with BDD compared with controls had reduced erectile dysfunction, orgasmic function, intercourse satisfaction and overall satisfaction on the IEF. Men with SPA compared with controls had reduced intercourse satisfaction. There were no differences in sexual desire, the frequency of intercourse or masturbation across any of the three groups. Men with BDD and SPA were more likely than the controls to attempt to alter the shape or size of their penis (for example jelqing, vacuum pumps or stretching devices) with poor reported success. Men with BDD are more likely to have erectile dysfunction and less satisfaction with intercourse than controls but maintain their libido. Further research is required to develop and evaluate a psychological intervention for such men with adequate outcome measures.

  5. Factors Affecting the Presence of Adequately Iodized Salt at Home in Wolaita, Southern Ethiopia: Community Based Study.

    PubMed

    Kumma, Wondimagegn Paulos; Haji, Yusuf; Abdurahmen, Junayde; Mehretie Adinew, Yohannes

    2018-01-01

    Universal use of iodized salt is a simple and inexpensive method to prevent and eliminate iodine deficiency disorders like mental retardation. However, little is known about the level of adequately iodized salt consumption in the study area. Therefore, the study was aimed at assessing the proportion of households having adequately iodized salt and associated factors in Wolaita Sodo town and its peripheries, Southern Ethiopia. A cross-sectional study was conducted from May 10 to 20, 2016, in 441 households in Sodo town and its peripheries. Samples were selected using the systematic sampling technique. An iodometric titration method (AOAC, 2000) was used to analyze the iodine content of the salt samples. Data entry and analysis were done using Epi Info version 3.5.1 and SPSS version 16, respectively. The female to male ratio of the respondents was 219. The mean age of the respondents was 30.2 (±7.3 SD). The proportion of households having adequately iodized salt was 37.7%, with 95% CI of 33.2% to 42.2%. Not exposing salt to sunlight with [OR: 3.75; 95% CI: 2.14, 6.57], higher monthly income [OR: 3.71; 95% CI: 1.97-7.01], and formal education of respondents with [OR: 1.75; 95% CI: 1.14, 2.70] were found associated with the presence of adequately iodized salt at home. This study revealed low levels of households having adequately iodized salt in Wolaita Sodo town and its peripheries. The evidence here shows that there is a need to increase the supply of adequately iodized salt to meet the goal for monitoring progress towards sustainable elimination of IDD.

  6. Factors Affecting the Presence of Adequately Iodized Salt at Home in Wolaita, Southern Ethiopia: Community Based Study

    PubMed Central

    Abdurahmen, Junayde

    2018-01-01

    Background Universal use of iodized salt is a simple and inexpensive method to prevent and eliminate iodine deficiency disorders like mental retardation. However, little is known about the level of adequately iodized salt consumption in the study area. Therefore, the study was aimed at assessing the proportion of households having adequately iodized salt and associated factors in Wolaita Sodo town and its peripheries, Southern Ethiopia. Methods A cross-sectional study was conducted from May 10 to 20, 2016, in 441 households in Sodo town and its peripheries. Samples were selected using the systematic sampling technique. An iodometric titration method (AOAC, 2000) was used to analyze the iodine content of the salt samples. Data entry and analysis were done using Epi Info version 3.5.1 and SPSS version 16, respectively. Result The female to male ratio of the respondents was 219. The mean age of the respondents was 30.2 (±7.3 SD). The proportion of households having adequately iodized salt was 37.7%, with 95% CI of 33.2% to 42.2%. Not exposing salt to sunlight with [OR: 3.75; 95% CI: 2.14, 6.57], higher monthly income [OR: 3.71; 95% CI: 1.97–7.01], and formal education of respondents with [OR: 1.75; 95% CI: 1.14, 2.70] were found associated with the presence of adequately iodized salt at home. Conclusion This study revealed low levels of households having adequately iodized salt in Wolaita Sodo town and its peripheries. The evidence here shows that there is a need to increase the supply of adequately iodized salt to meet the goal for monitoring progress towards sustainable elimination of IDD. PMID:29765978

  7. Correlations of Apparent Cellulose Crystallinity Determined by XRD, NMR, IR, Raman, and SFG Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, David K; Lee, Christopher; Dazen, Kevin

    2015-07-04

    Although the cellulose crystallinity index (CI) is used widely, its limitations have not been adequately described. In this study, the CI values of a set of reference samples were determined from X-ray diffraction (XRD), nuclear magnetic resonance (NMR), and infrared (IR), Raman, and vibrational sum frequency generation (SFG) spectroscopies. The intensities of certain crystalline peaks in IR, Raman, and SFG spectra positively correlated with the amount of crystalline cellulose in the sample, but the correlation with XRD was nonlinear as a result of fundamental differences in detection sensitivity to crystalline cellulose and improper baseline corrections for amorphous contributions. It ismore » demonstrated that the intensity and shape of the XRD signal is affected by both the amount of crystalline cellulose and crystal size, which makes XRD analysis complicated. It is clear that the methods investigated show the same qualitative trends for samples, but the absolute CI values differ depending on the determination method. This clearly indicates that the CI, as estimated by different methods, is not an absolute value and that for a given set of samples the CI values can be compared only as a qualitative measure.« less

  8. Correlations of Apparent Cellulose Crystallinity Determined by XRD, NMR, IR, Raman, and SFG Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Christopher M; Dazen, Kevin; Kafle, Kabindra

    2015-01-01

    Although the cellulose crystallinity index (CI) is used widely, its limitations have not been adequately described. In this study, the CI values of a set of reference samples were determined from X-ray diffraction (XRD), nuclear magnetic resonance (NMR), and infrared (IR), Raman, and vibrational sum frequency generation (SFG) spectroscopies. The intensities of certain crystalline peaks in IR, Raman, and SFG spectra positively correlated with the amount of crystalline cellulose in the sample, but the correlation with XRD was nonlinear as a result of fundamental differences in detection sensitivity to crystalline cellulose and improper baseline corrections for amorphous contributions. It ismore » demonstrated that the intensity and shape of the XRD signal is affected by both the amount of crystalline cellulose and crystal size, which makes XRD analysis complicated. It is clear that the methods investigated show the same qualitative trends for samples, but the absolute CI values differ depending on the determination method. This clearly indicates that the CI, as estimated by different methods, is not an absolute value and that for a given set of samples the CI values can be compared only as a qualitative measure.« less

  9. Lake trout restoration in the Great Lakes: stock-size criteria for natural reproduction

    USGS Publications Warehouse

    Selgeby, James H.; Bronte, Charles R.; Brown, Edward H.; Hansen, Michael J.; Holey, Mark E.; VanAmberg, Jan P.; Muth, Kenneth M.; Makauskas, Daniel B.; Mckee, Patrick; Anderson, David M.; Ferreri, C. Paola; Schram, Stephen T.

    1995-01-01

    We examined the question of whether the lake trout restoration program in the Great Lakes has developed brood stocks of adequate size to sustain natural reproduction. Stock size criteria were developed from areas of the Great Lakes where natural reproduction has been successful (defined as detection of age-1 or older recruits by assessment fishing). We contrasted them with stocks in areas with no natural reproduction. Based on the relative abundance of spawners measured in the fall and the presence or absence of natural reproduction in 24 areas of the Great Lakes, we found three distinct sets of lake trout populations. In seven areas of successful natural reproduction, the catch-per-unit-effort (CPE) of spawners ranged from 17 to 135 fish/305 m of gillnet. Stock sizes in these areas were used as a gauge against which stocks in other areas were contrasted. We conclude that stock densities of 17-135 fish/305 m of gill net are adequate for natural reproduction, provided that all other requirements are met. No natural reproduction has been detected in seven other areas, where CPEs of spawners ranged from only 3 to 5 fish/305 m. We conclude that spawning stocks of only 3-5 fish/305 m of net are inadequate to develop measurable natural reproduction. Natural reproduction has also not been detected in ten areas where CPEs of spawners ranged from 43 to 195 fish/305 m of net. We conclude that spawning stocks in these ten areas were adequate to sustain natural reproduction, but that some factor other than parental stock size prevented recruitment of wild lake trout.

  10. Evaluation Studies of Robotic Rollators by the User Perspective: A Systematic Review.

    PubMed

    Werner, Christian; Ullrich, Phoebe; Geravand, Milad; Peer, Angelika; Hauer, Klaus

    2016-01-01

    Robotic rollators enhance the basic functions of established devices by technically advanced physical, cognitive, or sensory support to increase autonomy in persons with severe impairment. In the evaluation of such ambient assisted living solutions, both the technical and user perspectives are important to prove usability, effectiveness and safety, and to ensure adequate device application. The aim of this systematic review is to summarize the methodology of studies evaluating robotic rollators with focus on the user perspective and to give recommendations for future evaluation studies. A systematic literature search up to December 31, 2014, was conducted based on the Cochrane Review methodology using the electronic databases PubMed and IEEE Xplore. Articles were selected according to the following inclusion criteria: evaluation studies of robotic rollators documenting human-robot interaction, no case reports, published in English language. Twenty-eight studies were identified that met the predefined inclusion criteria. Large heterogeneity in the definitions of the target user group, study populations, study designs and assessment methods was found across the included studies. No generic methodology to evaluate robotic rollators could be identified. We found major methodological shortcomings related to insufficient sample descriptions and sample sizes, and lack of appropriate, standardized and validated assessment methods. Long-term use in habitual environment was also not evaluated. Apart from the heterogeneity, methodological deficits in most of the identified studies became apparent. Recommendations for future evaluation studies include: clear definition of target user group, adequate selection of subjects, inclusion of other assistive mobility devices for comparison, evaluation of the habitual use of advanced prototypes, adequate assessment strategy with established, standardized and validated methods, and statistical analysis of study results. Assessment strategies may additionally focus on specific functionalities of the robotic rollators allowing an individually tailored assessment of innovative features to document their added value. © 2016 S. Karger AG, Basel.

  11. Lithologic controls on AIRSAR signatures of bedrock and alluvium, at Lunar Crater, Nevada

    NASA Technical Reports Server (NTRS)

    Rivard, Benoit; Diorio, Marc; Budkewitsch, Paul

    1995-01-01

    Radar backscatter intensity as measured by calibrated synthetic aperture radar (SAR) systems is primarily controlled by three factors: local incidence angle, wavelength-scale roughness, and dielectric permittivity of surface materials. In order to make adequate use of radar observations for geological investigations of surface type, the relationships between lithology and the above characteristics must be adequately understood. In arid terrains weathering signatures (e.g. fracturing, debris grain size and shape, slope characteristics) are controlled to some extent by lithologic characteristics of the parent bedrock. These textural features of outcrops and their associated debris control radar backscatter to varying degrees. The quad-polarization JPL AIRSAR system allows sampling of textures at three distinct wavelength scales: C-band (5.66 cm), L-band (23.98 cm), and P-band (68.13 cm). This paper presents a discussion of AIRSAR data using recent field observations of weathered felsic and basaltic volcanic rock units exposed in the southern part of the Lunar Crater Volcanic Field, in the Pancake Range of central Nevada. The focus is on the relationship of radar backscatter at multiple wavelengths to weathering style and parent bedrock lithology.

  12. Agreement studies in radiology research.

    PubMed

    Farzin, B; Gentric, J-C; Pham, M; Tremblay-Paquet, S; Brosseau, L; Roy, C; Jamali, S; Chagnon, M; Darsaut, T E; Guilbert, F; Naggara, O; Raymond, J

    2017-03-01

    The goal of this study was to estimate the frequency and the quality of agreement studies published in diagnostic imaging journals. All studies published between January 2011 and December 2012 in four radiology journals were reviewed. Four trained readers evaluated agreement studies using a 24-item form that included the 15 items of the Guidelines for Reporting Reliability and Agreement Studies criteria. Of 2229 source titles, 280 studies (13%) reported agreement. The mean number of patients per study was 81±99 (SD) (range, 0-180). Justification for sample size was found in 9 studies (3%). The number of raters was≤2 in 226 studies (81%). No intra-observer study was performed in 212 (76%) articles. Confidence intervals and interpretation of statistical estimates were provided in 98 (35%) and 147 (53%) of the studies, respectively. In 168 studies (60%), the agreement study was not mentioned in the discussion section. In 8 studies (3%), reporting of the agreement study was judged to be adequate. Twenty studies (7%) were dedicated to agreement. Agreement studies are preliminary and not adequately reported. Studies dedicated to agreement are infrequent. They are research opportunities that should be promoted. Copyright © 2016 Éditions françaises de radiologie. Published by Elsevier Masson SAS. All rights reserved.

  13. Sediment quantity and quality in three impoundments in Massachusetts

    USGS Publications Warehouse

    Zimmerman, Marc James; Breault, Robert F.

    2003-01-01

    As part of a study with an overriding goal of providing information that would assist State and Federal agencies in developing screening protocols for managing sediments impounded behind dams that are potential candidates for removal, the U.S Geological Survey determined sediment quantity and quality at three locations: one on the French River and two on Yokum Brook, a tributary to the west branch of the Westfield River. Data collected with a global positioning system, a geographic information system, and sediment-thickness data aided in the creation of sediment maps and the calculation of sediment volumes at Perryville Pond on the French River in Webster, Massachusetts, and at the Silk Mill and Ballou Dams on Yokum Brook in Becket, Massachusetts. From these data the following sediment volumes were determined: Perryville Pond, 71,000 cubic yards, Silk Mill, 1,600 cubic yards, and Ballou, 800 cubic yards. Sediment characteristics were assessed in terms of grain size and concentrations of potentially hazardous organic compounds and metals. Assessment of the approaches and methods used at study sites indicated that ground-penetrating radar produced data that were extremely difficult and time-consuming to interpret for the three study sites. Because of these difficulties, a steel probe was ultimately used to determine sediment depth and extent for inclusion in the sediment maps. Use of these methods showed that, where sampling sites were accessible, a machine-driven coring device would be preferable to the physically exhausting, manual sediment-coring methods used in this investigation. Enzyme-linked immunosorbent assays were an effective tool for screening large numbers of samples for a range of organic contaminant compounds. An example calculation of the number of samples needed to characterize mean concentrations of contaminants indicated that the number of samples collected for most analytes was adequate; however, additional analyses for lead, copper, silver, arsenic, total petroleum hydrocarbons, and chlordane are needed to meet the criteria determined from the calculations. Particle-size analysis did not reveal a clear spatial distribution pattern at Perryville Pond. On average, less than 65 percent of each sample was greater in size than very fine sand. The sample with the highest percentage of clay-sized particles (24.3 percent) was collected just upstream from the dam and generally had the highest concentrations of contaminants determined here. In contrast, more than 90 percent of the sediment samples in the Becket impoundments had grain sizes larger than very fine sand; as determined by direct observation, rocks, cobbles, and boulders constituted a substantial amount of the material impounded at Becket. In general, the highest percentages of the finest particles, clays, occurred in association with the highest concentrations of contaminants. Enzyme-linked immunosorbent assays of the Perryville samples showed the widespread presence of petroleum hydrocarbons (16 out of 26 samples), polycyclic aromatic hydrocarbons (23 out of 26 samples), and chlordane (18 out of 26 samples); polychlorinated biphenyls were detected in five samples from four locations. Neither petroleum hydrocarbons nor polychlorinated biphenyls were detected at Becket, and chlordane was detected in only one sample. All 14 Becket samples contained polycyclic aromatic hydrocarbons. Replicate quality-control analyses revealed consistent results between paired samples. Samples from throughout Perryville Pond contained a number of metals at potentially toxic concentrations. These metals included arsenic, cadmium, copper, lead, nickel, and zinc. At Becket, no metals were found in elevated concentrations. In general, most of the concentrations of organic compounds and metals detected in Perryville Pond exceeded standards for benthic organisms, but only rarely exceeded standards for human contact. The most highly contaminated samples were

  14. An error criterion for determining sampling rates in closed-loop control systems

    NASA Technical Reports Server (NTRS)

    Brecher, S. M.

    1972-01-01

    The determination of an error criterion which will give a sampling rate for adequate performance of linear, time-invariant closed-loop, discrete-data control systems was studied. The proper modelling of the closed-loop control system for characterization of the error behavior, and the determination of an absolute error definition for performance of the two commonly used holding devices are discussed. The definition of an adequate relative error criterion as a function of the sampling rate and the parameters characterizing the system is established along with the determination of sampling rates. The validity of the expressions for the sampling interval was confirmed by computer simulations. Their application solves the problem of making a first choice in the selection of sampling rates.

  15. CELL NUMBER AND SIZE IN SELECTED ORGANS OF FETUSES OF RATS MALNOURISHED AND EXPOSED TO NITROFEN

    EPA Science Inventory

    The effects of maternal exposure to nitrofen or protein-energy malnutrition on the number and sizes of cells in selected organs of the fetal rat have been studied. Pregnant rats were fed either an adequate (CON) or protein-energy deficient diet (PEM) throughout gestation. Materna...

  16. 21 CFR 226.102 - Master-formula and batch-production records.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...(s) produced on a batch or continuous operation basis, including mixing steps and mixing times that have been determined to yield an adequately mixed Type A medicated article(s); and in the case of Type... batch size, or of appropriate size in the case of continuous systems to be produced from the master...

  17. A comparison of geochemical exploration techniques and sample media within accretionary continental margins: an example from the Pacific Border Ranges, Southern Alaska, U.S.A.

    USGS Publications Warehouse

    Sutley, S.J.; Goldfarb, R.J.; O'Leary, R. M.; Tripp, R.B.

    1990-01-01

    The Pacific Border Ranges of the southern Alaskan Cordillera are composed of a number of allochthonous tectonostratigraphic terranes. Within these terranes are widespread volcanogenic, massive sulfide deposits in and adjacent to portions of accreted ophiolite complexes, bands and disseminations of chromite in accreted island-arc ultramafic rocks, and epigenetic, gold-bearing quartz veins in metamorphosed turbidite sequences. A geochemical pilot study was undertaken to determine the most efficient exploration strategy for locating these types of mineral deposits within the Pacific Border Ranges and other typical convergent continental margin environments. High-density sediment sampling was carried out in first- and second-order stream channels surrounding typical gold, chromite and massive sulfide occurrences. At each site, a stream-sediment and a panned-concentrate sample were collected. In the laboratory, the stream sediments were sieved into coarse-sand, fine- to medium-sand, and silt- to clay-size fractions prior to analysis. One split of the panned concentrates was retained for analysis; a second split was further concentrated by gravity separation in heavy liquids and then divided into magnetic, weakly magnetic and nonmagnetic fractions for analysis. A number of different techniques including atomic absorption spectrometry, inductively coupled plasma atomic emission spectrometry and semi-quantitative emission spectrography were used to analyze the various sample media. Comparison of the various types of sample media shows that in this tectonic environment it is most efficient to include a silt- to clay-size sediment fraction and a panned-concentrate sample. Even with the relatively low detection limits for many elements by plasma spectrometry and atomic absorption spectrometry, anomalies reflecting the presence of gold veins could not be identified in any of the stream-sediment fractions. Unseparated panned-concentrate samples should be analyzed by emission spectroscopy and atomic absorption spectrometry for Ag and Au. If, however, magnetic and nonmagnetic concentrate fractions are used in a reconnaissance program, semiquantitative emission spectrography is adequate for all analytical work. ?? 1990.

  18. Artemisia supplementation differentially affects the mucosal and luminal ileal microbiota of diet-induced obese mice

    PubMed Central

    Shawna, Wicks; M., Taylor Christopher; Meng, Luo; Eugene, Blanchard IV; David, Ribnicky; T., Cefalu William; L., Mynatt Randall; A., Welsh David

    2014-01-01

    Objective The gut microbiome has been implicated in obesity and metabolic syndrome; however, most studies have focused on fecal or colonic samples. Several species of Artemisia have been reported to ameliorate insulin signaling both in vitro and in vivo. The aim of this study was to characterize the mucosal and luminal bacterial populations in the terminal ileum with or without supplementation with Artemisia extracts. Materials/Methods Following 4 weeks of supplementation with different Artemisia extracts (PMI 5011, Santa or Scopa), diet-induced obese mice were sacrificed and luminal and mucosal samples of terminal ileum were used to evaluate microbial community composition by pyrosequencing of 16S rDNA hypervariable regions. Results Significant differences in community structure and membership were observed between luminal and mucosal samples, irrespective of diet group. All Artemisia extracts increased the Bacteroidetes:Firmicutes ratio in mucosal samples. This effect was not observed in the luminal compartment. There was high inter-individual variability in the phylogenetic assessments of the ileal microbiota, limiting the statistical power of this pilot investigation. Conclusions Marked differences in bacterial communities exist dependent upon the biogeographic compartment in the terminal ileum. Future studies testing the effects of Artemisia or other botanical supplements require larger sample sizes for adequate statistical power. PMID:24985102

  19. Generalized estimators of avian abundance from count survey data

    USGS Publications Warehouse

    Royle, J. Andrew

    2004-01-01

    I consider modeling avian abundance from spatially referenced bird count data collected according to common protocols such as capture?recapture, multiple observer, removal sampling and simple point counts. Small sample sizes and large numbers of parameters have motivated many analyses that disregard the spatial indexing of the data, and thus do not provide an adequate treatment of spatial structure. I describe a general framework for modeling spatially replicated data that regards local abundance as a random process, motivated by the view that the set of spatially referenced local populations (at the sample locations) constitute a metapopulation. Under this view, attention can be focused on developing a model for the variation in local abundance independent of the sampling protocol being considered. The metapopulation model structure, when combined with the data generating model, define a simple hierarchical model that can be analyzed using conventional methods. The proposed modeling framework is completely general in the sense that broad classes of metapopulation models may be considered, site level covariates on detection and abundance may be considered, and estimates of abundance and related quantities may be obtained for sample locations, groups of locations, unsampled locations. Two brief examples are given, the first involving simple point counts, and the second based on temporary removal counts. Extension of these models to open systems is briefly discussed.

  20. Between-litter variation in developmental studies of hormones and behavior: Inflated false positives and diminished power.

    PubMed

    Williams, Donald R; Carlsson, Rickard; Bürkner, Paul-Christian

    2017-10-01

    Developmental studies of hormones and behavior often include littermates-rodent siblings that share early-life experiences and genes. Due to between-litter variation (i.e., litter effects), the statistical assumption of independent observations is untenable. In two literatures-natural variation in maternal care and prenatal stress-entire litters are categorized based on maternal behavior or experimental condition. Here, we (1) review both literatures; (2) simulate false positive rates for commonly used statistical methods in each literature; and (3) characterize small sample performance of multilevel models (MLM) and generalized estimating equations (GEE). We found that the assumption of independence was routinely violated (>85%), false positives (α=0.05) exceeded nominal levels (up to 0.70), and power (1-β) rarely surpassed 0.80 (even for optimistic sample and effect sizes). Additionally, we show that MLMs and GEEs have adequate performance for common research designs. We discuss implications for the extant literature, the field of behavioral neuroendocrinology, and provide recommendations. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. High-quality unsaturated zone hydraulic property data for hydrologic applications

    USGS Publications Warehouse

    Perkins, Kimberlie; Nimmo, John R.

    2009-01-01

    In hydrologic studies, especially those using dynamic unsaturated zone moisture modeling, calculations based on property transfer models informed by hydraulic property databases are often used in lieu of measured data from the site of interest. Reliance on database-informed predicted values has become increasingly common with the use of neural networks. High-quality data are needed for databases used in this way and for theoretical and property transfer model development and testing. Hydraulic properties predicted on the basis of existing databases may be adequate in some applications but not others. An obvious problem occurs when the available database has few or no data for samples that are closely related to the medium of interest. The data set presented in this paper includes saturated and unsaturated hydraulic conductivity, water retention, particle-size distributions, and bulk properties. All samples are minimally disturbed, all measurements were performed using the same state of the art techniques and the environments represented are diverse.

  2. SU-E-T-611: Effective Treatment Volume of the Small Size IORT Applicators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krechetov, A.S.; ASK Physics, Mountain View, CA; Goer, D.A.

    2014-06-01

    Purpose Mobile electron linear accelerators are gaining more attention recently, providing a lower cost and simpler way to perform intraoperative treatment. However, the simplicity of the treatment process does not eliminate the need for proper attention to the technical aspects of the treatment. One of the potential pitfalls is incorrect selection of the appropriate applicator size to adequately cover the tumor bed to the prescription dose. When treating tumor beds in the pelvis, the largest applicator that fits into the pelvis is usually selected as there is concern about microscopic extension of the disease along the sidewalls of the pelvis.more » But when treating early stage breast tumors, there is a natural tendency to select an applicator as small as possible so as not to jeopardize cosmesis. Methods This investigation questions how much of the typical breast treatment volume gets adequate exposure and what is the correct strategy in selecting the proper applicator size. Actual data from isodose scans were analyzed. Results We found that typical treatment dose prescriptions can cover as much as 80% and as little as 20% of the nominal treatment volume depending on the applicator size and energy of the beam and whether the dose is prescribed to the 80 or 90% isodose level. Treatment volume is defined as a cylinder with diameter equal to applicator and height equal to the corresponding D80 or D90 depth. Conclusion If mobile linear accelerators are used, there can be significant amount of “cold volume” depending on the applicator size and this should be taken into account when selecting the applicator that is needed. Using too small of an applicator could result in significant under-dosing to the tissue at risk. Long-term clinical data demonstrates that selecting an adequate field size results in good ontological control as well as excellent cosmesis. Intraop Medical Corp was providing facilities and equipment for this research.« less

  3. Iodine Status of Women of Reproductive Age in Sierra Leone and Its Association with Household Coverage with Adequately Iodized Salt

    PubMed Central

    Rohner, Fabian; Wirth, James P.; Woodruff, Bradley A.; Chiwile, Faraja; Yankson, Hannah; Sesay, Fatmata; Koroma, Aminata S.; Petry, Nicolai; Pyne-Bailey, Solade; Dominguez, Elisa; Kupka, Roland; Hodges, Mary H.; de Onis, Mercedes

    2016-01-01

    Salt iodization programs are a public health success in tackling iodine deficiency. Yet, a large proportion of the world’s population remains at risk for iodine deficiency. In a nationally representative cross-sectional survey in Sierra Leone, household salt samples and women’s urine samples were quantitatively analyzed for iodine content. Salt was collected from 1123 households, and urine samples from 817 non-pregnant and 154 pregnant women. Household coverage with adequately iodized salt (≥15 mg/kg iodine) was 80.7%. The median urinary iodine concentration (UIC) of pregnant women was 175.8 µg/L and of non-pregnant women 190.8 µg/L. Women living in households with adequately iodized salt had higher median UIC (for pregnant women: 180.6 µg/L vs. 100.8 µg/L, respectively, p < 0.05; and for non-pregnant women: 211.3 µg/L vs. 97.8 µg/L, p < 0.001). Differences in UIC by residence, region, household wealth, and women’s education were much smaller in women living in households with adequately iodized salt than in households without. Despite the high household coverage of iodized salt in Sierra Leone, it is important to reach the 20% of households not consuming adequately iodized salt. Salt iodization has the potential for increasing equity in iodine status even with the persistence of other risk factors for deficiency. PMID:26848685

  4. C-Arm Computed Tomography-Assisted Adrenal Venous Sampling Improved Right Adrenal Vein Cannulation and Sampling Quality in Primary Aldosteronism.

    PubMed

    Park, Chung Hyun; Hong, Namki; Han, Kichang; Kang, Sang Wook; Lee, Cho Rok; Park, Sungha; Rhee, Yumie

    2018-05-04

    Adrenal venous sampling (AVS) is a gold standard for subtype classification of primary aldosteronism (PA). However, this procedure has a high failure rate because of the anatomical difficulties in accessing the right adrenal vein. We investigated whether C-arm computed tomography-assisted AVS (C-AVS) could improve the success rate of adrenal sampling. A total of 156 patients, diagnosed with PA who underwent AVS from May 2004 through April 2017, were included. Based on the medical records, we retrospectively compared the overall, left, and right catheterization success rates of adrenal veins during the periods without C-AVS (2004 to 2010, n=32) and with C-AVS (2011 to 2016, n=134). The primary outcome was adequate bilateral sampling defined as a selectivity index (SI) >5. With C-AVS, the rates of adequate bilateral AVS increased from 40.6% to 88.7% (P<0.001), with substantial decreases in failure rates (43.7% to 0.8%, P<0.001). There were significant increases in adequate sampling rates from right (43.7% to 91.9%, P<0.001) and left adrenal veins (53.1% to 95.9%, P<0.001) as well as decreases in catheterization failure from right adrenal vein (9.3% to 0.0%, P<0.001). Net improvement of SI on right side remained significant after adjustment for left side (adjusted SI, 1.1 to 9.0; P=0.038). C-AVS was an independent predictor of adequate bilateral sampling in the multivariate model (odds ratio, 9.01; P<0.001). C-AVS improved the overall success rate of AVS, possibly as a result of better catheterization of right adrenal vein. Copyright © 2018 Korean Endocrine Society.

  5. Distinguishing between heating power and hyperthermic cell-treatment efficacy in magnetic fluid hyperthermia.

    PubMed

    Munoz-Menendez, Cristina; Conde-Leboran, Ivan; Serantes, David; Chantrell, Roy; Chubykalo-Fesenko, Oksana; Baldomir, Daniel

    2016-11-04

    In the magnetic fluid hyperthermia (MFH) research field, it is usually assumed that achieving a uniform temperature enhancement (ΔT) of the entire tumour is a key-point for treatment. However, various experimental works reported successful cell apoptosis via MFH without a noticeable ΔT of the system. A possible explanation of the success of these negligible-ΔT experiments is that a local ΔT restricted to the particle nanoenvironment (i.e. with no significant effect on the global temperature T) could be enough to trigger cell death. Shedding light on such a possibility requires accurate knowledge of heat dissipation at the local level in relation to the usually investigated global (average) one. Since size polydispersity is inherent to all synthesis techniques and the heat released is proportional to the particle size, heat dissipation spots with different performances - and thus different effects on the cells - will likely exist in every sample. In this work we aim for a double objective: (1) to emphasize the necessity to distinguish between the total dissipated heat and hyperthermia effectiveness, and (2) to suggest a theoretical approach on how to select, for a given size polydispersity, a more adequate average size so that most of the particles dissipate within a desired heating power range. The results are reported in terms of Fe 3 O 4 nanoparticles as a representative example.

  6. Spectral solution of the inverse Mie problem

    NASA Astrophysics Data System (ADS)

    Romanov, Andrey V.; Konokhova, Anastasiya I.; Yastrebova, Ekaterina S.; Gilev, Konstantin V.; Strokotov, Dmitry I.; Chernyshev, Andrei V.; Maltsev, Valeri P.; Yurkin, Maxim A.

    2017-10-01

    We developed a fast method to determine size and refractive index of homogeneous spheres from the power Fourier spectrum of their light-scattering patterns (LSPs), measured with the scanning flow cytometer. Specifically, we used two spectral parameters: the location of the non-zero peak and zero-frequency amplitude, and numerically inverted the map from the space of particle characteristics (size and refractive index) to the space of spectral parameters. The latter parameters can be reliably resolved only for particle size parameter greater than 11, and the inversion is unique only in the limited range of refractive index with upper limit between 1.1 and 1.25 (relative to the medium) depending on the size parameter and particular definition of uniqueness. The developed method was tested on two experimental samples, milk fat globules and spherized red blood cells, and resulted in accuracy not worse than the reference method based on the least-square fit of the LSP with the Mie theory. Moreover, for particles with significant deviation from the spherical shape the spectral method was much closer to the Mie-fit result than the estimated uncertainty of the latter. The spectral method also showed adequate results for synthetic LSPs of spheroids with aspect ratios up to 1.4. Overall, we present a general framework, which can be used to construct an inverse algorithm for any other experimental signals.

  7. Application of cluster and discriminant analyses to diagnose lithological heterogeneity of the parent material according to its particle-size distribution

    NASA Astrophysics Data System (ADS)

    Giniyatullin, K. G.; Valeeva, A. A.; Smirnova, E. V.

    2017-08-01

    Particle-size distribution in soddy-podzolic and light gray forest soils of the Botanical Garden of Kazan Federal University has been studied. The cluster analysis of data on the samples from genetic soil horizons attests to the lithological heterogeneity of the profiles of all the studied soils. It is probable that they are developed from the two-layered sediments with the upper colluvial layer underlain by the alluvial layer. According to the discriminant analysis, the major contribution to the discrimination of colluvial and alluvial layers is that of the fraction >0.25 mm. The results of canonical analysis show that there is only one significant discriminant function that separates alluvial and colluvial sediments on the investigated territory. The discriminant function correlates with the contents of fractions 0.05-0.01, 0.25-0.05, and >0.25 mm. Classification functions making it possible to distinguish between alluvial and colluvial sediments have been calculated. Statistical assessment of particle-size distribution data obtained for the plow horizons on ten plowed fields within the garden indicates that this horizon is formed from colluvial sediments. We conclude that the contents of separate fractions and their ratios cannot be used as a universal criterion of the lithological heterogeneity. However, adequate combination of the cluster and discriminant analyses makes it possible to give a comprehensive assessment of the lithology of soil samples from data on the contents of sand and silt fractions, which considerably increases the information value and reliability of the results.

  8. What's in a name? The challenge of describing interventions in systematic reviews: analysis of a random sample of reviews of non-pharmacological stroke interventions

    PubMed Central

    Hoffmann, Tammy C; Walker, Marion F; Langhorne, Peter; Eames, Sally; Thomas, Emma; Glasziou, Paul

    2015-01-01

    Objective To assess, in a sample of systematic reviews of non-pharmacological interventions, the completeness of intervention reporting, identify the most frequently missing elements, and assess review authors’ use of and beliefs about providing intervention information. Design Analysis of a random sample of systematic reviews of non-pharmacological stroke interventions; online survey of review authors. Data sources and study selection The Cochrane Library and PubMed were searched for potentially eligible systematic reviews and a random sample of these assessed for eligibility until 60 (30 Cochrane, 30 non-Cochrane) eligible reviews were identified. Data collection In each review, the completeness of the intervention description in each eligible trial (n=568) was assessed by 2 independent raters using the Template for Intervention Description and Replication (TIDieR) checklist. All review authors (n=46) were invited to complete a survey. Results Most reviews were missing intervention information for the majority of items. The most incompletely described items were: modifications, fidelity, materials, procedure and tailoring (missing from all interventions in 97%, 90%, 88%, 83% and 83% of reviews, respectively). Items that scored better, but were still incomplete for the majority of reviews, were: ‘when and how much’ (in 31% of reviews, adequate for all trials; in 57% of reviews, adequate for some trials); intervention mode (in 22% of reviews, adequate for all trials; in 38%, adequate for some trials); and location (in 19% of reviews, adequate for all trials). Of the 33 (71%) authors who responded, 58% reported having further intervention information but not including it, and 70% tried to obtain information. Conclusions Most focus on intervention reporting has been directed at trials. Poor intervention reporting in stroke systematic reviews is prevalent, compounded by poor trial reporting. Without adequate intervention descriptions, the conduct, usability and interpretation of reviews are restricted and therefore, require action by trialists, systematic reviewers, peer reviewers and editors. PMID:26576811

  9. Toward a Singleton Undergraduate Computer Graphics Course in Small and Medium-Sized Colleges

    ERIC Educational Resources Information Center

    Shesh, Amit

    2013-01-01

    This article discusses the evolution of a single undergraduate computer graphics course over five semesters, driven by a primary question: if one could offer only one undergraduate course in graphics, what would it include? This constraint is relevant to many small and medium-sized colleges that lack resources, adequate expertise, and enrollment…

  10. New detection system and signal processing for the tokamak ISTTOK heavy ion beam diagnostic.

    PubMed

    Henriques, R B; Nedzelskiy, I S; Malaquias, A; Fernandes, H

    2012-10-01

    The tokamak ISTTOK havy ion beam diagnostic (HIBD) operates with a multiple cell array detector (MCAD) that allows for the plasma density and the plasma density fluctuations measurements simultaneously at different sampling volumes across the plasma. To improve the capability of the plasma density fluctuations investigations, a new detection system and new signal conditioning amplifier have been designed and tested. The improvements in MCAD design are presented which allow for nearly complete suppression of the spurious plasma background signal by applying a biasing potential onto special electrodes incorporated into MCAD. The new low cost and small size transimpedance amplifiers are described with the parameters of 400 kHz, 10(7) V/A, 0.4 nA of RMS noise, adequate for the plasma density fluctuations measurements.

  11. Effect of abdominopelvic abscess drain size on drainage time and probability of occlusion

    PubMed Central

    Rotman, Jessica A.; Getrajdman, George I.; Maybody, Majid; Erinjeri, Joseph P.; Yarmohammadi, Hooman; Sofocleous, Constantinos T.; Solomon, Stephen B.; Boas, F. Edward

    2016-01-01

    Background The purpose of this study is to determine whether larger abdominopelvic abscess drains reduce the time required for abscess resolution, or the probability of tube occlusion. Methods 144 consecutive patients who underwent abscess drainage at a single institution were reviewed retrospectively. Results: Larger initial drain size did not reduce drainage time, drain occlusion, or drain exchanges (p>0.05). Subgroup analysis did not find any type of collection that benefitted from larger drains. A multivariate model predicting drainage time showed that large collections (>200 ml) required 16 days longer drainage time than small collections (<50 ml). Collections with a fistula to bowel required 17 days longer drainage time than collections without a fistula. Initial drain size and the viscosity of the fluid in the collection had no significant effect on drainage time in the multivariate model. Conclusions 8 F drains are adequate for initial drainage of most serous and serosanguineous collections. 10 F drains are adequate for initial drainage of most purulent or bloody collections. PMID:27634422

  12. Evaluation of hygiene practices and microbiological quality of cooked meat products during slicing and handling at retail.

    PubMed

    Pérez-Rodríguez, F; Castro, R; Posada-Izquierdo, G D; Valero, A; Carrasco, E; García-Gimeno, R M; Zurera, G

    2010-10-01

    Cooked meat ready-to-eat products are recognized to be contaminated during slicing which, in the last years, has been associated with several outbreaks. This work aimed to find out possible relation between the hygiene practice taking place at retail point during slicing of cooked meat products in small and medium-sized establishments (SMEs) and large-sized establishments (LEs) and the microbiological quality of sliced cooked meat products. For that, a checklist was drawn up and filled in based on scoring handling practice during slicing in different establishments in Cordoba (Southern Spain). In addition, sliced cooked meats were analyzed for different microbiological indicators and investigated for the presence of Listeria spp. and Listeria monocytogenes. Results indicated that SMEs showed a more deficient handling practices compared to LEs. In spite of these differences, microbiological counts indicated similar microbiological quality in cooked meat samples for both types of establishments. On the other hand, Listeria monocytogenes and Listeria inocua were isolated from 7.35% (5/68) and 8.82% (6/68) of analyzed samples, respectively. Positive samples for Listeria spp. were found in establishments which showed acceptable hygiene levels, though contamination could be associated to the lack of exclusiveness of slicers at retail points. Moreover, Listeria spp presence could not be statistically linked to any microbiological parameters; however, it was observed that seasonality influenced significantly (P<0.05) L. monocytogenes presence, being all samples found during warm season (5/5). As a conclusion, results suggested that more effort should be made to adequately educate handlers in food hygiene practices, focused specially on SMEs. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  13. One size does not fit all: an examination of low birthweight disparities among a diverse set of racial/ethnic groups.

    PubMed

    Johnelle Sparks, P

    2009-11-01

    To examine disparities in low birthweight using a diverse set of racial/ethnic categories and a nationally representative sample. This research explored the degree to which sociodemographic characteristics, health care access, maternal health status, and health behaviors influence birthweight disparities among seven racial/ethnic groups. Binary logistic regression models were estimated using a nationally representative sample of singleton, normal for gestational age births from 2001 using the ECLS-B, which has an approximate sample size of 7,800 infants. The multiple variable models examine disparities in low birthweight (LBW) for seven racial/ethnic groups, including non-Hispanic white, non-Hispanic black, U.S.-born Mexican-origin Hispanic, foreign-born Mexican-origin Hispanic, other Hispanic, Native American, and Asian mothers. Race-stratified logistic regression models were also examined. In the full sample models, only non-Hispanic black mothers have a LBW disadvantage compared to non-Hispanic white mothers. Maternal WIC usage was protective against LBW in the full models. No prenatal care and adequate plus prenatal care increase the odds of LBW. In the race-stratified models, prenatal care adequacy and high maternal health risks are the only variables that influence LBW for all racial/ethnic groups. The race-stratified models highlight the different mechanism important across the racial/ethnic groups in determining LBW. Differences in the distribution of maternal sociodemographic, health care access, health status, and behavior characteristics by race/ethnicity demonstrate that a single empirical framework may distort associations with LBW for certain racial and ethnic groups. More attention must be given to the specific mechanisms linking maternal risk factors to poor birth outcomes for specific racial/ethnic groups.

  14. From picture to porosity of river bed material using Structure-from-Motion with Multi-View-Stereo

    NASA Astrophysics Data System (ADS)

    Seitz, Lydia; Haas, Christian; Noack, Markus; Wieprecht, Silke

    2018-04-01

    Common methods for in-situ determination of porosity of river bed material are time- and effort-consuming. Although mathematical predictors can be used for estimation, they do not adequately represent porosities. The objective of this study was to assess a new approach for the determination of porosity of frozen sediment samples. The method is based on volume determination by applying Structure-from-Motion with Multi View Stereo (SfM-MVS) to estimate a 3D volumetric model based on overlapping imagery. The method was applied on artificial sediment mixtures as well as field samples. In addition, the commonly used water replacement method was applied to determine porosities in comparison with the SfM-MVS method. We examined a range of porosities from 0.16 to 0.46 that are representative of the wide range of porosities found in rivers. SfM-MVS performed well in determining volumes of the sediment samples. A very good correlation (r = 0.998, p < 0.0001) was observed between the SfM-MVS and the water replacement method. Results further show that the water replacement method underestimated total sample volumes. A comparison with several mathematical predictors showed that for non-uniform samples the calculated porosity based on the standard deviation performed better than porosities based on the median grain size. None of the predictors were effective at estimating the porosity of the field samples.

  15. Applicability of empirical data currently used in predicting solid propellant exhaust plumes

    NASA Technical Reports Server (NTRS)

    Tevepaugh, J. A.; Smith, S. D.; Penny, M. M.; Greenwood, T.; Roberts, B. B.

    1977-01-01

    Theoretical and experimental approaches to exhaust plume analysis are compared. A two-phase model is extended to include treatment of reacting gas chemistry, and thermodynamical modeling of the gaseous phase of the flow field is considered. The applicability of empirical data currently available to define particle drag coefficients, heat transfer coefficients, mean particle size, and particle size distributions is investigated. Experimental and analytical comparisons are presented for subscale solid rocket motors operating at three altitudes with attention to pitot total pressure and stagnation point heating rate measurements. The mathematical treatment input requirements are explained. The two-phase flow field solution adequately predicts gasdynamic properties in the inviscid portion of two-phase exhaust plumes. It is found that prediction of exhaust plume gas pressures requires an adequate model of flow field dynamics.

  16. Progressive Sampling Technique for Efficient and Robust Uncertainty and Sensitivity Analysis of Environmental Systems Models: Stability and Convergence

    NASA Astrophysics Data System (ADS)

    Sheikholeslami, R.; Hosseini, N.; Razavi, S.

    2016-12-01

    Modern earth and environmental models are usually characterized by a large parameter space and high computational cost. These two features prevent effective implementation of sampling-based analysis such as sensitivity and uncertainty analysis, which require running these computationally expensive models several times to adequately explore the parameter/problem space. Therefore, developing efficient sampling techniques that scale with the size of the problem, computational budget, and users' needs is essential. In this presentation, we propose an efficient sequential sampling strategy, called Progressive Latin Hypercube Sampling (PLHS), which provides an increasingly improved coverage of the parameter space, while satisfying pre-defined requirements. The original Latin hypercube sampling (LHS) approach generates the entire sample set in one stage; on the contrary, PLHS generates a series of smaller sub-sets (also called `slices') while: (1) each sub-set is Latin hypercube and achieves maximum stratification in any one dimensional projection; (2) the progressive addition of sub-sets remains Latin hypercube; and thus (3) the entire sample set is Latin hypercube. Therefore, it has the capability to preserve the intended sampling properties throughout the sampling procedure. PLHS is deemed advantageous over the existing methods, particularly because it nearly avoids over- or under-sampling. Through different case studies, we show that PHLS has multiple advantages over the one-stage sampling approaches, including improved convergence and stability of the analysis results with fewer model runs. In addition, PLHS can help to minimize the total simulation time by only running the simulations necessary to achieve the desired level of quality (e.g., accuracy, and convergence rate).

  17. Statistical Design for Biospecimen Cohort Size in Proteomics-based Biomarker Discovery and Verification Studies

    PubMed Central

    Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.

    2014-01-01

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748

  18. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    PubMed

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-06

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  19. Context factors in consultations of general practitioner trainees and their impact on communication assessment in the authentic setting.

    PubMed

    Essers, Geurt; van Dulmen, Sandra; van Es, Judy; van Weel, Chris; van der Vleuten, Cees; Kramer, Anneke

    2013-12-01

    Acquiring adequate communication skills is an essential part of general practice (GP) specialty training. In assessing trainee proficiency, the context in which trainees communicate is usually not taken into account. The present paper aims to explore what context factors can be found in regular GP trainee consultations and how these influence their communication performance. In a randomly selected sample of 44 videotaped, real-life GP trainee consultations, we searched for context factors previously identified in GP consultations and explored how trainee ratings change if context factors are taken into account. Trainee performance was rated twice using the MAAS-Global, first without and then with incorporating context factors. Item score differences were calculated using a paired samples t-test and effect sizes were computed. All previously identified context factors were again observed in GP trainee consultations. In communication assessment scores, we found a significant difference in 5 out of 13 MAAS-Global items, mostly in a positive direction. The effect size was moderate (0.57). GP trainee communication is influenced by contextual factors; they seem to adapt to context in a professional way. GP specialty training needs to focus on a context-specific application of communication skills. Communication raters need to be taught how to incorporate context factors into their assessments. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  20. GROUND WATER SAMPLING FOR VERTICAL PROFILING OF CONTAMINANTS

    EPA Science Inventory

    Accurate delineation of plume boundaries and vertical contaminant distribution are necessary in order to adequately characterize waste sites and determine remedial strategies to be employed. However, it is important to consider the sampling objectives, sampling methods, and sampl...

  1. Virus Characterization by FFF-MALS Assay

    NASA Astrophysics Data System (ADS)

    Razinkov, Vladimer

    2009-03-01

    Adequate biophysical characterization of influenza virions is important for vaccine development. The influenza virus vaccines are produced from the allantoic fluid of developing chicken embryos. The process of viral replication produces a heterogeneous mixture of infectious and non-infectious viral particles with varying states of aggregation. The study of the relative distribution and behavior of different subpopulations and their inter-correlation can assist in the development of a robust process for a live virus vaccine. This report describes a field flow fractionation and multiangle light scattering (FFF-MALS) method optimized for the analysis of size distribution and total particle counts. A method using a combination of asymmetric flow field-flow fractionation (AFFFF) and multiangle light scattering (MALS) techniques has been shown to improve the estimation of virus particle counts and the amount of aggregated virus in laboratory samples. The FFF-MALS method was compared with several other methods such as transmission electron microscopy (TEM), atomic force microscopy (AFM), size exclusion chromatography followed by MALS (SEC-MALS), quantitative reverse transcription polymerase chain reaction (RT Q-PCR), median tissue culture dose (TCID(50)), and the fluorescent focus assay (FFA). The correlation between the various methods for determining total particle counts, infectivity and size distribution is reported. The pros and cons of each of the analytical methods are discussed.

  2. Differences in hospital casemix, and the relationship between casemix and hospital costs.

    PubMed

    Söderlund, N; Milne, R; Gray, A; Raftery, J

    1995-03-01

    The aim of the study was to examine the relationship between hospital costs and casemix, and after adjustment for casemix differences, between cost and institutional size, number of specialties, occupancy and teaching status. A retrospective analysis of all admissions to nine acute-care NHS hospitals in the Oxford region during the 1991-1992 financial year was undertaken. All episodes were assigned to a diagnosis-related group (DRG) and a cost weight assigned accordingly. Costs per finished consultant episode, before and after adjustment for casemix differences, were analysed at the hospital and specialty level. Casemix differences were significant, and accounted for approximately 77 per cent of the difference in costs between providers. Costs per casemix-adjusted episode were not significantly associated with differences in hospital size, scope, occupancy levels or teaching status, but sample size was insufficient to investigate these relationships adequately. Specialty costs were poorly correlated with specialty casemix. This was probably due to poor apportionment of specialty costs in hospital accounting returns. Casemix differences need to be taken into account when comparing providers for the purposes of contracting, as unadjusted unit costs may be misleading. Although the methods used may currently be applied to most NHS hospitals, widespread use would be greatly facilitated by the development of indigenous cost weights and better routine hospital data coding and collection.

  3. Environmental Validation of Legionella Control in a VHA Facility Water System.

    PubMed

    Jinadatha, Chetan; Stock, Eileen M; Miller, Steve E; McCoy, William F

    2018-03-01

    OBJECTIVES We conducted this study to determine what sample volume, concentration, and limit of detection (LOD) are adequate for environmental validation of Legionella control. We also sought to determine whether time required to obtain culture results can be reduced compared to spread-plate culture method. We also assessed whether polymerase chain reaction (PCR) and in-field total heterotrophic aerobic bacteria (THAB) counts are reliable indicators of Legionella in water samples from buildings. DESIGN Comparative Legionella screening and diagnostics study for environmental validation of a healthcare building water system. SETTING Veterans Health Administration (VHA) facility water system in central Texas. METHODS We analyzed 50 water samples (26 hot, 24 cold) from 40 sinks and 10 showers using spread-plate cultures (International Standards Organization [ISO] 11731) on samples shipped overnight to the analytical lab. In-field, on-site cultures were obtained using the PVT (Phigenics Validation Test) culture dipslide-format sampler. A PCR assay for genus-level Legionella was performed on every sample. RESULTS No practical differences regardless of sample volume filtered were observed. Larger sample volumes yielded more detections of Legionella. No statistically significant differences at the 1 colony-forming unit (CFU)/mL or 10 CFU/mL LOD were observed. Approximately 75% less time was required when cultures were started in the field. The PCR results provided an early warning, which was confirmed by spread-plate cultures. The THAB results did not correlate with Legionella status. CONCLUSIONS For environmental validation at this facility, we confirmed that (1) 100 mL sample volumes were adequate, (2) 10× concentrations were adequate, (3) 10 CFU/mL LOD was adequate, (4) in-field cultures reliably reduced time to get results by 75%, (5) PCR provided a reliable early warning, and (6) THAB was not predictive of Legionella results. Infect Control Hosp Epidemiol 2018;39:259-266.

  4. How well do we know the infaunal biomass of the continental shelf?

    NASA Astrophysics Data System (ADS)

    Powell, Eric N.; Mann, Roger

    2016-03-01

    Benthic infauna comprise a wide range of taxa of varying abundances and sizes, but large infaunal taxa are infrequently recorded in community surveys of the shelf benthos. These larger, but numerically rare, species may contribute disproportionately to biomass, however. We examine the degree to which standard benthic sampling gear and survey design provide an adequate estimate of the biomass of large infauna using the Atlantic surfclam, Spisula solidissima, on the continental shelf off the northeastern coast of the United States as a test organism. We develop a numerical model that simulates standard survey designs, gear types, and sampling densities to evaluate the effectiveness of vertically-dropped sampling gear (e.g., boxcores, grabs) for estimating density of large species. Simulations of randomly distributed clams at a density of 0.5-1 m-2 within an 0.25-km2 domain show that lower sampling densities (1-5 samples per sampling event) resulted in highly inaccurate estimates of clam density with the presence of clams detected in less than 25% of the sampling events. In all cases in which patchiness was present in the simulated clam population, surveys were prone to very large errors (survey availability events) unless a dense (e.g., 100-sample) sampling protocol was imposed. Thus, commercial quantities of surfclams could easily go completely undetected by any standard benthic community survey protocol using vertically-dropped gear. Without recourse to modern high-volume sampling gear capable of sampling many meters at a swath, such as hydraulic dredges, biomass of the continental shelf will be grievously underestimated if large infauna are present even at moderate densities.

  5. Stocking levels and underlying assumptions for uneven-aged Ponderosa Pine stands.

    Treesearch

    P.H. Cochran

    1992-01-01

    Potential Problems With Q-Values Many ponderosa pine stands have a limited number of size classes, and it may be desirable to carry very large trees through several cutting cycles. Large numbers of trees below commercial size are not needed to provide adequate numbers of future replacement trees. Under these conditions, application of stand density index (SDI) can have...

  6. Publications - Sales | Alaska Division of Geological & Geophysical Surveys

    Science.gov Websites

    datasets and large quantity publications orders we also offer a hard drive file transfer. We offer two options: (1) DGGS will purchase a new hard drive of adequate size that you will be billed for upon sized hard drive. You will be charged $56 per hour for data processing for any staff time in excess of

  7. Effect Size in Efficacy Trials of Women With Decreased Sexual Desire.

    PubMed

    Pyke, Robert E; Clayton, Anita H

    2018-03-22

    Regarding hypoactive sexual desire disorder (HSDD) in women, some reviewers judge the effect size small for medications vs placebo, but substantial for cognitive behavior therapy (CBT) or mindfulness meditation training (MMT) vs wait list. However, we lack comparisons of the effect sizes for the active intervention itself, for the control treatment, and for the differential between the two. For efficacy trials of HSDD in women, compare effect sizes for medications (testosterone/testosterone transdermal system, flibanserin, and bremelanotide) and placebo vs effect sizes for psychotherapy and wait-list control. We conducted a literature search for mean changes and SD on main measures of sexual desire and associated distress in trials of medications, CBT, or MMT. Effect size was used as it measures the magnitude of the intervention without confounding by sample size. Cohen d was used to determine effect sizes. For medications, mean (SD) effect size was 1.0 (0.34); for CBT and MMT, 1.0 (0.36); for placebo, 0.55 (0.16); and for wait list, 0.05 (0.26). Recommendations of psychotherapy over medication for treatment of HSDD are premature and not supported by data on effect sizes. Active participation in treatment conveys considerable non-specific benefits. Caregivers should attend to biological and psychosocial elements, and patient preference, to optimize response. Few clinical trials of psychotherapies were substantial in size or utilized adequate control paradigms. Medications and psychotherapies had similar, large effect sizes. Effect size of placebo was moderate. Effect size of wait-list control was very small, about one quarter that of placebo. Thus, a substantial non-specific therapeutic effect is associated with receiving placebo plus active care and evaluation. The difference in effect size between placebo and wait-list controls distorts the value of the subtraction of effect of the control paradigms to estimate intervention effectiveness. Pyke RE, Clayton AH. Effect Size in Efficacy Trials of Women With Decreased Sexual Desire. Sex Med Rev 2018;XX:XXX-XXX. Copyright © 2018 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  8. Wastewater Biosolid Composting Optimization Based on UV-VNIR Spectroscopy Monitoring

    PubMed Central

    Temporal-Lara, Beatriz; Melendez-Pastor, Ignacio; Gómez, Ignacio; Navarro-Pedreño, Jose

    2016-01-01

    Conventional wastewater treatment generates large amounts of organic matter–rich sludge that requires adequate treatment to avoid public health and environmental problems. The mixture of wastewater sludge and some bulking agents produces a biosolid to be composted at adequate composting facilities. The composting process is chemically and microbiologically complex and requires an adequate aeration of the biosolid (e.g., with a turner machine) for proper maturation of the compost. Adequate (near) real-time monitoring of the compost maturity process is highly difficult and the operation of composting facilities is not as automatized as other industrial processes. Spectroscopic analysis of compost samples has been successfully employed for compost maturity assessment but the preparation of the solid compost samples is difficult and time-consuming. This manuscript presents a methodology based on a combination of a less time-consuming compost sample preparation and ultraviolet, visible and short-wave near-infrared spectroscopy. Spectroscopic measurements were performed with liquid compost extract instead of solid compost samples. Partial least square (PLS) models were developed to quantify chemical fractions commonly employed for compost maturity assessment. Effective regression models were obtained for total organic matter (residual predictive deviation—RPD = 2.68), humification ratio (RPD = 2.23), total exchangeable carbon (RPD = 2.07) and total organic carbon (RPD = 1.66) with a modular and cost-effective visible and near infrared (VNIR) spectroradiometer. This combination of a less time-consuming compost sample preparation with a versatile sensor system provides an easy-to-implement, efficient and cost-effective protocol for compost maturity assessment and near-real-time monitoring. PMID:27854280

  9. Wastewater Biosolid Composting Optimization Based on UV-VNIR Spectroscopy Monitoring.

    PubMed

    Temporal-Lara, Beatriz; Melendez-Pastor, Ignacio; Gómez, Ignacio; Navarro-Pedreño, Jose

    2016-11-15

    Conventional wastewater treatment generates large amounts of organic matter-rich sludge that requires adequate treatment to avoid public health and environmental problems. The mixture of wastewater sludge and some bulking agents produces a biosolid to be composted at adequate composting facilities. The composting process is chemically and microbiologically complex and requires an adequate aeration of the biosolid (e.g., with a turner machine) for proper maturation of the compost. Adequate (near) real-time monitoring of the compost maturity process is highly difficult and the operation of composting facilities is not as automatized as other industrial processes. Spectroscopic analysis of compost samples has been successfully employed for compost maturity assessment but the preparation of the solid compost samples is difficult and time-consuming. This manuscript presents a methodology based on a combination of a less time-consuming compost sample preparation and ultraviolet, visible and short-wave near-infrared spectroscopy. Spectroscopic measurements were performed with liquid compost extract instead of solid compost samples. Partial least square (PLS) models were developed to quantify chemical fractions commonly employed for compost maturity assessment. Effective regression models were obtained for total organic matter (residual predictive deviation-RPD = 2.68), humification ratio (RPD = 2.23), total exchangeable carbon (RPD = 2.07) and total organic carbon (RPD = 1.66) with a modular and cost-effective visible and near infrared (VNIR) spectroradiometer. This combination of a less time-consuming compost sample preparation with a versatile sensor system provides an easy-to-implement, efficient and cost-effective protocol for compost maturity assessment and near-real-time monitoring.

  10. Genome-wide association analysis accounting for environmental factors through propensity-score matching: application to stressful live events in major depressive disorder.

    PubMed

    Power, Robert A; Cohen-Woods, Sarah; Ng, Mandy Y; Butler, Amy W; Craddock, Nick; Korszun, Ania; Jones, Lisa; Jones, Ian; Gill, Michael; Rice, John P; Maier, Wolfgang; Zobel, Astrid; Mors, Ole; Placentino, Anna; Rietschel, Marcella; Aitchison, Katherine J; Tozzi, Federica; Muglia, Pierandrea; Breen, Gerome; Farmer, Anne E; McGuffin, Peter; Lewis, Cathryn M; Uher, Rudolf

    2013-09-01

    Stressful life events are an established trigger for depression and may contribute to the heterogeneity within genome-wide association analyses. With depression cases showing an excess of exposure to stressful events compared to controls, there is difficulty in distinguishing between "true" cases and a "normal" response to a stressful environment. This potential contamination of cases, and that from genetically at risk controls that have not yet experienced environmental triggers for onset, may reduce the power of studies to detect causal variants. In the RADIANT sample of 3,690 European individuals, we used propensity score matching to pair cases and controls on exposure to stressful life events. In 805 case-control pairs matched on stressful life event, we tested the influence of 457,670 common genetic variants on the propensity to depression under comparable level of adversity with a sign test. While this analysis produced no significant findings after genome-wide correction for multiple testing, we outline a novel methodology and perspective for providing environmental context in genetic studies. We recommend contextualizing depression by incorporating environmental exposure into genome-wide analyses as a complementary approach to testing gene-environment interactions. Possible explanations for negative findings include a lack of statistical power due to small sample size and conditional effects, resulting from the low rate of adequate matching. Our findings underscore the importance of collecting information on environmental risk factors in studies of depression and other complex phenotypes, so that sufficient sample sizes are available to investigate their effect in genome-wide association analysis. Copyright © 2013 Wiley Periodicals, Inc.

  11. The Quality of Reporting of Measures of Precision in Animal Experiments in Implant Dentistry: A Methodological Study.

    PubMed

    Faggion, Clovis Mariano; Aranda, Luisiana; Diaz, Karla Tatiana; Shih, Ming-Chieh; Tu, Yu-Kang; Alarcón, Marco Antonio

    2016-01-01

    Information on precision of treatment-effect estimates is pivotal for understanding research findings. In animal experiments, which provide important information for supporting clinical trials in implant dentistry, inaccurate information may lead to biased clinical trials. The aim of this methodological study was to determine whether sample size calculation, standard errors, and confidence intervals for treatment-effect estimates are reported accurately in publications describing animal experiments in implant dentistry. MEDLINE (via PubMed), Scopus, and SciELO databases were searched to identify reports involving animal experiments with dental implants published from September 2010 to March 2015. Data from publications were extracted into a standardized form with nine items related to precision of treatment estimates and experiment characteristics. Data selection and extraction were performed independently and in duplicate, with disagreements resolved by discussion-based consensus. The chi-square and Fisher exact tests were used to assess differences in reporting according to study sponsorship type and impact factor of the journal of publication. The sample comprised reports of 161 animal experiments. Sample size calculation was reported in five (2%) publications. P values and confidence intervals were reported in 152 (94%) and 13 (8%) of these publications, respectively. Standard errors were reported in 19 (12%) publications. Confidence intervals were better reported in publications describing industry-supported animal experiments (P = .03) and with a higher impact factor (P = .02). Information on precision of estimates is rarely reported in publications describing animal experiments in implant dentistry. This lack of information makes it difficult to evaluate whether the translation of animal research findings to clinical trials is adequate.

  12. The Work Role Functioning Questionnaire v2.0 Showed Consistent Factor Structure Across Six Working Samples.

    PubMed

    Abma, Femke I; Bültmann, Ute; Amick Iii, Benjamin C; Arends, Iris; Dorland, Heleen F; Flach, Peter A; van der Klink, Jac J L; van de Ven, Hardy A; Bjørner, Jakob Bue

    2017-09-09

    Objective The Work Role Functioning Questionnaire v2.0 (WRFQ) is an outcome measure linking a persons' health to the ability to meet work demands in the twenty-first century. We aimed to examine the construct validity of the WRFQ in a heterogeneous set of working samples in the Netherlands with mixed clinical conditions and job types to evaluate the comparability of the scale structure. Methods Confirmatory factor and multi-group analyses were conducted in six cross-sectional working samples (total N = 2433) to evaluate and compare a five-factor model structure of the WRFQ (work scheduling demands, output demands, physical demands, mental and social demands, and flexibility demands). Model fit indices were calculated based on RMSEA ≤ 0.08 and CFI ≥ 0.95. After fitting the five-factor model, the multidimensional structure of the instrument was evaluated across samples using a second order factor model. Results The factor structure was robust across samples and a multi-group model had adequate fit (RMSEA = 0.63, CFI = 0.972). In sample specific analyses, minor modifications were necessary in three samples (final RMSEA 0.055-0.080, final CFI between 0.955 and 0.989). Applying the previous first order specifications, a second order factor model had adequate fit in all samples. Conclusion A five-factor model of the WRFQ showed consistent structural validity across samples. A second order factor model showed adequate fit, but the second order factor loadings varied across samples. Therefore subscale scores are recommended to compare across different clinical and working samples.

  13. SNPP VIIRS Spectral Bands Co-Registration and Spatial Response Characterization

    NASA Technical Reports Server (NTRS)

    Lin, Guoqing; Tilton, James C.; Wolfe, Robert E.; Tewari, Krishna P.; Nishihama, Masahiro

    2013-01-01

    The Visible Infrared Imager Radiometer Suite (VIIRS) instrument onboard the Suomi National Polar-orbiting Partnership (SNPP) satellite was launched on 28 October 2011. The VIIRS has 5 imagery spectral bands (I-bands), 16 moderate resolution spectral bands (M-bands) and a panchromatic day/night band (DNB). Performance of the VIIRS spatial response and band-to-band co-registration (BBR) was measured through intensive pre-launch tests. These measurements were made in the non-aggregated zones near the start (or end) of scan for the I-bands and M-bands and for a limited number of aggregation modes for the DNB in order to test requirement compliance. This paper presents results based on a recently re-processed pre-launch test data. Sensor (detector) spatial impulse responses in the scan direction are parameterized in terms of ground dynamic field of view (GDFOV), horizontal spatial resolution (HSR), modulation transfer function (MTF), ensquared energy (EE) and integrated out-of-pixel (IOOP) spatial response. Results are presented for the non-aggregation, 2-sample and 3-sample aggregation zones for the I-bands and M-bands, and for a limited number of aggregation modes for the DNB. On-orbit GDFOVs measured for the 5 I-bands in the scan direction using a straight bridge are also presented. Band-to-band co-registration (BBR) is quantified using the prelaunch measured band-to-band offsets. These offsets may be expressed as fractions of horizontal sampling intervals (HSIs), detector spatial response parameters GDFOV or HSR. BBR bases on HSIs in the non-aggregation, 2-sample and 3-sample aggregation zones are presented. BBR matrices based on scan direction GDFOV and HSR are compared to the BBR matrix based on HSI in the non-aggregation zone. We demonstrate that BBR based on GDFOV is a better representation of footprint overlap and so this definition should be used in BBR requirement specifications. We propose that HSR not be used as the primary image quality indicator, since we show that it is neither an adequate representation of the size of sensor spatial response nor an adequate measure of imaging quality.

  14. Extent of genome-wide linkage disequilibrium in Australian Holstein-Friesian cattle based on a high-density SNP panel.

    PubMed

    Khatkar, Mehar S; Nicholas, Frank W; Collins, Andrew R; Zenger, Kyall R; Cavanagh, Julie A L; Barris, Wes; Schnabel, Robert D; Taylor, Jeremy F; Raadsma, Herman W

    2008-04-24

    The extent of linkage disequilibrium (LD) within a population determines the number of markers that will be required for successful association mapping and marker-assisted selection. Most studies on LD in cattle reported to date are based on microsatellite markers or small numbers of single nucleotide polymorphisms (SNPs) covering one or only a few chromosomes. This is the first comprehensive study on the extent of LD in cattle by analyzing data on 1,546 Holstein-Friesian bulls genotyped for 15,036 SNP markers covering all regions of all autosomes. Furthermore, most studies in cattle have used relatively small sample sizes and, consequently, may have had biased estimates of measures commonly used to describe LD. We examine minimum sample sizes required to estimate LD without bias and loss in accuracy. Finally, relatively little information is available on comparative LD structures including other mammalian species such as human and mouse, and we compare LD structure in cattle with public-domain data from both human and mouse. We computed three LD estimates, D', Dvol and r2, for 1,566,890 syntenic SNP pairs and a sample of 365,400 non-syntenic pairs. Mean D' is 0.189 among syntenic SNPs, and 0.105 among non-syntenic SNPs; mean r2 is 0.024 among syntenic SNPs and 0.0032 among non-syntenic SNPs. All three measures of LD for syntenic pairs decline with distance; the decline is much steeper for r2 than for D' and Dvol. The value of D' and Dvol are quite similar. Significant LD in cattle extends to 40 kb (when estimated as r2) and 8.2 Mb (when estimated as D'). The mean values for LD at large physical distances are close to those for non-syntenic SNPs. Minor allelic frequency threshold affects the distribution and extent of LD. For unbiased and accurate estimates of LD across marker intervals spanning < 1 kb to > 50 Mb, minimum sample sizes of 400 (for D') and 75 (for r2) are required. The bias due to small samples sizes increases with inter-marker interval. LD in cattle is much less extensive than in a mouse population created from crossing inbred lines, and more extensive than in humans. For association mapping in Holstein-Friesian cattle, for a given design, at least one SNP is required for each 40 kb, giving a total requirement of at least 75,000 SNPs for a low power whole-genome scan (median r2 > 0.19) and up to 300,000 markers at 10 kb intervals for a high power genome scan (median r2 > 0.62). For estimation of LD by D' and Dvol with sufficient precision, a sample size of at least 400 is required, whereas for r2 a minimum sample of 75 is adequate.

  15. Direct determination of potentially toxic elements in rice by SS-GF AAS: development of methods and applications.

    PubMed

    Silvestre, Daniel Menezes; Nomura, Cassiana Seimi

    2013-07-03

    The development of methods for direct determinations of Al, Cd, and Pb in rice by SS-GF AAS is presented. Heating program optimization associated with the use of an adequate chemical modifier containing Pd + Mg allowed direct analysis against aqueous calibrations. The obtained LOD values were 114.0, 3.0, and 16.0 μg kg⁻¹ for Al, Cd, and Pb, respectively. Important parameters associated with a solid sampling analysis were investigated, such as minimum and maximum sample mass size and analyte segregation. Seventeen rice samples available in São Paulo City were analyzed, and all of them presented analyte mass fractions less than the maximum allowed by legislation. The influences of rice washing and the cooking procedure were also investigated. The washing procedure diminished the Al and Pb total mass fractions, indicating an exogenous grain contamination. The cooking procedure diminished the Cd total mass fraction. Rice cooking using an aluminum container did not cause a significant increase in the Al mass fraction in the rice, indicating no translocation of this element from container to food. In general, coarse rice presented higher levels of Al when compared to polished or parabolized rice.

  16. Aerothermodynamic Environment Definition for the Genesis Sample Return Capsule

    NASA Technical Reports Server (NTRS)

    Cheatwood, F. McNeil; Merski, N. Ronald, Jr.; Riley, Christopher J.; Mitcheltree, Robert A.

    2001-01-01

    NASA's Genesis sample return mission will be the first to return material from beyond the Earth-Moon system. NASA Langley Research Center supported this mission with aerothermodynamic analyses of the sample return capsule. This paper provides an overview of that effort. The capsule is attached through its forebody to the spacecraft bus. When the attachment is severed prior to Earth entry, forebody cavities remain. The presence of these cavities could dramatically increase the heating environment in their vicinity and downstream. A combination of computational fluid dynamics calculations and wind tunnel phosphor thermography tests were employed to address this issue. These results quantify the heating environment in and around the cavities, and were a factor in the decision to switch forebody heat shield materials. A transition map is developed which predicts that the flow aft of the penetrations will still be laminar at the peak heating point of the trajectory. As the vehicle continues along the trajectory to the peak dynamic pressure point, fully turbulent flow aft of the penetrations could occur. The integrated heat load calculations show that a heat shield sized to the stagnation point levels will be adequate for the predicted environment aft of the penetrations.

  17. Shock wave plasma induced emission generated by low energy nanosecond Nd:YAG laser in open air and its application to quantitative Cr analysis of low alloy steel

    NASA Astrophysics Data System (ADS)

    Idris, Nasrullah; Pardede, Marincan; Kurniawan, Koo Hendrik; Kagawa, Kiichiro; Tjia, May On

    2018-05-01

    We report the result of an experimental study that shows the remarkable benefits of generating a micro shock wave plasma by low energy (800 μJ) nanosecond (ns) Nd:YAG laser irradiation on a solid target in open air and the efficient detection of the induced plasma emission. The very low irradiation power density of 0.8 MW/cm2 produced by the slightly defocused laser beam gives the additional advantage of rather wide crater size of 400 μm on the sample surface, thus enabling average analysis and reducing the ion production responsible for the undesirable emission background as well as the Stark broadening effect, and thus leading to largely improved spectral quality. This is corroborated by the result of spectra measured from a number of metal samples which display the sharp emission lines with low background. Specifically, its application to Cr analysis of a series of low alloy steel samples with different Cr concentrations is shown to yield a linear calibration line of adequate dynamical range and an estimated detection limit of about 10 ppm.

  18. Vertical accretion sand proxies of gaged floods along the upper Little Tennessee River, Blue Ridge Mountains, USA

    NASA Astrophysics Data System (ADS)

    Leigh, David S.

    2018-02-01

    Understanding environmental hazards presented by river flooding has been enhanced by paleoflood analysis, which uses sedimentary records to document floods beyond historical records. Bottomland overbank deposits (e.g., natural levees, floodbasins, meander scars, low terraces) have the potential as continuous paleoflood archives of flood frequency and magnitude, but they have been under-utilized because of uncertainty about their ability to derive flood magnitude estimates. The purpose of this paper is to provide a case study that illuminates tremendous potential of bottomland overbank sediments as reliable proxies of both flood frequency and magnitude. Methods involve correlation of particle-size measurements of the coarse tail of overbank deposits (> 0.25 mm sand) from three separate sites with historical flood discharge records for the upper Little Tennessee River in the Blue Ridge Mountains of the southeastern United States. Results show that essentially all floods larger than a 20% probability event can be detected by the coarse tail of particle-size distributions, especially if the temporal resolution of sampling is annual or sub-annual. Coarser temporal resolution (1.0 to 2.5 year sample intervals) provides an adequate record of large floods, but is unable to discriminate individual floods separated by only one to three years. Measurements of > 0.25 mm sand that are normalized against a smoothed trend line through the down-column data produce highly significant correlations (R2 values of 0.50 to 0.60 with p-values of 0.004 to < 0.001) between sand peak values and flood peak discharges, indicating that flood magnitude can be reliably estimated. In summary, bottomland overbank deposits can provide excellent continuous records of paleofloods when the following conditions are met: 1) Stable depositional sites should be chosen; 2) Analysis should concentrate on the coarse tails of particle-size distributions; 3) Sampling of sediment intervals should achieve annual or better resolution; 4) Time-series data of particle-size should be detrended to minimize variation from dynamic aspects of fluvial sedimentation that are not related to flood magnitude; and 5) Multiple sites should be chosen to allow for replication of findings.

  19. Note: Small anaerobic chamber for optical spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chauvet, Adrien A. P., E-mail: adrien.chauvet@gmail.com; Chergui, Majed; Agarwal, Rachna

    2015-10-15

    The study of oxygen-sensitive biological samples requires an effective control of the atmosphere in which they are housed. In this aim however, no commercial anaerobic chamber is adequate to solely enclose the sample and small enough to fit in a compact spectroscopic system with which analysis can be performed. Furthermore, spectroscopic analysis requires the probe beam to pass through the whole chamber, introducing a requirement for adequate windows. In response to these challenges, we present a 1 l anaerobic chamber that is suitable for broad-band spectroscopic analysis. This chamber has the advantage of (1) providing access, via a septum, tomore » the sample and (2) allows the sample position to be adjusted while keeping the chamber fixed and hermetic during the experiment.« less

  20. Assessing psychosocial correlates of parental safety behaviour using Protection Motivation Theory: stair gate presence and use among parents of toddlers.

    PubMed

    Beirens, T M J; Brug, J; van Beeck, E F; Dekker, R; den Hertog, P; Raat, H

    2008-08-01

    Unintentional injury due to falls is one of the main reasons for hospitalization among children 0-4 years of age. The goal of this study was to assess the psychosocial correlates of parental safety behaviours to prevent falls from a staircase due to the lack of or the lack of adequate use of a stair gate. Data were collected from a cross-sectional survey using self-administered questionnaires mailed to a population sample of 2470 parents with toddlers. Associations between self-reported habits on the presence and use of stair gates and family and psychosocial factors were analysed, using descriptive statistics and multiple regression models, based on Protection Motivation Theory. The presence of stair gates was associated with family situation, perceived vulnerability, response efficacy, social norms and descriptive norms. The use of stair gates was associated with family situation, response efficacy, self-efficacy and perceived advantages of safe behaviour. The full model explained 32 and 24% of the variance in the presence of stair gates and the use of stair gates, respectively, indicating a large and medium effect size. Programmes promoting the presence and adequate use of stair gates should address the family situation, personal cognitive factors as well as social factors.

  1. Assessing diet in populations at risk for konzo and neurolathyrism.

    PubMed

    Dufour, Darna L

    2011-03-01

    Although both konzo and neurolathyrism are diseases associated with diet, we know surprising little about the diets of the groups at risk. The objective of this paper is to discuss methods for assessing dietary intake in populations at risk for konzo and lathyrism. These methods include weighed food records and interview based techniques like 24-h recalls and food frequency questionnaires (FFQs). Food records have the potential to provide accurate information on food quantities, and are generally the method of choice. Interview based methods provide less precise information on the quantities of foods ingested, and are subject to recall bias, but may be useful in some studies or for surveillance. Sample size needs to be adequate to account for day-to-day and seasonal variability in food intake, and differences between age and sex groups. Adequate data on the composition of foods, as actually consumed, are needed to evaluate the food intake information. This is especially important in the case of cassava and grass pea where the toxins in the diet is a function of processing. Biomarkers for assessing the cyanogen exposure from cassava-based diets are available; biomarkers for the β-ODAP exposure from grass pea diets need development. Copyright © 2010 Elsevier Ltd. All rights reserved.

  2. Assessing psychosocial correlates of parental safety behaviour using Protection Motivation Theory: stair gate presence and use among parents of toddlers

    PubMed Central

    Beirens, T. M. J.; Brug, J.; van Beeck, E. F.; Dekker, R.; den Hertog, P.; Raat, H.

    2008-01-01

    Unintentional injury due to falls is one of the main reasons for hospitalization among children 0–4 years of age. The goal of this study was to assess the psychosocial correlates of parental safety behaviours to prevent falls from a staircase due to the lack of or the lack of adequate use of a stair gate. Data were collected from a cross-sectional survey using self-administered questionnaires mailed to a population sample of 2470 parents with toddlers. Associations between self-reported habits on the presence and use of stair gates and family and psychosocial factors were analysed, using descriptive statistics and multiple regression models, based on Protection Motivation Theory. The presence of stair gates was associated with family situation, perceived vulnerability, response efficacy, social norms and descriptive norms. The use of stair gates was associated with family situation, response efficacy, self-efficacy and perceived advantages of safe behaviour. The full model explained 32 and 24% of the variance in the presence of stair gates and the use of stair gates, respectively, indicating a large and medium effect size. Programmes promoting the presence and adequate use of stair gates should address the family situation, personal cognitive factors as well as social factors. PMID:17947245

  3. Genetics of preeclampsia: what are the challenges?

    PubMed

    Bernard, Nathalie; Giguère, Yves

    2003-07-01

    Despite recent efforts to identify susceptibility genes of preeclampsia, the genetic determinants of the condition remain ill-defined, as is the situation for most disorders of complex inheritance patterns. The angiotensinogen, factor V, and methylenetetrahydrofolate reductase genes have been investigated in different populations, as have other genes involved in blood pressure, vascular volume control, thrombophilia, lipid metabolism, oxidative stress, and endothelial dysfunction. The study of the genetics of complex traits is faced with both methodological and genetic issues; these include adequate sample size to allow for the identification of modest genetic effects, of gene-gene and gene-environment interactions, the study of adequate quantitative traits and extreme phenotypes, haplotype analyses, statistical genetics, genome-wide (hypothesis-free) versus candidate-gene (hypothesis-driven) approaches, and the validation of positive associations. The use of genetically well-characterized populations showing a founder effect, such as the French-Canadian population of Quebec, in genetic association studies, may help to unravel the susceptibility genes of disorders showing complex inheritance, such as preeclampsia. It is necessary to better evaluate the role of the fetal genome in the resulting predisposition to preeclampsia and its complications. Eventually, we may be able to integrate genetic information to better identify the women at risk of developing preeclampsia, and to improve the management of those suffering from this condition.

  4. "When you're in the hospital, you're in a sort of bubble." Understanding the high risk of self-harm and suicide following psychiatric discharge: a qualitative study.

    PubMed

    Owen-Smith, Amanda; Bennewith, Olive; Donovan, Jenny; Evans, Jonathan; Hawton, Keith; Kapur, Nav; O'Connor, Susan; Gunnell, David

    2014-01-01

    Individuals are at a greatly increased risk of suicide and self-harm in the months following discharge from psychiatric hospital, yet little is known about the reasons for this. To investigate the lived experience of psychiatric discharge and explore service users' experiences following discharge. In-depth interviews were undertaken with recently discharged service users (n = 10) in the UK to explore attitudes to discharge and experiences since leaving hospital. Informants had mixed attitudes to discharge, and those who had not felt adequately involved in discharge decisions, or disagreed with them, had experienced urges to self-harm since being discharged. Accounts revealed a number of factors that made the postdischarge period difficult; these included both the reemergence of stressors that existed prior to hospitalization and a number of stressors that were prompted or exacerbated by hospitalization. Although inferences that can be drawn from the study are limited by the small sample size, the results draw attention to a number of factors that could be investigated further to help explain the high risk of suicide and self-harm following psychiatric discharge. Findings emphasize the importance of adequate preparation for discharge and the maintenance of ongoing relationships with known service providers where possible.

  5. Instruments measuring perceived racism/racial discrimination: review and critique of factor analytic techniques.

    PubMed

    Atkins, Rahshida

    2014-01-01

    Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis.

  6. INSTRUMENTS MEASURING PERCEIVED RACISM/RACIAL DISCRIMINATION: REVIEW AND CRITIQUE OF FACTOR ANALYTIC TECHNIQUES

    PubMed Central

    Atkins, Rahshida

    2015-01-01

    Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis. PMID:25626225

  7. The Inheritance of Metabolic Flux: Expressions for the within-Sibship Mean and Variance Given the Parental Genotypes

    PubMed Central

    Ward, P. J.

    1990-01-01

    Recent developments have related quantitative trait expression to metabolic flux. The present paper investigates some implications of this for statistical aspects of polygenic inheritance. Expressions are derived for the within-sibship genetic mean and genetic variance of metabolic flux given a pair of parental, diploid, n-locus genotypes. These are exact and hold for arbitrary numbers of gene loci, arbitrary allelic values at each locus, and for arbitrary recombination fractions between adjacent gene loci. The within-sibship, genetic variance is seen to be simply a measure of parental heterozygosity plus a measure of the degree of linkage coupling within the parental genotypes. Approximations are given for the within-sibship phenotypic mean and variance of metabolic flux. These results are applied to the problem of attaining adequate statistical power in a test of association between allozymic variation and inter-individual variation in metabolic flux. Simulations indicate that statistical power can be greatly increased by augmenting the data with predictions and observations on progeny statistics in relation to parental allozyme genotypes. Adequate power may thus be attainable at small sample sizes, and when allozymic variation is scored at a only small fraction of the total set of loci whose catalytic products determine the flux. PMID:2379825

  8. A review of reporting of participant recruitment and retention in RCTs in six major journals

    PubMed Central

    Toerien, Merran; Brookes, Sara T; Metcalfe, Chris; de Salis, Isabel; Tomlin, Zelda; Peters, Tim J; Sterne, Jonathan; Donovan, Jenny L

    2009-01-01

    Background Poor recruitment and retention of participants in randomised controlled trials (RCTs) is problematic but common. Clear and detailed reporting of participant flow is essential to assess the generalisability and comparability of RCTs. Despite improved reporting since the implementation of the CONSORT statement, important problems remain. This paper aims: (i) to update and extend previous reviews evaluating reporting of participant recruitment and retention in RCTs; (ii) to quantify the level of participation throughout RCTs. Methods We reviewed all reports of RCTs of health care interventions and/or processes with individual randomisation, published July–December 2004 in six major journals. Short, secondary or interim reports, and Phase I/II trials were excluded. Data recorded were: general RCT details; inclusion of flow diagram; participant flow throughout trial; reasons for non-participation/withdrawal; target sample sizes. Results 133 reports were reviewed. Overall, 79% included a flow diagram, but over a third were incomplete. The majority reported the flow of participants at each stage of the trial after randomisation. However, 40% failed to report the numbers assessed for eligibility. Percentages of participants retained at each stage were high: for example, 90% of eligible individuals were randomised, and 93% of those randomised were outcome assessed. On average, trials met their sample size targets. However, there were some substantial shortfalls: for example 21% of trials reporting a sample size calculation failed to achieve adequate numbers at randomisation, and 48% at outcome assessment. Reporting of losses to follow up was variable and difficult to interpret. Conclusion The majority of RCTs reported the flow of participants well after randomisation, although only two-thirds included a complete flow chart and there was great variability over the definition of "lost to follow up". Reporting of participant eligibility was poor, making assessments of recruitment practice and external validity difficult. Reporting of participant flow throughout RCTs could be improved by small changes to the CONSORT chart. PMID:19591685

  9. A review of reporting of participant recruitment and retention in RCTs in six major journals.

    PubMed

    Toerien, Merran; Brookes, Sara T; Metcalfe, Chris; de Salis, Isabel; Tomlin, Zelda; Peters, Tim J; Sterne, Jonathan; Donovan, Jenny L

    2009-07-10

    Poor recruitment and retention of participants in randomised controlled trials (RCTs) is problematic but common. Clear and detailed reporting of participant flow is essential to assess the generalisability and comparability of RCTs. Despite improved reporting since the implementation of the CONSORT statement, important problems remain. This paper aims: (i) to update and extend previous reviews evaluating reporting of participant recruitment and retention in RCTs; (ii) to quantify the level of participation throughout RCTs. We reviewed all reports of RCTs of health care interventions and/or processes with individual randomisation, published July-December 2004 in six major journals. Short, secondary or interim reports, and Phase I/II trials were excluded. Data recorded were: general RCT details; inclusion of flow diagram; participant flow throughout trial; reasons for non-participation/withdrawal; target sample sizes. 133 reports were reviewed. Overall, 79% included a flow diagram, but over a third were incomplete. The majority reported the flow of participants at each stage of the trial after randomisation. However, 40% failed to report the numbers assessed for eligibility. Percentages of participants retained at each stage were high: for example, 90% of eligible individuals were randomised, and 93% of those randomised were outcome assessed. On average, trials met their sample size targets. However, there were some substantial shortfalls: for example 21% of trials reporting a sample size calculation failed to achieve adequate numbers at randomisation, and 48% at outcome assessment. Reporting of losses to follow up was variable and difficult to interpret. The majority of RCTs reported the flow of participants well after randomisation, although only two-thirds included a complete flow chart and there was great variability over the definition of "lost to follow up". Reporting of participant eligibility was poor, making assessments of recruitment practice and external validity difficult. Reporting of participant flow throughout RCTs could be improved by small changes to the CONSORT chart.

  10. Quantitative imaging biomarkers: Effect of sample size and bias on confidence interval coverage.

    PubMed

    Obuchowski, Nancy A; Bullen, Jennifer

    2017-01-01

    Introduction Quantitative imaging biomarkers (QIBs) are being increasingly used in medical practice and clinical trials. An essential first step in the adoption of a quantitative imaging biomarker is the characterization of its technical performance, i.e. precision and bias, through one or more performance studies. Then, given the technical performance, a confidence interval for a new patient's true biomarker value can be constructed. Estimating bias and precision can be problematic because rarely are both estimated in the same study, precision studies are usually quite small, and bias cannot be measured when there is no reference standard. Methods A Monte Carlo simulation study was conducted to assess factors affecting nominal coverage of confidence intervals for a new patient's quantitative imaging biomarker measurement and for change in the quantitative imaging biomarker over time. Factors considered include sample size for estimating bias and precision, effect of fixed and non-proportional bias, clustered data, and absence of a reference standard. Results Technical performance studies of a quantitative imaging biomarker should include at least 35 test-retest subjects to estimate precision and 65 cases to estimate bias. Confidence intervals for a new patient's quantitative imaging biomarker measurement constructed under the no-bias assumption provide nominal coverage as long as the fixed bias is <12%. For confidence intervals of the true change over time, linearity must hold and the slope of the regression of the measurements vs. true values should be between 0.95 and 1.05. The regression slope can be assessed adequately as long as fixed multiples of the measurand can be generated. Even small non-proportional bias greatly reduces confidence interval coverage. Multiple lesions in the same subject can be treated as independent when estimating precision. Conclusion Technical performance studies of quantitative imaging biomarkers require moderate sample sizes in order to provide robust estimates of bias and precision for constructing confidence intervals for new patients. Assumptions of linearity and non-proportional bias should be assessed thoroughly.

  11. COMPARISON OF ECOLOGICAL COMMUNITIES: THE PROBLEM OF SAMPLE REPRESENTATIVENESS

    EPA Science Inventory

    Obtaining an adequate, representative sample of ecological communities to make taxon richness (TR) or compositional comparisons among sites is a continuing challenge. Sample representativeness literally means the similarity in species composition and relative abundance between a ...

  12. An improved correlation procedure for subsize and full-size Charpy impact specimen data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sokolov, M.A.; Alexander, D.J.

    1997-03-01

    The possibility of using subsize specimens to monitor the properties of reactor pressure vessel steels is receiving increasing attention for light-water reactor plant life extension. This potential results from the possibility of cutting samples of small volume form the internal surface of the pressure vessel for determination of the actual properties of the operating pressure vessel. In addition, plant life extension will require supplemental data that cannot be provided by existing surveillance programs. Testing of subsize specimens manufactured from broken halves of previously tested surveillance Charpy specimens offers an attractive means of extending existing surveillance programs. Using subsize Charpy V-notch-typemore » specimens requires the establishment of a specimen geometry that is adequate to obtain a ductile-to-brittle transition curve similar to that obtained from full-size specimens, and the development of correlations for transition temperature and upper-shelf energy (USE) level between subsize and full-size specimens. Five different geometries of subsize specimens were selected for testing and evaluation. The specimens were made from several types of pressure vessel steels with a wide range of yield strengths, transition temperatures, and USEs. The effects of specimen dimensions, including notch depth, angle, and radius, have been studied. The correlations of transition temperatures determined from different types of subsize specimens and the full-size specimens are presented. A new procedure for transforming data from subsize specimens is developed. The transformed data are in good agreement with data from full-size specimens for materials that have USE levels less than 200 J.« less

  13. Naming Speed in Dyslexia and Dyscalculia

    ERIC Educational Resources Information Center

    Willburger, Edith; Fussenegger, Barbara; Moll, Kristina; Wood, Guilherme; Landerl, Karin

    2008-01-01

    In four carefully selected samples of 8- to 10-year old children with dyslexia (but age adequate arithmetic skills), dyscalculia (but age adequate reading skills), dyslexia/dyscalculia and controls a domain-general deficit in rapid automatized naming (RAN) was found for both dyslexia groups. Dyscalculic children exhibited a domain-specific deficit…

  14. The Leap of a Provincial SME into the Global Market Using E-commerce: The Success of Adequate Planning

    NASA Astrophysics Data System (ADS)

    Sainz de Abajo, Beatriz; García Salcines, Enrique; Burón Fernández, F. Javier; López Coronado, Miguel; de Castro Lozano, Carlos

    The leap into the global market is not easy when it involves a provincial family business. This article demonstrates how adequate planning is fundamental in a small and medium-sized enterprise (SME) with the tight budget they have available to them, in order to be able to differentiate themselves in a highly competitive market, taking into accounts the benefits and risks involved. The Information Technology (IT) tools put in place will give the necessary support and allow for the possibility of increasing and improving the infrastructure as the company requires. An adequate strategy for the future to increases sales would be e-marketing techniques as well as the current promotions which contribute to diffusing the brand.

  15. Predicting Patients with Inadequate 24- or 48-Hour Urine Collections at Time of Metabolic Stone Evaluation.

    PubMed

    McGuire, Barry B; Bhanji, Yasin; Sharma, Vidit; Frainey, Brendan T; McClean, Megan; Dong, Caroline; Rimar, Kalen; Perry, Kent T; Nadler, Robert B

    2015-06-01

    We aimed to understand the characteristics of patients who are less likely to submit adequate urine collections at metabolic stone evaluation. Inadequate urine collection was defined using two definitions: (1) Reference ranges for 24-hour creatinine/kilogram (Cr/24) and (2) discrepancy in total 24-hour urine Cr between 24-hour urine collections. There were 1502 patients with ≥1 kidney stone between 1998 and 2014 who performed a 24- or 48-hour urine collection at Northwestern Memorial Hospital and who were identified retrospectively. Multivariate analysis was performed to analyze predictor variables for adequate urine collection. A total of 2852 urine collections were analyzed. Mean age for males was 54.4 years (range 17-86), and for females was 50.2 years (range 8-90). One patient in the study was younger than 17 years old. (1) Analysis based on the Cr 24/kg definition: There were 50.7% of patients who supplied an inadequate sample. Females were nearly 50% less likely to supply an adequate sample compared with men, P<0.001. Diabetes (odds ratio [OR] 1.42 [1.04-1.94], P=0.026) and vitamin D supplementation (OR 0.64 [0.43-0.95], P=0.028) predicted receiving an adequate/inadequate sample, respectively. (2) Analysis based on differences between total urinary Cr: The model was stratified based on percentage differences between samples up to 50%. At 10%, 20%, 30%, 40%, and 50% differences, inadequate collections were achieved in 82.8%, 66.9%, 51.7%, 38.5%, and 26.4% of patients, respectively. Statistical significance was observed based on differences of ≥40%, and this was defined as the threshold for an inadequate sample. Female sex (OR 0.73 [0.54-0.98], P=0.037) predicted supplying inadequate samples. Adequate collections were more likely to be received on a Sunday (OR 1.6 [1.03-2.58], P=0.038) and by sedentary workers (OR 2.3 [1.12-4.72], P=0.023). Urine collections from patients during metabolic evaluation for nephrolithiasis may be considered inadequate based on two commonly used clinical definitions. This may have therapeutic or economic ramifications and the propensity for females to supply inadequate samples should be investigated further.

  16. 7 CFR Exhibit J to Subpart A of... - Manufactured Home Sites, Rental Projects and Subdivisions: Development, Installation and Set-Up

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... established frost line without exceeding the safe bearing capacity of the supporting soil. Set-Up. The work... architectural practices and shall provide for all utilities in a manner which allows adequate, economic, safe... residential environment which is an asset to the community in which it is located. 4. Lot Size. The size of...

  17. GOST: A generic ordinal sequential trial design for a treatment trial in an emerging pandemic.

    PubMed

    Whitehead, John; Horby, Peter

    2017-03-01

    Conducting clinical trials to assess experimental treatments for potentially pandemic infectious diseases is challenging. Since many outbreaks of infectious diseases last only six to eight weeks, there is a need for trial designs that can be implemented rapidly in the face of uncertainty. Outbreaks are sudden and unpredictable and so it is essential that as much planning as possible takes place in advance. Statistical aspects of such trial designs should be evaluated and discussed in readiness for implementation. This paper proposes a generic ordinal sequential trial design (GOST) for a randomised clinical trial comparing an experimental treatment for an emerging infectious disease with standard care. The design is intended as an off-the-shelf, ready-to-use robust and flexible option. The primary endpoint is a categorisation of patient outcome according to an ordinal scale. A sequential approach is adopted, stopping as soon as it is clear that the experimental treatment has an advantage or that sufficient advantage is unlikely to be detected. The properties of the design are evaluated using large-sample theory and verified for moderate sized samples using simulation. The trial is powered to detect a generic clinically relevant difference: namely an odds ratio of 2 for better rather than worse outcomes. Total sample sizes (across both treatments) of between 150 and 300 patients prove to be adequate in many cases, but the precise value depends on both the magnitude of the treatment advantage and the nature of the ordinal scale. An advantage of the approach is that any erroneous assumptions made at the design stage about the proportion of patients falling into each outcome category have little effect on the error probabilities of the study, although they can lead to inaccurate forecasts of sample size. It is important and feasible to pre-determine many of the statistical aspects of an efficient trial design in advance of a disease outbreak. The design can then be tailored to the specific disease under study once its nature is better understood.

  18. Detection probability in aerial surveys of feral horses

    USGS Publications Warehouse

    Ransom, Jason I.

    2011-01-01

    Observation bias pervades data collected during aerial surveys of large animals, and although some sources can be mitigated with informed planning, others must be addressed using valid sampling techniques that carefully model detection probability. Nonetheless, aerial surveys are frequently employed to count large mammals without applying such methods to account for heterogeneity in visibility of animal groups on the landscape. This often leaves managers and interest groups at odds over decisions that are not adequately informed. I analyzed detection of feral horse (Equus caballus) groups by dual independent observers from 24 fixed-wing and 16 helicopter flights using mixed-effect logistic regression models to investigate potential sources of observation bias. I accounted for observer skill, population location, and aircraft type in the model structure and analyzed the effects of group size, sun effect (position related to observer), vegetation type, topography, cloud cover, percent snow cover, and observer fatigue on detection of horse groups. The most important model-averaged effects for both fixed-wing and helicopter surveys included group size (fixed-wing: odds ratio = 0.891, 95% CI = 0.850–0.935; helicopter: odds ratio = 0.640, 95% CI = 0.587–0.698) and sun effect (fixed-wing: odds ratio = 0.632, 95% CI = 0.350–1.141; helicopter: odds ratio = 0.194, 95% CI = 0.080–0.470). Observer fatigue was also an important effect in the best model for helicopter surveys, with detection probability declining after 3 hr of survey time (odds ratio = 0.278, 95% CI = 0.144–0.537). Biases arising from sun effect and observer fatigue can be mitigated by pre-flight survey design. Other sources of bias, such as those arising from group size, topography, and vegetation can only be addressed by employing valid sampling techniques such as double sampling, mark–resight (batch-marked animals), mark–recapture (uniquely marked and identifiable animals), sightability bias correction models, and line transect distance sampling; however, some of these techniques may still only partially correct for negative observation biases.

  19. Critical appraisal of fundamental items in approved clinical trial research proposals in Mashhad University of Medical Sciences

    PubMed Central

    Shakeri, Mohammad-Taghi; Taghipour, Ali; Sadeghi, Masoumeh; Nezami, Hossein; Amirabadizadeh, Ali-Reza; Bonakchi, Hossein

    2017-01-01

    Background: Writing, designing, and conducting a clinical trial research proposal has an important role in achieving valid and reliable findings. Thus, this study aimed at critically appraising fundamental information in approved clinical trial research proposals in Mashhad University of Medical Sciences (MUMS) from 2008 to 2014. Methods: This cross-sectional study was conducted on all 935 approved clinical trial research proposals in MUMS from 2008 to 2014. A valid and reliable as well as comprehensive, simple, and usable checklist in sessions with biostatisticians and methodologists, consisting of 11 main items as research tool, were used. Agreement rate between the reviewers of the proposals, who were responsible for data collection, was assessed during 3 sessions, and Kappa statistics was calculated at the last session as 97%. Results: More than 60% of the research proposals had a methodologist consultant, moreover, type of study or study design had been specified in almost all of them (98%). Appropriateness of study aims with hypotheses was not observed in a significant number of research proposals (585 proposals, 62.6%). The required sample size for 66.8% of the approved proposals was based on a sample size formula; however, in 25% of the proposals, sample size formula was not in accordance with the study design. Data collection tool was not selected appropriately in 55.2% of the approved research proposals. Type and method of randomization were unknown in 21% of the proposals and dealing with missing data had not been described in most of them (98%). Inclusion and exclusion criteria were (92%) fully and adequately explained. Moreover, 44% and 31% of the research proposals were moderate and weak in rank, respectively, with respect to the correctness of the statistical analysis methods. Conclusion: Findings of the present study revealed that a large portion of the approved proposals were highly biased or ambiguous with respect to randomization, blinding, dealing with missing data, data collection tool, sampling methods, and statistical analysis. Thus, it is essential to consult and collaborate with a methodologist in all parts of a proposal to control the possible and specific biases in clinical trials. PMID:29445703

  20. Critical appraisal of fundamental items in approved clinical trial research proposals in Mashhad University of Medical Sciences.

    PubMed

    Shakeri, Mohammad-Taghi; Taghipour, Ali; Sadeghi, Masoumeh; Nezami, Hossein; Amirabadizadeh, Ali-Reza; Bonakchi, Hossein

    2017-01-01

    Background: Writing, designing, and conducting a clinical trial research proposal has an important role in achieving valid and reliable findings. Thus, this study aimed at critically appraising fundamental information in approved clinical trial research proposals in Mashhad University of Medical Sciences (MUMS) from 2008 to 2014. Methods: This cross-sectional study was conducted on all 935 approved clinical trial research proposals in MUMS from 2008 to 2014. A valid and reliable as well as comprehensive, simple, and usable checklist in sessions with biostatisticians and methodologists, consisting of 11 main items as research tool, were used. Agreement rate between the reviewers of the proposals, who were responsible for data collection, was assessed during 3 sessions, and Kappa statistics was calculated at the last session as 97%. Results: More than 60% of the research proposals had a methodologist consultant, moreover, type of study or study design had been specified in almost all of them (98%). Appropriateness of study aims with hypotheses was not observed in a significant number of research proposals (585 proposals, 62.6%). The required sample size for 66.8% of the approved proposals was based on a sample size formula; however, in 25% of the proposals, sample size formula was not in accordance with the study design. Data collection tool was not selected appropriately in 55.2% of the approved research proposals. Type and method of randomization were unknown in 21% of the proposals and dealing with missing data had not been described in most of them (98%). Inclusion and exclusion criteria were (92%) fully and adequately explained. Moreover, 44% and 31% of the research proposals were moderate and weak in rank, respectively, with respect to the correctness of the statistical analysis methods. Conclusion: Findings of the present study revealed that a large portion of the approved proposals were highly biased or ambiguous with respect to randomization, blinding, dealing with missing data, data collection tool, sampling methods, and statistical analysis. Thus, it is essential to consult and collaborate with a methodologist in all parts of a proposal to control the possible and specific biases in clinical trials.

  1. Seasonal Progression of the Deposition of Black Carbon by Snowfall at Ny-Ålesund, Spitsbergen

    NASA Astrophysics Data System (ADS)

    Sinha, P. R.; Kondo, Y.; Goto-Azuma, K.; Tsukagawa, Y.; Fukuda, K.; Koike, M.; Ohata, S.; Moteki, N.; Mori, T.; Oshima, N.; Førland, E. J.; Irwin, M.; Gallet, J.-C.; Pedersen, C. A.

    2018-01-01

    Deposition of black carbon (BC) aerosol in the Arctic lowers snow albedo, thus contributing to warming in the region. However, the processes and impacts associated with BC deposition are poorly understood because of the scarcity and uncertainties of measurements of BC in snow with adequate spatiotemporal resolution. We sampled snowpack at two sites (11 m and 300 m above sea level) at Ny-Ålesund, Spitsbergen, in April 2013. We also collected falling snow near the surface with a windsock from September 2012 to April 2013. The size distribution of BC in snowpack and falling snow was measured using a single-particle soot photometer combined with a characterized nebulizer. The BC size distributions did not show significant variations with depth in the snowpack, suggesting stable size distributions in falling snow. The BC number and mass concentrations (CNBC and CMBC) at the two sites agreed to within 19% and 10%, respectively, despite the sites' different snow water equivalent (SWE) loadings. This indicates the small influence of the amount of SWE (or precipitation) on these quantities. Average CNBC and CMBC in snowpack and falling snow at nearly the same locations agreed to within 5% and 16%, after small corrections for artifacts associated with the sampling of the falling snow. This comparison shows that the dry deposition was a small contributor to the total BC deposition. CMBC were highest (2.4 ± 3.0 μg L-1) in December-February and lowest (1.2 ± 1.2 μg L-1) in September-November.

  2. Methodological issues regarding power of classical test theory (CTT) and item response theory (IRT)-based approaches for the comparison of patient-reported outcomes in two groups of patients - a simulation study

    PubMed Central

    2010-01-01

    Background Patients-Reported Outcomes (PRO) are increasingly used in clinical and epidemiological research. Two main types of analytical strategies can be found for these data: classical test theory (CTT) based on the observed scores and models coming from Item Response Theory (IRT). However, whether IRT or CTT would be the most appropriate method to analyse PRO data remains unknown. The statistical properties of CTT and IRT, regarding power and corresponding effect sizes, were compared. Methods Two-group cross-sectional studies were simulated for the comparison of PRO data using IRT or CTT-based analysis. For IRT, different scenarios were investigated according to whether items or person parameters were assumed to be known, to a certain extent for item parameters, from good to poor precision, or unknown and therefore had to be estimated. The powers obtained with IRT or CTT were compared and parameters having the strongest impact on them were identified. Results When person parameters were assumed to be unknown and items parameters to be either known or not, the power achieved using IRT or CTT were similar and always lower than the expected power using the well-known sample size formula for normally distributed endpoints. The number of items had a substantial impact on power for both methods. Conclusion Without any missing data, IRT and CTT seem to provide comparable power. The classical sample size formula for CTT seems to be adequate under some conditions but is not appropriate for IRT. In IRT, it seems important to take account of the number of items to obtain an accurate formula. PMID:20338031

  3. Size distribution and clothing-air partitioning of polycyclic aromatic hydrocarbons generated by barbecue.

    PubMed

    Lao, Jia-Yong; Wu, Chen-Chou; Bao, Lian-Jun; Liu, Liang-Ying; Shi, Lei; Zeng, Eddy Y

    2018-10-15

    Barbecue (BBQ) is one of the most popular cooking activities with charcoal worldwide and produces abundant polycyclic aromatic hydrocarbons (PAHs) and particulate matter. Size distribution and clothing-air partitioning of particle-bound PAHs are significant for assessing potential health hazards to humans due to exposure to BBQ fumes, but have not been examined adequately. To address this issue, particle and gaseous samples were collected at 2-m and 10-m distances from a cluster of four BBQ stoves. Personal samplers and cotton clothes were carried by volunteers sitting near the BBQ stoves. Particle-bound PAHs (especially 4-6 rings) derived from BBQ fumes were mostly affiliated with fine particles in the size range of 0.18-1.8 μm. High molecular-weight PAHs were mostly unimodal peaking in fine particles and consequently had small geometric mean diameters and standard deviations. Source diagnostics indicated that particle-bound PAHs in BBQ fumes were generated primarily by combustion of charcoal, fat content in food, and oil. The influences of BBQ fumes on the occurrence of particle-bound PAHs decreased with increasing distance from BBQ stoves, due to increased impacts of ambient sources, especially by petrogenic sources and to a lesser extent by wind speed and direction. Octanol-air and clothing-air partition coefficients of PAHs obtained from personal air samples were significantly correlated to each other. High molecular-weight PAHs had higher area-normalized clothing-air partition coefficients in cotton clothes, i.e., cotton fabrics may be a significant reservoir of higher molecular-weight PAHs. Particle-bound PAHs from barbecue fumes are generated largely from charcoal combustion and food-charred emissions and mainly affiliated with fine particles. Copyright © 2018. Published by Elsevier B.V.

  4. Clinical Trials of Potential Cognitive-Enhancing Drugs in Schizophrenia: What Have We Learned So Far?

    PubMed Central

    Keefe, Richard S. E.; Buchanan, Robert W.; Marder, Stephen R.; Schooler, Nina R.; Dugar, Ashish; Zivkov, Milana; Stewart, Michelle

    2013-01-01

    In light of the number of studies conducted to examine the treatment of cognitive impairment associated with schizophrenia (CIAS), we critically reviewed recent CIAS trials. Trials were identified through searches of the website “www.clinicaltrials.gov” using the terms “schizophrenia AND cognition,” “schizophrenia AND neurocognition,” “schizophrenia AND neurocognitive tests,” “schizophrenia AND MATRICS,” “schizophrenia AND MCCB,” “schizophrenia AND BACS,” “schizophrenia AND COGSTATE,” and “schizophrenia AND CANTAB” and “first-episode schizophrenia AND cognition.” The cutoff date was 20 April 2011. Included trials were conducted in people with schizophrenia, the effects on cognition were either a primary or secondary outcome, and the effect of a pharmacologically active substance was examined. Drug challenge, pharmacokinetic, pharmacodynamic, or prodrome of psychosis studies were excluded. We identified 118 trials, with 62% using an add-on parallel group design. The large majority of completed trials were underpowered to detect moderate effect sizes, had ≤8 weeks duration, and were performed in samples of participants with chronic stable schizophrenia. The ongoing add-on trials are longer, have larger sample sizes (with a number of them being adequately powered to detect moderate effect sizes), and are more likely to use a widely accepted standardized cognitive battery (eg, the MATRICS Consensus Cognitive Battery) and MATRICS guidelines. Ongoing studies performed in subjects with recent onset schizophrenia may help elucidate which subjects are most likely to show an effect in cognition. New insights into the demands of CIAS trial design and methodology may help increase the probability of identifying treatments with beneficial effect on cognitive impairment in schizophrenia. PMID:22114098

  5. What you see is not what you catch: a comparison of concurrently collected net, Optical Plankton Counter, and Shadowed Image Particle Profiling Evaluation Recorder data from the northeast Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Remsen, Andrew; Hopkins, Thomas L.; Samson, Scott

    2004-01-01

    Zooplankton and suspended particles were sampled in the upper 100 m of the Gulf of Mexico with the High Resolution Sampler. This towed-platform can concurrently sample zooplankton with plankton nets, an Optical Plankton Counter (OPC) and the Shadowed Image Particle Profiling and Evaluation Recorder (SIPPER), a zooplankton imaging system. This allowed for direct comparison of mesozooplankton abundance, biomass, taxonomic composition and size distributions between simultaneously collected net samples, OPC data, and digital imagery. While the net data were numerically and taxonomically similar to that of previous studies in the region, analysis of the SIPPER imagery revealed that nets significantly underestimated larvacean, doliolid, protoctist and cnidarian/ctenophore abundance by 300%, 379%, 522% and 1200%, respectively. The inefficiency of the nets in sampling the fragile and gelatinous zooplankton groups led to a dry-weight biomass estimate less than half that of the SIPPER total and suggests that this component of the zooplankton assemblage is more important than previously determined for this region. Additionally, using the SIPPER data we determined that more than 29% of all mesozooplankton-sized particles occurred within 4 mm of another particle and therefore would not be separately counted by the OPC. This suggests that coincident counting is a major problem for the OPC even at the low zooplankton abundances encountered in low latitude oligotrophic systems like the Gulf. Furthermore, we found that the colonial cyanobacterium Trichodesmium was the most abundant recognizable organism in the SIPPER dataset, while it was difficult to quantify with the nets. For these reasons, the traditional method of using net samples to ground truth OPC data would not be adequate in describing the particle assemblage described here. Consequently we suggest that in situ imaging sensors be included in any comprehensive study of mesozooplankton.

  6. Meta-analysis of workplace physical activity interventions.

    PubMed

    Conn, Vicki S; Hafdahl, Adam R; Cooper, Pamela S; Brown, Lori M; Lusk, Sally L

    2009-10-01

    Most adults do not achieve adequate physical activity levels. Despite the potential benefits of worksite health promotion, no previous comprehensive meta-analysis has summarized health and physical activity behavior outcomes from such programs. This comprehensive meta-analysis integrated the extant wide range of worksite physical activity intervention research. Extensive searching located published and unpublished intervention studies reported from 1969 through 2007. Results were coded from primary studies. Random-effects meta-analytic procedures, including moderator analyses, were completed in 2008. Effects on most variables were substantially heterogeneous because diverse studies were included. Standardized mean difference (d) effect sizes were synthesized across approximately 38,231 subjects. Significantly positive effects were observed for physical activity behavior (0.21); fitness (0.57); lipids (0.13); anthropometric measures (0.08); work attendance (0.19); and job stress (0.33). The significant effect size for diabetes risk (0.98) is less robust given small sample sizes. The mean effect size for fitness corresponds to a difference between treatment minus control subjects' means on VO2max of 3.5 mL/kg/min; for lipids, -0.2 on the ratio of total cholesterol to high-density lipoprotein; and for diabetes risk, -12.6 mg/dL on fasting glucose. These findings document that some workplace physical activity interventions can improve both health and important worksite outcomes. Effects were variable for most outcomes, reflecting the diversity of primary studies. Future primary research should compare interventions to confirm causal relationships and further explore heterogeneity.

  7. Latitudinal and stock-specific variation in size- and age-at-maturity of female winter flounder, Pseudopleuronectes americanus, as determined with gonad histology

    NASA Astrophysics Data System (ADS)

    McBride, Richard S.; Wuenschel, Mark J.; Nitschke, Paul; Thornton, Grace; King, Jeremy R.

    2013-01-01

    Female winter flounder were examined using gonad histology to determine the adequacy of routine macroscopic maturity classification methods and to determine the spatial variation in size and age of maturity in U.S. waters. Sampling occurred in spring and autumn, which was adequate to collect immature, mature, spawning-active, and non-active females. Females were collected in coastal waters from Delaware Bay, USA, to the Scotian Shelf, Canada, including in Long Island Sound and on Georges Bank, which covered all U.S. stock areas. Mature fish spawned in spring, when gonads comprised up to 30% of the total body weight. Direct comparisons of maturity assignment by macroscopic versus microscopic methods demonstrated that both schemes are compatible, but the more cost-effective macroscopic method had trouble distinguishing larger immature from smaller resting females. Spatial comparisons, using gonad histology only, supported the existence of three stocks in U.S. waters, but also revealed significant variation in age at maturity within the two coastal stocks. Age-at-maturity was more variable than size-at-maturity, which is consistent with known stock-specific patterns of growth rates and a postulated life history tradeoff to delay maturity until a size threshold is reached. The within-stock variation in median age at maturity, about one year for coastal stocks, recommends further investigation of using static, stock-specific maturity ogives to calculate reference points for management.

  8. Multivariate Meta-Analysis of Brain-Mass Correlations in Eutherian Mammals

    PubMed Central

    Steinhausen, Charlene; Zehl, Lyuba; Haas-Rioth, Michaela; Morcinek, Kerstin; Walkowiak, Wolfgang; Huggenberger, Stefan

    2016-01-01

    The general assumption that brain size differences are an adequate proxy for subtler differences in brain organization turned neurobiologists toward the question why some groups of mammals such as primates, elephants, and whales have such remarkably large brains. In this meta-analysis, an extensive sample of eutherian mammals (115 species distributed in 14 orders) provided data about several different biological traits and measures of brain size such as absolute brain mass (AB), relative brain mass (RB; quotient from AB and body mass), and encephalization quotient (EQ). These data were analyzed by established multivariate statistics without taking specific phylogenetic information into account. Species with high AB tend to (1) feed on protein-rich nutrition, (2) have a long lifespan, (3) delayed sexual maturity, and (4) long and rare pregnancies with small litter sizes. Animals with high RB usually have (1) a short life span, (2) reach sexual maturity early, and (3) have short and frequent gestations. Moreover, males of species with high RB also have few potential sexual partners. In contrast, animals with high EQs have (1) a high number of potential sexual partners, (2) delayed sexual maturity, and (3) rare gestations with small litter sizes. Based on these correlations, we conclude that Eutheria with either high AB or high EQ occupy positions at the top of the network of food chains (high trophic levels). Eutheria of low trophic levels can develop a high RB only if they have small body masses. PMID:27746724

  9. Vitamin D in corticosteroid-naïve and corticosteroid-treated Duchenne muscular dystrophy: what dose achieves optimal 25(OH) vitamin D levels?

    PubMed

    Alshaikh, Nahla; Brunklaus, Andreas; Davis, Tracey; Robb, Stephanie A; Quinlivan, Ros; Munot, Pinki; Sarkozy, Anna; Muntoni, Francesco; Manzur, Adnan Y

    2016-10-01

    Assessment of the efficacy of vitamin D replenishment and maintenance doses required to attain optimal levels in boys with Duchenne muscular dystrophy (DMD). 25(OH)-vitamin D levels and concurrent vitamin D dosage were collected from retrospective case-note review of boys with DMD at the Dubowitz Neuromuscular Centre. Vitamin D levels were stratified as deficient at <25 nmol/L, insufficient at 25-49 nmol/L, adequate at 50-75 nmol/L and optimal at >75 nmol/L. 617 vitamin D samples were available from 197 boys (range 2-18 years)-69% from individuals on corticosteroids. Vitamin D-naïve boys (154 samples) showed deficiency in 28%, insufficiency in 42%, adequate levels in 24% and optimal levels in 6%. The vitamin D-supplemented group (463 samples) was tested while on different maintenance/replenishment doses. Three-month replenishment of daily 3000 IU (23 samples) or 6000 IU (37 samples) achieved optimal levels in 52% and 84%, respectively. 182 samples taken on 400 IU revealed deficiency in 19 (10%), insufficiency in 84 (47%), adequate levels in 67 (37%) and optimal levels in 11 (6%). 97 samples taken on 800 IU showed deficiency in 2 (2%), insufficiency in 17 (17%), adequate levels in 56 (58%) and optimal levels in 22 (23%). 81 samples were on 1000 IU and 14 samples on 1500 IU, with optimal levels in 35 (43%) and 9 (64%), respectively. No toxic level was seen (highest level 230 nmol/L). The prevalence of vitamin D deficiency and insufficiency in DMD is high. A 2-month replenishment regimen of 6000 IU and maintenance regimen of 1000-1500 IU/day was associated with optimal vitamin D levels. These data have important implications for optimising vitamin D dosing in DMD. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. 9 CFR 354.221 - Rooms and compartments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) Coolers and freezers. Coolers and freezers of adequate size and capacity shall be provided to reduce the... ventilated. (e) Storage and supply rooms. The storage and supply rooms shall be in good repair, kept dry, and...

  11. 9 CFR 354.221 - Rooms and compartments.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) Coolers and freezers. Coolers and freezers of adequate size and capacity shall be provided to reduce the... ventilated. (e) Storage and supply rooms. The storage and supply rooms shall be in good repair, kept dry, and...

  12. 9 CFR 354.221 - Rooms and compartments.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) Coolers and freezers. Coolers and freezers of adequate size and capacity shall be provided to reduce the... ventilated. (e) Storage and supply rooms. The storage and supply rooms shall be in good repair, kept dry, and...

  13. 9 CFR 354.221 - Rooms and compartments.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) Coolers and freezers. Coolers and freezers of adequate size and capacity shall be provided to reduce the... ventilated. (e) Storage and supply rooms. The storage and supply rooms shall be in good repair, kept dry, and...

  14. 9 CFR 354.221 - Rooms and compartments.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) Coolers and freezers. Coolers and freezers of adequate size and capacity shall be provided to reduce the... ventilated. (e) Storage and supply rooms. The storage and supply rooms shall be in good repair, kept dry, and...

  15. Effect of abdominopelvic abscess drain size on drainage time and probability of occlusion.

    PubMed

    Rotman, Jessica A; Getrajdman, George I; Maybody, Majid; Erinjeri, Joseph P; Yarmohammadi, Hooman; Sofocleous, Constantinos T; Solomon, Stephen B; Boas, F Edward

    2017-04-01

    The purpose of this study is to determine whether larger abdominopelvic abscess drains reduce the time required for abscess resolution or the probability of tube occlusion. 144 consecutive patients who underwent abscess drainage at a single institution were reviewed retrospectively. Larger initial drain size did not reduce drainage time, drain occlusion, or drain exchanges (P > .05). Subgroup analysis did not find any type of collection that benefitted from larger drains. A multivariate model predicting drainage time showed that large collections (>200 mL) required 16 days longer drainage time than small collections (<50 mL). Collections with a fistula to bowel required 17 days longer drainage time than collections without a fistula. Initial drain size and the viscosity of the fluid in the collection had no significant effect on drainage time in the multivariate model. 8 F drains are adequate for initial drainage of most serous and serosanguineous collections. 10 F drains are adequate for initial drainage of most purulent or bloody collections. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Analysis and design of randomised clinical trials involving competing risks endpoints.

    PubMed

    Tai, Bee-Choo; Wee, Joseph; Machin, David

    2011-05-19

    In randomised clinical trials involving time-to-event outcomes, the failures concerned may be events of an entirely different nature and as such define a classical competing risks framework. In designing and analysing clinical trials involving such endpoints, it is important to account for the competing events, and evaluate how each contributes to the overall failure. An appropriate choice of statistical model is important for adequate determination of sample size. We describe how competing events may be summarised in such trials using cumulative incidence functions and Gray's test. The statistical modelling of competing events using proportional cause-specific and subdistribution hazard functions, and the corresponding procedures for sample size estimation are outlined. These are illustrated using data from a randomised clinical trial (SQNP01) of patients with advanced (non-metastatic) nasopharyngeal cancer. In this trial, treatment has no effect on the competing event of loco-regional recurrence. Thus the effects of treatment on the hazard of distant metastasis were similar via both the cause-specific (unadjusted csHR = 0.43, 95% CI 0.25 - 0.72) and subdistribution (unadjusted subHR 0.43; 95% CI 0.25 - 0.76) hazard analyses, in favour of concurrent chemo-radiotherapy followed by adjuvant chemotherapy. Adjusting for nodal status and tumour size did not alter the results. The results of the logrank test (p = 0.002) comparing the cause-specific hazards and the Gray's test (p = 0.003) comparing the cumulative incidences also led to the same conclusion. However, the subdistribution hazard analysis requires many more subjects than the cause-specific hazard analysis to detect the same magnitude of effect. The cause-specific hazard analysis is appropriate for analysing competing risks outcomes when treatment has no effect on the cause-specific hazard of the competing event. It requires fewer subjects than the subdistribution hazard analysis for a similar effect size. However, if the main and competing events are influenced in opposing directions by an intervention, a subdistribution hazard analysis may be warranted.

  17. Acute Respiratory Distress Syndrome Measurement Error. Potential Effect on Clinical Study Results

    PubMed Central

    Cooke, Colin R.; Iwashyna, Theodore J.; Hofer, Timothy P.

    2016-01-01

    Rationale: Identifying patients with acute respiratory distress syndrome (ARDS) is a recognized challenge. Experts often have only moderate agreement when applying the clinical definition of ARDS to patients. However, no study has fully examined the implications of low reliability measurement of ARDS on clinical studies. Objectives: To investigate how the degree of variability in ARDS measurement commonly reported in clinical studies affects study power, the accuracy of treatment effect estimates, and the measured strength of risk factor associations. Methods: We examined the effect of ARDS measurement error in randomized clinical trials (RCTs) of ARDS-specific treatments and cohort studies using simulations. We varied the reliability of ARDS diagnosis, quantified as the interobserver reliability (κ-statistic) between two reviewers. In RCT simulations, patients identified as having ARDS were enrolled, and when measurement error was present, patients without ARDS could be enrolled. In cohort studies, risk factors as potential predictors were analyzed using reviewer-identified ARDS as the outcome variable. Measurements and Main Results: Lower reliability measurement of ARDS during patient enrollment in RCTs seriously degraded study power. Holding effect size constant, the sample size necessary to attain adequate statistical power increased by more than 50% as reliability declined, although the result was sensitive to ARDS prevalence. In a 1,400-patient clinical trial, the sample size necessary to maintain similar statistical power increased to over 1,900 when reliability declined from perfect to substantial (κ = 0.72). Lower reliability measurement diminished the apparent effectiveness of an ARDS-specific treatment from a 15.2% (95% confidence interval, 9.4–20.9%) absolute risk reduction in mortality to 10.9% (95% confidence interval, 4.7–16.2%) when reliability declined to moderate (κ = 0.51). In cohort studies, the effect on risk factor associations was similar. Conclusions: ARDS measurement error can seriously degrade statistical power and effect size estimates of clinical studies. The reliability of ARDS measurement warrants careful attention in future ARDS clinical studies. PMID:27159648

  18. The effects of density dependent resource limitation on size of wild reindeer.

    PubMed

    Skogland, Terje

    1983-11-01

    A density-dependent decrement in size for wild reindeer from 12 different Norwegian herds at 16 different densities was shown using lower jawbone-length as the criterion of size. This criterion was tested and found to adequately predict body size of both bucks and does. Lactation in does did not affect jaw length but significantly affected dressed weights.A decrement in the size of does as a result of gross density was found. This size decrement was further analysed in relation to the habitat densities in winter (R 2 =0.85) and in summer (R 2 =0.75) separately, in order to estimate the relative effects of each factor. For herds with adequate food in winter (no signs of overgrazing of lichens) density in relation to summer habitat and mires yielded the highest predictive power in a multiple regression. For herds with adequate summer pastures, densities per winter habitat and lichen volumes showed likewise a highly significant correlation. The inclusion of the lichen volume data in the regression increased its predictive power. The major effect of resource limitation was to delay the time of calving because a maternal carry-over effect allowed the calf a shorter period of growth to be completed during its first summer. Neonate size at birth was highly correlated with maternal size regardless of the mean calving date although the latter was significantly delayed for small-sized does in food resource-limited herds. Likewise the postnatal growth rate of all calves were not significantly different during 50 days postpartum regardless of maternal conditions in winter feeding. The summer growth rates of bucks ≧1 year did not vary significantly between herds. The age of maturity of food resource-limited does was delayed by one year and growth ceased after the initiation of reproduction. This shows that under conditions of limited resources the does with delayed births of calves allocated less energy to body growth simply because they had less time to replenish body reserves once they were freed of the energetic demands of lactation. The overriding effects of such limitation of food resources is thus to produce a time-lag for the completition of all the important life-history events, such as growth, maintenance and reproduction. From a theoretical point of view, i.e. according to the reproductive effort model their only option is to try to overcome this time limitation to reproductive success.

  19. Evaluation of the use of nonesterified fatty acids and β-hydroxybutyrate concentrations in pooled serum samples for herd-based detection of subclinical ketosis in dairy cows during the first week after parturition.

    PubMed

    Borchardt, Stefan; Staufenbiel, Rudolf

    2012-04-15

    To evaluate the use of nonesterified fatty acids (NEFA) and β-hydroxybutyrate (BHBA) concentrations in pooled serum samples for herd-based detection of subclinical ketosis (SCK) in dairy cows after calving. Cross-sectional study. 1,100 dairy cows from 110 herds. Blood samples were collected from 10 healthy cows/herd in the first week after parturition. Aliquots of serum were mixed to create a pooled sample. Concentrations of NEFA and BHBA were measured to estimate prevalence of SCK. Pooled sample test results were compared with those obtained for individual samples. Linear regression and receiver-operating characteristic curve analysis were performed; Bland-Altman plots were used to evaluate agreement between methods. Overall prevalence of SCK was 30.7%, 19.3%, and 13.6%, as determined by use of BHBA threshold concentrations of 1,000, 1,200, and 1,400 μmol/L, respectively. Pooled sample concentrations of NEFA and BHBA were significantly correlated (r = 0.98 and 0.97, respectively) with individual sample means and with the number of cows that had NEFA (R(2) range, 0.81 to 0.84) or BHBA (R(2) range, 0.65 to 0.76) concentrations above predefined thresholds. Pooled sample concentrations of NEFA and BHBA were very accurate to highly accurate for herd-based detection of SCK. Analysis of NEFA and BHBA concentrations in pooled serum samples was useful for herd-based detection of SCK. A sample size of 10 cows/herd was deemed adequate for monitoring dairy herds for SCK. Reference criteria specific to pooled samples should be used for this type of herd-based testing.

  20. Dynamic sample size detection in learning command line sequence for continuous authentication.

    PubMed

    Traore, Issa; Woungang, Isaac; Nakkabi, Youssef; Obaidat, Mohammad S; Ahmed, Ahmed Awad E; Khalilian, Bijan

    2012-10-01

    Continuous authentication (CA) consists of authenticating the user repetitively throughout a session with the goal of detecting and protecting against session hijacking attacks. While the accuracy of the detector is central to the success of CA, the detection delay or length of an individual authentication period is important as well since it is a measure of the window of vulnerability of the system. However, high accuracy and small detection delay are conflicting requirements that need to be balanced for optimum detection. In this paper, we propose the use of sequential sampling technique to achieve optimum detection by trading off adequately between detection delay and accuracy in the CA process. We illustrate our approach through CA based on user command line sequence and naïve Bayes classification scheme. Experimental evaluation using the Greenberg data set yields encouraging results consisting of a false acceptance rate (FAR) of 11.78% and a false rejection rate (FRR) of 1.33%, with an average command sequence length (i.e., detection delay) of 37 commands. When using the Schonlau (SEA) data set, we obtain FAR = 4.28% and FRR = 12%.

  1. Goldeye, Hiodon alosoides, in Lake Oahe: abundance, age, growth, maturity, food, and the fishery, 1963-69

    USGS Publications Warehouse

    Miller, Grant L.; Nelson, William R.

    1974-01-01

    Reproductive success was relatively consistent, and adequate to maintain species abundance at a nearly constant level, during 1963-69. Both abundance and growth in length increased from the lower to the upper portion of the reservoir. In most characteristics -- growth in length, length-weight relation, age at maturity, and food -- goldeye in Lake Oahe were similar to those from other Missouri River impoundments. Experimental gill nets samples all lengths (range 80-460 mm; median, 320 mm) of goldeye, bottom trawls sampled mostly small fish, (median, 215 mm) and trap nets large ones (median, 345 mm). Commercial gill nets were highly size selective (median, 375 mm); fish of ages IV-VII made up 90% of the catch. Survival rates were 57 to 52% for ages II-X. Estimated survival rates for ages V-IX declined from 44 to 35% after the inception of the commercial fishery in 1966. The peak commercial catch was 151,432 kg (1.2 kg/hectare) in 1969. Unless recruitment declines, the population can support a fishery of that magnitude.

  2. Detection of early changes in lung-cell cytology by flow-systems analysis techniques. Progress report, January 1--June 30, 1976

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steinkamp, J. A.; Hansen, K. M.; Wilson, J. S.

    1976-08-01

    This report summarizes results of preliminary experiments to develop cytological and biochemical indicators for estimating damage to respiratory epithelium exposed to toxic agents associated with the by-products of nonnuclear energy production using advanced flow-systems cell-analysis technologies. Since initiation of the program one year ago, progress has been made in obtaining adequate numbers of exfoliated lung cells from the Syrian hamster for flow analysis; cytological techniques developed on human exfoliated gynecological samples have been adapted to hamster lung epithelium for obtaining single-cell suspensions; and lung-cell samples have been initially characterized based on DNA content, total protein, nuclear and cytoplasmic size, andmore » multiangle light-scatter measurements. Preliminary results from measurements of the above parameters which recently became available are described in this report. As the flow-systems technology is adapted further to analysis of exfoliated lung cells, measurements of changes in physical and biochemical cellular properties as a function of exposure to toxic agents will be performed.« less

  3. The Relation Between Inflation in Type-I and Type-II Error Rate and Population Divergence in Genome-Wide Association Analysis of Multi-Ethnic Populations.

    PubMed

    Derks, E M; Zwinderman, A H; Gamazon, E R

    2017-05-01

    Population divergence impacts the degree of population stratification in Genome Wide Association Studies. We aim to: (i) investigate type-I error rate as a function of population divergence (F ST ) in multi-ethnic (admixed) populations; (ii) evaluate the statistical power and effect size estimates; and (iii) investigate the impact of population stratification on the results of gene-based analyses. Quantitative phenotypes were simulated. Type-I error rate was investigated for Single Nucleotide Polymorphisms (SNPs) with varying levels of F ST between the ancestral European and African populations. Type-II error rate was investigated for a SNP characterized by a high value of F ST . In all tests, genomic MDS components were included to correct for population stratification. Type-I and type-II error rate was adequately controlled in a population that included two distinct ethnic populations but not in admixed samples. Statistical power was reduced in the admixed samples. Gene-based tests showed no residual inflation in type-I error rate.

  4. Iodine Nutritional Status among Adolescent Girls in Uttarakhand, India.

    PubMed

    Kapil, Umesh; Sareen, Neha; Nambiar, Vanisha S; Khenduja, Preetika; Sofi, Nighat Yaseen

    2016-02-01

    Uttarakhand (UK) state is a known endemic region for Iodine deficiency. To assess iodine nutritional status among adolescent girls in districts: Udham Singh Nagar (USN), Nainital (N) and Pauri (P) of UK state. In each district, 30 clusters (schools) were identified by using population proportionate to size cluster sampling. In each school, 60 girls (12-18 years) attending the schools were included. Total of 5430 girls from USN (1823), N (1811) and P (1796) were studied. Clinical examination of thyroid of each girl was conducted. From each cluster, spot urine and salt samples were collected. Total goiter rate was found to be 6.8% (USN), 8.2% (N) and 5.6% (P). Median urinary iodine concentration levels were 250 μg/l (USN), 200 μg/l (N) and 183 μg/l (P). Findings of the study documented that adolescent girls had adequate iodine nutritional status in the three districts of UK. © The Author [2015]. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Visualizing 3D Food Microstructure Using Tomographic Methods: Advantages and Disadvantages.

    PubMed

    Wang, Zi; Herremans, Els; Janssen, Siem; Cantre, Dennis; Verboven, Pieter; Nicolaï, Bart

    2018-03-25

    X-ray micro-computed tomography (micro-CT) provides the unique ability to capture intact internal microstructure data without significant preparation of the sample. The fundamentals of micro-CT technology are briefly described along with a short introduction to basic image processing, quantitative analysis, and derivative computational modeling. The applications and limitations of micro-CT in industries such as meat, dairy, postharvest, and bread/confectionary are discussed to serve as a guideline to the plausibility of utilizing the technique for detecting features of interest. Component volume fractions, their respective size/shape distributions, and connectivity, for example, can be utilized for product development, manufacturing process tuning and/or troubleshooting. In addition to determining structure-function relations, micro-CT can be used for foreign material detection to further ensure product quality and safety. In most usage scenarios, micro-CT in its current form is perfectly adequate for determining microstructure in a wide variety of food products. However, in low-contrast and low-stability samples, emphasis is placed on the shortcomings of the current systems to set realistic expectations for the intended users.

  6. Controlled trials in children: quantity, methodological quality and descriptive characteristics of pediatric controlled trials published 1948-2006.

    PubMed

    Thomson, Denise; Hartling, Lisa; Cohen, Eyal; Vandermeer, Ben; Tjosvold, Lisa; Klassen, Terry P

    2010-09-30

    The objective of this study was to describe randomized controlled trials (RCTs) and controlled clinical trials (CCTs) in child health published between 1948 and 2006, in terms of quantity, methodological quality, and publication and trial characteristics. We used the Trials Register of the Cochrane Child Health Field for overall trends and a sample from this to explore trial characteristics in more detail. We extracted descriptive data on a random sample of 578 trials. Ninety-six percent of the trials were published in English; the percentage of child-only trials was 90.5%. The most frequent diagnostic categories were infectious diseases (13.2%), behavioural and psychiatric disorders (11.6%), neonatal critical care (11.4%), respiratory disorders (8.9%), non-critical neonatology (7.9%), and anaesthesia (6.5%). There were significantly fewer child-only studies (i.e., more mixed child and adult studies) over time (P = 0.0460). The proportion of RCTs to CCTs increased significantly over time (P<0.0001), as did the proportion of multicentre trials (P = 0.002). Significant increases over time were found in methodological quality (Jadad score) (P<0.0001), the proportion of double-blind studies (P<0.0001), and studies with adequate allocation concealment (P<0.0001). Additionally, we found an improvement in reporting over time: adequate description of withdrawals and losses to follow-up (P<0.0001), sample size calculations (P<0.0001), and intention-to-treat analysis (P<0.0001). However, many trials still do not describe their level of blinding, and allocation concealment was inadequately reported in the majority of studies across the entire time period. The proportion of studies with industry funding decreased slightly over time (P = 0.003), and these studies were more likely to report positive conclusions (P = 0.028). The quantity and quality of pediatric controlled trials has increased over time; however, much work remains to be done, particularly in improving methodological issues around conduct and reporting of trials.

  7. Context factors in general practitioner-patient encounters and their impact on assessing communication skills--an exploratory study.

    PubMed

    Essers, Geurt; Kramer, Anneke; Andriesse, Boukje; van Weel, Chris; van der Vleuten, Cees; van Dulmen, Sandra

    2013-05-22

    Assessment of medical communication performance usually focuses on rating generically applicable, well-defined communication skills. However, in daily practice, communication is determined by (specific) context factors, such as acquaintance with the patient, or the presented problem. Merely valuing the presence of generic skills may not do justice to the doctor's proficiency.Our aim was to perform an exploratory study on how assessment of general practitioner (GP) communication performance changes if context factors are explicitly taken into account. We used a mixed method design to explore how ratings would change. A random sample of 40 everyday GP consultations was used to see if previously identified context factors could be observed again. The sample was rated twice using a widely used assessment instrument (the MAAS-Global), first in the standard way and secondly after context factors were explicitly taken into account, by using a context-specific rating protocol to assess communication performance in the workplace. In between first and second rating, the presence of context factors was established. Item score differences were calculated using paired sample t-tests. In 38 out of 40 consultations, context factors prompted application of the context-specific rating protocol. Mean overall score on the 7-point MAAS-Global scale increased from 2.98 in standard to 3.66 in the context-specific rating (p<0.00); the effect size for the total mean score was 0.84. In earlier research the minimum standard score for adequate communication was set at 3.17. Applying the protocol, the mean overall score rose above the level set in an earlier study for the MAAS-Global scores to represent 'adequate GP communication behaviour'. Our findings indicate that incorporating context factors in communication assessment thus makes a meaningful difference and shows that context factors should be considered as 'signal' instead of 'noise' in GP communication assessment. Explicating context factors leads to a more deliberate and transparent rating of GP communication performance.

  8. A visual basic program to generate sediment grain-size statistics and to extrapolate particle distributions

    USGS Publications Warehouse

    Poppe, L.J.; Eliason, A.H.; Hastings, M.E.

    2004-01-01

    Measures that describe and summarize sediment grain-size distributions are important to geologists because of the large amount of information contained in textural data sets. Statistical methods are usually employed to simplify the necessary comparisons among samples and quantify the observed differences. The two statistical methods most commonly used by sedimentologists to describe particle distributions are mathematical moments (Krumbein and Pettijohn, 1938) and inclusive graphics (Folk, 1974). The choice of which of these statistical measures to use is typically governed by the amount of data available (Royse, 1970). If the entire distribution is known, the method of moments may be used; if the next to last accumulated percent is greater than 95, inclusive graphics statistics can be generated. Unfortunately, earlier programs designed to describe sediment grain-size distributions statistically do not run in a Windows environment, do not allow extrapolation of the distribution's tails, or do not generate both moment and graphic statistics (Kane and Hubert, 1963; Collias et al., 1963; Schlee and Webster, 1967; Poppe et al., 2000)1.Owing to analytical limitations, electro-resistance multichannel particle-size analyzers, such as Coulter Counters, commonly truncate the tails of the fine-fraction part of grain-size distributions. These devices do not detect fine clay in the 0.6–0.1 μm range (part of the 11-phi and all of the 12-phi and 13-phi fractions). Although size analyses performed down to 0.6 μm microns are adequate for most freshwater and near shore marine sediments, samples from many deeper water marine environments (e.g. rise and abyssal plain) may contain significant material in the fine clay fraction, and these analyses benefit from extrapolation.The program (GSSTAT) described herein generates statistics to characterize sediment grain-size distributions and can extrapolate the fine-grained end of the particle distribution. It is written in Microsoft Visual Basic 6.0 and provides a window to facilitate program execution. The input for the sediment fractions is weight percentages in whole-phi notation (Krumbein, 1934; Inman, 1952), and the program permits the user to select output in either method of moments or inclusive graphics statistics (Fig. 1). Users select options primarily with mouse-click events, or through interactive dialogue boxes.

  9. Using Transom Jack in the Human Engineering Analysis of the Materials Science Research Rack-1 and Quench Module Insert

    NASA Technical Reports Server (NTRS)

    Dunn, Mariea C.; Alves, Jeffrey R.; Hutchinson, Sonya L.

    1999-01-01

    This paper describes the human engineering analysis performed on the Materials Science Research Rack-1 and Quench Module Insert (MSRR-1/QMI) using Transom Jack (Jack) software. The Jack software was used to model a virtual environment consisting of the MSRR-1/QMI hardware configuration and human figures representing the 95th percentile male and 5th percentile female. The purpose of the simulation was to assess the human interfaces in the design for their ability to meet the requirements of the Pressurized Payloads Interface Requirements Document - International Space Program, Revision C (SSP 57000). Jack was used in the evaluation because of its ability to correctly model anthropometric body measurements and the physical behavior of astronauts working in microgravity, which is referred to as the neutral body posture. The Jack model allows evaluation of crewmember interaction with hardware through task simulation including but not limited to collision avoidance behaviors, hand/eye coordination, reach path planning, and automatic grasping to part contours. Specifically, this virtual simulation depicts the human figures performing the QMI installation and check-out, sample cartridge insertion and removal, and gas bottle drawer removal. These tasks were evaluated in terms of adequate clearance in reach envelopes, adequate accessibility in work envelopes, appropriate line of sight in visual envelopes, and accommodation of full size range for male and female stature maneuverability. The results of the human engineering analysis virtual simulation indicate that most of the associated requirements of SSP 57000 were met. However, some hardware design considerations and crew procedures modifications are recommended to improve accessibility, provide an adequate work envelope, reduce awkward body posture, and eliminate permanent protrusions.

  10. The Relationship between Adequate Yearly Progress and the Quality of Professional Development

    ERIC Educational Resources Information Center

    Wolff, Lori A.; McClelland, Susan S.; Stewart, Stephanie E.

    2010-01-01

    Based on publicly available data, the study examined the relationship between adequate yearly progress status and teachers' perceptions of the quality of their professional development. The sample included responses of 5,558 teachers who completed the questionnaire in the 2005-2006 school year. Results of the statistical analysis show a…

  11. A comparison of the genetic basis of wing size divergence in three parallel body size clines of Drosophila melanogaster.

    PubMed Central

    Gilchrist, A S; Partridge, L

    1999-01-01

    Body size clines in Drosophila melanogaster have been documented in both Australia and South America, and may exist in Southern Africa. We crossed flies from the northern and southern ends of each of these clines to produce F(1), F(2), and first backcross generations. Our analysis of generation means for wing area and wing length produced estimates of the additive, dominance, epistatic, and maternal effects underlying divergence within each cline. For both females and males of all three clines, the generation means were adequately described by these parameters, indicating that linkage and higher order interactions did not contribute significantly to wing size divergence. Marked differences were apparent between the clines in the occurrence and magnitude of the significant genetic parameters. No cline was adequately described by a simple additive-dominance model, and significant epistatic and maternal effects occurred in most, but not all, of the clines. Generation variances were also analyzed. Only one cline was described sufficiently by a simple additive variance model, indicating significant epistatic, maternal, or linkage effects in the remaining two clines. The diversity in genetic architecture of the clines suggests that natural selection has produced similar phenotypic divergence by different combinations of gene action and interaction. PMID:10581284

  12. Geotechnical Data Inventory, Southern California Coastal Zone, Cape San Martin (Monterey County) to Mexican Border.

    DTIC Science & Technology

    1985-12-01

    Adequate Several moderate to snail Santa Ynez Mts. sized creeks and streams The largest potential source for sediment is La Honda Canyon. Major drainage...Sized or Area Relative Size Sediment Rate Drainage Basin(s) Santa Ynez River (See note 5) Large 48,000 cu. yds./yr. Ref: 66 Honda Ck (See note 5) Small...Hematite- Ilmenite, Epidote. Ref: 4A Heavy Minerals* Ref: 56A Epidote Augite Hornblende Chlorite Opaques Los Angeles 9 6 23 12 33 Cliffs Laguna Beach "Coarse

  13. 46 CFR 169.672 - Wiring for power and lighting circuits.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) Wiring for power and lighting circuits must have copper conductors, of 14 AWG or larger, and— (1) Meet... must have stranded conductors. (c) Conductors must be sized so that— (1) They are adequate for the...

  14. Evaluation of longitudinal joint tie bar system.

    DOT National Transportation Integrated Search

    2011-09-01

    "An adequate longitudinal joint tie bar system is essential in the overall performance of concrete pavement. Excessive : longitudinal joint openings are believed to be caused by either inadequate tie bar size or spacing or improper tie bar : installa...

  15. The all-on-four treatment concept: Systematic review

    PubMed Central

    Soto-Penaloza, David; Zaragozí-Alonso, Regino; Penarrocha-Diago, María

    2017-01-01

    Objectives To systematically review the literature on the “all-on-four” treatment concept regarding its indications, surgical procedures, prosthetic protocols and technical and biological complications after at least three years in function. Study Design The three major electronic databases were screened: MEDLINE (via PubMed), EMBASE, and the Cochrane Library of the Cochrane Collaboration (CENTRAL). In addition, electronic screening was made of the ‘grey literature’ using the System for Information on Grey Literature in Europe - Open Grey, covering the period from January 2005 up to and including April 2016. Results A total of 728 articles were obtained from the initial screening process. Of these articles, 24 fulfilled the inclusion criteria. Methodological quality assessment showed sample size calculation to be reported by only one study, and follow-up did not include a large number of participants - a fact that may introduce bias and lead to misleading interpretations of the study results. Conclusions The all-on-four treatment concept offers a predictable way to treat the atrophic jaw in patients that do not prefer regenerative procedures, which increase morbidity and the treatment fees. The results obtained indicate a survival rate for more than 24 months of 99.8%. However, current evidence is limited due the scarcity of information referred to methodological quality, a lack of adequate follow-up, and sample attrition. Biological complications (e.g., peri-implantitis) are reported in few patients after a mean follow-up of two years. Adequate definition of the success / survival criteria is thus necessary, due the high prevalence of peri-implant diseases. Key words:All-on-four, all-on-4, tilted implants, dental prostheses, immediate loading. PMID:28298995

  16. Chi-Squared Test of Fit and Sample Size-A Comparison between a Random Sample Approach and a Chi-Square Value Adjustment Method.

    PubMed

    Bergh, Daniel

    2015-01-01

    Chi-square statistics are commonly used for tests of fit of measurement models. Chi-square is also sensitive to sample size, which is why several approaches to handle large samples in test of fit analysis have been developed. One strategy to handle the sample size problem may be to adjust the sample size in the analysis of fit. An alternative is to adopt a random sample approach. The purpose of this study was to analyze and to compare these two strategies using simulated data. Given an original sample size of 21,000, for reductions of sample sizes down to the order of 5,000 the adjusted sample size function works as good as the random sample approach. In contrast, when applying adjustments to sample sizes of lower order the adjustment function is less effective at approximating the chi-square value for an actual random sample of the relevant size. Hence, the fit is exaggerated and misfit under-estimated using the adjusted sample size function. Although there are big differences in chi-square values between the two approaches at lower sample sizes, the inferences based on the p-values may be the same.

  17. Comparison of Dorsal Intercostal Artery Perforator Propeller Flaps and Bilateral Rotation Flaps in Reconstruction of Myelomeningocele Defects.

    PubMed

    Tenekeci, Goktekin; Basterzi, Yavuz; Unal, Sakir; Sari, Alper; Demir, Yavuz; Bagdatoglu, Celal; Tasdelen, Bahar

    2018-04-09

    Bilateral rotation flaps are considered the workhorse flaps in reconstruction of myelomeningocele defects. Since the introduction of perforator flaps in the field of reconstructive surgery, perforator flaps have been used increasingly in the reconstruction of various soft tissue defects all over the body because of their appreciated advantages. The aim of this study was to compare the complications and surgical outcomes between bilateral rotation flaps and dorsal intercostal artery perforator (DICAP) flaps in the soft tissue reconstruction of myelomeningocele defects. Between January 2005-February 2017, we studied 47 patients who underwent reconstruction of myelomeningocele defects. Patient demographics, operative data, and postoperative data were reviewed retrospectively and are included in the study. We found no statistically significant differences in patient demographics and surgical complications between these two groups; this may be due to small sample size. With regard to complications-partial flap necrosis, cerebrospinal fluid (CSF) leakage, necessity for reoperation, and wound infection-DICAP propeller flaps were clinically superior to rotation flaps. Partial flap necrosis was associated with CSF leakage and wound infection, and CSF leakage was associated with wound dehiscence. Although surgical outcomes obtained with DICAP propeller flaps were clinically superior to those obtained with rotation flaps, there was no statistically significant difference between the two patient groups. A well-designed comparative study with adequate sample size is needed. Nonetheless, we suggest using DICAP propeller flaps for reconstruction of large myelomeningocele defects.

  18. Variability of the raindrop size distribution at small spatial scales

    NASA Astrophysics Data System (ADS)

    Berne, A.; Jaffrain, J.

    2010-12-01

    Because of the interactions between atmospheric turbulence and cloud microphysics, the raindrop size distribution (DSD) is strongly variable in space and time. The spatial variability of the DSD at small spatial scales (below a few km) is not well documented and not well understood, mainly because of a lack of adequate measurements at the appropriate resolutions. A network of 16 disdrometers (Parsivels) has been designed and set up over EPFL campus in Lausanne, Switzerland. This network covers a typical operational weather radar pixel of 1x1 km2. The question of the significance of the variability of the DSD at such small scales is relevant for radar remote sensing of rainfall because the DSD is often assumed to be uniform within a radar sample volume and because the Z-R relationships used to convert the measured radar reflectivity Z into rain rate R are usually derived from point measurements. Thanks to the number of disdrometers, it was possible to quantify the spatial variability of the DSD at the radar pixel scale and to show that it can be significant. In this contribution, we show that the variability of the total drop concentration, of the median volume diameter and of the rain rate are significant, taking into account the sampling uncertainty associated with disdrometer measurements. The influence of this variability on the Z-R relationship can be non-negligible. Finally, the spatial structure of the DSD is quantified using a geostatistical tool, the variogram, and indicates high spatial correlation within a radar pixel.

  19. Characteristics of individuals who make impulsive suicide attempts.

    PubMed

    Spokas, Megan; Wenzel, Amy; Brown, Gregory K; Beck, Aaron T

    2012-02-01

    Previous research has identified only a few variables that have been associated with making an impulsive suicide attempt. The aim of the current study was to compare individuals who made an impulsive suicide attempt with those who made a premeditated attempt on both previously examined and novel characteristics. Participants were classified as making an impulsive or premeditated attempt based on the Suicide Intent Scale (Beck et al., 1974a) and were compared on a number of characteristics relevant to suicidality, psychiatric history, and demographics. Individuals who made an impulsive attempt expected that their attempts would be less lethal; yet the actual lethality of both groups' attempts was similar. Those who made an impulsive attempt were less depressed and hopeless than those who made a premeditated attempt. Participants who made an impulsive attempt were less likely to report a history of childhood sexual abuse and more likely to be diagnosed with an alcohol use disorder than those who made a premeditated attempt. Although the sample size was adequate for bivariate statistics, future studies using larger sample sizes will allow for multivariate analyses of characteristics that differentiate individuals who make impulsive and premeditated attempts. Clinicians should not minimize the significance of impulsive attempts, as they are associated with a similar level of lethality as premeditated attempts. Focusing mainly on depression and hopelessness as indicators of suicide risk has the potential to under-identify those who are at risk for making impulsive attempts. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. A Systematic Review and Meta-Analysis Estimating the Expected Dropout Rates in Randomized Controlled Trials on Yoga Interventions.

    PubMed

    Cramer, Holger; Haller, Heidemarie; Dobos, Gustav; Lauche, Romy

    2016-01-01

    A reasonable estimation of expected dropout rates is vital for adequate sample size calculations in randomized controlled trials (RCTs). Underestimating expected dropouts rates increases the risk of false negative results while overestimating rates results in overly large sample sizes, raising both ethical and economic issues. To estimate expected dropout rates in RCTs on yoga interventions, MEDLINE/PubMed, Scopus, IndMED, and the Cochrane Library were searched through February 2014; a total of 168 RCTs were meta-analyzed. Overall dropout rate was 11.42% (95% confidence interval [CI] = 10.11%, 12.73%) in the yoga groups; rates were comparable in usual care and psychological control groups and were slightly higher in exercise control groups (rate = 14.53%; 95% CI = 11.56%, 17.50%; odds ratio = 0.82; 95% CI = 0.68, 0.98; p = 0.03). For RCTs with durations above 12 weeks, dropout rates in yoga groups increased to 15.23% (95% CI = 11.79%, 18.68%). The upper border of 95% CIs for dropout rates commonly was below 20% regardless of study origin, health condition, gender, age groups, and intervention characteristics; however, it exceeded 40% for studies on HIV patients or heterogeneous age groups. In conclusion, dropout rates can be expected to be less than 15 to 20% for most RCTs on yoga interventions. Yet dropout rates beyond 40% are possible depending on the participants' sociodemographic and health condition.

  1. A Systematic Review and Meta-Analysis Estimating the Expected Dropout Rates in Randomized Controlled Trials on Yoga Interventions

    PubMed Central

    Haller, Heidemarie; Dobos, Gustav; Lauche, Romy

    2016-01-01

    A reasonable estimation of expected dropout rates is vital for adequate sample size calculations in randomized controlled trials (RCTs). Underestimating expected dropouts rates increases the risk of false negative results while overestimating rates results in overly large sample sizes, raising both ethical and economic issues. To estimate expected dropout rates in RCTs on yoga interventions, MEDLINE/PubMed, Scopus, IndMED, and the Cochrane Library were searched through February 2014; a total of 168 RCTs were meta-analyzed. Overall dropout rate was 11.42% (95% confidence interval [CI] = 10.11%, 12.73%) in the yoga groups; rates were comparable in usual care and psychological control groups and were slightly higher in exercise control groups (rate = 14.53%; 95% CI = 11.56%, 17.50%; odds ratio = 0.82; 95% CI = 0.68, 0.98; p = 0.03). For RCTs with durations above 12 weeks, dropout rates in yoga groups increased to 15.23% (95% CI = 11.79%, 18.68%). The upper border of 95% CIs for dropout rates commonly was below 20% regardless of study origin, health condition, gender, age groups, and intervention characteristics; however, it exceeded 40% for studies on HIV patients or heterogeneous age groups. In conclusion, dropout rates can be expected to be less than 15 to 20% for most RCTs on yoga interventions. Yet dropout rates beyond 40% are possible depending on the participants' sociodemographic and health condition. PMID:27413387

  2. Deliberate self-harm in older adults: a review of the literature from 1995 to 2004.

    PubMed

    Chan, Jenifer; Draper, Brian; Banerjee, Sube

    2007-08-01

    The prevention of suicide is a national and international policy priority. Old age is an important predictor of completed suicide. Suicide rates in old age differ markedly from country to country but there is a general trend towards increasing rates with increasing age. In 1996 Draper reviewed critically the evidence on attempted suicide in old age in the 10 years between 1985 and 1994. The review highlighted a need for prospective controlled studies in older people with more representative samples as well as studies examining the interaction of risk factors, precipitants, motivations, psychopathology and response to treatment. The aim of this paper is to update this review and to summarise the advances in our understanding of DSH in later life. We have critically reviewed relevant studies published between 1995 and 2004 to summarise the advances in our understanding of factors associated with deliberate self-harm in later life. The main advances in understanding have been to clarify the effect of personality and cultural factors, service utilisation pre and post attempt, and the (lesser) impact of socio-economic status and physical illness. Methodological weaknesses continue to include inadequate sample sizes performed on highly selected populations, inconsistent age criteria and lack of informant data on studies relating to role of personality. Future studies should include prospective, cross-cultural research with adequate sample sizes and which are population-based. Such approaches might confirm or refute the results generated to date and improve knowledge on factors such as the biological correlates of deliberate self-harm, service utilisation, costs and barriers to health care, and the interaction of these factors. Intervention studies to elucidate the impact of modifying these factors and of specific treatment packages are also needed.

  3. Effect of maternal body mass index on hormones in breast milk: a systematic review.

    PubMed

    Andreas, Nicholas J; Hyde, Matthew J; Gale, Chris; Parkinson, James R C; Jeffries, Suzan; Holmes, Elaine; Modi, Neena

    2014-01-01

    Maternal Body Mass Index (BMI) is positively associated with infant obesity risk. Breast milk contains a number of hormones that may influence infant metabolism during the neonatal period; these may have additional downstream effects on infant appetite regulatory pathways, thereby influencing propensity towards obesity in later life. To conduct a systematic review of studies examining the association between maternal BMI and the concentration of appetite-regulating hormones in breast milk. Pubmed was searched for studies reporting the association between maternal BMI and leptin, adiponectin, insulin, ghrelin, resistin, obestatin, Peptide YY and Glucagon-Like Peptide 1 in breast milk. Twenty six studies were identified and included in the systematic review. There was a high degree of variability between studies with regard to collection, preparation and analysis of breast milk samples. Eleven of fifteen studies reporting breast milk leptin found a positive association between maternal BMI and milk leptin concentration. Two of nine studies investigating adiponectin found an association between maternal BMI and breast milk adiponectin concentration; however significance was lost in one study following adjustment for time post-partum. No association was seen between maternal BMI and milk adiponectin in the other seven studies identified. Evidence for an association between other appetite regulating hormones and maternal BMI was either inconclusive, or lacking. A positive association between maternal BMI and breast milk leptin concentration is consistently found in most studies, despite variable methodology. Evidence for such an association with breast milk adiponectin concentration, however, is lacking with additional research needed for other hormones including insulin, ghrelin, resistin, obestatin, peptide YY and glucagon-like peptide-1. As most current studies have been conducted with small sample sizes, future studies should ensure adequate sample sizes and standardized methodology.

  4. Respondent driven sampling is an effective method for engaging methamphetamine users in HIV prevention research in South Africa

    PubMed Central

    Kimani, Stephen M.; Watt, Melissa H.; Merli, M. Giovanna; Skinner, Donald; Myers, Bronwyn; Pieterse, Desiree; MacFarlane, Jessica C.; Meade, Christina S.

    2014-01-01

    Background South Africa, in the midst of the world’s largest HIV epidemic, has a growing methamphetamine problem. Respondent driven sampling (RDS) is a useful tool for recruiting hard-to-reach populations in HIV prevention research, but its use with methamphetamine smokers in South Africa has not been described. This study examined the effectiveness of RDS as a method for engaging methamphetamine users in a Cape Town township into HIV behavioral research. Methods Standard RDS procedures were used to recruit active methamphetamine smokers from a racially diverse peri-urban township in Cape Town. Effectiveness of RDS was determined by examining social network characteristics (network size, homophily, and equilibrium) of recruited participants. Results Beginning with 8 seeds, 345 methamphetamine users were enrolled over 6 months, with a coupon return rate of 67%. The sample included 197 men and 148 women who were racially diverse (73% Coloured, 27% Black African) and had a mean age of 28.8 years (SD=7.2). Social networks were adequate (mean network size >5) and mainly comprised of close social ties. Equilibrium on race was reached after 11 waves of recruitment, and after ≤3 waves for all other variables of interest. There was little to moderate preference for either in- or out-group recruiting in all subgroups. Conclusions Results suggest that RDS is an effective method for engaging methamphetamine users into HIV prevention research in South Africa. Additionally, RDS may be a useful strategy for seeking high-risk methamphetamine users for HIV testing and linkage to HIV care in this and other low resource settings. PMID:25128957

  5. Chemical release from single-PMMA microparticles monitored by CARS microscopy

    NASA Astrophysics Data System (ADS)

    Enejder, Annika; Svedberg, Fredrik; Nordstierna, Lars; Nydén, Magnus

    2011-03-01

    Microparticles loaded with antigens, proteins, DNA, fungicides, and other functional agents emerge as ideal vehicles for vaccine, drug delivery, genetic therapy, surface- and crop protection. The microscopic size of the particles and their collective large specific surface area enables highly active and localized release of the functional substance. In order to develop designs with release profiles optimized for the specific application, it is desirable to map the distribution of the active substance within the particle and how parameters such as size, material and morphology affect release rates at single particle level. Current imaging techniques are limited in resolution, sensitivity, image acquisition time, or sample treatment, excluding dynamic studies of active agents in microparticles. Here, we demonstrate that the combination of CARS and THG microscopy can successfully be used, by mapping the spatial distribution and release rates of the fungicide and food preservative IPBC from different designs of PMMA microparticles at single-particle level. By fitting a radial diffusion model to the experimental data, single particle diffusion coefficients can be determined. We show that release rates are highly dependent on the size and morphology of the particles. Hence, CARS and THG microscopy provides adequate sensitivity and spatial resolution for quantitative studies on how singleparticle properties affect the diffusion of active agents at microscopic level. This will aid the design of innovative microencapsulating systems for controlled release.

  6. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    PubMed

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical uniform random (VUR) sections.

  7. The need of adequate information to achieve total compliance of mass drug administration in Pekalongan

    NASA Astrophysics Data System (ADS)

    Ginandjar, Praba; Saraswati, Lintang Dian; Taufik, Opik; Nurjazuli; Widjanarko, Bagoes

    2017-02-01

    World Health Organization (WHO) initiated The Global Program to Eliminate Lymphatic Filariasis (LF) through mass drug administration (MDA). Pekalongan started MDA in 2011. Yet the LF prevalence in 2015 remained exceed the threshold (1%). This study aimed to describe the inhibiting factors related to the compliance of MDA in community level. This was a rapid survey with cross sectional approach. A two-stages random sampling was used in this study. In the first stage, 25 clusters were randomly selected from 27 villages with proportionate to population size (PPS) methods (C-Survey). In the second stage, 10 subjects were randomly selected from each cluster. Subject consisted of 250 respondents from 25 selected clusters. Variables consisted of MDA coverage, practice of taking medication during MDA, enabling and inhibiting factors to MDA in community level. The results showed most respondents had poor knowledge on filariasis, which influence awareness of the disease. Health-illness perception, did not receive the drugs, lactation, side effect, and size of the drugs were dominant factors of non-compliance to MDA. MDA information and community empowerment were needed to improve MDA coverage. Further study to explore the appropriate model of socialization will support the success of MDA program

  8. The reliability of the Australasian Triage Scale: a meta-analysis

    PubMed Central

    Ebrahimi, Mohsen; Heydari, Abbas; Mazlom, Reza; Mirhaghi, Amir

    2015-01-01

    BACKGROUND: Although the Australasian Triage Scale (ATS) has been developed two decades ago, its reliability has not been defined; therefore, we present a meta-analyis of the reliability of the ATS in order to reveal to what extent the ATS is reliable. DATA SOURCES: Electronic databases were searched to March 2014. The included studies were those that reported samples size, reliability coefficients, and adequate description of the ATS reliability assessment. The guidelines for reporting reliability and agreement studies (GRRAS) were used. Two reviewers independently examined abstracts and extracted data. The effect size was obtained by the z-transformation of reliability coefficients. Data were pooled with random-effects models, and meta-regression was done based on the method of moment’s estimator. RESULTS: Six studies were included in this study at last. Pooled coefficient for the ATS was substantial 0.428 (95%CI 0.340–0.509). The rate of mis-triage was less than fifty percent. The agreement upon the adult version is higher than the pediatric version. CONCLUSION: The ATS has shown an acceptable level of overall reliability in the emergency department, but it needs more development to reach an almost perfect agreement. PMID:26056538

  9. Transitioning from adequate to inadequate sleep duration associated with higher smoking rate and greater nicotine dependence in a population sample

    PubMed Central

    Patterson, Freda; Grandner, Michael A.; Lozano, Alicia; Satti, Aditi; Ma, Grace

    2017-01-01

    Introduction Inadequate sleep (≤6 and ≥9 h) is more prevalent in smokers than non-smokers but the extent to which sleep duration in smokers relates to smoking behaviors and cessation outcomes, is not yet clear. To begin to address this knowledge gap, we investigated the extent to which sleep duration predicted smoking behaviors and quitting intention in a population sample. Methods Data from current smokers who completed the baseline (N=635) and 5-year follow-up (N=477) assessment in the United Kingdom Biobank cohort study were analyzed. Multivariable regression models using smoking behavior outcomes (cigarettes per day, time to first cigarette, difficulty not smoking for a day, quitting intention) and sleep duration (adequate (7–8 h) versus inadequate (≤6 and ≥9 h) as the predictor were generated. All models adjusted for age, sex, race, and education. Results Worsening sleep duration (adequate to inadequate) predicted a more than three-fold higher odds in increased cigarettes per day (OR =3.18; 95% CI =1.25–8.06), a more than three-fold increased odds of not smoking for the day remaining difficult (OR =3.90; 95% CI =1.27–12.01), and a > 8-fold increased odds of higher nicotine dependence (OR= 8.98; 95% CI =2.81–28.66). Improving sleep duration (i.e., inadequate to adequate sleep) did not predict reduced cigarette consumption or nicotine dependence in this population sample. Conclusion Transitioning from adequate to inadequate sleep duration may be a risk factor for developing a more “hard-core” smoking profile. The extent to which achieving healthy sleep may promote, or optimize smoking cessation treatment response, warrants investigation. PMID:28950118

  10. Impact of sampling techniques on measured stormwater quality data for small streams

    USDA-ARS?s Scientific Manuscript database

    Science-based sampling methodologies are needed to enhance water quality characterization for developing Total Maximum Daily Loads (TMDLs), setting appropriate water quality standards, and managing nonpoint source pollution. Storm event sampling, which is vital for adequate assessment of water qual...

  11. Modeling of human operator dynamics in simple manual control utilizing time series analysis. [tracking (position)

    NASA Technical Reports Server (NTRS)

    Agarwal, G. C.; Osafo-Charles, F.; Oneill, W. D.; Gottlieb, G. L.

    1982-01-01

    Time series analysis is applied to model human operator dynamics in pursuit and compensatory tracking modes. The normalized residual criterion is used as a one-step analytical tool to encompass the processes of identification, estimation, and diagnostic checking. A parameter constraining technique is introduced to develop more reliable models of human operator dynamics. The human operator is adequately modeled by a second order dynamic system both in pursuit and compensatory tracking modes. In comparing the data sampling rates, 100 msec between samples is adequate and is shown to provide better results than 200 msec sampling. The residual power spectrum and eigenvalue analysis show that the human operator is not a generator of periodic characteristics.

  12. Accounting for patient size in the optimization of dose and image quality of pelvis cone beam CT protocols on the Varian OBI system.

    PubMed

    Wood, Tim J; Moore, Craig S; Horsfield, Carl J; Saunderson, John R; Beavis, Andrew W

    2015-01-01

    The purpose of this study was to develop size-based radiotherapy kilovoltage cone beam CT (CBCT) protocols for the pelvis. Image noise was measured in an elliptical phantom of varying size for a range of exposure factors. Based on a previously defined "small pelvis" reference patient and CBCT protocol, appropriate exposure factors for small, medium, large and extra-large patients were derived which approximate the image noise behaviour observed on a Philips CT scanner (Philips Medical Systems, Best, Netherlands) with automatic exposure control (AEC). Selection criteria, based on maximum tube current-time product per rotation selected during the radiotherapy treatment planning scan, were derived based on an audit of patient size. It has been demonstrated that 110 kVp yields acceptable image noise for reduced patient dose in pelvic CBCT scans of small, medium and large patients, when compared with manufacturer's default settings (125 kVp). Conversely, extra-large patients require increased exposure factors to give acceptable images. 57% of patients in the local population now receive much lower radiation doses, whereas 13% require higher doses (but now yield acceptable images). The implementation of size-based exposure protocols has significantly reduced radiation dose to the majority of patients with no negative impact on image quality. Increased doses are required on the largest patients to give adequate image quality. The development of size-based CBCT protocols that use the planning CT scan (with AEC) to determine which protocol is appropriate ensures adequate image quality whilst minimizing patient radiation dose.

  13. Evaluation of the US Army Institute of Public Health Destination Monitoring Program, a food safety surveillance program.

    PubMed

    Rapp-Santos, Kamala; Havas, Karyn; Vest, Kelly

    2015-01-01

    The Destination Monitoring Program, operated by the US Army Public Health Command (APHC), is one component that supports the APHC Veterinary Service's mission to ensure safety and quality of food procured for the Department of Defense (DoD). This program relies on retail product testing to ensure compliance of production facilities and distributors that supply food to the DoD. This program was assessed to determine the validity and timeliness by specifically evaluating whether sample size of items collected was adequate, if food samples collected were representative of risk, and whether the program returns results in a timely manner. Data was collected from the US Army Veterinary Services Lotus Notes database, including all food samples collected and submitted from APHC Region-North for the purposes of destination monitoring from January 1, 2013 to December 31, 2013. For most food items, only one sample was submitted for testing. The ability to correctly identify a contaminated food lot may be limited by reliance on test results from only one sample, as the level of confidence in a negative test result is low. The food groups most frequently sampled by APHC correlated with the commodities that were implicated in foodborne illness in the United States. Food items to be submitted were equally distributed among districts and branches, but sections within large branches submitted relatively few food samples compared to sections within smaller branches and districts. Finally, laboratory results were not available for about half the food items prior to their respective expiration dates.

  14. Improving the Yield of Histological Sampling in Patients With Suspected Colorectal Cancer During Colonoscopy by Introducing a Colonoscopy Quality Assurance Program.

    PubMed

    Gado, Ahmed; Ebeid, Basel; Abdelmohsen, Aida; Axon, Anthony

    2011-08-01

    Masses discovered by clinical examination, imaging or endoscopic studies that are suspicious for malignancy typically require biopsy confirmation before treatment is initiated. Biopsy specimens may fail to yield a definitive diagnosis if the lesion is extensively ulcerated or otherwise necrotic and viable tumor tissue is not obtained on sampling. The diagnostic yield is improved when multiple biopsy samples (BSs) are taken. A colonoscopy quality-assurance program (CQAP) was instituted in 2003 in our institution. The aim of this study was to determine the effect of instituting a CQAP on the yield of histological sampling in patients with suspected colorectal cancer (CRC) during colonoscopy. Initial assessment of colonoscopy practice was performed in 2003. A total of five patients with suspected CRC during colonoscopy were documented in 2003. BSs confirmed CRC in three (60%) patients and were nondiagnostic in two (40%). A quality-improvement process was instituted which required a minimum six BSs with adequate size of the samples from any suspected CRC during colonoscopy. A total of 37 patients for the period 2004-2010 were prospectively assessed. The diagnosis of CRC was confirmed with histological examination of BSs obtained during colonoscopy in 63% of patients in 2004, 60% in 2005, 50% in 2006, 67% in 2007, 100% in 2008, 67% in 2009 and 100% in 2010. The yield of histological sampling increased significantly ( p <0.02) from 61% in 2004-2007 to 92% in 2008-2010. The implementation of a quality assurance and improvement program increased the yield of histological sampling in patients with suspected CRC during colonoscopy.

  15. Reactive flow modeling of small scale detonation failure experiments for a baseline non-ideal explosive

    NASA Astrophysics Data System (ADS)

    Kittell, David E.; Cummock, Nick R.; Son, Steven F.

    2016-08-01

    Small scale characterization experiments using only 1-5 g of a baseline ammonium nitrate plus fuel oil (ANFO) explosive are discussed and simulated using an ignition and growth reactive flow model. There exists a strong need for the small scale characterization of non-ideal explosives in order to adequately survey the wide parameter space in sample composition, density, and microstructure of these materials. However, it is largely unknown in the scientific community whether any useful or meaningful result may be obtained from detonation failure, and whether a minimum sample size or level of confinement exists for the experiments. In this work, it is shown that the parameters of an ignition and growth rate law may be calibrated using the small scale data, which is obtained from a 35 GHz microwave interferometer. Calibration is feasible when the samples are heavily confined and overdriven; this conclusion is supported with detailed simulation output, including pressure and reaction contours inside the ANFO samples. The resulting shock wave velocity is most likely a combined chemical-mechanical response, and simulations of these experiments require an accurate unreacted equation of state (EOS) in addition to the calibrated reaction rate. Other experiments are proposed to gain further insight into the detonation failure data, as well as to help discriminate between the role of the EOS and reaction rate in predicting the measured outcome.

  16. Xpert MTB/Rif for the diagnosis of extrapulmonary tuberculosis--an experience from a tertiary care centre in South India.

    PubMed

    Suzana, Shirly; Ninan, Marilyn M; Gowri, Mahasampath; Venkatesh, Krishnan; Rupali, Priscilla; Michael, Joy S

    2016-03-01

    The Xpert MTB/Rif, with a detection limit of 131 CFU/ml, plays a valuable role in the diagnosis of extrapulmonary tuberculosis, both susceptible and resistant. This study aims at evaluating the Xpert MTB/Rif for the same, at a tertiary care centre in south India, assessing it against both culture and a composite gold standard (CGS). We tested consecutive samples from patients suspected of extrapulmonary tuberculosis with Xpert MTB/Rif, evaluated its sensitivity and specificity against solid and/or liquid culture and CGS. An individual analysis of different sample types (tissue biopsies, fluids, pus, lymph node biopsies and CSF) given an adequate sample size, against both culture and CGS, was also performed. In total, 494 samples were analysed against culture. Compared to culture, the sensitivity of Xpert MTB/Rif was 89% (95% CI 0.81-0.94) and its specificity was 74% (95% CI 0.70-0.78). When Xpert MTB/Rif was compared to the CGS, pooled sensitivity was 62% (95% CI 0.56-0.67) and specificity was 100% (95% CI 0.91-1.00). This assay performs better than the currently available conventional laboratory methods. The rapidity with which results are obtained is an added advantage, and its integration into a routine diagnostic protocol must be considered. © 2015 John Wiley & Sons Ltd.

  17. Reactive flow modeling of small scale detonation failure experiments for a baseline non-ideal explosive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kittell, David E.; Cummock, Nick R.; Son, Steven F.

    2016-08-14

    Small scale characterization experiments using only 1–5 g of a baseline ammonium nitrate plus fuel oil (ANFO) explosive are discussed and simulated using an ignition and growth reactive flow model. There exists a strong need for the small scale characterization of non-ideal explosives in order to adequately survey the wide parameter space in sample composition, density, and microstructure of these materials. However, it is largely unknown in the scientific community whether any useful or meaningful result may be obtained from detonation failure, and whether a minimum sample size or level of confinement exists for the experiments. In this work, itmore » is shown that the parameters of an ignition and growth rate law may be calibrated using the small scale data, which is obtained from a 35 GHz microwave interferometer. Calibration is feasible when the samples are heavily confined and overdriven; this conclusion is supported with detailed simulation output, including pressure and reaction contours inside the ANFO samples. The resulting shock wave velocity is most likely a combined chemical-mechanical response, and simulations of these experiments require an accurate unreacted equation of state (EOS) in addition to the calibrated reaction rate. Other experiments are proposed to gain further insight into the detonation failure data, as well as to help discriminate between the role of the EOS and reaction rate in predicting the measured outcome.« less

  18. Quantitative Analysis of Organophosphate and Pyrethroid Insecticides, PyrethroidTransformation Products, Polybrominated Diphenyl Ethers and Bisphenol A in Residential Surface Wipe Samples

    EPA Science Inventory

    Surface wipe sampling is a frequently used technique for measuring persistent pollutants in residential environments. One characteristic of this form of sampling is the need to extract the entire wipe sample to achieve adequate sensitivity and to ensure representativeness. Most s...

  19. General Constraints on Sampling Wildlife on FIA Plots

    Treesearch

    Larissa L. Bailey; John R. Sauer; James D. Nichols; Paul H. Geissler

    2005-01-01

    This paper reviews the constraints to sampling wildlife populations at FIA points. Wildlife sampling programs must have well-defined goals and provide information adequate to meet those goals. Investigators should choose a State variable based on information needs and the spatial sampling scale. We discuss estimation-based methods for three State variables: species...

  20. Review of sampling hard-to-reach and hidden populations for HIV surveillance.

    PubMed

    Magnani, Robert; Sabin, Keith; Saidel, Tobi; Heckathorn, Douglas

    2005-05-01

    Adequate surveillance of hard-to-reach and 'hidden' subpopulations is crucial to containing the HIV epidemic in low prevalence settings and in slowing the rate of transmission in high prevalence settings. For a variety of reasons, however, conventional facility and survey-based surveillance data collection strategies are ineffective for a number of key subpopulations, particularly those whose behaviors are illegal or illicit. This paper critically reviews alternative sampling strategies for undertaking behavioral or biological surveillance surveys of such groups. Non-probability sampling approaches such as facility-based sentinel surveillance and snowball sampling are the simplest to carry out, but are subject to a high risk of sampling/selection bias. Most of the probability sampling methods considered are limited in that they are adequate only under certain circumstances and for some groups. One relatively new method, respondent-driven sampling, an adaptation of chain-referral sampling, appears to be the most promising for general applications. However, as its applicability to HIV surveillance in resource-poor settings has yet to be established, further field trials are needed before a firm conclusion can be reached.

  1. Sensitivity of postplanning target and OAR coverage estimates to dosimetric margin distribution sampling parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu Huijun; Gordon, J. James; Siebers, Jeffrey V.

    2011-02-15

    Purpose: A dosimetric margin (DM) is the margin in a specified direction between a structure and a specified isodose surface, corresponding to a prescription or tolerance dose. The dosimetric margin distribution (DMD) is the distribution of DMs over all directions. Given a geometric uncertainty model, representing inter- or intrafraction setup uncertainties or internal organ motion, the DMD can be used to calculate coverage Q, which is the probability that a realized target or organ-at-risk (OAR) dose metric D{sub v} exceeds the corresponding prescription or tolerance dose. Postplanning coverage evaluation quantifies the percentage of uncertainties for which target and OAR structuresmore » meet their intended dose constraints. The goal of the present work is to evaluate coverage probabilities for 28 prostate treatment plans to determine DMD sampling parameters that ensure adequate accuracy for postplanning coverage estimates. Methods: Normally distributed interfraction setup uncertainties were applied to 28 plans for localized prostate cancer, with prescribed dose of 79.2 Gy and 10 mm clinical target volume to planning target volume (CTV-to-PTV) margins. Using angular or isotropic sampling techniques, dosimetric margins were determined for the CTV, bladder and rectum, assuming shift invariance of the dose distribution. For angular sampling, DMDs were sampled at fixed angular intervals {omega} (e.g., {omega}=1 deg., 2 deg., 5 deg., 10 deg., 20 deg.). Isotropic samples were uniformly distributed on the unit sphere resulting in variable angular increments, but were calculated for the same number of sampling directions as angular DMDs, and accordingly characterized by the effective angular increment {omega}{sub eff}. In each direction, the DM was calculated by moving the structure in radial steps of size {delta}(=0.1,0.2,0.5,1 mm) until the specified isodose was crossed. Coverage estimation accuracy {Delta}Q was quantified as a function of the sampling parameters {omega} or {omega}{sub eff} and {delta}. Results: The accuracy of coverage estimates depends on angular and radial DMD sampling parameters {omega} or {omega}{sub eff} and {delta}, as well as the employed sampling technique. Target |{Delta}Q|<1% and OAR |{Delta}Q|<3% can be achieved with sampling parameters {omega} or {omega}{sub eff}=20 deg., {delta}=1 mm. Better accuracy (target |{Delta}Q|<0.5% and OAR |{Delta}Q|<{approx}1%) can be achieved with {omega} or {omega}{sub eff}=10 deg., {delta}=0.5 mm. As the number of sampling points decreases, the isotropic sampling method maintains better accuracy than fixed angular sampling. Conclusions: Coverage estimates for post-planning evaluation are essential since coverage values of targets and OARs often differ from the values implied by the static margin-based plans. Finer sampling of the DMD enables more accurate assessment of the effect of geometric uncertainties on coverage estimates prior to treatment. DMD sampling with {omega} or {omega}{sub eff}=10 deg. and {delta}=0.5 mm should be adequate for planning purposes.« less

  2. Sensitivity of postplanning target and OAR coverage estimates to dosimetric margin distribution sampling parameters.

    PubMed

    Xu, Huijun; Gordon, J James; Siebers, Jeffrey V

    2011-02-01

    A dosimetric margin (DM) is the margin in a specified direction between a structure and a specified isodose surface, corresponding to a prescription or tolerance dose. The dosimetric margin distribution (DMD) is the distribution of DMs over all directions. Given a geometric uncertainty model, representing inter- or intrafraction setup uncertainties or internal organ motion, the DMD can be used to calculate coverage Q, which is the probability that a realized target or organ-at-risk (OAR) dose metric D, exceeds the corresponding prescription or tolerance dose. Postplanning coverage evaluation quantifies the percentage of uncertainties for which target and OAR structures meet their intended dose constraints. The goal of the present work is to evaluate coverage probabilities for 28 prostate treatment plans to determine DMD sampling parameters that ensure adequate accuracy for postplanning coverage estimates. Normally distributed interfraction setup uncertainties were applied to 28 plans for localized prostate cancer, with prescribed dose of 79.2 Gy and 10 mm clinical target volume to planning target volume (CTV-to-PTV) margins. Using angular or isotropic sampling techniques, dosimetric margins were determined for the CTV, bladder and rectum, assuming shift invariance of the dose distribution. For angular sampling, DMDs were sampled at fixed angular intervals w (e.g., w = 1 degree, 2 degrees, 5 degrees, 10 degrees, 20 degrees). Isotropic samples were uniformly distributed on the unit sphere resulting in variable angular increments, but were calculated for the same number of sampling directions as angular DMDs, and accordingly characterized by the effective angular increment omega eff. In each direction, the DM was calculated by moving the structure in radial steps of size delta (=0.1, 0.2, 0.5, 1 mm) until the specified isodose was crossed. Coverage estimation accuracy deltaQ was quantified as a function of the sampling parameters omega or omega eff and delta. The accuracy of coverage estimates depends on angular and radial DMD sampling parameters omega or omega eff and delta, as well as the employed sampling technique. Target deltaQ/ < l% and OAR /deltaQ/ < 3% can be achieved with sampling parameters omega or omega eef = 20 degrees, delta =1 mm. Better accuracy (target /deltaQ < 0.5% and OAR /deltaQ < approximately 1%) can be achieved with omega or omega eff = 10 degrees, delta = 0.5 mm. As the number of sampling points decreases, the isotropic sampling method maintains better accuracy than fixed angular sampling. Coverage estimates for post-planning evaluation are essential since coverage values of targets and OARs often differ from the values implied by the static margin-based plans. Finer sampling of the DMD enables more accurate assessment of the effect of geometric uncertainties on coverage estimates prior to treatment. DMD sampling with omega or omega eff = 10 degrees and delta = 0.5 mm should be adequate for planning purposes.

  3. Evaluation of fish sampling using rotenone in a navigation lock

    USGS Publications Warehouse

    Margraf, F.J.; Knight, C.T.

    2002-01-01

    Annual sampling in locks with rotenone has been a principal means of assessing fish populations in the commercially navigable portions of the Ohio River. Despite extensive use, sampling in locks with rotenone and interpretation of the data obtained have not been adequately evaluated. The purpose of our study was to determine the degree of inter- and intraseasonal variation in lock samples, estimate correction factors (CFs) for fish recovery rates, and compare lock samples to other fish collections from the navigational pools above and below the lock. Lock samples from all seasons had a greater proportion of pelagic and demersal fish than samples from the navigational pools, which contained greater proportions of littoral species. CF for non-recovery of fish were determined. Spring and summer lock collections yielded several more species and estimates of overall fish biomass were an order of magnitude higher than fall collections. Within season variation between lock samples was relatively low; however, variation in lock samples among seasons was high, equivalent to that seen among the annual samples from the 1980s. Thus, single-season sampling may not be adequate, and fall may be the least preferred season.

  4. Comparison of Performance Characteristics of Oval Cup Forceps Versus Serrated Jaw Forceps in Gastric Biopsy.

    PubMed

    Sussman, Daniel A; Deshpande, Amar R; Shankar, Uday; Barkin, Jodie A; Medina, Ana Maria; Poppiti, Robert J; Cubeddu, Luigi X; Barkin, Jamie S

    2016-08-01

    Obtaining quality endoscopic biopsy specimens is vital in making successful histological diagnoses. The influence of forceps cup shape and size on quality of biopsy specimens is unclear. To identify whether oval cup or two different serrated jaw biopsy forceps could obtain specimens of superior size. Secondary endpoints were tissue adequacy, depth of tissue acquisition, and crush artifact. A single-center, prospective, pathologist-masked, randomized controlled trial was performed. In total 136 patients with a clinical indication for esophagogastroduodenoscopy with biopsy were randomized to receive serial biopsies with a large-capacity serrated forceps with jaw diameter 2.2 mm (SER1) and either a large-capacity oval forceps with jaw diameter 2.4 mm (OVL) or large-capacity serrated biopsy forceps with jaw diameter 2.4 mm (SER2) in two parallel groups. SER2 provided significantly larger specimens than did the other forceps (SER2 3.26 ± 1.09 vs. SER1 2.92 ± 0.88 vs. OVL 2.92 ± 0.76; p = 0.026), with an average size difference of 0.34 mm greater with SER2 compared to SER1 and OVL. OVL provided significantly deeper biopsies compared to SER1 and SER2 (p = 0.02), with 31 % of OVL biopsies reaching the submucosa. SER2 had significantly less crush artifact than SER1 and OVL (p < 0.0001). Serrated forceps provided larger samples compared to oval jaw forceps of the same size, with SER2 providing the largest specimen size. Oval cup forceps had deeper penetration of epithelium, while the larger jaw diameter serrated jaw forceps had less crush artifact. All three forceps provided specimens adequate for diagnostic purposes.

  5. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  6. STATISTICAL ANALYSIS OF TANK 19F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  7. Sediment transport simulation in an armoured stream

    USGS Publications Warehouse

    Milhous, Robert T.; Bradley, Jeffrey B.; Loeffler, Cindy L.

    1986-01-01

    Improved methods of calculating bed material stability and transport must be developed for a gravel bed stream having an armoured surface in order to use the HEC-6 model to examine channel change. Good possibilities exist for use of a two layer model based on the Schoklitsch and the Einstein-Brown transport equations. In Einstein-Brown the D35 of the armour is used for stabilities and the D50 of the bed (sub-surface) is used for transport. Data on the armour and sub-surface size distribution needs to be obtained as part of a bed material study in a gravel bed river; a "shovel" sample is not adequate. The Meyer-Peter, Muller equation should not be applied to a gravel bed stream with an armoured surface to estimate the initiation of transport or for calculation of transport at low effective bed shear stress.

  8. A mechanistic review on plant-derived natural compounds as dietary supplements for prevention of inflammatory bowel disease.

    PubMed

    Farzaei, Mohammad Hosein; Bahramsoltani, Roodabeh; Abdolghaffari, Amir Hossein; Sodagari, Hamid Reza; Esfahani, Shadi A; Rezaei, Nima

    2016-06-01

    Inflammatory bowel disease (IBD) is a recurrent idiopathic inflammatory condition, characterized by disruption of the gut mucosal barrier. This mechanistic review aims to highlight the significance of plant-derived natural compounds as dietary supplements, which can be used in addition to restricted conventional options for the prevention of IBD and induction of remission. Various clinical trials confirmed the effectiveness and tolerability of natural supplements in patients with IBD. Mounting evidence suggests that these natural compounds perform their protective and therapeutic effect on IBD through numerous molecular mechanisms, including anti-inflammatory and immunoregulatory, anti-oxidative stress, modulation of intracellular signaling transduction pathways, as well as improving gut microbiota. In conclusion, natural products can be considered as dietary supplements with therapeutic potential for IBD, provided that their safety and efficacy is confirmed in future well-designed clinical trials with adequate sample size.

  9. Treatments for somnambulism in adults: assessing the evidence.

    PubMed

    Harris, Melanie; Grunstein, Ronald R

    2009-08-01

    Somnambulism, or sleepwalking, is a parasomnia of non-rapid eye movement (NREM) sleep where movement behaviours usually confined to wakefulness are displayed during sleep. Generally, if sleepwalking is causing distress or danger in spite of safety measures, medical or psychological treatment is indicated. Clinicians will need to assess the evidence for treatment options. MEDLINE, EMBASE, PsycINFO and the Ovid Evidence-Based Medicine Reviews (EBM) multifile databases were searched. No properly powered rigorous controlled trials were found for treatment of sleepwalking in adults. Seven reports described small trials with some kind of control arm, or retrospective case series which included 30 or more patients. With no high quality evidence to underpin recommendations for treatments of somnambulism, full discussion with patients is advised. Adequately powered, well-designed clinical trials are now needed, and multi-centre collaborations may be required to obtain the sample sizes required.

  10. Invited article: Neurology education research.

    PubMed

    Stern, Barney J; Lowenstein, Daniel H; Schuh, Lori A

    2008-03-11

    There is a need to rigorously study the neurologic education of medical students, neurology residents, and neurologists to determine the effectiveness of our educational efforts. We review the status of neurologic education research as it pertains to the groups of interest. We identify opportunities and impediments for education research. The introduction of the Accreditation Council for Graduate Medical Education core competencies, the Accreditation Council of Continuing Medical Education requirement to link continuing medical education to improved physician behavior and patient care, and the American Board of Medical Specialties/American Board of Psychiatry and Neurology-mandated maintenance of certification program represent research opportunities. Challenges include numerous methodologic issues such as definition of the theoretical framework of the study, adequate sample size ascertainment, and securing research funding. State-of-the-art education research will require multidisciplinary research teams and innovative funding strategies. The central goal of all concerned should be defining educational efforts that improve patient outcomes.

  11. Age-related invariance of abilities measured with the Wechsler Adult Intelligence Scale-IV.

    PubMed

    Sudarshan, Navaneetham J; Bowden, Stephen C; Saklofske, Donald H; Weiss, Lawrence G

    2016-11-01

    Assessment of measurement invariance across populations is essential for meaningful comparison of test scores, and is especially relevant where repeated measurements are required for educational assessment or clinical diagnosis. Establishing measurement invariance legitimizes the assumption that test scores reflect the same psychological trait in different populations or across different occasions. Examination of Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV) U.S. standardization samples revealed that a first-order 5-factor measurement model was best fitting across 9 age groups from 16 years to 69 years. Strong metric invariance was found for 3 of 5 factors and partial intercept invariance for the remaining 2. Pairwise comparisons of adjacent age groups supported the inference that cognitive-trait group differences are manifested by group differences in the test scores. In educational and clinical settings these findings provide theoretical and empirical support to interpret changes in the index or subtest scores as reflecting changes in the corresponding cognitive abilities. Further, where clinically relevant, the subtest score composites can be used to compare changes in respective cognitive abilities. The model was supported in the Canadian standardization data with pooled age groups but the sample sizes were not adequate for detailed examination of separate age groups in the Canadian sample. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Multiscale dispersion-state characterization of nanocomposites using optical coherence tomography

    PubMed Central

    Schneider, Simon; Eppler, Florian; Weber, Marco; Olowojoba, Ganiu; Weiss, Patrick; Hübner, Christof; Mikonsaari, Irma; Freude, Wolfgang; Koos, Christian

    2016-01-01

    Nanocomposite materials represent a success story of nanotechnology. However, development of nanomaterial fabrication still suffers from the lack of adequate analysis tools. In particular, achieving and maintaining well-dispersed particle distributions is a key challenge, both in material development and industrial production. Conventional methods like optical or electron microscopy need laborious, costly sample preparation and do not permit fast extraction of nanoscale structural information from statistically relevant sample volumes. Here we show that optical coherence tomography (OCT) represents a versatile tool for nanomaterial characterization, both in a laboratory and in a production environment. The technique does not require sample preparation and is applicable to a wide range of solid and liquid material systems. Large particle agglomerates can be directly found by OCT imaging, whereas dispersed nanoparticles are detected by model-based analysis of depth-dependent backscattering. Using a model system of polystyrene nanoparticles, we demonstrate nanoparticle sizing with high accuracy. We further prove the viability of the approach by characterizing highly relevant material systems based on nanoclays or carbon nanotubes. The technique is perfectly suited for in-line metrology in a production environment, which is demonstrated using a state-of-the-art compounding extruder. These experiments represent the first demonstration of multiscale nanomaterial characterization using OCT. PMID:27557544

  13. Multiscale dispersion-state characterization of nanocomposites using optical coherence tomography.

    PubMed

    Schneider, Simon; Eppler, Florian; Weber, Marco; Olowojoba, Ganiu; Weiss, Patrick; Hübner, Christof; Mikonsaari, Irma; Freude, Wolfgang; Koos, Christian

    2016-08-25

    Nanocomposite materials represent a success story of nanotechnology. However, development of nanomaterial fabrication still suffers from the lack of adequate analysis tools. In particular, achieving and maintaining well-dispersed particle distributions is a key challenge, both in material development and industrial production. Conventional methods like optical or electron microscopy need laborious, costly sample preparation and do not permit fast extraction of nanoscale structural information from statistically relevant sample volumes. Here we show that optical coherence tomography (OCT) represents a versatile tool for nanomaterial characterization, both in a laboratory and in a production environment. The technique does not require sample preparation and is applicable to a wide range of solid and liquid material systems. Large particle agglomerates can be directly found by OCT imaging, whereas dispersed nanoparticles are detected by model-based analysis of depth-dependent backscattering. Using a model system of polystyrene nanoparticles, we demonstrate nanoparticle sizing with high accuracy. We further prove the viability of the approach by characterizing highly relevant material systems based on nanoclays or carbon nanotubes. The technique is perfectly suited for in-line metrology in a production environment, which is demonstrated using a state-of-the-art compounding extruder. These experiments represent the first demonstration of multiscale nanomaterial characterization using OCT.

  14. Precision and relative effectiveness of a purse seine for sampling age-0 river herring in lakes

    USGS Publications Warehouse

    Devine, Matthew T.; Roy, Allison; Whiteley, Andrew R.; Gahagan, Benjamin I.; Armstrong, Michael P.; Jordaan, Adrian

    2018-01-01

    Stock assessments for anadromous river herring, collectively Alewife Alosa pseudoharengus and Blueback Herring A. aestivalis, lack adequate demographic information, particularly with respect to early life stages. Although sampling adult river herring is increasingly common throughout their range, currently no standardized, field‐based, analytical methods exist for estimating juvenile abundance in freshwater lakes. The objective of this research was to evaluate the relative effectiveness and sampling precision of a purse seine for estimating densities of age‐0 river herring in freshwater lakes. We used a purse seine to sample age‐0 river herring in June–September 2015 and June–July 2016 in 16 coastal freshwater lakes in the northeastern USA. Sampling effort varied from two seine hauls to more than 50 seine hauls per lake. Catch rates were highest in June and July, and sampling precision was maximized in July. Sampling at night (versus day) in open water (versus littoral areas) was most effective for capturing newly hatched larvae and juveniles up to ca. 100 mm TL. Bootstrap simulation results indicated that sampling precision of CPUE estimates increased with sampling effort, and there was a clear threshold beyond which increased effort resulted in negligible increases in precision. The effort required to produce precise CPUE estimates, as determined by the CV, was dependent on lake size; river herring densities could be estimated with up to 10 purse‐seine hauls (one‐two nights) in a small lake (<50 ha) and 15–20 hauls (two‐three nights) in a large lake (>50 ha). Fish collection techniques using a purse seine as described in this paper are likely to be effective for estimating recruit abundance of river herring in freshwater lakes across their range.

  15. The Power of Low Back Pain Trials: A Systematic Review of Power, Sample Size, and Reporting of Sample Size Calculations Over Time, in Trials Published Between 1980 and 2012.

    PubMed

    Froud, Robert; Rajendran, Dévan; Patel, Shilpa; Bright, Philip; Bjørkli, Tom; Eldridge, Sandra; Buchbinder, Rachelle; Underwood, Martin

    2017-06-01

    A systematic review of nonspecific low back pain trials published between 1980 and 2012. To explore what proportion of trials have been powered to detect different bands of effect size; whether there is evidence that sample size in low back pain trials has been increasing; what proportion of trial reports include a sample size calculation; and whether likelihood of reporting sample size calculations has increased. Clinical trials should have a sample size sufficient to detect a minimally important difference for a given power and type I error rate. An underpowered trial is one within which probability of type II error is too high. Meta-analyses do not mitigate underpowered trials. Reviewers independently abstracted data on sample size at point of analysis, whether a sample size calculation was reported, and year of publication. Descriptive analyses were used to explore ability to detect effect sizes, and regression analyses to explore the relationship between sample size, or reporting sample size calculations, and time. We included 383 trials. One-third were powered to detect a standardized mean difference of less than 0.5, and 5% were powered to detect less than 0.3. The average sample size was 153 people, which increased only slightly (∼4 people/yr) from 1980 to 2000, and declined slightly (∼4.5 people/yr) from 2005 to 2011 (P < 0.00005). Sample size calculations were reported in 41% of trials. The odds of reporting a sample size calculation (compared to not reporting one) increased until 2005 and then declined (Equation is included in full-text article.). Sample sizes in back pain trials and the reporting of sample size calculations may need to be increased. It may be justifiable to power a trial to detect only large effects in the case of novel interventions. 3.

  16. Accommodating oversize and overweight loads : technical report.

    DOT National Transportation Integrated Search

    2012-07-01

    Adequate management of oversize/overweight (OS/OW) permit loads throughout the state of Texas is : critical to maintaining a vibrant state economy. The growth in the number and size of permit loads in recent : years is clear evidence that new tools a...

  17. Challenges of DNA-based mark-recapture studies of American black bears

    USGS Publications Warehouse

    Settlage, K.E.; Van Manen, F.T.; Clark, J.D.; King, T.L.

    2008-01-01

    We explored whether genetic sampling would be feasible to provide a region-wide population estimate for American black bears (Ursus americanus) in the southern Appalachians, USA. Specifically, we determined whether adequate capture probabilities (p >0.20) and population estimates with a low coefficient of variation (CV <20%) could be achieved given typical agency budget and personnel constraints. We extracted DNA from hair collected from baited barbed-wire enclosures sampled over a 10-week period on 2 study areas: a high-density black bear population in a portion of Great Smoky Mountains National Park and a lower density population on National Forest lands in North Carolina, South Carolina, and Georgia. We identified individual bears by their unique genotypes obtained from 9 microsatellite loci. We sampled 129 and 60 different bears in the National Park and National Forest study areas, respectively, and applied closed mark–recapture models to estimate population abundance. Capture probabilities and precision of the population estimates were acceptable only for sampling scenarios for which we pooled weekly sampling periods. We detected capture heterogeneity biases, probably because of inadequate spatial coverage by the hair-trapping grid. The logistical challenges of establishing and checking a sufficiently high density of hair traps make DNA-based estimates of black bears impractical for the southern Appalachian region. Alternatives are to estimate population size for smaller areas, estimate population growth rates or survival using mark–recapture methods, or use independent marking and recapturing techniques to reduce capture heterogeneity.

  18. SU-E-J-188: Theoretical Estimation of Margin Necessary for Markerless Motion Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patel, R; Block, A; Harkenrider, M

    2015-06-15

    Purpose: To estimate the margin necessary to adequately cover the target using markerless motion tracking (MMT) of lung lesions given the uncertainty in tracking and the size of the target. Methods: Simulations were developed in Matlab to determine the effect of tumor size and tracking uncertainty on the margin necessary to achieve adequate coverage of the target. For simplicity, the lung tumor was approximated by a circle on a 2D radiograph. The tumor was varied in size from a diameter of 0.1 − 30 mm in increments of 0.1 mm. From our previous studies using dual energy markerless motion tracking,more » we estimated tracking uncertainties in x and y to have a standard deviation of 2 mm. A Gaussian was used to simulate the deviation between the tracked location and true target location. For each size tumor, 100,000 deviations were randomly generated, the margin necessary to achieve at least 95% coverage 95% of the time was recorded. Additional simulations were run for varying uncertainties to demonstrate the effect of the tracking accuracy on the margin size. Results: The simulations showed an inverse relationship between tumor size and margin necessary to achieve 95% coverage 95% of the time using the MMT technique. The margin decreased exponentially with target size. An increase in tracking accuracy expectedly showed a decrease in margin size as well. Conclusion: In our clinic a 5 mm expansion of the internal target volume (ITV) is used to define the planning target volume (PTV). These simulations show that for tracking accuracies in x and y better than 2 mm, the margin required is less than 5 mm. This simple simulation can provide physicians with a guideline estimation for the margin necessary for use of MMT clinically based on the accuracy of their tracking and the size of the tumor.« less

  19. Meta-Analysis of Workplace Physical Activity Interventions

    PubMed Central

    Conn, Vicki S.; Hafdahl, Adam R.; Cooper, Pamela S.; Brown, Lori M.; Lusk, Sally L.

    2009-01-01

    Context Most adults do not achieve adequate physical activity. Despite the potential benefits of worksite health promotion, no previous comprehensive meta-analysis has summarized health and physical activity behavior outcomes from these programs. This comprehensive meta-analysis integrated the extant wide range of worksite physical activity intervention research. Evidence acquisition Extensive searching located published and unpublished intervention studies reported from 1969 through 2007. Results were coded from primary studies. Random-effects meta-analytic procedures, including moderator analyses, were completed in 2008. Evidence synthesis Effects on most variables were substantially heterogeneous because diverse studies were included. Standardized mean difference (d) effect sizes were synthesized across approximately 38,231 subjects. Significantly positive effects were observed for physical activity behavior (0.21), fitness (0.57), lipids (0.13), anthropometric measures (0.08), work attendance (0.19), and job stress (0.33). The significant effect size for diabetes risk (0.98) is more tentative given small sample sizes. Significant heterogeneity documents intervention effects varied across studies. The mean effect size for fitness corresponds to a difference between treatment minus control subjects' means on V02max of 3.5 mL/kg/min; for lipids, −0.2 on total cholesterol:HDL; and for diabetes risk, −12.6 mg/dL on fasting glucose. Conclusions These findings document that some workplace physical activity interventions can improve both health and important worksite outcomes. Effects were variable for most outcomes, reflecting the diversity of primary studies. Future primary research should compare interventions to confirm causal relationships and further explore heterogeneity. PMID:19765506

  20. Publication Bias in Psychology: A Diagnosis Based on the Correlation between Effect Size and Sample Size

    PubMed Central

    Kühberger, Anton; Fritz, Astrid; Scherndl, Thomas

    2014-01-01

    Background The p value obtained from a significance test provides no information about the magnitude or importance of the underlying phenomenon. Therefore, additional reporting of effect size is often recommended. Effect sizes are theoretically independent from sample size. Yet this may not hold true empirically: non-independence could indicate publication bias. Methods We investigate whether effect size is independent from sample size in psychological research. We randomly sampled 1,000 psychological articles from all areas of psychological research. We extracted p values, effect sizes, and sample sizes of all empirical papers, and calculated the correlation between effect size and sample size, and investigated the distribution of p values. Results We found a negative correlation of r = −.45 [95% CI: −.53; −.35] between effect size and sample size. In addition, we found an inordinately high number of p values just passing the boundary of significance. Additional data showed that neither implicit nor explicit power analysis could account for this pattern of findings. Conclusion The negative correlation between effect size and samples size, and the biased distribution of p values indicate pervasive publication bias in the entire field of psychology. PMID:25192357

  1. Publication bias in psychology: a diagnosis based on the correlation between effect size and sample size.

    PubMed

    Kühberger, Anton; Fritz, Astrid; Scherndl, Thomas

    2014-01-01

    The p value obtained from a significance test provides no information about the magnitude or importance of the underlying phenomenon. Therefore, additional reporting of effect size is often recommended. Effect sizes are theoretically independent from sample size. Yet this may not hold true empirically: non-independence could indicate publication bias. We investigate whether effect size is independent from sample size in psychological research. We randomly sampled 1,000 psychological articles from all areas of psychological research. We extracted p values, effect sizes, and sample sizes of all empirical papers, and calculated the correlation between effect size and sample size, and investigated the distribution of p values. We found a negative correlation of r = -.45 [95% CI: -.53; -.35] between effect size and sample size. In addition, we found an inordinately high number of p values just passing the boundary of significance. Additional data showed that neither implicit nor explicit power analysis could account for this pattern of findings. The negative correlation between effect size and samples size, and the biased distribution of p values indicate pervasive publication bias in the entire field of psychology.

  2. Does routine screening for breast cancer raise anxiety? Results from a three wave prospective study in England.

    PubMed Central

    Sutton, S; Saidi, G; Bickler, G; Hunter, J

    1995-01-01

    OBJECTIVE--To investigate whether mammography raises anxiety in routinely screened women who receive a negative result. DESIGN--Prospective design in which women completed questionnaires at three key points in the breast screening process: at baseline (before being sent their invitation for breast screening), at the screening clinic immediately before or after screening, and at follow up, about nine months after baseline. Information was obtained from non-attenders as well as from attenders. SETTING--Bromley District Health Authority, served by the South East London Breast Screening Service. PARTICIPANTS--Two overlapping samples were used. Sample A comprised 1500 women aged 50-64 who were due to be called for first round screening at a mobile screening unit. Altogether 1021 (68%) returned a usable questionnaire and 795 of these (78%) also provided adequate information at nine month follow up: there were 695 attenders (including 24 women who received false positive results) and 100 non-attenders. Sample B consisted of 868 women who attended the screening unit in a three month period, 732 (84%) of whom provided adequate data. A total of 306 attenders (including 10 who received false positive results) occurred in both samples and provided adequate information on all occasions. The main analyses were based on these 306 women plus the 100 non-attenders. The analysis of retrospective anxiety took advantage of the larger sample size of 695 attenders. MAIN RESULTS--On average, the women were not unduly anxious at any of the three points in the screening process. Among attenders, there was no difference between anxiety levels immediately before and immediately after screening. Anxiety was lowest at the clinic and highest at baseline but the changes were very small in absolute terms. Anxiety did not predict attendance: there were no differences in anxiety levels between attenders and non-attenders at baseline. As expected, women who received false positive results recalled feeling extremely anxious after they had received the referral letter but their retrospective anxiety was also higher than in the negative screenees at earlier stages in the breast screening process. They also reported having experienced more pain and discomfort during the x ray. CONCLUSIONS--Anxiety does not seem to be an important problem in routinely screened women who receive a negative result. This finding is very reassuring in relation to a major criticism of breast screening programmes. Thus, apart from maintaining current procedures such as keeping waiting times to a minimum, there seems to be no need to introduce special anxiety reducing interventions into the national programme. On the other hand, the findings for women who received false positive results suggest that there are aspects of the experience of being recalled for assessment after an abnormal mammogram that warrant further attention. The relationship between contemporaneous and retrospective anxiety should also be studied. PMID:7650466

  3. Economic evaluation of crop acreage estimation by multispectral remote sensing. [Michigan

    NASA Technical Reports Server (NTRS)

    Manderscheid, L. V.; Nalepka, R. F. (Principal Investigator); Myers, W.; Safir, G.; Ilhardt, D.; Morgenstern, J. P.; Sarno, J.

    1976-01-01

    The author has identified the following significant results. Photointerpretation of S190A and S190B imagery showed significantly better resolution with the S190B system. A small tendancy to underestimate acreage was observed. This averaged 6 percent and varied with field size. The S190B system had adequate resolution for acreage measurement but the color film did not provide adequate contrast to allow detailed classification of ground cover from imagery of a single date. In total 78 percent of the fields were correctly classified but with 56 percent correct for the major crop, corn.

  4. Aerial estimation of the size of gull breeding colonies

    USGS Publications Warehouse

    Kadlec, J.A.; Drury, W.H.

    1968-01-01

    Counts on photographs and visual estimates of the numbers of territorial gulls are usually reliable indicators of the number of gull nests, but single visual estimates are not adequate to measure the number of nests in individual colonies. To properly interpret gull counts requires that several islands with known numbers of nests be photographed to establish the ratio of gulls to nests applicable for a given local census. Visual estimates are adequate to determine total breeding gull numbers by regions. Neither visual estimates nor photography will reliably detect annual changes of less than about 2.5 percent.

  5. Optimum sample size allocation to minimize cost or maximize power for the two-sample trimmed mean test.

    PubMed

    Guo, Jiin-Huarng; Luh, Wei-Ming

    2009-05-01

    When planning a study, sample size determination is one of the most important tasks facing the researcher. The size will depend on the purpose of the study, the cost limitations, and the nature of the data. By specifying the standard deviation ratio and/or the sample size ratio, the present study considers the problem of heterogeneous variances and non-normality for Yuen's two-group test and develops sample size formulas to minimize the total cost or maximize the power of the test. For a given power, the sample size allocation ratio can be manipulated so that the proposed formulas can minimize the total cost, the total sample size, or the sum of total sample size and total cost. On the other hand, for a given total cost, the optimum sample size allocation ratio can maximize the statistical power of the test. After the sample size is determined, the present simulation applies Yuen's test to the sample generated, and then the procedure is validated in terms of Type I errors and power. Simulation results show that the proposed formulas can control Type I errors and achieve the desired power under the various conditions specified. Finally, the implications for determining sample sizes in experimental studies and future research are discussed.

  6. Sponge and skin excision sampling for recovery of Salmonella and Campylobacter from defeathered broiler carcasses

    USDA-ARS?s Scientific Manuscript database

    Introduction: Salmonella and Campylobacter contamination of broiler carcass skin increases during feather removal. There are several methods for sampling carcasses including sponging or swabbing of skin surface and skin excision. It is unclear whether sponge sampling is adequate to remove bacteri...

  7. Sponge and skin excision sampling for recovery of Salmonella and Campylobacter from defeathered broiler carcasses

    USDA-ARS?s Scientific Manuscript database

    Introduction: Salmonella and Campylobacter contamination of broiler carcass skin increases during feather removal. There are several methods for sampling carcasses including sponging or swabbing of skin surface and skin excision. It is unclear whether sponge sampling is adequate to remove bacteria f...

  8. Continuous microbial cultures maintained by electronically-controlled device

    NASA Technical Reports Server (NTRS)

    Eisler, W. J., Jr.; Webb, R. B.

    1967-01-01

    Photocell-controlled instrument maintains microbial culture. It uses commercially available chemostat glassware, provides adequate aeration through bubbling of the culture, maintains the population size and density, continuously records growth rates over small increments of time, and contains a simple, sterilizable nutrient control mechanism.

  9. Comparison of the Cytobrush®, dermatological curette and oral CDx® brush test as methods for obtaining samples of RNA for molecular analysis of oral cytology.

    PubMed

    Reboiras-López, M D; Pérez-Sayáns, M; Somoza-Martín, J M; Gayoso-Diz, P; Barros-Angueira, F; Gándara-Rey, J M; García-García, A

    2012-06-01

    Interest in oral exfoliative cytology has increased with the availability of molecular markers that may lead to the earlier diagnosis of oral squamous cell carcinoma. This research aims to compare the efficacy of three different instruments (Cytobrush, curette and Oral CDx brush) in providing adequate material for molecular analysis. One hundred and four cytological samples obtained from volunteer healthy subjects were analysed using all three instruments. The clinical and demographical variables under study were age, sex and smoking habits. The three instruments were compared for their ability to obtain adequate samples and for the amount of RNA obtained using quantitative real-time polymerase chain reaction (PCR-qRT) analysis of the Abelson (ABL) housekeeping gene. RNA of the ABL gene has been quantified by number of copies. Adequate samples were more likely to be obtained with a curette (90.6%) or Oral CDx (80.0%) than a Cytobrush (48.6%); P < 0.001. Similarly, the RNA quantification was 17.64 ± 21.10 with a curette, 16.04 ± 15.81 with Oral CDx and 6.82 ± 6.71 with a Cytobrush. There were statistically significant differences between the Cytobrush and curette (P = 0.008) and between the Cytobrush and OralCDx (P = 0.034). There was no difference according to the demographical variables. Oral exfoliative cytology is a simple, non-invasive technique that provides sufficient RNA to perform studies on gene expression. Although material was obtained with all three instruments, adequate samples were more likely to be obtained with the curette or Oral CDx than with a Cytobrush. The Oral CDx is a less aggressive instrument than the curette, so could be a useful tool in a clinical setting. © 2011 Blackwell Publishing Ltd.

  10. Design and rationale of the Cardiovascular Health and Text Messaging (CHAT) Study and the CHAT-Diabetes Mellitus (CHAT-DM) Study: two randomised controlled trials of text messaging to improve secondary prevention for coronary heart disease and diabetes

    PubMed Central

    Huo, Xiqian; Spatz, Erica S; Ding, Qinglan; Horak, Paul; Zheng, Xin; Masters, Claire; Zhang, Haibo; Irwin, Melinda L; Yan, Xiaofang; Guan, Wenchi; Li, Jing; Li, Xi; Spertus, John A; Masoudi, Frederick A; Krumholz, Harlan M; Jiang, Lixin

    2017-01-01

    Introduction Mobile health interventions have the potential to promote risk factor management and lifestyle modification, and are a particularly attractive approach for scaling across healthcare systems with limited resources. We are conducting two randomised trials to evaluate the efficacy of text message-based health messages in improving secondary coronary heart disease (CHD) prevention among patients with or without diabetes. Methods and analysis The Cardiovascular Health And Text Messaging (CHAT) Study and the CHAT-Diabetes Mellitus (CHAT-DM) Study are multicentre, single-blind, randomised controlled trials of text messaging versus standard treatment with 6 months of follow-up conducted in 37 hospitals throughout 17 provinces in China. The intervention group receives six text messages per week which target blood pressure control, medication adherence, physical activity, smoking cessation (when appropriate), glucose monitoring and lifestyle recommendations including diet (in CHAT-DM). The text messages were developed based on behavioural change techniques, using models such as the information-motivation-behavioural skills model, goal setting and provision of social support. A total sample size of 800 patients would be adequate for CHAT Study and sample size of 500 patients would be adequate for the CHAT-DM Study. In CHAT, the primary outcome is the change in systolic blood pressure (SBP) at 6 months. Secondary outcomes include a change in proportion of patients achieving a SBP <140 mm Hg, low-density lipoprotein cholesterol (LDL-C), physical activity, medication adherence, body mass index (BMI) and smoking cessation. In CHAT-DM, the primary outcome is the change in glycaemic haemoglobin (HbA1C) at 6 months. Secondary outcomes include a change in the proportion of patients achieving HbA1C<7%, fasting blood glucose, SBP, LDL-C, BMI, physical activity and medication adherence. Ethics and dissemination The central ethics committee at the China National Center for Cardiovascular Disease and the Yale University Institutional Review Board approved the CHAT and CHAT-DM studies. Results will be disseminated via usual scientific forums including peer-reviewed publications. Trial registration number CHAT (NCT02888769) and CHAT-DM (NCT02883842); Pre-results. PMID:29273661

  11. The ANTOP study: focal psychodynamic psychotherapy, cognitive-behavioural therapy, and treatment-as-usual in outpatients with anorexia nervosa - a randomized controlled trial

    PubMed Central

    Wild, Beate; Friederich, Hans-Christoph; Gross, Gaby; Teufel, Martin; Herzog, Wolfgang; Giel, Katrin E; de Zwaan, Martina; Schauenburg, Henning; Schade-Brittinger, Carmen; Schäfer, Helmut; Zipfel, Stephan

    2009-01-01

    Background Anorexia nervosa is a serious eating disorder leading to high morbidity and mortality as a result of both malnutrition and suicide. The seriousness of the disorder requires extensive knowledge of effective treatment options. However, evidence for treatment efficacy in this area is remarkably weak. A recent Cochrane review states that there is an urgent need for large, well-designed treatment studies for patients with anorexia nervosa. The aim of this particular multi-centre study is to evaluate the efficacy of two standardized outpatient treatments for patients with anorexia nervosa: focal psychodynamic (FPT) and cognitive behavioural therapy (CBT). Each therapeutic approach is compared to a "treatment-as-usual" control group. Methods/Design 237 patients meeting eligibility criteria are randomly and evenly assigned to the three groups – two intervention groups (CBT and FPT) and one control group. The treatment period for each intervention group is 10 months, consisting of 40 sessions respectively. Body weight, eating disorder related symptoms, and variables of therapeutic alliance are measured during the course of treatment. Psychotherapy sessions are audiotaped for adherence monitoring. The treatment in the control group, both the dosage and type of therapy, is not regulated in the study protocol, but rather reflects the current practice of established outpatient care. The primary outcome measure is the body mass index (BMI) at the end of the treatment (10 months after randomization). Discussion The study design surmounts the disadvantages of previous studies in that it provides a randomized controlled design, a large sample size, adequate inclusion criteria, an adequate treatment protocol, and a clear separation of the treatment conditions in order to avoid contamination. Nevertheless, the study has to deal with difficulties specific to the psychopathology of anorexia nervosa. The treatment protocol allows for dealing with the typically occurring medical complications without dropping patients from the protocol. However, because patients are difficult to recruit and often ambivalent about treatment, a drop-out rate of 30% is assumed for sample size calculation. Due to the ethical problem of denying active treatment to patients with anorexia nervosa, the control group is defined as "treatment-as-usual". Trial registration Current Controlled Trials ISRCTN72809357 PMID:19389245

  12. Design and rationale of the Cardiovascular Health and Text Messaging (CHAT) Study and the CHAT-Diabetes Mellitus (CHAT-DM) Study: two randomised controlled trials of text messaging to improve secondary prevention for coronary heart disease and diabetes.

    PubMed

    Huo, Xiqian; Spatz, Erica S; Ding, Qinglan; Horak, Paul; Zheng, Xin; Masters, Claire; Zhang, Haibo; Irwin, Melinda L; Yan, Xiaofang; Guan, Wenchi; Li, Jing; Li, Xi; Spertus, John A; Masoudi, Frederick A; Krumholz, Harlan M; Jiang, Lixin

    2017-12-21

    Mobile health interventions have the potential to promote risk factor management and lifestyle modification, and are a particularly attractive approach for scaling across healthcare systems with limited resources. We are conducting two randomised trials to evaluate the efficacy of text message-based health messages in improving secondary coronary heart disease (CHD) prevention among patients with or without diabetes. The Cardiovascular Health And Text Messaging (CHAT) Study and the CHAT-Diabetes Mellitus (CHAT-DM) Study are multicentre, single-blind, randomised controlled trials of text messaging versus standard treatment with 6 months of follow-up conducted in 37 hospitals throughout 17 provinces in China. The intervention group receives six text messages per week which target blood pressure control, medication adherence, physical activity, smoking cessation (when appropriate), glucose monitoring and lifestyle recommendations including diet (in CHAT-DM). The text messages were developed based on behavioural change techniques, using models such as the information-motivation-behavioural skills model, goal setting and provision of social support. A total sample size of 800 patients would be adequate for CHAT Study and sample size of 500 patients would be adequate for the CHAT-DM Study. In CHAT, the primary outcome is the change in systolic blood pressure (SBP) at 6 months. Secondary outcomes include a change in proportion of patients achieving a SBP <140 mm Hg, low-density lipoprotein cholesterol (LDL-C), physical activity, medication adherence, body mass index (BMI) and smoking cessation. In CHAT-DM, the primary outcome is the change in glycaemic haemoglobin (HbA 1C ) at 6 months. Secondary outcomes include a change in the proportion of patients achieving HbA 1C <7%, fasting blood glucose, SBP, LDL-C, BMI, physical activity and medication adherence. The central ethics committee at the China National Center for Cardiovascular Disease and the Yale University Institutional Review Board approved the CHAT and CHAT-DM studies. Results will be disseminated via usual scientific forums including peer-reviewed publications. CHAT (NCT02888769) and CHAT-DM (NCT02883842); Pre-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Report: EPA Needs to Fulfill Its Designated Responsibilities to Ensure Effective BioWatch Program

    EPA Pesticide Factsheets

    Report #2005-P-00012, March 23, 2005. EPA did not provide adequate oversight of the sampling operations to ensure quality assurance guidance was adhered to, potentially affecting the quality of the samples taken.

  14. Evaluating Simulated Tropical Convective Cores using HAIC-HIWC Microphysics and Dynamics Observations

    NASA Astrophysics Data System (ADS)

    Stanford, M.; Varble, A.; Zipser, E. J.; Strapp, J. W.; Leroy, D.; Schwarzenboeck, A.; Korolev, A.; Potts, R.

    2016-12-01

    A model intercomparison study is conducted to identify biases in simulated tropical convective core microphysical properties using two popular bulk parameterization schemes (Thompson and Morrison) and the Fast Spectral Bin Microphysics (FSBM) scheme. In-situ aircraft measurements of total condensed water content (TWC) and particle size distributions are compared with output from high-resolution WRF simulations of 4 mesoscale convective system (MCS) cases during the High Altitude Ice Crystals-High Ice Water Content (HAIC-HIWC) field campaign conducted in Darwin, Australia in 2014 and Cayenne, French Guiana in 2015. Observations of TWC collected using an isokinetic evaporator probe (IKP) optimized for high IWC measurements in conjunction with particle image processing from two optical array probes aboard the Falcon-20 research aircraft were used to constrain mass-size relationships in the observational dataset. Hydrometeor mass size distributions are compared between retrievals and simulations providing insight into the well-known high bias in simulated convective radar reflectivity. For TWC > 1 g m-3 between -10 and -40°C, simulations generally produce significantly greater median mass diameters (MMDs). Observations indicate that a sharp particle size mode occurs at 300 μm for large TWC values (> 2 g m-3) regardless of temperature. All microphysics schemes fail to reproduce this feature, and relative contributions of different hydrometeor species to this size bias vary between schemes. Despite far greater sample sizes, simulations also fail to produce high TWC conditions with very little of the mass contributed by large particles for a range of temperatures, despite such conditions being observed. Considering vapor grown particles alone in comparison with observations fails to correct the bias present in all schemes. Decreasing horizontal resolution from 1 km to 333 m shifts graupel and rain size distributions to slightly smaller sizes, but increased resolution alone will clearly not eliminate model biases. Results instead indicate that biases in both hydrometeor size distribution assumptions and parameterized processes also exist and need to be addressed before cloud and precipitation properties of convective systems can be adequately predicted.

  15. Adjusting for multiple prognostic factors in the analysis of randomised trials

    PubMed Central

    2013-01-01

    Background When multiple prognostic factors are adjusted for in the analysis of a randomised trial, it is unclear (1) whether it is necessary to account for each of the strata, formed by all combinations of the prognostic factors (stratified analysis), when randomisation has been balanced within each stratum (stratified randomisation), or whether adjusting for the main effects alone will suffice, and (2) the best method of adjustment in terms of type I error rate and power, irrespective of the randomisation method. Methods We used simulation to (1) determine if a stratified analysis is necessary after stratified randomisation, and (2) to compare different methods of adjustment in terms of power and type I error rate. We considered the following methods of analysis: adjusting for covariates in a regression model, adjusting for each stratum using either fixed or random effects, and Mantel-Haenszel or a stratified Cox model depending on outcome. Results Stratified analysis is required after stratified randomisation to maintain correct type I error rates when (a) there are strong interactions between prognostic factors, and (b) there are approximately equal number of patients in each stratum. However, simulations based on real trial data found that type I error rates were unaffected by the method of analysis (stratified vs unstratified), indicating these conditions were not met in real datasets. Comparison of different analysis methods found that with small sample sizes and a binary or time-to-event outcome, most analysis methods lead to either inflated type I error rates or a reduction in power; the lone exception was a stratified analysis using random effects for strata, which gave nominal type I error rates and adequate power. Conclusions It is unlikely that a stratified analysis is necessary after stratified randomisation except in extreme scenarios. Therefore, the method of analysis (accounting for the strata, or adjusting only for the covariates) will not generally need to depend on the method of randomisation used. Most methods of analysis work well with large sample sizes, however treating strata as random effects should be the analysis method of choice with binary or time-to-event outcomes and a small sample size. PMID:23898993

  16. Accounting for patient size in the optimization of dose and image quality of pelvis cone beam CT protocols on the Varian OBI system

    PubMed Central

    Moore, Craig S; Horsfield, Carl J; Saunderson, John R; Beavis, Andrew W

    2015-01-01

    Objective: The purpose of this study was to develop size-based radiotherapy kilovoltage cone beam CT (CBCT) protocols for the pelvis. Methods: Image noise was measured in an elliptical phantom of varying size for a range of exposure factors. Based on a previously defined “small pelvis” reference patient and CBCT protocol, appropriate exposure factors for small, medium, large and extra-large patients were derived which approximate the image noise behaviour observed on a Philips CT scanner (Philips Medical Systems, Best, Netherlands) with automatic exposure control (AEC). Selection criteria, based on maximum tube current–time product per rotation selected during the radiotherapy treatment planning scan, were derived based on an audit of patient size. Results: It has been demonstrated that 110 kVp yields acceptable image noise for reduced patient dose in pelvic CBCT scans of small, medium and large patients, when compared with manufacturer's default settings (125 kVp). Conversely, extra-large patients require increased exposure factors to give acceptable images. 57% of patients in the local population now receive much lower radiation doses, whereas 13% require higher doses (but now yield acceptable images). Conclusion: The implementation of size-based exposure protocols has significantly reduced radiation dose to the majority of patients with no negative impact on image quality. Increased doses are required on the largest patients to give adequate image quality. Advances in knowledge: The development of size-based CBCT protocols that use the planning CT scan (with AEC) to determine which protocol is appropriate ensures adequate image quality whilst minimizing patient radiation dose. PMID:26419892

  17. Optimal flexible sample size design with robust power.

    PubMed

    Zhang, Lanju; Cui, Lu; Yang, Bo

    2016-08-30

    It is well recognized that sample size determination is challenging because of the uncertainty on the treatment effect size. Several remedies are available in the literature. Group sequential designs start with a sample size based on a conservative (smaller) effect size and allow early stop at interim looks. Sample size re-estimation designs start with a sample size based on an optimistic (larger) effect size and allow sample size increase if the observed effect size is smaller than planned. Different opinions favoring one type over the other exist. We propose an optimal approach using an appropriate optimality criterion to select the best design among all the candidate designs. Our results show that (1) for the same type of designs, for example, group sequential designs, there is room for significant improvement through our optimization approach; (2) optimal promising zone designs appear to have no advantages over optimal group sequential designs; and (3) optimal designs with sample size re-estimation deliver the best adaptive performance. We conclude that to deal with the challenge of sample size determination due to effect size uncertainty, an optimal approach can help to select the best design that provides most robust power across the effect size range of interest. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. 7 CFR 58.427 - Paraffin tanks.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Paraffin tanks. 58.427 Section 58.427 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards....427 Paraffin tanks. The metal tank should be adequate in size, have wood rather than metal racks to...

  19. 7 CFR 58.427 - Paraffin tanks.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Paraffin tanks. 58.427 Section 58.427 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards....427 Paraffin tanks. The metal tank should be adequate in size, have wood rather than metal racks to...

  20. 7 CFR 58.427 - Paraffin tanks.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Paraffin tanks. 58.427 Section 58.427 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards....427 Paraffin tanks. The metal tank should be adequate in size, have wood rather than metal racks to...

  1. 7 CFR 58.427 - Paraffin tanks.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Paraffin tanks. 58.427 Section 58.427 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards....427 Paraffin tanks. The metal tank should be adequate in size, have wood rather than metal racks to...

  2. 7 CFR 58.427 - Paraffin tanks.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Paraffin tanks. 58.427 Section 58.427 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards....427 Paraffin tanks. The metal tank should be adequate in size, have wood rather than metal racks to...

  3. 40 CFR 241.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... manner that adequately prevents releases or other hazards to human health and the environment considering... subdivision of a state, or any interstate body. Processing means any operations that transform discarded non.... Minimal operations that result only in modifying the size of the material by shredding do not constitute...

  4. Vocabulary Levels and Size of Malaysian Undergraduates

    ERIC Educational Resources Information Center

    Harji, Madhubala Bava; Balakrishnan, Kavitha; Bhar, Sareen Kaur; Letchumanan, Krishnaveni

    2015-01-01

    Vocabulary is a fundamental requirement of language acquisition, and its competence enables independent reading and effective language acquisition. Effective language use requires adequate level of vocabulary knowledge; therefore, efforts must be made to identify students' vocabulary base for greater efficiency and competency in the language.…

  5. [Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].

    PubMed

    Suzukawa, Yumi; Toyoda, Hideki

    2012-04-01

    This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.

  6. Sample Size Estimation: The Easy Way

    ERIC Educational Resources Information Center

    Weller, Susan C.

    2015-01-01

    This article presents a simple approach to making quick sample size estimates for basic hypothesis tests. Although there are many sources available for estimating sample sizes, methods are not often integrated across statistical tests, levels of measurement of variables, or effect sizes. A few parameters are required to estimate sample sizes and…

  7. Counting at low concentrations: the statistical challenges of verifying ballast water discharge standards

    USGS Publications Warehouse

    Frazier, Melanie; Miller, A. Whitman; Lee, Henry; Reusser, Deborah A.

    2013-01-01

    Discharge from the ballast tanks of ships is one of the primary vectors of nonindigenous species in marine environments. To mitigate this environmental and economic threat, international, national, and state entities are establishing regulations to limit the concentration of living organisms that may be discharged from the ballast tanks of ships. The proposed discharge standards have ranged from zero detectable organisms to 3. If standard sampling methods are used, verifying whether ballast discharge complies with these stringent standards will be challenging due to the inherent stochasticity of sampling. Furthermore, at low concentrations, very large volumes of water must be sampled to find enough organisms to accurately estimate concentration. Despite these challenges, adequate sampling protocols comprise a critical aspect of establishing standards because they help define the actual risk level associated with a standard. A standard that appears very stringent may be effectively lax if it is paired with an inadequate sampling protocol. We describe some of the statistical issues associated with sampling at low concentrations to help regulators understand the uncertainties of sampling as well as to inform the development of sampling protocols that ensure discharge standards are adequately implemented.

  8. The Relationship between Sample Sizes and Effect Sizes in Systematic Reviews in Education

    ERIC Educational Resources Information Center

    Slavin, Robert; Smith, Dewi

    2009-01-01

    Research in fields other than education has found that studies with small sample sizes tend to have larger effect sizes than those with large samples. This article examines the relationship between sample size and effect size in education. It analyzes data from 185 studies of elementary and secondary mathematics programs that met the standards of…

  9. Generalized Variance Function Applications in Forestry

    Treesearch

    James Alegria; Charles T. Scott; Charles T. Scott

    1991-01-01

    Adequately predicting the sampling errors of tabular data can reduce printing costs by eliminating the need to publish separate sampling error tables. Two generalized variance functions (GVFs) found in the literature and three GVFs derived for this study were evaluated for their ability to predict the sampling error of tabular forestry estimates. The recommended GVFs...

  10. 21 CFR 2.10 - Examination and investigation samples.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... of analytical results, to be adequate to establish the respects, if any, in which the article is... shipment or other lot of the article from which such sample was collected was introduced or delivered for... article from which the sample is collected. (b) When an officer or employee of the Department collects an...

  11. Phylogenetic effective sample size.

    PubMed

    Bartoszek, Krzysztof

    2016-10-21

    In this paper I address the question-how large is a phylogenetic sample? I propose a definition of a phylogenetic effective sample size for Brownian motion and Ornstein-Uhlenbeck processes-the regression effective sample size. I discuss how mutual information can be used to define an effective sample size in the non-normal process case and compare these two definitions to an already present concept of effective sample size (the mean effective sample size). Through a simulation study I find that the AICc is robust if one corrects for the number of species or effective number of species. Lastly I discuss how the concept of the phylogenetic effective sample size can be useful for biodiversity quantification, identification of interesting clades and deciding on the importance of phylogenetic correlations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Evaluation of an aerosol photometer for monitoring welding fume levels in a shipyard.

    PubMed

    Glinsmann, P W; Rosenthal, F S

    1985-07-01

    A direct reading aerosol photometer (Sibata P-5 Digital Dust Indicator) was used to assess fume levels from welding and burning operations in a shipyard. The photometer was calibrated with gravimetric analysis of filter samples collected simultaneously with instrument readings. A six-fold difference between calibration factors for personal and area samples was found. This difference can be explained by expected changes in particle size distributions in welding fume. Monitoring of various work situations was performed in order to assess the value of the photometer for the measurement of fume. Measurements categorized by enclosure of space and quality of ventilation indicated the presence of high fume levels in semi-enclosed and enclosed spaces. The build up of welding fume in an enclosed space occurred over several minutes after the arc was struck. Decay likewise required several minutes. During welding, wide fluctuations of fume concentrations were found. Thus a single reading was not adequate to characterize average fume levels. Although this type of instrument is useful for locating areas with high fume levels and monitoring the effectiveness of ventilation, the uncertainty in calibration factors makes accurate determinations of fume levels difficult.

  13. Multiplexing of miniaturized planar antibody arrays for serum protein profiling--a biomarker discovery in SLE nephritis.

    PubMed

    Petersson, Linn; Dexlin-Mellby, Linda; Bengtsson, Anders A; Sturfelt, Gunnar; Borrebaeck, Carl A K; Wingren, Christer

    2014-06-07

    In the quest to decipher disease-associated biomarkers, miniaturized and multiplexed antibody arrays may play a central role in generating protein expression profiles, or protein maps, of crude serum samples. In this conceptual study, we explored a novel, 4-times larger pen design, enabling us to, in a unique manner, simultaneously print 48 different reagents (antibodies) as individual 78.5 μm(2) (10 μm in diameter) sized spots at a density of 38,000 spots cm(-2) using dip-pen nanolithography technology. The antibody array set-up was interfaced with a high-resolution fluorescent-based scanner for sensitive sensing. The performance and applicability of this novel 48-plex recombinant antibody array platform design was demonstrated in a first clinical application targeting SLE nephritis, a severe chronic autoimmune connective tissue disorder, as the model disease. To this end, crude, directly biotinylated serum samples were targeted. The results showed that the miniaturized and multiplexed array platform displayed adequate performance, and that SLE-associated serum biomarker panels reflecting the disease process could be deciphered, outlining the use of miniaturized antibody arrays for disease proteomics and biomarker discovery.

  14. Feasibility of using recipients of health promotional newsletters for post-marketing surveillance.

    PubMed

    Mead, L A; Ford, D E; Roht, L H; Beach, C L; Klag, M J

    2000-06-01

    Achieving an adequate sample size is one of the major difficulties in performing post-marketing observational studies of health outcomes in persons taking specific drug preparations. We assessed the feasibility of recruiting participants for such a study of Cardizem CD from approximately 400,000 U.S. recipients of a health promotion newsletter. A three-page questionnaire was sent to a 2.5% random sample (n = 10,000) of recipients, stratified by geographic region. After two mailings, 2779 (28%) returned the questionnaire. Of the 2779 respondents, 2132 (77%) reported having high blood pressure. Eighty-seven percent indicated a willingness to participate in a long-term prospective study. In a multivariate model, calcium channel blocker (CCB) use was associated with a history of coronary heart disease, duration of hypertension medication use greater than 1 year, a rating of good or excellent hypertension care, higher systolic blood pressure, higher education level, family history of cardiovascular disease, and history of smoking. These results indicate that self-reported CCB users may be at greater risk of cardiovascular heart disease and that it is feasible to use health promotion newsletters as a source of participants in prospective studies of cardiovascular disease.

  15. [Identification of novel therapeutically effective antibiotics using silkworm infection model].

    PubMed

    Hamamoto, Hiroshi; Urai, Makoto; Paudel, Atmika; Horie, Ryo; Murakami, Kazuhisa; Sekimizu, Kazuhisa

    2012-01-01

    Most antibiotics obtained by in vitro screening with antibacterial activity have inappropriate properties as medicines due to their toxicity and pharmacodynamics in animal bodies. Thus, evaluation of the therapeutic effects of these samples using animal models is essential in the crude stage. Mammals are not suitable for therapeutic evaluation of a large number of samples due to high costs and ethical issues. We propose the use of silkworms (Bombyx mori) as model animals for screening therapeutically effective antibiotics. Silkworms are infected by various pathogenic bacteria and are effectively treated with similar ED(50) values of clinically used antibiotics. Furthermore, the drug metabolism pathways, such as cytochrome P450 and conjugation systems, are similar between silkworms and mammals. Silkworms have many advantages compared with other infection models, such as their 1) low cost, 2) few associated ethical problems, 3) adequate body size for easily handling, and 4) easier separation of organs and hemolymph. These features of the silkworm allow for efficient screening of therapeutically effective antibiotics. In this review, we discuss the advantages of the silkworm model in the early stages of drug development and the screening results of some antibiotics using the silkworm infection model.

  16. Rapid and Efficient Method for the Detection of Microplastic in the Gastrointestinal Tract of Fishes.

    PubMed

    Roch, Samuel; Brinker, Alexander

    2017-04-18

    The rising evidence of microplastic pollution impacts on aquatic organisms in both marine and freshwater ecosystems highlights a pressing need for adequate and comparable detection methods. Available tissue digestion protocols are time-consuming (>10 h) and/or require several procedural steps, during which materials can be lost and contaminants introduced. This novel approach comprises an accelerated digestion step using sodium hydroxide and nitric acid in combination to digest all organic material within 1 h plus an additional separation step using sodium iodide which can be used to reduce mineral residues in samples where necessary. This method yielded a microplastic recovery rate of ≥95%, and all tested polymer types were recovered with only minor changes in weight, size, and color with the exception of polyamide. The method was also shown to be effective on field samples from two benthic freshwater fish species, revealing a microplastic burden comparable to that indicated in the literature. As a consequence, the present method saves time, minimizes the loss of material and the risk of contamination, and facilitates the identification of plastic particles and fibers, thus providing an efficient method to detect and quantify microplastics in the gastrointestinal tract of fishes.

  17. Sampling of radical prostatectomy specimens. How much is adequate?

    PubMed

    Cohen, M B; Soloway, M S; Murphy, W M

    1994-03-01

    Prostate glands from 52 patients with clinical stage B carcinoma were examined using two sampling techniques. After fixation and conization of the apical portions, each gland was serially sectioned with sections mounted whole on oversized glass slides and examined for pathologic features of prognostic importance. A second examination was subsequently conducted on the same tissue using only alternate sections. No differences in tumor type, grade, Gleason score, multiplicity, or capsular penetration were detected in 75% of cases. The discrepancies that did occur were most often minor variations in multiplicity and Gleason score. Of the 20 glands with capsular penetration observed with the serial sectioning method, 17 (85%) were detected using alternate sectioning. The surgical margin was involved in two of the three invasive foci that would have been missed. Although the topography is better displayed, the authors' examinations indicated no significant advantage to whole mount sections compared with sections mounted on standard-sized glass slides. Considering the most effective use of resources, as well as the current modalities available for patient monitoring, the results support the use of an alternate sectioning method for pathologic examination of specimens removed for clinically localized prostate cancer.

  18. Mental health of patients from different cultures in Germany.

    PubMed

    Wittig, U; Lindert, J; Merbach, M; Brähler, E

    2008-01-01

    Empirical studies on migration and mental health of migrants are still rare. In Germany they are often characterised by low sample sizes and are limited to certain diseases and geographical areas (old federal states). The comparability of their results is limited. Nonetheless, the assessment of migrants' health is necessary for adequate medical and psychosocial care for this target group. To provide data on mental health of migrants from Poland and from Vietnam in Germany. We have assessed a random sample of migrants from Poland (n=140) and from Vietnam (n=88) using the Giessen Subjective Complaints List - 24 (GSCL-24) and the Hospital Anxiety and Depression Scale (HADS). Additionally we asked migrants about their knowledge of health care institutions in case of psychosocial problems, their demands and the existing barriers to health care utilisation. Migrants from Poland and Vietnam have a higher general score of complaints of physical ill-health and higher anxiety and depression values than Germans. Psychosocial and medical institutions are visited less. Further analytical studies are needed to clarify health differences between these groups. Migrants are a heterogeneous group and only group-specific investigations will clarify associations between countries of origin, health status and use of health care institutions.

  19. The Association of Health Literacy with the Management of Type 2 Diabetes

    NASA Astrophysics Data System (ADS)

    Kumar, Samita

    Introduction: Type 2 Diabetes (T2D) is a chronic metabolic disease characterized by high blood glucose levels in the blood. It is associated with microvascular and macrovascular complications which can lead to potential threats such as to amputations and even death. The irony of the disease is that these complications are preventable with appropriate treatment and self-management. The Emergency Medicine Department (ED) at University of Southwestern Medical center conducted this study to assess health literacy in Parkland Memorial Hospital patients with T2D. The objective for the research study was to assess the association of health literacy with management of T2D. Methods: This was a prospective study with collection of personal health information (PHI) and 30 day-follow up for ED recidivism for patients with T2D presenting to ED with diabetic complications. Eligibility was assessed by pre-screening via EPIC (Electronic Medical Record System for Parkland). The tool for measuring health literacy was the Short Assessment of Health Literacy (SAHL) and data was collected. The cut-off used for the SAHL to determine adequate or inadequate health literacy was 15. Low health literacy is defined as a score of <15 on the short assessment of health literacy (SAHL) scale. Results: The total number of subjects enrolled was 23, with 43.48% males and 56.52% females who spoke either Spanish or English. Mean age of the subjects was 50 years with standard deviation of 10 years. About 74% were white hispanic males. According to the data collected, 30% of the patients demonstrated inadequate health literacy based on SAHL score survey. The total number of subjects required to have adequate power was 400. Since the study could not reach adequate power due to low enrollment, no significant associations could be made from this small sample size. Conclusions: Due to low enrollment period at this time the recommendation would be to continue collecting data to have a larger sample size to afford the observation of statistically relevant associations. If any statistically significant associations are found, then future studies will focus on improving diabetes outcomes through the development of educational tools at the individual patient's appropriate literacy level. There are many reasons to improve diabetes care and explore all possible factors that contribute to poor outcomes. Millions of people are living with uncontrolled diabetes and the burden is not only on the patient but also on the community as a whole. Quality care should aim for improved benchmarks for patients with diabetes and their knowledge about the disease, such as 1) obtaining HbA1c levels below 8%, 2) blood pressure in the normal range, 3) having regular foot exams to keep a check on any developing signs of pressure sores, and 4) most importantly, having dilated eye exam on a regular basis.

  20. Alignment of Iron Nanoparticles in a Magnetic Field Due to Shape Anisotropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radhakrishnan, Balasubramaniam; Nicholson, Don M; Eisenbach, Markus

    2015-07-09

    During high magnetic field processing there is evidence for alignment of non-spherical metallic particles above the Curie temperature in alloys with negligible magneto-crystalline anisotropy. The main driving force for alignment is the magnetic shape anisotropy. Current understanding of the phenomenon is not adequate to quantify the effect of particle size, aspect ratio, temperature and the magnetic field on particle alignment. We demonstrate a Monte Carlo approach coupled with size scaling to show the conditions under which alignment is possible.

  1. Pyramid algorithms as models of human cognition

    NASA Astrophysics Data System (ADS)

    Pizlo, Zygmunt; Li, Zheng

    2003-06-01

    There is growing body of experimental evidence showing that human perception and cognition involves mechanisms that can be adequately modeled by pyramid algorithms. The main aspect of those mechanisms is hierarchical clustering of information: visual images, spatial relations, and states as well as transformations of a problem. In this paper we review prior psychophysical and simulation results on visual size transformation, size discrimination, speed-accuracy tradeoff, figure-ground segregation, and the traveling salesman problem. We also present our new results on graph search and on the 15-puzzle.

  2. Characterization and modelling of the hollow beam produced by a real conical lens

    NASA Astrophysics Data System (ADS)

    Dépret, Benoı̂t; Verkerk, Philippe; Hennequin, Daniel

    2002-10-01

    The properties of the hollow beam produced by a conical lens are studied in detail. In particular, the impact of a rounded vertex is examined. It is shown that it could lead to drastic changes in the transverse distribution of the hollow beam, determined by the ratio between the transverse size of the incident beam and the size of the blunt area. An adequate choice for this ratio allows us to either minimize the losses or optimize the distribution symmetry.

  3. SAMPLING AND ANALYSIS OF MERCURY IN CRUDE OIL

    EPA Science Inventory

    Sampling and analytical procedures used to determine total mercury content in crude oils were examined. Three analytical methods were compared with respect to accuracy, precision and detection limit. The combustion method and a commercial extraction method were found adequate to...

  4. Validation of the Weight Concerns Scale Applied to Brazilian University Students.

    PubMed

    Dias, Juliana Chioda Ribeiro; da Silva, Wanderson Roberto; Maroco, João; Campos, Juliana Alvares Duarte Bonini

    2015-06-01

    The aim of this study was to evaluate the validity and reliability of the Portuguese version of the Weight Concerns Scale (WCS) when applied to Brazilian university students. The scale was completed by 1084 university students from Brazilian public education institutions. A confirmatory factor analysis was conducted. The stability of the model in independent samples was assessed through multigroup analysis, and the invariance was estimated. Convergent, concurrent, divergent, and criterion validities as well as internal consistency were estimated. Results indicated that the one-factor model presented an adequate fit to the sample and values of convergent validity. The concurrent validity with the Body Shape Questionnaire and divergent validity with the Maslach Burnout Inventory for Students were adequate. Internal consistency was adequate, and the factorial structure was invariant in independent subsamples. The results present a simple and short instrument capable of precisely and accurately assessing concerns with weight among Brazilian university students. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Toward a functional definition of a "rare disease" for regulatory authorities and funding agencies.

    PubMed

    Clarke, Joe T R; Coyle, Doug; Evans, Gerald; Martin, Janet; Winquist, Eric

    2014-12-01

    The designation of a disease as "rare" is associated with some substantial benefits for companies involved in new drug development, including expedited review by regulatory authorities and relaxed criteria for reimbursement. How "rare disease" is defined therefore has major financial implications, both for pharmaceutical companies and for insurers or public drug reimbursement programs. All existing definitions are based, somewhat arbitrarily, on disease incidence or prevalence. What is proposed here is a functional definition of rare based on an assessment of the feasibility of measuring the efficacy of a new treatment in conventional randomized controlled trials, to inform regulatory authorities and funding agencies charged with assessing new therapies being considered for public funding. It involves a five-step process, involving significant negotiations between patient advocacy groups, pharmaceutical companies, physicians, and public drug reimbursement programs, designed to establish the feasibility of carrying out a randomized controlled trial with sufficient statistical power to show a clinically significant treatment effect. The steps are as follows: 1) identification of a specific disease, including appropriate genetic definition; 2) identification of clinically relevant outcomes to evaluate efficacy; 3) establishment of the inherent variability of measurements of clinically relevant outcomes; 4) calculation of the sample size required to assess the efficacy of a new treatment with acceptable statistical power; and 5) estimation of the difficulty of recruiting an adequate sample size given the estimated prevalence or incidence of the disorder in the population and the inclusion criteria to be used. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  6. Epidemiology of periodontal diseases in Indian population since last decade

    PubMed Central

    Chandra, Anuja; Yadav, Om Prakash; Narula, Sugandha; Dutta, Angel

    2016-01-01

    Objective: India suffers lot of disparities in terms of oral health care and 95% of the Indian population suffers from periodontal disease. The aim of this review is to estimate the risk factors responsible for periodontal diseases as well as prevalence for the same in the last decade to make an attempt to develop a strategy to improve formulation of an effective oral health care policy in India. Materials and Methods: Keywords such as “prevalence of periodontal diseases,” “epidemiology,” “periodontitis in India,” and “oral hygiene status in India” were searched for appropriate studies to obtain a bibliographic database. The references of selected articles and relevant reviews were searched for any missed publications that included studies conducted in India estimating periodontal diseases with adequate sample size. Clinical parameters, sample size, and findings for each study were tabulated from 2006 to 2015 (till September 15, 2015) in chronological order to observe the prevalence as well as epidemiology of periodontal disease in India. Results: The projection of periodontal disease is disturbing. In addition, the majority of studies done have used the Community Periodontal Index of Treatment Needs (CPITN) as its epidemiological tool that can grossly underestimate the presence of deep pockets. Conclusion: Current knowledge has shown that periodontitis does not present a linear progression and is not age-dependent. Moreover, its distribution and severity are strongly influenced by host susceptibility and risk factors. A structured all-inclusive survey of all districts of the states is a prerequisite for the constitution of an apt and cogent health care policy in our country. PMID:27114945

  7. A low cost virtual reality system for home based rehabilitation of the arm following stroke: a randomised controlled feasibility trial

    PubMed Central

    Standen, PJ; Threapleton, K; Richardson, A; Connell, L; Brown, DJ; Battersby, S; Platts, F; Burton, A

    2016-01-01

    Objective: To assess the feasibility of conducting a randomised controlled trial of a home-based virtual reality system for rehabilitation of the arm following stroke. Design: Two group feasibility randomised controlled trial of intervention versus usual care. Setting: Patients’ homes. Participants: Patients aged 18 or over, with residual arm dysfunction following stroke and no longer receiving any other intensive rehabilitation. Interventions: Eight weeks’ use of a low cost home-based virtual reality system employing infra-red capture to translate the position of the hand into game play or usual care. Main measures: The primary objective was to collect information on the feasibility of a trial, including recruitment, collection of outcome measures and staff support required. Patients were assessed at three time points using the Wolf Motor Function Test, Nine-Hole Peg Test, Motor Activity Log and Nottingham Extended Activities of Daily Living. Results: Over 15 months only 47 people were referred to the team. Twenty seven were randomised and 18 (67%) of those completed final outcome measures. Sample size calculation based on data from the Wolf Motor Function Test indicated a requirement for 38 per group. There was a significantly greater change from baseline in the intervention group on midpoint Wolf Grip strength and two subscales of the final Motor Activity Log. Training in the use of the equipment took a median of 230 minutes per patient. Conclusions: To achieve the required sample size, a definitive home-based trial would require additional strategies to boost recruitment rates and adequate resources for patient support. PMID:27029939

  8. A low cost virtual reality system for home based rehabilitation of the arm following stroke: a randomised controlled feasibility trial.

    PubMed

    Standen, P J; Threapleton, K; Richardson, A; Connell, L; Brown, D J; Battersby, S; Platts, F; Burton, A

    2017-03-01

    To assess the feasibility of conducting a randomised controlled trial of a home-based virtual reality system for rehabilitation of the arm following stroke. Two group feasibility randomised controlled trial of intervention versus usual care. Patients' homes. Patients aged 18 or over, with residual arm dysfunction following stroke and no longer receiving any other intensive rehabilitation. Eight weeks' use of a low cost home-based virtual reality system employing infra-red capture to translate the position of the hand into game play or usual care. The primary objective was to collect information on the feasibility of a trial, including recruitment, collection of outcome measures and staff support required. Patients were assessed at three time points using the Wolf Motor Function Test, Nine-Hole Peg Test, Motor Activity Log and Nottingham Extended Activities of Daily Living. Over 15 months only 47 people were referred to the team. Twenty seven were randomised and 18 (67%) of those completed final outcome measures. Sample size calculation based on data from the Wolf Motor Function Test indicated a requirement for 38 per group. There was a significantly greater change from baseline in the intervention group on midpoint Wolf Grip strength and two subscales of the final Motor Activity Log. Training in the use of the equipment took a median of 230 minutes per patient. To achieve the required sample size, a definitive home-based trial would require additional strategies to boost recruitment rates and adequate resources for patient support.

  9. Evaluating propagation method performance over time with Bayesian updating: An application to incubator testing

    USGS Publications Warehouse

    Converse, Sarah J.; Chandler, J. N.; Olsen, Glenn H.; Shafer, C. C.; Hartup, Barry K.; Urbanek, Richard P.

    2010-01-01

    In captive-rearing programs, small sample sizes can limit the quality of information on performance of propagation methods. Bayesian updating can be used to increase information on method performance over time. We demonstrate an application to incubator testing at USGS Patuxent Wildlife Research Center. A new type of incubator was purchased for use in the whooping crane (Grus americana) propagation program, which produces birds for release. We tested the new incubator for reliability, using sandhill crane (Grus canadensis) eggs as surrogates. We determined that the new incubator should result in hatching rates no more than 5% lower than the available incubators, with 95% confidence, before it would be used to incubate whooping crane eggs. In 2007, 5 healthy chicks hatched from 12 eggs in the new incubator, and 2 hatched from 5 in an available incubator, for a median posterior difference of <1%, but with a large 95% credible interval (-41%, 43%). In 2008, we implemented a double-blind evaluation method, where a veterinarian determined whether eggs produced chicks that, at hatching, had no apparent health problems that would impede future release. We used the 2007 estimates as priors in the 2008 analysis. In 2008, 7 normal chicks hatched from 15 eggs in the new incubator, and 11 hatched from 15 in an available incubator, for a median posterior difference of 19%, with 95% credible interval (-8%, 44%). The increased sample size has increased our understanding of incubator performance. While additional data will be collected, at this time the new incubator does not appear adequate for use with whooping crane eggs.

  10. The endothelial sample size analysis in corneal specular microscopy clinical examinations.

    PubMed

    Abib, Fernando C; Holzchuh, Ricardo; Schaefer, Artur; Schaefer, Tania; Godois, Ronialci

    2012-05-01

    To evaluate endothelial cell sample size and statistical error in corneal specular microscopy (CSM) examinations. One hundred twenty examinations were conducted with 4 types of corneal specular microscopes: 30 with each BioOptics, CSO, Konan, and Topcon corneal specular microscopes. All endothelial image data were analyzed by respective instrument software and also by the Cells Analyzer software with a method developed in our lab. A reliability degree (RD) of 95% and a relative error (RE) of 0.05 were used as cut-off values to analyze images of the counted endothelial cells called samples. The sample size mean was the number of cells evaluated on the images obtained with each device. Only examinations with RE < 0.05 were considered statistically correct and suitable for comparisons with future examinations. The Cells Analyzer software was used to calculate the RE and customized sample size for all examinations. Bio-Optics: sample size, 97 ± 22 cells; RE, 6.52 ± 0.86; only 10% of the examinations had sufficient endothelial cell quantity (RE < 0.05); customized sample size, 162 ± 34 cells. CSO: sample size, 110 ± 20 cells; RE, 5.98 ± 0.98; only 16.6% of the examinations had sufficient endothelial cell quantity (RE < 0.05); customized sample size, 157 ± 45 cells. Konan: sample size, 80 ± 27 cells; RE, 10.6 ± 3.67; none of the examinations had sufficient endothelial cell quantity (RE > 0.05); customized sample size, 336 ± 131 cells. Topcon: sample size, 87 ± 17 cells; RE, 10.1 ± 2.52; none of the examinations had sufficient endothelial cell quantity (RE > 0.05); customized sample size, 382 ± 159 cells. A very high number of CSM examinations had sample errors based on Cells Analyzer software. The endothelial sample size (examinations) needs to include more cells to be reliable and reproducible. The Cells Analyzer tutorial routine will be useful for CSM examination reliability and reproducibility.

  11. Accounting for twin births in sample size calculations for randomised trials.

    PubMed

    Yelland, Lisa N; Sullivan, Thomas R; Collins, Carmel T; Price, David J; McPhee, Andrew J; Lee, Katherine J

    2018-05-04

    Including twins in randomised trials leads to non-independence or clustering in the data. Clustering has important implications for sample size calculations, yet few trials take this into account. Estimates of the intracluster correlation coefficient (ICC), or the correlation between outcomes of twins, are needed to assist with sample size planning. Our aims were to provide ICC estimates for infant outcomes, describe the information that must be specified in order to account for clustering due to twins in sample size calculations, and develop a simple tool for performing sample size calculations for trials including twins. ICCs were estimated for infant outcomes collected in four randomised trials that included twins. The information required to account for clustering due to twins in sample size calculations is described. A tool that calculates the sample size based on this information was developed in Microsoft Excel and in R as a Shiny web app. ICC estimates ranged between -0.12, indicating a weak negative relationship, and 0.98, indicating a strong positive relationship between outcomes of twins. Example calculations illustrate how the ICC estimates and sample size calculator can be used to determine the target sample size for trials including twins. Clustering among outcomes measured on twins should be taken into account in sample size calculations to obtain the desired power. Our ICC estimates and sample size calculator will be useful for designing future trials that include twins. Publication of additional ICCs is needed to further assist with sample size planning for future trials. © 2018 John Wiley & Sons Ltd.

  12. Sample size determination for mediation analysis of longitudinal data.

    PubMed

    Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying

    2018-03-27

    Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.

  13. Reduction of Specimen Size for the Full Simultaneous Characterization of Thermoelectric Performance

    NASA Astrophysics Data System (ADS)

    Vasilevskiy, D.; Simard, J.-M.; Masut, R. A.; Turenne, S.

    2017-05-01

    The successful implementation of thermoelectric (TE) materials for waste heat recovery depends strongly on our ability to increase their performance. This challenge continues to generate a renewed interest in novel high TE performance compounds. The technological difficulties in producing homogeneous ingots of new compounds or alloys with regular shape and a size sufficiently large to prepare several samples that are usually needed for a separate measurement of all TE parameters are well known. It creates a situation whereby material performance could be critically over- or under-evaluated at the first stages of the research process of a new material. Both cases would equally lead to negative consequences. Thus, minimizing the specimen size yet keeping it adequate for accurate material characterization becomes extremely important. In this work we report the experimental validation of reliable simultaneous measurements of the four most relevant TE parameters on a single bismuth telluride alloy based specimen of 4 mm × 4 mm × 1.4 mm in size. This translates in roughly 140 mg in weight for one of the heaviest TE materials, as was used in this study, and <100 mg for most others. Our validation is based on comparative measurements performed by a Harman apparatus (ZT-Scanner) on a series of differently sized specimens of hot extruded bismuth telluride based alloys. The Seebeck coefficient, electrical resistivity, thermal conductivity and the figure of merit were simultaneously assessed from 300 K to 440 K with increments of 20 K, 15 K, 10 K, 5 K, and 1 K. Our choice of a well-known homogeneous material has been made to increase measurement reliability and accuracy, but the results are expected to be valid for the full TE characterization of any unknown material. These results show a way to significantly decrease specimen sizes which has the potential to accelerate investigation of novel TE materials for large scale waste heat recovery.

  14. Exposure of miners to diesel exhaust particulates in underground nonmetal mines.

    PubMed

    Cohen, H J; Borak, J; Hall, T; Sirianni, G; Chemerynski, S

    2002-01-01

    A study was initiated to examine worker exposures in seven underground nonmetal mines and to examine the precision of the National Institute for Occupational Safety and Health (NIOSH) 5040 sampling and analytical method for diesel exhaust that has recently been adopted for compliance monitoring by the Mine Safety and Health Administration (MSHA). Approximately 1000 air samples using cyclones were taken on workers and in areas throughout the mines. Results indicated that worker exposures were consistently above the MSHA final limit of 160 micrograms/m3 (time-weighted average; TWA) for total carbon as determined by the NIOSH 5040 method and greater than the proposed American Conference of Governmental Industrial Hygienists TLV limit of 20 micrograms/m3 (TWA) for elemental carbon. A number of difficulties were documented when sampling for diesel exhaust using organic carbon: high and variable blank values from filters, a high variability (+/- 20%) from duplicate punches from the same sampling filter, a consistent positive interference (+26%) when open-faced monitors were sampled side-by-side with cyclones, poor correlation (r 2 = 0.38) to elemental carbon levels, and an interference from limestone that could not be adequately corrected by acid-washing of filters. The sampling and analytical precision (relative standard deviation) was approximately 11% for elemental carbon, 17% for organic carbon, and 11% for total carbon. An hypothesis is presented and supported with data that gaseous organic carbon constituents of diesel exhaust adsorb onto not only the submicron elemental carbon particles found in diesel exhaust, but also mining ore dusts. Such mining dusts are mostly nonrespirable and should not be considered equivalent to submicron diesel particulates in their potential for adverse pulmonary effects. It is recommended that size-selective sampling be employed, rather than open-faced monitoring, when using the NIOSH 5040 method.

  15. Public Opinion Polls, Chicken Soup and Sample Size

    ERIC Educational Resources Information Center

    Nguyen, Phung

    2005-01-01

    Cooking and tasting chicken soup in three different pots of very different size serves to demonstrate that it is the absolute sample size that matters the most in determining the accuracy of the findings of the poll, not the relative sample size, i.e. the size of the sample in relation to its population.

  16. Sample size in studies on diagnostic accuracy in ophthalmology: a literature survey.

    PubMed

    Bochmann, Frank; Johnson, Zoe; Azuara-Blanco, Augusto

    2007-07-01

    To assess the sample sizes used in studies on diagnostic accuracy in ophthalmology. Design and sources: A survey literature published in 2005. The frequency of reporting calculations of sample sizes and the samples' sizes were extracted from the published literature. A manual search of five leading clinical journals in ophthalmology with the highest impact (Investigative Ophthalmology and Visual Science, Ophthalmology, Archives of Ophthalmology, American Journal of Ophthalmology and British Journal of Ophthalmology) was conducted by two independent investigators. A total of 1698 articles were identified, of which 40 studies were on diagnostic accuracy. One study reported that sample size was calculated before initiating the study. Another study reported consideration of sample size without calculation. The mean (SD) sample size of all diagnostic studies was 172.6 (218.9). The median prevalence of the target condition was 50.5%. Only a few studies consider sample size in their methods. Inadequate sample sizes in diagnostic accuracy studies may result in misleading estimates of test accuracy. An improvement over the current standards on the design and reporting of diagnostic studies is warranted.

  17. Effects of newborn characteristics and length of colostrum feeding period on passive immune transfer in goat kids.

    PubMed

    Castro, N; Capote, J; Morales-Delanuez, A; Rodríguez, C; Argüello, A

    2009-04-01

    Majorera goat kids (n = 200) were used to evaluate the effects of litter size, birth body weight, sex, and suckling duration on serum IgG concentrations. Kids were assigned to 1 of 3 experimental groups: litter size and sex were equally distributed in each group. In the first group, kids (n = 67) stayed with their dams for 24 h; in the second group, kids (n = 66) stayed with their dams for 48 h; and in the third group, kids (n = 67) stayed with their dams for 120 h. Blood samples were obtained every 24 h for 5 d, and serum IgG concentration was measured using radial immunodiffusion. In litter sizes of 1 to 2 kids, IgG blood serum concentration was significantly higher (18.30 +/- 5.40 mg/mL) than in litters of 3 kids (9.85 +/- 4.23 mg/mL). Kid sex did not affect IgG blood serum concentrations. Suckling duration did not affect kid serum IgG concentrations. In conclusion, kids with low birth body weight (<2.8 kg) or from litters of 3 may need special attention. If newborn goat kids are allowed to suckle colostrum for at least 24 h from their dams, this seems to be sufficient time to ingest enough IgG from colostrum to achieve an adequate serum IgG concentration and passive immune protection to avoid failure of passive immune transfer.

  18. 14 CFR 27.733 - Tires.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... STANDARDS: NORMAL CATEGORY ROTORCRAFT Design and Construction Landing Gear § 27.733 Tires. (a) Each landing... gravity. (c) Each tire installed on a retractable landing gear system must, at the maximum size of the tire type expected in service, have a clearance to surrounding structure and systems that is adequate...

  19. 40 CFR 221.1 - Applications for permits.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... Applications for general, special, emergency, and research permits under section 102 of the Act may be filed... equipment; (c) Adequate physical and chemical description of material to be dumped, including results of tests necessary to apply the Criteria, and the number, size, and physical configuration of any...

  20. 40 CFR 221.1 - Applications for permits.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... Applications for general, special, emergency, and research permits under section 102 of the Act may be filed... equipment; (c) Adequate physical and chemical description of material to be dumped, including results of tests necessary to apply the Criteria, and the number, size, and physical configuration of any...

  1. 40 CFR 221.1 - Applications for permits.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... for general, special, emergency, and research permits under section 102 of the Act may be filed with... equipment; (c) Adequate physical and chemical description of material to be dumped, including results of tests necessary to apply the Criteria, and the number, size, and physical configuration of any...

  2. 40 CFR 221.1 - Applications for permits.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... Applications for general, special, emergency, and research permits under section 102 of the Act may be filed... equipment; (c) Adequate physical and chemical description of material to be dumped, including results of tests necessary to apply the Criteria, and the number, size, and physical configuration of any...

  3. 40 CFR 221.1 - Applications for permits.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... Applications for general, special, emergency, and research permits under section 102 of the Act may be filed... equipment; (c) Adequate physical and chemical description of material to be dumped, including results of tests necessary to apply the Criteria, and the number, size, and physical configuration of any...

  4. 7 CFR 987.145 - Withholding obligation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... obligation for any variety of dates for which free and restricted percentages have been established by having an adequate quantity of that variety inspected and certified as meeting the applicable grade, size... to reflect any increase in weight. (d) Dates for deferment of withholding. Any handler may defer his...

  5. Anatomy of an Evidence Base

    ERIC Educational Resources Information Center

    Malouf, David B.; Taymans, Juliana M.

    2016-01-01

    An analysis was conducted of the What Works Clearinghouse (WWC) research evidence base on the effectiveness of replicable education interventions. Most interventions were found to have little or no support from technically adequate research studies, and intervention effect sizes were of questionable magnitude to meet education policy goals. These…

  6. 24 CFR 51.205 - Mitigating measures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...) An existing permanent fire resistant structure of adequate size and strength will shield the proposed..., or in between the potential hazard and the proposed project. (d) The structure and outdoor areas used... potential hazard (e.g., the project is of masonry and steel or reinforced concrete and steel construction). ...

  7. 24 CFR 51.205 - Mitigating measures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...) An existing permanent fire resistant structure of adequate size and strength will shield the proposed..., or in between the potential hazard and the proposed project. (d) The structure and outdoor areas used... potential hazard (e.g., the project is of masonry and steel or reinforced concrete and steel construction). ...

  8. 24 CFR 51.205 - Mitigating measures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...) An existing permanent fire resistant structure of adequate size and strength will shield the proposed..., or in between the potential hazard and the proposed project. (d) The structure and outdoor areas used... potential hazard (e.g., the project is of masonry and steel or reinforced concrete and steel construction). ...

  9. 24 CFR 51.205 - Mitigating measures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...) An existing permanent fire resistant structure of adequate size and strength will shield the proposed..., or in between the potential hazard and the proposed project. (d) The structure and outdoor areas used... potential hazard (e.g., the project is of masonry and steel or reinforced concrete and steel construction). ...

  10. 24 CFR 51.205 - Mitigating measures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...) An existing permanent fire resistant structure of adequate size and strength will shield the proposed..., or in between the potential hazard and the proposed project. (d) The structure and outdoor areas used... potential hazard (e.g., the project is of masonry and steel or reinforced concrete and steel construction). ...

  11. Novel Diffusion-Weighted MRI for High-Grade Prostate Cancer Detection

    DTIC Science & Technology

    2016-10-01

    in image resolution and scale.This process is critical for evaluating new imaging modalities.Our initial findings illustrate the potential of the...eligible for analysis as determined by adequate pathologic processing and MR images deemed to be of adequate quality by the study team.  The...histology samples have been requested from the UIC biorepository for digitization  All MR images have been collected and prepared for image processing

  12. An empirical determination of the minimum number of measurements needed to estimate the mean random vitrinite reflectance of disseminated organic matter

    USGS Publications Warehouse

    Barker, C.E.; Pawlewicz, M.J.

    1993-01-01

    In coal samples, published recommendations based on statistical methods suggest 100 measurements are needed to estimate the mean random vitrinite reflectance (Rv-r) to within ??2%. Our survey of published thermal maturation studies indicates that those using dispersed organic matter (DOM) mostly have an objective of acquiring 50 reflectance measurements. This smaller objective size in DOM versus that for coal samples poses a statistical contradiction because the standard deviations of DOM reflectance distributions are typically larger indicating a greater sample size is needed to accurately estimate Rv-r in DOM. However, in studies of thermal maturation using DOM, even 50 measurements can be an unrealistic requirement given the small amount of vitrinite often found in such samples. Furthermore, there is generally a reduced need for assuring precision like that needed for coal applications. Therefore, a key question in thermal maturation studies using DOM is how many measurements of Rv-r are needed to adequately estimate the mean. Our empirical approach to this problem is to compute the reflectance distribution statistics: mean, standard deviation, skewness, and kurtosis in increments of 10 measurements. This study compares these intermediate computations of Rv-r statistics with a final one computed using all measurements for that sample. Vitrinite reflectance was measured on mudstone and sandstone samples taken from borehole M-25 in the Cerro Prieto, Mexico geothermal system which was selected because the rocks have a wide range of thermal maturation and a comparable humic DOM with depth. The results of this study suggest that after only 20-30 measurements the mean Rv-r is generally known to within 5% and always to within 12% of the mean Rv-r calculated using all of the measured particles. Thus, even in the worst case, the precision after measuring only 20-30 particles is in good agreement with the general precision of one decimal place recommended for mean Rv-r measurements on DOM. The coefficient of variation (V = standard deviation/mean) is proposed as a statistic to indicate the reliability of the mean Rv-r estimates made at n ??? 20. This preliminary study suggests a V 0.2 suggests an unreliable mean in such small samples. ?? 1993.

  13. Dispersion and sampling of adult Dermacentor andersoni in rangeland in Western North America.

    PubMed

    Rochon, K; Scoles, G A; Lysyk, T J

    2012-03-01

    A fixed precision sampling plan was developed for off-host populations of adult Rocky Mountain wood tick, Dermacentor andersoni (Stiles) based on data collected by dragging at 13 locations in Alberta, Canada; Washington; and Oregon. In total, 222 site-date combinations were sampled. Each site-date combination was considered a sample, and each sample ranged in size from 86 to 250 10 m2 quadrats. Analysis of simulated quadrats ranging in size from 10 to 50 m2 indicated that the most precise sample unit was the 10 m2 quadrat. Samples taken when abundance < 0.04 ticks per 10 m2 were more likely to not depart significantly from statistical randomness than samples taken when abundance was greater. Data were grouped into ten abundance classes and assessed for fit to the Poisson and negative binomial distributions. The Poisson distribution fit only data in abundance classes < 0.02 ticks per 10 m2, while the negative binomial distribution fit data from all abundance classes. A negative binomial distribution with common k = 0.3742 fit data in eight of the 10 abundance classes. Both the Taylor and Iwao mean-variance relationships were fit and used to predict sample sizes for a fixed level of precision. Sample sizes predicted using the Taylor model tended to underestimate actual sample sizes, while sample sizes estimated using the Iwao model tended to overestimate actual sample sizes. Using a negative binomial with common k provided estimates of required sample sizes closest to empirically calculated sample sizes.

  14. ELECTROFISHING EFFORT REQUIREMENTS FOR ASSESSING SPECIES RICHNESS AND BIOTIC INTEGRITY IN WESTERN OREGON STREAMS

    EPA Science Inventory

    We empirically examined the sampling effort required to adequately represent species richness and proportionate abundance when backpack electrofishing western Oregon streams. When sampling, we separately recorded data for each habitat unit. In data analyses, we repositioned each...

  15. Simple, Defensible Sample Sizes Based on Cost Efficiency

    PubMed Central

    Bacchetti, Peter; McCulloch, Charles E.; Segal, Mark R.

    2009-01-01

    Summary The conventional approach of choosing sample size to provide 80% or greater power ignores the cost implications of different sample size choices. Costs, however, are often impossible for investigators and funders to ignore in actual practice. Here, we propose and justify a new approach for choosing sample size based on cost efficiency, the ratio of a study’s projected scientific and/or practical value to its total cost. By showing that a study’s projected value exhibits diminishing marginal returns as a function of increasing sample size for a wide variety of definitions of study value, we are able to develop two simple choices that can be defended as more cost efficient than any larger sample size. The first is to choose the sample size that minimizes the average cost per subject. The second is to choose sample size to minimize total cost divided by the square root of sample size. This latter method is theoretically more justifiable for innovative studies, but also performs reasonably well and has some justification in other cases. For example, if projected study value is assumed to be proportional to power at a specific alternative and total cost is a linear function of sample size, then this approach is guaranteed either to produce more than 90% power or to be more cost efficient than any sample size that does. These methods are easy to implement, based on reliable inputs, and well justified, so they should be regarded as acceptable alternatives to current conventional approaches. PMID:18482055

  16. RnaSeqSampleSize: real data based sample size estimation for RNA sequencing.

    PubMed

    Zhao, Shilin; Li, Chung-I; Guo, Yan; Sheng, Quanhu; Shyr, Yu

    2018-05-30

    One of the most important and often neglected components of a successful RNA sequencing (RNA-Seq) experiment is sample size estimation. A few negative binomial model-based methods have been developed to estimate sample size based on the parameters of a single gene. However, thousands of genes are quantified and tested for differential expression simultaneously in RNA-Seq experiments. Thus, additional issues should be carefully addressed, including the false discovery rate for multiple statistic tests, widely distributed read counts and dispersions for different genes. To solve these issues, we developed a sample size and power estimation method named RnaSeqSampleSize, based on the distributions of gene average read counts and dispersions estimated from real RNA-seq data. Datasets from previous, similar experiments such as the Cancer Genome Atlas (TCGA) can be used as a point of reference. Read counts and their dispersions were estimated from the reference's distribution; using that information, we estimated and summarized the power and sample size. RnaSeqSampleSize is implemented in R language and can be installed from Bioconductor website. A user friendly web graphic interface is provided at http://cqs.mc.vanderbilt.edu/shiny/RnaSeqSampleSize/ . RnaSeqSampleSize provides a convenient and powerful way for power and sample size estimation for an RNAseq experiment. It is also equipped with several unique features, including estimation for interested genes or pathway, power curve visualization, and parameter optimization.

  17. The special case of the 2 × 2 table: asymptotic unconditional McNemar test can be used to estimate sample size even for analysis based on GEE.

    PubMed

    Borkhoff, Cornelia M; Johnston, Patrick R; Stephens, Derek; Atenafu, Eshetu

    2015-07-01

    Aligning the method used to estimate sample size with the planned analytic method ensures the sample size needed to achieve the planned power. When using generalized estimating equations (GEE) to analyze a paired binary primary outcome with no covariates, many use an exact McNemar test to calculate sample size. We reviewed the approaches to sample size estimation for paired binary data and compared the sample size estimates on the same numerical examples. We used the hypothesized sample proportions for the 2 × 2 table to calculate the correlation between the marginal proportions to estimate sample size based on GEE. We solved the inside proportions based on the correlation and the marginal proportions to estimate sample size based on exact McNemar, asymptotic unconditional McNemar, and asymptotic conditional McNemar. The asymptotic unconditional McNemar test is a good approximation of GEE method by Pan. The exact McNemar is too conservative and yields unnecessarily large sample size estimates than all other methods. In the special case of a 2 × 2 table, even when a GEE approach to binary logistic regression is the planned analytic method, the asymptotic unconditional McNemar test can be used to estimate sample size. We do not recommend using an exact McNemar test. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Validity and reliability of Patient-Reported Outcomes Measurement Information System (PROMIS) Instruments in Osteoarthritis

    PubMed Central

    Broderick, Joan E.; Schneider, Stefan; Junghaenel, Doerte U.; Schwartz, Joseph E.; Stone, Arthur A.

    2013-01-01

    Objective Evaluation of known group validity, ecological validity, and test-retest reliability of four domain instruments from the Patient Reported Outcomes Measurement System (PROMIS) in osteoarthritis (OA) patients. Methods Recruitment of an osteoarthritis sample and a comparison general population (GP) through an Internet survey panel. Pain intensity, pain interference, physical functioning, and fatigue were assessed for 4 consecutive weeks with PROMIS short forms on a daily basis and compared with same-domain Computer Adaptive Test (CAT) instruments that use a 7-day recall. Known group validity (comparison of OA and GP), ecological validity (comparison of aggregated daily measures with CATs), and test-retest reliability were evaluated. Results The recruited samples matched (age, sex, race, ethnicity) the demographic characteristics of the U.S. sample for arthritis and the 2009 Census for the GP. Compliance with repeated measurements was excellent: > 95%. Known group validity for CATs was demonstrated with large effect sizes (pain intensity: 1.42, pain interference: 1.25, and fatigue: .85). Ecological validity was also established through high correlations between aggregated daily measures and weekly CATs (≥ .86). Test-retest validity (7-day) was very good (≥ .80). Conclusion PROMIS CAT instruments demonstrated known group and ecological validity in a comparison of osteoarthritis patients with a general population sample. Adequate test-retest reliability was also observed. These data provide encouraging initial data on the utility of these PROMIS instruments for clinical and research outcomes in osteoarthritis patients. PMID:23592494

  19. CT-guided transthoracic core needle biopsy for small pulmonary lesions: diagnostic performance and adequacy for molecular testing.

    PubMed

    Tian, Panwen; Wang, Ye; Li, Lei; Zhou, Yongzhao; Luo, Wenxin; Li, Weimin

    2017-02-01

    Computed tomography (CT)-guided transthoracic needle biopsy is a well-established, minimally invasive diagnostic tool for pulmonary lesions. Few large studies have been conducted on the diagnostic performance and adequacy for molecular testing of transthoracic core needle biopsy (TCNB) for small pulmonary lesions. This study included CT-guided TCNB with 18-gauge cutting needles in 560 consecutive patients with small (≤3 cm) pulmonary lesions from January 2012 to January 2015. There were 323 males and 237 females, aged 51.8±12.7 years. The size of the pulmonary lesions was 1.8±0.6 cm. The sensitivity, specificity, accuracy and complications of the biopsies were investigated. The risk factors of diagnostic failure were assessed using univariate and multivariate analyses. The sample's adequacy for molecular testing of non-small cell lung cancer (NSCLC) was analyzed. The overall sensitivity, specificity, and accuracy for diagnosis of malignancy were 92.0% (311/338), 98.6% (219/222), and 94.6% (530/560), respectively. The incidence of bleeding complications was 22.9% (128/560), and the incidence of pneumothorax was 10.4% (58/560). Logistic multivariate regression analysis showed that the independent risk factors for diagnostic failure were a lesion size ≤1 cm [odds ratio (OR), 3.95; P=0.007], lower lobe lesions (OR, 2.83; P=0.001), and pneumothorax (OR, 1.98; P=0.004). Genetic analysis was successfully performed on 95.45% (168/176) of specimens diagnosed as NSCLC. At least 96.8% of samples with two or more passes from a lesion were sufficient for molecular testing. The diagnostic yield of small pulmonary lesions by CT-guided TCNB is high, and the procedure is relatively safe. A lesion size ≤1 cm, lower lobe lesions, and pneumothorax are independent risk factors for biopsy diagnostic failure. TCNB specimens could provide adequate tissues for molecular testing.

  20. Reporting of sample size calculations in analgesic clinical trials: ACTTION systematic review.

    PubMed

    McKeown, Andrew; Gewandter, Jennifer S; McDermott, Michael P; Pawlowski, Joseph R; Poli, Joseph J; Rothstein, Daniel; Farrar, John T; Gilron, Ian; Katz, Nathaniel P; Lin, Allison H; Rappaport, Bob A; Rowbotham, Michael C; Turk, Dennis C; Dworkin, Robert H; Smith, Shannon M

    2015-03-01

    Sample size calculations determine the number of participants required to have sufficiently high power to detect a given treatment effect. In this review, we examined the reporting quality of sample size calculations in 172 publications of double-blind randomized controlled trials of noninvasive pharmacologic or interventional (ie, invasive) pain treatments published in European Journal of Pain, Journal of Pain, and Pain from January 2006 through June 2013. Sixty-five percent of publications reported a sample size calculation but only 38% provided all elements required to replicate the calculated sample size. In publications reporting at least 1 element, 54% provided a justification for the treatment effect used to calculate sample size, and 24% of studies with continuous outcome variables justified the variability estimate. Publications of clinical pain condition trials reported a sample size calculation more frequently than experimental pain model trials (77% vs 33%, P < .001) but did not differ in the frequency of reporting all required elements. No significant differences in reporting of any or all elements were detected between publications of trials with industry and nonindustry sponsorship. Twenty-eight percent included a discrepancy between the reported number of planned and randomized participants. This study suggests that sample size calculation reporting in analgesic trial publications is usually incomplete. Investigators should provide detailed accounts of sample size calculations in publications of clinical trials of pain treatments, which is necessary for reporting transparency and communication of pre-trial design decisions. In this systematic review of analgesic clinical trials, sample size calculations and the required elements (eg, treatment effect to be detected; power level) were incompletely reported. A lack of transparency regarding sample size calculations may raise questions about the appropriateness of the calculated sample size. Copyright © 2015 American Pain Society. All rights reserved.

  1. Treatment effect on biases in size estimation in spider phobia.

    PubMed

    Shiban, Youssef; Fruth, Martina B; Pauli, Paul; Kinateder, Max; Reichenberger, Jonas; Mühlberger, Andreas

    2016-12-01

    The current study investigates biases in size estimations made by spider-phobic and healthy participants before and after treatment. Forty-one spider-phobic and 20 healthy participants received virtual reality (VR) exposure treatment and were then asked to rate the size of a real spider immediately before and, on average, 15days after the treatment. During the VR exposure treatment skin conductance response was assessed. Prior to the treatment, both groups tended to overestimate the size of the spider, but this size estimation bias was significantly larger in the phobic group than in the control group. The VR exposure treatment reduced this bias, which was reflected in a significantly smaller size rating post treatment. However, the size estimation bias was unrelated to the skin conductance response. Our results confirm the hypothesis that size estimation by spider-phobic patients is biased. This bias is not stable over time and can be decreased with adequate treatment. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Translation, adaptation, and validation of the Stanford Hypnotic Clinical Scale in Puerto Rico.

    PubMed

    Deynes-Exclusa, Yazmin; Sayers-Montalvo, Sean K; Martinez-Taboas, Alfonso

    2011-04-01

    The only hypnotizability scale that has been translated and validated for the Puerto Rican population is the Barber Suggestibility Scale (BSS). In this article, the Stanford Hypnotic Clinical Scale (SHCS) was translated and validated for this population. The translated SHCS ("Escala Stanford de Hipnosis Clinica" [ESHC]) was administered individually to 100 Puerto Rican college students. There were no significant differences found between the norms of the original SHCS samples and the Spanish version of the SHCS. Both samples showed similar distributions. The Spanish version's internal reliability as well as the item discrimination index were adequate. The authors conclude that the ESHC is an adequate instrument to measure hypnotizability in the Puerto Rican population.

  3. Prospects of Fine-Mapping Trait-Associated Genomic Regions by Using Summary Statistics from Genome-wide Association Studies.

    PubMed

    Benner, Christian; Havulinna, Aki S; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ripatti, Samuli; Pirinen, Matti

    2017-10-05

    During the past few years, various novel statistical methods have been developed for fine-mapping with the use of summary statistics from genome-wide association studies (GWASs). Although these approaches require information about the linkage disequilibrium (LD) between variants, there has not been a comprehensive evaluation of how estimation of the LD structure from reference genotype panels performs in comparison with that from the original individual-level GWAS data. Using population genotype data from Finland and the UK Biobank, we show here that a reference panel of 1,000 individuals from the target population is adequate for a GWAS cohort of up to 10,000 individuals, whereas smaller panels, such as those from the 1000 Genomes Project, should be avoided. We also show, both theoretically and empirically, that the size of the reference panel needs to scale with the GWAS sample size; this has important consequences for the application of these methods in ongoing GWAS meta-analyses and large biobank studies. We conclude by providing software tools and by recommending practices for sharing LD information to more efficiently exploit summary statistics in genetics research. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  4. Geriatric dentistry education and context in a selection of countries in 5 continents.

    PubMed

    Marchini, Leonardo; Ettinger, Ronald; Chen, Xi; Kossioni, Anastassia; Tan, Haiping; Tada, Sayaka; Ikebe, Kazunori; Dosumu, Elizabeth Bosede; Oginni, Fadekemi O; Akeredolu, Patricia Adetokunbo; Butali, Azeez; Donnelly, Leeann; Brondani, Mario; Fritzsch, Bernd; Adeola, Henry A

    2018-05-01

    To summarize and discuss how geriatric dentistry has been addressed in dental schools of different countries regarding to (1) teaching students at the predoctoral level; (2) advanced training, and (3) research. A convenience sample of faculty members from a selection of high, upper-middle and lower-middle income countries were recruited to complete the survey. The survey had 5 open-ended main topics, and asked about (1) the size of their elderly population, (2) general information about dental education; (3) the number of dental schools teaching geriatric dentistry, and their teaching methods; (4) advanced training in geriatric dentistry; (5) scholarship/research in geriatric dentistry. (1) There is great variation in the size of elderly population; (2) duration of training and content of dental education curriculum varies; (3) geriatric dentistry has not been established as a standalone course in dental schools in the majority of the countries, (4) most countries, with the exception of Japan, lack adequate number of dentists trained in geriatric dentistry as well as training programs, and (5) geriatric dentistry-related research has increased in recent years in scope and content, although the majority of these papers are not in English. © 2018 Special Care Dentistry Association and Wiley Periodicals, Inc.

  5. A fresh look at crater scaling laws for normal and oblique hypervelocity impacts

    NASA Technical Reports Server (NTRS)

    Watts, A. J.; Atkinson, D. R.; Rieco, S. R.; Brandvold, J. B.; Lapin, S. L.; Coombs, C. R.

    1993-01-01

    With the concomitant increase in the amount of man-made debris and an ever increasing use of space satellites, the issue of accidental collisions with particles becomes more severe. While the natural micrometeoroid population is unavoidable and assumed constant, continued launches increase the debris population at a steady rate. Debris currently includes items ranging in size from microns to meters which originated from spent satellites and rocket cases. To understand and model these environments, impact damage in the form of craters and perforations must be analyzed. Returned spacecraft materials such as those from LDEF and Solar Max have provided such a testbed. From these space-aged samples various impact parameters (i.e., particle size, particle and target material, particle shape, relative impact speed, etc.) may be determined. These types of analyses require the use of generic analytic scaling laws which can adequately describe the impact effects. Currently, most existing analytic scaling laws are little more than curve-fits to limited data and are not based on physics, and thus are not generically applicable over a wide range of impact parameters. During this study, a series of physics-based scaling laws for normal and oblique crater and perforation formation has been generated into two types of materials: aluminum and Teflon.

  6. Relationship between organochlorine pesticides and stress indicators in hawksbill sea turtle (Eretmochelys imbricata) nesting at Punta Xen (Campeche), Southern Gulf of Mexico.

    PubMed

    Tremblay, Nelly; Ortíz Arana, Alejandro; González Jáuregui, Mauricio; Rendón-von Osten, Jaime

    2017-03-01

    Data on the impact of environmental pollution on the homeostasis of sea turtles remains scarce, particularly in the Southern Gulf of Mexico. As many municipalities along the coastline of the Yucatan Peninsula do not rely on a waste treatment plant, these organisms could be particularly vulnerable. We searched for relationships between the presence of organochlorine pesticides (OCP) and the level of several oxidative and pollutant stress indicators of the hawksbill sea turtle (Eretmochelys imbricata) during the 2010 nesting season at Punta Xen (Campeche, Mexico). Of the 30 sampled sea turtles, endosulfans, aldrin related (aldrin, endrin, dieldrin, endrin ketone, endrin aldehyde) and dichlorodiphenyldichloroethylene (DDT) families were detected in 17, 21 and 26, respectively. Significant correlation existed between the size of sea turtles with the concentration of methoxychlor, cholinesterase activity in plasma and heptachlors family, and catalase activity and hexachlorohexane family. Cholinesterase activity in washed erythrocytes and lipid peroxidation were positively correlated with glutathione reductase activity. Antioxidant enzyme actions seem adequate as no lipids damages were correlated with any OCPs. Future studies are necessary to evaluate the effect of OCPs on males of the area due to the significant detection of methoxychlor, which target endocrine functioning and increases its concentration with sea turtles size.

  7. Assessment and measurement of patient-centered medical home implementation: the BCBSM experience.

    PubMed

    Alexander, Jeffrey A; Paustian, Michael; Wise, Christopher G; Green, Lee A; Fetters, Michael D; Mason, Margaret; El Reda, Darline K

    2013-01-01

    Our goal was to describe an approach to patient-centered medical home (PCMH) measurement based on delineating the desired properties of the measurement relative to assumptions about the PCMH and the uses of the measure by Blue Cross Blue Shield of Michigan (BCBSM) and health services researchers. We developed and validated an approach to assess 13 functional domains of PCMHs and 128 capabilities within those domains. A measure of PCMH implementation was constructed using data from the validated self-assessment and then tested on a large sample of primary care practices in Michigan. Our results suggest that the measure adequately addresses the specific requirements and assumptions underlying the BCBSM PCMH program-ability to assess change in level of implementation; ability to compare across practices regardless of size, affiliation, or payer mix; and ability to assess implementation of the PCMH through different sequencing of capabilities and domains. Our experience illustrates that approaches to measuring PCMH should be driven by the measures' intended use(s) and users, and that a one-size-fits-all approach may not be appropriate. Rather than promoting the BCBSM PCMH measure as the gold standard, our study highlights the challenges, strengths, and limitations of developing a standardized approach to PCMH measurement.

  8. Multimodal neuroimaging of male and female brain structure in health and disease across the life span

    PubMed Central

    Thompson, Paul M.

    2016-01-01

    Sex differences in brain development and aging are important to identify, as they may help to understand risk factors and outcomes in brain disorders that are more prevalent in one sex compared with the other. Brain imaging techniques have advanced rapidly in recent years, yielding detailed structural and functional maps of the living brain. Even so, studies are often limited in sample size, and inconsistent findings emerge, one example being varying findings regarding sex differences in the size of the corpus callosum. More recently, large‐scale neuroimaging consortia such as the Enhancing Neuro Imaging Genetics through Meta Analysis Consortium have formed, pooling together expertise, data, and resources from hundreds of institutions around the world to ensure adequate power and reproducibility. These initiatives are helping us to better understand how brain structure is affected by development, disease, and potential modulators of these effects, including sex. This review highlights some established and disputed sex differences in brain structure across the life span, as well as pitfalls related to interpreting sex differences in health and disease. We also describe sex‐related findings from the ENIGMA consortium, and ongoing efforts to better understand sex differences in brain circuitry. © 2016 The Authors. Journal of Neuroscience Research Published by Wiley Periodicals, Inc. PMID:27870421

  9. Determination of the optimal sample size for a clinical trial accounting for the population size.

    PubMed

    Stallard, Nigel; Miller, Frank; Day, Simon; Hee, Siew Wan; Madan, Jason; Zohar, Sarah; Posch, Martin

    2017-07-01

    The problem of choosing a sample size for a clinical trial is a very common one. In some settings, such as rare diseases or other small populations, the large sample sizes usually associated with the standard frequentist approach may be infeasible, suggesting that the sample size chosen should reflect the size of the population under consideration. Incorporation of the population size is possible in a decision-theoretic approach either explicitly by assuming that the population size is fixed and known, or implicitly through geometric discounting of the gain from future patients reflecting the expected population size. This paper develops such approaches. Building on previous work, an asymptotic expression is derived for the sample size for single and two-arm clinical trials in the general case of a clinical trial with a primary endpoint with a distribution of one parameter exponential family form that optimizes a utility function that quantifies the cost and gain per patient as a continuous function of this parameter. It is shown that as the size of the population, N, or expected size, N∗ in the case of geometric discounting, becomes large, the optimal trial size is O(N1/2) or O(N∗1/2). The sample size obtained from the asymptotic expression is also compared with the exact optimal sample size in examples with responses with Bernoulli and Poisson distributions, showing that the asymptotic approximations can also be reasonable in relatively small sample sizes. © 2016 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Controlled Trials in Children: Quantity, Methodological Quality and Descriptive Characteristics of Pediatric Controlled Trials Published 1948-2006

    PubMed Central

    Thomson, Denise; Hartling, Lisa; Cohen, Eyal; Vandermeer, Ben; Tjosvold, Lisa; Klassen, Terry P.

    2010-01-01

    Background The objective of this study was to describe randomized controlled trials (RCTs) and controlled clinical trials (CCTs) in child health published between 1948 and 2006, in terms of quantity, methodological quality, and publication and trial characteristics. We used the Trials Register of the Cochrane Child Health Field for overall trends and a sample from this to explore trial characteristics in more detail. Methodology/Principal Findings We extracted descriptive data on a random sample of 578 trials. Ninety-six percent of the trials were published in English; the percentage of child-only trials was 90.5%. The most frequent diagnostic categories were infectious diseases (13.2%), behavioural and psychiatric disorders (11.6%), neonatal critical care (11.4%), respiratory disorders (8.9%), non-critical neonatology (7.9%), and anaesthesia (6.5%). There were significantly fewer child-only studies (i.e., more mixed child and adult studies) over time (P = 0.0460). The proportion of RCTs to CCTs increased significantly over time (P<0.0001), as did the proportion of multicentre trials (P = 0.002). Significant increases over time were found in methodological quality (Jadad score) (P<0.0001), the proportion of double-blind studies (P<0.0001), and studies with adequate allocation concealment (P<0.0001). Additionally, we found an improvement in reporting over time: adequate description of withdrawals and losses to follow-up (P<0.0001), sample size calculations (P<0.0001), and intention-to-treat analysis (P<0.0001). However, many trials still do not describe their level of blinding, and allocation concealment was inadequately reported in the majority of studies across the entire time period. The proportion of studies with industry funding decreased slightly over time (P = 0.003), and these studies were more likely to report positive conclusions (P = 0.028). Conclusions/Significance The quantity and quality of pediatric controlled trials has increased over time; however, much work remains to be done, particularly in improving methodological issues around conduct and reporting of trials. PMID:20927344

  11. Identification of Particles in Parenteral Drug Raw Materials.

    PubMed

    Lee, Kathryn; Lankers, Markus; Valet, Oliver

    2018-04-18

    Particles in drug products are not good and are therefore regulated. These particles can come from the very beginning of the manufacturing process, from the raw materials. To prevent particles, it is important to understand what they are and where they come from so the raw material quality, processing, and shipping can be improved. Thus, it is important to correctly identify particles seen in raw materials. Raw materials need to be of a certain quality with respect to physical and chemical composition, and need to have no contaminants in the form of particles which could contaminate the product or indicate the raw materials are not pure enough to make a good quality product. Particles are often seen when handling raw materials due to color, size, or shape characteristics different from those in the raw materials. Particles may appear to the eye to be very different things than they actually are, so microscope, chemical, and elemental analyses are required for accuracy in proper identification. This paper shows how using three different spectroscopy tools correctly and together can be used to identify particles from extrinsic, intrinsic, and inherent particles. Sources of materials can be humans and the environment (extrinsic), from within the process (intrinsic), and part of the formulation (inherent). Microscope versions of Raman spectroscopy, laser-induced breakdown spectroscopy (LIBS), and IR spectroscopy are excellent tools for identifying particles because they are fast and accurate techniques needing minimal sample preparation that can provide chemical composition as well as images that can be used for identification. The micro analysis capabilities allow for easy analysis of different portions of samples so multiple components can be identified and sample preparation can be reduced. Using just one of these techniques may not be sufficient to give adequate identification results so that the source of contamination can be adequately identified. The complementarity of the techniques provides the advantage of identifying various chemical and molecular components, as well as elemental and image analyses. Correct interpretation of the results from these techniques is also very important. Copyright © 2018, Parenteral Drug Association.

  12. Seed particle response and size characterization in high speed flows

    NASA Technical Reports Server (NTRS)

    Rudoff, Roger C.; Bachalo, William D.

    1991-01-01

    The response of seed particles ranging between 0.7 and 8.7 micron is determined using a phase Doppler particle analyzer which simultaneously measures particle size and velocity. The stagnant seed particles are entrained into a high speed free jet at velocities ranging from 40 to 300 m/s. The size-mean axial velocity correlation and size-rms velocity correlations are used to determine the particle response to the sudden acceleration. It was determined that at the lower speeds, seed particles up to approximately 5 microns are adequate, but as velocities approach 300 m/s only particles on the order of one micron are suitable. The ability to determine size and velocity simultaneously is essential if seeding with polydispersions is used since it allows the rejection of data which will not accurately represent the flow field.

  13. 30 CFR 77.503 - Electric conductors; capacity and insulation.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Electric conductors; capacity and insulation... UNDERGROUND COAL MINES Electrical Equipment-General § 77.503 Electric conductors; capacity and insulation. Electric conductors shall be sufficient in size and have adequate current carrying capacity and be of such...

  14. 30 CFR 77.503 - Electric conductors; capacity and insulation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Electric conductors; capacity and insulation... UNDERGROUND COAL MINES Electrical Equipment-General § 77.503 Electric conductors; capacity and insulation. Electric conductors shall be sufficient in size and have adequate current carrying capacity and be of such...

  15. 24 CFR 3280.609 - Water distribution systems.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 5 2011-04-01 2011-04-01 false Water distribution systems. 3280....609 Water distribution systems. (a) Water supply—(1) Supply piping. Piping systems shall be sized to provide an adequate quantity of water to each plumbing fixture at a flow rate sufficient to keep the...

  16. 24 CFR 3280.609 - Water distribution systems.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 5 2012-04-01 2012-04-01 false Water distribution systems. 3280....609 Water distribution systems. (a) Water supply—(1) Supply piping. Piping systems shall be sized to provide an adequate quantity of water to each plumbing fixture at a flow rate sufficient to keep the...

  17. 9 CFR 3.54 - Feeding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... otherwise might be required to provide adequate veterinary care. The food shall be free from contamination... for the condition and size of the rabbit. (b) Food receptacles shall be accessible to all rabbits in a primary enclosure and shall be located so as to minimize contamination by excreta. All food receptacles...

  18. 21 CFR 111.20 - What design and construction requirements apply to your physical plant?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... surfaces, with microorganisms, chemicals, filth, or other extraneous material. Your physical plant must have, and you must use, separate or defined areas of adequate size or other control systems, such as computerized inventory controls or automated systems of separation, to prevent contamination and mixups of...

  19. 21 CFR 111.20 - What design and construction requirements apply to your physical plant?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... surfaces, with microorganisms, chemicals, filth, or other extraneous material. Your physical plant must have, and you must use, separate or defined areas of adequate size or other control systems, such as computerized inventory controls or automated systems of separation, to prevent contamination and mixups of...

  20. A landscape approach for assessing the ecological feasibility of a black bear population recovery

    EPA Science Inventory

    There is great interest in recovering populations of large carnivores in locations where they previously were extirpated or severely reduced in size as a result of human activity. Determining the ecological feasibility (i.e., is adequate habitat available?) of a species is diffi...

  1. Space Guidelines for Planning Educational Facilities. Planning for Education.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Education, Oklahoma City.

    In 1983 the Oklahoma Legislature adopted facility guidelines for the purpose of defining, organizing, and encouraging the planning of adequate environments for education. The guidelines contained in this booklet have been designed to allow for the requirements of all Oklahoma school districts regardless of size or educational program. The…

  2. 24 CFR 3280.609 - Water distribution systems.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 5 2013-04-01 2013-04-01 false Water distribution systems. 3280....609 Water distribution systems. (a) Water supply—(1) Supply piping. Piping systems shall be sized to provide an adequate quantity of water to each plumbing fixture at a flow rate sufficient to keep the...

  3. 46 CFR 177.700 - General requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) CONSTRUCTION AND ARRANGEMENT Crew Spaces § 177.700 General requirements. (a) A crew accommodation space and a work space must be of sufficient size, adequate construction, and with suitable equipment to provide for the safe operation of the vessel and the protection and accommodation of the crew in a manner...

  4. 46 CFR 116.700 - General requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Spaces § 116.700 General requirements. (a) A crew accommodation space and a work space must be of sufficient size, adequate construction, and with suitable equipment to provide for the safe operation of the..., service, route, speed, and modes of operation of the vessel. (b) The deck above a crew accommodation space...

  5. 46 CFR 116.700 - General requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Spaces § 116.700 General requirements. (a) A crew accommodation space and a work space must be of sufficient size, adequate construction, and with suitable equipment to provide for the safe operation of the..., service, route, speed, and modes of operation of the vessel. (b) The deck above a crew accommodation space...

  6. 46 CFR 177.700 - General requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) CONSTRUCTION AND ARRANGEMENT Crew Spaces § 177.700 General requirements. (a) A crew accommodation space and a work space must be of sufficient size, adequate construction, and with suitable equipment to provide for the safe operation of the vessel and the protection and accommodation of the crew in a manner...

  7. Planning the School Library.

    ERIC Educational Resources Information Center

    Babcock, Ruth E.; And Others

    This report consists of recommendations for library facilities in either new or existing school buildings. Suggestions are made for the location and size of the library. Also included are considerations for library acoustics, heating and interior finish. It is considered to be of critical importance that adequate and separate space should be…

  8. 7 CFR 58.210 - Dry storage of product.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Dry storage of product. 58.210 Section 58.210 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards....210 Dry storage of product. Storage rooms for the dry storage of product shall be adequate in size...

  9. 7 CFR 58.210 - Dry storage of product.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Dry storage of product. 58.210 Section 58.210 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards....210 Dry storage of product. Storage rooms for the dry storage of product shall be adequate in size...

  10. 7 CFR 58.210 - Dry storage of product.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Dry storage of product. 58.210 Section 58.210 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards....210 Dry storage of product. Storage rooms for the dry storage of product shall be adequate in size...

  11. 7 CFR 58.210 - Dry storage of product.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Dry storage of product. 58.210 Section 58.210 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards....210 Dry storage of product. Storage rooms for the dry storage of product shall be adequate in size...

  12. The Stoner-Wohlfarth Model of Ferromagnetism

    ERIC Educational Resources Information Center

    Tannous, C.; Gieraltowski, J.

    2008-01-01

    The Stoner-Wohlfarth (SW) model is the simplest model that describes adequately the physics of fine magnetic grains, the magnetization of which can be used in digital magnetic storage (floppies, hard disks and tapes). Magnetic storage density is presently increasing steadily in almost the same way as electronic device size and circuitry are…

  13. 36 CFR 251.23 - Experimental areas and research natural areas.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... research natural areas. 251.23 Section 251.23 Parks, Forests, and Public Property FOREST SERVICE... and research natural areas. The Chief of the Forest Service shall establish and permanently record a... a series of research natural areas, sufficient in number and size to illustrate adequately or typify...

  14. 36 CFR 251.23 - Experimental areas and research natural areas.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... research natural areas. 251.23 Section 251.23 Parks, Forests, and Public Property FOREST SERVICE... and research natural areas. The Chief of the Forest Service shall establish and permanently record a... a series of research natural areas, sufficient in number and size to illustrate adequately or typify...

  15. Requirements for Minimum Sample Size for Sensitivity and Specificity Analysis

    PubMed Central

    Adnan, Tassha Hilda

    2016-01-01

    Sensitivity and specificity analysis is commonly used for screening and diagnostic tests. The main issue researchers face is to determine the sufficient sample sizes that are related with screening and diagnostic studies. Although the formula for sample size calculation is available but concerning majority of the researchers are not mathematicians or statisticians, hence, sample size calculation might not be easy for them. This review paper provides sample size tables with regards to sensitivity and specificity analysis. These tables were derived from formulation of sensitivity and specificity test using Power Analysis and Sample Size (PASS) software based on desired type I error, power and effect size. The approaches on how to use the tables were also discussed. PMID:27891446

  16. Street trees reduce the negative effects of urbanization on birds.

    PubMed

    Pena, João Carlos de Castro; Martello, Felipe; Ribeiro, Milton Cezar; Armitage, Richard A; Young, Robert J; Rodrigues, Marcos

    2017-01-01

    The effects of streets on biodiversity is an important aspect of urban ecology, but it has been neglected worldwide. Several vegetation attributes (e.g. street tree density and diversity) have important effects on biodiversity and ecological processes. In this study, we evaluated the influences of urban vegetation-represented by characteristics of street trees (canopy size, proportion of native tree species and tree species richness)-and characteristics of the landscape (distance to parks and vegetation quantity), and human impacts (human population size and exposure to noise) on taxonomic data and functional diversity indices of the bird community inhabiting streets. The study area was the southern region of Belo Horizonte (Minas Gerais, Brazil), a largely urbanized city in the understudied Neotropical region. Bird data were collected on 60 point count locations distributed across the streets of the landscape. We used a series of competing GLM models (using Akaike's information criterion for small sample sizes) to assess the relative contribution of the different sets of variables to explain the observed patterns. Seventy-three bird species were observed exploiting the streets: native species were the most abundant and frequent throughout this landscape. The bird community's functional richness and Rao's Quadratic Entropy presented values lower than 0.5. Therefore, this landscape was favoring few functional traits. Exposure to noise was the most limiting factor for this bird community. However, the average size of arboreal patches and, especially the characteristics of street trees, were able to reduce the negative effects of noise on the bird community. These results show the importance of adequately planning the urban afforestation process: increasing tree species richness, preserving large trees and planting more native trees species in the streets are management practices that will increase bird species richness, abundance and community functional aspects and consequently improve human wellbeing and quality of life.

  17. MORPHOLOGICAL PROPERTIES OF Lyα EMITTERS AT REDSHIFT 4.86 IN THE COSMOS FIELD: CLUMPY STAR FORMATION OR MERGER?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kobayashi, Masakazu A. R.; Taniguchi, Yoshiaki; Kajisawa, Masaru

    2016-03-01

    We investigate morphological properties of 61 Lyα emitters (LAEs) at z = 4.86 identified in the COSMOS field, based on Hubble Space Telescope Advanced Camera for Surveys (ACS) imaging data in the F814W band. Out of the 61 LAEs, we find the ACS counterparts for 54 LAEs. Eight LAEs show double-component structures with a mean projected separation of 0.″63 (∼4.0 kpc at z = 4.86). Considering the faintness of these ACS sources, we carefully evaluate their morphological properties, that is, size and ellipticity. While some of them are compact and indistinguishable from the point-spread function (PSF) half-light radius of 0.″07 (∼0.45 kpc),more » the others are clearly larger than the PSF size and spatially extended up to 0.″3 (∼1.9 kpc). We find that the ACS sources show a positive correlation between ellipticity and size and that the ACS sources with large size and round shape are absent. Our Monte Carlo simulation suggests that the correlation can be explained by (1) the deformation effects via PSF broadening and shot noise or (2) the source blending in which two or more sources with small separation are blended in our ACS image and detected as a single elongated source. Therefore, the 46 single-component LAEs could contain the sources that consist of double (or multiple) components with small spatial separation (i.e., ≲0.″3 or 1.9 kpc). Further observation with high angular resolution at longer wavelengths (e.g., rest-frame wavelengths of ≳4000 Å) is inevitable to decipher which interpretation is adequate for our LAE sample.« less

  18. Interventions to improve mental health nurses' skills, attitudes, and knowledge related to people with a diagnosis of borderline personality disorder: Systematic review.

    PubMed

    Dickens, Geoffrey L; Hallett, Nutmeg; Lamont, Emma

    2016-04-01

    There is some evidence that mental health nurses have poor attitudes towards people with a diagnosis of borderline personality disorder and that this might impact negatively on the development of helpful therapeutic relationships. We aimed to collate the current evidence about interventions that have been devised to improve the responses of mental health nurses towards this group of people. Systematic review in accordance with the Preferred Reporting Items for Systematic Reviews and Meta Analyses statement. Comprehensive terms were used to search CINAHL, PsycINFO, Medline, Biomedical Reference Collection: Comprehensive, Web of Science, ASSIA, Cochrane Library, EMBASE, ProQuest [including Dissertations/Theses], and Google Scholar for relevant studies. Included studies were those that described an intervention whose aim was to improve attitudes towards, knowledge about or responses to people with a diagnosis of borderline personality disorder. The sample described had to include mental health nurses. Information about study characteristics, intervention content and mode of delivery was extracted. Study quality was assessed, and effect sizes of interventions and potential moderators of those interventions were extracted and converted to Cohen's d to aid comparison. The search strategy yielded a total of eight studies, half of which were judged to be methodologically weak with the remaining four studies judged to be of moderate quality. Only one study employed a control group. The largest effect sizes were found for changes related to cognitive attitudes including knowledge; smaller effect sizes were found in relation to changes in affective outcomes. Self-reported behavioural change in the form of increased use of components of Dialectical Behaviour Therapy following training in this treatment was associated with moderate effect sizes. The largest effect sizes were found among those with poorer baseline attitudes and without previous training about borderline personality disorder. There is a dearth of high quality evidence about the attitudes of mental health nurses towards people with a diagnosis of borderline personality disorder. This is an important gap since nurses hold the poorest attitudes of professional disciplines involved in the care of this group. Further work is needed to ascertain the most effective elements of training programmes; this should involve trials of interventions in samples that are compared against adequately matched control groups. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Measurement of total Zn and Zn isotope ratios by quadrupole ICP-MS for evaluation of Zn uptake in gills of brown trout (Salmo trutta) and rainbow trout (Oncorhynchus mykiss)

    USGS Publications Warehouse

    Wolf, R.E.; Todd, A.S.; Brinkman, S.; Lamothe, P.J.; Smith, K.S.; Ranville, J.F.

    2009-01-01

    This study evaluates the potential use of stable zinc isotopes in toxicity studies measuring zinc uptake by the gills of brown trout (Salmo trutta) and rainbow trout (Oncorhynchus mykiss). The use of stable isotopes in such studies has several advantages over the use of radioisotopes, including cost, ease of handling, elimination of permit requirements, and waste disposal. A pilot study using brown trout was performed to evaluate sample preparation methods and the ability of a quadrupole inductively coupled plasma mass spectrometer (ICP-MS) system to successfully measure changes in the 67Zn/66Zn ratios for planned exposure levels and duration. After completion of the pilot study, a full-scale zinc exposure study using rainbow trout was performed. The results of these studies indicate that there are several factors that affect the precision of the measured 67Zn/66Zn ratios in the sample digests, including variations in sample size, endogenous zinc levels, and zinc uptake rates by individual fish. However, since these factors were incorporated in the calculation of the total zinc accumulated by the gills during the exposures, the data obtained were adequate for their intended use in calculating zinc binding and evaluating the influences of differences in water quality parameters.

  20. Low energy cyclotron for radiocarbon dating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welch, J.J.

    1984-12-01

    The measurement of naturally occurring radioisotopes whose half lives are less than a few hundred million years but more than a few years provides information about the temporal behavior of geologic and climatic processes, the temporal history of meteoritic bodies as well as the production mechanisms of these radioisotopes. A new extremely sensitive technique for measuring these radioisotopes at tandem Van de Graaff and cyclotron facilities has been very successful though the high cost and limited availability have been discouraging. We have built and tested a low energy cyclotron for radiocarbon dating similar in size to a conventional mass spectrometer.more » These tests clearly show that with the addition of a conventional ion source, the low energy cyclotron can perform the extremely high sensitivity /sup 14/C measurements that are now done at accelerator facilities. We found that no significant background is present when the cyclotron is tuned to accelerate /sup 14/C negative ions and the transmission efficiency is adequate to perform radiocarbon dating on milligram samples of carbon. The internal ion source used did not produce sufficient current to detect /sup 14/C directly at modern concentrations. We show how a conventional carbon negative ion source, located outside the cyclotron magnet, would produce sufficient beam and provide for quick sampling to make radiocarbon dating milligram samples with a modest laboratory instrument feasible.« less

Top