Sample records for methods sample summary

  1. Robust inference in summary data Mendelian randomization via the zero modal pleiotropy assumption.

    PubMed

    Hartwig, Fernando Pires; Davey Smith, George; Bowden, Jack

    2017-12-01

    Mendelian randomization (MR) is being increasingly used to strengthen causal inference in observational studies. Availability of summary data of genetic associations for a variety of phenotypes from large genome-wide association studies (GWAS) allows straightforward application of MR using summary data methods, typically in a two-sample design. In addition to the conventional inverse variance weighting (IVW) method, recently developed summary data MR methods, such as the MR-Egger and weighted median approaches, allow a relaxation of the instrumental variable assumptions. Here, a new method - the mode-based estimate (MBE) - is proposed to obtain a single causal effect estimate from multiple genetic instruments. The MBE is consistent when the largest number of similar (identical in infinite samples) individual-instrument causal effect estimates comes from valid instruments, even if the majority of instruments are invalid. We evaluate the performance of the method in simulations designed to mimic the two-sample summary data setting, and demonstrate its use by investigating the causal effect of plasma lipid fractions and urate levels on coronary heart disease risk. The MBE presented less bias and lower type-I error rates than other methods under the null in many situations. Its power to detect a causal effect was smaller compared with the IVW and weighted median methods, but was larger than that of MR-Egger regression, with sample size requirements typically smaller than those available from GWAS consortia. The MBE relaxes the instrumental variable assumptions, and should be used in combination with other approaches in sensitivity analyses. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association

  2. A Simple Method for Evaluating Within Sample Prognostic Balance Achieved by Published Comorbidity Summary Measures.

    PubMed

    Egleston, Brian L; Uzzo, Robert G; Beck, J Robert; Wong, Yu-Ning

    2015-08-01

    To demonstrate how a researcher can investigate the appropriateness of a published comorbidity summary measure for use with a given sample. Surveillance, Epidemiology, and End Results linked to Medicare claims data. We examined Kaplan-Meier estimated survival curves for four diseases within strata of a comorbidity summary measure, the Charlson Comorbidity Index. We identified individuals with early-stage kidney cancer diagnosed from 1995 to 2009. We recorded comorbidities present in the year before diagnosis. The use of many comorbidity summary measures is valid under appropriate conditions. One condition is that the relationships of the comorbidities with the outcome of interest in a researcher's own population are comparable to the relationships in a published algorithm's population. The original comorbidity weights from the Charlson Comorbidity Index seemed adequate for three of the diseases in our sample. We found evidence that the Charlson Comorbidity Index might underestimate the impact of one disease in our sample. Examination of survival curves within strata defined by a comorbidity summary measure can be a useful tool for determining whether a published method appropriately accounts for comorbidities. A comorbidity score is only as good as those variables included. © Health Research and Educational Trust.

  3. Fast and accurate imputation of summary statistics enhances evidence of functional enrichment

    PubMed Central

    Pasaniuc, Bogdan; Zaitlen, Noah; Shi, Huwenbo; Bhatia, Gaurav; Gusev, Alexander; Pickrell, Joseph; Hirschhorn, Joel; Strachan, David P.; Patterson, Nick; Price, Alkes L.

    2014-01-01

    Motivation: Imputation using external reference panels (e.g. 1000 Genomes) is a widely used approach for increasing power in genome-wide association studies and meta-analysis. Existing hidden Markov models (HMM)-based imputation approaches require individual-level genotypes. Here, we develop a new method for Gaussian imputation from summary association statistics, a type of data that is becoming widely available. Results: In simulations using 1000 Genomes (1000G) data, this method recovers 84% (54%) of the effective sample size for common (>5%) and low-frequency (1–5%) variants [increasing to 87% (60%) when summary linkage disequilibrium information is available from target samples] versus the gold standard of 89% (67%) for HMM-based imputation, which cannot be applied to summary statistics. Our approach accounts for the limited sample size of the reference panel, a crucial step to eliminate false-positive associations, and it is computationally very fast. As an empirical demonstration, we apply our method to seven case–control phenotypes from the Wellcome Trust Case Control Consortium (WTCCC) data and a study of height in the British 1958 birth cohort (1958BC). Gaussian imputation from summary statistics recovers 95% (105%) of the effective sample size (as quantified by the ratio of χ2 association statistics) compared with HMM-based imputation from individual-level genotypes at the 227 (176) published single nucleotide polymorphisms (SNPs) in the WTCCC (1958BC height) data. In addition, for publicly available summary statistics from large meta-analyses of four lipid traits, we publicly release imputed summary statistics at 1000G SNPs, which could not have been obtained using previously published methods, and demonstrate their accuracy by masking subsets of the data. We show that 1000G imputation using our approach increases the magnitude and statistical evidence of enrichment at genic versus non-genic loci for these traits, as compared with an analysis without 1000G imputation. Thus, imputation of summary statistics will be a valuable tool in future functional enrichment analyses. Availability and implementation: Publicly available software package available at http://bogdan.bioinformatics.ucla.edu/software/. Contact: bpasaniuc@mednet.ucla.edu or aprice@hsph.harvard.edu Supplementary information: Supplementary materials are available at Bioinformatics online. PMID:24990607

  4. Methods for Determining Particle Size Distributions from Nuclear Detonations.

    DTIC Science & Technology

    1987-03-01

    Debris . . . 30 IV. Summary of Sample Preparation Method . . . . 35 V. Set Parameters for PCS ... ........... 39 VI. Analysis by Vendors...54 XV. Results From Brookhaven Analysis Using The Method of Cumulants ... ........... . 54 XVI. Results From Brookhaven Analysis of Sample...R-3 Using Histogram Method ......... .55 XVII. Results From Brookhaven Analysis of Sample R-8 Using Histogram Method ........... 56 XVIII.TEM Particle

  5. RCRA Facility investigation report for Waste Area Grouping 6 at Oak Ridge National Laboratory, Oak Ridge, Tennessee. Volume 5, Technical Memorandums 06-09A, 06-10A, and 06-12A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This report provides a detailed summary of the activities carried out to sample groundwater at Waste Area Grouping (WAG) 6. The analytical results for samples collected during Phase 1, Activity 2 of the WAG 6 Resource Conservation and Recovery Act Facility Investigation (RFI) are also presented. In addition, analytical results for Phase 1, activity sampling events for which data were not previously reported are included in this TM. A summary of the groundwater sampling activities of WAG 6, to date, are given in the Introduction. The Methodology section describes the sampling procedures and analytical parameters. Six attachments are included. Attachmentsmore » 1 and 2 provide analytical results for selected RFI groundwater samples and ORNL sampling event. Attachment 3 provides a summary of the contaminants detected in each well sampled for all sampling events conducted at WAG 6. Bechtel National Inc. (BNI)/IT Corporation Contract Laboratory (IT) RFI analytical methods and detection limits are given in Attachment 4. Attachment 5 provides the Oak Ridge National Laboratory (ORNL)/Analytical Chemistry Division (ACD) analytical methods and detection limits and Resource Conservation and Recovery Act (RCRA) quarterly compliance monitoring (1988--1989). Attachment 6 provides ORNL/ACD groundwater analytical methods and detection limits (for the 1990 RCRA semi-annual compliance monitoring).« less

  6. Fast and accurate imputation of summary statistics enhances evidence of functional enrichment.

    PubMed

    Pasaniuc, Bogdan; Zaitlen, Noah; Shi, Huwenbo; Bhatia, Gaurav; Gusev, Alexander; Pickrell, Joseph; Hirschhorn, Joel; Strachan, David P; Patterson, Nick; Price, Alkes L

    2014-10-15

    Imputation using external reference panels (e.g. 1000 Genomes) is a widely used approach for increasing power in genome-wide association studies and meta-analysis. Existing hidden Markov models (HMM)-based imputation approaches require individual-level genotypes. Here, we develop a new method for Gaussian imputation from summary association statistics, a type of data that is becoming widely available. In simulations using 1000 Genomes (1000G) data, this method recovers 84% (54%) of the effective sample size for common (>5%) and low-frequency (1-5%) variants [increasing to 87% (60%) when summary linkage disequilibrium information is available from target samples] versus the gold standard of 89% (67%) for HMM-based imputation, which cannot be applied to summary statistics. Our approach accounts for the limited sample size of the reference panel, a crucial step to eliminate false-positive associations, and it is computationally very fast. As an empirical demonstration, we apply our method to seven case-control phenotypes from the Wellcome Trust Case Control Consortium (WTCCC) data and a study of height in the British 1958 birth cohort (1958BC). Gaussian imputation from summary statistics recovers 95% (105%) of the effective sample size (as quantified by the ratio of [Formula: see text] association statistics) compared with HMM-based imputation from individual-level genotypes at the 227 (176) published single nucleotide polymorphisms (SNPs) in the WTCCC (1958BC height) data. In addition, for publicly available summary statistics from large meta-analyses of four lipid traits, we publicly release imputed summary statistics at 1000G SNPs, which could not have been obtained using previously published methods, and demonstrate their accuracy by masking subsets of the data. We show that 1000G imputation using our approach increases the magnitude and statistical evidence of enrichment at genic versus non-genic loci for these traits, as compared with an analysis without 1000G imputation. Thus, imputation of summary statistics will be a valuable tool in future functional enrichment analyses. Publicly available software package available at http://bogdan.bioinformatics.ucla.edu/software/. bpasaniuc@mednet.ucla.edu or aprice@hsph.harvard.edu Supplementary materials are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. NHEXAS PHASE I REGION 5 STUDY--STANDARD OPERATING PROCEDURE--COMPENDIUM OF METHOD SUMMARIES FOR COLLECTION AND ANALYSIS OF METALS AND VOCS IN BLOOD AND URINE (CDC-COMPENDIUM)

    EPA Science Inventory

    This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). It provide...

  8. NHEXAS PHASE I MARYLAND STUDY--STANDARD OPERATING PROCEDURE--COMPENDIUM OF METHOD SUMMARIES FOR COLLECTION AND ANALYSIS OF METALS, PESTICIDE METABOLITES, AND VOCS IN BLOOD AND URINE (CDC-COMPENDIUM)

    EPA Science Inventory

    This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for the collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). It pro...

  9. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE--COMPENDIUM OF METHOD SUMMARIES FOR COLLECTION AND ANALYSIS OF METALS, PESTICIDE METABOLITES, AND VOC IN BLOOD AND URINE (CDC-COMPENDIUM)

    EPA Science Inventory

    This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for the collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). It pro...

  10. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE--COMPENDIUM OF METHOD SUMMARIES FOR COLLECTION AND ANALYSIS OF METALS, PESTICIDE METABOLITES, AND VOC IN BLOOD AND URINE (CDC-COMPENDIUM)

    EPA Science Inventory

    This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for the collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). The pr...

  11. Across-cohort QC analyses of GWAS summary statistics from complex traits.

    PubMed

    Chen, Guo-Bo; Lee, Sang Hong; Robinson, Matthew R; Trzaskowski, Maciej; Zhu, Zhi-Xiang; Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Kutalik, Zoltán; Loos, Ruth J F; Frayling, Timothy M; Hirschhorn, Joel N; Yang, Jian; Wray, Naomi R; Visscher, Peter M

    2016-01-01

    Genome-wide association studies (GWASs) have been successful in discovering SNP trait associations for many quantitative traits and common diseases. Typically, the effect sizes of SNP alleles are very small and this requires large genome-wide association meta-analyses (GWAMAs) to maximize statistical power. A trend towards ever-larger GWAMA is likely to continue, yet dealing with summary statistics from hundreds of cohorts increases logistical and quality control problems, including unknown sample overlap, and these can lead to both false positive and false negative findings. In this study, we propose four metrics and visualization tools for GWAMA, using summary statistics from cohort-level GWASs. We propose methods to examine the concordance between demographic information, and summary statistics and methods to investigate sample overlap. (I) We use the population genetics F st statistic to verify the genetic origin of each cohort and their geographic location, and demonstrate using GWAMA data from the GIANT Consortium that geographic locations of cohorts can be recovered and outlier cohorts can be detected. (II) We conduct principal component analysis based on reported allele frequencies, and are able to recover the ancestral information for each cohort. (III) We propose a new statistic that uses the reported allelic effect sizes and their standard errors to identify significant sample overlap or heterogeneity between pairs of cohorts. (IV) To quantify unknown sample overlap across all pairs of cohorts, we propose a method that uses randomly generated genetic predictors that does not require the sharing of individual-level genotype data and does not breach individual privacy.

  12. Across-cohort QC analyses of GWAS summary statistics from complex traits

    PubMed Central

    Chen, Guo-Bo; Lee, Sang Hong; Robinson, Matthew R; Trzaskowski, Maciej; Zhu, Zhi-Xiang; Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Kutalik, Zoltán; Loos, Ruth J F; Frayling, Timothy M; Hirschhorn, Joel N; Yang, Jian; Wray, Naomi R; Visscher, Peter M

    2017-01-01

    Genome-wide association studies (GWASs) have been successful in discovering SNP trait associations for many quantitative traits and common diseases. Typically, the effect sizes of SNP alleles are very small and this requires large genome-wide association meta-analyses (GWAMAs) to maximize statistical power. A trend towards ever-larger GWAMA is likely to continue, yet dealing with summary statistics from hundreds of cohorts increases logistical and quality control problems, including unknown sample overlap, and these can lead to both false positive and false negative findings. In this study, we propose four metrics and visualization tools for GWAMA, using summary statistics from cohort-level GWASs. We propose methods to examine the concordance between demographic information, and summary statistics and methods to investigate sample overlap. (I) We use the population genetics Fst statistic to verify the genetic origin of each cohort and their geographic location, and demonstrate using GWAMA data from the GIANT Consortium that geographic locations of cohorts can be recovered and outlier cohorts can be detected. (II) We conduct principal component analysis based on reported allele frequencies, and are able to recover the ancestral information for each cohort. (III) We propose a new statistic that uses the reported allelic effect sizes and their standard errors to identify significant sample overlap or heterogeneity between pairs of cohorts. (IV) To quantify unknown sample overlap across all pairs of cohorts, we propose a method that uses randomly generated genetic predictors that does not require the sharing of individual-level genotype data and does not breach individual privacy. PMID:27552965

  13. A simple web-based tool to compare freshwater fish data collected using AFS standard methods

    USGS Publications Warehouse

    Bonar, Scott A.; Mercado-Silva, Norman; Rahr, Matt; Torrey, Yuta T.; Cate, Averill

    2016-01-01

    The American Fisheries Society (AFS) recently published Standard Methods for Sampling North American Freshwater Fishes. Enlisting the expertise of 284 scientists from 107 organizations throughout Canada, Mexico, and the United States, this text was developed to facilitate comparisons of fish data across regions or time. Here we describe a user-friendly web tool that automates among-sample comparisons in individual fish condition, population length-frequency distributions, and catch per unit effort (CPUE) data collected using AFS standard methods. Currently, the web tool (1) provides instantaneous summaries of almost 4,000 data sets of condition, length frequency, and CPUE of common freshwater fishes collected using standard gears in 43 states and provinces; (2) is easily appended with new standardized field data to update subsequent queries and summaries; (3) compares fish data from a particular water body with continent, ecoregion, and state data summaries; and (4) provides additional information about AFS standard fish sampling including benefits, ongoing validation studies, and opportunities to comment on specific methods. The web tool—programmed in a PHP-based Drupal framework—was supported by several AFS Sections, agencies, and universities and is freely available from the AFS website and fisheriesstandardsampling.org. With widespread use, the online tool could become an important resource for fisheries biologists.

  14. Laboratory Studies on Surface Sampling of Bacillus anthracis Contamination: Summary, Gaps, and Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Amidan, Brett G.; Hu, Rebecca

    2011-11-28

    This report summarizes previous laboratory studies to characterize the performance of methods for collecting, storing/transporting, processing, and analyzing samples from surfaces contaminated by Bacillus anthracis or related surrogates. The focus is on plate culture and count estimates of surface contamination for swab, wipe, and vacuum samples of porous and nonporous surfaces. Summaries of the previous studies and their results were assessed to identify gaps in information needed as inputs to calculate key parameters critical to risk management in biothreat incidents. One key parameter is the number of samples needed to make characterization or clearance decisions with specified statistical confidence. Othermore » key parameters include the ability to calculate, following contamination incidents, the (1) estimates of Bacillus anthracis contamination, as well as the bias and uncertainties in the estimates, and (2) confidence in characterization and clearance decisions for contaminated or decontaminated buildings. Gaps in knowledge and understanding identified during the summary of the studies are discussed and recommendations are given for future studies.« less

  15. On using summary statistics from an external calibration sample to correct for covariate measurement error.

    PubMed

    Guo, Ying; Little, Roderick J; McConnell, Daniel S

    2012-01-01

    Covariate measurement error is common in epidemiologic studies. Current methods for correcting measurement error with information from external calibration samples are insufficient to provide valid adjusted inferences. We consider the problem of estimating the regression of an outcome Y on covariates X and Z, where Y and Z are observed, X is unobserved, but a variable W that measures X with error is observed. Information about measurement error is provided in an external calibration sample where data on X and W (but not Y and Z) are recorded. We describe a method that uses summary statistics from the calibration sample to create multiple imputations of the missing values of X in the regression sample, so that the regression coefficients of Y on X and Z and associated standard errors can be estimated using simple multiple imputation combining rules, yielding valid statistical inferences under the assumption of a multivariate normal distribution. The proposed method is shown by simulation to provide better inferences than existing methods, namely the naive method, classical calibration, and regression calibration, particularly for correction for bias and achieving nominal confidence levels. We also illustrate our method with an example using linear regression to examine the relation between serum reproductive hormone concentrations and bone mineral density loss in midlife women in the Michigan Bone Health and Metabolism Study. Existing methods fail to adjust appropriately for bias due to measurement error in the regression setting, particularly when measurement error is substantial. The proposed method corrects this deficiency.

  16. Near-shore and off-shore habitat use by endangered juvenile Lost River and Shortnose Suckers in Upper Klamath Lake, Oregon: 2006 data summary

    USGS Publications Warehouse

    Burdick, Summer M.; Wilkens, Alexander X.; VanderKooi, Scott P.

    2008-01-01

    We continued sampling juvenile suckers in 2006 as part of an effort to develop bioenergetics models for juvenile Lost River and shortnose suckers. This study required us to collect fish to determine growth rates and energy content of juvenile suckers. We followed the sampling protocols and methods described by Hendrixson et al. (2007b) to maintain continuity and facilitate comparisons with data collected in recent years, but sampled at a reduced level of effort compared to previous years (approximately one-third) due to limited funding. Here we present a summary of catch data collected in 2006. Bioenergetics models will be reported separately

  17. Thermal conductivity of particulate materials: A summary of measurements taken at the Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Fountain, J. A.

    1973-01-01

    Thermal conductivity measurements of particulate materials in vacuum are presented in summary. Particulate basalt and soda lime glass beads of various size ranges were used as samples. The differentiated line heat source method was used for the measurements. A comprehensive table is shown giving all pertinent experimental conditions. Least-squares curve fits to the data are presented.

  18. Manufacturing Methods and Technology Project Summary Reports

    DTIC Science & Technology

    1984-12-01

    are used. The instrument chosen provides a convenient method of artifically aging a propellant sample while automatically analyzing for evolved oxides...and aging . Shortly after the engineering sample run, a change in REMBASS require- ments eliminated the crystal high shock requirements. This resulted...material with minimum outgassing in a precision vacuum QXFF. Minimal outgas- ..- sing reduces aging in the finished unit. A fixture was also developed to

  19. METHOD 415.3 - MEASUREMENT OF TOTAL ORGANIC CARBON, DISSOLVED ORGANIC CARBON AND SPECIFIC UV ABSORBANCE AT 254 NM IN SOURCE WATER AND DRINKING WATER

    EPA Science Inventory

    2.0 SUMMARY OF METHOD

    2.1 In both TOC and DOC determinations, organic carbon in the water sample is oxidized to form carbon dioxide (CO2), which is then measured by a detection system. There are two different approaches for the oxidation of organic carbon in water sample...

  20. MEASUREMENT OF PYRETHROID RESIDUES IN ENVIRONMENTAL AND FOOD SAMPLES BY ENHANCED SOLVENT EXTRACTION/SUPERCRITICAL FLUID EXTRACTION COUPLED WITH GAS CHROMATOGRAPHY-TANDEM MASS SPECTROMETRY

    EPA Science Inventory

    The abstract summarizes pyrethorid methods development research. It provides a summary of sample preparation and analytical techniques such as supercritical fluid extraction, enhance solvent extraction, gas chromatography and tandem mass spectrometry.

  1. Cluster randomised crossover trials with binary data and unbalanced cluster sizes: application to studies of near-universal interventions in intensive care.

    PubMed

    Forbes, Andrew B; Akram, Muhammad; Pilcher, David; Cooper, Jamie; Bellomo, Rinaldo

    2015-02-01

    Cluster randomised crossover trials have been utilised in recent years in the health and social sciences. Methods for analysis have been proposed; however, for binary outcomes, these have received little assessment of their appropriateness. In addition, methods for determination of sample size are currently limited to balanced cluster sizes both between clusters and between periods within clusters. This article aims to extend this work to unbalanced situations and to evaluate the properties of a variety of methods for analysis of binary data, with a particular focus on the setting of potential trials of near-universal interventions in intensive care to reduce in-hospital mortality. We derive a formula for sample size estimation for unbalanced cluster sizes, and apply it to the intensive care setting to demonstrate the utility of the cluster crossover design. We conduct a numerical simulation of the design in the intensive care setting and for more general configurations, and we assess the performance of three cluster summary estimators and an individual-data estimator based on binomial-identity-link regression. For settings similar to the intensive care scenario involving large cluster sizes and small intra-cluster correlations, the sample size formulae developed and analysis methods investigated are found to be appropriate, with the unweighted cluster summary method performing well relative to the more optimal but more complex inverse-variance weighted method. More generally, we find that the unweighted and cluster-size-weighted summary methods perform well, with the relative efficiency of each largely determined systematically from the study design parameters. Performance of individual-data regression is adequate with small cluster sizes but becomes inefficient for large, unbalanced cluster sizes. When outcome prevalences are 6% or less and the within-cluster-within-period correlation is 0.05 or larger, all methods display sub-nominal confidence interval coverage, with the less prevalent the outcome the worse the coverage. As with all simulation studies, conclusions are limited to the configurations studied. We confined attention to detecting intervention effects on an absolute risk scale using marginal models and did not explore properties of binary random effects models. Cluster crossover designs with binary outcomes can be analysed using simple cluster summary methods, and sample size in unbalanced cluster size settings can be determined using relatively straightforward formulae. However, caution needs to be applied in situations with low prevalence outcomes and moderate to high intra-cluster correlations. © The Author(s) 2014.

  2. A Summary Score for the Framingham Heart Study Neuropsychological Battery

    PubMed Central

    Downer, Brian; Fardo, David W.; Schmitt, Frederick A.

    2015-01-01

    Objective To calculate three summary scores of the Framingham Heart Study neuropsychological battery and determine which score best differentiates between subjects classified as having normal cognition, test-based impaired learning and memory, test-based multidomain impairment, and dementia. Method The final sample included 2,503 participants. Three summary scores were assessed: (a) composite score that provided equal weight to each subtest, (b) composite score that provided equal weight to each cognitive domain assessed by the neuropsychological battery, and (c) abbreviated score comprised of subtests for learning and memory. Receiver operating characteristic analysis was used to determine which summary score best differentiated between the four cognitive states. Results The summary score that provided equal weight to each subtest best differentiated between the four cognitive states. Discussion A summary score that provides equal weight to each subtest is an efficient way to utilize all of the cognitive data collected by a neuropsychological battery. PMID:25804903

  3. MAP3S precipitation chemistry network. Third periodic summary report, July 1978-December 1979

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-05-01

    The MAP3S Precipitation Chemistry Network consists of eight collection sites in the northeastern United States. Precipitation event samples are collected by cooperating site operators, using specially developed sampling equipment. In this, the third periodic summary report, are listed field and concentration data for the period July 1, 1978 to December 31, 1979. Over three years' samples have been collected at most of the sites, which went into operation between September 1976 and October 1978. Samples are chemically analyzed at a central laboratory for 13 pollutant species. Weekly samples in addition to event samples were collected over a 1 1/2 yearmore » period at three sites. Analysis of one year's results indicates that there is little difference between the concentrations collected by the two methods in terms of seasonal precipitation-weighted means for all species except dissolved SO/sub 2/. Event samples tend to average about 25% higher in SO/sub 2/ than weekly samples.« less

  4. Evidence for a Global Sampling Process in Extraction of Summary Statistics of Item Sizes in a Set.

    PubMed

    Tokita, Midori; Ueda, Sachiyo; Ishiguchi, Akira

    2016-01-01

    Several studies have shown that our visual system may construct a "summary statistical representation" over groups of visual objects. Although there is a general understanding that human observers can accurately represent sets of a variety of features, many questions on how summary statistics, such as an average, are computed remain unanswered. This study investigated sampling properties of visual information used by human observers to extract two types of summary statistics of item sets, average and variance. We presented three models of ideal observers to extract the summary statistics: a global sampling model without sampling noise, global sampling model with sampling noise, and limited sampling model. We compared the performance of an ideal observer of each model with that of human observers using statistical efficiency analysis. Results suggest that summary statistics of items in a set may be computed without representing individual items, which makes it possible to discard the limited sampling account. Moreover, the extraction of summary statistics may not necessarily require the representation of individual objects with focused attention when the sets of items are larger than 4.

  5. Using regression equations built from summary data in the psychological assessment of the individual case: extension to multiple regression.

    PubMed

    Crawford, John R; Garthwaite, Paul H; Denham, Annie K; Chelune, Gordon J

    2012-12-01

    Regression equations have many useful roles in psychological assessment. Moreover, there is a large reservoir of published data that could be used to build regression equations; these equations could then be employed to test a wide variety of hypotheses concerning the functioning of individual cases. This resource is currently underused because (a) not all psychologists are aware that regression equations can be built not only from raw data but also using only basic summary data for a sample, and (b) the computations involved are tedious and prone to error. In an attempt to overcome these barriers, Crawford and Garthwaite (2007) provided methods to build and apply simple linear regression models using summary statistics as data. In the present study, we extend this work to set out the steps required to build multiple regression models from sample summary statistics and the further steps required to compute the associated statistics for drawing inferences concerning an individual case. We also develop, describe, and make available a computer program that implements these methods. Although there are caveats associated with the use of the methods, these need to be balanced against pragmatic considerations and against the alternative of either entirely ignoring a pertinent data set or using it informally to provide a clinical "guesstimate." Upgraded versions of earlier programs for regression in the single case are also provided; these add the point and interval estimates of effect size developed in the present article.

  6. 40 CFR Table E-1 to Subpart E of... - Summary of Test Requirements for Reference and Class I Equivalent Methods for PM 2.5 and PM 10-2.5

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... accuracy 3. Filter temp. control accuracy, sampling and non-sampling 1. 2 °C2. 2 °C 3. Not more than 5 °C... Reference and Class I Equivalent Methods for PM 2.5 and PM 10-2.5 E Table E-1 to Subpart E of Part 53... MONITORING REFERENCE AND EQUIVALENT METHODS Procedures for Testing Physical (Design) and Performance...

  7. 40 CFR Table E-1 to Subpart E of... - Summary of Test Requirements for Reference and Class I Equivalent Methods for PM2.5 and PM10-2.5

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... accuracy 3. Filter temp. control accuracy, sampling and non-sampling 1. 2 °C2. 2 °C 3. Not more than 5 °C... Reference and Class I Equivalent Methods for PM2.5 and PM10-2.5 E Table E-1 to Subpart E of Part 53... MONITORING REFERENCE AND EQUIVALENT METHODS Procedures for Testing Physical (Design) and Performance...

  8. 40 CFR Table E-1 to Subpart E of... - Summary of Test Requirements for Reference and Class I Equivalent Methods for PM2.5 and PM10-2.5

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... accuracy 3. Filter temp. control accuracy, sampling and non-sampling 1. 2 °C2. 2 °C 3. Not more than 5 °C... Reference and Class I Equivalent Methods for PM2.5 and PM10-2.5 E Table E-1 to Subpart E of Part 53... MONITORING REFERENCE AND EQUIVALENT METHODS Procedures for Testing Physical (Design) and Performance...

  9. 40 CFR Table E-1 to Subpart E of... - Summary of Test Requirements for Reference and Class I Equivalent Methods for PM 2.5 and PM 10-2.5

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... accuracy 3. Filter temp. control accuracy, sampling and non-sampling 1. 2 °C2. 2 °C 3. Not more than 5 °C... Reference and Class I Equivalent Methods for PM 2.5 and PM 10-2.5 E Table E-1 to Subpart E of Part 53... MONITORING REFERENCE AND EQUIVALENT METHODS Procedures for Testing Physical (Design) and Performance...

  10. US Fish and Wildlife Service biomonitoring operations manual, Appendices A--K

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gianotto, D.F.; Rope, R.C.; Mondecar, M.

    1993-04-01

    Volume 2 contains Appendices and Summary Sheets for the following areas: A-Legislative Background and Key to Relevant Legislation, B- Biomonitoring Operations Workbook, C-Air Monitoring, D-Introduction to the Flora and Fauna for Biomonitoring, E-Decontamination Guidance Reference Field Methods, F-Documentation Guidance, Sample Handling, and Quality Assurance/Quality Control Standard Operating Procedures, G-Field Instrument Measurements Reference Field Methods, H-Ground Water Sampling Reference Field Methods, I-Sediment Sampling Reference Field Methods, J-Soil Sampling Reference Field Methods, K-Surface Water Reference Field Methods. Appendix B explains how to set up strategy to enter information on the ``disk workbook``. Appendix B is enhanced by DE97006389, an on-line workbook formore » users to be able to make revisions to their own biomonitoring data.« less

  11. The characterization of organic contaminants during the development of the Space Station water reclamation and management system

    NASA Technical Reports Server (NTRS)

    Cole, H.; Habercom, M.; Crenshaw, M.; Johnson, S.; Manuel, S.; Martindale, W.; Whitman, G.; Traweek, M.

    1991-01-01

    Examples of the application of various methods for characterizing samples for alcohols, fatty acids, detergents, and volatile/semivolatile basic, neutral, and phenolic acid contaminants are presented. Data, applications, and interpretations are given for a variety of methods including sample preparation/cleanup procedures, ion chromatography, and gas chromatography with various detectors. Summaries of the major organic contaminants that contribute to the total organic carbon content are presented.

  12. Looking for trees in the forest: summary tree from posterior samples

    PubMed Central

    2013-01-01

    Background Bayesian phylogenetic analysis generates a set of trees which are often condensed into a single tree representing the whole set. Many methods exist for selecting a representative topology for a set of unrooted trees, few exist for assigning branch lengths to a fixed topology, and even fewer for simultaneously setting the topology and branch lengths. However, there is very little research into locating a good representative for a set of rooted time trees like the ones obtained from a BEAST analysis. Results We empirically compare new and known methods for generating a summary tree. Some new methods are motivated by mathematical constructions such as tree metrics, while the rest employ tree concepts which work well in practice. These use more of the posterior than existing methods, which discard information not directly mapped to the chosen topology. Using results from a large number of simulations we assess the quality of a summary tree, measuring (a) how well it explains the sequence data under the model and (b) how close it is to the “truth”, i.e to the tree used to generate the sequences. Conclusions Our simulations indicate that no single method is “best”. Methods producing good divergence time estimates have poor branch lengths and lower model fit, and vice versa. Using the results presented here, a user can choose the appropriate method based on the purpose of the summary tree. PMID:24093883

  13. Looking for trees in the forest: summary tree from posterior samples.

    PubMed

    Heled, Joseph; Bouckaert, Remco R

    2013-10-04

    Bayesian phylogenetic analysis generates a set of trees which are often condensed into a single tree representing the whole set. Many methods exist for selecting a representative topology for a set of unrooted trees, few exist for assigning branch lengths to a fixed topology, and even fewer for simultaneously setting the topology and branch lengths. However, there is very little research into locating a good representative for a set of rooted time trees like the ones obtained from a BEAST analysis. We empirically compare new and known methods for generating a summary tree. Some new methods are motivated by mathematical constructions such as tree metrics, while the rest employ tree concepts which work well in practice. These use more of the posterior than existing methods, which discard information not directly mapped to the chosen topology. Using results from a large number of simulations we assess the quality of a summary tree, measuring (a) how well it explains the sequence data under the model and (b) how close it is to the "truth", i.e to the tree used to generate the sequences. Our simulations indicate that no single method is "best". Methods producing good divergence time estimates have poor branch lengths and lower model fit, and vice versa. Using the results presented here, a user can choose the appropriate method based on the purpose of the summary tree.

  14. Rapid Radiochemical Method for Isotopic Uranium in Building ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Uranium-234, uranium-235, and uranium-238 in concrete and brick samples Method Selected for: SAM lists this method for qualitative analysis of uranium-234, uranium-235, and uranium-238 in concrete or brick building materials. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  15. The State of the Union: Sexual Health Disparities in a National Sample of US College Students

    ERIC Educational Resources Information Center

    Buhi, Eric R.; Marhefka, Stephanie L.; Hoban, Mary T.

    2010-01-01

    Objective: To examine sexual health disparities between blacks and whites in a national sample of US college students. Participants and Method Summary: Analyses utilized secondary data from 44,165 nonmarried undergraduates (aged 18-24; M = 20.1) responding to the Spring 2007 American College Health Association-National College Health Assessment;…

  16. Constrained Maximum Likelihood Estimation for Model Calibration Using Summary-level Information from External Big Data Sources

    PubMed Central

    Chatterjee, Nilanjan; Chen, Yi-Hau; Maas, Paige; Carroll, Raymond J.

    2016-01-01

    Information from various public and private data sources of extremely large sample sizes are now increasingly available for research purposes. Statistical methods are needed for utilizing information from such big data sources while analyzing data from individual studies that may collect more detailed information required for addressing specific hypotheses of interest. In this article, we consider the problem of building regression models based on individual-level data from an “internal” study while utilizing summary-level information, such as information on parameters for reduced models, from an “external” big data source. We identify a set of very general constraints that link internal and external models. These constraints are used to develop a framework for semiparametric maximum likelihood inference that allows the distribution of covariates to be estimated using either the internal sample or an external reference sample. We develop extensions for handling complex stratified sampling designs, such as case-control sampling, for the internal study. Asymptotic theory and variance estimators are developed for each case. We use simulation studies and a real data application to assess the performance of the proposed methods in contrast to the generalized regression (GR) calibration methodology that is popular in the sample survey literature. PMID:27570323

  17. How-To-Do-It: Measuring Vegetation Biomass and Production.

    ERIC Educational Resources Information Center

    Collins, Don; Weaver, T.

    1988-01-01

    Describes a lab exercise used to demonstrate the measurement of biomass in a three layered forest. Discusses sampling, estimation methods, and the analysis of results. Presents an example of a summary sheet for this activity. (CW)

  18. Determining detection sensitivity and methods for invertebrate sampling

    EPA Science Inventory

    This meeting is intended to communicate Great Lakes invasive species early detection science to state management agencies to assist them in implementing monitoring. My presentation summaries lessons learned concerning invertebrate monitoring in the course of ORD research on earl...

  19. Unconstrained Enhanced Sampling for Free Energy Calculations of Biomolecules: A Review

    PubMed Central

    Miao, Yinglong; McCammon, J. Andrew

    2016-01-01

    Free energy calculations are central to understanding the structure, dynamics and function of biomolecules. Yet insufficient sampling of biomolecular configurations is often regarded as one of the main sources of error. Many enhanced sampling techniques have been developed to address this issue. Notably, enhanced sampling methods based on biasing collective variables (CVs), including the widely used umbrella sampling, adaptive biasing force and metadynamics, have been discussed in a recent excellent review (Abrams and Bussi, Entropy, 2014). Here, we aim to review enhanced sampling methods that do not require predefined system-dependent CVs for biomolecular simulations and as such do not suffer from the hidden energy barrier problem as encountered in the CV-biasing methods. These methods include, but are not limited to, replica exchange/parallel tempering, self-guided molecular/Langevin dynamics, essential energy space random walk and accelerated molecular dynamics. While it is overwhelming to describe all details of each method, we provide a summary of the methods along with the applications and offer our perspectives. We conclude with challenges and prospects of the unconstrained enhanced sampling methods for accurate biomolecular free energy calculations. PMID:27453631

  20. Unconstrained Enhanced Sampling for Free Energy Calculations of Biomolecules: A Review.

    PubMed

    Miao, Yinglong; McCammon, J Andrew

    Free energy calculations are central to understanding the structure, dynamics and function of biomolecules. Yet insufficient sampling of biomolecular configurations is often regarded as one of the main sources of error. Many enhanced sampling techniques have been developed to address this issue. Notably, enhanced sampling methods based on biasing collective variables (CVs), including the widely used umbrella sampling, adaptive biasing force and metadynamics, have been discussed in a recent excellent review (Abrams and Bussi, Entropy, 2014). Here, we aim to review enhanced sampling methods that do not require predefined system-dependent CVs for biomolecular simulations and as such do not suffer from the hidden energy barrier problem as encountered in the CV-biasing methods. These methods include, but are not limited to, replica exchange/parallel tempering, self-guided molecular/Langevin dynamics, essential energy space random walk and accelerated molecular dynamics. While it is overwhelming to describe all details of each method, we provide a summary of the methods along with the applications and offer our perspectives. We conclude with challenges and prospects of the unconstrained enhanced sampling methods for accurate biomolecular free energy calculations.

  1. Formal Functional Test Designs: Bridging the Gap Between Test Requirements and Test Specifications

    NASA Technical Reports Server (NTRS)

    Hops, Jonathan

    1993-01-01

    This presentation describes the testing life cycle, the purpose of the test design phase, and test design methods and gives an example application. Also included is a description of Test Representation Language (TRL), a summary of the language, and an example of an application of TRL. A sample test requirement and sample test design are included.

  2. HAPRAP: a haplotype-based iterative method for statistical fine mapping using GWAS summary statistics.

    PubMed

    Zheng, Jie; Rodriguez, Santiago; Laurin, Charles; Baird, Denis; Trela-Larsen, Lea; Erzurumluoglu, Mesut A; Zheng, Yi; White, Jon; Giambartolomei, Claudia; Zabaneh, Delilah; Morris, Richard; Kumari, Meena; Casas, Juan P; Hingorani, Aroon D; Evans, David M; Gaunt, Tom R; Day, Ian N M

    2017-01-01

    Fine mapping is a widely used approach for identifying the causal variant(s) at disease-associated loci. Standard methods (e.g. multiple regression) require individual level genotypes. Recent fine mapping methods using summary-level data require the pairwise correlation coefficients ([Formula: see text]) of the variants. However, haplotypes rather than pairwise [Formula: see text], are the true biological representation of linkage disequilibrium (LD) among multiple loci. In this article, we present an empirical iterative method, HAPlotype Regional Association analysis Program (HAPRAP), that enables fine mapping using summary statistics and haplotype information from an individual-level reference panel. Simulations with individual-level genotypes show that the results of HAPRAP and multiple regression are highly consistent. In simulation with summary-level data, we demonstrate that HAPRAP is less sensitive to poor LD estimates. In a parametric simulation using Genetic Investigation of ANthropometric Traits height data, HAPRAP performs well with a small training sample size (N < 2000) while other methods become suboptimal. Moreover, HAPRAP's performance is not affected substantially by single nucleotide polymorphisms (SNPs) with low minor allele frequencies. We applied the method to existing quantitative trait and binary outcome meta-analyses (human height, QTc interval and gallbladder disease); all previous reported association signals were replicated and two additional variants were independently associated with human height. Due to the growing availability of summary level data, the value of HAPRAP is likely to increase markedly for future analyses (e.g. functional prediction and identification of instruments for Mendelian randomization). The HAPRAP package and documentation are available at http://apps.biocompute.org.uk/haprap/ CONTACT: : jie.zheng@bristol.ac.uk or tom.gaunt@bristol.ac.ukSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  3. MStern Blotting-High Throughput Polyvinylidene Fluoride (PVDF) Membrane-Based Proteomic Sample Preparation for 96-Well Plates.

    PubMed

    Berger, Sebastian T; Ahmed, Saima; Muntel, Jan; Cuevas Polo, Nerea; Bachur, Richard; Kentsis, Alex; Steen, Judith; Steen, Hanno

    2015-10-01

    We describe a 96-well plate compatible membrane-based proteomic sample processing method, which enables the complete processing of 96 samples (or multiples thereof) within a single workday. This method uses a large-pore hydrophobic PVDF membrane that efficiently adsorbs proteins, resulting in fast liquid transfer through the membrane and significantly reduced sample processing times. Low liquid transfer speeds have prevented the useful 96-well plate implementation of FASP as a widely used membrane-based proteomic sample processing method. We validated our approach on whole-cell lysate and urine and cerebrospinal fluid as clinically relevant body fluids. Without compromising peptide and protein identification, our method uses a vacuum manifold and circumvents the need for digest desalting, making our processing method compatible with standard liquid handling robots. In summary, our new method maintains the strengths of FASP and simultaneously overcomes one of the major limitations of FASP without compromising protein identification and quantification. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  4. MStern Blotting–High Throughput Polyvinylidene Fluoride (PVDF) Membrane-Based Proteomic Sample Preparation for 96-Well Plates*

    PubMed Central

    Berger, Sebastian T.; Ahmed, Saima; Muntel, Jan; Cuevas Polo, Nerea; Bachur, Richard; Kentsis, Alex; Steen, Judith; Steen, Hanno

    2015-01-01

    We describe a 96-well plate compatible membrane-based proteomic sample processing method, which enables the complete processing of 96 samples (or multiples thereof) within a single workday. This method uses a large-pore hydrophobic PVDF membrane that efficiently adsorbs proteins, resulting in fast liquid transfer through the membrane and significantly reduced sample processing times. Low liquid transfer speeds have prevented the useful 96-well plate implementation of FASP as a widely used membrane-based proteomic sample processing method. We validated our approach on whole-cell lysate and urine and cerebrospinal fluid as clinically relevant body fluids. Without compromising peptide and protein identification, our method uses a vacuum manifold and circumvents the need for digest desalting, making our processing method compatible with standard liquid handling robots. In summary, our new method maintains the strengths of FASP and simultaneously overcomes one of the major limitations of FASP without compromising protein identification and quantification. PMID:26223766

  5. The space of ultrametric phylogenetic trees.

    PubMed

    Gavryushkin, Alex; Drummond, Alexei J

    2016-08-21

    The reliability of a phylogenetic inference method from genomic sequence data is ensured by its statistical consistency. Bayesian inference methods produce a sample of phylogenetic trees from the posterior distribution given sequence data. Hence the question of statistical consistency of such methods is equivalent to the consistency of the summary of the sample. More generally, statistical consistency is ensured by the tree space used to analyse the sample. In this paper, we consider two standard parameterisations of phylogenetic time-trees used in evolutionary models: inter-coalescent interval lengths and absolute times of divergence events. For each of these parameterisations we introduce a natural metric space on ultrametric phylogenetic trees. We compare the introduced spaces with existing models of tree space and formulate several formal requirements that a metric space on phylogenetic trees must possess in order to be a satisfactory space for statistical analysis, and justify them. We show that only a few known constructions of the space of phylogenetic trees satisfy these requirements. However, our results suggest that these basic requirements are not enough to distinguish between the two metric spaces we introduce and that the choice between metric spaces requires additional properties to be considered. Particularly, that the summary tree minimising the square distance to the trees from the sample might be different for different parameterisations. This suggests that further fundamental insight is needed into the problem of statistical consistency of phylogenetic inference methods. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Effective dimension reduction for sparse functional data

    PubMed Central

    YAO, F.; LEI, E.; WU, Y.

    2015-01-01

    Summary We propose a method of effective dimension reduction for functional data, emphasizing the sparse design where one observes only a few noisy and irregular measurements for some or all of the subjects. The proposed method borrows strength across the entire sample and provides a way to characterize the effective dimension reduction space, via functional cumulative slicing. Our theoretical study reveals a bias-variance trade-off associated with the regularizing truncation and decaying structures of the predictor process and the effective dimension reduction space. A simulation study and an application illustrate the superior finite-sample performance of the method. PMID:26566293

  7. Assessing the suitability of summary data for two-sample Mendelian randomization analyses using MR-Egger regression: the role of the I2 statistic.

    PubMed

    Bowden, Jack; Del Greco M, Fabiola; Minelli, Cosetta; Davey Smith, George; Sheehan, Nuala A; Thompson, John R

    2016-12-01

    : MR-Egger regression has recently been proposed as a method for Mendelian randomization (MR) analyses incorporating summary data estimates of causal effect from multiple individual variants, which is robust to invalid instruments. It can be used to test for directional pleiotropy and provides an estimate of the causal effect adjusted for its presence. MR-Egger regression provides a useful additional sensitivity analysis to the standard inverse variance weighted (IVW) approach that assumes all variants are valid instruments. Both methods use weights that consider the single nucleotide polymorphism (SNP)-exposure associations to be known, rather than estimated. We call this the `NO Measurement Error' (NOME) assumption. Causal effect estimates from the IVW approach exhibit weak instrument bias whenever the genetic variants utilized violate the NOME assumption, which can be reliably measured using the F-statistic. The effect of NOME violation on MR-Egger regression has yet to be studied. An adaptation of the I2 statistic from the field of meta-analysis is proposed to quantify the strength of NOME violation for MR-Egger. It lies between 0 and 1, and indicates the expected relative bias (or dilution) of the MR-Egger causal estimate in the two-sample MR context. We call it IGX2 . The method of simulation extrapolation is also explored to counteract the dilution. Their joint utility is evaluated using simulated data and applied to a real MR example. In simulated two-sample MR analyses we show that, when a causal effect exists, the MR-Egger estimate of causal effect is biased towards the null when NOME is violated, and the stronger the violation (as indicated by lower values of IGX2 ), the stronger the dilution. When additionally all genetic variants are valid instruments, the type I error rate of the MR-Egger test for pleiotropy is inflated and the causal effect underestimated. Simulation extrapolation is shown to substantially mitigate these adverse effects. We demonstrate our proposed approach for a two-sample summary data MR analysis to estimate the causal effect of low-density lipoprotein on heart disease risk. A high value of IGX2 close to 1 indicates that dilution does not materially affect the standard MR-Egger analyses for these data. : Care must be taken to assess the NOME assumption via the IGX2 statistic before implementing standard MR-Egger regression in the two-sample summary data context. If IGX2 is sufficiently low (less than 90%), inferences from the method should be interpreted with caution and adjustment methods considered. © The Author 2016. Published by Oxford University Press on behalf of the International Epidemiological Association.

  8. Food safety issues: a summary report of a panel session addressing pre- and post-harvest strategies to improve public health.

    PubMed

    Kirkhorn, Steven R

    2008-01-01

    The paper presents a summary of a panel presentation by animal scientists and public health veterinarians on food safety methods to improve public health of the public consuming beef and poultry. Pre-harvest methods discussed include vaccination to decrease shedding of Esherichia coli O157:H7, direct-fed microbials (DFMs), calving methods, and responsible use of antimicrobials. Post-harvest methods discussed include increased sampling and use of hazard analysis and critical control point (HACCP) methods, test-and-hold of meat products prior to release for public consumption, development of attainment categories, the use of pulsed field gel electrophoresis (PFGE) for genotyping and serotyping, and an 11-step surveillance program. The public health concern and financial consequences of contamination with E. coli O157:H7 is discussed. A "carrot and stick" approach for both producers and processors to increase vaccination and product testing is recommended.

  9. How powerful are summary-based methods for identifying expression-trait associations under different genetic architectures?

    PubMed

    Veturi, Yogasudha; Ritchie, Marylyn D

    2018-01-01

    Transcriptome-wide association studies (TWAS) have recently been employed as an approach that can draw upon the advantages of genome-wide association studies (GWAS) and gene expression studies to identify genes associated with complex traits. Unlike standard GWAS, summary level data suffices for TWAS and offers improved statistical power. Two popular TWAS methods include either (a) imputing the cis genetic component of gene expression from smaller sized studies (using multi-SNP prediction or MP) into much larger effective sample sizes afforded by GWAS - TWAS-MP or (b) using summary-based Mendelian randomization - TWAS-SMR. Although these methods have been effective at detecting functional variants, it remains unclear how extensive variability in the genetic architecture of complex traits and diseases impacts TWAS results. Our goal was to investigate the different scenarios under which these methods yielded enough power to detect significant expression-trait associations. In this study, we conducted extensive simulations based on 6000 randomly chosen, unrelated Caucasian males from Geisinger's MyCode population to compare the power to detect cis expression-trait associations (within 500 kb of a gene) using the above-described approaches. To test TWAS across varying genetic backgrounds we simulated gene expression and phenotype using different quantitative trait loci per gene and cis-expression /trait heritability under genetic models that differentiate the effect of causality from that of pleiotropy. For each gene, on a training set ranging from 100 to 1000 individuals, we either (a) estimated regression coefficients with gene expression as the response using five different methods: LASSO, elastic net, Bayesian LASSO, Bayesian spike-slab, and Bayesian ridge regression or (b) performed eQTL analysis. We then sampled with replacement 50,000, 150,000, and 300,000 individuals respectively from the testing set of the remaining 5000 individuals and conducted GWAS on each set. Subsequently, we integrated the GWAS summary statistics derived from the testing set with the weights (or eQTLs) derived from the training set to identify expression-trait associations using (a) TWAS-MP (b) TWAS-SMR (c) eQTL-based GWAS, or (d) standalone GWAS. Finally, we examined the power to detect functionally relevant genes using the different approaches under the considered simulation scenarios. In general, we observed great similarities among TWAS-MP methods although the Bayesian methods resulted in improved power in comparison to LASSO and elastic net as the trait architecture grew more complex while training sample sizes and expression heritability remained small. Finally, we observed high power under causality but very low to moderate power under pleiotropy.

  10. METHOD 200.5 - DETERMINATION OF TRACE ELEMENTS IN DRINKING WATER BY AXIALLY VIEWED INDUCTIVELY COUPLED PLASMA-ATOMIC EMISSION SPECTROMETRY

    EPA Science Inventory

    2.0 SUMMARY OF METHOD
    2.1. A 50 mL aliquot of a well-mixed, non-filtered, acid preserved aqueous sample is accurately transferred to clean 50-mL plastic disposable digestion tube containing a mixture of nitric and hydrochloric acids. The aliquot is heated to 95 degrees C (+ o...

  11. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  12. Multi-trait analysis of genome-wide association summary statistics using MTAG.

    PubMed

    Turley, Patrick; Walters, Raymond K; Maghzian, Omeed; Okbay, Aysu; Lee, James J; Fontana, Mark Alan; Nguyen-Viet, Tuan Anh; Wedow, Robbee; Zacher, Meghan; Furlotte, Nicholas A; Magnusson, Patrik; Oskarsson, Sven; Johannesson, Magnus; Visscher, Peter M; Laibson, David; Cesarini, David; Neale, Benjamin M; Benjamin, Daniel J

    2018-02-01

    We introduce multi-trait analysis of GWAS (MTAG), a method for joint analysis of summary statistics from genome-wide association studies (GWAS) of different traits, possibly from overlapping samples. We apply MTAG to summary statistics for depressive symptoms (N eff  = 354,862), neuroticism (N = 168,105), and subjective well-being (N = 388,538). As compared to the 32, 9, and 13 genome-wide significant loci identified in the single-trait GWAS (most of which are themselves novel), MTAG increases the number of associated loci to 64, 37, and 49, respectively. Moreover, association statistics from MTAG yield more informative bioinformatics analyses and increase the variance explained by polygenic scores by approximately 25%, matching theoretical expectations.

  13. Evaluation and application of summary statistic imputation to discover new height-associated loci.

    PubMed

    Rüeger, Sina; McDaid, Aaron; Kutalik, Zoltán

    2018-05-01

    As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression.

  14. Evaluation and application of summary statistic imputation to discover new height-associated loci

    PubMed Central

    2018-01-01

    As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression. PMID:29782485

  15. Appendix 3 Summary of Field Sampling and Analytical Methods with Bibliography

    EPA Science Inventory

    Conductivity and Specific conductance are measures of the ability of water to conduct an electric current, and are a general measure of stream-water quality. Conductivity is affected by temperature, with warmer water having a greater conductivity. Specific conductance is the te...

  16. Partitioning heritability by functional annotation using genome-wide association summary statistics.

    PubMed

    Finucane, Hilary K; Bulik-Sullivan, Brendan; Gusev, Alexander; Trynka, Gosia; Reshef, Yakir; Loh, Po-Ru; Anttila, Verneri; Xu, Han; Zang, Chongzhi; Farh, Kyle; Ripke, Stephan; Day, Felix R; Purcell, Shaun; Stahl, Eli; Lindstrom, Sara; Perry, John R B; Okada, Yukinori; Raychaudhuri, Soumya; Daly, Mark J; Patterson, Nick; Neale, Benjamin M; Price, Alkes L

    2015-11-01

    Recent work has demonstrated that some functional categories of the genome contribute disproportionately to the heritability of complex diseases. Here we analyze a broad set of functional elements, including cell type-specific elements, to estimate their polygenic contributions to heritability in genome-wide association studies (GWAS) of 17 complex diseases and traits with an average sample size of 73,599. To enable this analysis, we introduce a new method, stratified LD score regression, for partitioning heritability from GWAS summary statistics while accounting for linked markers. This new method is computationally tractable at very large sample sizes and leverages genome-wide information. Our findings include a large enrichment of heritability in conserved regions across many traits, a very large immunological disease-specific enrichment of heritability in FANTOM5 enhancers and many cell type-specific enrichments, including significant enrichment of central nervous system cell types in the heritability of body mass index, age at menarche, educational attainment and smoking behavior.

  17. Installation Restoration Program Stage 3. McClellan Air Force Base, California. Remedial Investigation/Feasibility Study Groundwater Sampling and Analysis Program Data Summary

    DTIC Science & Technology

    1988-12-01

    and do not refer to monitoring zones at McClellSn AFB. b Priority poLutant metals analyses also included U.S. EPA Methods 206.2, 245.1 and 270.2. EW a...sampling protocol, and the laboratory is audited routinely. Therefore, no corrective action other than good training and supervision is necessary. The same

  18. A Review and Comparison of Methods for Recreating Individual Patient Data from Published Kaplan-Meier Survival Curves for Economic Evaluations: A Simulation Study

    PubMed Central

    Wan, Xiaomin; Peng, Liubao; Li, Yuanjian

    2015-01-01

    Background In general, the individual patient-level data (IPD) collected in clinical trials are not available to independent researchers to conduct economic evaluations; researchers only have access to published survival curves and summary statistics. Thus, methods that use published survival curves and summary statistics to reproduce statistics for economic evaluations are essential. Four methods have been identified: two traditional methods 1) least squares method, 2) graphical method; and two recently proposed methods by 3) Hoyle and Henley, 4) Guyot et al. The four methods were first individually reviewed and subsequently assessed regarding their abilities to estimate mean survival through a simulation study. Methods A number of different scenarios were developed that comprised combinations of various sample sizes, censoring rates and parametric survival distributions. One thousand simulated survival datasets were generated for each scenario, and all methods were applied to actual IPD. The uncertainty in the estimate of mean survival time was also captured. Results All methods provided accurate estimates of the mean survival time when the sample size was 500 and a Weibull distribution was used. When the sample size was 100 and the Weibull distribution was used, the Guyot et al. method was almost as accurate as the Hoyle and Henley method; however, more biases were identified in the traditional methods. When a lognormal distribution was used, the Guyot et al. method generated noticeably less bias and a more accurate uncertainty compared with the Hoyle and Henley method. Conclusions The traditional methods should not be preferred because of their remarkable overestimation. When the Weibull distribution was used for a fitted model, the Guyot et al. method was almost as accurate as the Hoyle and Henley method. However, if the lognormal distribution was used, the Guyot et al. method was less biased compared with the Hoyle and Henley method. PMID:25803659

  19. A UNIFIED FRAMEWORK FOR VARIANCE COMPONENT ESTIMATION WITH SUMMARY STATISTICS IN GENOME-WIDE ASSOCIATION STUDIES.

    PubMed

    Zhou, Xiang

    2017-12-01

    Linear mixed models (LMMs) are among the most commonly used tools for genetic association studies. However, the standard method for estimating variance components in LMMs-the restricted maximum likelihood estimation method (REML)-suffers from several important drawbacks: REML requires individual-level genotypes and phenotypes from all samples in the study, is computationally slow, and produces downward-biased estimates in case control studies. To remedy these drawbacks, we present an alternative framework for variance component estimation, which we refer to as MQS. MQS is based on the method of moments (MoM) and the minimal norm quadratic unbiased estimation (MINQUE) criterion, and brings two seemingly unrelated methods-the renowned Haseman-Elston (HE) regression and the recent LD score regression (LDSC)-into the same unified statistical framework. With this new framework, we provide an alternative but mathematically equivalent form of HE that allows for the use of summary statistics. We provide an exact estimation form of LDSC to yield unbiased and statistically more efficient estimates. A key feature of our method is its ability to pair marginal z -scores computed using all samples with SNP correlation information computed using a small random subset of individuals (or individuals from a proper reference panel), while capable of producing estimates that can be almost as accurate as if both quantities are computed using the full data. As a result, our method produces unbiased and statistically efficient estimates, and makes use of summary statistics, while it is computationally efficient for large data sets. Using simulations and applications to 37 phenotypes from 8 real data sets, we illustrate the benefits of our method for estimating and partitioning SNP heritability in population studies as well as for heritability estimation in family studies. Our method is implemented in the GEMMA software package, freely available at www.xzlab.org/software.html.

  20. A review and comparison of methods for recreating individual patient data from published Kaplan-Meier survival curves for economic evaluations: a simulation study.

    PubMed

    Wan, Xiaomin; Peng, Liubao; Li, Yuanjian

    2015-01-01

    In general, the individual patient-level data (IPD) collected in clinical trials are not available to independent researchers to conduct economic evaluations; researchers only have access to published survival curves and summary statistics. Thus, methods that use published survival curves and summary statistics to reproduce statistics for economic evaluations are essential. Four methods have been identified: two traditional methods 1) least squares method, 2) graphical method; and two recently proposed methods by 3) Hoyle and Henley, 4) Guyot et al. The four methods were first individually reviewed and subsequently assessed regarding their abilities to estimate mean survival through a simulation study. A number of different scenarios were developed that comprised combinations of various sample sizes, censoring rates and parametric survival distributions. One thousand simulated survival datasets were generated for each scenario, and all methods were applied to actual IPD. The uncertainty in the estimate of mean survival time was also captured. All methods provided accurate estimates of the mean survival time when the sample size was 500 and a Weibull distribution was used. When the sample size was 100 and the Weibull distribution was used, the Guyot et al. method was almost as accurate as the Hoyle and Henley method; however, more biases were identified in the traditional methods. When a lognormal distribution was used, the Guyot et al. method generated noticeably less bias and a more accurate uncertainty compared with the Hoyle and Henley method. The traditional methods should not be preferred because of their remarkable overestimation. When the Weibull distribution was used for a fitted model, the Guyot et al. method was almost as accurate as the Hoyle and Henley method. However, if the lognormal distribution was used, the Guyot et al. method was less biased compared with the Hoyle and Henley method.

  1. 40 CFR Appendix I to Subpart T - Sample Graphical Summary of NTE Emission Results

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Emission Results I Appendix I to Subpart T Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Appendix I to Subpart T—Sample Graphical Summary of NTE Emission Results The following figure shows an example of a graphical summary of NTE emission results: ER14JN05.002 ...

  2. Rapid Method for Sodium Hydroxide Fusion of Concrete and ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Americium-241, plutonium-238, plutonium-239, radium-226, strontium-90, uranium-234, uranium-235 and uranium-238 in concrete and brick samples Method Selected for: SAM lists this method for qualitative analysis of americium-241, plutonium-238, plutonium-239, radium-226, strontium-90, uranium-234, uranium-235 and uranium-238 in concrete or brick building materials. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  3. Evaluation of hot mix asphalt moisture sensitivity using the Nottingham Asphalt test equipment : tech transfer summary, March 2010.

    DOT National Transportation Integrated Search

    2010-03-01

    Objectives: Evaluate the usefulness of the dynamic modulus and flow number tests in moisture-susceptibility evaluation, Compare the results to those achieved using the AASHTO T 283 test, Study the effect of different methods of sample conditioning an...

  4. 77 FR 72361 - Agency Information Collection Activities: Proposed Collection; Comment Request; Generic Clearance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-05

    ... calculations that justify the proposed sample size, the expected response rate, methods for assessing potential... Qualitative Feedback on Agency Service Delivery SUMMARY: As part of a Federal Government-wide effort to... Information Collection Request (Generic ICR): ``Generic Clearance for the Collection of Qualitative Feedback...

  5. REST: a computer system for estimating logging residue by using the line-intersect method

    Treesearch

    A. Jeff Martin

    1975-01-01

    A computer program was designed to accept logging-residue measurements obtained by line-intersect sampling and transform them into summaries useful for the land manager. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckley, D.A.; Stites, J. Jr.

    The objective was to characterize several lots of materials used for carbon/carbon and carbon/phenol product manufacture. Volume one is organized into testing categories based on raw material of product form. Each category contains a discussion of the sampling plan, comments and observations on each test method utilized, and a summary of the results obtained each category.

  7. Correlation of the summary method with learning styles.

    PubMed

    Sarikcioglu, Levent; Senol, Yesim; Yildirim, Fatos B; Hizay, Arzu

    2011-09-01

    The summary is the last part of the lesson but one of the most important. We aimed to study the relationship between the preference of the summary method (video demonstration, question-answer, or brief review of slides) and learning styles. A total of 131 students were included in the present study. An inventory was prepared to understand the students' learning styles, and a satisfaction questionnaire was provided to determine the summary method selection. The questionnaire and inventory were collected and analyzed. A comparison of the data revealed that the summary method with video demonstration received the highest score among all the methods tested. Additionally, there were no significant differences between learning styles and summary method with video demonstration. We suggest that such a summary method should be incorporated into neuroanatomy lessons. Since anatomy has a large amount of visual material, we think that it is ideally suited for this summary method.

  8. Network Model-Assisted Inference from Respondent-Driven Sampling Data

    PubMed Central

    Gile, Krista J.; Handcock, Mark S.

    2015-01-01

    Summary Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population. PMID:26640328

  9. Median nitrate concentrations in groundwater in the New Jersey Highlands Region estimated using regression models and land-surface characteristics

    USGS Publications Warehouse

    Baker, Ronald J.; Chepiga, Mary M.; Cauller, Stephen J.

    2015-01-01

    The Kaplan-Meier method of estimating summary statistics from left-censored data was applied in order to include nondetects (left-censored data) in median nitrate-concentration calculations. Median concentrations also were determined using three alternative methods of handling nondetects. Treatment of the 23 percent of samples that were nondetects had little effect on estimated median nitrate concentrations because method detection limits were mostly less than median values.

  10. Summary and Synthesis: How to Present a Research Proposal.

    PubMed

    Setia, Maninder Singh; Panda, Saumya

    2017-01-01

    This concluding module attempts to synthesize the key learning points discussed during the course of the previous ten sets of modules on methodology and biostatistics. The objective of this module is to discuss how to present a model research proposal, based on whatever was discussed in the preceding modules. The lynchpin of a research proposal is the protocol, and the key component of a protocol is the study design. However, one must not neglect the other areas, be it the project summary through which one catches the eyes of the reviewer of the proposal, or the background and the literature review, or the aims and objectives of the study. Two critical areas in the "methods" section that cannot be emphasized more are the sampling strategy and a formal estimation of sample size. Without a legitimate sample size, none of the conclusions based on the statistical analysis would be valid. Finally, the ethical parameters of the study should be well understood by the researchers, and that should get reflected in the proposal.

  11. Beyond Description in Interpersonal Construct Validation: Methodological Advances in the Circumplex Structural Summary Approach.

    PubMed

    Zimmermann, Johannes; Wright, Aidan G C

    2017-01-01

    The interpersonal circumplex is a well-established structural model that organizes interpersonal functioning within the two-dimensional space marked by dominance and affiliation. The structural summary method (SSM) was developed to evaluate the interpersonal nature of other constructs and measures outside the interpersonal circumplex. To date, this method has been primarily descriptive, providing no way to draw inferences when comparing SSM parameters across constructs or groups. We describe a newly developed resampling-based method for deriving confidence intervals, which allows for SSM parameter comparisons. In a series of five studies, we evaluated the accuracy of the approach across a wide range of possible sample sizes and parameter values, and demonstrated its utility for posing theoretical questions on the interpersonal nature of relevant constructs (e.g., personality disorders) using real-world data. As a result, the SSM is strengthened for its intended purpose of construct evaluation and theory building. © The Author(s) 2015.

  12. 40 CFR Appendix I to Subpart T of... - Sample Graphical Summary of NTE Emission Results

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Emission Results I Appendix I to Subpart T of Part 86 Protection of Environment ENVIRONMENTAL PROTECTION... Appendix I to Subpart T of Part 86—Sample Graphical Summary of NTE Emission Results The following figure shows an example of a graphical summary of NTE emission results: ER14JN05.002 ...

  13. Nonparametric and Semiparametric Regression Estimation for Length-biased Survival Data

    PubMed Central

    Shen, Yu; Ning, Jing; Qin, Jing

    2016-01-01

    For the past several decades, nonparametric and semiparametric modeling for conventional right-censored survival data has been investigated intensively under a noninformative censoring mechanism. However, these methods may not be applicable for analyzing right-censored survival data that arise from prevalent cohorts when the failure times are subject to length-biased sampling. This review article is intended to provide a summary of some newly developed methods as well as established methods for analyzing length-biased data. PMID:27086362

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korte, Andrew R

    This thesis presents efforts to improve the methodology of matrix-assisted laser desorption ionization-mass spectrometry imaging (MALDI-MSI) as a method for analysis of metabolites from plant tissue samples. The first chapter consists of a general introduction to the technique of MALDI-MSI, and the sixth and final chapter provides a brief summary and an outlook on future work.

  15. 40 CFR Appendix I to Subpart T of... - Sample Graphical Summary of NTE Emission Results

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Emission Results I Appendix I to Subpart T of Part 86 Protection of Environment ENVIRONMENTAL PROTECTION..., App. I Appendix I to Subpart T of Part 86—Sample Graphical Summary of NTE Emission Results The following figure shows an example of a graphical summary of NTE emission results: ER14JN05.002 ...

  16. Development of NASA's Sample Cartridge Assembly: Summary of GEDS Design, Development Testing, and Thermal Analyses

    NASA Technical Reports Server (NTRS)

    O'Connor, Brian; Hernandez, Deborah; Hornsby, Linda; Brown, Maria; Horton-Mullins, Kathryn

    2017-01-01

    Outline: Background of ISS (International Space Station) Material Science Research Rack; NASA SCA (Sample Cartridge Assembly) Design; GEDS (Gravitational Effects in Distortion in Sintering) Experiment Ampoule Design; Development Testing Summary; Thermal Modeling and Analysis. Summary: GEDS design development challenging (GEDS Ampoule design developed through MUGS (Microgravity) testing; Short duration transient sample processing; Unable to measure sample temperatures); MUGS Development testing used to gather data (Actual LGF (Low Gradient Furnace)-like furnace response; Provided sample for sintering evaluation); Transient thermal model integral to successful GEDS experiment (Development testing provided furnace response; PI (Performance Indicator) evaluation of sintering anchored model evaluation of processing durations; Thermal transient model used to determine flight SCA sample processing profiles).

  17. Methods for sample size determination in cluster randomized trials

    PubMed Central

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra

    2015-01-01

    Background: The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. Methods: We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. Results: We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. Conclusions: There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. PMID:26174515

  18. Cortisol Awakening Response in Elite Military Men: Summary Parameters, Stability Measurement, and Effect of Compliance.

    PubMed

    Taylor, Marcus K; Hernández, Lisa M; Fuller, Shiloah A; Sargent, Paul; Padilla, Genieleah A; Harris, Erica

    2016-11-01

    The cortisol awakening response (CAR) holds promise as a clinically important marker of health status. However, CAR research is routinely challenged by its innate complexity, sensitivity to confounds, and methodological inconsistencies. In this unprecedented characterization of CAR in elite military men (N = 58), we established summary parameters, evaluated sampling stability across two consecutive days, and explored the effect of subject compliance. Average salivary cortisol concentrations increased nearly 60% within 30 minutes of waking, followed by a swift recovery to waking values at 60 minutes. Approximately one in six were classified as negative responders (i.e., <0% change from waking to 30-minute postawakening). Three summary parameters of magnitude, as well as three summary parameters of pattern, were computed. Consistent with our hypothesis, summary parameters of magnitude displayed superior stability compared with summary parameters of pattern in the total sample. As expected, compliance with target sampling times was relatively good; average deviations of self-reported morning sampling times in relation to actigraph-derived wake times across both days were within ±5 minutes, and nearly two-thirds of the sample was classified as CAR compliant across both days. Although compliance had equivocal effects on some measures of magnitude, it substantially improved the stability of summary parameters of pattern. The first of its kind, this study established the foundation for a program of CAR research in a profoundly resilient yet chronically stressed population. Building from this, our forthcoming research will evaluate demographic, biobehavioral, and clinical determinants of CAR in this unique population. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.

  19. IGESS: a statistical approach to integrating individual-level genotype data and summary statistics in genome-wide association studies.

    PubMed

    Dai, Mingwei; Ming, Jingsi; Cai, Mingxuan; Liu, Jin; Yang, Can; Wan, Xiang; Xu, Zongben

    2017-09-15

    Results from genome-wide association studies (GWAS) suggest that a complex phenotype is often affected by many variants with small effects, known as 'polygenicity'. Tens of thousands of samples are often required to ensure statistical power of identifying these variants with small effects. However, it is often the case that a research group can only get approval for the access to individual-level genotype data with a limited sample size (e.g. a few hundreds or thousands). Meanwhile, summary statistics generated using single-variant-based analysis are becoming publicly available. The sample sizes associated with the summary statistics datasets are usually quite large. How to make the most efficient use of existing abundant data resources largely remains an open question. In this study, we propose a statistical approach, IGESS, to increasing statistical power of identifying risk variants and improving accuracy of risk prediction by i ntegrating individual level ge notype data and s ummary s tatistics. An efficient algorithm based on variational inference is developed to handle the genome-wide analysis. Through comprehensive simulation studies, we demonstrated the advantages of IGESS over the methods which take either individual-level data or summary statistics data as input. We applied IGESS to perform integrative analysis of Crohns Disease from WTCCC and summary statistics from other studies. IGESS was able to significantly increase the statistical power of identifying risk variants and improve the risk prediction accuracy from 63.2% ( ±0.4% ) to 69.4% ( ±0.1% ) using about 240 000 variants. The IGESS software is available at https://github.com/daviddaigithub/IGESS . zbxu@xjtu.edu.cn or xwan@comp.hkbu.edu.hk or eeyang@hkbu.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  20. LD Hub: a centralized database and web interface to perform LD score regression that maximizes the potential of summary level GWAS data for SNP heritability and genetic correlation analysis.

    PubMed

    Zheng, Jie; Erzurumluoglu, A Mesut; Elsworth, Benjamin L; Kemp, John P; Howe, Laurence; Haycock, Philip C; Hemani, Gibran; Tansey, Katherine; Laurin, Charles; Pourcain, Beate St; Warrington, Nicole M; Finucane, Hilary K; Price, Alkes L; Bulik-Sullivan, Brendan K; Anttila, Verneri; Paternoster, Lavinia; Gaunt, Tom R; Evans, David M; Neale, Benjamin M

    2017-01-15

    LD score regression is a reliable and efficient method of using genome-wide association study (GWAS) summary-level results data to estimate the SNP heritability of complex traits and diseases, partition this heritability into functional categories, and estimate the genetic correlation between different phenotypes. Because the method relies on summary level results data, LD score regression is computationally tractable even for very large sample sizes. However, publicly available GWAS summary-level data are typically stored in different databases and have different formats, making it difficult to apply LD score regression to estimate genetic correlations across many different traits simultaneously. In this manuscript, we describe LD Hub - a centralized database of summary-level GWAS results for 173 diseases/traits from different publicly available resources/consortia and a web interface that automates the LD score regression analysis pipeline. To demonstrate functionality and validate our software, we replicated previously reported LD score regression analyses of 49 traits/diseases using LD Hub; and estimated SNP heritability and the genetic correlation across the different phenotypes. We also present new results obtained by uploading a recent atopic dermatitis GWAS meta-analysis to examine the genetic correlation between the condition and other potentially related traits. In response to the growing availability of publicly accessible GWAS summary-level results data, our database and the accompanying web interface will ensure maximal uptake of the LD score regression methodology, provide a useful database for the public dissemination of GWAS results, and provide a method for easily screening hundreds of traits for overlapping genetic aetiologies. The web interface and instructions for using LD Hub are available at http://ldsc.broadinstitute.org/ CONTACT: jie.zheng@bristol.ac.ukSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  1. How much do you need: a randomised experiment of whether readers can understand the key messages from summaries of Cochrane Reviews without reading the full review

    PubMed Central

    Clarke, Mike

    2014-01-01

    Objective We explored whether readers can understand key messages without having to read the full review, and if there were differences in understanding between various types of summary. Design A randomised experiment of review summaries which compared understanding of a key outcome. Participants Members of university staff (n = 36). Setting Universities on the island of Ireland. Method The Cochrane Review chosen examines the health impacts of the use of electric fans during heat waves. Participants were asked their expectation of the effect these would have on mortality. They were then randomly assigned a summary of the review (i.e. abstract, plain language summary, podcast or podcast transcription) and asked to spend a short time reading/listening to the summary. After this they were again asked about the effects of electric fans on mortality and to indicate if they would want to read the full Review. Main outcome measure Correct identification of a key review outcome. Results Just over half (53%) of the participants identified its key message on mortality after engaging with their summary. The figures were 33% for the abstract group, 50% for both the plain language and transcript groups and 78% for the podcast group. Conclusions The differences between the groups were not statistically significant but suggest that the audio summary might improve knowledge transfer compared to written summaries. These findings should be explored further using a larger sample size and with other reviews. PMID:25341445

  2. TRAFFIC AND METEOROLOGICAL IMPACTS ON NEAR-ROAD AIR QUALITY: SUMMARY OF METHODS AND TRENDS FROM THE RALEIGH NEAR-ROAD STUDY

    EPA Science Inventory

    There are adverse health effects in populations living, working or going to school near major roadways. A study was designed to assess traffic emissions impacts on air quality and particle toxicity near a heavily-traveled highway. Several real-time and time-integrated sampling d...

  3. TRANDESNF: A computer program for transonic airfoil design and analysis in nonuniform flow

    NASA Technical Reports Server (NTRS)

    Chang, J. F.; Lan, C. Edward

    1987-01-01

    The use of a transonic airfoil code for analysis, inverse design, and direct optimization of an airfoil immersed in propfan slipstream is described. A summary of the theoretical method, program capabilities, input format, output variables, and program execution are described. Input data of sample test cases and the corresponding output are given.

  4. Statistical characterization of carbon phenolic prepreg materials, volume 1

    NASA Technical Reports Server (NTRS)

    Beckley, Don A.; Stites, John, Jr.

    1988-01-01

    The objective was to characterize several lots of materials used for carbon/carbon and carbon/phenol product manufacture. Volume one is organized into testing categories based on raw material of product form. Each category contains a discussion of the sampling plan, comments and observations on each test method utilized, and a summary of the results obtained each category.

  5. NHEXAS PHASE I MARYLAND STUDY--STANDARD OPERATING PROCEDURE FOR EXPLORATORY DATA ANALYSIS AND SUMMARY STATISTICS (D05)

    EPA Science Inventory

    This SOP describes the methods and procedures for two types of QA procedures: spot checks of hand entered data, and QA procedures for co-located and split samples. The spot checks were used to determine whether the error rate goal for the input of hand entered data was being att...

  6. Simulation-based estimation of mean and standard deviation for meta-analysis via Approximate Bayesian Computation (ABC).

    PubMed

    Kwon, Deukwoo; Reis, Isildinha M

    2015-08-12

    When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.

  7. Joint Inference of Population Assignment and Demographic History

    PubMed Central

    Choi, Sang Chul; Hey, Jody

    2011-01-01

    A new approach to assigning individuals to populations using genetic data is described. Most existing methods work by maximizing Hardy–Weinberg and linkage equilibrium within populations, neither of which will apply for many demographic histories. By including a demographic model, within a likelihood framework based on coalescent theory, we can jointly study demographic history and population assignment. Genealogies and population assignments are sampled from a posterior distribution using a general isolation-with-migration model for multiple populations. A measure of partition distance between assignments facilitates not only the summary of a posterior sample of assignments, but also the estimation of the posterior density for the demographic history. It is shown that joint estimates of assignment and demographic history are possible, including estimation of population phylogeny for samples from three populations. The new method is compared to results of a widely used assignment method, using simulated and published empirical data sets. PMID:21775468

  8. 16 CFR Appendix B to Part 436 - Sample Item 20(1) Table-Systemwide Outlet Summary

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Sample Item 20(1) Table-Systemwide Outlet Summary B Appendix B to Part 436 Commercial Practices FEDERAL TRADE COMMISSION TRADE REGULATION RULES DISCLOSURE REQUIREMENTS AND PROHIBITIONS CONCERNING FRANCHISING Pt. 436, App. B Appendix B to Part 436—Sample...

  9. 16 CFR Appendix A to Part 436 - Sample Item 10 Table-Summary of Financing Offered

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Sample Item 10 Table-Summary of Financing Offered A Appendix A to Part 436 Commercial Practices FEDERAL TRADE COMMISSION TRADE REGULATION RULES DISCLOSURE REQUIREMENTS AND PROHIBITIONS CONCERNING FRANCHISING Pt. 436, App. A Appendix A to Part 436—Sample...

  10. 16 CFR Appendix B to Part 436 - Sample Item 20(1) Table-Systemwide Outlet Summary

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Sample Item 20(1) Table-Systemwide Outlet Summary B Appendix B to Part 436 Commercial Practices FEDERAL TRADE COMMISSION TRADE REGULATION RULES DISCLOSURE REQUIREMENTS AND PROHIBITIONS CONCERNING FRANCHISING Pt. 436, App. B Appendix B to Part 436—Sample...

  11. Comparison of several analytical methods for the determination of tin in geochemical samples as a function of tin speciation

    USGS Publications Warehouse

    Kane, J.S.; Evans, J.R.; Jackson, J.C.

    1989-01-01

    Accurate and precise determinations of tin in geological materials are needed for fundamental studies of tin geochemistry, and for tin prospecting purposes. Achieving the required accuracy is difficult because of the different matrices in which Sn can occur (i.e. sulfides, silicates and cassiterite), and because of the variability of literature values for Sn concentrations in geochemical reference materials. We have evaluated three methods for the analysis of samples for Sn concentration: graphite furnace atomic absorption spectrometry (HGA-AAS) following iodide extraction, inductively coupled plasma atomic emission spectrometry (ICP-OES), and energy-dispersive X-ray fluorescence (EDXRF) spectrometry. Two of these methods (HGA-AAS and ICP-OES) required sample decomposition either by acid digestion or fusion, while the third (EDXRF) was performed directly on the powdered sample. Analytical details of all three methods, their potential errors, and the steps necessary to correct these errors were investigated. Results showed that similar accuracy was achieved from all methods for unmineralized samples, which contain no known Sn-bearing phase. For mineralized samples, which contain Sn-bearing minerals, either cassiterite or stannous sulfides, only EDXRF and fusion ICP-OES methods provided acceptable accuracy. This summary of our study provides information which helps to assure correct interpretation of data bases for underlying geochemical processes, regardless of method of data collection and its inherent limitations. ?? 1989.

  12. Sensitive diagnosis of cutaneous leishmaniasis by lesion swab sampling coupled to qPCR

    PubMed Central

    ADAMS, EMILY R.; GOMEZ, MARIA ADELAIDA; SCHESKE, LAURA; RIOS, RUBY; MARQUEZ, RICARDO; COSSIO, ALEXANDRA; ALBERTINI, AUDREY; SCHALLIG, HENK; SARAVIA, NANCY GORE

    2015-01-01

    SUMMARY Variation in clinical accuracy of molecular diagnostic methods for cutaneous leishmaniasis (CL) is commonly observed depending on the sample source, the method of DNA recovery and the molecular test. Few attempts have been made to compare these variables. Two swab and aspirate samples from lesions of patients with suspected CL (n = 105) were evaluated alongside standard diagnosis by microscopic detection of amastigotes or culture of parasites from lesion material. Three DNA extraction methods were compared: Qiagen on swab and aspirate specimens, Isohelix on swabs and Boil/Spin of lesion aspirates. Recovery of Leishmania DNA was evaluated for each sample type by real-time polymerase chain reaction detection of parasitic 18S rDNA, and the diagnostic accuracy of the molecular method determined. Swab sampling combined with Qiagen DNA extraction was the most efficient recovery method for Leishmania DNA, and was the most sensitive (98%; 95% CI: 91–100%) and specific (84%; 95% CI: 64–95%) approach. Aspirated material was less sensitive at 80% (95% CI: 70–88%) and 61% (95% CI: 50–72%) when coupled to Qiagen or Boil-Spin DNA extraction, respectively. Swab sampling of lesions was painless, simple to perform and coupled with standardized DNA extraction enhances the feasibility of molecular diagnosis of CL. PMID:25111885

  13. Seasonal comparison of moss bag technique against vertical snow samples for monitoring atmospheric pollution.

    PubMed

    Salo, Hanna; Berisha, Anna-Kaisa; Mäkinen, Joni

    2016-03-01

    This is the first study seasonally applying Sphagnum papillosum moss bags and vertical snow samples for monitoring atmospheric pollution. Moss bags, exposed in January, were collected together with snow samples by early March 2012 near the Harjavalta Industrial Park in southwest Finland. Magnetic, chemical, scanning electron microscopy-energy dispersive X-ray spectroscopy (SEM-EDX), K-means clustering, and Tomlinson pollution load index (PLI) data showed parallel spatial trends of pollution dispersal for both materials. Results strengthen previous findings that concentrate and slag handling activities were important (dust) emission sources while the impact from Cu-Ni smelter's pipe remained secondary at closer distances. Statistically significant correlations existed between the variables of snow and moss bags. As a summary, both methods work well for sampling and are efficient pollutant accumulators. Moss bags can be used also in winter conditions and they provide more homogeneous and better controlled sampling method than snow samples. Copyright © 2015. Published by Elsevier B.V.

  14. Palladium-based Mass-Tag Cell Barcoding with a Doublet-Filtering Scheme and Single Cell Deconvolution Algorithm

    PubMed Central

    Zunder, Eli R.; Finck, Rachel; Behbehani, Gregory K.; Amir, El-ad D.; Krishnaswamy, Smita; Gonzalez, Veronica D.; Lorang, Cynthia G.; Bjornson, Zach; Spitzer, Matthew H.; Bodenmiller, Bernd; Fantl, Wendy J.; Pe’er, Dana; Nolan, Garry P.

    2015-01-01

    SUMMARY Mass-tag cell barcoding (MCB) labels individual cell samples with unique combinatorial barcodes, after which they are pooled for processing and measurement as a single multiplexed sample. The MCB method eliminates variability between samples in antibody staining and instrument sensitivity, reduces antibody consumption, and shortens instrument measurement time. Here, we present an optimized MCB protocol with several improvements over previously described methods. The use of palladium-based labeling reagents expands the number of measurement channels available for mass cytometry and reduces interference with lanthanide-based antibody measurement. An error-detecting combinatorial barcoding scheme allows cell doublets to be identified and removed from the analysis. A debarcoding algorithm that is single cell-based rather than population-based improves the accuracy and efficiency of sample deconvolution. This debarcoding algorithm has been packaged into software that allows rapid and unbiased sample deconvolution. The MCB procedure takes 3–4 h, not including sample acquisition time of ~1 h per million cells. PMID:25612231

  15. Information Transfer and the Hospital Discharge Summary: National Primary Care Provider Perspectives of Challenges and Opportunities.

    PubMed

    Robelia, Paul M; Kashiwagi, Deanne T; Jenkins, Sarah M; Newman, James S; Sorita, Atsushi

    2017-01-01

    The hospital discharge summary (HDS) serves as a critical method of patient information transfer between hospitalist and primary care provider (PCP). This study was designed to increase our understanding of PCP preferences for, and perceived deficiencies in, the discharge summary. We designed a mail survey that was sent to a random sample of 800 American Academy of Family Physicians members nationally. The survey response rate was 59%. We analyzed the availability of summaries at hospital followup, whether all desired information was contained in the summary and whether certain specific items were completed. Provider subgroup analysis was performed. The strongest predictor of discharge summary availability at posthospital followup is direct access to inpatient data. Respondents (27.5%) had a summary available 0% to 40% of the time, 41.4% noted availability 41% to 80% of the time and 31.1% >80% of the time; if a provider had access to inpatient data they tended to have a discharge summary available to them ( P < .0001). Providers also described significant content deficits: 26.5% of providers noted the summary contained all information needed 0% to 40% of the time, 48.5% of providers noted this 41% to 80% of the time and only 25% >80% of the time. Specific summary items considered "very important" by providers included medication list (94% of respondents), diagnosis list (89%), and treatment provided (87%). Opportunities remain in timely delivery of a complete HDS to the PCP. Further multifaceted practice redesign should be directed at optimizing this critical information transfer tool, potentially encompassing electronic medical record utilization and specific training for clinicians preparing summaries. Initial efforts should focus on ensuring availability of a complete summary (containing items deemed important by PCPs including medication list, diagnosis list, and treatment provided) at the posthospital follow-up visit. © Copyright 2017 by the American Board of Family Medicine.

  16. 40 CFR Appendix A to Part 80 - Test for the Determination of Phosphorus in Gasoline

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Specification for Filter Paper for Use in Chemical Analysis. 3. Summary of method. 3.1 Organic matter in the...) during the entire period of sample heating. Note 1: If the temperature of the hot water bath drops below... 100-ml volumetric flasks submerged to the mark in ice water. 4.4 Filter Paper, for quantitative...

  17. 40 CFR Appendix A to Part 80 - Test for the Determination of Phosphorus in Gasoline

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Specification for Filter Paper for Use in Chemical Analysis. 3. Summary of method. 3.1 Organic matter in the...) during the entire period of sample heating. Note 1: If the temperature of the hot water bath drops below... 100-ml volumetric flasks submerged to the mark in ice water. 4.4 Filter Paper, for quantitative...

  18. 40 CFR Appendix A to Part 80 - Test for the Determination of Phosphorus in Gasoline

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Specification for Filter Paper for Use in Chemical Analysis. 3. Summary of method. 3.1 Organic matter in the...) during the entire period of sample heating. Note 1: If the temperature of the hot water bath drops below... 100-ml volumetric flasks submerged to the mark in ice water. 4.4 Filter Paper, for quantitative...

  19. Missouri Ozark Forest Ecosystem Project: site history, soils, landforms, woody and herbaceous vegetation, down wood, and inventory methods for the landscape experiment.

    Treesearch

    Stephen R. Shifley; Brian L., eds. Brookshire

    2000-01-01

    Describes vegetation and physical site conditions at the initiation (1991-1995) of the Missouri Ozark Forest Ecosystem Project (MOFEP) in the southeastern Missouri Ozarks. Provides detailed information on sampling protocols and summarizes initial conditions of the landscape experiment prior to harvest treatments. Summaries are by plot, by ~800-acre...

  20. Forest Fire History... A Computer Method of Data Analysis

    Treesearch

    Romain M. Meese

    1973-01-01

    A series of computer programs is available to extract information from the individual Fire Reports (U.S. Forest Service Form 5100-29). The programs use a statistical technique to fit a continuous distribution to a set of sampled data. The goodness-of-fit program is applicable to data other than the fire history. Data summaries illustrate analysis of fire occurrence,...

  1. Evaluation of sampling and storage procedures on preserving the community structure of stool microbiota: A simple at-home toilet-paper collection method.

    PubMed

    Al, Kait F; Bisanz, Jordan E; Gloor, Gregory B; Reid, Gregor; Burton, Jeremy P

    2018-01-01

    The increasing interest on the impact of the gut microbiota on health and disease has resulted in multiple human microbiome-related studies emerging. However, multiple sampling methods are being used, making cross-comparison of results difficult. To avoid additional clinic visits and increase patient recruitment to these studies, there is the potential to utilize at-home stool sampling. The aim of this pilot study was to compare simple self-sampling collection and storage methods. To simulate storage conditions, stool samples from three volunteers were freshly collected, placed on toilet tissue, and stored at four temperatures (-80, 7, 22 and 37°C), either dry or in the presence of a stabilization agent (RNAlater®) for 3 or 7days. Using 16S rRNA gene sequencing by Illumina, the effect of storage variations for each sample was compared to a reference community from fresh, unstored counterparts. Fastq files may be accessed in the NCBI Sequence Read Archive: Bioproject ID PRJNA418287. Microbial diversity and composition were not significantly altered by any storage method. Samples were always separable based on participant, regardless of storage method suggesting there was no need for sample preservation by a stabilization agent. In summary, if immediate sample processing is not feasible, short term storage of unpreserved stool samples on toilet paper offers a reliable way to assess the microbiota composition by 16S rRNA gene sequencing. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Application of Liquid Chromatography/Ion Trap Mass Spectrometry Technique to Determine Ergot Alkaloids in Grain Products

    PubMed Central

    Szymczyk, Krystyna; Jędrzejczak, Renata; Roszko, Marek

    2015-01-01

    Summary A liquid chromatography/ion trap mass spectrometry-based method to determine six ergot alkaloids and their isomers is presented. The samples were cleaned on neutral alumina-based solid-phase extraction cartridges. The following method parameters were obtained (depending on the analyte and spiking level): method recovery from 63.0 to 104.6%, relative standard deviation below 18%, linear range from 1 to 325 µg/kg, linear correlation coefficient not less than 0.98. The developed analytical procedure was applied to determine the levels of ergot alkaloids in 65 samples of selected rye-based food products (flour – 34 samples, bran – 12 samples, rye – 18 samples, flakes – 1 sample). Measurable levels of alkaloids were found in majority of the analysed samples, particularly in rye flour. Additionally, alkaloids were determined in ergot sclerotia isolated from rye grains. Total content was nearly 0.01% (97.9 mg/kg). However, the alkaloid profile was dominated by ergocristine at 45.6% (44.7 mg/kg), an alkaloid not commonly found in the tested food products. Ergocorninine at 0.2% (0.2 mg/kg) was the least abundant alkaloid. PMID:27904328

  3. Computerized summary scoring: crowdsourcing-based latent semantic analysis.

    PubMed

    Li, Haiying; Cai, Zhiqiang; Graesser, Arthur C

    2017-11-03

    In this study we developed and evaluated a crowdsourcing-based latent semantic analysis (LSA) approach to computerized summary scoring (CSS). LSA is a frequently used mathematical component in CSS, where LSA similarity represents the extent to which the to-be-graded target summary is similar to a model summary or a set of exemplar summaries. Researchers have proposed different formulations of the model summary in previous studies, such as pregraded summaries, expert-generated summaries, or source texts. The former two methods, however, require substantial human time, effort, and costs in order to either grade or generate summaries. Using source texts does not require human effort, but it also does not predict human summary scores well. With human summary scores as the gold standard, in this study we evaluated the crowdsourcing LSA method by comparing it with seven other LSA methods that used sets of summaries from different sources (either experts or crowdsourced) of differing quality, along with source texts. Results showed that crowdsourcing LSA predicted human summary scores as well as expert-good and crowdsourcing-good summaries, and better than the other methods. A series of analyses with different numbers of crowdsourcing summaries demonstrated that the number (from 10 to 100) did not significantly affect performance. These findings imply that crowdsourcing LSA is a promising approach to CSS, because it saves human effort in generating the model summary while still yielding comparable performance. This approach to small-scale CSS provides a practical solution for instructors in courses, and also advances research on automated assessments in which student responses are expected to semantically converge on subject matter content.

  4. The Efficacy of Consensus Tree Methods for Summarizing Phylogenetic Relationships from a Posterior Sample of Trees Estimated from Morphological Data.

    PubMed

    O'Reilly, Joseph E; Donoghue, Philip C J

    2018-03-01

    Consensus trees are required to summarize trees obtained through MCMC sampling of a posterior distribution, providing an overview of the distribution of estimated parameters such as topology, branch lengths, and divergence times. Numerous consensus tree construction methods are available, each presenting a different interpretation of the tree sample. The rise of morphological clock and sampled-ancestor methods of divergence time estimation, in which times and topology are coestimated, has increased the popularity of the maximum clade credibility (MCC) consensus tree method. The MCC method assumes that the sampled, fully resolved topology with the highest clade credibility is an adequate summary of the most probable clades, with parameter estimates from compatible sampled trees used to obtain the marginal distributions of parameters such as clade ages and branch lengths. Using both simulated and empirical data, we demonstrate that MCC trees, and trees constructed using the similar maximum a posteriori (MAP) method, often include poorly supported and incorrect clades when summarizing diffuse posterior samples of trees. We demonstrate that the paucity of information in morphological data sets contributes to the inability of MCC and MAP trees to accurately summarise of the posterior distribution. Conversely, majority-rule consensus (MRC) trees represent a lower proportion of incorrect nodes when summarizing the same posterior samples of trees. Thus, we advocate the use of MRC trees, in place of MCC or MAP trees, in attempts to summarize the results of Bayesian phylogenetic analyses of morphological data.

  5. The Efficacy of Consensus Tree Methods for Summarizing Phylogenetic Relationships from a Posterior Sample of Trees Estimated from Morphological Data

    PubMed Central

    O’Reilly, Joseph E; Donoghue, Philip C J

    2018-01-01

    Abstract Consensus trees are required to summarize trees obtained through MCMC sampling of a posterior distribution, providing an overview of the distribution of estimated parameters such as topology, branch lengths, and divergence times. Numerous consensus tree construction methods are available, each presenting a different interpretation of the tree sample. The rise of morphological clock and sampled-ancestor methods of divergence time estimation, in which times and topology are coestimated, has increased the popularity of the maximum clade credibility (MCC) consensus tree method. The MCC method assumes that the sampled, fully resolved topology with the highest clade credibility is an adequate summary of the most probable clades, with parameter estimates from compatible sampled trees used to obtain the marginal distributions of parameters such as clade ages and branch lengths. Using both simulated and empirical data, we demonstrate that MCC trees, and trees constructed using the similar maximum a posteriori (MAP) method, often include poorly supported and incorrect clades when summarizing diffuse posterior samples of trees. We demonstrate that the paucity of information in morphological data sets contributes to the inability of MCC and MAP trees to accurately summarise of the posterior distribution. Conversely, majority-rule consensus (MRC) trees represent a lower proportion of incorrect nodes when summarizing the same posterior samples of trees. Thus, we advocate the use of MRC trees, in place of MCC or MAP trees, in attempts to summarize the results of Bayesian phylogenetic analyses of morphological data. PMID:29106675

  6. 19 CFR 151.64 - Extra copy of entry summary.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.64 Extra copy of entry summary. One extra copy of the entry summary covering wool or hair subject to duty at a rate per...

  7. 19 CFR 151.64 - Extra copy of entry summary.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.64 Extra copy of entry summary. One extra copy of the entry summary covering wool or hair subject to duty at a rate per...

  8. 19 CFR 151.64 - Extra copy of entry summary.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.64 Extra copy of entry summary. One extra copy of the entry summary covering wool or hair subject to duty at a rate per...

  9. 19 CFR 151.64 - Extra copy of entry summary.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.64 Extra copy of entry summary. One extra copy of the entry summary covering wool or hair subject to duty at a rate per...

  10. 19 CFR 151.64 - Extra copy of entry summary.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.64 Extra copy of entry summary. One extra copy of the entry summary covering wool or hair subject to duty at a rate per...

  11. Field evaluation of personal sampling methods for multiple bioaerosols.

    PubMed

    Wang, Chi-Hsun; Chen, Bean T; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  12. Methodological integrative review of the work sampling technique used in nursing workload research.

    PubMed

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  13. Sample Size Methods for Estimating HIV Incidence from Cross-Sectional Surveys

    PubMed Central

    Brookmeyer, Ron

    2015-01-01

    Summary Understanding HIV incidence, the rate at which new infections occur in populations, is critical for tracking and surveillance of the epidemic. In this paper we derive methods for determining sample sizes for cross-sectional surveys to estimate incidence with sufficient precision. We further show how to specify sample sizes for two successive cross-sectional surveys to detect changes in incidence with adequate power. In these surveys biomarkers such as CD4 cell count, viral load, and recently developed serological assays are used to determine which individuals are in an early disease stage of infection. The total number of individuals in this stage, divided by the number of people who are uninfected, is used to approximate the incidence rate. Our methods account for uncertainty in the durations of time spent in the biomarker defined early disease stage. We find that failure to account for this uncertainty when designing surveys can lead to imprecise estimates of incidence and underpowered studies. We evaluated our sample size methods in simulations and found that they performed well in a variety of underlying epidemics. Code for implementing our methods in R is available with this paper at the Biometrics website on Wiley Online Library. PMID:26302040

  14. Random vs. systematic sampling from administrative databases involving human subjects.

    PubMed

    Hagino, C; Lo, R J

    1998-09-01

    Two sampling techniques, simple random sampling (SRS) and systematic sampling (SS), were compared to determine whether they yield similar and accurate distributions for the following four factors: age, gender, geographic location and years in practice. Any point estimate within 7 yr or 7 percentage points of its reference standard (SRS or the entire data set, i.e., the target population) was considered "acceptably similar" to the reference standard. The sampling frame was from the entire membership database of the Canadian Chiropractic Association. The two sampling methods were tested using eight different sample sizes of n (50, 100, 150, 200, 250, 300, 500, 800). From the profile/characteristics, summaries of four known factors [gender, average age, number (%) of chiropractors in each province and years in practice], between- and within-methods chi 2 tests and unpaired t tests were performed to determine whether any of the differences [descriptively greater than 7% or 7 yr] were also statistically significant. The strengths of the agreements between the provincial distributions were quantified by calculating the percent agreements for each (provincial pairwise-comparison methods). Any percent agreement less than 70% was judged to be unacceptable. Our assessments of the two sampling methods (SRS and SS) for the different sample sizes tested suggest that SRS and SS yielded acceptably similar results. Both methods started to yield "correct" sample profiles at approximately the same sample size (n > 200). SS is not only convenient, it can be recommended for sampling from large databases in which the data are listed without any inherent order biases other than alphabetical listing by surname.

  15. Use of Bayesian Methods to Analyze and Visualize Content Uniformity Capability Versus United States Pharmacopeia and ASTM Standards.

    PubMed

    Hofer, Jeffrey D; Rauk, Adam P

    2017-02-01

    The purpose of this work was to develop a straightforward and robust approach to analyze and summarize the ability of content uniformity data to meet different criteria. A robust Bayesian statistical analysis methodology is presented which provides a concise and easily interpretable visual summary of the content uniformity analysis results. The visualization displays individual batch analysis results and shows whether there is high confidence that different content uniformity criteria could be met a high percentage of the time in the future. The 3 tests assessed are as follows: (a) United States Pharmacopeia Uniformity of Dosage Units <905>, (b) a specific ASTM E2810 Sampling Plan 1 criterion to potentially be used for routine release testing, and (c) another specific ASTM E2810 Sampling Plan 2 criterion to potentially be used for process validation. The approach shown here could readily be used to create similar result summaries for other potential criteria. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  16. Sorbent-based sampling methods for volatile and semi-volatile organic compounds in air. Part 2. Sorbent selection and other aspects of optimizing air monitoring methods.

    PubMed

    Woolfenden, Elizabeth

    2010-04-16

    Sorbent tubes/traps are widely used in combination with gas chromatographic (GC) analytical methods to monitor the vapour-phase fraction of organic compounds in air. Applications range from atmospheric research and ambient air monitoring (indoor and outdoor) to occupational hygiene (personal exposure assessment) and measuring chemical emission levels. Part 1 of this paper reviewed the main sorbent-based air sampling strategies including active (pumped) tube monitoring, diffusive (passive) sampling onto sorbent tubes/cartridges plus sorbent trapping/focusing of whole air samples that are either collected in containers (such as canisters or bags) or monitored online. Options for subsequent extraction and transfer to GC(MS) analysis were also summarised and the trend to thermal desorption (TD)-based methods and away from solvent extraction was explained. As a result of this trend, demand for TD-compatible sorbents (alternatives to traditional charcoal) is growing. Part 2 of this paper therefore continues with a summary of TD-compatible sorbents, their respective advantages and limitations and considerations for sorbent selection. Other analytical considerations for optimizing sorbent-based air monitoring methods are also discussed together with recent technical developments and sampling accessories which have extended the application range of sorbent trapping technology generally. Copyright 2010 Elsevier B.V. All rights reserved.

  17. The Effect of Primary School Students' Writing Attitudes and Writing Self-Efficacy Beliefs on Their Summary Writing Achievement

    ERIC Educational Resources Information Center

    Bulut, Pinar

    2017-01-01

    In this study, the effect of writing attitude and writing self-efficacy beliefs on the summarization achievement of the 4th grade primary school students was examined using the structural equation modeling. The study employed the relational survey model. The study group constructed by means of simple random sampling method is comprised of 335…

  18. Proceedings from the Annual Army Environmental R&D Symposium (16th) Held 23-25 June 1992 at Fort Magruder Inn and Conference Center, Williamsburg, Virginia

    DTIC Science & Technology

    1992-06-01

    methods of selecting sites, monitoring flow, and sampling 4 409 runoff. Also, there are some observations on storm water quality findings and some...turning off the flow meters until a rain event is imminent. Make sure you pack plenty of flashlights for night rains. 6. STORM WATER QUALITY SUMMARY

  19. Quantitation of polycyclic aromatic hydrocarbons (PAH4) in cocoa and chocolate samples by an HPLC-FD method.

    PubMed

    Raters, Marion; Matissek, Reinhard

    2014-11-05

    As a consequence of the PAH4 (sum of four different polycyclic aromatic hydrocarbons, named benzo[a]anthracene, chrysene, benzo[b]fluoranthene, and benzo[a]pyrene) maximum levels permitted in cocoa beans and derived products as of 2013, an high-performance liquid chromatography with fluorescence detection method (HPLC-FD) was developed and adapted to the complex cocoa butter matrix to enable a simultaneous determination of PAH4. The resulting analysis method was subsequently successfully validated. This method meets the requirements of Regulation (EU) No. 836/2011 regarding analysis methods criteria for determining PAH4 and is hence most suitable for monitoring the observance of the maximum levels applicable under Regulation (EU) No. 835/2011. Within the scope of this work, a total of 218 samples of raw cocoa, cocoa masses, and cocoa butter from several sample years (1999-2012), of various origins and treatments, as well as cocoa and chocolate products were analyzed for the occurrence of PAH4. In summary, it is noted that the current PAH contamination level of cocoa products can be deemed very slight overall.

  20. Sampling and sample processing in pesticide residue analysis.

    PubMed

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  1. Evidence and Clinical Trials.

    NASA Astrophysics Data System (ADS)

    Goodman, Steven N.

    1989-11-01

    This dissertation explores the use of a mathematical measure of statistical evidence, the log likelihood ratio, in clinical trials. The methods and thinking behind the use of an evidential measure are contrasted with traditional methods of analyzing data, which depend primarily on a p-value as an estimate of the statistical strength of an observed data pattern. It is contended that neither the behavioral dictates of Neyman-Pearson hypothesis testing methods, nor the coherency dictates of Bayesian methods are realistic models on which to base inference. The use of the likelihood alone is applied to four aspects of trial design or conduct: the calculation of sample size, the monitoring of data, testing for the equivalence of two treatments, and meta-analysis--the combining of results from different trials. Finally, a more general model of statistical inference, using belief functions, is used to see if it is possible to separate the assessment of evidence from our background knowledge. It is shown that traditional and Bayesian methods can be modeled as two ends of a continuum of structured background knowledge, methods which summarize evidence at the point of maximum likelihood assuming no structure, and Bayesian methods assuming complete knowledge. Both schools are seen to be missing a concept of ignorance- -uncommitted belief. This concept provides the key to understanding the problem of sampling to a foregone conclusion and the role of frequency properties in statistical inference. The conclusion is that statistical evidence cannot be defined independently of background knowledge, and that frequency properties of an estimator are an indirect measure of uncommitted belief. Several likelihood summaries need to be used in clinical trials, with the quantitative disparity between summaries being an indirect measure of our ignorance. This conclusion is linked with parallel ideas in the philosophy of science and cognitive psychology.

  2. Getting more from accuracy and response time data: methods for fitting the linear ballistic accumulator.

    PubMed

    Donkin, Chris; Averell, Lee; Brown, Scott; Heathcote, Andrew

    2009-11-01

    Cognitive models of the decision process provide greater insight into response time and accuracy than do standard ANOVA techniques. However, such models can be mathematically and computationally difficult to apply. We provide instructions and computer code for three methods for estimating the parameters of the linear ballistic accumulator (LBA), a new and computationally tractable model of decisions between two or more choices. These methods-a Microsoft Excel worksheet, scripts for the statistical program R, and code for implementation of the LBA into the Bayesian sampling software WinBUGS-vary in their flexibility and user accessibility. We also provide scripts in R that produce a graphical summary of the data and model predictions. In a simulation study, we explored the effect of sample size on parameter recovery for each method. The materials discussed in this article may be downloaded as a supplement from http://brm.psychonomic-journals.org/content/supplemental.

  3. Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Britt, Phillip F

    2015-03-01

    Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report. Summaries of conclusions, analytical processes, and analytical results. Analysis of samples taken from the Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico in support of the WIPP Technical Assessment Team (TAT) activities to determine to the extent feasible the mechanisms and chemical reactions that may have resulted in the breach of at least one waste drum and release of waste material in WIPP Panel 7 Room 7 on February 14, 2014. This report integrates and summarizes the results contained in three separate reports, described below, and draws conclusions basedmore » on those results. Chemical and Radiochemical Analyses of WIPP Samples R-15 C5 SWB and R16 C-4 Lip; PNNL-24003, Pacific Northwest National Laboratory, December 2014 Analysis of Waste Isolation Pilot Plant (WIPP) Underground and MgO Samples by the Savannah River National Laboratory (SRNL); SRNL-STI-2014-00617; Savannah River National Laboratory, December 2014 Report for WIPP UG Sample #3, R15C5 (9/3/14); LLNL-TR-667015; Lawrence Livermore National Laboratory, January 2015 This report is also contained in the Waste Isolation Pilot Plant Technical Assessment Team Report; SRNL-RP-2015-01198; Savannah River National Laboratory, March 17, 2015, as Appendix C: Analysis Integrated Summary Report.« less

  4. Analytical methods of the U.S. Geological Survey's New York District Water-Analysis Laboratory

    USGS Publications Warehouse

    Lawrence, Gregory B.; Lincoln, Tricia A.; Horan-Ross, Debra A.; Olson, Mark L.; Waldron, Laura A.

    1995-01-01

    The New York District of the U.S. Geological Survey (USGS) in Troy, N.Y., operates a water-analysis laboratory for USGS watershed-research projects in the Northeast that require analyses of precipitation and of dilute surface water and soil water for major ions; it also provides analyses of certain chemical constituents in soils and soil gas samples.This report presents the methods for chemical analyses of water samples, soil-water samples, and soil-gas samples collected in wateshed-research projects. The introduction describes the general materials and technicques for each method and explains the USGS quality-assurance program and data-management procedures; it also explains the use of cross reference to the three most commonly used methods manuals for analysis of dilute waters. The body of the report describes the analytical procedures for (1) solution analysis, (2) soil analysis, and (3) soil-gas analysis. The methods are presented in alphabetical order by constituent. The method for each constituent is preceded by (1) reference codes for pertinent sections of the three manuals mentioned above, (2) a list of the method's applications, and (3) a summary of the procedure. The methods section for each constitutent contains the following categories: instrumentation and equipment, sample preservation and storage, reagents and standards, analytical procedures, quality control, maintenance, interferences, safety considerations, and references. Sufficient information is presented for each method to allow the resulting data to be appropriately used in environmental investigations.

  5. Performance of biometric quality measures.

    PubMed

    Grother, Patrick; Tabassi, Elham

    2007-04-01

    We document methods for the quantitative evaluation of systems that produce a scalar summary of a biometric sample's quality. We are motivated by a need to test claims that quality measures are predictive of matching performance. We regard a quality measurement algorithm as a black box that converts an input sample to an output scalar. We evaluate it by quantifying the association between those values and observed matching results. We advance detection error trade-off and error versus reject characteristics as metrics for the comparative evaluation of sample quality measurement algorithms. We proceed this with a definition of sample quality, a description of the operational use of quality measures. We emphasize the performance goal by including a procedure for annotating the samples of a reference corpus with quality values derived from empirical recognition scores.

  6. Earth recovery mode analysis for a Martian sample return mission

    NASA Technical Reports Server (NTRS)

    Green, J. P.

    1978-01-01

    The analysis has concerned itself with evaluating alternative methods of recovering a sample module from a trans-earth trajectory originating in the vicinity of Mars. The major modes evaluated are: (1) direct atmospheric entry from trans-earth trajectory; (2) earth orbit insertion by retropropulsion; and (3) atmospheric braking to a capture orbit. In addition, the question of guided vs. unguided entry vehicles was considered, as well as alternative methods of recovery after orbit insertion for modes (2) and (3). A summary of results and conclusions is presented. Analytical results for aerodynamic and propulsive maneuvering vehicles are discussed. System performance requirements and alternatives for inertial systems implementation are also discussed. Orbital recovery operations and further studies required to resolve the recovery mode issue are described.

  7. 40 CFR Appendix I to Subpart T - Sample Graphical Summary of NTE Emission Results

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Emission Results I Appendix I to Subpart T Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... (CONTINUED) Manufacturer-Run In-Use Testing Program for Heavy-Duty Diesel Engines Pt. 86, Subpt. T, App. I Appendix I to Subpart T—Sample Graphical Summary of NTE Emission Results The following figure shows an...

  8. 40 CFR Appendix I to Subpart T - Sample Graphical Summary of NTE Emission Results

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Emission Results I Appendix I to Subpart T Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... (CONTINUED) Manufacturer-Run In-Use Testing Program for Heavy-Duty Diesel Engines Pt. 86, Subpt. T, App. I Appendix I to Subpart T—Sample Graphical Summary of NTE Emission Results The following figure shows an...

  9. Meta-analysis of correlated traits via summary statistics from GWASs with an application in hypertension.

    PubMed

    Zhu, Xiaofeng; Feng, Tao; Tayo, Bamidele O; Liang, Jingjing; Young, J Hunter; Franceschini, Nora; Smith, Jennifer A; Yanek, Lisa R; Sun, Yan V; Edwards, Todd L; Chen, Wei; Nalls, Mike; Fox, Ervin; Sale, Michele; Bottinger, Erwin; Rotimi, Charles; Liu, Yongmei; McKnight, Barbara; Liu, Kiang; Arnett, Donna K; Chakravati, Aravinda; Cooper, Richard S; Redline, Susan

    2015-01-08

    Genome-wide association studies (GWASs) have identified many genetic variants underlying complex traits. Many detected genetic loci harbor variants that associate with multiple-even distinct-traits. Most current analysis approaches focus on single traits, even though the final results from multiple traits are evaluated together. Such approaches miss the opportunity to systemically integrate the phenome-wide data available for genetic association analysis. In this study, we propose a general approach that can integrate association evidence from summary statistics of multiple traits, either correlated, independent, continuous, or binary traits, which might come from the same or different studies. We allow for trait heterogeneity effects. Population structure and cryptic relatedness can also be controlled. Our simulations suggest that the proposed method has improved statistical power over single-trait analysis in most of the cases we studied. We applied our method to the Continental Origins and Genetic Epidemiology Network (COGENT) African ancestry samples for three blood pressure traits and identified four loci (CHIC2, HOXA-EVX1, IGFBP1/IGFBP3, and CDH17; p < 5.0 × 10(-8)) associated with hypertension-related traits that were missed by a single-trait analysis in the original report. Six additional loci with suggestive association evidence (p < 5.0 × 10(-7)) were also observed, including CACNA1D and WNT3. Our study strongly suggests that analyzing multiple phenotypes can improve statistical power and that such analysis can be executed with the summary statistics from GWASs. Our method also provides a way to study a cross phenotype (CP) association by using summary statistics from GWASs of multiple phenotypes. Copyright © 2015 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  10. 16 CFR 1610.3 - Summary of test method.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Summary of test method. 1610.3 Section 1610.3 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FLAMMABLE FABRICS ACT REGULATIONS STANDARD FOR THE FLAMMABILITY OF CLOTHING TEXTILES The Standard § 1610.3 Summary of test method. The Standard...

  11. Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review.

    PubMed

    Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C

    2018-03-07

    Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.

  12. A Robust Bayesian Random Effects Model for Nonlinear Calibration Problems

    PubMed Central

    Fong, Y.; Wakefield, J.; De Rosa, S.; Frahm, N.

    2013-01-01

    Summary In the context of a bioassay or an immunoassay, calibration means fitting a curve, usually nonlinear, through the observations collected on a set of samples containing known concentrations of a target substance, and then using the fitted curve and observations collected on samples of interest to predict the concentrations of the target substance in these samples. Recent technological advances have greatly improved our ability to quantify minute amounts of substance from a tiny volume of biological sample. This has in turn led to a need to improve statistical methods for calibration. In this paper, we focus on developing calibration methods robust to dependent outliers. We introduce a novel normal mixture model with dependent error terms to model the experimental noise. In addition, we propose a re-parameterization of the five parameter logistic nonlinear regression model that allows us to better incorporate prior information. We examine the performance of our methods with simulation studies and show that they lead to a substantial increase in performance measured in terms of mean squared error of estimation and a measure of the average prediction accuracy. A real data example from the HIV Vaccine Trials Network Laboratory is used to illustrate the methods. PMID:22551415

  13. Comparison of electrical conductivity calculation methods for natural waters

    USGS Publications Warehouse

    McCleskey, R. Blaine; Nordstrom, D. Kirk; Ryan, Joseph N.

    2012-01-01

    The capability of eleven methods to calculate the electrical conductivity of a wide range of natural waters from their chemical composition was investigated. A brief summary of each method is presented including equations to calculate the conductivities of individual ions, the ions incorporated, and the method's limitations. The ability of each method to reliably predict the conductivity depends on the ions included, effective accounting of ion pairing, and the accuracy of the equation used to estimate the ionic conductivities. The performances of the methods were evaluated by calculating the conductivity of 33 environmentally important electrolyte solutions, 41 U.S. Geological Survey standard reference water samples, and 1593 natural water samples. The natural waters tested include acid mine waters, geothermal waters, seawater, dilute mountain waters, and river water impacted by municipal waste water. The three most recent conductivity methods predict the conductivity of natural waters better than other methods. Two of the recent methods can be used to reliably calculate the conductivity for samples with pH values greater than about 3 and temperatures between 0 and 40°C. One method is applicable to a variety of natural water types with a range of pH from 1 to 10, temperature from 0 to 95°C, and ionic strength up to 1 m.

  14. Replicability of time-varying connectivity patterns in large resting state fMRI samples.

    PubMed

    Abrol, Anees; Damaraju, Eswar; Miller, Robyn L; Stephen, Julia M; Claus, Eric D; Mayer, Andrew R; Calhoun, Vince D

    2017-12-01

    The past few years have seen an emergence of approaches that leverage temporal changes in whole-brain patterns of functional connectivity (the chronnectome). In this chronnectome study, we investigate the replicability of the human brain's inter-regional coupling dynamics during rest by evaluating two different dynamic functional network connectivity (dFNC) analysis frameworks using 7 500 functional magnetic resonance imaging (fMRI) datasets. To quantify the extent to which the emergent functional connectivity (FC) patterns are reproducible, we characterize the temporal dynamics by deriving several summary measures across multiple large, independent age-matched samples. Reproducibility was demonstrated through the existence of basic connectivity patterns (FC states) amidst an ensemble of inter-regional connections. Furthermore, application of the methods to conservatively configured (statistically stationary, linear and Gaussian) surrogate datasets revealed that some of the studied state summary measures were indeed statistically significant and also suggested that this class of null model did not explain the fMRI data fully. This extensive testing of reproducibility of similarity statistics also suggests that the estimated FC states are robust against variation in data quality, analysis, grouping, and decomposition methods. We conclude that future investigations probing the functional and neurophysiological relevance of time-varying connectivity assume critical importance. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Replicability of time-varying connectivity patterns in large resting state fMRI samples

    PubMed Central

    Abrol, Anees; Damaraju, Eswar; Miller, Robyn L.; Stephen, Julia M.; Claus, Eric D.; Mayer, Andrew R.; Calhoun, Vince D.

    2018-01-01

    The past few years have seen an emergence of approaches that leverage temporal changes in whole-brain patterns of functional connectivity (the chronnectome). In this chronnectome study, we investigate the replicability of the human brain’s inter-regional coupling dynamics during rest by evaluating two different dynamic functional network connectivity (dFNC) analysis frameworks using 7 500 functional magnetic resonance imaging (fMRI) datasets. To quantify the extent to which the emergent functional connectivity (FC) patterns are reproducible, we characterize the temporal dynamics by deriving several summary measures across multiple large, independent age-matched samples. Reproducibility was demonstrated through the existence of basic connectivity patterns (FC states) amidst an ensemble of inter-regional connections. Furthermore, application of the methods to conservatively configured (statistically stationary, linear and Gaussian) surrogate datasets revealed that some of the studied state summary measures were indeed statistically significant and also suggested that this class of null model did not explain the fMRI data fully. This extensive testing of reproducibility of similarity statistics also suggests that the estimated FC states are robust against variation in data quality, analysis, grouping, and decomposition methods. We conclude that future investigations probing the functional and neurophysiological relevance of time-varying connectivity assume critical importance. PMID:28916181

  16. Mining of Business-Oriented Conversations at a Call Center

    NASA Astrophysics Data System (ADS)

    Takeuchi, Hironori; Nasukawa, Tetsuya; Watanabe, Hideo

    Recently it has become feasible to transcribe textual records from telephone conversations at call centers by using automatic speech recognition. In this research, we extended a text mining system for call summary records and constructed a conversation mining system for the business-oriented conversations at the call center. To acquire useful business insights from the conversational data through the text mining system, it is critical to identify appropriate textual segments and expressions as the viewpoints to focus on. In the analysis of call summary data using a text mining system, some experts defined the viewpoints for the analysis by looking at some sample records and by preparing the dictionaries based on frequent keywords in the sample dataset. However with conversations it is difficult to identify such viewpoints manually and in advance because the target data consists of complete transcripts that are often lengthy and redundant. In this research, we defined a model of the business-oriented conversations and proposed a mining method to identify segments that have impacts on the outcomes of the conversations and can then extract useful expressions in each of these identified segments. In the experiment, we processed the real datasets from a car rental service center and constructed a mining system. With this system, we show the effectiveness of the method based on the defined conversation model.

  17. What is the use? Application of the short form (SF) questionnaires for the evaluation of treatment effects.

    PubMed

    Pelle, Aline J; Kupper, Nina; Mols, Floortje; de Jonge, Peter

    2013-08-01

    Health status has evolved as a clinical outcome measure that is of great interest in medical care. However, there is still debate about the appropriateness of scoring algorithms for the often used short form questionnaires. Therefore, our aim was to evaluate the consequences of the traditional scoring procedure based on orthogonal factor rotation for clinical applications by (a) re-evaluating the results of randomized controlled trials (RCTs) on the effectiveness of antidepressants in improving health status in cardiac patients and (b) comparing empirical evidence on depression and health status using orthogonal and oblique factor rotation (alternative scoring method) in a community sample and a heart failure (HF) sample. This is a systematic literature review and cross-sectional analysis among 1,598 community sample participants and 282 HF patients. Orthogonal rotation artificially forces the mental component summary (MCS) and physical component summary (PCS) to be unrelated, which is illustrated in two of the three included RCTs. Two RCTs showed improvements in MCS, but no improvement in PCS over time. Cross-sectional analysis in the two datasets showed that employing the alternative scoring algorithm resulted in higher negative correlations of MCS and PCS with depression, and a gradual decline in MCS with each decile of decline in PCS. Our data showed that appropriate carefulness is needed when calculating and interpreting summary scores. The traditional scoring algorithm seems inappropriate to objectively evaluate the effects of interventions on both the MCS and the PCS. Awareness in the design and evaluation of interventions using these outcomes is warranted.

  18. 19 CFR 151.63 - Information on entry summary.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.63 Information on entry summary. Each entry summary covering wool or hair subject to duty at a rate per clean... to each lot of wool or hair covered thereby, in addition to other information required, the total...

  19. 19 CFR 151.63 - Information on entry summary.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.63 Information on entry summary. Each entry summary covering wool or hair subject to duty at a rate per clean... to each lot of wool or hair covered thereby, in addition to other information required, the total...

  20. 19 CFR 151.63 - Information on entry summary.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.63 Information on entry summary. Each entry summary covering wool or hair subject to duty at a rate per clean... to each lot of wool or hair covered thereby, in addition to other information required, the total...

  1. 19 CFR 151.63 - Information on entry summary.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.63 Information on entry summary. Each entry summary covering wool or hair subject to duty at a rate per clean... to each lot of wool or hair covered thereby, in addition to other information required, the total...

  2. 19 CFR 151.63 - Information on entry summary.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.63 Information on entry summary. Each entry summary covering wool or hair subject to duty at a rate per clean... to each lot of wool or hair covered thereby, in addition to other information required, the total...

  3. Internal pilots for a class of linear mixed models with Gaussian and compound symmetric data

    PubMed Central

    Gurka, Matthew J.; Coffey, Christopher S.; Muller, Keith E.

    2015-01-01

    SUMMARY An internal pilot design uses interim sample size analysis, without interim data analysis, to adjust the final number of observations. The approach helps to choose a sample size sufficiently large (to achieve the statistical power desired), but not too large (which would waste money and time). We report on recent research in cerebral vascular tortuosity (curvature in three dimensions) which would benefit greatly from internal pilots due to uncertainty in the parameters of the covariance matrix used for study planning. Unfortunately, observations correlated across the four regions of the brain and small sample sizes preclude using existing methods. However, as in a wide range of medical imaging studies, tortuosity data have no missing or mistimed data, a factorial within-subject design, the same between-subject design for all responses, and a Gaussian distribution with compound symmetry. For such restricted models, we extend exact, small sample univariate methods for internal pilots to linear mixed models with any between-subject design (not just two groups). Planning a new tortuosity study illustrates how the new methods help to avoid sample sizes that are too small or too large while still controlling the type I error rate. PMID:17318914

  4. Multi-mycotoxin stable isotope dilution LC-MS/MS method for Fusarium toxins in beer.

    PubMed

    Habler, Katharina; Gotthardt, Marina; Schüler, Jan; Rychlik, Michael

    2017-03-01

    A stable isotope dilution LC-MS/MS multi-mycotoxin method was developed for 12 different Fusarium toxins including modified mycotoxins in beer (deoxynivalenol-3-glucoside, deoxynivalenol, 3-acetyldeoxynivalenol, 15-acetyl-deoxynivalenol, HT2-toxin, T2-toxin, enniatin B, B1, A1, A, beauvericin and zearalenone). As sample preparation and purification of beer a combined solid phase extraction for trichothecenes, enniatins, beauvericin and zearalenone was firstly developed. The validation of the new method gave satisfying results: intra-day and inter-day precision and recoveries were 1-5%, 2-8% and 72-117%, respectively. In total, 61 different organic and conventional beer samples from Germany and all over the world were analyzed by using the newly developed multi-mycotoxin method. In summary, deoxynivalenol, deoxynivalenol-3-glucoside, 3-acetyldeoxynivaleneol and enniatin B were quantified in rather low contents in the investigated beer samples. None of the other monitored Fusarium toxins like 15-acetyldeoxynivalenol, HT2- and T2-toxin, zearalenone, enniatin B1, A1, A or beauvericin were detectable. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Prospects of Fine-Mapping Trait-Associated Genomic Regions by Using Summary Statistics from Genome-wide Association Studies.

    PubMed

    Benner, Christian; Havulinna, Aki S; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ripatti, Samuli; Pirinen, Matti

    2017-10-05

    During the past few years, various novel statistical methods have been developed for fine-mapping with the use of summary statistics from genome-wide association studies (GWASs). Although these approaches require information about the linkage disequilibrium (LD) between variants, there has not been a comprehensive evaluation of how estimation of the LD structure from reference genotype panels performs in comparison with that from the original individual-level GWAS data. Using population genotype data from Finland and the UK Biobank, we show here that a reference panel of 1,000 individuals from the target population is adequate for a GWAS cohort of up to 10,000 individuals, whereas smaller panels, such as those from the 1000 Genomes Project, should be avoided. We also show, both theoretically and empirically, that the size of the reference panel needs to scale with the GWAS sample size; this has important consequences for the application of these methods in ongoing GWAS meta-analyses and large biobank studies. We conclude by providing software tools and by recommending practices for sharing LD information to more efficiently exploit summary statistics in genetics research. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  6. Application of surface enhanced Raman scattering and competitive adaptive reweighted sampling on detecting furfural dissolved in transformer oil

    NASA Astrophysics Data System (ADS)

    Chen, Weigen; Zou, Jingxin; Wan, Fu; Fan, Zhou; Yang, Dingkun

    2018-03-01

    Detecting the dissolving furfural in mineral oil is an essential technical method to evaluate the ageing condition of oil-paper insulation and the degradation of mechanical properties. Compared with the traditional detection method, Raman spectroscopy is obviously convenient and timesaving in operation. This study explored the method of applying surface enhanced Raman scattering (SERS) on quantitative analysis of the furfural dissolved in oil. Oil solution with different concentration of furfural were prepared and calibrated by high-performance liquid chromatography. Confocal laser Raman spectroscopy (CLRS) and SERS technology were employed to acquire Raman spectral data. Monte Carlo cross validation (MCCV) was used to eliminate the outliers in sample set, then competitive adaptive reweighted sampling (CARS) was developed to select an optimal combination of informative variables that most reflect the chemical properties of concern. Based on selected Raman spectral features, support vector machine (SVM) combined with particle swarm algorithm (PSO) was used to set up a furfural quantitative analysis model. Finally, the generalization ability and prediction precision of the established method were verified by the samples made in lab. In summary, a new spectral method is proposed to quickly detect furfural in oil, which lays a foundation for evaluating the ageing of oil-paper insulation in oil immersed electrical equipment.

  7. 16 CFR 1610.3 - Summary of test method.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Summary of test method. 1610.3 Section 1610... FOR THE FLAMMABILITY OF CLOTHING TEXTILES The Standard § 1610.3 Summary of test method. The Standard... surface, and held in a special apparatus at an angle of 45°. A standardized flame shall be applied to the...

  8. 16 CFR 1610.3 - Summary of test method.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Summary of test method. 1610.3 Section 1610... FOR THE FLAMMABILITY OF CLOTHING TEXTILES The Standard § 1610.3 Summary of test method. The Standard... surface, and held in a special apparatus at an angle of 45°. A standardized flame shall be applied to the...

  9. 16 CFR 1610.3 - Summary of test method.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Summary of test method. 1610.3 Section 1610... FOR THE FLAMMABILITY OF CLOTHING TEXTILES The Standard § 1610.3 Summary of test method. The Standard... surface, and held in a special apparatus at an angle of 45°. A standardized flame shall be applied to the...

  10. Nutrient concentrations in Upper and Lower Echo, Fallen Leaf, Spooner, and Marlette Lakes and associated outlet streams, California and Nevada, 2002-03

    USGS Publications Warehouse

    Lico, Michael S.

    2004-01-01

    Five lakes and their outlet streams in the Lake Tahoe Basin were sampled for nutrients during 2002-03. The lakes and streams sampled included Upper Echo, Lower Echo, Fallen Leaf, Spooner, and Marlette Lakes and Echo, Taylor, and Marlette Creeks. Water samples were collected to determine seasonal and spatial concentrations of dissolved nitrite plus nitrate, dissolved ammonia, total Kjeldahl nitrogen, dissolved orthophosphate, total phosphorus, and total bioreactive iron. These data will be used by Tahoe Regional Planning Agency in revising threshold values for waters within the Lake Tahoe Basin. Standard U.S. Geological Survey methods of sample collection and analysis were used and are detailed herein. Data collected during this study and summary statistics are presented in graphical and tabular form.

  11. Summary of nutrient and biomass data from two aspen sites in western United States

    Treesearch

    Robert S. Johnston; Dale L. Bartos

    1977-01-01

    Summary tables are presented for aboveground biomass and nutrient concentrations for 20 aspen trees (Populus tremuloides Michx.) that were sampled at two study sites in Utah and Wyoming. Trees were divided into seven components - leaves, current twigs, old twigs, deadwood (branches), branches, bark, and bole wood. Samples from each component were analyzed for nitrogen...

  12. Fabrication Methods and Luminescent Properties of ZnO Materials for Light-Emitting Diodes

    PubMed Central

    Lee, Ching-Ting

    2010-01-01

    Zinc oxide (ZnO) is a potential candidate material for optoelectronic applications, especially for blue to ultraviolet light emitting devices, due to its fundamental advantages, such as direct wide band gap of 3.37 eV, large exciton binding energy of 60 meV, and high optical gain of 320 cm−1 at room temperature. Its luminescent properties have been intensively investigated for samples, in the form of bulk, thin film, or nanostructure, prepared by various methods and doped with different impurities. In this paper, we first review briefly the recent progress in this field. Then a comprehensive summary of the research carried out in our laboratory on ZnO preparation and its luminescent properties, will be presented, in which the involved samples include ZnO films and nanorods prepared with different methods and doped with n-type or p-type impurities. The results of ZnO based LEDs will also be discussed.

  13. Estimating the sample mean and standard deviation from the sample size, median, range and/or interquartile range.

    PubMed

    Wan, Xiang; Wang, Wenqian; Liu, Jiming; Tong, Tiejun

    2014-12-19

    In systematic reviews and meta-analysis, researchers often pool the results of the sample mean and standard deviation from a set of similar clinical trials. A number of the trials, however, reported the study using the median, the minimum and maximum values, and/or the first and third quartiles. Hence, in order to combine results, one may have to estimate the sample mean and standard deviation for such trials. In this paper, we propose to improve the existing literature in several directions. First, we show that the sample standard deviation estimation in Hozo et al.'s method (BMC Med Res Methodol 5:13, 2005) has some serious limitations and is always less satisfactory in practice. Inspired by this, we propose a new estimation method by incorporating the sample size. Second, we systematically study the sample mean and standard deviation estimation problem under several other interesting settings where the interquartile range is also available for the trials. We demonstrate the performance of the proposed methods through simulation studies for the three frequently encountered scenarios, respectively. For the first two scenarios, our method greatly improves existing methods and provides a nearly unbiased estimate of the true sample standard deviation for normal data and a slightly biased estimate for skewed data. For the third scenario, our method still performs very well for both normal data and skewed data. Furthermore, we compare the estimators of the sample mean and standard deviation under all three scenarios and present some suggestions on which scenario is preferred in real-world applications. In this paper, we discuss different approximation methods in the estimation of the sample mean and standard deviation and propose some new estimation methods to improve the existing literature. We conclude our work with a summary table (an Excel spread sheet including all formulas) that serves as a comprehensive guidance for performing meta-analysis in different situations.

  14. Comparing and Combining Data across Multiple Sources via Integration of Paired-sample Data to Correct for Measurement Error

    PubMed Central

    Huang, Yunda; Huang, Ying; Moodie, Zoe; Li, Sue; Self, Steve

    2014-01-01

    Summary In biomedical research such as the development of vaccines for infectious diseases or cancer, measures from the same assay are often collected from multiple sources or laboratories. Measurement error that may vary between laboratories needs to be adjusted for when combining samples across laboratories. We incorporate such adjustment in comparing and combining independent samples from different labs via integration of external data, collected on paired samples from the same two laboratories. We propose: 1) normalization of individual level data from two laboratories to the same scale via the expectation of true measurements conditioning on the observed; 2) comparison of mean assay values between two independent samples in the Main study accounting for inter-source measurement error; and 3) sample size calculations of the paired-sample study so that hypothesis testing error rates are appropriately controlled in the Main study comparison. Because the goal is not to estimate the true underlying measurements but to combine data on the same scale, our proposed methods do not require that the true values for the errorprone measurements are known in the external data. Simulation results under a variety of scenarios demonstrate satisfactory finite sample performance of our proposed methods when measurement errors vary. We illustrate our methods using real ELISpot assay data generated by two HIV vaccine laboratories. PMID:22764070

  15. Gravimetric Analysis of Particulate Matter using Air Samplers Housing Internal Filtration Capsules

    PubMed Central

    O'Connor, Sean; O'Connor, Paula Fey; Feng, H. Amy

    2015-01-01

    Summary An evaluation was carried out to investigate the suitability of polyvinyl chloride (PVC) internal capsules, housed within air sampling devices, for gravimetric analysis of airborne particles collected in workplaces. Experiments were carried out using blank PVC capsules and PVC capsules spiked with 0,1 – 4 mg of National Institute of Standards and Technology Standard Reference Material® (NIST SRM) 1648 (Urban Particulate Matter) and Arizona Road Dust (Air Cleaner Test Dust). The capsules were housed within plastic closed-face cassette samplers (CFCs). A method detection limit (MDL) of 0,075 mg per sample was estimated. Precision Sr at 0,5 - 4 mg per sample was 0,031 and the estimated bias was 0,058. Weight stability over 28 days was verified for both blanks and spiked capsules. Independent laboratory testing on blanks and field samples verified long-term weight stability as well as sampling and analysis precision and bias estimates. An overall precision estimate Ŝrt of 0,059 was obtained. An accuracy measure of ±15,5% was found for the gravimetric method using PVC internal capsules. PMID:26435581

  16. 16 CFR § 1610.3 - Summary of test method.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Summary of test method. § 1610.3 Section Â... STANDARD FOR THE FLAMMABILITY OF CLOTHING TEXTILES The Standard § 1610.3 Summary of test method. The... surface, and held in a special apparatus at an angle of 45°. A standardized flame shall be applied to the...

  17. On summary measure analysis of linear trend repeated measures data: performance comparison with two competing methods.

    PubMed

    Vossoughi, Mehrdad; Ayatollahi, S M T; Towhidi, Mina; Ketabchi, Farzaneh

    2012-03-22

    The summary measure approach (SMA) is sometimes the only applicable tool for the analysis of repeated measurements in medical research, especially when the number of measurements is relatively large. This study aimed to describe techniques based on summary measures for the analysis of linear trend repeated measures data and then to compare performances of SMA, linear mixed model (LMM), and unstructured multivariate approach (UMA). Practical guidelines based on the least squares regression slope and mean of response over time for each subject were provided to test time, group, and interaction effects. Through Monte Carlo simulation studies, the efficacy of SMA vs. LMM and traditional UMA, under different types of covariance structures, was illustrated. All the methods were also employed to analyze two real data examples. Based on the simulation and example results, it was found that the SMA completely dominated the traditional UMA and performed convincingly close to the best-fitting LMM in testing all the effects. However, the LMM was not often robust and led to non-sensible results when the covariance structure for errors was misspecified. The results emphasized discarding the UMA which often yielded extremely conservative inferences as to such data. It was shown that summary measure is a simple, safe and powerful approach in which the loss of efficiency compared to the best-fitting LMM was generally negligible. The SMA is recommended as the first choice to reliably analyze the linear trend data with a moderate to large number of measurements and/or small to moderate sample sizes.

  18. Tracking the release of IPCC AR5 on Twitter: Users, comments, and sources following the release of the Working Group I Summary for Policymakers.

    PubMed

    Newman, Todd P

    2017-10-01

    Using the immediate release of the Working Group 1 Summary for Policymakers of the Intergovernmental Panel on Climate Change Fifth Assessment Report as a case study, this article seeks to describe what type of actors were most active during the summary release, the substance of the most propagated tweets during the summary release, and the media sources that attracted the most attention during the summary release. The results from the study suggest that non-elite actors, such as individual bloggers and concerned citizens, accounted for the majority of the most propagated tweets in the sample. This study also finds that the majority of the most propagated tweets in the sample focused on public understanding of the report. Finally, while mainstream media sources were the most frequently discussed media sources, a number of new media and science news and information sources compete for audience attention.

  19. A guide to the proper selection and use of federally approved sediment and water-quality samplers

    USGS Publications Warehouse

    Davis, Broderick E.; ,

    2005-01-01

    As interest in the health of rivers and streams increases3, and new water-quality regulations4 are promulgated, interest in sediment and water-quality sampling equipment and technologies has increased. While much information on the subject exists, a comprehensive summary document of sediment sampling equipment and technology is lacking. This report seeks to provide such a summary.

  20. Final report: survey and removal of radioactive surface contamination at environmental restoration sites, Sandia National Laboratories/New Mexico. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, K.A.; Mitchell, M.M.; Jean, D.

    1997-09-01

    This report contains the Appendices A-L including Voluntary Corrective Measure Plans, Waste Management Plans, Task-Specific Health and Safety Plan, Analytical Laboratory Procedures, Soil Sample Results, In-Situ Gamma Spectroscopy Results, Radionuclide Activity Summary, TCLP Soil Sample Results, Waste Characterization Memoranda, Waste Drum Inventory Data, Radiological Risk Assessment, and Summary of Site-Specific Recommendations.

  1. A Method for Extracting Important Segments from Documents Using Support Vector Machines

    NASA Astrophysics Data System (ADS)

    Suzuki, Daisuke; Utsumi, Akira

    In this paper we propose an extraction-based method for automatic summarization. The proposed method consists of two processes: important segment extraction and sentence compaction. The process of important segment extraction classifies each segment in a document as important or not by Support Vector Machines (SVMs). The process of sentence compaction then determines grammatically appropriate portions of a sentence for a summary according to its dependency structure and the classification result by SVMs. To test the performance of our method, we conducted an evaluation experiment using the Text Summarization Challenge (TSC-1) corpus of human-prepared summaries. The result was that our method achieved better performance than a segment-extraction-only method and the Lead method, especially for sentences only a part of which was included in human summaries. Further analysis of the experimental results suggests that a hybrid method that integrates sentence extraction with segment extraction may generate better summaries.

  2. Production of plutonium, yttrium and strontium tracers for using in environmental research

    NASA Astrophysics Data System (ADS)

    Arzumanov, A.; Batischev, V.; Berdinova, N.; Borissenko, A.; Chumikov, G.; Lukashenko, S.; Lysukhin, S.; Popov, Yu.; Sychikov, G.

    2001-12-01

    Summary of cyclotron production methods of 237Pu (45,2 d), 88Y (106,65 d) and 85Sr (64,84 d) tracers via nuclear reactions with protons and alphas on 235U, 88Sr and 85Rb targets in wide energy range is given. Chemical methods of separation and purification of the tracers from the irradiated uranium, strontium and rubidium targets are described. The tracers were used for determination of Pu (239-240), Sr-90 and Am-241 in the samples (soil, plants, underground waters) from Semipalatinsk Test Site. Obtained results are discussed.

  3. QUANTIFYING ALTERNATIVE SPLICING FROM PAIRED-END RNA-SEQUENCING DATA.

    PubMed

    Rossell, David; Stephan-Otto Attolini, Camille; Kroiss, Manuel; Stöcker, Almond

    2014-03-01

    RNA-sequencing has revolutionized biomedical research and, in particular, our ability to study gene alternative splicing. The problem has important implications for human health, as alternative splicing may be involved in malfunctions at the cellular level and multiple diseases. However, the high-dimensional nature of the data and the existence of experimental biases pose serious data analysis challenges. We find that the standard data summaries used to study alternative splicing are severely limited, as they ignore a substantial amount of valuable information. Current data analysis methods are based on such summaries and are hence sub-optimal. Further, they have limited flexibility in accounting for technical biases. We propose novel data summaries and a Bayesian modeling framework that overcome these limitations and determine biases in a non-parametric, highly flexible manner. These summaries adapt naturally to the rapid improvements in sequencing technology. We provide efficient point estimates and uncertainty assessments. The approach allows to study alternative splicing patterns for individual samples and can also be the basis for downstream analyses. We found a several fold improvement in estimation mean square error compared popular approaches in simulations, and substantially higher consistency between replicates in experimental data. Our findings indicate the need for adjusting the routine summarization and analysis of alternative splicing RNA-seq studies. We provide a software implementation in the R package casper.

  4. Statistical grand rounds: a review of analysis and sample size calculation considerations for Wilcoxon tests.

    PubMed

    Divine, George; Norton, H James; Hunt, Ronald; Dienemann, Jacqueline

    2013-09-01

    When a study uses an ordinal outcome measure with unknown differences in the anchors and a small range such as 4 or 7, use of the Wilcoxon rank sum test or the Wilcoxon signed rank test may be most appropriate. However, because nonparametric methods are at best indirect functions of standard measures of location such as means or medians, the choice of the most appropriate summary measure can be difficult. The issues underlying use of these tests are discussed. The Wilcoxon-Mann-Whitney odds directly reflects the quantity that the rank sum procedure actually tests, and thus it can be a superior summary measure. Unlike the means and medians, its value will have a one-to-one correspondence with the Wilcoxon rank sum test result. The companion article appearing in this issue of Anesthesia & Analgesia ("Aromatherapy as Treatment for Postoperative Nausea: A Randomized Trial") illustrates these issues and provides an example of a situation for which the medians imply no difference between 2 groups, even though the groups are, in fact, quite different. The trial cited also provides an example of a single sample that has a median of zero, yet there is a substantial shift for much of the nonzero data, and the Wilcoxon signed rank test is quite significant. These examples highlight the potential discordance between medians and Wilcoxon test results. Along with the issues surrounding the choice of a summary measure, there are considerations for the computation of sample size and power, confidence intervals, and multiple comparison adjustment. In addition, despite the increased robustness of the Wilcoxon procedures relative to parametric tests, some circumstances in which the Wilcoxon tests may perform poorly are noted, along with alternative versions of the procedures that correct for such limitations. 

  5. Conceptual models in exploration geochemistry-The Basin and Range Province of the Western United States and Northern Mexico

    USGS Publications Warehouse

    Lovering, T.G.; McCarthy, J.H.

    1978-01-01

    This summary of geochemical exploration in the Basin and Range Province is another in the series of reviews of geochemical-exploration applications covering a large region; this series began in 1975 with a summary for the Canadian Cordillera and Canadian Shield, and was followed in 1976 by a similar summary for Scandinavia (Norden). Rather than adhering strictly to the type of conceptual models applied in those papers, we have made use of generalized landscape geochemistry models related to the nature of concealment of ore deposits. This study is part of a continuing effort to examine and evaluate geochemical-exploration practices in different areas of the world. Twenty case histories of the application of geochemical exploration in both district and regional settings illustrate recent developments in techniques and approaches. Along with other published reports these case histories, exemplifying generalized models of concealed deposits, provide data used to evaluate geochemical-exploration programs and specific sample media. Because blind deposits are increasingly sought in the Basin and Range Province, the use of new sample media or anomaly-enhancement techniques is a necessity. Analysis of vapors or gases emanating from blind deposits is a promising new technique. Certain fractions of stream sediments show anomalies that are weak or not detected in conventional minus 80-mesh fractions. Multi-element analysis of mineralized bedrock may show zoning patterns that indicate depth or direction of ore. Examples of the application of these and other, more conventional methods are indicated in the case histories. The final section of this paper contains a brief evaluation of the applications of all types of sample media to geochemical exploration in the arid environment of the Basin and Range Province. ?? 1978.

  6. Quality of Life for Saudi Patients With Heart Failure: A Cross-Sectional Correlational Study

    PubMed Central

    AbuRuz, Mohannad Eid; Alaloul, Fawwaz; Saifan, Ahmed; Masa’Deh, Rami; Abusalem, Said

    2016-01-01

    Introduction: Heart failure is a major public health issue and a growing concern in developing countries, including Saudi Arabia. Most related research was conducted in Western cultures and may have limited applicability for individuals in Saudi Arabia. Thus, this study assesses the quality of life of Saudi patients with heart failure. Materials and Methods: A cross-sectional correlational design was used on a convenient sample of 103 patients with heart failure. Data were collected using the Short Form-36 and the Medical Outcomes Study-Social Support Survey. Results: Overall, the patients’ scores were low for all domains of Quality of Life. The Physical Component Summary and Mental Component Summary mean scores and SDs were (36.7±12.4, 48.8±6.5) respectively, indicating poor Quality of Life. Left ventricular ejection fraction was the strongest predictor of both physical and mental summaries. Conclusion: Identifying factors that impact quality of life for Saudi heart failure patients is important in identifying and meeting their physical and psychosocial needs. PMID:26493415

  7. METHOD 415.3 - MEASUREMENT OF TOTAL ORGANIC ...

    EPA Pesticide Factsheets

    2.0 SUMMARY OF METHOD2.1 In both TOC and DOC determinations, organic carbon in the water sample is oxidized to form carbon dioxide (CO2), which is then measured by a detection system. There are two different approaches for the oxidation of organic carbon in water samples to carbon dioxide gas: (a) combustion in an oxidizing gas and (b) UV promoted or heat catalized chemical oxidation with a persulfate solution. Carbon dioxide, which is released from the oxidized sample, is detected by a conductivity detector or by a nondispersive infrared (NDIR) detector. Instruments using any combination of the above technologies may be used in this method.2.2. Setteable solids and floating matter may cause plugging of valves, tubing, and the injection needle port. The TOC procedure allows the removal of settleable solids and floating matter. The suspended matter is considered part of the sample. The resulting water sample is then considered a close approximation of the original whole water sample for the purpose of TOC measurement.2.3. The DOC procedure requires that the sample be passed through a 0.45 um filter prior to analysis.2.4. The TOC and DOC procedures require that all inorganic carbon be removed from the sample before the sample is analyzed for organic carbon content. If the inorganic carbon (IC) is not completely removed, significant error will occur. The inorganic carbon interference is removed by converting the mineralized IC to CO2 by acidification and

  8. Quality-assurance procedures: Method 5G determination of particulate emissions from wood heaters from a dilution tunnel sampling location

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, T.E.; Hartman, M.W.; Olin, R.C.

    1989-06-01

    Quality-assurance procedures are contained in this comprehensive document intended to be used as an aid for wood-heater manufacturers and testing laboratories in performing particulate matter sampling of wood heaters according to EPA protocol, Method 5G. These procedures may be used in research and development, and as an aid in auditing and certification testing. A detailed, step-by-step quality assurance guide is provided to aid in the procurement and assembly of testing apparatus, to clearly describe the procedures, and to facilitate data collection and reporting. Suggested data sheets are supplied that can be used as an aid for both recordkeeping and certificationmore » applications. Throughout the document, activity matrices are provided to serve as a summary reference. Checklists are also supplied that can be used by testing personnel. Finally, for the purposes of ensuring data quality, procedures are outlined for apparatus operation, maintenance, and traceability. These procedures combined with the detailed description of the sampling and analysis protocol will help ensure the accuracy and reliability of Method 5G emission-testing results.« less

  9. DISSCO: direct imputation of summary statistics allowing covariates

    PubMed Central

    Xu, Zheng; Duan, Qing; Yan, Song; Chen, Wei; Li, Mingyao; Lange, Ethan; Li, Yun

    2015-01-01

    Background: Imputation of individual level genotypes at untyped markers using an external reference panel of genotyped or sequenced individuals has become standard practice in genetic association studies. Direct imputation of summary statistics can also be valuable, for example in meta-analyses where individual level genotype data are not available. Two methods (DIST and ImpG-Summary/LD), that assume a multivariate Gaussian distribution for the association summary statistics, have been proposed for imputing association summary statistics. However, both methods assume that the correlations between association summary statistics are the same as the correlations between the corresponding genotypes. This assumption can be violated in the presence of confounding covariates. Methods: We analytically show that in the absence of covariates, correlation among association summary statistics is indeed the same as that among the corresponding genotypes, thus serving as a theoretical justification for the recently proposed methods. We continue to prove that in the presence of covariates, correlation among association summary statistics becomes the partial correlation of the corresponding genotypes controlling for covariates. We therefore develop direct imputation of summary statistics allowing covariates (DISSCO). Results: We consider two real-life scenarios where the correlation and partial correlation likely make practical difference: (i) association studies in admixed populations; (ii) association studies in presence of other confounding covariate(s). Application of DISSCO to real datasets under both scenarios shows at least comparable, if not better, performance compared with existing correlation-based methods, particularly for lower frequency variants. For example, DISSCO can reduce the absolute deviation from the truth by 3.9–15.2% for variants with minor allele frequency <5%. Availability and implementation: http://www.unc.edu/∼yunmli/DISSCO. Contact: yunli@med.unc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25810429

  10. Summary of selected U.S. Geological survey data on domestic well water quality for the Centers for Disease Control's National Environmental Public Health Tracking Program

    USGS Publications Warehouse

    Bartholomay, Roy C.; Carter, Janet M.; Qi, Sharon L.; Squillace, Paul J.; Rowe, Gary L.

    2007-01-01

    About 10 to 30 percent of the population in most States uses domestic (private) water supply. In many States, the total number of people served by domestic supplies can be in the millions. The water quality of domestic supplies is inconsistently regulated and generally not well characterized. The U.S. Geological Survey (USGS) has two water-quality data sets in the National Water Information System (NWIS) database that can be used to help define the water quality of domestic-water supplies: (1) data from the National Water-Quality Assessment (NAWQA) Program, and (2) USGS State data. Data from domestic wells from the NAWQA Program were collected to meet one of the Program's objectives, which was to define the water quality of major aquifers in the United States. These domestic wells were located primarily in rural areas. Water-quality conditions in these major aquifers as defined by the NAWQA data can be compared because of the consistency of the NAWQA sampling design, sampling protocols, and water-quality analyses. The NWIS database is a repository of USGS water data collected for a variety of projects; consequently, project objectives and analytical methods vary. This variability can bias statistical summaries of contaminant occurrence and concentrations; nevertheless, these data can be used to define the geographic distribution of contaminants. Maps created using NAWQA and USGS State data in NWIS can show geographic areas where contaminant concentrations may be of potential human-health concern by showing concentrations relative to human-health water-quality benchmarks. On the basis of national summaries of detection frequencies and concentrations relative to U.S. Environmental Protection Agency (USEPA) human-health benchmarks for trace elements, pesticides, and volatile organic compounds, 28 water-quality constituents were identified as contaminants of potential human-health concern. From this list, 11 contaminants were selected for summarization of water-quality data in 16 States (grantee States) that were funded by the Environmental Public Health Tracking (EPHT) Program of the Centers for Disease Control and Prevention (CDC). Only data from domestic-water supplies were used in this summary because samples from these wells are most relevant to human exposure for the targeted population. Using NAWQA data, the concentrations of the 11 contaminants were compared to USEPA human-health benchmarks. Using NAWQA and USGS State data in NWIS, the geographic distribution of the contaminants were mapped for the 16 grantee States. Radon, arsenic, manganese, nitrate, strontium, and uranium had the largest percentages of samples with concentrations greater than their human-health benchmarks. In contrast, organic compounds (pesticides and volatile organic compounds) had the lowest percentages of samples with concentrations greater than human-health benchmarks. Results of data retrievals and spatial analysis were compiled for each of the 16 States and are presented in State summaries for each State. Example summary tables, graphs, and maps based on USGS data for New Jersey are presented to illustrate how USGS water-quality and associated ancillary geospatial data can be used by the CDC to address goals and objectives of the EPHT Program.

  11. The Comprehensive Longitudinal Evaluation of the Milwaukee Parental Choice Program: Summary of Final Reports. SCDP Milwaukee Evaluation Report #36

    ERIC Educational Resources Information Center

    Wolf, Patrick J.

    2012-01-01

    This report contains a summary of the findings from the various topical reports that comprise the author's comprehensive longitudinal study. As a summary, it does not include extensive details regarding the study samples and scientific methodologies employed in those topical studies. The research revealed a pattern of school choice results that…

  12. Controlled microaspiration for high-pressure freezing: a new method for ultrastructural preservation of fragile and sparse tissues for TEM and electron tomography

    PubMed Central

    Triffo, W. J.; Palsdottir, H.; McDonald, K. L.; Lee, J. K.; Inman, J. L.; Bissell, M. J.; Raphael, R. M.; Auer, M.

    2009-01-01

    Summary High-pressure freezing is the preferred method to prepare thick biological specimens for ultrastructural studies. However, the advantages obtained by this method often prove unattainable for samples that are difficult to handle during the freezing and substitution protocols. Delicate and sparse samples are difficult to manipulate and maintain intact throughout the sequence of freezing, infiltration, embedding and final orientation for sectioning and subsequent transmission electron microscopy. An established approach to surmount these difficulties is the use of cellulose microdialysis tubing to transport the sample. With an inner diameter of 200 µm, the tubing protects small and fragile samples within the thickness constraints of high-pressure freezing, and the tube ends can be sealed to avoid loss of sample. Importantly, the transparency of the tubing allows optical study of the specimen at different steps in the process. Here, we describe the use of a micromanipulator and microinjection apparatus to handle and position delicate specimens within the tubing. We report two biologically significant examples that benefit from this approach, 3D cultures of mammary epithelial cells and cochlear outer hair cells. We illustrate the potential for correlative light and electron microscopy as well as electron tomography. PMID:18445158

  13. Identification of the Properties of Gum Arabic Used as a Binder in 7.62-mm Ammunition Primers

    DTIC Science & Technology

    2010-06-01

    Solution - LCC Testing (ATK Task 700) 51 Cartridge - Ballistic Testing (ATK Task 800) 51 ATK Elemental Analysis 52 Moisture Loss and Friability...Hummel sample 7 3 SDT summary for Quadra sample 8 4 Particle size analysis summary for gum arabic samples 9 5 SEM images of Colony gum arabic at 230x...strengths 21 16 Color analysis : Colony after 5.0 hrs 23 17 Color analysis : Hummel after 5.0 hrs 23 18 Color analysis : Brenntag after 5.0 hrs 23 19 Gel

  14. A comparison of confidence/credible interval methods for the area under the ROC curve for continuous diagnostic tests with small sample size.

    PubMed

    Feng, Dai; Cortese, Giuliana; Baumgartner, Richard

    2017-12-01

    The receiver operating characteristic (ROC) curve is frequently used as a measure of accuracy of continuous markers in diagnostic tests. The area under the ROC curve (AUC) is arguably the most widely used summary index for the ROC curve. Although the small sample size scenario is common in medical tests, a comprehensive study of small sample size properties of various methods for the construction of the confidence/credible interval (CI) for the AUC has been by and large missing in the literature. In this paper, we describe and compare 29 non-parametric and parametric methods for the construction of the CI for the AUC when the number of available observations is small. The methods considered include not only those that have been widely adopted, but also those that have been less frequently mentioned or, to our knowledge, never applied to the AUC context. To compare different methods, we carried out a simulation study with data generated from binormal models with equal and unequal variances and from exponential models with various parameters and with equal and unequal small sample sizes. We found that the larger the true AUC value and the smaller the sample size, the larger the discrepancy among the results of different approaches. When the model is correctly specified, the parametric approaches tend to outperform the non-parametric ones. Moreover, in the non-parametric domain, we found that a method based on the Mann-Whitney statistic is in general superior to the others. We further elucidate potential issues and provide possible solutions to along with general guidance on the CI construction for the AUC when the sample size is small. Finally, we illustrate the utility of different methods through real life examples.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogel, R.E.

    This document is in two parts: the first is the data package entitled ``Concrete Samples for Organic Samples`` and the second is entitled ``Concrete Samples for Organic Samples -- Addendum 1A`` which is the 222-S validation summary report.

  16. Molecular Testing for Clinical Diagnosis and Epidemiological Investigations of Intestinal Parasitic Infections

    PubMed Central

    Stensvold, C. Rune

    2014-01-01

    SUMMARY Over the past few decades, nucleic acid-based methods have been developed for the diagnosis of intestinal parasitic infections. Advantages of nucleic acid-based methods are numerous; typically, these include increased sensitivity and specificity and simpler standardization of diagnostic procedures. DNA samples can also be stored and used for genetic characterization and molecular typing, providing a valuable tool for surveys and surveillance studies. A variety of technologies have been applied, and some specific and general pitfalls and limitations have been identified. This review provides an overview of the multitude of methods that have been reported for the detection of intestinal parasites and offers some guidance in applying these methods in the clinical laboratory and in epidemiological studies. PMID:24696439

  17. Flight summaries and temperature climatology at airliner cruise altitudes from GASP (Global Atmospheric Sampling Program) data

    NASA Technical Reports Server (NTRS)

    Nastrom, G. D.; Jasperson, W. H.

    1983-01-01

    Temperature data obtained by the Global Atmospheric Sampling Program (GASP) during the period March 1975 to July 1979 are compiled to form flight summaries of static air temperature and a geographic temperature climatology. The flight summaries include the height and location of the coldest observed temperature and the mean flight level, temperature and the standard deviation of temperature for each flight as well as for flight segments. These summaries are ordered by route and month. The temperature climatology was computed for all statistically independent temperture data for each flight. The grid used consists of 5 deg latitude, 30 deg longitude and 2000 feet vertical resolution from FL270 to FL430 for each month of the year. The number of statistically independent observations, their mean, standard deviation and the empirical 98, 50, 16, 2 and .3 probability percentiles are presented.

  18. Atmospheric Fluoroform (CHF3, HFC-23) at Cape Grim, Tasmania (1978-1995)

    DOE Data Explorer

    Oram, D. E. [University of East Anglia, Norwich, United Kingdom; Sturges, W. T. [University of East Anglia, Norwich, United Kingdom; Penkett, S. A. [University of East Anglia, Norwich, United Kingdom; McCulloch, A. [ICI Chemicals and Polymers, Ltd., Cheshire, United Kingdom; Fraser, P. J. [CRC for Southern Hemisphere Meteorology, Victoria, Australia

    2000-10-01

    The sampling and analytical methods are described more fully in Oram et al. (1998). In summary, air samples were taken from the archive of Cape Grim, Tasmania (41oS, 145oE) air samples collected from 1978 through 1995. Comparisons of CFC-11, CFC-12, CFC-113, CH3CCl3, and CH4 data between archive samples and corresponding in-situ samples for the same dates confirm that the archive samples are both representative and stable over time. Samples were analyzed by gas chromatography-mass spectrometry (GC-MS), using a KCl-passivated alumina PLOT column. Fluoroform was monitored on mass 69 (CF3+). The analytical precision (one standard deviation of the mean) for two or three replicate analyses was typically ± 1% of the mean measured value. The overall uncertainty of the observed data is ± 10%, taking into account uncertainties in the preparation of the primary standards, the purity of the fluoroform used to make the primary standards, as well as the analytical precision.

  19. Approximate Bayesian Computation Using Markov Chain Monte Carlo Simulation: Theory, Concepts, and Applications

    NASA Astrophysics Data System (ADS)

    Sadegh, M.; Vrugt, J. A.

    2013-12-01

    The ever increasing pace of computational power, along with continued advances in measurement technologies and improvements in process understanding has stimulated the development of increasingly complex hydrologic models that simulate soil moisture flow, groundwater recharge, surface runoff, root water uptake, and river discharge at increasingly finer spatial and temporal scales. Reconciling these system models with field and remote sensing data is a difficult task, particularly because average measures of model/data similarity inherently lack the power to provide a meaningful comparative evaluation of the consistency in model form and function. The very construction of the likelihood function - as a summary variable of the (usually averaged) properties of the error residuals - dilutes and mixes the available information into an index having little remaining correspondence to specific behaviors of the system (Gupta et al., 2008). The quest for a more powerful method for model evaluation has inspired Vrugt and Sadegh [2013] to introduce "likelihood-free" inference as vehicle for diagnostic model evaluation. This class of methods is also referred to as Approximate Bayesian Computation (ABC) and relaxes the need for an explicit likelihood function in favor of one or multiple different summary statistics rooted in hydrologic theory that together have a much stronger and compelling diagnostic power than some aggregated measure of the size of the error residuals. Here, we will introduce an efficient ABC sampling method that is orders of magnitude faster in exploring the posterior parameter distribution than commonly used rejection and Population Monte Carlo (PMC) samplers. Our methodology uses Markov Chain Monte Carlo simulation with DREAM, and takes advantage of a simple computational trick to resolve discontinuity problems with the application of set-theoretic summary statistics. We will also demonstrate a set of summary statistics that are rather insensitive to errors in the forcing data. This enhances prospects of detecting model structural deficiencies.

  20. POTENTIALLY PATHOGENIC FREE-LIVING AMOEBAE IN SOME FLOOD-AFFECTED AREAS DURING 2011 CHIANG MAI FLOOD

    PubMed Central

    Wannasan, Anchalee; Uparanukraw, Pichart; Songsangchun, Apichart; Morakote, Nimit

    2013-01-01

    SUMMARY The survey was carried out to investigate the presence of potentially pathogenic free-living amoebae (FLA) during flood in Chiang Mai, Thailand in 2011. From different crisis flood areas, seven water samples were collected and tested for the presence of amoebae using culture and molecular methods. By monoxenic culture, FLA were detected from all samples at 37 °C incubation. The FLA growing at 37 °C were morphologically identified as Acanthamoeba spp., Naegleria spp. and some unidentified amoebae. Only three samples (42.8%), defined as thermotolerant FLA, continued to grow at 42 °C. By molecular methods, two non-thermotolerant FlA were shown to have 99% identity to Acanthamoeba sp. and 98% identity to Hartmannella vermiformis while the two thermotolerant FLA were identified as Echinamoeba exundans (100% identity) and Hartmannella sp. (99% identity). This first report of the occurrence of FLA in water during the flood disaster will provide information to the public to be aware of potentially pathogenic FLA. PMID:24213194

  1. A step forward in the study of the electroerosion by optical methods

    NASA Astrophysics Data System (ADS)

    Aparicio, R.; Gale, M. F. Ruiz; Hogert, E. N.; Landau, M. R.; Gaggioli, y. N. G.

    2003-05-01

    This work develops two theoretical models of surfaces to explain the behavior of the light scattered by samples that suffers some alteration. In a first model, it is evaluated the mean intensity scattered by the sample, analyzing the different curves obtained as function of the eroded/total surface ratio. The theoretical results are compared with those obtained experimentally. It can be seen that there exists a strong relation between the electroerosion level and the light scattered by the sample. A second model analyzes a surface with random changes in its roughness. A translucent surface with its roughness changing in a controlled way is studied. Then, the correlation coefficient variation as function of the roughness variation is determined by the transmission speckle correlation method. The obtained experimental values are compared with those obtained with this model. In summary, it can be shown that the first- and second-order statistics properties of the transmitted or reflected light by a sample with a variable topography can be taken account as a parameter to analyze these morphologic changes.

  2. Correlation of the Summary Method with Learning Styles

    ERIC Educational Resources Information Center

    Sarikcioglu, Levent; Senol, Yesim; Yildirim, Fatos B.; Hizay, Arzu

    2011-01-01

    The summary is the last part of the lesson but one of the most important. We aimed to study the relationship between the preference of the summary method (video demonstration, question-answer, or brief review of slides) and learning styles. A total of 131 students were included in the present study. An inventory was prepared to understand the…

  3. A Comparison of Methods to Analyze Aquatic Heterotrophic Flagellates of Different Taxonomic Groups.

    PubMed

    Jeuck, Alexandra; Nitsche, Frank; Wylezich, Claudia; Wirth, Olaf; Bergfeld, Tanja; Brutscher, Fabienne; Hennemann, Melanie; Monir, Shahla; Scherwaß, Anja; Troll, Nicole; Arndt, Hartmut

    2017-08-01

    Heterotrophic flagellates contribute significantly to the matter flux in aquatic and terrestrial ecosystems. Still today their quantification and taxonomic classification bear several problems in field studies, though these methodological problems seem to be increasingly ignored in current ecological studies. Here we describe and test different methods, the live-counting technique, different fixation techniques, cultivation methods like the liquid aliquot method (LAM), and a molecular single cell survey called aliquot PCR (aPCR). All these methods have been tested either using aquatic field samples or cultures of freshwater and marine taxa. Each of the described methods has its advantages and disadvantages, which have to be considered in every single case. With the live-counting technique a detection of living cells up to morphospecies level is possible. Fixation of cells and staining methods are advantageous due to the possible long-term storage and observation of samples. Cultivation methods (LAM) offer the possibility of subsequent molecular analyses, and aPCR tools might complete the deficiency of LAM in terms of the missing detection of non-cultivable flagellates. In summary, we propose a combination of several investigation techniques reducing the gap between the different methodological problems. Copyright © 2017 Elsevier GmbH. All rights reserved.

  4. Linked Micromaps: Statistical Summaries in a Spatial Context

    EPA Science Inventory

    Communicating summaries of spatial data to decision makers and the public is challenging. We present a graphical method that provides both a geographic context and a statistical summary for such spatial data. Monitoring programs have a need for such geographical summaries. For ...

  5. Ab initio quantum direct dynamics simulations of ultrafast photochemistry with Multiconfigurational Ehrenfest approach

    NASA Astrophysics Data System (ADS)

    Makhov, Dmitry V.; Symonds, Christopher; Fernandez-Alberti, Sebastian; Shalashilin, Dmitrii V.

    2017-08-01

    The Multiconfigurational Ehrenfest (MCE) method is a quantum dynamics technique which allows treatment of a large number of quantum nuclear degrees of freedom. This paper presents a review of MCE and its recent applications, providing a summary of the formalisms, including its ab initio direct dynamics versions and also giving a summary of recent results. Firstly, we describe the Multiconfigurational Ehrenfest version 2 (MCEv2) method and its applicability to direct dynamics and report new calculations which show that the approach converges to the exact result in model systems with tens of degrees of freedom. Secondly, we review previous ;on the fly; ab initio Multiple Cloning (AIMC-MCE) MCE dynamics results obtained for systems of a similar size, in which the calculations treat every electron and every nucleus of a polyatomic molecule on a fully quantum basis. We also review the Time Dependent Diabatic Basis (TDDB) version of the technique and give an example of its application. We summarise the details of the sampling techniques and interpolations used for calculation of the matrix elements, which make our approach efficient. Future directions of work are outlined.

  6. Audit of lymphadenectomy in lung cancer resections using a specimen collection kit and checklist

    PubMed Central

    Osarogiagbon, Raymond U.; Sareen, Srishti; Eke, Ransome; Yu, Xinhua; McHugh, Laura M.; Kernstine, Kemp H.; Putnam, Joe B.; Robbins, Edward T.

    2014-01-01

    Background Audits of operative summaries and pathology reports reveal wide discordance in identifying the extent of lymphadenectomy performed (the communication gap). We tested the ability of a pre-labeled lymph node specimen collection kit and checklist to narrow the communication gap between operating surgeons, pathologists, and auditors of surgeons’ operation notes. Methods We conducted a prospective single cohort study of lung cancer resections performed with a lymph node collection kit from November 2010 to January 2013. We used the kappa statistic to compare surgeon claims on a checklist of lymph node stations harvested intraoperatively, to pathology reports, and an independent audit of surgeons’ operative summaries. Lymph node collection procedures were classified into 4 groups based on the anatomic origin of resected lymph nodes: mediastinal lymph node dissection, systematic sampling, random sampling and no sampling. Results From the pathology report, 73% of 160 resections had a mediastinal lymph node dissection or systematic sampling procedure, 27% had random sampling. The concordance with surgeon claims was 80% (kappa statistic 0.69 [CI 0.60 – 0.79]). Concordance between independent audits of the operation notes and either the pathology report (kappa 0.14 [0.04 – 0.23]), or surgeon claims (kappa 0.09 [0.03 – 0.22]), was poor. Conclusion A pre-labeled specimen collection kit and checklist significantly narrowed the communication gap between surgeons and pathologists in identifying the extent of lymphadenectomy. Audit of surgeons’ operation notes did not accurately reflect the procedure performed, bringing its value for quality improvement work into question. PMID:25530090

  7. Counting glomeruli and podocytes: rationale and methodologies

    PubMed Central

    Puelles, Victor G.; Bertram, John F.

    2015-01-01

    Purpose of review There is currently much interest in the numbers of both glomeruli and podocytes. This interest stems from greater understanding of the effects of suboptimal fetal events on nephron endowment, the associations between low nephron number and chronic cardiovascular and kidney disease in adults, and the emergence of the podocyte depletion hypothesis. Recent findings Obtaining accurate and precise estimates of glomerular and podocyte number has proven surprisingly difficult. When whole kidneys or large tissue samples are available, design-based stereological methods are considered gold-standard because they are based on principles that negate systematic bias. However, these methods are often tedious and time-consuming, and oftentimes inapplicable when dealing with small samples such as biopsies. Therefore, novel methods suitable for small tissue samples, and innovative approaches to facilitate high through put measurements, such as magnetic resonance imaging (MRI) to estimate glomerular number and flow cytometry to estimate podocyte number, have recently been described. Summary This review describes current gold-standard methods for estimating glomerular and podocyte number, as well as methods developed in the past 3 years. We are now better placed than ever before to accurately and precisely estimate glomerular and podocyte number, and to examine relationships between these measurements and kidney health and disease. PMID:25887899

  8. Methods for determination of radioactive substances in water and fluvial sediments

    USGS Publications Warehouse

    Thatcher, Leland Lincoln; Janzer, Victor J.; Edwards, Kenneth W.

    1977-01-01

    Analytical methods for the determination of some of the more important components of fission or neutron activation product radioactivity and of natural radioactivity found in water are reported. The report for each analytical method includes conditions for application of the method, a summary of the method, interferences, required apparatus and reagents, analytical procedures, calculations, reporting of results, and estimation of precision. The fission product isotopes considered are cesium-137, strontium-90, and ruthenium-106. The natural radioelements and isotopes considered are uranium, lead-210, radium-226, radium-228, tritium, and carbon-14. A gross radioactivity survey method and a uranium isotope ratio method are given. When two analytical methods are in routine use for an individual isotope, both methods are reported with identification of the specific areas of application of each. Techniques for the collection and preservation of water samples to be analyzed for radioactivity are discussed.

  9. Tissue Preservation Assessment Preliminary Results

    NASA Technical Reports Server (NTRS)

    Globus, Ruth; Costes, Sylvain

    2017-01-01

    Pre-flight groundbased testing done to prepare for the first Rodent Research mission validation flight, RR1 (Choi et al, 2016 PlosOne). We purified RNA and measured RIN values to assess quality of the samples. For protein, we measured liver enzyme activities. We tested protocol and methods of preservation to date. Here we present an overview of results related to tissue preservation from the RR1 validation mission and a summary of findings to date from investigators who received RR1 teissues various Biospecimen Sharing Program.

  10. Computer applications in scientific balloon quality control

    NASA Astrophysics Data System (ADS)

    Seely, Loren G.; Smith, Michael S.

    Seal defects and seal tensile strength are primary determinants of product quality in scientific balloon manufacturing; they therefore require a unit of quality measure. The availability of inexpensive and powerful data-processing tools can serve as the basis of a quality-trends-discerning analysis of products. The results of one such analysis are presently given in graphic form for use on the production floor. Software descriptions and their sample outputs are presented, together with a summary of overall and long-term effects of these methods on product quality.

  11. Efficient computation of the joint sample frequency spectra for multiple populations.

    PubMed

    Kamm, John A; Terhorst, Jonathan; Song, Yun S

    2017-01-01

    A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity.

  12. Efficient computation of the joint sample frequency spectra for multiple populations

    PubMed Central

    Kamm, John A.; Terhorst, Jonathan; Song, Yun S.

    2016-01-01

    A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity. PMID:28239248

  13. Methods for meta-analysis of multiple traits using GWAS summary statistics.

    PubMed

    Ray, Debashree; Boehnke, Michael

    2018-03-01

    Genome-wide association studies (GWAS) for complex diseases have focused primarily on single-trait analyses for disease status and disease-related quantitative traits. For example, GWAS on risk factors for coronary artery disease analyze genetic associations of plasma lipids such as total cholesterol, LDL-cholesterol, HDL-cholesterol, and triglycerides (TGs) separately. However, traits are often correlated and a joint analysis may yield increased statistical power for association over multiple univariate analyses. Recently several multivariate methods have been proposed that require individual-level data. Here, we develop metaUSAT (where USAT is unified score-based association test), a novel unified association test of a single genetic variant with multiple traits that uses only summary statistics from existing GWAS. Although the existing methods either perform well when most correlated traits are affected by the genetic variant in the same direction or are powerful when only a few of the correlated traits are associated, metaUSAT is designed to be robust to the association structure of correlated traits. metaUSAT does not require individual-level data and can test genetic associations of categorical and/or continuous traits. One can also use metaUSAT to analyze a single trait over multiple studies, appropriately accounting for overlapping samples, if any. metaUSAT provides an approximate asymptotic P-value for association and is computationally efficient for implementation at a genome-wide level. Simulation experiments show that metaUSAT maintains proper type-I error at low error levels. It has similar and sometimes greater power to detect association across a wide array of scenarios compared to existing methods, which are usually powerful for some specific association scenarios only. When applied to plasma lipids summary data from the METSIM and the T2D-GENES studies, metaUSAT detected genome-wide significant loci beyond the ones identified by univariate analyses. Evidence from larger studies suggest that the variants additionally detected by our test are, indeed, associated with lipid levels in humans. In summary, metaUSAT can provide novel insights into the genetic architecture of a common disease or traits. © 2017 WILEY PERIODICALS, INC.

  14. The impact of three discharge coding methods on the accuracy of diagnostic coding and hospital reimbursement for inpatient medical care.

    PubMed

    Tsopra, Rosy; Peckham, Daniel; Beirne, Paul; Rodger, Kirsty; Callister, Matthew; White, Helen; Jais, Jean-Philippe; Ghosh, Dipansu; Whitaker, Paul; Clifton, Ian J; Wyatt, Jeremy C

    2018-07-01

    Coding of diagnoses is important for patient care, hospital management and research. However coding accuracy is often poor and may reflect methods of coding. This study investigates the impact of three alternative coding methods on the inaccuracy of diagnosis codes and hospital reimbursement. Comparisons of coding inaccuracy were made between a list of coded diagnoses obtained by a coder using (i)the discharge summary alone, (ii)case notes and discharge summary, and (iii)discharge summary with the addition of medical input. For each method, inaccuracy was determined for the primary, secondary diagnoses, Healthcare Resource Group (HRG) and estimated hospital reimbursement. These data were then compared with a gold standard derived by a consultant and coder. 107 consecutive patient discharges were analysed. Inaccuracy of diagnosis codes was highest when a coder used the discharge summary alone, and decreased significantly when the coder used the case notes (70% vs 58% respectively, p < 0.0001) or coded from the discharge summary with medical support (70% vs 60% respectively, p < 0.0001). When compared with the gold standard, the percentage of incorrect HRGs was 42% for discharge summary alone, 31% for coding with case notes, and 35% for coding with medical support. The three coding methods resulted in an annual estimated loss of hospital remuneration of between £1.8 M and £16.5 M. The accuracy of diagnosis codes and percentage of correct HRGs improved when coders used either case notes or medical support in addition to the discharge summary. Further emphasis needs to be placed on improving the standard of information recorded in discharge summaries. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. On the joint inversion of geophysical data for models of the coupled core-mantle system

    NASA Technical Reports Server (NTRS)

    Voorhies, Coerte V.

    1991-01-01

    Joint inversion of magnetic, earth rotation, geoid, and seismic data for a unified model of the coupled core-mantle system is proposed and shown to be possible. A sample objective function is offered and simplified by targeting results from independent inversions and summary travel time residuals instead of original observations. These data are parameterized in terms of a very simple, closed model of the topographically coupled core-mantle system. Minimization of the simplified objective function leads to a nonlinear inverse problem; an iterative method for solution is presented. Parameterization and method are emphasized; numerical results are not presented.

  16. Using Electronic Data Interchange to Report Product Quality

    DTIC Science & Technology

    1993-03-01

    Numbers 0 31.1 S........................ . . . . ........... .... . .--- . ... N/U 140 SPS Sampling Parameters for Summary Statistics 0 1 N/U 150 REF...DTM Date/Time Reference 0 1 N/U 190 REF Reference Numbers 021 .................................. .......... .. ... NAU 200 STA Statistics 0 1 N/U 210...Measurements 0 1 N/U 120 DTM Date/Time Reference 0 >1 N/U 130 REF Reference Numbers 0 >1 :LOOIV f-SPS N/U 140 SPS Sampling Parameters for Summary Statistics 0 1

  17. Data Summary Report for teh Remedial Investigation of Hanford Site Releases to the Columbia River, Hanford Site, Washington

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hulstrom, L.

    2011-02-07

    This data summary report summarizes the investigation results to evaluate the nature and distribution of Hanford Site-related contaminants present in the Columbia River. As detailed in DOE/RL-2008-11, more than 2,000 environmental samples were collected from the Columbia River between 2008 and 2010. These samples consisted of island soil, sediment, surface water, groundwater upwelling (pore water, surface water, and sediment), and fish tissue.

  18. Integrative Analysis of Cancer Diagnosis Studies with Composite Penalization

    PubMed Central

    Liu, Jin; Huang, Jian; Ma, Shuangge

    2013-01-01

    Summary In cancer diagnosis studies, high-throughput gene profiling has been extensively conducted, searching for genes whose expressions may serve as markers. Data generated from such studies have the “large d, small n” feature, with the number of genes profiled much larger than the sample size. Penalization has been extensively adopted for simultaneous estimation and marker selection. Because of small sample sizes, markers identified from the analysis of single datasets can be unsatisfactory. A cost-effective remedy is to conduct integrative analysis of multiple heterogeneous datasets. In this article, we investigate composite penalization methods for estimation and marker selection in integrative analysis. The proposed methods use the minimax concave penalty (MCP) as the outer penalty. Under the homogeneity model, the ridge penalty is adopted as the inner penalty. Under the heterogeneity model, the Lasso penalty and MCP are adopted as the inner penalty. Effective computational algorithms based on coordinate descent are developed. Numerical studies, including simulation and analysis of practical cancer datasets, show satisfactory performance of the proposed methods. PMID:24578589

  19. Measuring health-related quality of life: psychometric evaluation of the Tunisian version of the SF-12 health survey.

    PubMed

    Younsi, Moheddine; Chakroun, Mohamed

    2014-09-01

    The 12-item short-form health survey (SF-12) was developed as a shorter alternative to the SF-36 for use in large-scale studies as an applicable instrument for measuring health-related quality of life. The main purpose of this study was to evaluate the psychometric properties of the Tunisian version of the SF-12. A stratified representative sample (N = 3,582) of the general Tunisian population aged 18 years and over was interviewed. SF-12 summary scores were derived using the standard US algorithm. Factor analysis was used to confirm the hypothesized component structure of the SF-12 items. Reliability was estimated using internal consistency, and construct validity was investigated with "known groups" validity testing and via convergent and divergent validity. SF-12 summary scores distinguished well, and in the expected manner, between groups of respondents on the basis of gender, age, education and socioeconomic status, thus providing evidence of construct validity. Mean scores in the total sample were 50.11 (SD 8.53) for the physical component summary (PCS) score and 47.96 (SD 9.82) for the mental component summary (MCS) score. The results showed satisfactory internal consistency and acceptable convergent validity for both summary scores. Cronbach's α coefficient for PCS-12 and MCS-12 was 0.73 and 0.72, respectively. Known groups comparison showed that the SF-12 discriminated well between groups of respondents on the basis of gender, age, education and socioeconomic status. In addition, no floor or ceiling effects at baseline were observed. The PCA confirmed the two-factor structure of the SF-12 items. Items belonging to the physical component correlated more strongly with the PCS-12 than those with the MCS-12. Similarly, items belonging to the mental component correlated more strongly with the MCS-12 than those with the PCS-12. The findings suggest that the SF-12 appears to be a valid and reliable measure that can be used for measuring of population health status. However, for optimal measurement, modifications to traditional scoring methods for the SF-12 should be considered.

  20. DISSCO: direct imputation of summary statistics allowing covariates.

    PubMed

    Xu, Zheng; Duan, Qing; Yan, Song; Chen, Wei; Li, Mingyao; Lange, Ethan; Li, Yun

    2015-08-01

    Imputation of individual level genotypes at untyped markers using an external reference panel of genotyped or sequenced individuals has become standard practice in genetic association studies. Direct imputation of summary statistics can also be valuable, for example in meta-analyses where individual level genotype data are not available. Two methods (DIST and ImpG-Summary/LD), that assume a multivariate Gaussian distribution for the association summary statistics, have been proposed for imputing association summary statistics. However, both methods assume that the correlations between association summary statistics are the same as the correlations between the corresponding genotypes. This assumption can be violated in the presence of confounding covariates. We analytically show that in the absence of covariates, correlation among association summary statistics is indeed the same as that among the corresponding genotypes, thus serving as a theoretical justification for the recently proposed methods. We continue to prove that in the presence of covariates, correlation among association summary statistics becomes the partial correlation of the corresponding genotypes controlling for covariates. We therefore develop direct imputation of summary statistics allowing covariates (DISSCO). We consider two real-life scenarios where the correlation and partial correlation likely make practical difference: (i) association studies in admixed populations; (ii) association studies in presence of other confounding covariate(s). Application of DISSCO to real datasets under both scenarios shows at least comparable, if not better, performance compared with existing correlation-based methods, particularly for lower frequency variants. For example, DISSCO can reduce the absolute deviation from the truth by 3.9-15.2% for variants with minor allele frequency <5%. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Summary of Previous Chamber or Controlled Anthrax Studies and Recommendations for Possible Additional Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Amidan, Brett G.; Morrow, Jayne B.

    2010-12-29

    This report and an associated Excel file(a) summarizes the investigations and results of previous chamber and controlled studies(b) to characterize the performance of methods for collecting, storing and/or transporting, extracting, and analyzing samples from surfaces contaminated by Bacillus anthracis (BA) or related simulants. This report and the Excel are the joint work of the Pacific Northwest National Laboratory (PNNL) and the National Institute of Standards and Technology (NIST) for the Department of Homeland Security, Science and Technology Directorate. The report was originally released as PNNL-SA-69338, Rev. 0 in November 2009 with limited distribution, but was subsequently cleared for release withmore » unlimited distribution in this Rev. 1. Only minor changes were made to Rev. 0 to yield Rev. 1. A more substantial update (including summarizing data from other studies and more condensed summary tables of data) is underway« less

  2. HLLV avionics requirements study and electronic filing system database development

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This final report provides a summary of achievements and activities performed under Contract NAS8-39215. The contract's objective was to explore a new way of delivering, storing, accessing, and archiving study products and information and to define top level system requirements for Heavy Lift Launch Vehicle (HLLV) avionics that incorporate Vehicle Health Management (VHM). This report includes technical objectives, methods, assumptions, recommendations, sample data, and issues as specified by DPD No. 772, DR-3. The report is organized into two major subsections, one specific to each of the two tasks defined in the Statement of Work: the Index Database Task and the HLLV Avionics Requirements Task. The Index Database Task resulted in the selection and modification of a commercial database software tool to contain the data developed during the HLLV Avionics Requirements Task. All summary information is addressed within each task's section.

  3. 1993 annual status report: a summary of fish data in six reaches of the upper Mississippi River system

    USGS Publications Warehouse

    Gutreuter, Steve; Burkhardt, Randy W.; Stopyro, Mark; Bartels, Andrew; Kramer, Eric; Bowler, Melvin C.; Cronin, Frederick A.; Soergel, Dirk W.; Petersen, Michael D.; Herzog, David P.; Raibley, Paul T.; Irons, Kevin S.; O'Hara, Timothy M.

    1997-01-01

    The Long Term Resource Monitoring Program (LTRMP) completed 1,994 collections of fishes from stratified random and permanently fixed sampling locations in six study reaches of the Upper Mississippi River System during 1993. Collection methods included day and night electrofishing, hoop netting, fyke netting (two net sizes), gill netting, seining, and trawling in select aquatic area classes. The six LTRMP study reaches are Pools 4 (excluding Lake Pepin), 8, 13, and 26 of the Upper Mississippi River, an unimpounded reach of the Mississippi River near Cape Girardeau, Missouri, and the La Grange Pool of the Illinois River. A total of 62-78 fish species were detected in each study reach. For each of the six LTRMP study reaches, this report contains summaries of: (1) sampling efforts in each combination of gear type and aquatic area class, (2) total catches of each species from each gear type, (3) mean catch-per-unit of gear effort statistics and standard errors for common species from each combination of aquatic area class and selected gear type, and (4) length distributions of common species from selected gear types.

  4. 1994 annual status report: a summary of fish data in six reaches of the upper Mississippi River system

    USGS Publications Warehouse

    Gutreuter, Steve; Burkhardt, Randy W.; Stopyro, Mark; Bartels, Andrew; Kramer, Eric; Bowler, Melvin C.; Cronin, Frederick A.; Soergel, Dirk W.; Petersen, Michael D.; Herzog, David P.; Raibley, Paul T.; Irons, Kevin S.; O'Hara, Timothy M.

    1997-01-01

    The Long Term Resource Monitoring Program (LTRMP) completed 2,653 collections of fishes from stratified random and permanently fixed sampling locations in six study reaches of the Upper Mississippi River System during 1994. Collection methods included day and night electrofishing, hoop netting, fyke netting (two net sizes), gill netting, seining, and trawling in select aquatic area classes. The six LTRMP study areas are Pools 4 (excluding Lake Pepin), 8, 13, and 26 of the Upper Mississippi River, and unimpounded reach of the Mississippi River near Cape Girardeau, Missouri, and the La Grange Pool of the Illinois River. A total of 61-79 fish species were detected in each study area. For each of the six LTRMP study areas, this report contains summaries of (1) sampling efforts in each combination of gear type and aquatic area class, (2) total catches of each species from each gear type, (3) mean catch-per-unit of gear effort statistics and standard errors for common species from each combination of aquatic area class and selected gear type, and (4) length distributions of common species from selected gear types.

  5. Accuracy of recommended sampling and assay methods for the determination of plasma-free and urinary fractionated metanephrines in the diagnosis of pheochromocytoma and paraganglioma: a systematic review.

    PubMed

    Därr, Roland; Kuhn, Matthias; Bode, Christoph; Bornstein, Stefan R; Pacak, Karel; Lenders, Jacques W M; Eisenhofer, Graeme

    2017-06-01

    To determine the accuracy of biochemical tests for the diagnosis of pheochromocytoma and paraganglioma. A search of the PubMed database was conducted for English-language articles published between October 1958 and December 2016 on the biochemical diagnosis of pheochromocytoma and paraganglioma using immunoassay methods or high-performance liquid chromatography with coulometric/electrochemical or tandem mass spectrometric detection for measurement of fractionated metanephrines in 24-h urine collections or plasma-free metanephrines obtained under seated or supine blood sampling conditions. Application of the Standards for Reporting of Diagnostic Studies Accuracy Group criteria yielded 23 suitable articles. Summary receiver operating characteristic analysis revealed sensitivities/specificities of 94/93% and 91/93% for measurement of plasma-free metanephrines and urinary fractionated metanephrines using high-performance liquid chromatography or immunoassay methods, respectively. Partial areas under the curve were 0.947 vs. 0.911. Irrespective of the analytical method, sensitivity was significantly higher for supine compared with seated sampling, 95 vs. 89% (p < 0.02), while specificity was significantly higher for supine sampling compared with 24-h urine, 95 vs. 90% (p < 0.03). Partial areas under the curve were 0.942, 0.913, and 0.932 for supine sampling, seated sampling, and urine. Test accuracy increased linearly from 90 to 93% for 24-h urine at prevalence rates of 0.0-1.0, decreased linearly from 94 to 89% for seated sampling and was constant at 95% for supine conditions. Current tests for the biochemical diagnosis of pheochromocytoma and paraganglioma show excellent diagnostic accuracy. Supine sampling conditions and measurement of plasma-free metanephrines using high-performance liquid chromatography with coulometric/electrochemical or tandem mass spectrometric detection provides the highest accuracy at all prevalence rates.

  6. Detecting microbial dysbiosis associated with Pediatric Crohn’s disease despite the high variability of the gut microbiota

    PubMed Central

    Wang, Feng; Kaplan, Jess L.; Gold, Benjamin D.; Bhasin, Manoj K.; Ward, Naomi L.; Kellermayer, Richard; Kirschner, Barbara S.; Heyman, Melvin B.; Dowd, Scot E.; Cox, Stephen B.; Dogan, Haluk; Steven, Blaire; Ferry, George D.; Cohen, Stanley A.; Baldassano, Robert N.; Moran, Christopher J.; Garnett, Elizabeth A.; Drake, Lauren; Otu, Hasan H.; Mirny, Leonid A.; Libermann, Towia A.; Winter, Harland S.; Korolev, Kirill

    2016-01-01

    SUMMARY The relationship between the host and its microbiota is challenging to understand because both microbial communities and their environment are highly variable. We developed a set of techniques to address this challenge based on population dynamics and information theory. These methods identified additional bacterial taxa associated with pediatric Crohn's disease and could detect significant changes in microbial communities with fewer samples than previous statistical approaches. We also substantially improved the accuracy of the diagnosis based on the microbiota from stool samples and found that the ecological niche of a microbe predicts its role in Crohn’s disease. Bacteria typically residing in the lumen of healthy patients decrease in disease while bacteria typically residing on the mucosa of healthy patients increase in disease. Our results also show that the associations with Crohn’s disease are evolutionarily conserved and provide a mutual-information-based method to visualize dysbiosis. PMID:26804920

  7. Score Estimating Equations from Embedded Likelihood Functions under Accelerated Failure Time Model

    PubMed Central

    NING, JING; QIN, JING; SHEN, YU

    2014-01-01

    SUMMARY The semiparametric accelerated failure time (AFT) model is one of the most popular models for analyzing time-to-event outcomes. One appealing feature of the AFT model is that the observed failure time data can be transformed to identically independent distributed random variables without covariate effects. We describe a class of estimating equations based on the score functions for the transformed data, which are derived from the full likelihood function under commonly used semiparametric models such as the proportional hazards or proportional odds model. The methods of estimating regression parameters under the AFT model can be applied to traditional right-censored survival data as well as more complex time-to-event data subject to length-biased sampling. We establish the asymptotic properties and evaluate the small sample performance of the proposed estimators. We illustrate the proposed methods through applications in two examples. PMID:25663727

  8. Application of a Multivariant, Caucasian-Specific, Genotyped Donor Panel for Performance Validation of MDmulticard®, ID-System®, and Scangel® RhD/ABO Serotyping

    PubMed Central

    Gassner, Christoph; Rainer, Esther; Pircher, Elfriede; Markut, Lydia; Körmöczi, Günther F.; Jungbauer, Christof; Wessin, Dietmar; Klinghofer, Roswitha; Schennach, Harald; Schwind, Peter; Schönitzer, Diether

    2009-01-01

    Summary Background Validations of routinely used serological typing methods require intense performance evaluations typically including large numbers of samples before routine application. However, such evaluations could be improved considering information about the frequency of standard blood groups and their variants. Methods Using RHD and ABO population genetic data, a Caucasian-specific donor panel was compiled for a performance comparison of the three RhD and ABO serological typing methods MDmulticard (Medion Diagnostics), ID-System (DiaMed) and ScanGel (Bio-Rad). The final test panel included standard and variant RHD and ABO genotypes, e.g. RhD categories, partial and weak RhDs, RhD DELs, and ABO samples, mainly to interpret weak serological reactivity for blood group A specificity. All samples were from individuals recorded in our local DNA blood group typing database. Results For ‘standard’ blood groups, results of performance were clearly interpretable for all three serological methods compared. However, when focusing on specific variant phenotypes, pronounced differences in reaction strengths and specificities were observed between them. Conclusions A genetically and ethnically predefined donor test panel consisting of 93 individual samples only, delivered highly significant results for serological performance comparisons. Such small panels offer impressive representative powers, higher as such based on statistical chances and large numbers only. PMID:21113264

  9. Mixed-Methods Research in Nutrition and Dietetics.

    PubMed

    Zoellner, Jamie; Harris, Jeffrey E

    2017-05-01

    This work focuses on mixed-methods research (MMR) and is the 11th in a series exploring the importance of research design, statistical analysis, and epidemiologic methods as applied to nutrition and dietetics research. MMR research is an investigative technique that applies both quantitative and qualitative data. The purpose of this article is to define MMR; describe its history and nature; provide reasons for its use; describe and explain the six different MMR designs; describe sample selection; and provide guidance in data collection, analysis, and inference. MMR concepts are applied and integrated with nutrition-related scenarios in real-world research contexts and summary recommendations are provided. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  10. National accident sampling system sample design, phases 2 and 3 : executive summary

    DOT National Transportation Integrated Search

    1979-11-01

    This report describes the Phase 2 and 3 sample design for the : National Accident Sampling System (NASS). It recommends a procedure : for the first-stage selection of Primary Sampling Units (PSU's) and : the second-stage design for the selection of a...

  11. Neutrino oscillation parameter sampling with MonteCUBES

    NASA Astrophysics Data System (ADS)

    Blennow, Mattias; Fernandez-Martinez, Enrique

    2010-01-01

    We present MonteCUBES ("Monte Carlo Utility Based Experiment Simulator"), a software package designed to sample the neutrino oscillation parameter space through Markov Chain Monte Carlo algorithms. MonteCUBES makes use of the GLoBES software so that the existing experiment definitions for GLoBES, describing long baseline and reactor experiments, can be used with MonteCUBES. MonteCUBES consists of two main parts: The first is a C library, written as a plug-in for GLoBES, implementing the Markov Chain Monte Carlo algorithm to sample the parameter space. The second part is a user-friendly graphical Matlab interface to easily read, analyze, plot and export the results of the parameter space sampling. Program summaryProgram title: MonteCUBES (Monte Carlo Utility Based Experiment Simulator) Catalogue identifier: AEFJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 69 634 No. of bytes in distributed program, including test data, etc.: 3 980 776 Distribution format: tar.gz Programming language: C Computer: MonteCUBES builds and installs on 32 bit and 64 bit Linux systems where GLoBES is installed Operating system: 32 bit and 64 bit Linux RAM: Typically a few MBs Classification: 11.1 External routines: GLoBES [1,2] and routines/libraries used by GLoBES Subprograms used:Cat Id ADZI_v1_0, Title GLoBES, Reference CPC 177 (2007) 439 Nature of problem: Since neutrino masses do not appear in the standard model of particle physics, many models of neutrino masses also induce other types of new physics, which could affect the outcome of neutrino oscillation experiments. In general, these new physics imply high-dimensional parameter spaces that are difficult to explore using classical methods such as multi-dimensional projections and minimizations, such as those used in GLoBES [1,2]. Solution method: MonteCUBES is written as a plug-in to the GLoBES software [1,2] and provides the necessary methods to perform Markov Chain Monte Carlo sampling of the parameter space. This allows an efficient sampling of the parameter space and has a complexity which does not grow exponentially with the parameter space dimension. The integration of the MonteCUBES package with the GLoBES software makes sure that the experimental definitions already in use by the community can also be used with MonteCUBES, while also lowering the learning threshold for users who already know GLoBES. Additional comments: A Matlab GUI for interpretation of results is included in the distribution. Running time: The typical running time varies depending on the dimensionality of the parameter space, the complexity of the experiment, and how well the parameter space should be sampled. The running time for our simulations [3] with 15 free parameters at a Neutrino Factory with O(10) samples varied from a few hours to tens of hours. References:P. Huber, M. Lindner, W. Winter, Comput. Phys. Comm. 167 (2005) 195, hep-ph/0407333. P. Huber, J. Kopp, M. Lindner, M. Rolinec, W. Winter, Comput. Phys. Comm. 177 (2007) 432, hep-ph/0701187. S. Antusch, M. Blennow, E. Fernandez-Martinez, J. Lopez-Pavon, arXiv:0903.3986 [hep-ph].

  12. Fully Bayesian tests of neutrality using genealogical summary statistics.

    PubMed

    Drummond, Alexei J; Suchard, Marc A

    2008-10-31

    Many data summary statistics have been developed to detect departures from neutral expectations of evolutionary models. However questions about the neutrality of the evolution of genetic loci within natural populations remain difficult to assess. One critical cause of this difficulty is that most methods for testing neutrality make simplifying assumptions simultaneously about the mutational model and the population size model. Consequentially, rejecting the null hypothesis of neutrality under these methods could result from violations of either or both assumptions, making interpretation troublesome. Here we harness posterior predictive simulation to exploit summary statistics of both the data and model parameters to test the goodness-of-fit of standard models of evolution. We apply the method to test the selective neutrality of molecular evolution in non-recombining gene genealogies and we demonstrate the utility of our method on four real data sets, identifying significant departures of neutrality in human influenza A virus, even after controlling for variation in population size. Importantly, by employing a full model-based Bayesian analysis, our method separates the effects of demography from the effects of selection. The method also allows multiple summary statistics to be used in concert, thus potentially increasing sensitivity. Furthermore, our method remains useful in situations where analytical expectations and variances of summary statistics are not available. This aspect has great potential for the analysis of temporally spaced data, an expanding area previously ignored for limited availability of theory and methods.

  13. Delimiting Coalescence Genes (C-Genes) in Phylogenomic Data Sets

    PubMed Central

    Springer, Mark S.; Gatesy, John

    2018-01-01

    Summary coalescence methods have emerged as a popular alternative for inferring species trees with large genomic datasets, because these methods explicitly account for incomplete lineage sorting. However, statistical consistency of summary coalescence methods is not guaranteed unless several model assumptions are true, including the critical assumption that recombination occurs freely among but not within coalescence genes (c-genes), which are the fundamental units of analysis for these methods. Each c-gene has a single branching history, and large sets of these independent gene histories should be the input for genome-scale coalescence estimates of phylogeny. By contrast, numerous studies have reported the results of coalescence analyses in which complete protein-coding sequences are treated as c-genes even though exons for these loci can span more than a megabase of DNA. Empirical estimates of recombination breakpoints suggest that c-genes may be much shorter, especially when large clades with many species are the focus of analysis. Although this idea has been challenged recently in the literature, the inverse relationship between c-gene size and increased taxon sampling in a dataset—the ‘recombination ratchet’—is a fundamental property of c-genes. For taxonomic groups characterized by genes with long intron sequences, complete protein-coding sequences are likely not valid c-genes and are inappropriate units of analysis for summary coalescence methods unless they occur in recombination deserts that are devoid of incomplete lineage sorting (ILS). Finally, it has been argued that coalescence methods are robust when the no-recombination within loci assumption is violated, but recombination must matter at some scale because ILS, a by-product of recombination, is the raison d’etre for coalescence methods. That is, extensive recombination is required to yield the large number of independently segregating c-genes used to infer a species tree. If coalescent methods are powerful enough to infer the correct species tree for difficult phylogenetic problems in the anomaly zone, where concatenation is expected to fail because of ILS, then there should be a decreasing probability of inferring the correct species tree using longer loci with many intralocus recombination breakpoints (i.e., increased levels of concatenation). PMID:29495400

  14. Trend analysis and selected summary statistics of annual mean streamflow for 38 selected long-term U.S. Geological Survey streamgages in Texas, water years 1916-2012

    USGS Publications Warehouse

    Asquith, William H.; Barbie, Dana L.

    2014-01-01

    Selected summary statistics (L-moments) and estimates of respective sampling variances were computed for the 35 streamgages lacking statistically significant trends. From the L-moments and estimated sampling variances, weighted means or regional values were computed for each L-moment. An example application is included demonstrating how the L-moments could be used to evaluate the magnitude and frequency of annual mean streamflow.

  15. Simple, Defensible Sample Sizes Based on Cost Efficiency

    PubMed Central

    Bacchetti, Peter; McCulloch, Charles E.; Segal, Mark R.

    2009-01-01

    Summary The conventional approach of choosing sample size to provide 80% or greater power ignores the cost implications of different sample size choices. Costs, however, are often impossible for investigators and funders to ignore in actual practice. Here, we propose and justify a new approach for choosing sample size based on cost efficiency, the ratio of a study’s projected scientific and/or practical value to its total cost. By showing that a study’s projected value exhibits diminishing marginal returns as a function of increasing sample size for a wide variety of definitions of study value, we are able to develop two simple choices that can be defended as more cost efficient than any larger sample size. The first is to choose the sample size that minimizes the average cost per subject. The second is to choose sample size to minimize total cost divided by the square root of sample size. This latter method is theoretically more justifiable for innovative studies, but also performs reasonably well and has some justification in other cases. For example, if projected study value is assumed to be proportional to power at a specific alternative and total cost is a linear function of sample size, then this approach is guaranteed either to produce more than 90% power or to be more cost efficient than any sample size that does. These methods are easy to implement, based on reliable inputs, and well justified, so they should be regarded as acceptable alternatives to current conventional approaches. PMID:18482055

  16. MEASUREMENT ERROR ESTIMATION AND CORRECTION METHODS TO MINIMIZE EXPOSURE MISCLASSIFICATION IN EPIDEMIOLOGICAL STUDIES: PROJECT SUMMARY

    EPA Science Inventory

    This project summary highlights recent findings from research undertaken to develop improved methods to assess potential human health risks related to drinking water disinfection byproduct (DBP) exposures.

  17. Probabilistic assessment method of the non-monotonic dose-responses-Part I: Methodological approach.

    PubMed

    Chevillotte, Grégoire; Bernard, Audrey; Varret, Clémence; Ballet, Pascal; Bodin, Laurent; Roudot, Alain-Claude

    2017-08-01

    More and more studies aim to characterize non-monotonic dose response curves (NMDRCs). The greatest difficulty is to assess the statistical plausibility of NMDRCs from previously conducted dose response studies. This difficulty is linked to the fact that these studies present (i) few doses tested, (ii) a low sample size per dose, and (iii) the absence of any raw data. In this study, we propose a new methodological approach to probabilistically characterize NMDRCs. The methodology is composed of three main steps: (i) sampling from summary data to cover all the possibilities that may be presented by the responses measured by dose and to obtain a new raw database, (ii) statistical analysis of each sampled dose-response curve to characterize the slopes and their signs, and (iii) characterization of these dose-response curves according to the variation of the sign in the slope. This method allows characterizing all types of dose-response curves and can be applied both to continuous data and to discrete data. The aim of this study is to present the general principle of this probabilistic method which allows to assess the non-monotonic dose responses curves, and to present some results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Inference from Samples of DNA Sequences Using a Two-Locus Model

    PubMed Central

    Griffiths, Robert C.

    2011-01-01

    Abstract Performing inference on contemporary samples of DNA sequence data is an important and challenging task. Computationally intensive methods such as importance sampling (IS) are attractive because they make full use of the available data, but in the presence of recombination the large state space of genealogies can be prohibitive. In this article, we make progress by developing an efficient IS proposal distribution for a two-locus model of sequence data. We show that the proposal developed here leads to much greater efficiency, outperforming existing IS methods that could be adapted to this model. Among several possible applications, the algorithm can be used to find maximum likelihood estimates for mutation and crossover rates, and to perform ancestral inference. We illustrate the method on previously reported sequence data covering two loci either side of the well-studied TAP2 recombination hotspot. The two loci are themselves largely non-recombining, so we obtain a gene tree at each locus and are able to infer in detail the effect of the hotspot on their joint ancestry. We summarize this joint ancestry by introducing the gene graph, a summary of the well-known ancestral recombination graph. PMID:21210733

  19. Monitoring of occupational and environmental aeroallergens-- EAACI Position Paper. Concerted action of the EAACI IG Occupational Allergy and Aerobiology & Air Pollution.

    PubMed

    Raulf, M; Buters, J; Chapman, M; Cecchi, L; de Blay, F; Doekes, G; Eduard, W; Heederik, D; Jeebhay, M F; Kespohl, S; Krop, E; Moscato, G; Pala, G; Quirce, S; Sander, I; Schlünssen, V; Sigsgaard, T; Walusiak-Skorupa, J; Wiszniewska, M; Wouters, I M; Annesi-Maesano, I

    2014-10-01

    Exposure to high molecular weight sensitizers of biological origin is an important risk factor for the development of asthma and rhinitis. Most of the causal allergens have been defined based on their reactivity with IgE antibodies, and in many cases, the molecular structure and function of the allergens have been established. Significant information on allergen levels that cause sensitization and allergic symptoms for several major environmental and occupational allergens has been reported. Monitoring of high molecular weight allergens and allergen carrier particles is an important part of the management of allergic respiratory diseases and requires standardized allergen assessment methods for occupational and environmental (indoor and outdoor) allergen exposure. The aim of this EAACI task force was to review the essential points for monitoring environmental and occupational allergen exposure including sampling strategies and methods, processing of dust samples, allergen analysis, and quantification. The paper includes a summary of different methods for sampling and allergen quantification, as well as their pros and cons for various exposure settings. Recommendations are being made for different exposure scenarios. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Implementing Keyword and Question Generation Approaches in Teaching EFL Summary Writing

    ERIC Educational Resources Information Center

    Chou, Mu-hsuan

    2012-01-01

    Summary writing has been considered an important aspect of academic writing. However, writing summaries can be a challenging task for the majority of English as a Foreign Language (EFL) learners. Research into teaching summary writing has focused on different processes to teach EFL learners. The present study adopted two methods--keyword and…

  1. prepare_taxa_charts.py: A Python program to automate generation of publication ready taxonomic pie chart images from QIIME.

    PubMed

    Lakhujani, Vijay; Badapanda, Chandan

    2017-06-01

    QIIME (Quantitative Insights Into Microbial Ecology) is one of the most popular open-source bioinformatics suite for performing metagenome, 16S rRNA amplicon and Internal Transcribed Spacer (ITS) data analysis. Although, it is very comprehensive and powerful tool, it lacks a method to provide publication ready taxonomic pie charts. The script plot_taxa_summary . py bundled with QIIME generate a html file and a folder containing taxonomic pie chart and legend as separate images. The images have randomly generated alphanumeric names. Therefore, it is difficult to associate the pie chart with the legend and the corresponding sample identifier. Even if the option to have the legend within the html file is selected while executing plot_taxa_summary . py , it is very tedious to crop a complete image (having both the pie chart and the legend) due to unequal image sizes. It requires a lot of time to manually prepare the pie charts for multiple samples for publication purpose. Moreover, there are chances of error while identifying the pie chart and legend pair due to random alphanumeric names of the images. To bypass all these bottlenecks and make this process efficient, we have developed a python based program, prepare_taxa_charts . py , to automate the renaming, cropping and merging of taxonomic pie chart and corresponding legend image into a single, good quality publication ready image. This program not only augments the functionality of plot_taxa_summary . py but is also very fast in terms of CPU time and user friendly.

  2. Multiple animal studies for medical chemical defense program in soldier/patient decontamination and drug development on task 85-17: Validation of an analytical method for the detection of soman (GD), mustard (HD), tabun (GA), and VX in wastewater samples. Final report, 13 October 1985-1 January 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joiner, R.L.; Hayes, L.; Rust, W.

    1989-05-01

    The following report summarizes the development and validation of an analytical method for the analyses of soman (GD), mustard (HD), VX, and tabun (GA) in wastewater. The need for an analytical method that can detect GD, HD, VX, and GA with the necessary sensitivity (< 20 parts per billion (PPB))and selectivity is essential to Medical Research and Evaluation Facility (MREF) operations. The analytical data were generated using liquid-liquid extraction of the wastewater, with the extract being concentrated and analyzed by gas chromatography (GC) methods. The sample preparation and analyses methods were developed in support of ongoing activities within the MREF.more » We have documented the precision and accuracy of the analytical method through an expected working calibration range (3.0 to 60 ppb). The analytical method was statistically evaluated over a range of concentrations to establish a detection limit and quantitation limit for the method. Whenever the true concentration is 8.5 ppb or above, the probability is at least 99.9 percent that the measured concentration will be ppb or above. Thus, 6 ppb could be used as a lower reliability limit for detecting concentrations in excess of 8.5 ppb. In summary, the proposed sample extraction and analyses methods are suitable for quantitative analyses to determine the presence of GD, HD, VX, and GA in wastewater samples. Our findings indicate that we can detect any of these chemical surety materiel (CSM) in water at or below the established U.S. Army Surgeon General's safety levels in drinking water.« less

  3. Sequim Marine Research Laboratory routine environmental measurements during CY-1976

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fix, J.J.; Blumer, P.J.

    1977-05-01

    Beginning in 1976, a routine environmental program was established at the Marine Research Laboratory (MRL) at Sequim, Washington. The program is designed, primarily, to determine levels of radioactivity present in selected biota in Sequim Bay. The biota were selected because of their presence near the laboratory and their capacity to concentrate trace elements. Other samples were obtained to determine the radionuclides in Sequim Bay and laboratory drinking water, as well as the ambient radiation exposure levels and surface deposition of fallout radionuclides for the laboratory area. A summary of the analytical methods used is included. The present document includes datamore » obtained during CY 1976, the first year of the program. Radionuclides present in samples are attributed to fallout. Data are included on content of oil and Cu in seawater samples.« less

  4. Introduction to Field Water-Quality Methods for the Collection of Metals - 2007 Project Summary

    USGS Publications Warehouse

    Allen, Monica L.

    2008-01-01

    The U.S. Geological Survey (USGS), Region VI of the U.S. Environmental Protection Agency (USEPA), and the Osage Nation presented three 3-day workshops, in June-August 2007, entitled ?Introduction to Field Water-Quality Methods for the Collection of Metals.? The purpose of the workshops was to provide instruction to tribes within USEPA Region VI on various USGS surface-water measurement methods and water-quality sampling protocols for the collection of surface-water samples for metals analysis. Workshop attendees included members from over 22 tribes and pueblos. USGS instructors came from Oklahoma, New Mexico, and Georgia. Workshops were held in eastern and south-central Oklahoma and New Mexico and covered many topics including presampling preparation, water-quality monitors, and sampling for metals in surface water. Attendees spent one full classroom day learning the field methods used by the USGS Water Resources Discipline and learning about the complexity of obtaining valid water-quality and quality-assurance data. Lectures included (1) a description of metal contamination sources in surface water; (2) introduction on how to select field sites, equipment, and laboratories for sample analysis; (3) collection of sediment in surface water; and (4) utilization of proper protocol and methodology for sampling metals in surface water. Attendees also were provided USGS sampling equipment for use during the field portion of the class so they had actual ?hands-on? experience to take back to their own organizations. The final 2 days of the workshop consisted of field demonstrations of current USGS water-quality sample-collection methods. The hands-on training ensured that attendees were exposed to and experienced proper sampling procedures. Attendees learned integrated-flow techniques during sample collection, field-property documentation, and discharge measurements and calculations. They also used enclosed chambers for sample processing and collected quality-assurance samples to verify their techniques. Benefits of integrated water-quality sample-collection methods are varied. Tribal environmental programs now have the ability to collect data that are comparable across watersheds. The use of consistent sample collection, manipulation, and storage techniques will provide consistent quality data that will enhance the understanding of local water resources. The improved data quality also will help the USEPA better document the condition of the region?s water. Ultimately, these workshops equipped tribes to use uniform sampling methods and to provide consistent quality data that are comparable across the region.

  5. Performance Evaluation and Online Realization of Data-driven Normalization Methods Used in LC/MS based Untargeted Metabolomics Analysis.

    PubMed

    Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng

    2016-12-13

    In untargeted metabolomics analysis, several factors (e.g., unwanted experimental &biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data.

  6. Performance Evaluation and Online Realization of Data-driven Normalization Methods Used in LC/MS based Untargeted Metabolomics Analysis

    PubMed Central

    Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng

    2016-01-01

    In untargeted metabolomics analysis, several factors (e.g., unwanted experimental & biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data. PMID:27958387

  7. Rapid and sensitive analysis of 27 underivatized free amino acids, dipeptides, and tripeptides in fruits of Siraitia grosvenorii Swingle using HILIC-UHPLC-QTRAP(®)/MS (2) combined with chemometrics methods.

    PubMed

    Zhou, Guisheng; Wang, Mengyue; Li, Yang; Peng, Ying; Li, Xiaobo

    2015-08-01

    In the present study, a new strategy based on chemical analysis and chemometrics methods was proposed for the comprehensive analysis and profiling of underivatized free amino acids (FAAs) and small peptides among various Luo-Han-Guo (LHG) samples. Firstly, the ultrasound-assisted extraction (UAE) parameters were optimized using Plackett-Burman (PB) screening and Box-Behnken designs (BBD), and the following optimal UAE conditions were obtained: ultrasound power of 280 W, extraction time of 43 min, and the solid-liquid ratio of 302 mL/g. Secondly, a rapid and sensitive analytical method was developed for simultaneous quantification of 24 FAAs and 3 active small peptides in LHG at trace levels using hydrophilic interaction ultra-performance liquid chromatography coupled with triple-quadrupole linear ion-trap tandem mass spectrometry (HILIC-UHPLC-QTRAP(®)/MS(2)). The analytical method was validated by matrix effects, linearity, LODs, LOQs, precision, repeatability, stability, and recovery. Thirdly, the proposed optimal UAE conditions and analytical methods were applied to measurement of LHG samples. It was shown that LHG was rich in essential amino acids, which were beneficial nutrient substances for human health. Finally, based on the contents of the 27 analytes, the chemometrics methods of unsupervised principal component analysis (PCA) and supervised counter propagation artificial neural network (CP-ANN) were applied to differentiate and classify the 40 batches of LHG samples from different cultivated forms, regions, and varieties. As a result, these samples were mainly clustered into three clusters, which illustrated the cultivating disparity among the samples. In summary, the presented strategy had potential for the investigation of edible plants and agricultural products containing FAAs and small peptides.

  8. Adapt-Mix: learning local genetic correlation structure improves summary statistics-based analyses

    PubMed Central

    Park, Danny S.; Brown, Brielin; Eng, Celeste; Huntsman, Scott; Hu, Donglei; Torgerson, Dara G.; Burchard, Esteban G.; Zaitlen, Noah

    2015-01-01

    Motivation: Approaches to identifying new risk loci, training risk prediction models, imputing untyped variants and fine-mapping causal variants from summary statistics of genome-wide association studies are playing an increasingly important role in the human genetics community. Current summary statistics-based methods rely on global ‘best guess’ reference panels to model the genetic correlation structure of the dataset being studied. This approach, especially in admixed populations, has the potential to produce misleading results, ignores variation in local structure and is not feasible when appropriate reference panels are missing or small. Here, we develop a method, Adapt-Mix, that combines information across all available reference panels to produce estimates of local genetic correlation structure for summary statistics-based methods in arbitrary populations. Results: We applied Adapt-Mix to estimate the genetic correlation structure of both admixed and non-admixed individuals using simulated and real data. We evaluated our method by measuring the performance of two summary statistics-based methods: imputation and joint-testing. When using our method as opposed to the current standard of ‘best guess’ reference panels, we observed a 28% decrease in mean-squared error for imputation and a 73.7% decrease in mean-squared error for joint-testing. Availability and implementation: Our method is publicly available in a software package called ADAPT-Mix available at https://github.com/dpark27/adapt_mix. Contact: noah.zaitlen@ucsf.edu PMID:26072481

  9. Inferring brain-computational mechanisms with models of activity measurements

    PubMed Central

    Diedrichsen, Jörn

    2016-01-01

    High-resolution functional imaging is providing increasingly rich measurements of brain activity in animals and humans. A major challenge is to leverage such data to gain insight into the brain's computational mechanisms. The first step is to define candidate brain-computational models (BCMs) that can perform the behavioural task in question. We would then like to infer which of the candidate BCMs best accounts for measured brain-activity data. Here we describe a method that complements each BCM by a measurement model (MM), which simulates the way the brain-activity measurements reflect neuronal activity (e.g. local averaging in functional magnetic resonance imaging (fMRI) voxels or sparse sampling in array recordings). The resulting generative model (BCM-MM) produces simulated measurements. To avoid having to fit the MM to predict each individual measurement channel of the brain-activity data, we compare the measured and predicted data at the level of summary statistics. We describe a novel particular implementation of this approach, called probabilistic representational similarity analysis (pRSA) with MMs, which uses representational dissimilarity matrices (RDMs) as the summary statistics. We validate this method by simulations of fMRI measurements (locally averaging voxels) based on a deep convolutional neural network for visual object recognition. Results indicate that the way the measurements sample the activity patterns strongly affects the apparent representational dissimilarities. However, modelling of the measurement process can account for these effects, and different BCMs remain distinguishable even under substantial noise. The pRSA method enables us to perform Bayesian inference on the set of BCMs and to recognize the data-generating model in each case. This article is part of the themed issue ‘Interpreting BOLD: a dialogue between cognitive and cellular neuroscience’. PMID:27574316

  10. Child Welfare Research; Summaries of Research Conducted at the Child Welfare League of America.

    ERIC Educational Resources Information Center

    Child Welfare League of America, Inc., New York, NY.

    These summaries of research relating to child welfare are intended to give sufficient information about the objectives, methods, and findings of each research project to enable the reader to judge whether the full report would be of interest. Bibliographical references are included with each summary. Summaries encompass the areas of adoption…

  11. Controlling the joint local false discovery rate is more powerful than meta-analysis methods in joint analysis of summary statistics from multiple genome-wide association studies.

    PubMed

    Jiang, Wei; Yu, Weichuan

    2017-02-15

    In genome-wide association studies (GWASs) of common diseases/traits, we often analyze multiple GWASs with the same phenotype together to discover associated genetic variants with higher power. Since it is difficult to access data with detailed individual measurements, summary-statistics-based meta-analysis methods have become popular to jointly analyze datasets from multiple GWASs. In this paper, we propose a novel summary-statistics-based joint analysis method based on controlling the joint local false discovery rate (Jlfdr). We prove that our method is the most powerful summary-statistics-based joint analysis method when controlling the false discovery rate at a certain level. In particular, the Jlfdr-based method achieves higher power than commonly used meta-analysis methods when analyzing heterogeneous datasets from multiple GWASs. Simulation experiments demonstrate the superior power of our method over meta-analysis methods. Also, our method discovers more associations than meta-analysis methods from empirical datasets of four phenotypes. The R-package is available at: http://bioinformatics.ust.hk/Jlfdr.html . eeyu@ust.hk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  12. Tank Vapor Characterization Project: Tank 241-S-102 fourth temporal study: Headspace gas and vapor characterization results from samples collected on December 19, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pool, K.H.; Evans, J.C.; Olsen, K.B.

    1997-08-01

    This report presents the results from analyses of samples taken from the headspace of waste storage tank 241-S-102 (Tank S-102) at the Hanford Site in Washington State. Tank headspace samples collected by SGN Eurisys Service Corporation (SESC) were analyzed by Pacific Northwest National Laboratory (PNNL) to determine headspace concentrations of selected non-radioactive analytes. Analyses were performed by the Vapor Analytical Laboratory (VAL) at PNNL. Vapor concentrations from sorbent trap samples are based on measured sample volumes provided by SESC. Ammonia was determined to be above the immediate notification limit of 150 ppm as specified by the sampling and analysis planmore » (SAP). Hydrogen was the principal flammable constituent of the Tank S-102 headspace, determined to be present at approximately 2.410% of its lower flammability limit (LFL). Total headspace flammability was estimated to be <2.973% of its lower flammability limit (LFL). Total headspace flammability was estimated to be <2.973% of the LFL. Average measured concentrations of targeted gases, inorganic vapors, and selected organic vapors are provided in Table S.1. A summary of experimental methods, including sampling methodology, analytical procedures, and quality assurance and control methods are presented in Section 2.0. Detailed descriptions of the analytical results are provided in Section 3.0.« less

  13. Service and methods demonstration program annual report - executive summary.

    DOT National Transportation Integrated Search

    1979-08-01

    This report contains a summary of the contents of the Service and Methods Demonstration Program Annual Report for Fiscal Year 1978. Program activities and accomplishments discussed in the Annual Report are reviewed including findings and insights fro...

  14. ENVIRONMENTAL MONITORING AND ASSESSMENT PROGRAM (EMAP): WESTERN STREAMS AND RIVERS STATISTICAL SUMMARY

    EPA Science Inventory

    This statistical summary reports data from the Environmental Monitoring and Assessment Program (EMAP) Western Pilot (EMAP-W). EMAP-W was a sample survey (or probability survey, often simply called 'random') of streams and rivers in 12 states of the western U.S. (Arizona, Californ...

  15. Analysis of Variance with Summary Statistics in Microsoft® Excel®

    ERIC Educational Resources Information Center

    Larson, David A.; Hsu, Ko-Cheng

    2010-01-01

    Students regularly are asked to solve Single Factor Analysis of Variance problems given only the sample summary statistics (number of observations per category, category means, and corresponding category standard deviations). Most undergraduate students today use Excel for data analysis of this type. However, Excel, like all other statistical…

  16. Comparability of HbA1c and lipids measured with dried blood spot versus venous samples: a systematic review and meta-analysis

    PubMed Central

    2014-01-01

    Background Levels of haemoglobin A1c (HbA1c) and blood lipids are important determinants of risk in patients with diabetes. Standard analysis methods based upon venous blood samples can be logistically challenging in resource-poor settings where much of the diabetes epidemic is occurring. Dried blood spots (DBS) provide a simple alternative method for sample collection but the comparability of data from analyses based on DBS is not well established. Methods We conducted a systematic review and meta-analysis to define the association of findings for HbA1c and blood lipids for analyses based upon standard methods compared to DBS. The Cochrane, Embase and Medline databases were searched for relevant reports and summary regression lines were estimated. Results 705 abstracts were found by the initial electronic search with 6 further reports identified by manual review of the full papers. 16 studies provided data for one or more outcomes of interest. There was a close agreement between the results for HbA1c assays based on venous and DBS samples (DBS = 0.9858venous + 0.3809), except for assays based upon affinity chromatography. Significant adjustment was required for assays of total cholesterol (DBS = 0.6807venous + 1.151) but results for triglycerides (DBS = 0.9557venous + 0.1427) were directly comparable. Conclusions For HbA1c and selected blood lipids, assays based on DBS samples are clearly associated with assays based on standard venous samples. There are, however, significant uncertainties about the nature of these associations and there is a need for standardisation of the sample collection, transportation, storage and analysis methods before the technique can be considered mainstream. This should be a research priority because better elucidation of metabolic risks in resource poor settings, where venous sampling is infeasible, will be key to addressing the global epidemic of cardiovascular diseases. PMID:25045323

  17. Data and statistical summaries of background concentrations of metals in soils and streambed sediments in part of Big Soos Creek drainage basin, King County, Washington

    USGS Publications Warehouse

    Prych, E.A.; Kresch, D.L.; Ebbert, J.C.; Turney, G.L.

    1995-01-01

    Twenty-nine soil samples from 14 holes at 9 sites in part of the Big Soos Creek drainage basin in southwest King County, Washington, were collected and analyzed to obtain data on the magnitude and variability of background concentrations of metals in soils. Seven streambed-sediment samples and three streamwater samples from three sites also were collected and analyzed. These data are needed by regulating government agencies to determine if soils at sites of suspected contamination have elevated concentrations of metals, and to evaluate the effectiveness of remediation at sites with known contamination. Concentrations of 43 metals were determined by a total method, and concentrations of 17 metals were determined by a total-recoverable method and two different leaching methods. Metals analyzed for by all methods included most of those on the U.S. Environmental Protection agency list of priority pollutants, plus alluminum, iron, and manganese. Ranges of concentrations of metals determined by the total method are within ranges found by others for the conterminous United States. Concentrations of mercury, manganese, phosphorus, lead, selenium, antimony, and zinc as determined by the total method, and of some of these plus other metals as determined by the other methods were larger in shallow soil (less than 12 inches deep) than in deep soil (greater than 12 inches). Concentrations of metals in streambed sediments were more typical of shallow than deep soils.

  18. A Review of Current Methods for Analysis of Mycotoxins in Herbal Medicines

    PubMed Central

    Zhang, Lei; Dou, Xiao-Wen; Zhang, Cheng; Logrieco, Antonio F.; Yang, Mei-Hua

    2018-01-01

    The presence of mycotoxins in herbal medicines is an established problem throughout the entire world. The sensitive and accurate analysis of mycotoxin in complicated matrices (e.g., herbs) typically involves challenging sample pretreatment procedures and an efficient detection instrument. However, although numerous reviews have been published regarding the occurrence of mycotoxins in herbal medicines, few of them provided a detailed summary of related analytical methods for mycotoxin determination. This review focuses on analytical techniques including sampling, extraction, cleanup, and detection for mycotoxin determination in herbal medicines established within the past ten years. Dedicated sections of this article address the significant developments in sample preparation, and highlight the importance of this procedure in the analytical technology. This review also summarizes conventional chromatographic techniques for mycotoxin qualification or quantitation, as well as recent studies regarding the development and application of screening assays such as enzyme-linked immunosorbent assays, lateral flow immunoassays, aptamer-based lateral flow assays, and cytometric bead arrays. The present work provides a good insight regarding the advanced research that has been done and closes with an indication of future demand for the emerging technologies. PMID:29393905

  19. A statistical model investigating the prevalence of tuberculosis in New York City using counting processes with two change-points

    PubMed Central

    ACHCAR, J. A.; MARTINEZ, E. Z.; RUFFINO-NETTO, A.; PAULINO, C. D.; SOARES, P.

    2008-01-01

    SUMMARY We considered a Bayesian analysis for the prevalence of tuberculosis cases in New York City from 1970 to 2000. This counting dataset presented two change-points during this period. We modelled this counting dataset considering non-homogeneous Poisson processes in the presence of the two-change points. A Bayesian analysis for the data is considered using Markov chain Monte Carlo methods. Simulated Gibbs samples for the parameters of interest were obtained using WinBugs software. PMID:18346287

  20. In vivo Comet assay--statistical analysis and power calculations of mice testicular cells.

    PubMed

    Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne; Boberg, Julie; Kulahci, Murat

    2014-11-01

    The in vivo Comet assay is a sensitive method for evaluating DNA damage. A recurrent concern is how to analyze the data appropriately and efficiently. A popular approach is to summarize the raw data into a summary statistic prior to the statistical analysis. However, consensus on which summary statistic to use has yet to be reached. Another important consideration concerns the assessment of proper sample sizes in the design of Comet assay studies. This study aims to identify a statistic suitably summarizing the % tail DNA of mice testicular samples in Comet assay studies. A second aim is to provide curves for this statistic outlining the number of animals and gels to use. The current study was based on 11 compounds administered via oral gavage in three doses to male mice: CAS no. 110-26-9, CAS no. 512-56-1, CAS no. 111873-33-7, CAS no. 79-94-7, CAS no. 115-96-8, CAS no. 598-55-0, CAS no. 636-97-5, CAS no. 85-28-9, CAS no. 13674-87-8, CAS no. 43100-38-5 and CAS no. 60965-26-6. Testicular cells were examined using the alkaline version of the Comet assay and the DNA damage was quantified as % tail DNA using a fully automatic scoring system. From the raw data 23 summary statistics were examined. A linear mixed-effects model was fitted to the summarized data and the estimated variance components were used to generate power curves as a function of sample size. The statistic that most appropriately summarized the within-sample distributions was the median of the log-transformed data, as it most consistently conformed to the assumptions of the statistical model. Power curves for 1.5-, 2-, and 2.5-fold changes of the highest dose group compared to the control group when 50 and 100 cells were scored per gel are provided to aid in the design of future Comet assay studies on testicular cells. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Using the Halstead-Reitan Battery to diagnose brain damage: a comparison of the predictive power of traditional techniques to Rohling's Interpretive Method.

    PubMed

    Rohling, Martin L; Williamson, David J; Miller, L Stephen; Adams, Russell L

    2003-11-01

    The aim of this project was to validate an alternative global measure of neurocognitive impairment (Rohling Interpretive Method, or RIM) that could be generated from data gathered from a flexible battery approach. A critical step in this process is to establish the utility of the technique against current standards in the field. In this paper, we compared results from the Rohling Interpretive Method to those obtained from the General Neuropsychological Deficit Scale (GNDS; Reitan & Wolfson, 1988) and the Halstead-Russell Average Impairment Rating (AIR; Russell, Neuringer & Goldstein, 1970) on a large previously published sample of patients assessed with the Halstead-Reitan Battery (HRB). Findings support the use of the Rohling Interpretive Method in producing summary statistics similar in diagnostic sensitivity and specificity to the traditional HRB indices.

  2. Adherence to UK national guidance for discharge information: an audit in primary care

    PubMed Central

    Hammad, Eman A; Wright, David John; Walton, Christine; Nunney, Ian; Bhattacharya, Debi

    2014-01-01

    Aims Poor communication of clinical information between healthcare settings is associated with patient harm. In 2008, the UK National Prescribing Centre (NPC) issued guidance regarding the minimum information to be communicated upon hospital discharge. This study evaluates the extent of adherence to this guidance and identifies predictors of adherence. Methods This was an audit of discharge summaries received by medical practices in one UK primary care trust of patients hospitalized for 24 h or longer. Each discharge summary was scored against the applicable NPC criteria which were organized into: ‘patient, admission and discharge’, ‘medicine’ and ‘therapy change’ information. Results Of 3444 discharge summaries audited, 2421 (70.3%) were from two teaching hospitals and 906 (26.3%) from three district hospitals. Unplanned admissions accounted for 2168 (63.0%) of the audit sample and 74.6% (2570) of discharge summaries were electronic. Mean (95% CI) adherence to the total NPC minimum dataset was 71.7% [70.2, 73.2]. Adherence to patient, admission and discharge information was 77.3% (95% CI 77.0, 77.7), 67.2% (95% CI 66.3, 68.2) for medicine information and 48.9% (95% CI 47.5, 50.3) for therapy change information. Allergy status, co-morbidities, medication history and rationale for therapy change were the most frequent omissions. Predictors of adherence included quality of the discharge template, electronic discharge summaries and smaller numbers of prescribed medicines. Conclusions Despite clear guidance regarding the content of discharge information, omissions are frequent. Adherence to the NPC minimum dataset might be improved by using comprehensive electronic discharge templates and implementation of effective medicines reconciliation at both sides of the health interface. PMID:25041244

  3. A Summary Score for the Framingham Heart Study Neuropsychological Battery.

    PubMed

    Downer, Brian; Fardo, David W; Schmitt, Frederick A

    2015-10-01

    To calculate three summary scores of the Framingham Heart Study neuropsychological battery and determine which score best differentiates between subjects classified as having normal cognition, test-based impaired learning and memory, test-based multidomain impairment, and dementia. The final sample included 2,503 participants. Three summary scores were assessed: (a) composite score that provided equal weight to each subtest, (b) composite score that provided equal weight to each cognitive domain assessed by the neuropsychological battery, and (c) abbreviated score comprised of subtests for learning and memory. Receiver operating characteristic analysis was used to determine which summary score best differentiated between the four cognitive states. The summary score that provided equal weight to each subtest best differentiated between the four cognitive states. A summary score that provides equal weight to each subtest is an efficient way to utilize all of the cognitive data collected by a neuropsychological battery. © The Author(s) 2015.

  4. Statistical methods to detect novel genetic variants using publicly available GWAS summary data.

    PubMed

    Guo, Bin; Wu, Baolin

    2018-03-01

    We propose statistical methods to detect novel genetic variants using only genome-wide association studies (GWAS) summary data without access to raw genotype and phenotype data. With more and more summary data being posted for public access in the post GWAS era, the proposed methods are practically very useful to identify additional interesting genetic variants and shed lights on the underlying disease mechanism. We illustrate the utility of our proposed methods with application to GWAS meta-analysis results of fasting glucose from the international MAGIC consortium. We found several novel genome-wide significant loci that are worth further study. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Field Summary Report for Remedial Investigation of Hanford Site Releases to the Columbia River, Hanford Site, Washington

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    L.C. Hulstrom

    2010-09-28

    This report documents field activity associated with the collection, preparation, and shipment of fish samples. The purpose of the report is to describe the sampling locations, identify samples collected, and describe any modifications and additions made to the sampling and analysis plan.

  6. 77 FR 70835 - Centennial Challenges 2013 Sample Return Robot Challenge

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-27

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION Centennial Challenges 2013 Sample Return Robot...). SUMMARY: This notice is issued in accordance with 51 U.S.C. 20144(c). The 2013 Sample Return Robot.... The 2013 Sample Return Robot Challenge is a prize competition designed to encourage development of new...

  7. Overview of computational structural methods for modern military aircraft

    NASA Technical Reports Server (NTRS)

    Kudva, J. N.

    1992-01-01

    Computational structural methods are essential for designing modern military aircraft. This briefing deals with computational structural methods (CSM) currently used. First a brief summary of modern day aircraft structural design procedures is presented. Following this, several ongoing CSM related projects at Northrop are discussed. Finally, shortcomings in this area, future requirements, and summary remarks are given.

  8. Bias of shear wave elasticity measurements in thin layer samples and a simple correction strategy.

    PubMed

    Mo, Jianqiang; Xu, Hao; Qiang, Bo; Giambini, Hugo; Kinnick, Randall; An, Kai-Nan; Chen, Shigao; Luo, Zongping

    2016-01-01

    Shear wave elastography (SWE) is an emerging technique for measuring biological tissue stiffness. However, the application of SWE in thin layer tissues is limited by bias due to the influence of geometry on measured shear wave speed. In this study, we investigated the bias of Young's modulus measured by SWE in thin layer gelatin-agar phantoms, and compared the result with finite element method and Lamb wave model simulation. The result indicated that the Young's modulus measured by SWE decreased continuously when the sample thickness decreased, and this effect was more significant for smaller thickness. We proposed a new empirical formula which can conveniently correct the bias without the need of using complicated mathematical modeling. In summary, we confirmed the nonlinear relation between thickness and Young's modulus measured by SWE in thin layer samples, and offered a simple and practical correction strategy which is convenient for clinicians to use.

  9. Application of solid phase microextraction on dental composite resin analysis.

    PubMed

    Wang, Ven-Shing; Chang, Ta-Yuan; Lai, Chien-Chen; Chen, San-Yue; Huang, Long-Chen; Chao, Keh-Ping

    2012-08-15

    A direct immersion solid phase microextraction (DI-SPME) method was developed for the analysis of dentin monomers in saliva. Dentine monomers, such as triethylene glycol dimethacrylate (TEGDMA), urethane dimethacrylate (UDMA) and 2,2-bis-[4-(2-hydroxy-3-methacryloyloxypropoxy) phenyl]-propane (Bis-GMA), have a high molecular weight and a low vapor pressure. The polydimethylsiloxane/divinylbenzene (PDMS/DVB) fiber with a medium polarity was employed for DI-SPME, and 215 nm of detection wavelength was found to be optimum in the chromatogram of HPLC measurement. The calibration range for DI-SPME was 0.30-300 μg/mL with correlation coefficients (r) greater than 0.998 for each analyte. The DI-SPME method achieved good accuracy (recovery 96.1-101.2%) and precision (2.30-8.15% CV) for both intra- and inter-day assays of quality control samples for three target compounds. Method validation was performed on standards dissolved in blank saliva, and there was no significant difference (p>0.2) between the DI-SPME method and the liquid injection method. However, the detection limit of DI-SPME was as low as 0.03, 0.27 and 0.06 μg/mL for TEGDMA, UDMA and Bis-GMA, respectively. Real sample analyses were performed on commercial dentin products after curing for the leaching measurement. In summary, DI-SPME is a more sensitive method that requires less sample pretreatment procedures to measure the resin materials leached in saliva. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Salivary Cortisol as a Biomarker of Stress in Mothers and their Low Birth Weight Infants and Sample Collecting Challenges

    PubMed Central

    Vujičić, Ana Đorđević; Đukić, Svjetlana Maglajić

    2016-01-01

    Summary Background Salivary cortisol measurement is a non-invasive method suitable for use in neonatal research. Mother-infant separation after birth represents stress and skin-to-skin contact (SSC) has numerous benefits. The aim of the study was to measure salivary cortisol in mothers and newborns before and after SSC in order to assess the effect of SSC on mothers’ and infants’ stress and to estimate the efficacy of collecting small saliva samples in newborns. Methods Salivary cortisol was measured in 35 mother-infant pairs before and after the first and the fifth SSC in small saliva samples (50 μL) using the high sensitivity Quantitative ELISA-Kit (0.0828 nmol/L) for low cortisol levels detection. Samples were collected with eye sponge during 3 to 5 minutes. Results Cortisol level in mothers decreased after SSC: the highest levels were measured before and the lowest after SSC and the differences in values were significant during both the first (p<0.001) and the fifth SSC (p<0.001). During the first SSC the cortisol level decrease was detected in 14 (40%) and an increase in 21 (60%) newborns, and during the fifth SSC a decrease was detected in 16 (45.7%) and an increase in 19 (54.3%) newborns, without confirmed significance of the difference. Saliva sampling efficacy using eye sponge was 75%. Conclusions Cortisol level decrease in mothers proves the stress reduction during SSC, while variable cortisol levels in infants do not indicate stress reduction and imply the need for further research. The used sampling method appeared to be one of the most optimal considering the sample volume, sampling time and efficacy. PMID:28356870

  11. Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics

    PubMed Central

    Dowding, Irene; Haufe, Stefan

    2018-01-01

    Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885

  12. Testing Genetic Pleiotropy with GWAS Summary Statistics for Marginal and Conditional Analyses.

    PubMed

    Deng, Yangqing; Pan, Wei

    2017-12-01

    There is growing interest in testing genetic pleiotropy, which is when a single genetic variant influences multiple traits. Several methods have been proposed; however, these methods have some limitations. First, all the proposed methods are based on the use of individual-level genotype and phenotype data; in contrast, for logistical, and other, reasons, summary statistics of univariate SNP-trait associations are typically only available based on meta- or mega-analyzed large genome-wide association study (GWAS) data. Second, existing tests are based on marginal pleiotropy, which cannot distinguish between direct and indirect associations of a single genetic variant with multiple traits due to correlations among the traits. Hence, it is useful to consider conditional analysis, in which a subset of traits is adjusted for another subset of traits. For example, in spite of substantial lowering of low-density lipoprotein cholesterol (LDL) with statin therapy, some patients still maintain high residual cardiovascular risk, and, for these patients, it might be helpful to reduce their triglyceride (TG) level. For this purpose, in order to identify new therapeutic targets, it would be useful to identify genetic variants with pleiotropic effects on LDL and TG after adjusting the latter for LDL; otherwise, a pleiotropic effect of a genetic variant detected by a marginal model could simply be due to its association with LDL only, given the well-known correlation between the two types of lipids. Here, we develop a new pleiotropy testing procedure based only on GWAS summary statistics that can be applied for both marginal analysis and conditional analysis. Although the main technical development is based on published union-intersection testing methods, care is needed in specifying conditional models to avoid invalid statistical estimation and inference. In addition to the previously used likelihood ratio test, we also propose using generalized estimating equations under the working independence model for robust inference. We provide numerical examples based on both simulated and real data, including two large lipid GWAS summary association datasets based on ∼100,000 and ∼189,000 samples, respectively, to demonstrate the difference between marginal and conditional analyses, as well as the effectiveness of our new approach. Copyright © 2017 by the Genetics Society of America.

  13. Robust best linear estimator for Cox regression with instrumental variables in whole cohort and surrogates with additive measurement error in calibration sample

    PubMed Central

    Wang, Ching-Yun; Song, Xiao

    2017-01-01

    SUMMARY Biomedical researchers are often interested in estimating the effect of an environmental exposure in relation to a chronic disease endpoint. However, the exposure variable of interest may be measured with errors. In a subset of the whole cohort, a surrogate variable is available for the true unobserved exposure variable. The surrogate variable satisfies an additive measurement error model, but it may not have repeated measurements. The subset in which the surrogate variables are available is called a calibration sample. In addition to the surrogate variables that are available among the subjects in the calibration sample, we consider the situation when there is an instrumental variable available for all study subjects. An instrumental variable is correlated with the unobserved true exposure variable, and hence can be useful in the estimation of the regression coefficients. In this paper, we propose a nonparametric method for Cox regression using the observed data from the whole cohort. The nonparametric estimator is the best linear combination of a nonparametric correction estimator from the calibration sample and the difference of the naive estimators from the calibration sample and the whole cohort. The asymptotic distribution is derived, and the finite sample performance of the proposed estimator is examined via intensive simulation studies. The methods are applied to the Nutritional Biomarkers Study of the Women’s Health Initiative. PMID:27546625

  14. Extraction of features from ultrasound acoustic emissions: a tool to assess the hydraulic vulnerability of Norway spruce trunkwood?

    PubMed Central

    Rosner, Sabine; Klein, Andrea; Wimmer, Rupert; Karlsson, Bo

    2011-01-01

    Summary • The aim of this study was to assess the hydraulic vulnerability of Norway spruce (Picea abies) trunkwood by extraction of selected features of acoustic emissions (AEs) detected during dehydration of standard size samples. • The hydraulic method was used as the reference method to assess the hydraulic vulnerability of trunkwood of different cambial ages. Vulnerability curves were constructed by plotting the percentage loss of conductivity vs an overpressure of compressed air. • Differences in hydraulic vulnerability were very pronounced between juvenile and mature wood samples; therefore, useful AE features, such as peak amplitude, duration and relative energy, could be filtered out. The AE rates of signals clustered by amplitude and duration ranges and the AE energies differed greatly between juvenile and mature wood at identical relative water losses. • Vulnerability curves could be constructed by relating the cumulated amount of relative AE energy to the relative loss of water and to xylem tension. AE testing in combination with feature extraction offers a readily automated and easy to use alternative to the hydraulic method. PMID:16771986

  15. Trade-offs in sensitivity and sampling depth in bimodal atomic force microscopy and comparison to the trimodal case

    PubMed Central

    Eslami, Babak; Ebeling, Daniel

    2014-01-01

    Summary This paper presents experiments on Nafion® proton exchange membranes and numerical simulations illustrating the trade-offs between the optimization of compositional contrast and the modulation of tip indentation depth in bimodal atomic force microscopy (AFM). We focus on the original bimodal AFM method, which uses amplitude modulation to acquire the topography through the first cantilever eigenmode, and drives a higher eigenmode in open-loop to perform compositional mapping. This method is attractive due to its relative simplicity, robustness and commercial availability. We show that this technique offers the capability to modulate tip indentation depth, in addition to providing sample topography and material property contrast, although there are important competing effects between the optimization of sensitivity and the control of indentation depth, both of which strongly influence the contrast quality. Furthermore, we demonstrate that the two eigenmodes can be highly coupled in practice, especially when highly repulsive imaging conditions are used. Finally, we also offer a comparison with a previously reported trimodal AFM method, where the above competing effects are minimized. PMID:25161847

  16. Anchoring quartet-based phylogenetic distances and applications to species tree reconstruction.

    PubMed

    Sayyari, Erfan; Mirarab, Siavash

    2016-11-11

    Inferring species trees from gene trees using the coalescent-based summary methods has been the subject of much attention, yet new scalable and accurate methods are needed. We introduce DISTIQUE, a new statistically consistent summary method for inferring species trees from gene trees under the coalescent model. We generalize our results to arbitrary phylogenetic inference problems; we show that two arbitrarily chosen leaves, called anchors, can be used to estimate relative distances between all other pairs of leaves by inferring relevant quartet trees. This results in a family of distance-based tree inference methods, with running times ranging between quadratic to quartic in the number of leaves. We show in simulated studies that DISTIQUE has comparable accuracy to leading coalescent-based summary methods and reduced running times.

  17. Text Summarization Model based on Facility Location Problem

    NASA Astrophysics Data System (ADS)

    Takamura, Hiroya; Okumura, Manabu

    e propose a novel multi-document generic summarization model based on the budgeted median problem, which is a facility location problem. The summarization method based on our model is an extractive method, which selects sentences from the given document cluster and generates a summary. Each sentence in the document cluster will be assigned to one of the selected sentences, where the former sentece is supposed to be represented by the latter. Our method selects sentences to generate a summary that yields a good sentence assignment and hence covers the whole content of the document cluster. An advantage of this method is that it can incorporate asymmetric relations between sentences such as textual entailment. Through experiments, we showed that the proposed method yields good summaries on the dataset of DUC'04.

  18. Examination of wrist and hip actigraphy using a novel sleep estimation procedure☆

    PubMed Central

    Ray, Meredith A.; Youngstedt, Shawn D.; Zhang, Hongmei; Robb, Sara Wagner; Harmon, Brook E.; Jean-Louis, Girardin; Cai, Bo; Hurley, Thomas G.; Hébert, James R.; Bogan, Richard K.; Burch, James B.

    2014-01-01

    Objective Improving and validating sleep scoring algorithms for actigraphs enhances their usefulness in clinical and research applications. The MTI® device (ActiGraph, Pensacola, FL) had not been previously validated for sleep. The aims were to (1) compare the accuracy of sleep metrics obtained via wrist- and hip-mounted MTI® actigraphs with polysomnographic (PSG) recordings in a sample that included both normal sleepers and individuals with presumed sleep disorders; and (2) develop a novel sleep scoring algorithm using spline regression to improve the correspondence between the actigraphs and PSG. Methods Original actigraphy data were amplified and their pattern was estimated using a penalized spline. The magnitude of amplification and the spline were estimated by minimizing the difference in sleep efficiency between wrist- (hip-) actigraphs and PSG recordings. Sleep measures using both the original and spline-modified actigraphy data were compared to PSG using the following: mean sleep summary measures; Spearman rank-order correlations of summary measures; percent of minute-by-minute agreement; sensitivity and specificity; and Bland–Altman plots. Results The original wrist actigraphy data showed modest correspondence with PSG, and much less correspondence was found between hip actigraphy and PSG. The spline-modified wrist actigraphy produced better approximations of interclass correlations, sensitivity, and mean sleep summary measures relative to PSG than the original wrist actigraphy data. The spline-modified hip actigraphy provided improved correspondence, but sleep measures were still not representative of PSG. Discussion The results indicate that with some refinement, the spline regression method has the potential to improve sleep estimates obtained using wrist actigraphy. PMID:25580202

  19. The application of prototype point processes for the summary and description of California wildfires

    USGS Publications Warehouse

    Nichols, K.; Schoenberg, F.P.; Keeley, J.E.; Bray, A.; Diez, D.

    2011-01-01

    A method for summarizing repeated realizations of a space-time marked point process, known as prototyping, is discussed and applied to catalogues of wildfires in California. Prototype summaries are constructed for varying time intervals using California wildfire data from 1990 to 2006. Previous work on prototypes for temporal and space-time point processes is extended here to include methods for computing prototypes with marks and the incorporation of prototype summaries into hierarchical clustering algorithms, the latter of which is used to delineate fire seasons in California. Other results include summaries of patterns in the spatial-temporal distribution of wildfires within each wildfire season. ?? 2011 Blackwell Publishing Ltd.

  20. Design Characteristics Influence Performance of Clinical Prediction Rules in Validation: A Meta-Epidemiological Study

    PubMed Central

    Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda

    2016-01-01

    Background Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules’ performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Methods Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. Results A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2–4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Conclusion Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved. PMID:26730980

  1. An Adaptive Association Test for Multiple Phenotypes with GWAS Summary Statistics.

    PubMed

    Kim, Junghi; Bai, Yun; Pan, Wei

    2015-12-01

    We study the problem of testing for single marker-multiple phenotype associations based on genome-wide association study (GWAS) summary statistics without access to individual-level genotype and phenotype data. For most published GWASs, because obtaining summary data is substantially easier than accessing individual-level phenotype and genotype data, while often multiple correlated traits have been collected, the problem studied here has become increasingly important. We propose a powerful adaptive test and compare its performance with some existing tests. We illustrate its applications to analyses of a meta-analyzed GWAS dataset with three blood lipid traits and another with sex-stratified anthropometric traits, and further demonstrate its potential power gain over some existing methods through realistic simulation studies. We start from the situation with only one set of (possibly meta-analyzed) genome-wide summary statistics, then extend the method to meta-analysis of multiple sets of genome-wide summary statistics, each from one GWAS. We expect the proposed test to be useful in practice as more powerful than or complementary to existing methods. © 2015 WILEY PERIODICALS, INC.

  2. Design and Hospital-Wide Implementation of a Standardized Discharge Summary in an Electronic Health Record

    PubMed Central

    Dean, Shannon M; Gilmore-Bykovskyi, Andrea; Buchanan, Joel; Ehlenfeldt, Brad; Kind, Amy JH

    2016-01-01

    Background The hospital discharge summary is the primary method used to communicate a patient's plan of care to the next provider(s). Despite the existence of regulations and guidelines outlining the optimal content for the discharge summary and its importance in facilitating an effective transition to post-hospital care, incomplete discharge summaries remain a common problem that may contribute to poor post-hospital outcomes. Electronic health records (EHRs) are regularly used as a platform upon which standardization of content and format can be implemented. Objective We describe here the design and hospital-wide implementation of a standardized discharge summary using an EHR. Methods We employed the evidence-based Replicating Effective Programs implementation strategy to guide the development and implementation during this large-scale project. Results Within 18 months, 90% of all hospital discharge summaries were written using the standardized format. Hospital providers found the template helpful and easy to use, and recipient providers perceived an improvement in the quality of discharge summaries compared to those sent from our hospital previously. Conclusions Discharge summaries can be standardized and implemented hospital-wide with both author and recipient provider satisfaction, especially if evidence-based implementation strategies are employed. The use of EHR tools to guide clinicians in writing comprehensive discharge summaries holds promise in improving the existing deficits in communication at transitions of care. PMID:28334559

  3. Parallel evaluation of broad virus detection methods.

    PubMed

    Modrof, Jens; Berting, Andreas; Kreil, Thomas R

    2014-01-01

    The testing for adventitious viruses is of critical importance during development and production of biological products. The recent emergence and ongoing development of broad virus detection methods calls for an evaluation of whether these methods can appropriately be implemented into current adventitious agent testing procedures. To assess the suitability of several broad virus detection methods, a comparative experimental study was conducted: four virus preparations, which were spiked at two different concentrations each into two different cell culture media, were sent to four investigators in a blinded fashion for analysis with broad virus detection methods such as polymerase chain reaction-electrospray ionization mass spectrometry (PCR-ESI/MS), microarray, and two approaches utilizing massively parallel sequencing. The results that were reported by the investigators revealed that all methods were able to identify the majority of samples correctly (mean 83%), with a surprisingly narrow range among the methods, that is, between 72% (PCR-ESI/MS) and 95% (microarray). In addition to the correct results, a variety of unexpected assignments were reported for a minority of samples, again with little variation regarding the methods used (range 20-45%), while false negatives were reported for 0-25% of the samples. Regarding assay sensitivity, the viruses were detected by all methods included in this study at concentrations of about 4-5 log10 quantitative PCR copies/mL, and probably with higher sensitivity in some cases. In summary, the broad virus detection methods investigated were shown to be suitable even for detection of relatively low virus concentrations. However, there is also some potential for the production of false-positive as well as false-negative assignments, which indicates the requirement for further improvements before these methods can be considered for routine use. © PDA, Inc. 2014.

  4. Asteroid Redirect Mission: EVA and Sample Collection

    NASA Technical Reports Server (NTRS)

    Abell, Paul; Stich, Steve

    2015-01-01

    Asteroid Redirect Mission (ARM) Overview (1) Notional Development Schedule, (2) ARV Crewed Mission Accommodations; Asteroid Redirect Crewed Mission (ARCM) Mission Summary; ARCM Accomplishments; Sample collection/curation plan (1) CAPTEM Requirements; SBAG Engagement Plan

  5. Water-Quality, Bed-Sediment, and Biological Data (October 2004 through September 2005) and Statistical Summaries of Data for Streams in the Upper Clark Fork Basin, Montana

    USGS Publications Warehouse

    Dodge, Kent A.; Hornberger, Michelle I.; Dyke, Jessica

    2006-01-01

    Water, bed sediment, and biota were sampled in streams from Butte to below Missoula as part of a long-term monitoring program, conducted in cooperation with the U.S. Environmental Protection Agency, to characterize aquatic resources in the upper Clark Fork basin of western Montana. Sampling sites were located on the Clark Fork, six major tributaries, and three smaller tributaries. Water-quality samples were collected periodically at 18 sites during October 2004 through September 2005 (water year 2005). Bed-sediment and biological samples were collected once in August 2005. The primary constituents analyzed were trace elements associated with tailings from historical mining and smelting activities. This report summarizes the results of water-quality, bed-sediment, and biota samples col-lected in water year 2005 and provides statistical summaries of data collected since 1985. Water-quality data for samples collected periodically from streams include concentrations of selected major ions, trace ele-ments, and suspended sediment. Daily values of suspended-sed-iment concentration and suspended-sediment discharge were determined for three sites. Bed-sediment data include trace-ele-ment concentrations in the fine-grained fraction. Bio-logical data include trace-element concentrations in whole-body tissue of aquatic benthic insects. Quality-assurance data are reported for analytical results of water, bed sediment, and biota. Statistical summaries of water-quality, bed-sediment, and biological data are provided for the period of record since 1985 for each site.

  6. The Effects of Process Oriented Guided Inquiry Learning on Secondary Student ACT Science Scores

    NASA Astrophysics Data System (ADS)

    Judd, William Lindsey

    The purpose of this study was to examine any significant difference on secondary school chemistry students' ACT Science Test scores between students taught by the Process Oriented Guided Inquiry Learning (POGIL) method versus students taught by traditional, teacher-centered pedagogy. This study also examined any difference between students taught by the POGIL method versus students taught by traditional, teacher-centered pedagogy in regard to the three different types of questions on the ACT Science Test: data representation, research summaries, and conflicting viewpoints. The sample consisted of sophomore-level students at two private, suburban Christian schools. A pretest-posttest design was used to compare the mean difference in scores from ACT issued sample test booklets before and after each group had received instruction via the POGIL method or more traditional methods. This study found that there was no significant difference in the mean difference of test scores between the two groups. This study also found that there was not a significant difference in the mean difference of scores in regard to the three different types of questions on the ACT Science Test. Further implications of this study are discussed.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Volume II of the Site Environmental Report for 2006 is provided by Ernest Orlando Lawrence Berkeley National Laboratory as a supplemental appendix to Volume I, which contains the body of the report. Volume II contains the environmental monitoring and sampling data used to generate summary results of routine and nonroutine activities at the Laboratory (except for groundwater sampling data, which may be found in the reports referred to in Chapter 4). Volume I summarizes the results from analyses of the data. The results from sample collections are more comprehensive in Volume II than in Volume I: For completeness, all resultsmore » from sample collections that began or ended in calendar year (CY) 2006 are included in this volume. However, the samples representing CY 2005 data have not been used in the summary results that are reported in Volume I. (For example, although ambient air samples collected on January 2, 2006, are presented in Volume II, they represent December 2005 data and are not included in Table 4-2 in Volume I.)« less

  8. Summary of the 1987 soil sampling effort at the Idaho National Engineering Laboratory Test Reactor Area Paint Shop Ditch

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, T.R.; Knight, J.L.; Hertzler, C.L.

    1989-08-01

    Sampling of the Test Reactor Area (TRA) Paint Shop Ditch at the Idaho National Engineering Laboratory was initiated in compliance with the Interim Agreement between the Department of Energy (DOE) and the Environmental Protection Agency (EPA). Sampling of the TRA Paint Shop Ditch was done as part of the Action Plan to achieve and maintain compliance with the Resource Conservation and Recovery Act (RCRA) and applicable regulations. It is the purpose of this document to provide a summary of the July 6, 1987 sampling activities that occurred in ditch west of Building TRA-662, which housed the TRA Paint Shop inmore » 1987. This report will give a narrative description of the field activities, locations of collected samples, discuss the sampling procedures and the chemical analyses. Also included in the scope of this report is to bring together data and reports on the TRA Paint Shop Ditch for archival purposes. 6 refs., 10 figs., 8 tabs.« less

  9. [Identification of antler powder components based on DNA barcoding technology].

    PubMed

    Jia, Jing; Shi, Lin-chun; Xu, Zhi-chao; Xin, Tian-yi; Song, Jing-yuan; Chen Shi, Lin

    2015-10-01

    In order to authenticate the components of antler powder in the market, DNA barcoding technology coupled with cloning method were used. Cytochrome c oxidase subunit I (COI) sequences were obtained according to the DNA barcoding standard operation procedure (SOP). For antler powder with possible mixed components, the cloning method was used to get each COI sequence. 65 COI sequences were successfully obtained from commercial antler powders via sequencing PCR products. The results indicates that only 38% of these samples were derived from Cervus nippon Temminck or Cervus elaphus Linnaeus which is recorded in the 2010 edition of "Chinese Pharmacopoeia", while 62% of them were derived from other species. Rangifer tarandus Linnaeus was the most frequent species among the adulterants. Further analysis showed that some samples collected from different regions, companies and prices, contained adulterants. Analysis of 36 COI sequences obtained by the cloning method showed that C. elaphus and C. nippon were main components. In addition, some samples were marked clearly as antler powder on the label, however, C. elaphus or R. tarandus were their main components. In summary, DNA barcoding can accurately and efficiently distinguish the exact content in the commercial antler powder, which provides a new technique to ensure clinical safety and improve quality control of Chinese traditional medicine

  10. Evaluation of automatic video summarization systems

    NASA Astrophysics Data System (ADS)

    Taskiran, Cuneyt M.

    2006-01-01

    Compact representations of video, or video summaries, data greatly enhances efficient video browsing. However, rigorous evaluation of video summaries generated by automatic summarization systems is a complicated process. In this paper we examine the summary evaluation problem. Text summarization is the oldest and most successful summarization domain. We show some parallels between these to domains and introduce methods and terminology. Finally, we present results for a comprehensive evaluation summary that we have performed.

  11. Comparison of protocols for cryopreservation of rhesus monkey spermatozoa by post-thaw motility recovery and hyperactivation.

    PubMed

    Nichols, S M; Bavister, B D

    2006-01-01

    Cryopreservation of spermatozoa is useful for gene banking and for in vitro fertilisation (IVF). This study compared several published cryopreservation techniques to find the most efficient for rhesus macaques. Effectiveness was assessed by sperm longevity (post-thaw motility % and duration) and ability to hyperactivate in response to chemical activators (caffeine, dibutyryl cyclic AMP). Each ejaculate from three males was treated with four published cryopreservation protocols (Seier et al. 1993; Sanchez-Partida et al. 2000; Si et al. 2000; Isachenko et al. 2005). Upon thawing, each sub-sample was incubated either at 37 degrees C in 5% CO2 in air with or without activators or at approximately 22 degrees C in atmospheric air without activators for 0-24 h. Samples cryopreserved using one method showed zero motility and were not included in the 2 ;2 G-test statistical analysis. The other methods all demonstrated good immediate post-thaw motility rates (68%, 73% and 62% respectively) and underwent capacitation after exposure to activators. Sperm motility in each treatment decreased over time at both temperatures but overall, incubation at 22 degrees C preserved motility better in all three methods. In summary, cryopreservation of rhesus spermatozoa using the method published by Sanchez-Partida et al. or Seier et al. appeared best, potentially supporting gene banking as well as allowing for multiple IVF uses from the same sample.

  12. Ab initio folding of proteins using all-atom discrete molecular dynamics

    PubMed Central

    Ding, Feng; Tsao, Douglas; Nie, Huifen; Dokholyan, Nikolay V.

    2008-01-01

    Summary Discrete molecular dynamics (DMD) is a rapid sampling method used in protein folding and aggregation studies. Until now, DMD was used to perform simulations of simplified protein models in conjunction with structure-based force fields. Here, we develop an all-atom protein model and a transferable force field featuring packing, solvation, and environment-dependent hydrogen bond interactions. Using the replica exchange method, we perform folding simulations of six small proteins (20–60 residues) with distinct native structures. In all cases, native or near-native states are reached in simulations. For three small proteins, multiple folding transitions are observed and the computationally-characterized thermodynamics are in quantitative agreement with experiments. The predictive power of all-atom DMD highlights the importance of environment-dependent hydrogen bond interactions in modeling protein folding. The developed approach can be used for accurate and rapid sampling of conformational spaces of proteins and protein-protein complexes, and applied to protein engineering and design of protein-protein interactions. PMID:18611374

  13. Lead exposure in US worksites: A literature review and development of an occupational lead exposure database from the published literature

    PubMed Central

    Koh, Dong-Hee; Locke, Sarah J.; Chen, Yu-Cheng; Purdue, Mark P.; Friesen, Melissa C.

    2016-01-01

    Background Retrospective exposure assessment of occupational lead exposure in population-based studies requires historical exposure information from many occupations and industries. Methods We reviewed published US exposure monitoring studies to identify lead exposure measurement data. We developed an occupational lead exposure database from the 175 identified papers containing 1,111 sets of lead concentration summary statistics (21% area air, 47% personal air, 32% blood). We also extracted ancillary exposure-related information, including job, industry, task/location, year collected, sampling strategy, control measures in place, and sampling and analytical methods. Results Measurements were published between 1940 and 2010 and represented 27 2-digit standardized industry classification codes. The majority of the measurements were related to lead-based paint work, joining or cutting metal using heat, primary and secondary metal manufacturing, and lead acid battery manufacturing. Conclusions This database can be used in future statistical analyses to characterize differences in lead exposure across time, jobs, and industries. PMID:25968240

  14. Forensic Investigation of AC and PCC Pavements with Extended Service Life : Volume 2 : Petrographic Examination of PCC Core Samples at Lankard Materials Laboratory ; Executive Summary Report

    DOT National Transportation Integrated Search

    2010-09-01

    The overall purpose of this research project as described in : the Executive Summary Report for Volume 1 : (FHWA/OH-2010/04A) is to identify flexible, rigid and : composite pavements that have not received any structural : maintenance since construct...

  15. Neuropsychological tests for predicting cognitive decline in older adults

    PubMed Central

    Baerresen, Kimberly M; Miller, Karen J; Hanson, Eric R; Miller, Justin S; Dye, Richelin V; Hartman, Richard E; Vermeersch, David; Small, Gary W

    2015-01-01

    Summary Aim To determine neuropsychological tests likely to predict cognitive decline. Methods A sample of nonconverters (n = 106) was compared with those who declined in cognitive status (n = 24). Significant univariate logistic regression prediction models were used to create multivariate logistic regression models to predict decline based on initial neuropsychological testing. Results Rey–Osterrieth Complex Figure Test (RCFT) Retention predicted conversion to mild cognitive impairment (MCI) while baseline Buschke Delay predicted conversion to Alzheimer’s disease (AD). Due to group sample size differences, additional analyses were conducted using a subsample of demographically matched nonconverters. Analyses indicated RCFT Retention predicted conversion to MCI and AD, and Buschke Delay predicted conversion to AD. Conclusion Results suggest RCFT Retention and Buschke Delay may be useful in predicting cognitive decline. PMID:26107318

  16. Water-quality data (October 1988 through September 1989) and statistical summaries (March 1985 through September 1989) for the Clark Fork and selected tributaries from Galen to Missoula, Montana

    USGS Publications Warehouse

    Lambing, J.H.

    1990-01-01

    Water quality sampling was conducted at eight sites on the Clark Fork and selected tributaries from Galen to Missoula, from October 1988 through September 1989. This report presents tabulations and statistical summaries of the water quality data. Included are tabulations of streamflow, onsite water quality, and concentrations of trace elements and suspended sediment for periodic samples. Also included are tables and hydrographs of daily mean values for streamflow, suspended-sediment concentration, and suspended-sediment discharge at three mainstem stations and one tributary. Statistical summaries are presented for periodic water quality data collected from March 1985 through September 1989. Selected data are illustrated by graphs showing median concentrations of trace elements in water, relation of trace-element concentrations to suspended-sediment concentrations, and median concentrations of trace elements in suspended sediment. (USGS)

  17. Water-quality data (October 1987 through September 1988) and statistical summaries (March 1985 through September 1988) for the Clark Fork and selected tributaries from Galen to Missoula, Montana

    USGS Publications Warehouse

    Lambing, John H.

    1989-01-01

    Water quality sampling was conducted at eight sites on the Clark Fork and selected tributaries from Galen to Missoula, Mont., from October 1987 through September 1988. This report presents tabulations and statistical summaries of the water quality data. Included in this report are tabulations of streamflow, onsite water quality, and concentrations of trace elements and suspended sediment for periodic samples. Also included are tables and hydrographs of daily mean values for streamflow, suspended-sediment concentration, and suspended-sediment discharge at three mainstream stations and one tributary. Statistical summaries are presented for periodic water quality data collected from March 1985 through September 1988. Selected data are illustrated by graphs showing median concentrations of trace elements in water, relation of trace element concentrations to suspended-sediment concentrations, and median concentrations of trace elements in suspended sediments. (USGS)

  18. What’s Driving Uncertainty? The Model or the Model Parameters (What’s Driving Uncertainty? The influences of model and model parameters in data analysis)

    DOE PAGES

    Anderson-Cook, Christine Michaela

    2017-03-01

    Here, one of the substantial improvements to the practice of data analysis in recent decades is the change from reporting just a point estimate for a parameter or characteristic, to now including a summary of uncertainty for that estimate. Understanding the precision of the estimate for the quantity of interest provides better understanding of what to expect and how well we are able to predict future behavior from the process. For example, when we report a sample average as an estimate of the population mean, it is good practice to also provide a confidence interval (or credible interval, if youmore » are doing a Bayesian analysis) to accompany that summary. This helps to calibrate what ranges of values are reasonable given the variability observed in the sample and the amount of data that were included in producing the summary.« less

  19. OAST Space Theme Workshop. Volume 2: Theme summary. 4: Solar system exploration (no. 10). A: Statement of theme: B. 26 April 1976 Presentation. C. Summary. D. Initiative actions (form 5)

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Major strategies for exploring the solar system focus on the return of information and the return of matter. Both the planetary exploration facility, and an orbiting automated space station, and the sample return and exploration facility have similar requirements. The single most essential need to enable intensive study of the outer solar system is nuclear propulsion and power capability. New initiatives in 1978 related to the reactor, data and sample acquisition and return, navigation, and environmental protection are examined.

  20. Ground truth crop proportion summaries for US segments, 1976-1979

    NASA Technical Reports Server (NTRS)

    Horvath, R. (Principal Investigator); Rice, D.; Wessling, T.

    1981-01-01

    The original ground truth data was collected, digitized, and registered to LANDSAT data for use in the LACIE and AgRISTARS projects. The numerous ground truth categories were consolidated into fewer classes of crops or crop conditions and counted occurrences of these classes for each segment. Tables are presented in which the individual entries are the percentage of total segment area assigned to a given class. The ground truth summaries were prepared from a 20% sample of the scene. An analysis indicates that this size of sample provides sufficient accuracy for use of the data in initial segment screening.

  1. On the development of a theory of traveler attitude-behavior interrelationships. Volume 3 : executive summary : overview of methods, results, and conclusions

    DOT National Transportation Integrated Search

    1978-08-01

    The executive summary of this Final Report offers an overview of : methods, results, and conclusions which support the development of a : theory of traveler attitude-behavior interrelationships. Such a theory : will be useful in the design of transpo...

  2. A comprehensive review of arsenic levels in the semiconductor manufacturing industry.

    PubMed

    Park, Donguk; Yang, Haengsun; Jeong, Jeeyeon; Ha, Kwonchul; Choi, Sangjun; Kim, Chinyon; Yoon, Chungsik; Park, Dooyong; Paek, Domyung

    2010-11-01

    This paper presents a summary of arsenic level statistics from air and wipe samples taken from studies conducted in fabrication operations. The main objectives of this study were not only to describe arsenic measurement data but also, through a literature review, to categorize fabrication workers in accordance with observed arsenic levels. All airborne arsenic measurements reported were included in the summary statistics for analysis of the measurement data. The arithmetic mean was estimated assuming a lognormal distribution from the geometric mean and the geometric standard deviation or the range. In addition, weighted arithmetic means (WAMs) were calculated based on the number of measurements reported for each mean. Analysis of variance (ANOVA) was employed to compare arsenic levels classified according to several categories such as the year, sampling type, location sampled, operation type, and cleaning technique. Nine papers were found reporting airborne arsenic measurement data from maintenance workers or maintenance areas in semiconductor chip-making plants. A total of 40 statistical summaries from seven articles were identified that represented a total of 423 airborne arsenic measurements. Arsenic exposure levels taken during normal operating activities in implantation operations (WAM = 1.6 μg m⁻³, no. of samples = 77, no. of statistical summaries = 2) were found to be lower than exposure levels of engineers who were involved in maintenance works (7.7 μg m⁻³, no. of samples = 181, no. of statistical summaries = 19). The highest level (WAM = 218.6 μg m⁻³) was associated with various maintenance works performed inside an ion implantation chamber. ANOVA revealed no significant differences in the WAM arsenic levels among the categorizations based on operation and sampling characteristics. Arsenic levels (56.4 μg m⁻³) recorded during maintenance works performed in dry conditions were found to be much higher than those from maintenance works in wet conditions (0.6 μg m⁻³). Arsenic levels from wipe samples in process areas after maintenance activities ranged from non-detectable to 146 μg cm⁻², indicating the potential for dispersion into the air and hence inhalation. We conclude that workers who are regularly or occasionally involved in maintenance work have higher potential for occupational exposure than other employees who are in charge of routine production work. In addition, fabrication workers can be classified into two groups based on the reviewed arsenic exposure levels: operators with potential for low levels of exposure and maintenance engineers with high levels of exposure. These classifications could be used as a basis for a qualitative ordinal ranking of exposure in an epidemiological study.

  3. Measurement of cation exchange capacity (CEC) on natural zeolite by percolation method

    NASA Astrophysics Data System (ADS)

    Wiyantoko, Bayu; Rahmah, Nafisa

    2017-12-01

    The cation exchange capacity (CEC)measurement has been carried out in natural zeolite by percolation method. The natural zeolite samples used for cation exchange capacity measurement were activated beforehand with physical activation and chemical activation. The physically activated zeolite was done by calcination process at 600 °C for 4 hours. The natural zeolite was activated chemically by using sodium hydroxide by refluxing process at 60-80 °C for 3 hours. In summary, cation exchange capacity (CEC) determination was performed by percolation, distillation and titration processes. Based on the measurement that has been done, the exchange rate results from physical activated and chemical activated of natural zeolite were 181.90cmol (+)/kg and 901.49cmol (+)/kg respectively.

  4. Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN

    NASA Technical Reports Server (NTRS)

    Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.

    1996-01-01

    A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.

  5. Summary Fracturing of Coso Samples with StimuFrac

    DOE Data Explorer

    Carlos Fernandez

    2016-09-06

    Lab-scale stimulation was performed on Coso samples obtained from a single core (1623 feet TVD, reservoir Coso CGC 18-27) using StimuFrac and control fluid in the absence of stimuli-responsive polymer.

  6. Development and Validation of a Porcine (Sus scrofa) Sepsis Model

    DTIC Science & Technology

    2018-03-01

    last IACUC approval, have any methods been identified to reduce the number of live animals used in this protocol? None 10. PUBLICATIONS...SUMMARY: (Please provide, in "ABSTRACT" format, a summary of the protocol objectives, materials and methods , results - include tables/figures, and...Materials and methods : Animals were anesthetized and instrumented for cardiovascular monitoring. Lipopolysaccharide (LPS, a large molecule present on the

  7. Studying the Microanatomy of the Heart in Three Dimensions: A Practical Update

    PubMed Central

    Jarvis, Jonathan C.; Stephenson, Robert

    2013-01-01

    The structure and function of the heart needs to be understood in three dimensions. We give a brief historical summary of the methods by which such an understanding has been sought, and some practical details of the relatively new technique of micro-CT with iodine contrast enhancement in samples from rat and rabbit. We discuss how the improved anatomical detail available in fixed cadaveric hearts will enhance our ability to model and to understand the integrated function of the cardiomyocytes, conducting tissues, and fibrous supporting structures that generate the pumping function of the heart. PMID:24400272

  8. The mean and variance of phylogenetic diversity under rarefaction

    PubMed Central

    Matsen, Frederick A.

    2013-01-01

    Summary Phylogenetic diversity (PD) depends on sampling depth, which complicates the comparison of PD between samples of different depth. One approach to dealing with differing sample depth for a given diversity statistic is to rarefy, which means to take a random subset of a given size of the original sample. Exact analytical formulae for the mean and variance of species richness under rarefaction have existed for some time but no such solution exists for PD.We have derived exact formulae for the mean and variance of PD under rarefaction. We confirm that these formulae are correct by comparing exact solution mean and variance to that calculated by repeated random (Monte Carlo) subsampling of a dataset of stem counts of woody shrubs of Toohey Forest, Queensland, Australia. We also demonstrate the application of the method using two examples: identifying hotspots of mammalian diversity in Australasian ecoregions, and characterising the human vaginal microbiome.There is a very high degree of correspondence between the analytical and random subsampling methods for calculating mean and variance of PD under rarefaction, although the Monte Carlo method requires a large number of random draws to converge on the exact solution for the variance.Rarefaction of mammalian PD of ecoregions in Australasia to a common standard of 25 species reveals very different rank orderings of ecoregions, indicating quite different hotspots of diversity than those obtained for unrarefied PD. The application of these methods to the vaginal microbiome shows that a classical score used to quantify bacterial vaginosis is correlated with the shape of the rarefaction curve.The analytical formulae for the mean and variance of PD under rarefaction are both exact and more efficient than repeated subsampling. Rarefaction of PD allows for many applications where comparisons of samples of different depth is required. PMID:23833701

  9. Meta-Analyses of Diagnostic Accuracy in Imaging Journals: Analysis of Pooling Techniques and Their Effect on Summary Estimates of Diagnostic Accuracy.

    PubMed

    McGrath, Trevor A; McInnes, Matthew D F; Korevaar, Daniël A; Bossuyt, Patrick M M

    2016-10-01

    Purpose To determine whether authors of systematic reviews of diagnostic accuracy studies published in imaging journals used recommended methods for meta-analysis, and to evaluate the effect of traditional methods on summary estimates of sensitivity and specificity. Materials and Methods Medline was searched for published systematic reviews that included meta-analysis of test accuracy data limited to imaging journals published from January 2005 to May 2015. Two reviewers independently extracted study data and classified methods for meta-analysis as traditional (univariate fixed- or random-effects pooling or summary receiver operating characteristic curve) or recommended (bivariate model or hierarchic summary receiver operating characteristic curve). Use of methods was analyzed for variation with time, geographical location, subspecialty, and journal. Results from reviews in which study authors used traditional univariate pooling methods were recalculated with a bivariate model. Results Three hundred reviews met the inclusion criteria, and in 118 (39%) of those, authors used recommended meta-analysis methods. No change in the method used was observed with time (r = 0.54, P = .09); however, there was geographic (χ(2) = 15.7, P = .001), subspecialty (χ(2) = 46.7, P < .001), and journal (χ(2) = 27.6, P < .001) heterogeneity. Fifty-one univariate random-effects meta-analyses were reanalyzed with the bivariate model; the average change in the summary estimate was -1.4% (P < .001) for sensitivity and -2.5% (P < .001) for specificity. The average change in width of the confidence interval was 7.7% (P < .001) for sensitivity and 9.9% (P ≤ .001) for specificity. Conclusion Recommended methods for meta-analysis of diagnostic accuracy in imaging journals are used in a minority of reviews; this has not changed significantly with time. Traditional (univariate) methods allow overestimation of diagnostic accuracy and provide narrower confidence intervals than do recommended (bivariate) methods. (©) RSNA, 2016 Online supplemental material is available for this article.

  10. An Alternative Method for Teaching and Testing Reading Comprehension.

    ERIC Educational Resources Information Center

    Courchene, Robert

    1995-01-01

    The summary cloze technique offers an alternative to multiple choice. Summary cloze exercises are prepared by summarizing the content of the original text. The shortened text is transformed into a rational cloze exercise. The learner completes the summary text using the list of choices provided. This technique is a good measure of reading…

  11. An Optimized Trichloroacetic Acid/Acetone Precipitation Method for Two-Dimensional Gel Electrophoresis Analysis of Qinchuan Cattle Longissimus Dorsi Muscle Containing High Proportion of Marbling.

    PubMed

    Hao, Ruijie; Adoligbe, Camus; Jiang, Bijie; Zhao, Xianlin; Gui, Linsheng; Qu, Kaixing; Wu, Sen; Zan, Linsen

    2015-01-01

    Longissimus dorsi muscle (LD) proteomics provides a novel opportunity to reveal the molecular mechanism behind intramuscular fat deposition. Unfortunately, the vast amounts of lipids and nucleic acids in this tissue hampered LD proteomics analysis. Trichloroacetic acid (TCA)/acetone precipitation is a widely used method to remove contaminants from protein samples. However, the high speed centrifugation employed in this method produces hard precipitates, which restrict contaminant elimination and protein re-dissolution. To address the problem, the centrifugation precipitates were first grinded with a glass tissue grinder and then washed with 90% acetone (TCA/acetone-G-W) in the present study. According to our result, the treatment for solid precipitate facilitated non-protein contaminant removal and protein re-dissolution, ultimately improving two-dimensional gel electrophoresis (2-DE) analysis. Additionally, we also evaluated the effect of sample drying on 2-DE profile as well as protein yield. It was found that 30 min air-drying did not result in significant protein loss, but reduced horizontal streaking and smearing on 2-DE gel compared to 10 min. In summary, we developed an optimized TCA/acetone precipitation method for protein extraction of LD, in which the modifications improved the effectiveness of TCA/acetone method.

  12. An Optimized Trichloroacetic Acid/Acetone Precipitation Method for Two-Dimensional Gel Electrophoresis Analysis of Qinchuan Cattle Longissimus Dorsi Muscle Containing High Proportion of Marbling

    PubMed Central

    Hao, Ruijie; Adoligbe, Camus; Jiang, Bijie; Zhao, Xianlin; Gui, Linsheng; Qu, Kaixing; Wu, Sen; Zan, Linsen

    2015-01-01

    Longissimus dorsi muscle (LD) proteomics provides a novel opportunity to reveal the molecular mechanism behind intramuscular fat deposition. Unfortunately, the vast amounts of lipids and nucleic acids in this tissue hampered LD proteomics analysis. Trichloroacetic acid (TCA)/acetone precipitation is a widely used method to remove contaminants from protein samples. However, the high speed centrifugation employed in this method produces hard precipitates, which restrict contaminant elimination and protein re-dissolution. To address the problem, the centrifugation precipitates were first grinded with a glass tissue grinder and then washed with 90% acetone (TCA/acetone-G-W) in the present study. According to our result, the treatment for solid precipitate facilitated non-protein contaminant removal and protein re-dissolution, ultimately improving two-dimensional gel electrophoresis (2-DE) analysis. Additionally, we also evaluated the effect of sample drying on 2-DE profile as well as protein yield. It was found that 30 min air-drying did not result in significant protein loss, but reduced horizontal streaking and smearing on 2-DE gel compared to 10 min. In summary, we developed an optimized TCA/acetone precipitation method for protein extraction of LD, in which the modifications improved the effectiveness of TCA/acetone method. PMID:25893432

  13. 75 FR 43045 - Pistachios Grown in California, Arizona, and New Mexico; Modification of the Aflatoxin Regulations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-23

    .... SUMMARY: This rule modifies the aflatoxin sampling and testing regulations currently prescribed under the... Administrative Committee for Pistachios (Committee). This rule streamlines the aflatoxin sampling and testing... by providing a uniform and consistent aflatoxin sampling and testing procedure for pistachios shipped...

  14. 77 FR 59667 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Respirable...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-28

    ... for OMB Review; Comment Request; Respirable Coal Mine Dust Sampling ACTION: Notice. SUMMARY: The... information collection request (ICR) titled, ``Respirable Coal Mine Dust Sampling,'' to the Office of... operator to protect miners from exposure to excessive dust levels. The respirable coal mine dust sampling...

  15. 32 CFR 865.126 - Sample report format.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Sample report format. 865.126 Section 865.126 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ORGANIZATION AND MISSION-GENERAL PERSONNEL REVIEW BOARDS Air Force Discharge Review Board § 865.126 Sample report format. Summary...

  16. 32 CFR 865.126 - Sample report format.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Sample report format. 865.126 Section 865.126 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ORGANIZATION AND MISSION-GENERAL PERSONNEL REVIEW BOARDS Air Force Discharge Review Board § 865.126 Sample report format. Summary...

  17. 78 FR 49296 - Centennial Challenges 2014 Sample Return Robot Challenge

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-13

    ... Return Robot Challenge AGENCY: National Aeronautics and Space Administration (NASA). ACTION: Notice of Centennial Challenges 2014 Sample Return Robot Challenge. SUMMARY: This notice is issued in accordance with 51 U.S.C. 20144(c). The 2014 Sample Return Robot Challenge is scheduled and teams that wish to...

  18. 76 FR 56819 - Centennial Challenges 2012 Sample Return Robot Challenge

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-14

    ... Return Robot Challenge AGENCY: National Aeronautics and Space Administration (NASA). ACTION: Notice. SUMMARY: This notice is issued in accordance with 42 U.S.C. 2451(314)(d). The 2012 Sample Return Robot.... The 2012 Sample Return Robot Challenge is a prize competition designed to encourage development of new...

  19. Outcome-Dependent Sampling with Interval-Censored Failure Time Data

    PubMed Central

    Zhou, Qingning; Cai, Jianwen; Zhou, Haibo

    2017-01-01

    Summary Epidemiologic studies and disease prevention trials often seek to relate an exposure variable to a failure time that suffers from interval-censoring. When the failure rate is low and the time intervals are wide, a large cohort is often required so as to yield reliable precision on the exposure-failure-time relationship. However, large cohort studies with simple random sampling could be prohibitive for investigators with a limited budget, especially when the exposure variables are expensive to obtain. Alternative cost-effective sampling designs and inference procedures are therefore desirable. We propose an outcome-dependent sampling (ODS) design with interval-censored failure time data, where we enrich the observed sample by selectively including certain more informative failure subjects. We develop a novel sieve semiparametric maximum empirical likelihood approach for fitting the proportional hazards model to data from the proposed interval-censoring ODS design. This approach employs the empirical likelihood and sieve methods to deal with the infinite-dimensional nuisance parameters, which greatly reduces the dimensionality of the estimation problem and eases the computation difficulty. The consistency and asymptotic normality of the resulting regression parameter estimator are established. The results from our extensive simulation study show that the proposed design and method works well for practical situations and is more efficient than the alternative designs and competing approaches. An example from the Atherosclerosis Risk in Communities (ARIC) study is provided for illustration. PMID:28771664

  20. Annual Surveillance Summary: Pseudomonas aeruginosa Infections in the Military Health System (MHS), 2016

    DTIC Science & Technology

    2017-06-01

    prescription practices, the Standard Inpatient Data Record (SIDR) to determine healthcare-associated exposures, Defense Manpower Data Center (DMDC...Methods, and Limitations .......................................................................................... 1 Results ...no new methods or limitations were applied to this annual summary. As such, this report presents analytical results and discussion of CY 2016 data

  1. Development of a Blended Instructional Model via Weblog to Enhance English Summary Writing Ability of Thai Undergraduate Students

    ERIC Educational Resources Information Center

    Termsinsuk, Saisunee

    2015-01-01

    The objective of this research and development was to develop an effective blended instructional model via weblog to enhance English summary writing ability of Thai undergraduate students. A sample group in the English program of Nakhon Ratchasima Rajabhat University was studied in academic year 2010-2013. The research instruments were an…

  2. Improved Use of Small Reference Panels for Conditional and Joint Analysis with GWAS Summary Statistics.

    PubMed

    Deng, Yangqing; Pan, Wei

    2018-06-01

    Due to issues of practicality and confidentiality of genomic data sharing on a large scale, typically only meta- or mega-analyzed genome-wide association study (GWAS) summary data, not individual-level data, are publicly available. Reanalyses of such GWAS summary data for a wide range of applications have become more and more common and useful, which often require the use of an external reference panel with individual-level genotypic data to infer linkage disequilibrium (LD) among genetic variants. However, with a small sample size in only hundreds, as for the most popular 1000 Genomes Project European sample, estimation errors for LD are not negligible, leading to often dramatically increased numbers of false positives in subsequent analyses of GWAS summary data. To alleviate the problem in the context of association testing for a group of SNPs, we propose an alternative estimator of the covariance matrix with an idea similar to multiple imputation. We use numerical examples based on both simulated and real data to demonstrate the severe problem with the use of the 1000 Genomes Project reference panels, and the improved performance of our new approach. Copyright © 2018 by the Genetics Society of America.

  3. Assessing oral health-related quality of life in general dental practice in Scotland: validation of the OHIP-14.

    PubMed

    Fernandes, Marcelo José; Ruta, Danny Adolph; Ogden, Graham Richard; Pitts, Nigel Berry; Ogston, Simon Alexander

    2006-02-01

    To validate the Oral Health Impact Profile (OHIP)-14 in a sample of patients attending general dental practice. Patients with pathology-free impacted wisdom teeth were recruited from six general dental practices in Tayside, Scotland, and followed for a year to assess the development of problems related to impaction. The OHIP-14 was completed at baseline and at 1-year follow-up, and analysed using three different scoring methods: a summary score, a weighted and standardized score and the total number of problems reported. Instrument reliability was measured by assessing internal consistency and test-retest reliability. Construct validity was assessed using a number of variables. Linear regression was then used to model the relationship between OHIP-14 and all significantly correlated variables. Responsiveness was measured using the standardized response mean (SRM). Adjusted R(2)s and SRMs were calculated for each of the three scoring methods. Estimates for the differences between adjusted R(2)s and the differences between SRMs were obtained with 95% confidence intervals. A total of 278 and 169 patients completed the questionnaire at baseline and follow-up, respectively. Reliability - Cronbach's alpha coefficients ranged from 0.30 to 0.75. Alpha coefficients for all 14 items were 0.88 and 0.87 for baseline and follow-up, respectively. Test-retest coefficients ranged from 0.72 to 0.78. Validity - OHIP-14 scores were significantly correlated with number of teeth, education, main activity, the use of mouthwash, frequency of seeing a dentist, the reason for the last dental appointment, smoking, alcohol intake, pain and symptoms. Adjusted R(2)s ranged from 0.123 to 0.202 and there were no statistically significant differences between those for the three different scoring methods. Responsiveness - The SRMs ranged from 0.37 to 0.56 and there was a statistically significant difference between the summary scores method and the total number of problems method for symptomatic patients. The OHIP-14 is a valid and reliable measure of oral health-related quality of life in general dental practice and is responsive to third molar clinical change. The summary score method demonstrated performance as good as, or better than, the other methods studied.

  4. Summary of inorganic compositional data for groundwater, soil-water, and surface-water samples collected at the Headgate Draw subsurface drip irrigation site, Johnson County, Wyoming

    USGS Publications Warehouse

    Geboy, Nicholas J.; Engle, Mark A.; Schroeder, Karl T.; Zupancic, John W.

    2011-01-01

    As part of a 5-year project on the impact of subsurface drip irrigation (SDI) application of coalbed-methane (CBM) produced waters, water samples were collected from the Headgate Draw SDI site in the Powder River Basin, Wyoming, USA. This research is part of a larger study to understand short- and long-term impacts on both soil and water quality from the beneficial use of CBM waters to grow forage crops through use of SDI. This document provides a summary of the context, sampling methodology, and quality assurance and quality control documentation of samples collected prior to and over the first year of SDI operation at the site (May 2008-October 2009). This report contains an associated database containing inorganic compositional data, water-quality criteria parameters, and calculated geochemical parameters for samples of groundwater, soil water, surface water, treated CBM waters, and as-received CBM waters collected at the Headgate Draw SDI site.

  5. Optimization of preservation and storage time of sponge tissues to obtain quality mRNA for next-generation sequencing.

    PubMed

    Riesgo, Ana; Pérez-Porro, Alicia R; Carmona, Susana; Leys, Sally P; Giribet, Gonzalo

    2012-03-01

    Transcriptome sequencing with next-generation sequencing technologies has the potential for addressing many long-standing questions about the biology of sponges. Transcriptome sequence quality depends on good cDNA libraries, which requires high-quality mRNA. Standard protocols for preserving and isolating mRNA often require optimization for unusual tissue types. Our aim was assessing the efficiency of two preservation modes, (i) flash freezing with liquid nitrogen (LN₂) and (ii) immersion in RNAlater, for the recovery of high-quality mRNA from sponge tissues. We also tested whether the long-term storage of samples at -80 °C affects the quantity and quality of mRNA. We extracted mRNA from nine sponge species and analysed the quantity and quality (A260/230 and A260/280 ratios) of mRNA according to preservation method, storage time, and taxonomy. The quantity and quality of mRNA depended significantly on the preservation method used (LN₂) outperforming RNAlater), the sponge species, and the interaction between them. When the preservation was analysed in combination with either storage time or species, the quantity and A260/230 ratio were both significantly higher for LN₂-preserved samples. Interestingly, individual comparisons for each preservation method over time indicated that both methods performed equally efficiently during the first month, but RNAlater lost efficiency in storage times longer than 2 months compared with flash-frozen samples. In summary, we find that for long-term preservation of samples, flash freezing is the preferred method. If LN₂ is not available, RNAlater can be used, but mRNA extraction during the first month of storage is advised. © 2011 Blackwell Publishing Ltd.

  6. Resolving Identification Issues of Saraca asoca from Its Adulterant and Commercial Samples Using Phytochemical Markers

    PubMed Central

    Hegde, Satisha; Hegde, Harsha Vasudev; Jalalpure, Sunil Satyappa; Peram, Malleswara Rao; Pai, Sandeep Ramachandra; Roy, Subarna

    2017-01-01

    Saraca asoca (Roxb.) De Wilde (Ashoka) is a highly valued endangered medicinal tree species from Western Ghats of India. Besides treating cardiac and circulatory problems, S. asoca provides immense relief in gynecological disorders. Higher price and demand, in contrast to the smaller population size of the plant, have motivated adulteration with other plants such as Polyalthia longifolia (Sonnerat) Thwaites. The fundamental concerns in quality control of S. asoca arise due to its part of medicinal value (Bark) and the chemical composition. Phytochemical fingerprinting with proper selection of analytical markers is a promising method in addressing quality control issues. In the present study, high-performance liquid chromatography of phenolic compounds (gallic acid, catechin, and epicatechin) coupled to multivariate analysis was used. Five samples each of S. asoca, P. longifolia from two localities alongside five commercial market samples showed evidence of adulteration. Subsequently, multivariate hierarchical cluster analysis and principal component analysis was established to discriminate the adulterants of S. asoca. The proposed method ascertains identification of S. asoca from its putative adulterant P. longifolia and commercial market samples. The data generated may also serve as baseline data to form a quality standard for pharmacopoeias. SUMMARY Simultaneous quantification of gallic acid, catechin, epicatechin from Saraca asoca by high-performance liquid chromatographyDetection of S. asoca from adulterant and commercial samplesUse of analytical method along with a statistical tool for addressing quality issues. Abbreviations used: HPLC: High Performance Liquid Chromatography; RP-HPLC: Reverse Phase High Performance Liquid Chromatography; CAT: Catechin; EPI: Epicatechin; GA: Gallic acid; PCA: Principal Component Analysis. PMID:28808391

  7. 76 FR 37014 - Expedited Approval of Alternative Test Procedures for the Analysis of Contaminants Under the Safe...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-24

    ...; Analysis and Sampling Procedures AGENCY: Environmental Protection Agency (EPA). ACTION: Final rule. SUMMARY... Contaminants Under the Safe Drinking Water Act; Analysis and Sampling Procedures. 75 FR 32295. June 8, 2010...

  8. A report card from Missourians : 2013 [executive summary].

    DOT National Transportation Integrated Search

    2013-10-01

    Overall statewide satisfaction with MoDOT and additional feedback about MoDOTs operations was : obtained from a representative sample of the general adult public in Missouri. A professional calling : center was engaged to obtain a diverse sample a...

  9. A report card from Missourians - 2012 : [executive summary].

    DOT National Transportation Integrated Search

    2012-07-01

    Overall statewide satisfaction with MoDOT and additional feedback about MoDOTs operations was : obtained from a representative sample of the general adult public in Missouri. A professional calling : center was engaged to obtain a diverse sample a...

  10. A Report Card from Missourians - 2015 : [executive summary].

    DOT National Transportation Integrated Search

    2015-07-01

    Overall statewide satisfaction with MoDOT and additional feedback about MoDOTs operations was : obtained from a representative sample of the general adult public in Missouri. A professional calling : center was engaged to obtain a diverse sample a...

  11. Wastewater-Based Epidemiology of Stimulant Drugs: Functional Data Analysis Compared to Traditional Statistical Methods.

    PubMed

    Salvatore, Stefania; Bramness, Jørgen Gustav; Reid, Malcolm J; Thomas, Kevin Victor; Harman, Christopher; Røislien, Jo

    2015-01-01

    Wastewater-based epidemiology (WBE) is a new methodology for estimating the drug load in a population. Simple summary statistics and specification tests have typically been used to analyze WBE data, comparing differences between weekday and weekend loads. Such standard statistical methods may, however, overlook important nuanced information in the data. In this study, we apply functional data analysis (FDA) to WBE data and compare the results to those obtained from more traditional summary measures. We analysed temporal WBE data from 42 European cities, using sewage samples collected daily for one week in March 2013. For each city, the main temporal features of two selected drugs were extracted using functional principal component (FPC) analysis, along with simpler measures such as the area under the curve (AUC). The individual cities' scores on each of the temporal FPCs were then used as outcome variables in multiple linear regression analysis with various city and country characteristics as predictors. The results were compared to those of functional analysis of variance (FANOVA). The three first FPCs explained more than 99% of the temporal variation. The first component (FPC1) represented the level of the drug load, while the second and third temporal components represented the level and the timing of a weekend peak. AUC was highly correlated with FPC1, but other temporal characteristic were not captured by the simple summary measures. FANOVA was less flexible than the FPCA-based regression, and even showed concordance results. Geographical location was the main predictor for the general level of the drug load. FDA of WBE data extracts more detailed information about drug load patterns during the week which are not identified by more traditional statistical methods. Results also suggest that regression based on FPC results is a valuable addition to FANOVA for estimating associations between temporal patterns and covariate information.

  12. Characterizing Sleep Structure Using the Hypnogram

    PubMed Central

    Swihart, Bruce J.; Caffo, Brian; Bandeen-Roche, Karen; Punjabi, Naresh M.

    2008-01-01

    Objectives: Research on the effects of sleep-disordered breathing (SDB) on sleep structure has traditionally been based on composite sleep-stage summaries. The primary objective of this investigation was to demonstrate the utility of log-linear and multistate analysis of the sleep hypnogram in evaluating differences in nocturnal sleep structure in subjects with and without SDB. Methods: A community-based sample of middle-aged and older adults with and without SDB matched on age, sex, race, and body mass index was identified from the Sleep Heart Health Study. Sleep was assessed with home polysomnography and categorized into rapid eye movement (REM) and non-REM (NREM) sleep. Log-linear and multistate survival analysis models were used to quantify the frequency and hazard rates of transitioning, respectively, between wakefulness, NREM sleep, and REM sleep. Results: Whereas composite sleep-stage summaries were similar between the two groups, subjects with SDB had higher frequencies and hazard rates for transitioning between the three states. Specifically, log-linear models showed that subjects with SDB had more wake-to-NREM sleep and NREM sleep-to-wake transitions, compared with subjects without SDB. Multistate survival models revealed that subjects with SDB transitioned more quickly from wake-to-NREM sleep and NREM sleep-to-wake than did subjects without SDB. Conclusions: The description of sleep continuity with log-linear and multistate analysis of the sleep hypnogram suggests that such methods can identify differences in sleep structure that are not evident with conventional sleep-stage summaries. Detailed characterization of nocturnal sleep evolution with event history methods provides additional means for testing hypotheses on how specific conditions impact sleep continuity and whether sleep disruption is associated with adverse health outcomes. Citation: Swihart BJ; Caffo B; Bandeen-Roche K; Punjabi NM. Characterizing sleep structure using the hypnogram. J Clin Sleep Med 2008;4(4):349–355. PMID:18763427

  13. The quantification of short-chain chlorinated paraffins in sediment samples using comprehensive two-dimensional gas chromatography with μECD detection.

    PubMed

    Muscalu, Alina M; Morse, Dave; Reiner, Eric J; Górecki, Tadeusz

    2017-03-01

    The analysis of persistent organic pollutants in environmental samples is a challenge due to the very large number of compounds with varying chemical and physical properties. Chlorinated paraffins (CPs) are complex mixtures of chlorinated n-alkanes with varying chain lengths (C 10 to C 30 ) and degree of chlorination (30 to 70% by weight). Their physical-chemical properties make these compounds persistent in the environment and able to bioaccumulate in living organisms. Comprehensive two-dimensional gas chromatography (GC × GC) coupled with micro-electron capture detection (μECD) was used to separate and quantify short-chain chlorinated paraffins (SCCP) in sediment samples. Distinct ordered bands were observed in the GC × GC chromatograms pointing to group separation. Using the Classification function of the ChromaTOF software, summary tables were generated to determine total area counts to set up multilevel-calibration curves for different technical mixes. Fortified sediment samples were analyzed by GC × GC-μECD with minimal extraction and cleanup. Recoveries ranged from 120 to 130%. To further validate the proposed method for the analysis of SCCPs, the laboratory participated in interlaboratory studies for the analysis of standards and sediment samples. The results showed recoveries between 75 and 95% and z-score values <2, demonstrating that the method is suitable for the analysis of SCCPs in soil/sediment samples. Graphical abstract Quantification of SCCPs by 2D-GC-μECD.

  14. A Method for Gene-Based Pathway Analysis Using Genomewide Association Study Summary Statistics Reveals Nine New Type 1 Diabetes Associations

    PubMed Central

    Evangelou, Marina; Smyth, Deborah J; Fortune, Mary D; Burren, Oliver S; Walker, Neil M; Guo, Hui; Onengut-Gumuscu, Suna; Chen, Wei-Min; Concannon, Patrick; Rich, Stephen S; Todd, John A; Wallace, Chris

    2014-01-01

    Pathway analysis can complement point-wise single nucleotide polymorphism (SNP) analysis in exploring genomewide association study (GWAS) data to identify specific disease-associated genes that can be candidate causal genes. We propose a straightforward methodology that can be used for conducting a gene-based pathway analysis using summary GWAS statistics in combination with widely available reference genotype data. We used this method to perform a gene-based pathway analysis of a type 1 diabetes (T1D) meta-analysis GWAS (of 7,514 cases and 9,045 controls). An important feature of the conducted analysis is the removal of the major histocompatibility complex gene region, the major genetic risk factor for T1D. Thirty-one of the 1,583 (2%) tested pathways were identified to be enriched for association with T1D at a 5% false discovery rate. We analyzed these 31 pathways and their genes to identify SNPs in or near these pathway genes that showed potentially novel association with T1D and attempted to replicate the association of 22 SNPs in additional samples. Replication P-values were skewed () with 12 of the 22 SNPs showing . Support, including replication evidence, was obtained for nine T1D associated variants in genes ITGB7 (rs11170466, ), NRP1 (rs722988, ), BAD (rs694739, ), CTSB (rs1296023, ), FYN (rs11964650, ), UBE2G1 (rs9906760, ), MAP3K14 (rs17759555, ), ITGB1 (rs1557150, ), and IL7R (rs1445898, ). The proposed methodology can be applied to other GWAS datasets for which only summary level data are available. PMID:25371288

  15. Rapid assessment of Schistosoma mansoni: the validity, applicability and cost-effectiveness of the Lot Quality Assurance Sampling method in Uganda

    PubMed Central

    Brooker, Simon; Kabatereine, Narcis B.; Myatt, Mark; Stothard, J. Russell; Fenwick, Alan

    2007-01-01

    Summary Rapid and accurate identification of communities at highest risk of morbidity from schistosomiasis is key for sustainable control. Although school questionnaires can effectively and inexpensively identify communities with a high prevalence of Schistosoma haematobium, parasitological screening remains the preferred option for S. mansoni. To help reduce screening costs, we investigated the validity of Lot Quality Assurance Sampling (LQAS) in classifying schools according categories of S. mansoni prevalence in Uganda, and explored its applicability and cost-effectiveness. First, we evaluated several sampling plans using computer simulation and then field tested one sampling plan in 34 schools in Uganda. Finally, cost-effectiveness of different screening and control strategies (including mass treatment without prior screening) was determined, and sensitivity analysis undertaken to assess the effect of infection levels and treatment costs. In identifying schools with prevalence ≥50%, computer simulations showed that LQAS had high levels of sensitivity and specificity (>90%) at sample sizes <20. The method also provides an ability to classify communities into three prevalence categories. Field testing showed that LQAS where 15 children were sampled had excellent diagnostic performance (sensitivity: 100%, specificity: 96.4%, positive predictive value: 85.7% and negative predictive value: 92.3%). Screening using LQAS was more cost-effective than mass treating all schools (US$ 218 vs. US$ 482 / high prevalence school treated). Threshold analysis indicated that parasitological screening and mass treatment would become equivalent for settings where prevalence exceeds 50% in 75% of schools and for treatment costs of US$ 0.19 per schoolchild. We conclude that, in Uganda, LQAS provides a rapid, valid, and cost-effective method for guiding decision makers in allocating finite resources for the control of schistosomiasis. PMID:15960703

  16. Dissecting the genetics of complex traits using summary association statistics.

    PubMed

    Pasaniuc, Bogdan; Price, Alkes L

    2017-02-01

    During the past decade, genome-wide association studies (GWAS) have been used to successfully identify tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyse summary association statistics. Here, we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases.

  17. Packet flow monitoring tool and method

    DOEpatents

    Thiede, David R [Richland, WA

    2009-07-14

    A system and method for converting packet streams into session summaries. Session summaries are a group of packets each having a common source and destination internet protocol (IP) address, and, if present in the packets, common ports. The system first captures packets from a transport layer of a network of computer systems, then decodes the packets captured to determine the destination IP address and the source IP address. The system then identifies packets having common destination IP addresses and source IP addresses, then writes the decoded packets to an allocated memory structure as session summaries in a queue.

  18. Dissecting the genetics of complex traits using summary association statistics

    PubMed Central

    Pasaniuc, Bogdan; Price, Alkes L.

    2017-01-01

    During the past decade, genome-wide association studies (GWAS) have successfully identified tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyze summary association statistics. Here we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases. PMID:27840428

  19. Manufacturing Methods and Technology (MMT) project execution report

    NASA Astrophysics Data System (ADS)

    Swim, P. A.

    1982-10-01

    This document is a summary compilation of the manufacturing methods and technology program project status reports (RCS DRCMT-301) submitted to IBEA from DARCOM major Army subcommands and project managers. Each page of the computerized section lists project number, title, status, funding, and projected completion date. Summary pages give information relating to the overall DARCOM program.

  20. A novel approach to quantifying the spatiotemporal behavior of instrumented grey seals used to sample the environment.

    PubMed

    Baker, Laurie L; Mills Flemming, Joanna E; Jonsen, Ian D; Lidgard, Damian C; Iverson, Sara J; Bowen, W Don

    2015-01-01

    Paired with satellite location telemetry, animal-borne instruments can collect spatiotemporal data describing the animal's movement and environment at a scale relevant to its behavior. Ecologists have developed methods for identifying the area(s) used by an animal (e.g., home range) and those used most intensely (utilization distribution) based on location data. However, few have extended these models beyond their traditional roles as descriptive 2D summaries of point data. Here we demonstrate how the home range method, T-LoCoH, can be expanded to quantify collective sampling coverage by multiple instrumented animals using grey seals (Halichoerus grypus) equipped with GPS tags and acoustic transceivers on the Scotian Shelf (Atlantic Canada) as a case study. At the individual level, we illustrate how time and space-use metrics quantifying individual sampling coverage may be used to determine the rate of acoustic transmissions received. Grey seals collectively sampled an area of 11,308 km (2) and intensely sampled an area of 31 km (2) from June-December. The largest area sampled was in July (2094.56 km (2)) and the smallest area sampled occurred in August (1259.80 km (2)), with changes in sampling coverage observed through time. T-LoCoH provides an effective means to quantify changes in collective sampling effort by multiple instrumented animals and to compare these changes across time. We also illustrate how time and space-use metrics of individual instrumented seal movement calculated using T-LoCoH can be used to account for differences in the amount of time a bioprobe (biological sampling platform) spends in an area.

  1. A Bayesian Approach to the Overlap Analysis of Epidemiologically Linked Traits.

    PubMed

    Asimit, Jennifer L; Panoutsopoulou, Kalliope; Wheeler, Eleanor; Berndt, Sonja I; Cordell, Heather J; Morris, Andrew P; Zeggini, Eleftheria; Barroso, Inês

    2015-12-01

    Diseases often cooccur in individuals more often than expected by chance, and may be explained by shared underlying genetic etiology. A common approach to genetic overlap analyses is to use summary genome-wide association study data to identify single-nucleotide polymorphisms (SNPs) that are associated with multiple traits at a selected P-value threshold. However, P-values do not account for differences in power, whereas Bayes' factors (BFs) do, and may be approximated using summary statistics. We use simulation studies to compare the power of frequentist and Bayesian approaches with overlap analyses, and to decide on appropriate thresholds for comparison between the two methods. It is empirically illustrated that BFs have the advantage over P-values of a decreasing type I error rate as study size increases for single-disease associations. Consequently, the overlap analysis of traits from different-sized studies encounters issues in fair P-value threshold selection, whereas BFs are adjusted automatically. Extensive simulations show that Bayesian overlap analyses tend to have higher power than those that assess association strength with P-values, particularly in low-power scenarios. Calibration tables between BFs and P-values are provided for a range of sample sizes, as well as an approximation approach for sample sizes that are not in the calibration table. Although P-values are sometimes thought more intuitive, these tables assist in removing the opaqueness of Bayesian thresholds and may also be used in the selection of a BF threshold to meet a certain type I error rate. An application of our methods is used to identify variants associated with both obesity and osteoarthritis. © 2015 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.

  2. Lifestyle and health-related quality of life: A cross-sectional study among civil servants in China

    PubMed Central

    2012-01-01

    Background Health-related quality of life (HRQoL) has been increasingly acknowledged as a valid and appropriate indicator of public health and chronic morbidity. However, limited research was conducted among Chinese civil servants owing to the different lifestyle. The aim of the study was to evaluate the HRQoL among Chinese civil servants and to identify factors might be associated with their HRQoL. Methods A cross-sectional study was conducted to investigate HRQoL of 15,000 civil servants in China using stratified random sampling methods. Independent-Samples t-Test, one-way ANOVA, and multiple stepwise regression were used to analyse the influencing factors and the HRQoL of the civil servants. Results A univariate analysis showed that there were significant differences among physical component summary (PCS), mental component summary (MCS), and TS between lifestyle factors, such as smoking, drinking alcohol, having breakfast, sleep time, physical exercise, work time, operating computers, and sedentariness (P < 0.05). Multiple stepwise regressions showed that there were significant differences among TS between lifestyle factors, such as breakfast, sleep time, physical exercise, operating computers, sedentariness, work time, and drinking (P < 0.05). Conclusion In this study, using Short Form 36 items (SF-36), we assessed the association of HRQoL with lifestyle factors, including smoking, drinking alcohol, having breakfast, sleep time, physical exercise, work time, operating computers, and sedentariness in China. The performance of the questionnaire in the large-scale survey is satisfactory and provides a large picture of the HRQoL status in Chinese civil servants. Our results indicate that lifestyle factors such as smoking, drinking alcohol, having breakfast, sleep time, physical exercise, work time, operating computers, and sedentariness affect the HRQoL of civil servants in China. PMID:22559315

  3. Development of a patient reported outcome scale for fatigue in multiple sclerosis: The Neurological Fatigue Index (NFI-MS)

    PubMed Central

    2010-01-01

    Background Fatigue is a common and debilitating symptom in multiple sclerosis (MS). Best-practice guidelines suggest that health services should repeatedly assess fatigue in persons with MS. Several fatigue scales are available but concern has been expressed about their validity. The objective of this study was to examine the reliability and validity of a new scale for MS fatigue, the Neurological Fatigue Index (NFI-MS). Methods Qualitative analysis of 40 MS patient interviews had previously contributed to a coherent definition of fatigue, and a potential 52 item set representing the salient themes. A draft questionnaire was mailed out to 1223 people with MS, and the resulting data subjected to both factor and Rasch analysis. Results Data from 635 (51.9% response) respondents were split randomly into an 'evaluation' and 'validation' sample. Exploratory factor analysis identified four potential subscales: 'physical', 'cognitive', 'relief by diurnal sleep or rest' and 'abnormal nocturnal sleep and sleepiness'. Rasch analysis led to further item reduction and the generation of a Summary scale comprising items from the Physical and Cognitive subscales. The scales were shown to fit Rasch model expectations, across both the evaluation and validation samples. Conclusion A simple 10-item Summary scale, together with scales measuring the physical and cognitive components of fatigue, were validated for MS fatigue. PMID:20152031

  4. How Well Do Customers of Direct-to-Consumer Personal Genomic Testing Services Comprehend Genetic Test Results? Findings from the Impact of Personal Genomics Study

    PubMed Central

    Ostergren, Jenny E.; Gornick, Michele C.; Carere, Deanna Alexis; Kalia, Sarah S.; Uhlmann, Wendy R.; Ruffin, Mack T.; Mountain, Joanna L.; Green, Robert C.; Roberts, J. Scott

    2016-01-01

    Aim To assess customer comprehension of health-related personal genomic testing (PGT) results. Methods We presented sample reports of genetic results and examined responses to comprehension questions in 1,030 PGT customers (mean age: 46.7 years; 59.9% female; 79.0% college graduates; 14.9% non-White; 4.7% of Hispanic/Latino ethnicity). Sample reports presented a genetic risk for Alzheimer’s disease and type 2 diabetes, carrier screening summary results for >30 conditions, results for phenylketonuria and cystic fibrosis, and drug response results for a statin drug. Logistic regression was used to identify correlates of participant comprehension. Results Participants exhibited high overall comprehension (mean score: 79.1% correct). The highest comprehension (range: 81.1–97.4% correct) was observed in the statin drug response and carrier screening summary results, and lower comprehension (range: 63.6–74.8% correct) on specific carrier screening results. Higher levels of numeracy, genetic knowledge, and education were significantly associated with greater comprehension. Older age (≥ 60 years) was associated with lower comprehension scores. Conclusions Most customers accurately interpreted the health implications of PGT results; however, comprehension varied by demographic characteristics, numeracy and genetic knowledge, and types and format of the genetic information presented. Results suggest a need to tailor the presentation of PGT results by test type and customer characteristics. PMID:26087778

  5. 77 FR 19230 - Western Pacific Fishery Management Council; Public Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    .... Precious corals fishery and coral reef habitat status. iv. Update on Bio-Sampling Program data summary. v... precious coral fisheries. iv. Coral reef habitat status. v. Update on Bio-Sampling Program and Spearfishing... fisheries. iv. Precious corals fishery and coral reef habitat status. v. Update on Bio-Sampling Program Data...

  6. 75 FR 43989 - Agency Information Collection Activities; Proposed Collection; Comment Request; Sample Collection...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... for Dogs Treated With SLENTROL AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The... on the sample collection plan for dogs treated with the drug SLENTROL. DATES: Submit either... technology. Sample Collection Plan for Dogs Treated With SLENTROL--21 CFR 514.80 (OMB Control Number 0910-NEW...

  7. Nonparametric estimation of benchmark doses in environmental risk assessment

    PubMed Central

    Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen

    2013-01-01

    Summary An important statistical objective in environmental risk analysis is estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a pre-specified benchmark response in a dose-response experiment. In such settings, representations of the risk are traditionally based on a parametric dose-response model. It is a well-known concern, however, that if the chosen parametric form is misspecified, inaccurate and possibly unsafe low-dose inferences can result. We apply a nonparametric approach for calculating benchmark doses, based on an isotonic regression method for dose-response estimation with quantal-response data (Bhattacharya and Kong, 2007). We determine the large-sample properties of the estimator, develop bootstrap-based confidence limits on the BMDs, and explore the confidence limits’ small-sample properties via a short simulation study. An example from cancer risk assessment illustrates the calculations. PMID:23914133

  8. Yellowstone grizzly bear investigations: Annual report of the Interagency Grizzly Bear Study Team, 2006

    USGS Publications Warehouse

    Schwartz, Charles C.; Haroldson, Mark A.; West, Karrie K.

    2007-01-01

    The annual reports of the IGBST summarize annual data collection. Because additional information can be obtained after publication, data summaries are subject to change. For that reason, data analyses and summaries presented in this report supersede all previously published data. The study area and sampling techniques are reported by Blanchard (1985), Mattson et al. (1991 a), and Haroldson et al. (1998).

  9. Temporary traffic control handbook for local agencies : tech transfer summary.

    DOT National Transportation Integrated Search

    2016-03-01

    The updated handbook provides local agencies with uniform standards for temporary traffic control. The handbook includes sample layouts that can be used on various projects. Having sample layouts will provide a cost savings to agencies because the de...

  10. Sampling and monitoring for the mine life cycle

    USGS Publications Warehouse

    McLemore, Virginia T.; Smith, Kathleen S.; Russell, Carol C.

    2014-01-01

    Sampling and Monitoring for the Mine Life Cycle provides an overview of sampling for environmental purposes and monitoring of environmentally relevant variables at mining sites. It focuses on environmental sampling and monitoring of surface water, and also considers groundwater, process water streams, rock, soil, and other media including air and biological organisms. The handbook includes an appendix of technical summaries written by subject-matter experts that describe field measurements, collection methods, and analytical techniques and procedures relevant to environmental sampling and monitoring.The sixth of a series of handbooks on technologies for management of metal mine and metallurgical process drainage, this handbook supplements and enhances current literature and provides an awareness of the critical components and complexities involved in environmental sampling and monitoring at the mine site. It differs from most information sources by providing an approach to address all types of mining influenced water and other sampling media throughout the mine life cycle.Sampling and Monitoring for the Mine Life Cycle is organized into a main text and six appendices that are an integral part of the handbook. Sidebars and illustrations are included to provide additional detail about important concepts, to present examples and brief case studies, and to suggest resources for further information. Extensive references are included.

  11. Self-Consistency of Rain Event Definitions

    NASA Astrophysics Data System (ADS)

    Teves, J. B.; Larsen, M.

    2014-12-01

    A dense optical rain disdrometer array was constructed to study rain variability on spatial scales less than 100 meters with temporal resolution of 1 minute. Approximately two months of data were classified into rain events using methods common in the literature. These methods were unable to identify an array-wide consensus as to the total number of rain events; instruments as little as 2 meters apart with similar data records sometimes identified different rain event totals. Physical considerations suggest that these differing event totals are likely due to instrument sampling fluctuations that are typically not accounted for in rain event studies. Detection of varying numbers of rain events impact many commonly used storm statistics including storm duration distributions and mean rain rate. A summary of the results above and their implications are presented.

  12. Northern Marshall Islands radiological survey: sampling and analysis summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robison, W.L.; Conrado, C.L.; Eagle, R.J.

    1981-07-23

    A radiological survey was conducted in the Northern Marshall Islands to document reamining external gamma exposures from nuclear tests conducted at Enewetak and Bikini Atolls. An additional program was later included to obtain terrestrial and marine samples for radiological dose assessment for current or potential atoll inhabitants. This report is the first of a series summarizing the results from the terrestrial and marine surveys. The sample collection and processing procedures and the general survey methodology are discussed; a summary of the collected samples and radionuclide analyses is presented. Over 5400 samples were collected from the 12 atolls and 2 islandsmore » and prepared for analysis including 3093 soil, 961 vegetation, 153 animal, 965 fish composite samples (average of 30 fish per sample), 101 clam, 50 lagoon water, 15 cistern water, 17 groundwater, and 85 lagoon sediment samples. A complete breakdown by sample type, atoll, and island is given here. The total number of analyses by radionuclide are 8840 for /sup 241/Am, 6569 for /sup 137/Cs, 4535 for /sup 239 +240/Pu, 4431 for /sup 90/Sr, 1146 for /sup 238/Pu, 269 for /sup 241/Pu, and 114 each for /sup 239/Pu and /sup 240/Pu. A complete breakdown by sample category, atoll or island, and radionuclide is also included.« less

  13. Preparation of electrochemically active silicon nanotubes in highly ordered arrays

    PubMed Central

    Grünzel, Tobias; Lee, Young Joo; Kuepper, Karsten

    2013-01-01

    Summary Silicon as the negative electrode material of lithium ion batteries has a very large capacity, the exploitation of which is impeded by the volume changes taking place upon electrochemical cycling. A Si electrode displaying a controlled porosity could circumvent the difficulty. In this perspective, we present a preparative method that yields ordered arrays of electrochemically competent silicon nanotubes. The method is based on the atomic layer deposition of silicon dioxide onto the pore walls of an anodic alumina template, followed by a thermal reduction with lithium vapor. This thermal reduction is quantitative, homogeneous over macroscopic samples, and it yields amorphous silicon and lithium oxide, at the exclusion of any lithium silicides. The reaction is characterized by spectroscopic ellipsometry for thin silica films, and by nuclear magnetic resonance and X-ray photoelectron spectroscopy for nanoporous samples. After removal of the lithium oxide byproduct, the silicon nanotubes can be contacted electrically. In a lithium ion electrolyte, they then display the electrochemical waves also observed for other bulk or nanostructured silicon systems. The method established here paves the way for systematic investigations of how the electrochemical properties (capacity, charge/discharge rates, cyclability) of nanoporous silicon negative lithium ion battery electrode materials depend on the geometry. PMID:24205460

  14. Incorporating current research into formal higher education settings using Astrobites

    NASA Astrophysics Data System (ADS)

    Sanders, Nathan E.; Kohler, Susanna; Faesi, Chris; Villar, Ashley; Zevin, Michael

    2017-10-01

    A primary goal of many undergraduate- and graduate-level courses in the physical sciences is to prepare students to engage in scientific research or to prepare students for careers that leverage skillsets similar to those used by research scientists. Even for students who may not intend to pursue a career with these characteristics, exposure to the context of applications in modern research can be a valuable tool for teaching and learning. However, a persistent barrier to student participation in research is familiarity with the technical language, format, and context that academic researchers use to communicate research methods and findings with each other: the literature of the field. Astrobites, an online web resource authored by graduate students, has published brief and accessible summaries of more than 1300 articles from the astrophysical literature since its founding in 2010. This article presents three methods for introducing students at all levels within the formal higher education setting to approaches and results from modern research. For each method, we provide a sample lesson plan that integrates content and principles from Astrobites, including step-by-step instructions for instructors, suggestions for adapting the lesson to different class levels across the undergraduate and graduate spectrum, sample student handouts, and a grading rubric.

  15. Integrative genetic risk prediction using non-parametric empirical Bayes classification.

    PubMed

    Zhao, Sihai Dave

    2017-06-01

    Genetic risk prediction is an important component of individualized medicine, but prediction accuracies remain low for many complex diseases. A fundamental limitation is the sample sizes of the studies on which the prediction algorithms are trained. One way to increase the effective sample size is to integrate information from previously existing studies. However, it can be difficult to find existing data that examine the target disease of interest, especially if that disease is rare or poorly studied. Furthermore, individual-level genotype data from these auxiliary studies are typically difficult to obtain. This article proposes a new approach to integrative genetic risk prediction of complex diseases with binary phenotypes. It accommodates possible heterogeneity in the genetic etiologies of the target and auxiliary diseases using a tuning parameter-free non-parametric empirical Bayes procedure, and can be trained using only auxiliary summary statistics. Simulation studies show that the proposed method can provide superior predictive accuracy relative to non-integrative as well as integrative classifiers. The method is applied to a recent study of pediatric autoimmune diseases, where it substantially reduces prediction error for certain target/auxiliary disease combinations. The proposed method is implemented in the R package ssa. © 2016, The International Biometric Society.

  16. Water quality and habitat conditions in upper Midwest streams relative to riparian vegetation and soil characteristics, August 1997 : study design, methods, and data

    USGS Publications Warehouse

    Sorenson, S.K.; Porter, S.D.; Akers, K.B.; Harris, M.A.; Kalkhoff, S.J.; Lee, K.E.; Roberts, L.; Terrio, P.J.

    1999-01-01

    Water-chemistry, biological, and habitat data were collected from 70 sites on Midwestern streams during August 1997 as part of an integrated, regional water-quality assessment by the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program. The study area includes the Corn Belt region of southern Minnesota, eastern Iowa, and west-central Illinois, one of the most intensive and productive agricultural regions of the world. The focus of the study was to evaluate the condition of woodedriparian zones and the influence of basin soildrainage characteristics on water quality and biological-community responses. This report includes a description of the study design and site-characterization process, sample-collection and processing methods, laboratory methods, quality-assurance procedures, and summaries of data on nutrients, herbicides and metabolites, stream productivity and respiration, biological communities, habitat conditions, and agriculturalchemical and land-use information.

  17. Effectiveness of real-time polymerase chain reaction assay for the detection of Mycobacterium tuberculosis in pathological samples: a systematic review and meta-analysis.

    PubMed

    Babafemi, Emmanuel O; Cherian, Benny P; Banting, Lee; Mills, Graham A; Ngianga, Kandala

    2017-10-25

    Rapid and accurate diagnosis of tuberculosis (TB) is key to manage the disease and to control and prevent its transmission. Many established diagnostic methods suffer from low sensitivity or delay of timely results and are inadequate for rapid detection of Mycobacterium tuberculosis (MTB) in pulmonary and extra-pulmonary clinical samples. This study examined whether a real-time polymerase chain reaction (RT-PCR) assay, with a turn-a-round time of 2 h, would prove effective for routine detection of MTB by clinical microbiology laboratories. A systematic literature search was performed for publications in any language on the detection of MTB in pathological samples by RT-PCR assay. The following sources were used MEDLINE via PubMed, EMBASE, BIOSIS Citation Index, Web of Science, SCOPUS, ISI Web of Knowledge and Cochrane Infectious Diseases Group Specialised Register, grey literature, World Health Organization and Centres for Disease Control and Prevention websites. Forty-six studies met set inclusion criteria. Generated pooled summary estimates (95% CIs) were calculated for overall accuracy and bivariate meta-regression model was used for meta-analysis. Summary estimates for pulmonary TB (31 studies) were as follows: sensitivity 0.82 (95% CI 0.81-0.83), specificity 0.99 (95% CI 0.99-0.99), positive likelihood ratio 43.00 (28.23-64.81), negative likelihood ratio 0.16 (0.12-0.20), diagnostic odds ratio 324.26 (95% CI 189.08-556.09) and area under curve 0.99. Summary estimates for extra-pulmonary TB (25 studies) were as follows: sensitivity 0.70 (95% CI 0.67-0.72), specificity 0.99 (95% CI 0.99-0.99), positive likelihood ratio 29.82 (17.86-49.78), negative likelihood ratio 0.33 (0.26-0.42), diagnostic odds ratio 125.20 (95% CI 65.75-238.36) and area under curve 0.96. RT-PCR assay demonstrated a high degree of sensitivity for pulmonary TB and good sensitivity for extra-pulmonary TB. It indicated a high degree of specificity for ruling in TB infection from sampling regimes. This was acceptable, but may better as a rule out add-on diagnostic test. RT-PCR assays demonstrate both a high degree of sensitivity in pulmonary samples and rapidity of detection of TB which is an important factor in achieving effective global control and for patient management in terms of initiating early and appropriate anti-tubercular therapy. PROSPERO CRD42015027534 .

  18. Alternative Methods of Base Level Demand Forecasting for Economic Order Quantity Items,

    DTIC Science & Technology

    1975-12-01

    Note .. . . . . . . . . . . . . . . . . . . . . . . . 21 AdaptivC Single Exponential Smooti-ing ........ 21 Choosing the Smoothiing Constant... methodology used in the study, an analysis of results, .And a detailed summary. Chapter I. Methodology , contains a description o the data, a...Chapter IV. Detailed Summary, presents a detailed summary of the findings, lists the limitations inherent in the 7’" research methodology , and

  19. Learning to rank-based gene summary extraction.

    PubMed

    Shang, Yue; Hao, Huihui; Wu, Jiajin; Lin, Hongfei

    2014-01-01

    In recent years, the biomedical literature has been growing rapidly. These articles provide a large amount of information about proteins, genes and their interactions. Reading such a huge amount of literature is a tedious task for researchers to gain knowledge about a gene. As a result, it is significant for biomedical researchers to have a quick understanding of the query concept by integrating its relevant resources. In the task of gene summary generation, we regard automatic summary as a ranking problem and apply the method of learning to rank to automatically solve this problem. This paper uses three features as a basis for sentence selection: gene ontology relevance, topic relevance and TextRank. From there, we obtain the feature weight vector using the learning to rank algorithm and predict the scores of candidate summary sentences and obtain top sentences to generate the summary. ROUGE (a toolkit for summarization of automatic evaluation) was used to evaluate the summarization result and the experimental results showed that our method outperforms the baseline techniques. According to the experimental result, the combination of three features can improve the performance of summary. The application of learning to rank can facilitate the further expansion of features for measuring the significance of sentences.

  20. Multiple Phenotype Association Tests Using Summary Statistics in Genome-Wide Association Studies

    PubMed Central

    Liu, Zhonghua; Lin, Xihong

    2017-01-01

    Summary We study in this paper jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. PMID:28653391

  1. Water-quality data (July 1986 through September 1987) and statistical summaries (March 1985 through September 1987) for the Clark Fork and selected tributaries from Deer Lodge to Missoula, Montana

    USGS Publications Warehouse

    Lambing, J.H.

    1988-01-01

    Water quality sampling was conducted at seven sites on the Clark Fork and selected tributaries from Deer Lodge to Missoula, Montana, from July 1986 through September 1987. This report presents tabulations and statistical summaries of the water quality data. The data presented in this report supplement previous data collected from March 1985 through June 1986 for six of the seven sites. Included in this report are tabulations of instantaneous values of streamflow, onsite water quality, hardness, and concentrations of trace elements and suspended sediment for periodic samples. Also included are tables and hydrographs of daily mean values for streamflow, suspended-sediment concentration, and suspended-sediment discharge at three mainstream stations and one tributary. Statistical summaries are presented for periodic water quality data collected from March 1986 through September 1987. Selected data are illustrated by graphs showing median concentrations to suspended-sediment concentrations, and median concentrations of trace elements in suspended sediment. (USGS)

  2. Photon event distribution sampling: an image formation technique for scanning microscopes that permits tracking of sub-diffraction particles with high spatial and temporal resolutions.

    PubMed

    Larkin, J D; Publicover, N G; Sutko, J L

    2011-01-01

    In photon event distribution sampling, an image formation technique for scanning microscopes, the maximum likelihood position of origin of each detected photon is acquired as a data set rather than binning photons in pixels. Subsequently, an intensity-related probability density function describing the uncertainty associated with the photon position measurement is applied to each position and individual photon intensity distributions are summed to form an image. Compared to pixel-based images, photon event distribution sampling images exhibit increased signal-to-noise and comparable spatial resolution. Photon event distribution sampling is superior to pixel-based image formation in recognizing the presence of structured (non-random) photon distributions at low photon counts and permits use of non-raster scanning patterns. A photon event distribution sampling based method for localizing single particles derived from a multi-variate normal distribution is more precise than statistical (Gaussian) fitting to pixel-based images. Using the multi-variate normal distribution method, non-raster scanning and a typical confocal microscope, localizations with 8 nm precision were achieved at 10 ms sampling rates with acquisition of ~200 photons per frame. Single nanometre precision was obtained with a greater number of photons per frame. In summary, photon event distribution sampling provides an efficient way to form images when low numbers of photons are involved and permits particle tracking with confocal point-scanning microscopes with nanometre precision deep within specimens. © 2010 The Authors Journal of Microscopy © 2010 The Royal Microscopical Society.

  3. The Effect of Racial Socialization on Urban African American Use of Child Mental Health Services

    PubMed Central

    Cavaleri, Mary A.; Rodriguez, James; McKay, Mary M.

    2009-01-01

    SUMMARY Objective To examine how parental endorsement of racial socialization parenting practices relates to child mental health service use among an urban sample of African American families. Methods A cross-sectional sample of urban African American parents (n = 96) provided ratings of their beliefs concerning various dimensions of racial socialization constructs, i.e., spiritual or religious coping (SRC), extended family caring (EFC), cultural pride reinforcement (CPR), and assessed regarding their use of child mental health services. Results At the multivariate level, the use of child mental health services was significantly positively associated with moderate levels of endorsement of SRC and EFC. Inversely, scores in the moderate range of CPR were associated with a reduced likelihood of child mental health service use. Conclusion Parental endorsement of racial socialization parenting practices appear to play a salient role in child mental health service use among an urban African American families. Further research with larger and more representative samples should be pursued. PMID:20228964

  4. Evaluation of methods for measuring relative permeability of anhydride from the Salado Formation: Sensitivity analysis and data reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christiansen, R.L.; Kalbus, J.S.; Howarth, S.M.

    This report documents, demonstrates, evaluates, and provides theoretical justification for methods used to convert experimental data into relative permeability relationships. The report facilities accurate determination of relative permeabilities of anhydride rock samples from the Salado Formation at the Waste Isolation Pilot Plant (WIPP). Relative permeability characteristic curves are necessary for WIPP Performance Assessment (PA) predictions of the potential for flow of waste-generated gas from the repository and brine flow into repository. This report follows Christiansen and Howarth (1995), a comprehensive literature review of methods for measuring relative permeability. It focuses on unsteady-state experiments and describes five methods for obtaining relativemore » permeability relationships from unsteady-state experiments. Unsteady-state experimental methods were recommended for relative permeability measurements of low-permeability anhydrite rock samples form the Salado Formation because these tests produce accurate relative permeability information and take significantly less time to complete than steady-state tests. Five methods for obtaining relative permeability relationships from unsteady-state experiments are described: the Welge method, the Johnson-Bossler-Naumann method, the Jones-Roszelle method, the Ramakrishnan-Cappiello method, and the Hagoort method. A summary, an example of the calculations, and a theoretical justification are provided for each of the five methods. Displacements in porous media are numerically simulated for the calculation examples. The simulated product data were processed using the methods, and the relative permeabilities obtained were compared with those input to the numerical model. A variety of operating conditions were simulated to show sensitivity of production behavior to rock-fluid properties.« less

  5. Signal, Uncertainty, and Conflict in Phylogenomic Data for a Diverse Lineage of Microbial Eukaryotes (Diatoms, Bacillariophyta)

    PubMed Central

    Parks, Matthew B; Wickett, Norman J; Alverson, Andrew J

    2018-01-01

    Abstract Diatoms (Bacillariophyta) are a species-rich group of eukaryotic microbes diverse in morphology, ecology, and metabolism. Previous reconstructions of the diatom phylogeny based on one or a few genes have resulted in inconsistent resolution or low support for critical nodes. We applied phylogenetic paralog pruning techniques to a data set of 94 diatom genomes and transcriptomes to infer perennially difficult species relationships, using concatenation and summary-coalescent methods to reconstruct species trees from data sets spanning a wide range of thresholds for taxon and column occupancy in gene alignments. Conflicts between gene and species trees decreased with both increasing taxon occupancy and bootstrap cutoffs applied to gene trees. Concordance between gene and species trees was lowest for short internodes and increased logarithmically with increasing edge length, suggesting that incomplete lineage sorting disproportionately affects species tree inference at short internodes, which are a common feature of the diatom phylogeny. Although species tree topologies were largely consistent across many data treatments, concatenation methods appeared to outperform summary-coalescent methods for sparse alignments. Our results underscore that approaches to species-tree inference based on few loci are likely to be misled by unrepresentative sampling of gene histories, particularly in lineages that may have diversified rapidly. In addition, phylogenomic studies of diatoms, and potentially other hyperdiverse groups, should maximize the number of gene trees with high taxon occupancy, though there is clearly a limit to how many of these genes will be available. PMID:29040712

  6. Rocket exhaust plume computer program improvement. Volume 1: Summary: Method of characteristics nozzle and plume programs

    NASA Technical Reports Server (NTRS)

    Ratliff, A. W.; Smith, S. D.; Penny, N. M.

    1972-01-01

    A summary is presented of the various documents that discuss and describe the computer programs and analysis techniques which are available for rocket nozzle and exhaust plume calculations. The basic method of characteristics program is discussed, along with such auxiliary programs as the plume impingement program, the plot program and the thermochemical properties program.

  7. Pre-School Education--Aims, Methods and Problems. Report of a Symposium (Venice, Italy, October 11-16, 1971).

    ERIC Educational Resources Information Center

    Council of Europe, Strasbourg (France). Committee for General and Technical Education.

    This report provides a summary of the proceedings and recommendations of the Council of Europe symposium on preschool education held in Venice, Italy in 1971. The report is divided into three major areas: (1) historical background information; (2) summaries of general lectures, especially dealing with the functions, aims, methods, and problems of…

  8. Training in metabolomics research. I. Designing the experiment, collecting and extracting samples and generating metabolomics data

    PubMed Central

    Barnes, Stephen; Benton, H. Paul; Casazza, Krista; Cooper, Sara J.; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H.; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K.; Renfrow, Matthew B.; Tiwari, Hemant K.

    2016-01-01

    The study of metabolism has had a long history. Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. The National Institutes of Health Common Fund Metabolomics Program was established in 2012 to stimulate interest in the approaches and technologies of metabolomics. To deliver one of the program’s goals, the University of Alabama at Birmingham has hosted an annual 4-day short course in metabolomics for faculty, postdoctoral fellows and graduate students from national and international institutions. This paper is the first part of a summary of the training materials presented in the course to be used as a resource for all those embarking on metabolomics research. PMID:27434804

  9. Development of NASA's Sample Cartridge Assembly: Summary of GEDS Design, Development Testing, and Thermal Analyses

    NASA Technical Reports Server (NTRS)

    O'Connor, Brian; Hernandez, Deborah; Hornsby, Linda; Brown, Maria; Horton-Mullins, Kathryn

    2017-01-01

    NASA's Sample Cartridge Assembly (SCA) project is responsible for designing and validating a payload that contains materials research samples in a sealed environment. The SCA will be heated in the European Space Agency's (ESA) Low Gradient Furnace (LGF) that is housed inside the Material Science Research Rack (MSRR) located on the International Space Station (ISS). The first Principle Investigator (PI) to utilize the SCA will focus on Gravitational Effects on Distortion in Sintering (GEDS) research. This paper will give a summary of the design and development test effort for the GEDS SCA and will discuss the role of thermal analysis in developing test profiles to meet the science and engineering requirements. Lessons learned will be reviewed and salient design features that may differ for each PI will be discussed.

  10. Effect of IFN-gamma on the killing of S. aureus in human whole blood. Assessment of bacterial viability by CFU determination and by a new method using alamarBlue.

    PubMed

    DeForge, L E; Billeci, K L; Kramer, S M

    2000-11-01

    Given the increasing incidence of methicillin resistant Staphylococcus aureus (MRSA) and the recent emergence of MRSA with a reduced susceptibility to vancomycin, alternative approaches to the treatment of infection are of increasing relevance. The purpose of these studies was to evaluate the effect of IFN-gamma on the ability of white blood cells to kill S. aureus and to develop a simpler, higher throughput bacterial killing assay. Using a methicillin sensitive clinical isolate of S. aureus, a clinical isolate of MRSA, and a commercially available strain of MRSA, studies were conducted using a killing assay in which the bacteria were added directly into whole blood. The viability of the bacteria in samples harvested at various time points was then evaluated both by the classic CFU assay and by a new assay using alamarBlue. In the latter method, serially diluted samples and a standard curve containing known concentrations of bacteria were placed on 96-well plates, and alamarBlue was added. Fluorescence readings were taken, and the viability of the bacteria in the samples was calculated using the standard curve. The results of these studies demonstrated that the CFU and alamarBlue methods yielded equivalent detection of bacteria diluted in buffer. For samples incubated in whole blood, however, the alamarBlue method tended to yield lower viabilities than the CFU method due to the emergence of a slower growing subpopulation of S. aureus upon incubation in the blood matrix. A significant increase in bacterial killing was observed upon pretreatment of whole blood for 24 h with 5 or 25 ng/ml IFN-gamma. This increase in killing was detected equivalently by the CFU and alamarBlue methods. In summary, these studies describe a method that allows for the higher throughput analysis of the effects of immunomodulators on bacterial killing.

  11. Effectiveness of phylogenomic data and coalescent species-tree methods for resolving difficult nodes in the phylogeny of advanced snakes (Serpentes: Caenophidia).

    PubMed

    Pyron, R Alexander; Hendry, Catriona R; Chou, Vincent M; Lemmon, Emily M; Lemmon, Alan R; Burbrink, Frank T

    2014-12-01

    Next-generation genomic sequencing promises to quickly and cheaply resolve remaining contentious nodes in the Tree of Life, and facilitates species-tree estimation while taking into account stochastic genealogical discordance among loci. Recent methods for estimating species trees bypass full likelihood-based estimates of the multi-species coalescent, and approximate the true species-tree using simpler summary metrics. These methods converge on the true species-tree with sufficient genomic sampling, even in the anomaly zone. However, no studies have yet evaluated their efficacy on a large-scale phylogenomic dataset, and compared them to previous concatenation strategies. Here, we generate such a dataset for Caenophidian snakes, a group with >2500 species that contains several rapid radiations that were poorly resolved with fewer loci. We generate sequence data for 333 single-copy nuclear loci with ∼100% coverage (∼0% missing data) for 31 major lineages. We estimate phylogenies using neighbor joining, maximum parsimony, maximum likelihood, and three summary species-tree approaches (NJst, STAR, and MP-EST). All methods yield similar resolution and support for most nodes. However, not all methods support monophyly of Caenophidia, with Acrochordidae placed as the sister taxon to Pythonidae in some analyses. Thus, phylogenomic species-tree estimation may occasionally disagree with well-supported relationships from concatenated analyses of small numbers of nuclear or mitochondrial genes, a consideration for future studies. In contrast for at least two diverse, rapid radiations (Lamprophiidae and Colubridae), phylogenomic data and species-tree inference do little to improve resolution and support. Thus, certain nodes may lack strong signal, and larger datasets and more sophisticated analyses may still fail to resolve them. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Modelling the light-scattering properties of a planetary-regolith analog sample

    NASA Astrophysics Data System (ADS)

    Vaisanen, T.; Markkanen, J.; Hadamcik, E.; Levasseur-Regourd, A. C.; Lasue, J.; Blum, J.; Penttila, A.; Muinonen, K.

    2017-12-01

    Solving the scattering properties of asteroid surfaces can be made cheaper, faster, and more accurate with reliable physics-based electromagnetic scattering programs for large and dense random media. Existing exact methods fail to produce solutions for such large systems and it is essential to develop approximate methods. Radiative transfer (RT) is an approximate method which works for sparse random media such as atmospheres fails when applied to dense media. In order to make the method applicable to dense media, we have developed a radiative-transfer coherent-backscattering method (RT-CB) with incoherent interactions. To show the current progress with the RT-CB, we have modeled a planetary-regolith analog sample. The analog sample is a low-density agglomerate produced by random ballistic deposition of almost equisized silicate spheres studied using the PROGRA2-surf experiment. The scattering properties were then computed with the RT-CB assuming that the silicate spheres were equisized and that there were a Gaussian particle size distribution. The results were then compared to the measured data and the intensity plot is shown below. The phase functions are normalized to unity at the 40-deg phase angle. The tentative intensity modeling shows good match with the measured data, whereas the polarization modeling shows discrepancies. In summary, the current RT-CB modeling is promising, but more work needs to be carried out, in particular, for modeling the polarization. Acknowledgments. Research supported by European Research Council with Advanced Grant No. 320773 SAEMPL, Scattering and Absorption of ElectroMagnetic waves in ParticuLate media. Computational resources provided by CSC - IT Centre for Science Ltd, Finland.

  13. 78 FR 27442 - Coal Mine Dust Sampling Devices; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-10

    ... DEPARTMENT OF LABOR Mine Safety and Health Administration Coal Mine Dust Sampling Devices; Correction AGENCY: Mine Safety and Health Administration, Labor. ACTION: Notice; correction. SUMMARY: On April 30, 2013, Mine Safety and Health Administration (MSHA) published a notice in the Federal Register...

  14. Guidelines for Sampling, Assessing, and Restoring Defective Grout in Prestressed Concrete Bridge Post-Tensioning Ducts

    DOT National Transportation Integrated Search

    2013-05-01

    This document is a technical summary of the Federal Highway Administration report, "Guidelines for Sampling, Assessing, and Restoring Defective Grout in Prestressed Concrete Bridge Post-Tensioning Ducts" (FHWA-HRT-13-028). The objectives of this stud...

  15. Detection of a Serum Siderophore by LC-MS/MS as a Potential Biomarker of Invasive Aspergillosis

    PubMed Central

    Carroll, Cassandra S.; Amankwa, Lawrence N.; Pinto, Linda J.; Fuller, Jeffrey D.; Moore, Margo M.

    2016-01-01

    Invasive aspergillosis (IA) is a life-threatening systemic mycosis caused primarily by Aspergillus fumigatus. Early diagnosis of IA is based, in part, on an immunoassay for circulating fungal cell wall carbohydrate, galactomannan (GM). However, a wide range of sensitivity and specificity rates have been reported for the GM test across various patient populations. To obtain iron in vivo, A. fumigatus secretes the siderophore, N,N',N"-triacetylfusarinine C (TAFC) and we hypothesize that TAFC may represent a possible biomarker for early detection of IA. We developed an ultra performance liquid chromatography tandem mass spectrometry (UPLC-MS/MS) method for TAFC analysis from serum, and measured TAFC in serum samples collected from patients at risk for IA. The method showed lower and upper limits of quantitation (LOQ) of 5 ng/ml and 750 ng/ml, respectively, and complete TAFC recovery from spiked serum. As proof of concept, we evaluated 76 serum samples from 58 patients with suspected IA that were investigated for the presence of GM. Fourteen serum samples obtained from 11 patients diagnosed with probable or proven IA were also analyzed for the presence of TAFC. Control sera (n = 16) were analyzed to establish a TAFC cut-off value (≥6 ng/ml). Of the 36 GM-positive samples (≥0.5 GM index) from suspected IA patients, TAFC was considered positive in 25 (69%). TAFC was also found in 28 additional GM-negative samples. TAFC was detected in 4 of the 14 samples (28%) from patients with proven/probable aspergillosis. Log-transformed TAFC and GM values from patients with proven/probable IA, healthy individuals and SLE patients showed a significant correlation with a Pearson r value of 0.77. In summary, we have developed a method for the detection of TAFC in serum that revealed this fungal product in the sera of patients at risk for invasive aspergillosis. A prospective study is warranted to determine whether this method provides improved early detection of IA. PMID:26974544

  16. MPLEx: a Robust and Universal Protocol for Single-Sample Integrative Proteomic, Metabolomic, and Lipidomic Analyses

    PubMed Central

    Nakayasu, Ernesto S.; Nicora, Carrie D.; Sims, Amy C.; Burnum-Johnson, Kristin E.; Kim, Young-Mo; Kyle, Jennifer E.; Matzke, Melissa M.; Shukla, Anil K.; Chu, Rosalie K.; Schepmoes, Athena A.; Jacobs, Jon M.; Baric, Ralph S.; Webb-Robertson, Bobbie-Jo; Smith, Richard D.

    2016-01-01

    ABSTRACT Integrative multi-omics analyses can empower more effective investigation and complete understanding of complex biological systems. Despite recent advances in a range of omics analyses, multi-omic measurements of the same sample are still challenging and current methods have not been well evaluated in terms of reproducibility and broad applicability. Here we adapted a solvent-based method, widely applied for extracting lipids and metabolites, to add proteomics to mass spectrometry-based multi-omics measurements. The metabolite, protein, and lipid extraction (MPLEx) protocol proved to be robust and applicable to a diverse set of sample types, including cell cultures, microbial communities, and tissues. To illustrate the utility of this protocol, an integrative multi-omics analysis was performed using a lung epithelial cell line infected with Middle East respiratory syndrome coronavirus, which showed the impact of this virus on the host glycolytic pathway and also suggested a role for lipids during infection. The MPLEx method is a simple, fast, and robust protocol that can be applied for integrative multi-omic measurements from diverse sample types (e.g., environmental, in vitro, and clinical). IMPORTANCE In systems biology studies, the integration of multiple omics measurements (i.e., genomics, transcriptomics, proteomics, metabolomics, and lipidomics) has been shown to provide a more complete and informative view of biological pathways. Thus, the prospect of extracting different types of molecules (e.g., DNAs, RNAs, proteins, and metabolites) and performing multiple omics measurements on single samples is very attractive, but such studies are challenging due to the fact that the extraction conditions differ according to the molecule type. Here, we adapted an organic solvent-based extraction method that demonstrated broad applicability and robustness, which enabled comprehensive proteomics, metabolomics, and lipidomics analyses from the same sample. Author Video: An author video summary of this article is available. PMID:27822525

  17. KRAS mutation testing in colorectal cancer: comparison of the results obtained using 3 different methods for the analysis of codons G12 and G13.

    PubMed

    Bihl, Michel P; Hoeller, Sylvia; Andreozzi, Maria Carla; Foerster, Anja; Rufle, Alexander; Tornillo, Luigi; Terracciano, Luigi

    2012-03-01

    Targeting the epidermal growth factor receptor (EGFR) is a new therapeutic option for patients with metastatic colorectal or lung carcinoma. However, the therapy efficiency highly depends on the KRAS mutation status in the given tumour. Therefore a reliable and secure KRAS mutation testing is crucial. Here we investigated 100 colorectal carcinoma samples with known KRAS mutation status (62 mutated cases and 38 wild type cases) in a comparative manner with three different KRAS mutation testing techniques (Pyrosequencing, Dideoxysequencing and INFINITI) in order to test their reliability and sensitivity. For the large majority of samples (96/100, 96%), the KRAS mutation status obtained by all three methods was the same. Only two cases with clear discrepancies were observed. One case was reported as wild type by the INFINITI method while the two other methods detected a G13C mutation. In the second case the mutation could be detected by the Pyrosequencing and INFINITI method (15% and 15%), while no signal for mutation could be observed with the Dideoxysequencing method. Additional two unclear results were due to a detection of a G12V with the INFINITI method, which was below cut-off when repeated and which was not detectable by the other two methods and very weak signals in a G12V mutated case with the Dideoxy- and Pyroseqencing method compared to the INFINITI method, respectively. In summary all three methods are reliable and robust methods in detecting KRAS mutations. INFINITI, however seems to be slightly more sensitive compared to Dideoxy- and Pyrosequencing.

  18. Disease Specific Productivity of American Cancer Hospitals

    PubMed Central

    Goldstein, Jeffery A.; Prasad, Vinay

    2015-01-01

    Context Research-oriented cancer hospitals in the United States treat and study patients with a range of diseases. Measures of disease specific research productivity, and comparison to overall productivity, are currently lacking. Hypothesis Different institutions are specialized in research of particular diseases. Objective To report disease specific productivity of American cancer hospitals, and propose a summary measure. Method We conducted a retrospective observational survey of the 50 highest ranked cancer hospitals in the 2013 US News and World Report rankings. We performed an automated search of PubMed and Clinicaltrials.gov for published reports and registrations of clinical trials (respectively) addressing specific cancers between 2008 and 2013. We calculated the summed impact factor for the publications. We generated a summary measure of productivity based on the number of Phase II clinical trials registered and the impact factor of Phase II clinical trials published for each institution and disease pair. We generated rankings based on this summary measure. Results We identified 6076 registered trials and 6516 published trials with a combined impact factor of 44280.4, involving 32 different diseases over the 50 institutions. Using a summary measure based on registered and published clinical trails, we ranked institutions in specific diseases. As expected, different institutions were highly ranked in disease-specific productivity for different diseases. 43 institutions appeared in the top 10 ranks for at least 1 disease (vs 10 in the overall list), while 6 different institutions were ranked number 1 in at least 1 disease (vs 1 in the overall list). Conclusion Research productivity varies considerably among the sample. Overall cancer productivity conceals great variation between diseases. Disease specific rankings identify sites of high academic productivity, which may be of interest to physicians, patients and researchers. PMID:25781329

  19. Breast-feeding and Helicobacter pylori infection: systematic review and meta-analysis.

    PubMed

    Carreira, Helena; Bastos, Ana; Peleteiro, Bárbara; Lunet, Nuno

    2015-02-01

    To quantify the association between breast-feeding and Helicobacter pylori infection, among children and adolescents. We searched MEDLINE™ and Scopus™ up to January 2013. Summary relative risk estimates (RR) and 95 % confidence intervals were computed through the DerSimonian and Laird method. Heterogeneity was quantified using the I² statistic. Twenty-seven countries/regions; four low-income, thirteen middle-income and ten high-income countries/regions. Studies involving samples of children and adolescents, aged 0 to 19 years. We identified thirty-eight eligible studies, which is nearly twice the number included in a previous meta-analysis on this topic. Fifteen studies compared ever v. never breast-fed subjects; the summary RR was 0·87 (95% CI 0·57, 1·32; I²=34·4%) in middle-income and 0·85 (95% CI 0·54, 1·34; I²=79·1%) in high-income settings. The effect of breast-feeding for ≥4-6 months was assessed in ten studies from middle-income (summary RR=0·66; 95% CI 0·44, 0·98; I²=65·7%) and two from high-income countries (summary RR=1·56; 95% CI 0·57, 4·26; I²=68·3%). Two studies assessed the effect of exclusive breast-feeding until 6 months (OR=0·91; 95% CI 0·61, 1·34 and OR=1·71; 95% CI 0·66, 4·47, respectively). Our results suggest a protective effect of breast-feeding in economically less developed settings. However, further research is needed, with a finer assessment of the exposure to breast-feeding and careful control for confounding, before definite conclusions can be reached.

  20. Recent research about mild cognitive impairment in China

    PubMed Central

    CHENG, Yan; XIAO, Shifu

    2014-01-01

    Summary: The rapid aging of the Chinese population has spurred interest in research about the cause and prevention of dementia and its precursor, mild cognitive impairment (MCI). This review summarizes the last decade of research in China about MCI. Extensive research about the epidemiology, neuropsychological characteristics, diagnosis, genetic etiology, neuroimaging and electrophysiological changes, and treatment of MCI has provided some new insights but few breakthroughs. Further advances in the prevention and treatment of MCI will require a greater emphasis on multi-disciplinary prospective studies with large, representative samples that use standardized methods to assess and monitor changes in cognitive functioning over time. PMID:25114476

  1. A User's Guide for the Differential Reduced Ejector/Mixer Analysis "DREA" Program. 1.0

    NASA Technical Reports Server (NTRS)

    DeChant, Lawrence J.; Nadell, Shari-Beth

    1999-01-01

    A system of analytical and numerical two-dimensional mixer/ejector nozzle models that require minimal empirical input has been developed and programmed for use in conceptual and preliminary design. This report contains a user's guide describing the operation of the computer code, DREA (Differential Reduced Ejector/mixer Analysis), that contains these mathematical models. This program is currently being adopted by the Propulsion Systems Analysis Office at the NASA Glenn Research Center. A brief summary of the DREA method is provided, followed by detailed descriptions of the program input and output files. Sample cases demonstrating the application of the program are presented.

  2. Nano/Micro and Spectroscopic Approaches to Food Pathogen Detection

    NASA Astrophysics Data System (ADS)

    Cho, Il-Hoon; Radadia, Adarsh D.; Farrokhzad, Khashayar; Ximenes, Eduardo; Bae, Euiwon; Singh, Atul K.; Oliver, Haley; Ladisch, Michael; Bhunia, Arun; Applegate, Bruce; Mauer, Lisa; Bashir, Rashid; Irudayaraj, Joseph

    2014-06-01

    Despite continuing research efforts, timely and simple pathogen detection with a high degree of sensitivity and specificity remains an elusive goal. Given the recent explosion of sensor technologies, significant strides have been made in addressing the various nuances of this important global challenge that affects not only the food industry but also human health. In this review, we provide a summary of the various ongoing efforts in pathogen detection and sample preparation in areas related to Fourier transform infrared and Raman spectroscopy, light scattering, phage display, micro/nanodevices, and nanoparticle biosensors. We also discuss the advantages and potential limitations of the detection methods and suggest next steps for further consideration.

  3. Numerical Hydrodynamics in General Relativity.

    PubMed

    Font, José A

    2000-01-01

    The current status of numerical solutions for the equations of ideal general relativistic hydrodynamics is reviewed. Different formulations of the equations are presented, with special mention of conservative and hyperbolic formulations well-adapted to advanced numerical methods. A representative sample of available numerical schemes is discussed and particular emphasis is paid to solution procedures based on schemes exploiting the characteristic structure of the equations through linearized Riemann solvers. A comprehensive summary of relevant astrophysical simulations in strong gravitational fields, including gravitational collapse, accretion onto black holes and evolution of neutron stars, is also presented. Supplementary material is available for this article at 10.12942/lrr-2000-2.

  4. [THE TECHNOLOGY "CELL BLOCK" IN CYTOLOGICAL PRACTICE].

    PubMed

    Volchenko, N N; Borisova, O V; Baranova, I B

    2015-08-01

    The article presents summary information concerning application of "cell block" technology in cytological practice. The possibilities of implementation of various modern techniques (immune cytochemnical analysis. FISH, CISH, polymerase chain reaction) with application of "cell block" method are demonstrated. The original results of study of "cell block" technology made with gelatin, AgarCyto and Shadon Cyoblock set are presented. The diagnostic effectiveness of "cell block" technology and common cytological smear and also immune cytochemical analysis on samples of "cell block" technology and fluid cytology were compared. Actually application of "cell block" technology is necessary for ensuring preservation of cell elements for subsequent immune cytochemical and molecular genetic analysis.

  5. Automatic and user-centric approaches to video summary evaluation

    NASA Astrophysics Data System (ADS)

    Taskiran, Cuneyt M.; Bentley, Frank

    2007-01-01

    Automatic video summarization has become an active research topic in content-based video processing. However, not much emphasis has been placed on developing rigorous summary evaluation methods and developing summarization systems based on a clear understanding of user needs, obtained through user centered design. In this paper we address these two topics and propose an automatic video summary evaluation algorithm adapted from teh text summarization domain.

  6. Use of antimicrobial drugs in general hospitals. I. Description of population and definition of methods.

    PubMed

    Townsend, T R; Shapiro, M; Rosner, B; Kass, E H

    1979-06-01

    The patterns of use of antimicrobial drugs in a random sample of general hosptials in Pennsylvania were studied. The sample was tested for validity, and all deaths and discharges were analyzed for 10 random days drawn across the year spanning July 1973 to June 1974. Methods were developed for abstracting the hospital records and for determining the reproducibility of the findings of the physician and nonphysician chart reviewers. More than 99% of the requested charts were available. In the 5,288 charts reviewed, most of the required data were readily available. The study population was 84% white and 58% female; most patients were in hospitals that had more than 300 beds and that were located in towns with populations of greater than 10,000. In 41% of the 2,070 antimicrobial courses administered to almost 30% of the patients, an explicit clinical statement of why the drug was being given could be found in the chart. The information for review was found in clinical charts, but in half of the charts, the information required was not on face sheets and discharge summaries.

  7. Third NASA Langley Formal Methods Workshop

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael (Compiler)

    1995-01-01

    This publication constitutes the proceedings of NASA Langley Research Center's third workshop on the application of formal methods to the design and verification of life-critical systems. This workshop brought together formal methods researchers, industry engineers, and academicians to discuss the potential of NASA-sponsored formal methods and to investigate new opportunities for applying these methods to industry problems. contained herein are copies of the material presented at the workshop, summaries of many of the presentations, a complete list of attendees, and a detailed summary of the Langley formal methods program. Much of this material is available electronically through the World-Wide Web via the following URL.

  8. How much do you need: a randomised experiment of whether readers can understand the key messages from summaries of Cochrane Reviews without reading the full review.

    PubMed

    Maguire, Lisa K; Clarke, Mike

    2014-11-01

    We explored whether readers can understand key messages without having to read the full review, and if there were differences in understanding between various types of summary. A randomised experiment of review summaries which compared understanding of a key outcome. Members of university staff (n = 36). Universities on the island of Ireland. The Cochrane Review chosen examines the health impacts of the use of electric fans during heat waves. Participants were asked their expectation of the effect these would have on mortality. They were then randomly assigned a summary of the review (i.e. abstract, plain language summary, podcast or podcast transcription) and asked to spend a short time reading/listening to the summary. After this they were again asked about the effects of electric fans on mortality and to indicate if they would want to read the full Review. Correct identification of a key review outcome. Just over half (53%) of the participants identified its key message on mortality after engaging with their summary. The figures were 33% for the abstract group, 50% for both the plain language and transcript groups and 78% for the podcast group. The differences between the groups were not statistically significant but suggest that the audio summary might improve knowledge transfer compared to written summaries. These findings should be explored further using a larger sample size and with other reviews. © The Royal Society of Medicine.

  9. Extraction of toxic compounds from saliva by magnetic-stirring-assisted micro-solid-phase extraction step followed by headspace-gas chromatography-ion mobility spectrometry.

    PubMed

    Criado-García, Laura; Arce, Lourdes

    2016-09-01

    A new sample extraction procedure based on micro-solid-phase extraction (μSPE) using a mixture of sorbents of different polarities (polymeric reversed-phase sorbent HLB, silica-based sorbent C18, and multiwalled carbon nanotubes) was applied to extract benzene, toluene, butyraldehyde, benzaldehyde, and tolualdehyde present in saliva to avoid interference from moisture and matrix components and enhance sensitivity and selectivity of the ion mobility spectrometry (IMS) methodology proposed. The extraction of target analytes from saliva samples by using μSPE were followed by the desorption step carried out in the headspace vials placed in the autosampler of the IMS device. Then, 200 μL of headspace was injected into the GC column coupled to the IMS for its analysis. The method was fully validated in terms of sensitivity, precision, and recovery. The LODs and LOQs obtained, when analytes were dissolved in saliva samples to consider the matrix effect, were within the range of 0.38-0.49 and 1.26-1.66 μg mL(-1), respectively. The relative standard deviations were <3.5 % for retention time and drift time values, which indicate that the method proposed can be applied to determine toxic compounds in saliva samples. Graphical abstract Summary of steps followed in the experimental set up of this work.

  10. Robust Covariate-Adjusted Log-Rank Statistics and Corresponding Sample Size Formula for Recurrent Events Data

    PubMed Central

    Song, Rui; Kosorok, Michael R.; Cai, Jianwen

    2009-01-01

    Summary Recurrent events data are frequently encountered in clinical trials. This article develops robust covariate-adjusted log-rank statistics applied to recurrent events data with arbitrary numbers of events under independent censoring and the corresponding sample size formula. The proposed log-rank tests are robust with respect to different data-generating processes and are adjusted for predictive covariates. It reduces to the Kong and Slud (1997, Biometrika 84, 847–862) setting in the case of a single event. The sample size formula is derived based on the asymptotic normality of the covariate-adjusted log-rank statistics under certain local alternatives and a working model for baseline covariates in the recurrent event data context. When the effect size is small and the baseline covariates do not contain significant information about event times, it reduces to the same form as that of Schoenfeld (1983, Biometrics 39, 499–503) for cases of a single event or independent event times within a subject. We carry out simulations to study the control of type I error and the comparison of powers between several methods in finite samples. The proposed sample size formula is illustrated using data from an rhDNase study. PMID:18162107

  11. Bias correction in species distribution models: pooling survey and collection data for multiple species

    PubMed Central

    Fithian, William; Elith, Jane; Hastie, Trevor; Keith, David A.

    2016-01-01

    Summary Presence-only records may provide data on the distributions of rare species, but commonly suffer from large, unknown biases due to their typically haphazard collection schemes. Presence–absence or count data collected in systematic, planned surveys are more reliable but typically less abundant.We proposed a probabilistic model to allow for joint analysis of presence-only and survey data to exploit their complementary strengths. Our method pools presence-only and presence–absence data for many species and maximizes a joint likelihood, simultaneously estimating and adjusting for the sampling bias affecting the presence-only data. By assuming that the sampling bias is the same for all species, we can borrow strength across species to efficiently estimate the bias and improve our inference from presence-only data.We evaluate our model’s performance on data for 36 eucalypt species in south-eastern Australia. We find that presence-only records exhibit a strong sampling bias towards the coast and towards Sydney, the largest city. Our data-pooling technique substantially improves the out-of-sample predictive performance of our model when the amount of available presence–absence data for a given species is scarceIf we have only presence-only data and no presence–absence data for a given species, but both types of data for several other species that suffer from the same spatial sampling bias, then our method can obtain an unbiased estimate of the first species’ geographic range. PMID:27840673

  12. Dynamic thresholds and a summary ROC curve: Assessing prognostic accuracy of longitudinal markers.

    PubMed

    Saha-Chaudhuri, P; Heagerty, P J

    2018-04-19

    Cancer patients, chronic kidney disease patients, and subjects infected with HIV are routinely monitored over time using biomarkers that represent key health status indicators. Furthermore, biomarkers are frequently used to guide initiation of new treatments or to inform changes in intervention strategies. Since key medical decisions can be made on the basis of a longitudinal biomarker, it is important to evaluate the potential accuracy associated with longitudinal monitoring. To characterize the overall accuracy of a time-dependent marker, we introduce a summary ROC curve that displays the overall sensitivity associated with a time-dependent threshold that controls time-varying specificity. The proposed statistical methods are similar to concepts considered in disease screening, yet our methods are novel in choosing a potentially time-dependent threshold to define a positive test, and our methods allow time-specific control of the false-positive rate. The proposed summary ROC curve is a natural averaging of time-dependent incident/dynamic ROC curves and therefore provides a single summary of net error rates that can be achieved in the longitudinal setting. Copyright © 2018 John Wiley & Sons, Ltd.

  13. Veterinary Research Manpower Development for Defense

    DTIC Science & Technology

    2007-09-01

    Participatory Disease Surveillance Method for Detection of Highly Pathogenic Avian Influenza in Java, Indonesia Rebecca Steers Dr. Lindenmeyer Detection of...Transmission of Nipah Virus in Bangladesh Summary: My project aims to investigate the risk of zoonotic transmission of Nipah virus as a food-borne...Participatory Disease Surveillance Method for Detection of Highly Pathogenic Avian Influenza in Java, Indonesia Summary: Two epidemics of H5N1 Highly

  14. Silver Nanoparticle-Based Fluorescence-Quenching Lateral Flow Immunoassay for Sensitive Detection of Ochratoxin A in Grape Juice and Wine.

    PubMed

    Jiang, Hu; Li, Xiangmin; Xiong, Ying; Pei, Ke; Nie, Lijuan; Xiong, Yonghua

    2017-02-28

    A silver nanoparticle (AgNP)-based fluorescence-quenching lateral flow immunoassay with competitive format (cLFIA) was developed for sensitive detection of ochratoxin A (OTA) in grape juice and wine samples in the present study. The Ru(phen) 3 2 + -doped silica nanoparticles (RuNPs) were sprayed on the test and control line zones as background fluorescence signals. The AgNPs were designed as the fluorescence quenchers of RuNPs because they can block the exciting light transferring to the RuNP molecules. The proposed method exhibited high sensitivity for OTA detection, with a detection limit of 0.06 µg/L under optimized conditions. The method also exhibited a good linear range for OTA quantitative analysis from 0.08 µg/L to 5.0 µg/L. The reliability of the fluorescence-quenching cLFIA method was evaluated through analysis of the OTA-spiked red grape wine and juice samples. The average recoveries ranged from 88.0% to 110.0% in red grape wine and from 92.0% to 110.0% in grape juice. Meanwhile, less than a 10% coefficient variation indicated an acceptable precision of the cLFIA method. In summary, the new AgNP-based fluorescence-quenching cLFIA is a simple, rapid, sensitive, and accurate method for quantitative detection of OTA in grape juice and wine or other foodstuffs.

  15. Silver Nanoparticle-Based Fluorescence-Quenching Lateral Flow Immunoassay for Sensitive Detection of Ochratoxin A in Grape Juice and Wine

    PubMed Central

    Jiang, Hu; Li, Xiangmin; Xiong, Ying; Pei, Ke; Nie, Lijuan; Xiong, Yonghua

    2017-01-01

    A silver nanoparticle (AgNP)-based fluorescence-quenching lateral flow immunoassay with competitive format (cLFIA) was developed for sensitive detection of ochratoxin A (OTA) in grape juice and wine samples in the present study. The Ru(phen)32+-doped silica nanoparticles (RuNPs) were sprayed on the test and control line zones as background fluorescence signals. The AgNPs were designed as the fluorescence quenchers of RuNPs because they can block the exciting light transferring to the RuNP molecules. The proposed method exhibited high sensitivity for OTA detection, with a detection limit of 0.06 µg/L under optimized conditions. The method also exhibited a good linear range for OTA quantitative analysis from 0.08 µg/L to 5.0 µg/L. The reliability of the fluorescence-quenching cLFIA method was evaluated through analysis of the OTA-spiked red grape wine and juice samples. The average recoveries ranged from 88.0% to 110.0% in red grape wine and from 92.0% to 110.0% in grape juice. Meanwhile, less than a 10% coefficient variation indicated an acceptable precision of the cLFIA method. In summary, the new AgNP-based fluorescence-quenching cLFIA is a simple, rapid, sensitive, and accurate method for quantitative detection of OTA in grape juice and wine or other foodstuffs. PMID:28264472

  16. A robust clustering algorithm for identifying problematic samples in genome-wide association studies.

    PubMed

    Bellenguez, Céline; Strange, Amy; Freeman, Colin; Donnelly, Peter; Spencer, Chris C A

    2012-01-01

    High-throughput genotyping arrays provide an efficient way to survey single nucleotide polymorphisms (SNPs) across the genome in large numbers of individuals. Downstream analysis of the data, for example in genome-wide association studies (GWAS), often involves statistical models of genotype frequencies across individuals. The complexities of the sample collection process and the potential for errors in the experimental assay can lead to biases and artefacts in an individual's inferred genotypes. Rather than attempting to model these complications, it has become a standard practice to remove individuals whose genome-wide data differ from the sample at large. Here we describe a simple, but robust, statistical algorithm to identify samples with atypical summaries of genome-wide variation. Its use as a semi-automated quality control tool is demonstrated using several summary statistics, selected to identify different potential problems, and it is applied to two different genotyping platforms and sample collections. The algorithm is written in R and is freely available at www.well.ox.ac.uk/chris-spencer chris.spencer@well.ox.ac.uk Supplementary data are available at Bioinformatics online.

  17. The Eastern Gas Shales Project (EGSP) Data System: A case study in data base design, development, and application

    USGS Publications Warehouse

    Dyman, T.S.; Wilcox, L.A.

    1983-01-01

    The U.S. Geological Survey and Petroleum Information Corporation in Denver, Colorado, developed the Eastern Gas Shale Project (EGSP)Data System for the U.S. Department of Energy, Morgantown, West Virginia. Geological, geochemical, geophysical, and engineering data from Devonian shale samples from more than 5800 wells and outcrops in the Appalachian basin were edited and converted to a Petroleum Information Corporation data base. Well and sample data may be retrieved from this data system to produce (1)production-test summaries by formation and well location; (2)contoured isopach, structure, and trendsurface maps of Devonian shale units; (3)sample summary reports for samples by location, well, contractor, and sample number; (4)cross sections displaying digitized log traces, geochemical, and lithologic data by depth for wells; and (5)frequency distributions and bivariate plots. Although part of the EGSP Data System is proprietary, and distribution of complete well histories is prohibited by contract, maps and aggregated well-data listings are being made available to the public through published reports. ?? 1983 Plenum Publishing Corporation.

  18. FT-IR imaging for quantitative determination of liver fat content in non-alcoholic fatty liver.

    PubMed

    Kochan, K; Maslak, E; Chlopicki, S; Baranska, M

    2015-08-07

    In this work we apply FT-IR imaging of large areas of liver tissue cross-section samples (∼5 cm × 5 cm) for quantitative assessment of steatosis in murine model of Non-Alcoholic Fatty Liver (NAFLD). We quantified the area of liver tissue occupied by lipid droplets (LDs) by FT-IR imaging and Oil Red O (ORO) staining for comparison. Two alternative FT-IR based approaches are presented. The first, straightforward method, was based on average spectra from tissues and provided values of the fat content by using a PLS regression model and the reference method. The second one – the chemometric-based method – enabled us to determine the values of the fat content, independently of the reference method by means of k-means cluster (KMC) analysis. In summary, FT-IR images of large size liver sections may prove to be useful for quantifying liver steatosis without the need of tissue staining.

  19. Inferring causal relationships between phenotypes using summary statistics from genome-wide association studies.

    PubMed

    Meng, Xiang-He; Shen, Hui; Chen, Xiang-Ding; Xiao, Hong-Mei; Deng, Hong-Wen

    2018-03-01

    Genome-wide association studies (GWAS) have successfully identified numerous genetic variants associated with diverse complex phenotypes and diseases, and provided tremendous opportunities for further analyses using summary association statistics. Recently, Pickrell et al. developed a robust method for causal inference using independent putative causal SNPs. However, this method may fail to infer the causal relationship between two phenotypes when only a limited number of independent putative causal SNPs identified. Here, we extended Pickrell's method to make it more applicable for the general situations. We extended the causal inference method by replacing the putative causal SNPs with the lead SNPs (the set of the most significant SNPs in each independent locus) and tested the performance of our extended method using both simulation and empirical data. Simulations suggested that when the same number of genetic variants is used, our extended method had similar distribution of test statistic under the null model as well as comparable power under the causal model compared with the original method by Pickrell et al. But in practice, our extended method would generally be more powerful because the number of independent lead SNPs was often larger than the number of independent putative causal SNPs. And including more SNPs, on the other hand, would not cause more false positives. By applying our extended method to summary statistics from GWAS for blood metabolites and femoral neck bone mineral density (FN-BMD), we successfully identified ten blood metabolites that may causally influence FN-BMD. We extended a causal inference method for inferring putative causal relationship between two phenotypes using summary statistics from GWAS, and identified a number of potential causal metabolites for FN-BMD, which may provide novel insights into the pathophysiological mechanisms underlying osteoporosis.

  20. Weighted Statistical Binning: Enabling Statistically Consistent Genome-Scale Phylogenetic Analyses

    PubMed Central

    Bayzid, Md Shamsuzzoha; Mirarab, Siavash; Boussau, Bastien; Warnow, Tandy

    2015-01-01

    Because biological processes can result in different loci having different evolutionary histories, species tree estimation requires multiple loci from across multiple genomes. While many processes can result in discord between gene trees and species trees, incomplete lineage sorting (ILS), modeled by the multi-species coalescent, is considered to be a dominant cause for gene tree heterogeneity. Coalescent-based methods have been developed to estimate species trees, many of which operate by combining estimated gene trees, and so are called "summary methods". Because summary methods are generally fast (and much faster than more complicated coalescent-based methods that co-estimate gene trees and species trees), they have become very popular techniques for estimating species trees from multiple loci. However, recent studies have established that summary methods can have reduced accuracy in the presence of gene tree estimation error, and also that many biological datasets have substantial gene tree estimation error, so that summary methods may not be highly accurate in biologically realistic conditions. Mirarab et al. (Science 2014) presented the "statistical binning" technique to improve gene tree estimation in multi-locus analyses, and showed that it improved the accuracy of MP-EST, one of the most popular coalescent-based summary methods. Statistical binning, which uses a simple heuristic to evaluate "combinability" and then uses the larger sets of genes to re-calculate gene trees, has good empirical performance, but using statistical binning within a phylogenomic pipeline does not have the desirable property of being statistically consistent. We show that weighting the re-calculated gene trees by the bin sizes makes statistical binning statistically consistent under the multispecies coalescent, and maintains the good empirical performance. Thus, "weighted statistical binning" enables highly accurate genome-scale species tree estimation, and is also statistically consistent under the multi-species coalescent model. New data used in this study are available at DOI: http://dx.doi.org/10.6084/m9.figshare.1411146, and the software is available at https://github.com/smirarab/binning. PMID:26086579

  1. National Institutes of Health Toolbox Emotion Battery for English- and Spanish-speaking adults: normative data and factor-based summary scores.

    PubMed

    Babakhanyan, Ida; McKenna, Benjamin S; Casaletto, Kaitlin B; Nowinski, Cindy J; Heaton, Robert K

    2018-01-01

    The National Institutes of Health Toolbox Emotion Battery (NIHTB-EB) is a "common currency", computerized assessment developed to measure the full spectrum of emotional health. Though comprehensive, the NIHTB-EB's 17 scales may be unwieldy for users aiming to capture more global indices of emotional functioning. NIHTB-EB was administered to 1,036 English-speaking and 408 Spanish-speaking adults as a part of the NIH Toolbox norming project. We examined the factor structure of the NIHTB-EB in English- and Spanish-speaking adults and developed factor analysis-based summary scores. Census-weighted norms were presented for English speakers, and sample-weighted norms were presented for Spanish speakers. Exploratory factor analysis for both English- and Spanish-speaking cohorts resulted in the same 3-factor solution: 1) negative affect, 2) social satisfaction, and 3) psychological well-being. Confirmatory factor analysis supported similar factor structures for English- and Spanish-speaking cohorts. Model fit indices fell within the acceptable/good range, and our final solution was optimal compared to other solutions. Summary scores based upon the normative samples appear to be psychometrically supported and should be applied to clinical samples to further validate the factor structures and investigate rates of problematic emotions in medical and psychiatric populations.

  2. Effects of ensemble and summary displays on interpretations of geospatial uncertainty data.

    PubMed

    Padilla, Lace M; Ruginski, Ian T; Creem-Regehr, Sarah H

    2017-01-01

    Ensemble and summary displays are two widely used methods to represent visual-spatial uncertainty; however, there is disagreement about which is the most effective technique to communicate uncertainty to the general public. Visualization scientists create ensemble displays by plotting multiple data points on the same Cartesian coordinate plane. Despite their use in scientific practice, it is more common in public presentations to use visualizations of summary displays, which scientists create by plotting statistical parameters of the ensemble members. While prior work has demonstrated that viewers make different decisions when viewing summary and ensemble displays, it is unclear what components of the displays lead to diverging judgments. This study aims to compare the salience of visual features - or visual elements that attract bottom-up attention - as one possible source of diverging judgments made with ensemble and summary displays in the context of hurricane track forecasts. We report that salient visual features of both ensemble and summary displays influence participant judgment. Specifically, we find that salient features of summary displays of geospatial uncertainty can be misunderstood as displaying size information. Further, salient features of ensemble displays evoke judgments that are indicative of accurate interpretations of the underlying probability distribution of the ensemble data. However, when participants use ensemble displays to make point-based judgments, they may overweight individual ensemble members in their decision-making process. We propose that ensemble displays are a promising alternative to summary displays in a geospatial context but that decisions about visualization methods should be informed by the viewer's task.

  3. Foodomics and Food Safety: Where We Are

    PubMed Central

    Andjelković, Uroš

    2017-01-01

    Summary The power of foodomics as a discipline that is now broadly used for quality assurance of food products and adulteration identification, as well as for determining the safety of food, is presented. Concerning sample preparation and application, maintenance of highly sophisticated instruments for both high-performance and high-throughput techniques, and analysis and data interpretation, special attention has to be paid to the development of skilled analysts. The obtained data shall be integrated under a strong bioinformatics environment. Modern mass spectrometry is an extremely powerful analytical tool since it can provide direct qualitative and quantitative information about a molecule of interest from only a minute amount of sample. Quality of this information is influenced by the sample preparation procedure, the type of mass spectrometer used and the analyst’s skills. Technical advances are bringing new instruments of increased sensitivity, resolution and speed to the market. Other methods presented here give additional information and can be used as complementary tools to mass spectrometry or for validation of obtained results. Genomics and transcriptomics, as well as affinity-based methods, still have a broad use in food analysis. Serious drawbacks of some of them, especially the affinity-based methods, are the cross-reactivity between similar molecules and the influence of complex food matrices. However, these techniques can be used for pre-screening in order to reduce the large number of samples. Great progress has been made in the application of bioinformatics in foodomics. These developments enabled processing of large amounts of generated data for both identification and quantification, and for corresponding modeling. PMID:29089845

  4. Space Station CMIF extended duration metabolic control test

    NASA Technical Reports Server (NTRS)

    Schunk, Richard G.; Bagdigian, Robert M.; Carrasquillo, Robyn L.; Ogle, Kathryn Y.; Wieland, Paul O.

    1989-01-01

    The Space Station Extended Duration Metabolic Control Test (EMCT) was conducted at the MSFC Core Module Integration Facility. The primary objective of the EMCT was to gather performance data from a partially-closed regenerative Environmental Control and Life Support (ECLS) system functioning under steady-state conditions. Included is a description of the EMCT configuration, a summary of events, a discussion of anomalies that occurred during the test, and detailed results and analysis from individual measurements of water and gas samples taken during the test. A comparison of the physical, chemical, and microbiological methods used in the post test laboratory analyses of the water samples is included. The preprototype ECLS hardware used in the test, providing an overall process description and theory of operation for each hardware item. Analytical results pertaining to a system level mass balance and selected system power estimates are also included.

  5. Water-quality, bed-sediment, and biological data (October 2013 through September 2014) and statistical summaries of data for streams in the Clark Fork Basin, Montana

    USGS Publications Warehouse

    Dodge, Kent A.; Hornberger, Michelle I.

    2015-12-24

    This report presents the analytical results and qualityassurance data for water-quality, bed-sediment, and biota samples collected at sites from October 2013 through September 2014. Water-quality data include concentrations of selected major ions, trace elements, and suspended sediment. At 12 sites, dissolved organic carbon and turbidity samples were collected. In addition, nitrogen (nitrate plus nitrite) samples were collected at two sites. Daily values of mean suspended-sediment concentration and suspended-sediment discharge were determined for four sites. Seasonal daily values of turbidity were determined for four sites. Bed-sediment data include trace-ele­ment concentrations in the fine-grained fraction. Biological data include trace-element concentrations in wholebody tissue of aquatic benthic insects. Statistical summaries of water-quality, bed-sediment, and biological data for sites in the upper Clark Fork Basin are provided for the period of record.

  6. Payers' Perspectives: Health Economics Outcomes in Managed Care

    PubMed Central

    Bankhead, Charles

    2015-01-01

    The following summaries represent a sample of the many studies presented at the 27th Annual Meeting of the Academy of Managed Care Pharmacy (AMCP), April 7–10, 2015, in San Diego, CA. These summaries highlight some of the main trends in the current US healthcare, reflecting the impact of real-world, evidence-based issues of high interest for payers, employers, drug manufacturers, providers, patients, and other healthcare stakeholders. PMID:26085902

  7. Problems With Risk Reclassification Methods for Evaluating Prediction Models

    PubMed Central

    Pepe, Margaret S.

    2011-01-01

    For comparing the performance of a baseline risk prediction model with one that includes an additional predictor, a risk reclassification analysis strategy has been proposed. The first step is to cross-classify risks calculated according to the 2 models for all study subjects. Summary measures including the percentage of reclassification and the percentage of correct reclassification are calculated, along with 2 reclassification calibration statistics. The author shows that interpretations of the proposed summary measures and P values are problematic. The author's recommendation is to display the reclassification table, because it shows interesting information, but to use alternative methods for summarizing and comparing model performance. The Net Reclassification Index has been suggested as one alternative method. The author argues for reporting components of the Net Reclassification Index because they are more clinically relevant than is the single numerical summary measure. PMID:21555714

  8. Inferring population history with DIY ABC: a user-friendly approach to approximate Bayesian computation

    PubMed Central

    Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A.; Robert, Christian P.; Marin, Jean-Michel; Balding, David J.; Guillemaud, Thomas; Estoup, Arnaud

    2008-01-01

    Summary: Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. Availability: The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc. Contact: j.cornuet@imperial.ac.uk Supplementary information: Supplementary data are also available at http://www.montpellier.inra.fr/CBGP/diyabc PMID:18842597

  9. Sexual Harassment Retaliation Climate DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    exploratory factor analysis, and bivariate correlations (sample 1) 2) To determine the factor structure of the remaining (final) questions via...statistics, reliability analysis, exploratory factor analysis, and bivariate correlations of the prospective Sexual Harassment Retaliation Climate...reported by the survey requester). For information regarding the composition of sample, refer to Table 1. Table 1. Sample 1 Demographics n

  10. 75 FR 16874 - Market Test of “Samples Co-Op Box” Experimental Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-02

    ... POSTAL SERVICE Market Test of ``Samples Co-Op Box'' Experimental Product AGENCY: Postal Service TM . ACTION: Notice. SUMMARY: The Postal Service gives notice of a market test of an experimental product in... pursuant to 39 U.S.C. 3641(c)(1) that it will begin a market test of its ``Samples Co-Op Box'' experimental...

  11. Summary of chemical data from onsite and laboratory analyses of groundwater samples from the surficial aquifer, Las Vegas, Nevada, April and August 1993 and September 1994

    USGS Publications Warehouse

    Reddy, Michael M.; Gunther, Charmaine D.

    2012-01-01

    Samples were collected from groundwater wells in and about the city of Las Vegas, Nevada, and were analyzed for selected major, minor and trace constituents. Analyses of blank and reference samples are summarized as mean and standard deviation values for all positive results.

  12. REECo activities and sample logistics in support of the Nevada Applied Ecology Group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wireman, D.L.; Rosenberry, C.E. Jr.; White, M.G.

    Activities and sample logistics of Reynolds Electrical and Engineering Co., Inc. (REECo), in support of the Nevada Applied Ecology Group (NAEG), are discussed in this summary report. Activities include the collection, preparation, and shipment of samples of soils, vegetation, and small animals collected at Pu-contaminated areas of the Nevada Test Site and Tonopah Test Range. (CH)

  13. Mechanics of Ballast Compaction. Volume 5 : Summary Report.

    DOT National Transportation Integrated Search

    1982-03-01

    This report summarizes the results of research on the mechanics of ballast compaction. Details are provided in four preceeding reports. The scope of this summary includes: (1) a description of ballast physical state, (2) methods developed for measuri...

  14. Bilingual term alignment from comparable corpora in English discharge summary and Chinese discharge summary.

    PubMed

    Xu, Yan; Chen, Luoxin; Wei, Junsheng; Ananiadou, Sophia; Fan, Yubo; Qian, Yi; Chang, Eric I-Chao; Tsujii, Junichi

    2015-05-09

    Electronic medical record (EMR) systems have become widely used throughout the world to improve the quality of healthcare and the efficiency of hospital services. A bilingual medical lexicon of Chinese and English is needed to meet the demand for the multi-lingual and multi-national treatment. We make efforts to extract a bilingual lexicon from English and Chinese discharge summaries with a small seed lexicon. The lexical terms can be classified into two categories: single-word terms (SWTs) and multi-word terms (MWTs). For SWTs, we use a label propagation (LP; context-based) method to extract candidates of translation pairs. For MWTs, which are pervasive in the medical domain, we propose a term alignment method, which firstly obtains translation candidates for each component word of a Chinese MWT, and then generates their combinations, from which the system selects a set of plausible translation candidates. We compare our LP method with a baseline method based on simple context-similarity. The LP based method outperforms the baseline with the accuracies: 4.44% Acc1, 24.44% Acc10, and 62.22% Acc100, where AccN means the top N accuracy. The accuracy of the LP method drops to 5.41% Acc10 and 8.11% Acc20 for MWTs. Our experiments show that the method based on term alignment improves the performance for MWTs to 16.22% Acc10 and 27.03% Acc20. We constructed a framework for building an English-Chinese term dictionary from discharge summaries in the two languages. Our experiments have shown that the LP-based method augmented with the term alignment method will contribute to reduction of manual work required to compile a bilingual sydictionary of clinical terms.

  15. Experimental Null Method to Guide the Development of Technical Procedures and to Control False-Positive Discovery in Quantitative Proteomics.

    PubMed

    Shen, Xiaomeng; Hu, Qiang; Li, Jun; Wang, Jianmin; Qu, Jun

    2015-10-02

    Comprehensive and accurate evaluation of data quality and false-positive biomarker discovery is critical to direct the method development/optimization for quantitative proteomics, which nonetheless remains challenging largely due to the high complexity and unique features of proteomic data. Here we describe an experimental null (EN) method to address this need. Because the method experimentally measures the null distribution (either technical or biological replicates) using the same proteomic samples, the same procedures and the same batch as the case-vs-contol experiment, it correctly reflects the collective effects of technical variability (e.g., variation/bias in sample preparation, LC-MS analysis, and data processing) and project-specific features (e.g., characteristics of the proteome and biological variation) on the performances of quantitative analysis. To show a proof of concept, we employed the EN method to assess the quantitative accuracy and precision and the ability to quantify subtle ratio changes between groups using different experimental and data-processing approaches and in various cellular and tissue proteomes. It was found that choices of quantitative features, sample size, experimental design, data-processing strategies, and quality of chromatographic separation can profoundly affect quantitative precision and accuracy of label-free quantification. The EN method was also demonstrated as a practical tool to determine the optimal experimental parameters and rational ratio cutoff for reliable protein quantification in specific proteomic experiments, for example, to identify the necessary number of technical/biological replicates per group that affords sufficient power for discovery. Furthermore, we assessed the ability of EN method to estimate levels of false-positives in the discovery of altered proteins, using two concocted sample sets mimicking proteomic profiling using technical and biological replicates, respectively, where the true-positives/negatives are known and span a wide concentration range. It was observed that the EN method correctly reflects the null distribution in a proteomic system and accurately measures false altered proteins discovery rate (FADR). In summary, the EN method provides a straightforward, practical, and accurate alternative to statistics-based approaches for the development and evaluation of proteomic experiments and can be universally adapted to various types of quantitative techniques.

  16. Summary geochemical maps for samples of rock, stream sediment, and nonmagnetic heavy-mineral concentrate, Pyramid Roadless Area, El Dorado County, California

    USGS Publications Warehouse

    Chaffee, M.A.

    1986-01-01

    Geochemical sampling was conducted during 1982. This report summarizes the results of that investigation and provides details of the geochemical evaluation used in producing the final mineral resource assessment of the study area (Armstrong and others, 1983).

  17. The Cash Flow Budget. Part II--Implementation

    ERIC Educational Resources Information Center

    Gehm, Rudy

    1978-01-01

    An "aged accounts payable" (A/P) summary and a cash disbursements journal are advocated as management measures useful in monitoring the cash flow in a college store. Methods for maintaining the A/P summary and for updating the journal are illustrated. (LBH)

  18. Potential and Challenges in Collecting Social and Behavioral Data on Adolescent Alcohol Norms: Comparing Respondent-Driven Sampling and Web-Based Respondent-Driven Sampling

    PubMed Central

    Hildebrand, Janina; Burns, Sharyn; Zhao, Yun; Lobo, Roanna; Howat, Peter; Allsop, Steve

    2015-01-01

    Background Respondent-driven sampling (RDS) is a method successfully used to research hard-to-access populations. Few studies have explored the use of the Internet and social media with RDS, known as Web-based RDS (WebRDS). This study explored the use of combining both “traditional” RDS and WebRDS to examine the influences on adolescent alcohol use. Objective This paper reports on the recruitment processes and the challenges and enablers of both RDS and WebRDS. It details comparative recruitment data and provides a summary of the utility of both methods for recruiting adolescents to participate in an online survey investigating youth alcohol norms. Methods Process evaluation data collected from research staff throughout the study were used to assess the challenges and solutions of RDS and WebRDS. Pearson chi-square test (Fisher’s exact test if applicable) was used to compare the differences in sociodemographics and drinking behavior between data collected by RDS and WebRDS. Results Of the total sample (N=1012), 232 adolescents were recruited by RDS and 780 by WebRDS. A significantly larger proportion of Aboriginal or Torres Strait Islander (P<.001) participants who spoke English as their main language at home (P=.03), and of middle and lower socioeconomic status (P<.001) was found in the RDS sample. The RDS sample was also found to have a higher occurrence of past 7-day drinking (P<.001) and past 7-day risky drinking (P=.004). No significant differences in gender, age, past month alcohol use, and lifetime alcohol use were observed between the RDS and WebRDS samples. This study revealed RDS and WebRDS used similar lengths of chains for recruiting participants; however, WebRDS conducted a faster rate of recruitment at a lower average cost per participant compared to RDS. Conclusions Using WebRDS resulted in significant improvements in the recruitment rate and was a more effective and efficient use of resources than the traditional RDS method. However, WebRDS resulted in partially different sample characteristics to traditional RDS. This potential effect should be considered when selecting the most appropriate data collection method. PMID:26704736

  19. Mixed emotions: Sensitivity to facial variance in a crowd of faces.

    PubMed

    Haberman, Jason; Lee, Pegan; Whitney, David

    2015-01-01

    The visual system automatically represents summary information from crowds of faces, such as the average expression. This is a useful heuristic insofar as it provides critical information about the state of the world, not simply information about the state of one individual. However, the average alone is not sufficient for making decisions about how to respond to a crowd. The variance or heterogeneity of the crowd--the mixture of emotions--conveys information about the reliability of the average, essential for determining whether the average can be trusted. Despite its importance, the representation of variance within a crowd of faces has yet to be examined. This is addressed here in three experiments. In the first experiment, observers viewed a sample set of faces that varied in emotion, and then adjusted a subsequent set to match the variance of the sample set. To isolate variance as the summary statistic of interest, the average emotion of both sets was random. Results suggested that observers had information regarding crowd variance. The second experiment verified that this was indeed a uniquely high-level phenomenon, as observers were unable to derive the variance of an inverted set of faces as precisely as an upright set of faces. The third experiment replicated and extended the first two experiments using method-of-constant-stimuli. Together, these results show that the visual system is sensitive to emergent information about the emotional heterogeneity, or ambivalence, in crowds of faces.

  20. New version of PLNoise: a package for exact numerical simulation of power-law noises

    NASA Astrophysics Data System (ADS)

    Milotti, Edoardo

    2007-08-01

    In a recent paper I have introduced a package for the exact simulation of power-law noises and other colored noises [E. Milotti, Comput. Phys. Comm. 175 (2006) 212]: in particular, the algorithm generates 1/f noises with 0<α⩽2. Here I extend the algorithm to generate 1/f noises with 2<α⩽4 (black noises). The method is exact in the sense that it produces a sampled process with a theoretically guaranteed range-limited power-law spectrum for any arbitrary sequence of sampling intervals, i.e. the sampling times may be unevenly spaced. Program summaryTitle of program: PLNoise Catalogue identifier:ADXV_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXV_v2_0.html Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Programming language used: ANSI C Computer: Any computer with an ANSI C compiler: the package has been tested with gcc version 3.2.3 on Red Hat Linux 3.2.3-52 and gcc version 4.0.0 and 4.0.1 on Apple Mac OS X-10.4 Operating system: All operating systems capable of running an ANSI C compiler RAM: The code of the test program is very compact (about 60 Kbytes), but the program works with list management and allocates memory dynamically; in a typical run with average list length 2ṡ10, the RAM taken by the list is 200 Kbytes External routines: The package needs external routines to generate uniform and exponential deviates. The implementation described here uses the random number generation library ranlib freely available from Netlib [B.W. Brown, J. Lovato, K. Russell: ranlib, available from Netlib, http://www.netlib.org/random/index.html, select the C version ranlib.c], but it has also been successfully tested with the random number routines in Numerical Recipes [W.H. Press, S.A. Teulkolsky, W.T. Vetterling, B.P. Flannery, Numerical Recipes in C: The Art of Scientific Computing, second ed., Cambridge Univ. Press., Cambridge, 1992, pp. 274-290]. Notice that ranlib requires a pair of routines from the linear algebra package LINPACK, and that the distribution of ranlib includes the C source of these routines, in case LINPACK is not installed on the target machine. No. of lines in distributed program, including test data, etc.:2975 No. of bytes in distributed program, including test data, etc.:194 588 Distribution format:tar.gz Catalogue identifier of previous version: ADXV_v1_0 Journal reference of previous version: Comput. Phys. Comm. 175 (2006) 212 Does the new version supersede the previous version?: Yes Nature of problem: Exact generation of different types of colored noise. Solution method: Random superposition of relaxation processes [E. Milotti, Phys. Rev. E 72 (2005) 056701], possibly followed by an integration step to produce noise with spectral index >2. Reasons for the new version: Extension to 1/f noises with spectral index 2<α⩽4: the new version generates both noises with spectral with spectral index 0<α⩽2 and with 2<α⩽4. Summary of revisions: Although the overall structure remains the same, one routine has been added and several changes have been made throughout the code to include the new integration step. Unusual features: The algorithm is theoretically guaranteed to be exact, and unlike all other existing generators it can generate samples with uneven spacing. Additional comments: The program requires an initialization step; for some parameter sets this may become rather heavy. Running time: Running time varies widely with different input parameters, however in a test run like the one in Section 3 in the long write-up, the generation routine took on average about 75 μs for each sample.

  1. Nuclear Ensemble Approach with Importance Sampling.

    PubMed

    Kossoski, Fábris; Barbatti, Mario

    2018-06-12

    We show that the importance sampling technique can effectively augment the range of problems where the nuclear ensemble approach can be applied. A sampling probability distribution function initially determines the collection of initial conditions for which calculations are performed, as usual. Then, results for a distinct target distribution are computed by introducing compensating importance sampling weights for each sampled point. This mapping between the two probability distributions can be performed whenever they are both explicitly constructed. Perhaps most notably, this procedure allows for the computation of temperature dependent observables. As a test case, we investigated the UV absorption spectra of phenol, which has been shown to have a marked temperature dependence. Application of the proposed technique to a range that covers 500 K provides results that converge to those obtained with conventional sampling. We further show that an overall improved rate of convergence is obtained when sampling is performed at intermediate temperatures. The comparison between calculated and the available measured cross sections is very satisfactory, as the main features of the spectra are correctly reproduced. As a second test case, one of Tully's classical models was revisited, and we show that the computation of dynamical observables also profits from the importance sampling technique. In summary, the strategy developed here can be employed to assess the role of temperature for any property calculated within the nuclear ensemble method, with the same computational cost as doing so for a single temperature.

  2. GHSI EMERGENCY RADIONUCLIDE BIOASSAY LABORATORY NETWORK - SUMMARY OF THE SECOND EXERCISE.

    PubMed

    Li, Chunsheng; Bartizel, Christine; Battisti, Paolo; Böttger, Axel; Bouvier, Céline; Capote-Cuellar, Antonio; Carr, Zhanat; Hammond, Derek; Hartmann, Martina; Heikkinen, Tarja; Jones, Robert L; Kim, Eunjoo; Ko, Raymond; Koga, Roberto; Kukhta, Boris; Mitchell, Lorna; Morhard, Ryan; Paquet, Francois; Quayle, Debora; Rulik, Petr; Sadi, Baki; Sergei, Aleksanin; Sierra, Inmaculada; de Oliveira Sousa, Wanderson; Szab, Gyula

    2017-05-01

    The Global Health Security Initiative (GHSI) established a laboratory network within the GHSI community to develop collective surge capacity for radionuclide bioassay in response to a radiological or nuclear emergency as a means of enhancing response capability, health outcomes and community resilience. GHSI partners conducted an exercise in collaboration with the WHO Radiation Emergency Medical Preparedness and Assistance Network and the IAEA Response and Assistance Network, to test the participating laboratories (18) for their capabilities in in vitro assay of biological samples, using a urine sample spiked with multiple high-risk radionuclides (90Sr, 106Ru, 137Cs, and 239Pu). Laboratories were required to submit their reports within 72 h following receipt of the sample, using a pre-formatted template, on the procedures, methods and techniques used to identify and quantify the radionuclides in the sample, as well as the bioassay results with a 95% confidence interval. All of the participating laboratories identified and measured all or some of the radionuclides in the sample. However, gaps were identified in both the procedures used to assay multiple radionuclides in one sample, as well as in the methods or techniques used to assay specific radionuclides in urine. Two-third of the participating laboratories had difficulties in determining all the radionuclides in the sample. Results from this exercise indicate that challenges remain with respect to ensuring that results are delivered in a timely, consistent and reliable manner to support medical interventions. Laboratories within the networks are encouraged to work together to develop and maintain collective capabilities and capacity for emergency bioassay, which is an important component of radiation emergency response. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. GHSI Emergency Radionuclide Bioassay Laboratory Network - Summary of the Second Exercise

    PubMed Central

    Li, Chunsheng; Bartizel, Christine; Battisti, Paolo; Böttger, Axel; Bouvier, Céline; Capote-Cuellar, Antonio; Carr, Zhanat; Hammond, Derek; Hartmann, Martina; Heikkinen, Tarja; Jones, Robert L.; Kim, Eunjoo; Ko, Raymond; Koga, Roberto; Kukhta, Boris; Mitchell, Lorna; Morhard, Ryan; Paquet, Francois; Quayle, Debora; Rulik, Petr; Sadi, Baki; Sergei, Aleksanin; Sierra, Inmaculada; de Oliveira Sousa, Wanderson; Szabó, Gyula

    2017-01-01

    The Global Health Security Initiative (GHSI) established a laboratory network within the GHSI community to develop collective surge capacity for radionuclide bioassay in response to a radiological or nuclear emergency as a means of enhancing response capability, health outcomes and community resilience. GHSI partners conducted an exercise in collaboration with the WHO REMPAN (Radiation Emergency Medical Preparedness and Assistance Network) and the IAEA RANET (Response and Assistance Network), to test the participating laboratories (18) for their capabilities in in vitro assay of biological samples, using a urine sample spiked with multiple high-risk radionuclides (90Sr, 106Ru, 137Cs, and 239Pu). Laboratories were required to submit their reports within 72 hours following receipt of the sample, using a pre-formatted template, on the procedures, methods and techniques used to identify and quantify the radionuclides in the sample, as well as the bioassay results with a 95% confidence interval. All of the participating laboratories identified and measured all or some of the radionuclides in the sample. However, gaps were identified in both the procedures used to assay multiple radionuclides in one sample, as well as in the methods or techniques used to assay specific radionuclides in urine. Two third of the participating laboratories had difficulties in determining all the radionuclides in the sample. Results from this exercise indicate that challenges remain with respect to ensuring that results are delivered in a timely, consistent and reliable manner to support medical interventions. Laboratories within the networks are encouraged to work together to develop and maintain collective capabilities and capacity for emergency bioassay, which is an important component of radiation emergency response. PMID:27574317

  4. Accelerated benzene polycarboxylic acid analysis by liquid chromatography-time-of-flight-mass spectrometry for the determination of petrogenic and pyrogenic carbon.

    PubMed

    Hindersmann, Benjamin; Achten, Christine

    2017-08-11

    Pyrogenic carbon species are of particular interest due to their ubiquitous occurrence in the environment and their high sorption capacities for nonpolar organic compounds. It has recently been shown that the analysis of the molecular markers for complex aromatic carbon structures, benzene polycarboxylic acids (BPCA), has a high potential for aid in the identification of different carbon sources. In this study, the first LC method using mass spectrometry (MS) for reliable and accelerated (<24h) quantification of pyrogenic and petrogenic carbon by BPCA analysis has been developed. The main advantage of LC-MS compared to previous methods is the higher sensitivity, which is important if only small sample amounts are available. Sample pre-treatment could be reduced to a minimum. Deuterated phthalic acid was introduced as internal standard due to its structural similarity to BPCA and its lack of occurrence in the environment. Linear quantification with r 2 ≥0997 was accomplished for all BPCA. Method validation showed an excellent quantification reproducibility (mean CV<5%) which is comparable to LC-DAD methods and more reliable than GC-FID measurements (CV 16-23%). In summary, the presented BPCA method is more economic, efficient and presumably attractive to use. Besides reference materials, various pyrogenic and petrogenic samples were analyzed to test if the sources were indicated by BPCA analysis. In addition to pyrogenic carbon, large amounts of petrogenic carbon species can also be present in urban soils and river sediments, especially in mining regions. They also to a large degree consist of aromatic carbon structures and therefore have an impact on source identification by BPCA analysis. Comparison of petrogenic and pyrogenic carbon samples shows similarities in the BPCA concentrations and patterns, in their aromaticity and degree of aromatic condensation. Thus, a differentiation between petrogenic and pyrogenic carbon only by BPCA analysis of samples with unknown carbon sources is not possible. For reliable source identification of the carbon species, the combination with other methods, such as e. g. analysis of polycyclic aromatic hydrocarbons may be successful. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. What constitutes a high quality discharge summary? A comparison between the views of secondary and primary care doctors.

    PubMed

    Yemm, Rowan; Bhattacharya, Debi; Wright, David; Poland, Fiona

    2014-07-05

    This study aimed to identify any differences in opinion between UK hospital junior doctors and community General Practitioners (GPs) with respect to the ideal content and characteristics of discharge summaries, and to explore junior doctors' training for and awareness of post-discharge requirements of GPs. A piloted anonymous survey was posted to 74 junior doctors at a UK general hospital and 153 local GPs. Doctors were asked to rank discharge summary key content and characteristics in order of importance. GP discharge summary preferences and junior doctor training were also investigated. Non-respondents, identified by non-receipt of a separate participation card, were followed up once. Thirty-six (49%) junior doctors and 42 (28%) GPs returned completed questionnaires. Accuracy was a priority with 24 (72%) GPs and 28 (88%) junior doctors ranking it most important. Details of medication changes were considered most important by 13 (39%) GPs and 4 (12%) junior doctors. Inadequate training in discharge summary writing was reported by 13 (36%) junior doctors. Although based on small sample sizes from one location, the level and range of differences in perceived importance of reporting medication changes suggests that many discharge summaries may not currently fulfil GP requirements for managing continuity of care. Results indicate that over a third of junior doctors felt inadequately prepared for writing discharge summaries. There may therefore be both a need and professional support for further training in discharge summary writing, requiring confirmatory research.

  6. Aircraft data summaries for the SURE intensives. Final report. [Sampling done August 1977 near Rockport, Indiana and Duncan Falls, Ohio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blumenthal, D.L.; Tommerdahl, J.B.; McDonald, J.A.

    1981-09-01

    As part of the EPRI sulfate regional experiment (SURE), Meteorology Research, Inc., (MRI) and Research Triangle Institute (RTI) conducted six air quality sampling programs in the eastern United States using instrumented aircraft. This volume includes the air quality and meteorological data obtained during the August 1977 Intensive when MRI sampled near the Rockport, Indiana, SURE Station and RTI sampled near the Duncan Falls, Ohio, SURE Station. Sampling data are presented for all measured parameters.

  7. Electrical properties of Apollo 17 rock and soil samples and a summary of the electrical properties of lunar material at 450 MHz frequency

    NASA Technical Reports Server (NTRS)

    Gold, T.; Bilson, E.; Baron, R. L.

    1976-01-01

    The dielectric constant and the voltage absorption length was measured for four Apollo 17 soil samples (73241, 74220, 75061, 76501) and for two Apollo 17 rock samples (76315 and 79135) at 450 MHz frequency. The dielectric constant and absorption length measurements made on the lunar samples are reviewed and related to the transition element concentration in these samples. The significance of the laboratory measurements for radar observations is discussed.

  8. Summary of Proton Test on the Actel A1280A at Indiana University

    NASA Technical Reports Server (NTRS)

    Katz, Richard; LaBel, K.

    1998-01-01

    A summary of tests performed at the Indiana University Cycl,oltron Facility, on the Actel A1280A circuit device is described. The intent of the study was to investigate the proton response of the hard-wired S-Module flip-flops with a large sample size. This device is sensitive to protons for S-Modules. The device's performance in the test is shown in graphs, and was typical for devices of this class.

  9. Comparison between the standard and a new alternative format of the Summary-of-Findings tables in Cochrane review users: study protocol for a randomized controlled trial.

    PubMed

    Carrasco-Labra, Alonso; Brignardello-Petersen, Romina; Santesso, Nancy; Neumann, Ignacio; Mustafa, Reem A; Mbuagbaw, Lawrence; Ikobaltzeta, Itziar Etxeandia; De Stio, Catherine; McCullagh, Lauren J; Alonso-Coello, Pablo; Meerpohl, Joerg J; Vandvik, Per Olav; Brozek, Jan L; Akl, Elie A; Bossuyt, Patrick; Churchill, Rachel; Glenton, Claire; Rosenbaum, Sarah; Tugwell, Peter; Welch, Vivian; Guyatt, Gordon; Schünemann, Holger

    2015-04-16

    Systematic reviews represent one of the most important tools for knowledge translation but users often struggle with understanding and interpreting their results. GRADE Summary-of-Findings tables have been developed to display results of systematic reviews in a concise and transparent manner. The current format of the Summary-of-Findings tables for presenting risks and quality of evidence improves understanding and assists users with finding key information from the systematic review. However, it has been suggested that additional methods to present risks and display results in the Summary-of-Findings tables are needed. We will conduct a non-inferiority parallel-armed randomized controlled trial to determine whether an alternative format to present risks and display Summary-of-Findings tables is not inferior compared to the current standard format. We will measure participant understanding, accessibility of the information, satisfaction, and preference for both formats. We will invite systematic review users to participate (that is clinicians, guideline developers, and researchers). The data collection process will be undertaken using the online 'Survey Monkey' system. For the primary outcome understanding, non-inferiority of the alternative format (Table A) to the current standard format (Table C) of Summary-of-Findings tables will be claimed if the upper limit of a 1-sided 95% confidence interval (for the difference of proportion of participants answering correctly a given question) excluded a difference in favor of the current format of more than 10%. This study represents an effort to provide systematic reviewers with additional options to display review results using Summary-of-Findings tables. In this way, review authors will have a variety of methods to present risks and more flexibility to choose the most appropriate table features to display (that is optional columns, risks expressions, complementary methods to display continuous outcomes, and so on). NCT02022631 (21 December 2013).

  10. Installation summary report : GRS instrumentation I-70 over Smith Road.

    DOT National Transportation Integrated Search

    2016-07-04

    This report presents a summary of the I-70 over Smith Road GRS Instrumentation Project (the project) in Aurora, Colorado. The report summarizes the instruments used, installation means and methods, and a discussion on the web-based data interface. CD...

  11. A Summary of Publications on Methods and Tools for Assessing Cumulative Risk, Project Summary

    EPA Science Inventory

    This collection of eight publications on cumulative risk assessment was developed collaboratively among scientists within EPA’s Office of Research and Development and three other organizations. These include scientific collaborations through an Interagency Agreement with Argonne...

  12. Equivalent statistics and data interpretation.

    PubMed

    Francis, Gregory

    2017-08-01

    Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.

  13. A Summary of the Foundation Research Program.

    DTIC Science & Technology

    1980-03-01

    Electrical Engineering Sponsor: NPS Foundation Research Program Objective: To develop a new method for solving transient electromagnetic problems. Summary...This is a new project that is still in the start up phase. During the nt"t year, our goal is to develop a new iterativ, in-’rse scattering method for...unlimited Prepared for: Chief of Naval Research Arlington, Virginia 22217 and Chief of Naval Development Washington, D. C. 20360 80 5 20091 NAVAL

  14. Control of Aerodynamic Flows. Delivery Order 0051: Transition Prediction Method Review Summary for the Rapid Assessment Tool for Transition Prediction (RATTraP)

    DTIC Science & Technology

    2005-06-15

    61 9.2.7 Reynolds Number Effects...............................................................................................62 9.2.8...appropriate for control, and is therefore very useful for airfoil and wing design. However, Arnal (1994) and Schrauf (1994) review the different approaches...evaluation of new airfoil shapes for wings, even in 3- D, in a comparative sense. In summary, carefully used LST is the method of choice for

  15. Sensitivity analysis of infectious disease models: methods, advances and their application

    PubMed Central

    Wu, Jianyong; Dhingra, Radhika; Gambhir, Manoj; Remais, Justin V.

    2013-01-01

    Sensitivity analysis (SA) can aid in identifying influential model parameters and optimizing model structure, yet infectious disease modelling has yet to adopt advanced SA techniques that are capable of providing considerable insights over traditional methods. We investigate five global SA methods—scatter plots, the Morris and Sobol’ methods, Latin hypercube sampling-partial rank correlation coefficient and the sensitivity heat map method—and detail their relative merits and pitfalls when applied to a microparasite (cholera) and macroparasite (schistosomaisis) transmission model. The methods investigated yielded similar results with respect to identifying influential parameters, but offered specific insights that vary by method. The classical methods differed in their ability to provide information on the quantitative relationship between parameters and model output, particularly over time. The heat map approach provides information about the group sensitivity of all model state variables, and the parameter sensitivity spectrum obtained using this method reveals the sensitivity of all state variables to each parameter over the course of the simulation period, especially valuable for expressing the dynamic sensitivity of a microparasite epidemic model to its parameters. A summary comparison is presented to aid infectious disease modellers in selecting appropriate methods, with the goal of improving model performance and design. PMID:23864497

  16. TIMESERIESSTREAMING.VI: LabVIEW program for reliable data streaming of large analog time series

    NASA Astrophysics Data System (ADS)

    Czerwinski, Fabian; Oddershede, Lene B.

    2011-02-01

    With modern data acquisition devices that work fast and very precise, scientists often face the task of dealing with huge amounts of data. These need to be rapidly processed and stored onto a hard disk. We present a LabVIEW program which reliably streams analog time series of MHz sampling. Its run time has virtually no limitation. We explicitly show how to use the program to extract time series from two experiments: For a photodiode detection system that tracks the position of an optically trapped particle and for a measurement of ionic current through a glass capillary. The program is easy to use and versatile as the input can be any type of analog signal. Also, the data streaming software is simple, highly reliable, and can be easily customized to include, e.g., real-time power spectral analysis and Allan variance noise quantification. Program summaryProgram title: TimeSeriesStreaming.VI Catalogue identifier: AEHT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 250 No. of bytes in distributed program, including test data, etc.: 63 259 Distribution format: tar.gz Programming language: LabVIEW ( http://www.ni.com/labview/) Computer: Any machine running LabVIEW 8.6 or higher Operating system: Windows XP and Windows 7 RAM: 60-360 Mbyte Classification: 3 Nature of problem: For numerous scientific and engineering applications, it is highly desirable to have an efficient, reliable, and flexible program to perform data streaming of time series sampled with high frequencies and possibly for long time intervals. This type of data acquisition often produces very large amounts of data not easily streamed onto a computer hard disk using standard methods. Solution method: This LabVIEW program is developed to directly stream any kind of time series onto a hard disk. Due to optimized timing and usage of computational resources, such as multicores and protocols for memory usage, this program provides extremely reliable data acquisition. In particular, the program is optimized to deal with large amounts of data, e.g., taken with high sampling frequencies and over long time intervals. The program can be easily customized for time series analyses. Restrictions: Only tested in Windows-operating LabVIEW environments, must use TDMS format, acquisition cards must be LabVIEW compatible, driver DAQmx installed. Running time: As desirable: microseconds to hours

  17. Assessing prescription stimulant use, misuse and diversion among youth 10 to 18 years of age

    PubMed Central

    Cottler, Linda B.; Striley, Catherine Woodstock; Lasopa, Sonam O.

    2018-01-01

    Purpose Assessing medical and non-medical use of stimulants and diversion is a challenge, especially among youth, with different methods for recruitment, and definitions of non-medical use and use. The field needs inexpensive, yet effective and reliable, methods of data collection to understand the prescription drug use problem. Most studies of youth are school or web-based, and conducted with teens. Recent Findings The National Monitoring of Adolescent Prescription Stimulants Study (N-MAPSS) recruited 11,048 youth 10 to 18 years of age from urban, rural and suburban areas in ten US cities using an entertainment venue intercept study. This review discusses the effectiveness of the method and results from four cross sections as well as the representativeness of the sample. Lifetime prevalence of any stimulant use was 14.8%, with rates highest among rural 16 to 18 year olds. The rate of past 30 day use was 7.3%, with over half (3.9%) non-medical use. Nearly 12% of all youth (whether a user or not) reported lifetime incoming/outgoing diversion of prescription stimulants. Summary Because no study has focused on stimulant use among youth as young as 10 and 11, this study is a landmark for future comparisons and offers a unique strategy for sampling and data collection. PMID:23896947

  18. New methods allowing the detection of protein aggregates

    PubMed Central

    Demeule, Barthélemy; Palais, Caroline; Machaidze, Gia; Gurny, Robert

    2009-01-01

    Aggregation compromises the safety and efficacy of therapeutic proteins. According to the manufacturer, the therapeutic immunoglobulin trastuzumab (Herceptin®) should be diluted in 0.9% sodium chloride before administration. Dilution in 5% dextrose solutions is prohibited. The reason for the interdiction is not mentioned in the Food and Drug Administration (FDA) documentation, but the European Medicines Agency (EMEA) Summary of Product Characteristics states that dilution of trastuzumab in dextrose solutions results in protein aggregation. In this paper, asymmetrical flow field-flow fractionation (FFF), fluorescence spectroscopy, fluorescence microscopy and transmission electron microscopy (TEM) have been used to characterize trastuzumab samples diluted in 0.9% sodium chloride, a stable infusion solution, as well as in 5% dextrose (a solution prone to aggregation). When trastuzumab samples were injected in the FFF channel using a standard separation method, no difference could be seen between trastuzumab diluted in sodium chloride and trastuzumab diluted in dextrose. However, during FFF measurements made with appropriate protocols, aggregates were detected in 5% dextrose. The parameters enabling the detection of reversible trastuzumab aggregates are described. Aggregates could also be documented by fluorescence microscopy and TEM. Fluorescence spectroscopy data were indicative of conformational changes consistent with increased aggregation and adsorption to surfaces. The analytical methods presented in this study were able to detect and characterize trastuzumab aggregates. PMID:20061815

  19. Multiple phenotype association tests using summary statistics in genome-wide association studies.

    PubMed

    Liu, Zhonghua; Lin, Xihong

    2018-03-01

    We study in this article jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. © 2017, The International Biometric Society.

  20. Challenges and complexities of multifrequency atomic force microscopy in liquid environments

    PubMed Central

    2014-01-01

    Summary This paper illustrates through numerical simulation the complexities encountered in high-damping AFM imaging, as in liquid enviroments, within the specific context of multifrequency atomic force microscopy (AFM). The focus is primarily on (i) the amplitude and phase relaxation of driven higher eigenmodes between successive tip–sample impacts, (ii) the momentary excitation of non-driven higher eigenmodes and (iii) base excitation artifacts. The results and discussion are mostly applicable to the cases where higher eigenmodes are driven in open loop and frequency modulation within bimodal schemes, but some concepts are also applicable to other types of multifrequency operations and to single-eigenmode amplitude and frequency modulation methods. PMID:24778952

  1. Advanced ETC/LSS computerized analytical models, CO2 concentration. Volume 1: Summary document

    NASA Technical Reports Server (NTRS)

    Taylor, B. N.; Loscutoff, A. V.

    1972-01-01

    Computer simulations have been prepared for the concepts of C02 concentration which have the potential for maintaining a C02 partial pressure of 3.0 mmHg, or less, in a spacecraft environment. The simulations were performed using the G-189A Generalized Environmental Control computer program. In preparing the simulations, new subroutines to model the principal functional components for each concept were prepared and integrated into the existing program. Sample problems were run to demonstrate the methods of simulation and performance characteristics of the individual concepts. Comparison runs for each concept can be made for parametric values of cabin pressure, crew size, cabin air dry and wet bulb temperatures, and mission duration.

  2. Ecological Momentary Assessment is a Neglected Methodology in Suicidology.

    PubMed

    Davidson, Collin L; Anestis, Michael D; Gutierrez, Peter M

    2017-01-02

    Ecological momentary assessment (EMA) is a group of research methods that collect data frequently, in many contexts, and in real-world settings. EMA has been fairly neglected in suicidology. The current article provides an overview of EMA for suicidologists including definitions, data collection considerations, and different sampling strategies. Next, the benefits of EMA in suicidology (i.e., reduced recall bias, accurate tracking of fluctuating variables, testing assumptions of theories, use in interventions), participant safety considerations, and examples of published research that investigate self-directed violence variables using EMA are discussed. The article concludes with a summary and suggested directions for EMA research in suicidology with the particular aim to spur the increased use of this methodology among suicidologists.

  3. Solid-phase data from cores at the proposed Dewey Burdock uranium in-situ recovery mine, near Edgemont, South Dakota

    USGS Publications Warehouse

    Johnson, Raymond H.; Diehl, Sharon F.; Benzel, William M.

    2013-01-01

    This report releases solid-phase data from cores at the proposed Dewey Burdock uranium in-situ recovery site near Edgemont, South Dakota. These cores were collected by Powertech Uranium Corporation, and material not used for their analyses were given to the U.S. Geological Survey for additional sampling and analyses. These additional analyses included total carbon and sulfur, whole rock acid digestion for major and trace elements, 234U/238U activity ratios, X-ray diffraction, thin sections, scanning electron microscopy analyses, and cathodoluminescence. This report provides the methods and data results from these analyses along with a short summary of observations.

  4. Enhancing forensic science with spectroscopic imaging

    NASA Astrophysics Data System (ADS)

    Ricci, Camilla; Kazarian, Sergei G.

    2006-09-01

    This presentation outlines the research we are developing in the area of Fourier Transform Infrared (FTIR) spectroscopic imaging with the focus on materials of forensic interest. FTIR spectroscopic imaging has recently emerged as a powerful tool for characterisation of heterogeneous materials. FTIR imaging relies on the ability of the military-developed infrared array detector to simultaneously measure spectra from thousands of different locations in a sample. Recently developed application of FTIR imaging using an ATR (Attenuated Total Reflection) mode has demonstrated the ability of this method to achieve spatial resolution beyond the diffraction limit of infrared light in air. Chemical visualisation with enhanced spatial resolution in micro-ATR mode broadens the range of materials studied with FTIR imaging with applications to pharmaceutical formulations or biological samples. Macro-ATR imaging has also been developed for chemical imaging analysis of large surface area samples and was applied to analyse the surface of human skin (e.g. finger), counterfeit tablets, textile materials (clothing), etc. This approach demonstrated the ability of this imaging method to detect trace materials attached to the surface of the skin. This may also prove as a valuable tool in detection of traces of explosives left or trapped on the surfaces of different materials. This FTIR imaging method is substantially superior to many of the other imaging methods due to inherent chemical specificity of infrared spectroscopy and fast acquisition times of this technique. Our preliminary data demonstrated that this methodology will provide the means to non-destructive detection method that could relate evidence to its source. This will be important in a wider crime prevention programme. In summary, intrinsic chemical specificity and enhanced visualising capability of FTIR spectroscopic imaging open a window of opportunities for counter-terrorism and crime-fighting, with applications ranging from analysis of trace evidence (e.g. in soil), tablets, drugs, fibres, tape explosives, biological samples to detection of gunshot residues and imaging of fingerprints.

  5. A Modest Proposal: Permit Interlocutory Appeals of Summary Judgment Denials

    DTIC Science & Technology

    1994-04-01

    DTIC.FDAC DTIC 70A DOCaum~ ROCESS INr GAT NB •LD U•E RTONK LOAN DOCUMENT Best Availab le Copy A MODEST PROPOSAL: PERMIT INTERLOCUTORY APPEALS OF SUMMARY...PERMIT INTERLOCUTORY APPEALS OF SUMMARY JUDGMENT DENIALS Michael J. Davidson Major, U.S. Army Judge Advocate General’s Corps ABSTRACT: In 1986, the... interpreting Rule 56 of the Federal Rules of Civil Procedure, discusses the shortcomings of potential avenues of appeal , and suggests two methods by which

  6. A brief update on physical and optical disector applications and sectioning-staining methods in neuroscience.

    PubMed

    Yurt, Kıymet Kübra; Kivrak, Elfide Gizem; Altun, Gamze; Mohamed, Hamza; Ali, Fathelrahman; Gasmalla, Hosam Eldeen; Kaplan, Suleyman

    2018-02-26

    A quantitative description of a three-dimensional (3D) object based on two-dimensional images can be made using stereological methods These methods involve unbiased approaches and provide reliable results with quantitative data. The quantitative morphology of the nervous system has been thoroughly researched in this context. In particular, various novel methods such as design-based stereological approaches have been applied in neuoromorphological studies. The main foundations of these methods are systematic random sampling and a 3D approach to structures such as tissues and organs. One key point in these methods is that selected samples should represent the entire structure. Quantification of neurons, i.e. particles, is important for revealing degrees of neurodegeneration and regeneration in an organ or system. One of the most crucial morphometric parameters in biological studies is thus the "number". The disector counting method introduced by Sterio in 1984 is an efficient and reliable solution for particle number estimation. In order to obtain precise results by means of stereological analysis, counting items should be seen clearly in the tissue. If an item in the tissue cannot be seen, these cannot be analyzed even using unbiased stereological techniques. Staining and sectioning processes therefore play a critical role in stereological analysis. The purpose of this review is to evaluate current neuroscientific studies using optical and physical disector counting methods and to discuss their definitions and methodological characteristics. Although the efficiency of the optical disector method in light microscopic studies has been revealed in recent years, the physical disector method is more easily performed in electron microscopic studies. Also, we offered to readers summaries of some common basic staining and sectioning methods, which can be used for stereological techniques in this review. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. CTEPP STANDARD OPERATING PROCEDURE FOR ENTERING OR IMPORTING ELECTRONIC DATA INTO THE CTEPP DATABASE (SOP-4.12)

    EPA Science Inventory

    This SOP described the method used to automatically parse analytical data generated from gas chromatography/mass spectrometry (GC/MS) analyses into CTEPP summary spreadsheets and electronically import the summary spreadsheets into the CTEPP study database.

  8. Activities and summary statistics of radon-222 in stream- and ground-water samples, Owl Creek basin, north-central Wyoming, September 1991 through March 1992

    USGS Publications Warehouse

    Ogle, K.M.; Lee, R.W.

    1994-01-01

    Radon-222 activity was measured for 27 water samples from streams, an alluvial aquifer, bedrock aquifers, and a geothermal system, in and near the 510-square mile area of Owl Creek Basin, north- central Wyoming. Summary statistics of the radon- 222 activities are compiled. For 16 stream-water samples, the arithmetic mean radon-222 activity was 20 pCi/L (picocuries per liter), geometric mean activity was 7 pCi/L, harmonic mean activity was 2 pCi/L and median activity was 8 pCi/L. The standard deviation of the arithmetic mean is 29 pCi/L. The activities in the stream-water samples ranged from 0.4 to 97 pCi/L. The histogram of stream-water samples is left-skewed when compared to a normal distribution. For 11 ground-water samples, the arithmetic mean radon- 222 activity was 486 pCi/L, geometric mean activity was 280 pCi/L, harmonic mean activity was 130 pCi/L and median activity was 373 pCi/L. The standard deviation of the arithmetic mean is 500 pCi/L. The activity in the ground-water samples ranged from 25 to 1,704 pCi/L. The histogram of ground-water samples is left-skewed when compared to a normal distribution. (USGS)

  9. 78 FR 45566 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Coal Mine...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-29

    ... for OMB Review; Comment Request; Coal Mine Dust Sampling Devices ACTION: Notice. SUMMARY: The... information collection request (ICR) titled, ``Coal Mine Dust Sampling Devices,'' to the Office of Management...) determine the concentration of respirable dust in coal mines. CPDMs must be designed and constructed for...

  10. Improving Statistics Education through Simulations: The Case of the Sampling Distribution.

    ERIC Educational Resources Information Center

    Earley, Mark A.

    This paper presents a summary of action research investigating statistics students' understandings of the sampling distribution of the mean. With four sections of an introductory Statistics in Education course (n=98 students), a computer simulation activity (R. delMas, J. Garfield, and B. Chance, 1999) was implemented and evaluated to show…

  11. PubMed Central

    Genovese, C.; Facciolà, A.; Palamara, M.A.R.; Squeri, R.

    2017-01-01

    Summary Introduction. Nosocomial infections are one of the greatest problems in public health. Several studies have highlighted the role played by the hospital environment as a possible source of transmission of nosocomial pathogens. Methods. A five-year monitoring of bacterial contamination on healthcare workers hands, surfaces most closely in contact with inpatient wards, operating theatres and "at rest" and "in use" operating theatre air samples. For the samples, we used sterile swabs, contact slides, manual API, and automated VITEK systems for identification. Results. In the five-year period, a total of 9396 samples were collected and analysed. In ward patients, 4398 samplings were carried out with 4.7%, 9.4%, 7%, 10.8% and 7.9% positive results respectively from 2010 to 2014. For hands, 648 samplings were carried out, with a positivity of 40.74%. In operating theatres, 4188 samples were taken, with a positivity of 11.9%. Regarding air in empty and full theatres, 1962 samplings were carried out with a positivity rate equal to 31.9%. The monitoring showed a low rate of contamination with a progressive decrease in the fiveyear period on operating theatres surfaces and hands, while there was an increase in the surgical site wards and in the air of operating rooms. Conclusions. Our investigation has revealed the presence of pathogens on the assessed surfaces and the need for environmental monitoring, which can be a valuable tool for reducing contamination. PMID:28900357

  12. Will improving access to dental care improve oral health-related quality of life?

    PubMed

    Crocombe, L A; Mahoney, G D; Spencer, A J; Waller, M

    2013-06-01

    The aim of this study was to determine if Australian Defence Force (ADF) members had better oral health-related quality of life (OHRQoL) than the general Australian population and whether the difference was due to better access to dental care. The OHRQoL, as measured by OHIP-14 summary indicators, of participants from the Defence Deployed Solomon Islands (SI) Health Study and the National Survey of Adult Oral Health 2004-06 (NSAOH) were compared. The SI sample was age/gender status-adjusted to match that of the NSAOH sample which was age/gender/regional location weighted to that of the Australian population. NSAOH respondents with good access to dental care had lower OHIP-14 summary measures [frequency of impacts 8.5% (95% CI = 5.4, 11.6), extent mean = 0.16 (0.11, 0.22), severity mean = 5.0 (4.4, 5.6)] than the total NSAOH sample [frequency 18.6 (16.6, 20.7); extent 0.52 (0.44, 0.59); severity 7.6 (7.1, 8.1)]. The NSAOH respondents with both good access to dental care and self-reported good general health did not have as low OHIP-14 summary scores as in the SI sample [frequency 2.6 (1.2, 5.4), extent 0.05 (0.01, 0.10); severity 2.6 (1.9, 3.4)]. ADF members had better OHRQoL than the general Australian population, even those with good access to dental care and self-reported good general health. © 2013 Australian Dental Association.

  13. Application of denaturing high-performance liquid chromatography for monitoring sulfate-reducing bacteria in oil fields.

    PubMed

    Priha, Outi; Nyyssönen, Mari; Bomberg, Malin; Laitila, Arja; Simell, Jaakko; Kapanen, Anu; Juvonen, Riikka

    2013-09-01

    Sulfate-reducing bacteria (SRB) participate in microbially induced corrosion (MIC) of equipment and H2S-driven reservoir souring in oil field sites. Successful management of industrial processes requires methods that allow robust monitoring of microbial communities. This study investigated the applicability of denaturing high-performance liquid chromatography (DHPLC) targeting the dissimilatory sulfite reductase ß-subunit (dsrB) gene for monitoring SRB communities in oil field samples from the North Sea, the United States, and Brazil. Fifteen of the 28 screened samples gave a positive result in real-time PCR assays, containing 9 × 10(1) to 6 × 10(5) dsrB gene copies ml(-1). DHPLC and denaturing gradient gel electrophoresis (DGGE) community profiles of the PCR-positive samples shared an overall similarity; both methods revealed the same samples to have the lowest and highest diversity. The SRB communities were diverse, and different dsrB compositions were detected at different geographical locations. The identified dsrB gene sequences belonged to several phylogenetic groups, such as Desulfovibrio, Desulfococcus, Desulfomicrobium, Desulfobulbus, Desulfotignum, Desulfonatronovibrio, and Desulfonauticus. DHPLC showed an advantage over DGGE in that the community profiles were very reproducible from run to run, and the resolved gene fragments could be collected using an automated fraction collector and sequenced without a further purification step. DGGE, on the other hand, included casting of gradient gels, and several rounds of rerunning, excising, and reamplification of bands were needed for successful sequencing. In summary, DHPLC proved to be a suitable tool for routine monitoring of the diversity of SRB communities in oil field samples.

  14. Determination of vitamins D2 and D3 in selected food matrices by online high-performance liquid chromatography-gas chromatography-mass spectrometry (HPLC-GC-MS).

    PubMed

    Nestola, Marco; Thellmann, Andrea

    2015-01-01

    An online normal-phase liquid chromatography-gas chromatography-mass spectrometry (HPLC-GC-MS) method was developed for the determination of vitamins D2 and D3 in selected food matrices. Transfer of the sample from HPLC to GC was realized by large volume on-column injection; detection was performed with a time-of-flight mass spectrometer (TOF-MS). Typical GC problems in the determination of vitamin D such as sample degradation or sensitivity issues, previously reported in the literature, were not observed. Determination of total vitamin D content was done by quantitation of its pyro isomer based on an isotopically labelled internal standard (ISTD). Extracted ion traces of analyte and ISTD showed cross-contribution, but non-linearity of the calibration curve was not determined inside the chosen calibration range by selection of appropriate quantifier ions. Absolute limits of detection (LOD) and quantitation (LOQ) for vitamins D2 and D3 were calculated as approximately 50 and 150 pg, respectively. Repeatability with internal standard correction was below 2 %. Good agreement between quantitative results of an established high-performance liquid chromatography with UV detection (HPLC-UV) method and HPLC-GC-MS was found. Sterol-enriched margarine was subjected to HPLC-GC-MS and HPLC-MS/MS for comparison, because HPLC-UV showed strong matrix interferences. HPLC-GC-MS produced comparable results with less manual sample cleanup. In summary, online hyphenation of HPLC and GC allowed a minimization in manual sample preparation with an increase of sample throughput.

  15. Comparison of enzyme-linked immunosorbent assay and rapid chemiluminescent analyser in the detection of myeloperoxidase and proteinase 3 autoantibodies.

    PubMed

    Pucar, Phillippa A; Hawkins, Carolyn A; Randall, Katrina L; Li, Candice; McNaughton, Euan; Cook, Matthew C

    2017-06-01

    Antibodies to myeloperoxidase (MPO) and proteinase 3 (PR3) are vital in the diagnosis and management of ANCA-associated vasculitis. A chemiluminescent immunoassay (CLIA; Quanta Flash) provides MPO and PR3 antibody results in 30 minutes, which is much faster than enzyme-linked immunosorbent assay (ELISA). We compared the performance of ELISA (Orgentec) and CLIA (Quanta Flash) for MPO and PR3 antibody quantitation on 303 samples, comprising 196 consecutive samples received in a single diagnostic laboratory over a 3 month period, and 107 samples collected from 42 known vasculitis patients over a 40 month period. We observed a correlation between both methods using spearman correlation coefficients (MPO, r s  = 0.63, p < 0.01; PR3, r s  = 0.69, p < 0.01). There was agreement between both methods in determining a positive or negative result. In the vasculitis cohort, CLIA performed well at clinically important stages of disease; diagnosis (eight samples all positive by both assays) and disease relapse (correlation for both MPO and PR3 antibody quantitation r s  = 0.84, p = 0.03 and r s  = 0.78, p < 0.01, respectively). Three samples were discordant at clinical relapse, testing positive by CLIA, including one high positive associated with relapse requiring a change in treatment. In summary, CLIA appears to be at least as accurate as ELISA for measurement of MPO and PR3 antibodies. Copyright © 2017. Published by Elsevier B.V.

  16. Pharmacokinetics Evaluation of Carbon Nanotubes Using FTIR Analysis and Histological Analysis.

    PubMed

    Gherman, Claudia; Tudor, Matea Cristian; Constantin, Bele; Flaviu, Tabaran; Stefan, Razvan; Maria, Bindea; Chira, Sergiu; Braicu, Cornelia; Pop, Laura; Petric, Roxana Cojocneanu; Berindan-Neagoe, Ioana

    2015-04-01

    Carbon nanotubes (CNTs) are biologically non-toxic and long-circulating nanostructures that have special physical properties. This study was focused on developing alternative methods that track carbon nanotubes, like FR-IR to classical tissue histological procedure. FT-IR absorption spectra were used to confirm the carboxylation of carbon nanotubes and to evaluate the presence of carbon nanotubes from bulk spleen samples and histologically prepared samples (control spleen and spleen with SWCNT-COOH). FT-IR spectrum of spleen sample from animals injected with CNTs shows major spectral differences consisting in infrared bands located at ~1173 cm(-1), ~ 1410 cm(-1); ~1658 cm(-1), ~1737 cm(-1) and around 1720 cm(-1) respectively. In terms of localization of carbon nanotubes, selective accumulation of marginal zone macrophages and splenic red pulp is observed for all treated groups, indicating the presence of carbon nanotubes even at 3, 4 and 7 days after treatment. In summary, we believe that histological evaluation and FT-IR can provide more characteristic information about the pharmacokinetcis and the clearance of carbon nanotubes.

  17. Sequim Marine Research Laboratory routine environmental measurements during CY-1977

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fix, J.J.; Blumer, P.J.

    1978-06-01

    Beginning in 1976, a routine environmental program was established at the Marine Research Laboratory (MRL) at Sequim, Washington. The program is intended to demonstrate the negligible impact of current MRL operations on the surrounding environs and to provide baseline data through which any cumulative impact could be detected. The sampling frequency is greater during the first 2 years of the program to provide sufficient initial information to allow reliable estimates of observed radionuclide concentrations and to construct a long-term sampling program. The program is designed, primarily, to determine levels of radioactivity present in selected biota in Sequim Bay. The biotamore » were selected because of their presence near the laboratory and their capacity to concentrate trace elements. Other samples were obtained to determine the radionuclides in Sequim Bay and laboratory drinking water, as well as the ambient radiation exposure levels and surface deposition of fallout radionuclides for the laboratory area. Appendix A provides a summary of the analytical methods used. The present document includes data obtained during CY 1977 in addition to CY-1976 data published previously.« less

  18. Walter Reed Army Medical Center, Washington, DC Annual Progress Report FY-89. Volume 2. Part 1

    DTIC Science & Technology

    1990-01-02

    recurrent head and neck cancer who meet the eligibility requirements and consent to the protocol will have central venous catheter placed (if one is...DATE: 06/29/89 WORK UNIT # 6220 DETAIL SUMMARY SHEET TITLE: Epidemiology of HIV In Pediatric and Perinatal Patients - A Natural History Study...of HIV in clinical samples. 16 I I REPORT DATE: 04/04/89 WORK UNIT # 8804 DETAIL SUMMARY SHEET TITLE: The Natural History of HIV Infection and Disease

  19. A Workshop on the Integration of Numerical and Symbolic Computing Methods Held in Saratoga Springs, New York on July 9-11, 1990

    DTIC Science & Technology

    1991-04-01

    SUMMARY OF COMPLETED PROJECT (for public use) The summary (about 200 words) must be self-contained and intellegible to a scientifically literate reader...dialogue among re- searchers in symbolic methods and numerical computation, and their appli- cations in certain disciplines of artificial intelligence...Lozano-Perez Purdue University Artificial Intelligence Laboratory West Lafayette, IN 47907 Massachusetts Institute of Technology (317) 494-6181 545

  20. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  1. Proof-of-Concept Study for Uncertainty Quantification and Sensitivity Analysis using the BRL Shaped-Charge Example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Justin Matthew

    These are the slides for a graduate presentation at Mississippi State University. It covers the following: the BRL Shaped-Charge Geometry in PAGOSA, mesh refinement study, surrogate modeling using a radial basis function network (RBFN), ruling out parameters using sensitivity analysis (equation of state study), uncertainty quantification (UQ) methodology, and sensitivity analysis (SA) methodology. In summary, a mesh convergence study was used to ensure that solutions were numerically stable by comparing PDV data between simulations. A Design of Experiments (DOE) method was used to reduce the simulation space to study the effects of the Jones-Wilkins-Lee (JWL) Parameters for the Composition Bmore » main charge. Uncertainty was quantified by computing the 95% data range about the median of simulation output using a brute force Monte Carlo (MC) random sampling method. Parameter sensitivities were quantified using the Fourier Amplitude Sensitivity Test (FAST) spectral analysis method where it was determined that detonation velocity, initial density, C1, and B1 controlled jet tip velocity.« less

  2. Evaluating paper degradation progress. Cross-linking between chromatographic, spectroscopic and chemical results

    NASA Astrophysics Data System (ADS)

    Łojewski, Tomasz; Zięba, Katarzyna; Knapik, Arkadiusz; Bagniuk, Jacek; Lubańska, Anna; Łojewska, Joanna

    2010-09-01

    The study presents an overview of the chromatographic (SEC), spectroscopic (FTIR, UV/VIS), viscometric (DP) and chemical methods (titration, pH) used for the evaluation of the degradation progress of various kinds of paper under various conditions. The methods were chosen to follow different routes of paper degradation. Model paper samples represented boundary paper types from pure cellulose cotton paper, through softwood to low quality acidic, sized groundwood paper The accelerated ageing conditions were adjusted to achieve maximum effect (climatic chamber RH 59%, 90oC) and also to mimic the environment inside books (closed vials). The results were settled on the literature data on the degradation mechanisms and compared in terms of the paper types and ageing conditions. The estimators of coupled de-polymerisation and oxidation have been proposed based on the correlation between SEC, UV/VIS and titrative coppper number determination. The overall oxidation index derived from FTIR results was shown to correlate with the summary -CHO and -COOH concentration determined by titrative methods.

  3. Regression analysis of sparse asynchronous longitudinal data

    PubMed Central

    Cao, Hongyuan; Zeng, Donglin; Fine, Jason P.

    2015-01-01

    Summary We consider estimation of regression models for sparse asynchronous longitudinal observations, where time-dependent responses and covariates are observed intermittently within subjects. Unlike with synchronous data, where the response and covariates are observed at the same time point, with asynchronous data, the observation times are mismatched. Simple kernel-weighted estimating equations are proposed for generalized linear models with either time invariant or time-dependent coefficients under smoothness assumptions for the covariate processes which are similar to those for synchronous data. For models with either time invariant or time-dependent coefficients, the estimators are consistent and asymptotically normal but converge at slower rates than those achieved with synchronous data. Simulation studies evidence that the methods perform well with realistic sample sizes and may be superior to a naive application of methods for synchronous data based on an ad hoc last value carried forward approach. The practical utility of the methods is illustrated on data from a study on human immunodeficiency virus. PMID:26568699

  4. Development of Loop-Mediated Isothermal Amplification (LAMP) Assay for Rapid and Sensitive Identification of Ostrich Meat

    PubMed Central

    Abdulmawjood, Amir; Grabowski, Nils; Fohler, Svenja; Kittler, Sophie; Nagengast, Helga; Klein, Guenter

    2014-01-01

    Animal species identification is one of the primary duties of official food control. Since ostrich meat is difficult to be differentiated macroscopically from beef, therefore new analytical methods are needed. To enforce labeling regulations for the authentication of ostrich meat, it might be of importance to develop and evaluate a rapid and reliable assay. In the present study, a loop-mediated isothermal amplification (LAMP) assay based on the cytochrome b gene of the mitochondrial DNA of the species Struthio camelus was developed. The LAMP assay was used in combination with a real-time fluorometer. The developed system allowed the detection of 0.01% ostrich meat products. In parallel, a direct swab method without nucleic acid extraction using the HYPLEX LPTV buffer was also evaluated. This rapid processing method allowed detection of ostrich meat without major incubation steps. In summary, the LAMP assay had excellent sensitivity and specificity for detecting ostrich meat and could provide a sampling-to-result identification-time of 15 to 20 minutes. PMID:24963709

  5. Numerical methods for stochastic differential equations

    NASA Astrophysics Data System (ADS)

    Kloeden, Peter; Platen, Eckhard

    1991-06-01

    The numerical analysis of stochastic differential equations differs significantly from that of ordinary differential equations due to the peculiarities of stochastic calculus. This book provides an introduction to stochastic calculus and stochastic differential equations, both theory and applications. The main emphasise is placed on the numerical methods needed to solve such equations. It assumes an undergraduate background in mathematical methods typical of engineers and physicists, through many chapters begin with a descriptive summary which may be accessible to others who only require numerical recipes. To help the reader develop an intuitive understanding of the underlying mathematicals and hand-on numerical skills exercises and over 100 PC Exercises (PC-personal computer) are included. The stochastic Taylor expansion provides the key tool for the systematic derivation and investigation of discrete time numerical methods for stochastic differential equations. The book presents many new results on higher order methods for strong sample path approximations and for weak functional approximations, including implicit, predictor-corrector, extrapolation and variance-reduction methods. Besides serving as a basic text on such methods. the book offers the reader ready access to a large number of potential research problems in a field that is just beginning to expand rapidly and is widely applicable.

  6. 75 FR 7368 - Closed Captioning of Video Programming

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-19

    ... Captioning of Video Programming AGENCY: Federal Communications Commission. ACTION: Final rule. SUMMARY: In this document, the Commission amends the closed captioning rules to add another method by which video... . SUPPLEMENTARY INFORMATION: This is a summary of the Commission's document FCC 09-109, Closed Captioning of Video...

  7. Improvement of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry identification of difficult-to-identify bacteria and its impact in the workflow of a clinical microbiology laboratory.

    PubMed

    Rodríguez-Sánchez, Belén; Marín, Mercedes; Sánchez-Carrillo, Carlos; Cercenado, Emilia; Ruiz, Adrián; Rodríguez-Créixems, Marta; Bouza, Emilio

    2014-05-01

    This study evaluates matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) capability for the identification of difficult-to-identify microorganisms. A total of 150 bacterial isolates inconclusively identified with conventional phenotypic tests were further assessed by 16S rRNA sequencing and by MALDI-TOF MS following 2 methods: a) a simplified formic acid-based, on-plate extraction and b) performing a tube-based extraction step. Using the simplified method, 29 isolates could not be identified. For the remaining 121 isolates (80.7%), we obtained a reliable identification by MALDI-TOF: in 103 isolates, the identification by 16S rRNA sequencing and MALDI TOF coincided at the species level (68.7% from the total 150 analyzed isolates and 85.1% from the samples with MALDI-TOF result), and in 18 isolates, the identification by both methods coincided at the genus level (12% from the total and 14.9% from the samples with MALDI-TOF results). No discordant results were observed. The performance of the tube-based extraction step allowed the identification at the species level of 6 of the 29 unidentified isolates by the simplified method. In summary, MALDI-TOF can be used for the rapid identification of many bacterial isolates inconclusively identified by conventional methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Challenges in Species Tree Estimation Under the Multispecies Coalescent Model

    PubMed Central

    Xu, Bo; Yang, Ziheng

    2016-01-01

    The multispecies coalescent (MSC) model has emerged as a powerful framework for inferring species phylogenies while accounting for ancestral polymorphism and gene tree-species tree conflict. A number of methods have been developed in the past few years to estimate the species tree under the MSC. The full likelihood methods (including maximum likelihood and Bayesian inference) average over the unknown gene trees and accommodate their uncertainties properly but involve intensive computation. The approximate or summary coalescent methods are computationally fast and are applicable to genomic datasets with thousands of loci, but do not make an efficient use of information in the multilocus data. Most of them take the two-step approach of reconstructing the gene trees for multiple loci by phylogenetic methods and then treating the estimated gene trees as observed data, without accounting for their uncertainties appropriately. In this article we review the statistical nature of the species tree estimation problem under the MSC, and explore the conceptual issues and challenges of species tree estimation by focusing mainly on simple cases of three or four closely related species. We use mathematical analysis and computer simulation to demonstrate that large differences in statistical performance may exist between the two classes of methods. We illustrate that several counterintuitive behaviors may occur with the summary methods but they are due to inefficient use of information in the data by summary methods and vanish when the data are analyzed using full-likelihood methods. These include (i) unidentifiability of parameters in the model, (ii) inconsistency in the so-called anomaly zone, (iii) singularity on the likelihood surface, and (iv) deterioration of performance upon addition of more data. We discuss the challenges and strategies of species tree inference for distantly related species when the molecular clock is violated, and highlight the need for improving the computational efficiency and model realism of the likelihood methods as well as the statistical efficiency of the summary methods. PMID:27927902

  9. 43 CFR Appendix A to Part 10 - Sample Summary

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... specific circumstances of each case. Before November 17, 1993 Chairman or Other Authorized Official Indian..., tools, household equipment, clothing, travel and transportation, personal adornment, smoking, toys, and...

  10. Simultaneous Determination of Residue from 58 Pesticides in the Wheat Flour Consumed in Tehran, Iran by GC/MS

    PubMed Central

    Rezaei, Mohammad; Shariatifar, Nabi; Shoeibi, Shahram; Amir Ahmadi, Maryam; Jahed Khaniki, Gholamreza

    2017-01-01

    Food safety has a direct impact on human health and as such is a growing concern worldwide. Presence of harmful pesticide residue in food is a serious cause for concern among consumers so it is important to monitor levels of pesticides in foods. The aim of this study was simultaneous determination of concentrations of 58 pesticides in 40 wheat flour samples collected from Tehran market in January, 2014. The city under study (Tehran) was divided into five districts and samples were collected independently from each district and sourced from different bakeries (n=40). A gas chromatography-mass spectrometry single quadrupole selected ion monitoring «GC/MS-SQ-SIM» method was used to quantify residue of 58 pesticides in the wheat flour samples. Four of the 40 samples showed contamination with Malathion (2 samples: 50.96 ± 11.38 and 62.088 ± 11.38 ppb) and 2, 4-DDE (2 samples: 19.88±15.24 and 13.7 ± 15.24 ppb). that had levels below MRLs of these pesticides in Iran. Averages of recovery of pesticides at 6 concentration levels were in the range of 81.61-118.41%. The method was proven as repeatable with RSDr in the range of 6.5-29.45% for all concentration levels. The limit of quantification for 37 of the tested pesticides was calculated as 15 ppb and for the other 21 tested pesticides, the concentration was 25 ppb. In summary, results of these tests suggested that the wheat flour consumed in Tehran, was within safety limits in terms of levels of pesticide residues. PMID:29201093

  11. High Resolution 4-D Spectroscopy with Sparse Concentric Shell Sampling and FFT-CLEAN

    PubMed Central

    Coggins, Brian E.; Zhou, Pei

    2009-01-01

    SUMMARY Recent efforts to reduce the measurement time for multidimensional NMR experiments have fostered the development of a variety of new procedures for sampling and data processing. We recently described concentric ring sampling for 3-D NMR experiments, which is superior to radial sampling as input for processing by a multidimensional discrete Fourier transform. Here, we report the extension of this approach to 4-D spectroscopy as Randomized Concentric Shell Sampling (RCSS), where sampling points for the indirect dimensions are positioned on concentric shells, and where random rotations in the angular space are used to avoid coherent artifacts. With simulations, we show that RCSS produces a very low level of artifacts, even with a very limited number of sampling points. The RCSS sampling patterns can be adapted to fine rectangular grids to permit use of the Fast Fourier Transform in data processing, without an apparent increase in the artifact level. These artifacts can be further reduced to the noise level using the iterative CLEAN algorithm developed in radioastronomy. We demonstrate these methods on the high resolution 4-D HCCH-TOCSY spectrum of protein G's B1 domain, using only 1.2% of the sampling that would be needed conventionally for this resolution. The use of a multidimensional FFT instead of the slow DFT for initial data processing and for subsequent CLEAN significantly reduces the calculation time, yielding an artifact level that is on par with the level of the true spectral noise. PMID:18853260

  12. Analysis of environmental microplastics by vibrational microspectroscopy: FTIR, Raman or both?

    PubMed

    Käppler, Andrea; Fischer, Dieter; Oberbeckmann, Sonja; Schernewski, Gerald; Labrenz, Matthias; Eichhorn, Klaus-Jochen; Voit, Brigitte

    2016-11-01

    The contamination of aquatic ecosystems with microplastics has recently been reported through many studies, and negative impacts on the aquatic biota have been described. For the chemical identification of microplastics, mainly Fourier transform infrared (FTIR) and Raman spectroscopy are used. But up to now, a critical comparison and validation of both spectroscopic methods with respect to microplastics analysis is missing. To close this knowledge gap, we investigated environmental samples by both Raman and FTIR spectroscopy. Firstly, particles and fibres >500 μm extracted from beach sediment samples were analysed by Raman and FTIR microspectroscopic single measurements. Our results illustrate that both methods are in principle suitable to identify microplastics from the environment. However, in some cases, especially for coloured particles, a combination of both spectroscopic methods is necessary for a complete and reliable characterisation of the chemical composition. Secondly, a marine sample containing particles <400 μm was investigated by Raman imaging and FTIR transmission imaging. The results were compared regarding number, size and type of detectable microplastics as well as spectra quality, measurement time and handling. We show that FTIR imaging leads to significant underestimation (about 35 %) of microplastics compared to Raman imaging, especially in the size range <20 μm. However, the measurement time of Raman imaging is considerably higher compared to FTIR imaging. In summary, we propose a further size division within the smaller microplastics fraction into 500-50 μm (rapid and reliable analysis by FTIR imaging) and into 50-1 μm (detailed and more time-consuming analysis by Raman imaging). Graphical Abstract Marine microplastic sample (fraction <400 μm) on a silicon filter (middle) with the corresponding Raman and IR images.

  13. A simple and highly sensitive on-line column extraction liquid chromatography-tandem mass spectrometry method for the determination of protein-unbound tacrolimus in human plasma samples.

    PubMed

    Bittersohl, Heike; Schniedewind, Björn; Christians, Uwe; Luppa, Peter B

    2018-04-27

    Therapeutic drug monitoring (TDM) of the immunosuppressive drug tacrolimus is essential to avoid side effects and rejection of the allograft after transplantation. In the blood circulation, tacrolimus is largely located inside erythrocytes or bound to plasma proteins and less than 0.1% is protein-unbound (free). One basic principle of clinical pharmacology is that only free drug is pharmacologically active and monitoring this portion has the potential to better reflect the drug effect than conventional measurements of total tacrolimus in whole blood. To address this, a highly sensitive and straightforward on-line liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed, validated and applied to patient plasma samples. The sample preparation included ultracentrifugation and addition of the stable isotope labeled drug analogue D2,13C-tacrolimus, followed by on-line sample extraction and measurement using a Sciex QTRAP ® 6500 in the multiple reaction monitoring mode. Due to very low concentrations of protein-unbound tacrolimus, it was important to develop a highly sensitive, precise and accurate assay. Here, we first report the efficient formation of tacrolimus lithium adduct ions, which greatly increased assay sensitivity. A lower limit of quantification (LLOQ) of 1 pg/mL (10 fg on column) was achieved and the assay was linear between 1 and 200 pg/mL. There was no carry-over detected. The inaccuracy ranged from -9.8 to 7.4% and the greatest imprecision was 7.5%. The matrix factor was found to be smaller than 1.1%. In summary, this method represents a suitable tool to investigate the potential clinical value of free tacrolimus monitoring in organ transplant recipients. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Uncertainty in predicting soil hydraulic properties at the hillslope scale with indirect methods

    NASA Astrophysics Data System (ADS)

    Chirico, G. B.; Medina, H.; Romano, N.

    2007-02-01

    SummarySeveral hydrological applications require the characterisation of the soil hydraulic properties at large spatial scales. Pedotransfer functions (PTFs) are being developed as simplified methods to estimate soil hydraulic properties as an alternative to direct measurements, which are unfeasible for most practical circumstances. The objective of this study is to quantify the uncertainty in PTFs spatial predictions at the hillslope scale as related to the sampling density, due to: (i) the error in estimated soil physico-chemical properties and (ii) PTF model error. The analysis is carried out on a 2-km-long experimental hillslope in South Italy. The method adopted is based on a stochastic generation of patterns of soil variables using sequential Gaussian simulation, conditioned to the observed sample data. The following PTFs are applied: Vereecken's PTF [Vereecken, H., Diels, J., van Orshoven, J., Feyen, J., Bouma, J., 1992. Functional evaluation of pedotransfer functions for the estimation of soil hydraulic properties. Soil Sci. Soc. Am. J. 56, 1371-1378] and HYPRES PTF [Wösten, J.H.M., Lilly, A., Nemes, A., Le Bas, C., 1999. Development and use of a database of hydraulic properties of European soils. Geoderma 90, 169-185]. The two PTFs estimate reliably the soil water retention characteristic even for a relatively coarse sampling resolution, with prediction uncertainties comparable to the uncertainties in direct laboratory or field measurements. The uncertainty of soil water retention prediction due to the model error is as much as or more significant than the uncertainty associated with the estimated input, even for a relatively coarse sampling resolution. Prediction uncertainties are much more important when PTF are applied to estimate the saturated hydraulic conductivity. In this case model error dominates the overall prediction uncertainties, making negligible the effect of the input error.

  15. Standardized methods for Grand Canyon fisheries research 2015

    USGS Publications Warehouse

    Persons, William R.; Ward, David L.; Avery, Luke A.

    2013-01-01

    This document presents protocols and guidelines to persons sampling fishes in the Grand Canyon, to help ensure consistency in fish handling, fish tagging, and data collection among different projects and organizations. Most such research and monitoring projects are conducted under the general umbrella of the Glen Canyon Dam Adaptive Management Program and include studies by the U.S. Geological Survey (USGS), U.S. Fish and Wildlife Service (FWS), National Park Service (NPS), the Arizona Game and Fish Department (AGFD), various universities, and private contractors. This document is intended to provide guidance to fieldworkers regarding protocols that may vary from year to year depending on specific projects and objectives. We also provide herein documentation of standard methods used in the Grand Canyon that can be cited in scientific publications, as well as a summary of changes in protocols since the document was first created in 2002.

  16. Measurement of food flavonoids by high-performance liquid chromatography: A review.

    PubMed

    Merken, H M; Beecher, G R

    2000-03-01

    The flavonoids are plant polyphenols found frequently in fruits, vegetables, and grains. Divided into several subclasses, they include the anthocyanidins, pigments chiefly responsible for the red and blue colors in fruits, fruit juices, wines, and flowers; the catechins, concentrated in tea; the flavanones and flavanone glycosides, found in citrus and honey; and the flavones, flavonols, and flavonol glycosides, found in tea, fruits, vegetables, and honey. Known for their hydrogen-donating antioxidant activity as well as their ability to complex divalent transition metal cations, flavonoids are propitious to human health. Computer-controlled high-performance liquid chromatography (HPLC) has become the analytical method of choice. Many systems have been developed for the detection and quantification of flavonoids across one, two, or three subclasses. A summary of the various HPLC and sample preparation methods that have been employed to quantify individual flavonoids within a subclass or across several subclasses are tabulated in this review.

  17. Fractals, malware, and data models

    NASA Astrophysics Data System (ADS)

    Jaenisch, Holger M.; Potter, Andrew N.; Williams, Deborah; Handley, James W.

    2012-06-01

    We examine the hypothesis that the decision boundary between malware and non-malware is fractal. We introduce a novel encoding method derived from text mining for converting disassembled programs first into opstrings and then filter these into a reduced opcode alphabet. These opcodes are enumerated and encoded into real floating point number format and used for characterizing frequency of occurrence and distribution properties of malware functions to compare with non-malware functions. We use the concept of invariant moments to characterize the highly non-Gaussian structure of the opcode distributions. We then derive Data Model based classifiers from identified features and interpolate and extrapolate the parameter sample space for the derived Data Models. This is done to examine the nature of the parameter space classification boundary between families of malware and the general non-malware category. Preliminary results strongly support the fractal boundary hypothesis, and a summary of our methods and results are presented here.

  18. Oligonucleotide primers, probes and molecular methods for the environmental monitoring of methanogenic archaea

    PubMed Central

    Narihiro, Takashi; Sekiguchi, Yuji

    2011-01-01

    Summary For the identification and quantification of methanogenic archaea (methanogens) in environmental samples, various oligonucleotide probes/primers targeting phylogenetic markers of methanogens, such as 16S rRNA, 16S rRNA gene and the gene for the α‐subunit of methyl coenzyme M reductase (mcrA), have been extensively developed and characterized experimentally. These oligonucleotides were designed to resolve different groups of methanogens at different taxonomic levels, and have been widely used as hybridization probes or polymerase chain reaction primers for membrane hybridization, fluorescence in situ hybridization, rRNA cleavage method, gene cloning, DNA microarray and quantitative polymerase chain reaction for studies in environmental and determinative microbiology. In this review, we present a comprehensive list of such oligonucleotide probes/primers, which enable us to determine methanogen populations in an environment quantitatively and hierarchically, with examples of the practical applications of the probes and primers. PMID:21375721

  19. PubMed Central

    BAHRAINIAN, SA.; RAEISOON, MR.; HASHEMI GORJI, O.; KHAZAEE, A.

    2014-01-01

    Summary Background. The aim of the study was to investigate the relationship of self-esteem and depression with Internet addiction in university students. Methods. The present descriptive-analytic correlation study involved 408 students (150 female and 258 male) who had been selected by means of a cluster sampling method from among all the students studying in Birjand Islamic Azad University. Students were evaluated through the Beck Depression Inventory (BDI), Cooper Smith Self-Esteem Inventory (CSEI) and Internet Addiction Test (IAT). Results. The results indicated that 40.7% of the students had Internet addiction. A significant correlation emerged between depression, self-esteem and internet addiction. Regression analysis indicated that depression and self-esteem were able to predict the variance of Internet addiction to some extent. Conclusions. It may be important to evaluate self-esteem and depression in people with Internet addiction. These variables should be targeted for effective cognitive behavioral therapy in people with Internet addiction. PMID:25902574

  20. Nonparametric entropy estimation using kernel densities.

    PubMed

    Lake, Douglas E

    2009-01-01

    The entropy of experimental data from the biological and medical sciences provides additional information over summary statistics. Calculating entropy involves estimates of probability density functions, which can be effectively accomplished using kernel density methods. Kernel density estimation has been widely studied and a univariate implementation is readily available in MATLAB. The traditional definition of Shannon entropy is part of a larger family of statistics, called Renyi entropy, which are useful in applications that require a measure of the Gaussianity of data. Of particular note is the quadratic entropy which is related to the Friedman-Tukey (FT) index, a widely used measure in the statistical community. One application where quadratic entropy is very useful is the detection of abnormal cardiac rhythms, such as atrial fibrillation (AF). Asymptotic and exact small-sample results for optimal bandwidth and kernel selection to estimate the FT index are presented and lead to improved methods for entropy estimation.

  1. Thermo-voltage measurements of atomic contacts at low temperature

    PubMed Central

    Ofarim, Ayelet; Kopp, Bastian; Möller, Thomas; Martin, León; Boneberg, Johannes; Leiderer, Paul

    2016-01-01

    Summary We report the development of a novel method to determine the thermopower of atomic-sized gold contacts at low temperature. For these measurements a mechanically controllable break junction (MCBJ) system is used and a laser source generates a temperature difference of a few kelvins across the junction to create a thermo-voltage. Since the temperature difference enters directly into the Seebeck coefficient S = −ΔV/ΔT, the determination of the temperature plays an important role. We present a method for the determination of the temperature difference using a combination of a finite element simulation, which reveals the temperature distribution of the sample, and the measurement of the resistance change due to laser heating of sensor leads on both sides next to the junction. Our results for the measured thermopower are in agreement with recent reports in the literature. PMID:27335765

  2. The skill of summary in clinician-patient communication: a case study.

    PubMed

    Quilligan, Sally; Silverman, Jonathan

    2012-03-01

    To investigate the use and impact of the micro-skill of summary in clinical encounters, a core skill that has little empirical investigation of its use and outcomes. This exploratory study used a mixed method design. Video recordings of ten consultations between simulated patients and medical-students were analysed to identify types of summary used. Two contrasting cases were then micro-analysed and follow up interviews held with the 2 students and simulated patients, involved in the consultations, using the video recording as a trigger. Ninety-nine summaries were identified and grouped into six types: reflective, screening, clarifying, paraphrasing, interim and full. Summary appeared to aid accuracy. However, summaries about the patient's perspective were summarised less frequently than the biomedical perspective. When summaries were repeatedly incorrect they made the simulated patient feel they were not being listened to. The use and effect of summary appears more complex than the medical literature suggests and may have both positive and negative attributes. Further research is needed to investigate whether these preliminary findings are replicated within doctor-patient consultations. When teaching use of summary we need to address: type, purpose, accuracy, effect on patient and flexible use to suit the patient. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  3. Summary of the research work of the Trace Elements Section, Geochemistry and Petrology Branch, for the period January 1-March 31, 1951

    USGS Publications Warehouse

    Rabbitt, John C.

    1951-01-01

    This report summarized the research work of the Trace Elements Section, Geochemistry and Petrology Branch for the period January 1 - March 31, 1951. Work before that is summarized in an earlier report, "Summary of the research work of the Trace Elements Section, Geochemistry and Petrology Branch, for the period April 1, 1948 - December 31, 1950," by John C. Rabbitt (U.S. Geol. Survey Trace Elements Investigations Rept. 148, January 1951). This report will be referred to as TEIR 148. In TEIR 148 the purpose of each project was described and it is not thought necessary to repeat that material. The research work of the section consists of laboratory and related field studies in the following fields: 1. Mineralogic and petrologic investigations of radioactive rocks, minerals, and ores. 2. Investigations of chemical methods of analysis for uranium, thorium, and other elements and compounds in radioactive materials, and related chemical problems. 3. Investigations of spectographic method of analysis for a wide variety of elements in radioactive materials. 4. Investigation of radiometric methods of analysis is applied to radioactive materials. It should be emphasized that the work undertaken so far is almost entirely in the nature of investigations supporting the field appraisal of known uraniferous deposits. A program of more fundamental research, particularly in the mineralogy and geochemistry of uranium, is now being drawn up and will be submitted for approval soon. This report does not deal with the routine analytical work of the Section nor the public-sample program. The analytical work will be summarized in a report to be issued after the end of fiscal year 1951, and a report on the public-sample program is in process. Special thanks are due members of the Section who are engaged in the research work and who have supplied material for this report, the Early Ingerson, Chief of the Geochemistry and Petrology Branch for his critical review, to Jane Titcomb of the editorial staff of the Section for editing the report, and to Virginia Layne of the same staff, for typing the manuscript and the multilith mats.

  4. Replication and validation of higher order models demonstrated that a summary score for the EORTC QLQ-C30 is robust.

    PubMed

    Giesinger, Johannes M; Kieffer, Jacobien M; Fayers, Peter M; Groenvold, Mogens; Petersen, Morten Aa; Scott, Neil W; Sprangers, Mirjam A G; Velikova, Galina; Aaronson, Neil K

    2016-01-01

    To further evaluate the higher order measurement structure of the European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Questionnaire Core 30 (QLQ-C30), with the aim of generating a summary score. Using pretreatment QLQ-C30 data (N = 3,282), we conducted confirmatory factor analyses to test seven previously evaluated higher order models. We compared the summary score(s) derived from the best performing higher order model with the original QLQ-C30 scale scores, using tumor stage, performance status, and change over time (N = 244) as grouping variables. Although all models showed acceptable fit, we continued in the interest of parsimony with known-groups validity and responsiveness analyses using a summary score derived from the single higher order factor model. The validity and responsiveness of this QLQ-C30 summary score was equal to, and in many cases superior to the original, underlying QLQ-C30 scale scores. Our results provide empirical support for a measurement model for the QLQ-C30 yielding a single summary score. The availability of this summary score can avoid problems with potential type I errors that arise because of multiple testing when making comparisons based on the 15 outcomes generated by this questionnaire and may reduce sample size requirements for health-related quality of life studies using the QLQ-C30 questionnaire when an overall summary score is a relevant primary outcome. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  5. The Evaluation of a Risk Degree for the Process of a Brown Coal Spontaneous Ignition on Dumps with Using of Modern Numeric Methods

    NASA Astrophysics Data System (ADS)

    Klouda, Petr; Moni, Vlastimil; Řehoř, Michal; Blata, Jan; Helebrant, František

    2018-06-01

    The article is a summary of information about evaluation of a risk degree for a brown coal spontaneous ignition which is realized on the base of a database analysis of information about the development of stative quantities and desorbated gases in the stored bodies of the brown coal. The data were gained from the long term complex measurements which were realized at chosen companies during the coal mining in the previous parts of the project. In the last part of the project, we examined results of temperature models from thermographs with results of gasses and coal samples from the mines. Then, the influence of atmospheric conditions (insolation, water downfall, changes of barometric pressure etc.), the influence of coal mass degradation, the influence of physical and chemical factors, and the influence of other defective factors on the process of the coal spontaneous ignition. The gasmetry was assess with gas in-situ samples and laboratory gas models of indicative gasses for the spontaneous ignition, which were taken from the method of the thermic oxidation with the aim of the correlation finding for an epicentre of temperature within the spontaneous ignition.

  6. Advancing Methods for U.S. Transgender Health Research

    PubMed Central

    Reisner, Sari L.; Deutsch, Madeline B.; Bhasin, Shalender; Bockting, Walter; Brown, George R.; Feldman, Jamie; Garofalo, Rob; Kreukels, Baudewijntje; Radix, Asa; Safer, Joshua D.; Tangpricha, Vin; T’Sjoen, Guy; Goodman, Michael

    2016-01-01

    Purpose of Review To describe methodological challenges, gaps, and opportunities in U.S. transgender health research. Recent Findings Lack of large prospective observational studies and intervention trials, limited data on risks and benefits of gender affirmation (e.g., hormones and surgical interventions), and inconsistent use of definitions across studies hinder evidence-based care for transgender people. Systematic high-quality observational and intervention-testing studies may be carried out using several approaches, including general population-based, health systems-based, clinic-based, venue-based, and hybrid designs. Each of these approaches has its strength and limitations; however, harmonization of research efforts is needed. Ongoing development of evidence-based clinical recommendations will benefit from a series of observational and intervention studies aimed at identification, recruitment, and follow-up of transgender people of different ages, from different racial, ethnic, and socioeconomic backgrounds and with diverse gender identities. Summary Transgender health research faces challenges that include standardization of lexicon, agreed-upon population definitions, study design, sampling, measurement, outcome ascertainment, and sample size. Application of existing and new methods is needed to fill existing gaps, increase the scientific rigor and reach of transgender health research, and inform evidence-based prevention and care for this underserved population. PMID:26845331

  7. A review of polychlorinated biphenyls (PCBs) pollution in indoor air environment.

    PubMed

    Dai, Qizhou; Min, Xia; Weng, Mili

    2016-10-01

    Polychlorinated biphenyls (PCBs) were widely used in industrial production due to the unique physical and chemical properties. As a kind of persistent organic pollutants, the PCBs would lead to environment pollution and cause serious problems for human health. Thus, they have been banned since the 1980s due to the environment pollution in the past years. Indoor air is the most direct and important environment medium to human beings; thus, the PCBs pollution research in indoor air is important for the protection of human health. This paper introduces the industrial application and potential harm of PCBs, summarizes the sampling, extracting, and analytical methods of environment monitoring, and compares the indoor air levels of urban areas with those of industrial areas in different countries according to various reports. This paper can provide a basic summary for PCBs pollution control in the indoor air environment. The review of PCBs pollution in indoor air in China is still limited. In this paper, we introduce the industrial application and potential harm of PCBs, summarize the sampling, extracting, and analytical methods of environment monitoring, and compare the indoor air levels of urban areas with industrial areas in different countries according to various reports.

  8. Consumption of psychoactive substances in educational institutions: an inquiry into the state of affairs in the schools of Córdoba.

    PubMed

    Lucchese, M S M; Burrone, M S; Enders, J E; Fernández, A R

    2014-01-01

    This study describes and analyses the consumption of psychoactive substances in educational institutions, the school environment conditions and its relation to the school standing of the students. In the first stage, a quantitative evaluation was performed, based on the records of the Second National Survey of Secondary School Students carried out in Córdoba in 2005; the second stage used a qualitative approach. A multistage probabilistic sample of 4593 students was used for the quantitative assessment. The analysis comprised summary measurements, multivariate and factorial correspondence analysis, in all cases with a significance level of p < 0.05. For the qualitative stage, an ethnographic approach was applied. The state schools were chosen using an intentional, cumulative and sequential sampling method. Ten in-depth interviews were carried out to gather qualitative data that was analyzed using the comparative constant method. Results evince that consumption is lower among morning-shift students and that grade repetition and behavior problems are associated to consumption of illegal drugs. Furthermore, it was detected that students in night-shift schools with low academic and disciplinary demand standards have a higher probability of consumption. It is clear that as academic standards decrease, consumption increases.

  9. Maximum Likelihood Estimations and EM Algorithms with Length-biased Data

    PubMed Central

    Qin, Jing; Ning, Jing; Liu, Hao; Shen, Yu

    2012-01-01

    SUMMARY Length-biased sampling has been well recognized in economics, industrial reliability, etiology applications, epidemiological, genetic and cancer screening studies. Length-biased right-censored data have a unique data structure different from traditional survival data. The nonparametric and semiparametric estimations and inference methods for traditional survival data are not directly applicable for length-biased right-censored data. We propose new expectation-maximization algorithms for estimations based on full likelihoods involving infinite dimensional parameters under three settings for length-biased data: estimating nonparametric distribution function, estimating nonparametric hazard function under an increasing failure rate constraint, and jointly estimating baseline hazards function and the covariate coefficients under the Cox proportional hazards model. Extensive empirical simulation studies show that the maximum likelihood estimators perform well with moderate sample sizes and lead to more efficient estimators compared to the estimating equation approaches. The proposed estimates are also more robust to various right-censoring mechanisms. We prove the strong consistency properties of the estimators, and establish the asymptotic normality of the semi-parametric maximum likelihood estimators under the Cox model using modern empirical processes theory. We apply the proposed methods to a prevalent cohort medical study. Supplemental materials are available online. PMID:22323840

  10. Literature review of levels and determinants of exposure to potential carcinogens and other agents in the road construction industry.

    PubMed

    Burstyn, I; Kromhout, H; Boffetta, P

    2000-01-01

    Workers in the road construction industry include asphalt plant, ground construction, and road paving workers. These individuals can be exposed to a wide range of potentially hazardous substances. A summary of levels of exposure to different substances measured during road construction is presented. In modern road paving, workers typically are exposed to 0.1 to 2 mg/m3 of bitumen fume, which includes 10 to 200 ng/m3 of benzo(a)pyrene. Sampling strategies and analytical methods employed in each reviewed survey are described briefly. The published reports provide some insight into the identity of factors that influence exposure to bitumen among road construction workers: type of work performed, meteorological conditions, temperature of paved asphalt. However, there is a lack of (a) comprehensive and well-designed studies that evaluate determinants of exposure to bitumen in road construction, and (b) standard methods for bitumen sampling and analysis. Information on determinants of other exposures in road construction is either absent or limited. It is concluded that data available through published reports have limited value in assessing historical exposure levels in the road construction industry.

  11. Summary and Synthesis: How to Present a Research Proposal

    PubMed Central

    Setia, Maninder Singh; Panda, Saumya

    2017-01-01

    This concluding module attempts to synthesize the key learning points discussed during the course of the previous ten sets of modules on methodology and biostatistics. The objective of this module is to discuss how to present a model research proposal, based on whatever was discussed in the preceding modules. The lynchpin of a research proposal is the protocol, and the key component of a protocol is the study design. However, one must not neglect the other areas, be it the project summary through which one catches the eyes of the reviewer of the proposal, or the background and the literature review, or the aims and objectives of the study. Two critical areas in the “methods” section that cannot be emphasized more are the sampling strategy and a formal estimation of sample size. Without a legitimate sample size, none of the conclusions based on the statistical analysis would be valid. Finally, the ethical parameters of the study should be well understood by the researchers, and that should get reflected in the proposal. PMID:28979004

  12. Aircraft data summaries for the SURE intensives. Final report. [Sampling done October, 1978 near Duncan Falls, Ohio and Giles County, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keifer, W.S.; Blumenthal, D.L.; Tommerdahl, J.B.

    1981-09-01

    As part of the EPRI sulfate regional experiment (SURE), Meteorology Research, Inc., (MRI) and Research Triangle Institute (RTI) conducted six air quality sampling programs in the eastern United States using instrumented aircraft. This volume includes the air quality and meteorological data obtained during the October 1978 intensive when MRI sampled near the Giles County, Tennessee, SURE Station and RTI sampled near the Duncan Falls, Ohio, SURE Station. Sampling data are presented for all measured parameters.

  13. Report of Operation FITZWILLIAM. Volume 1, Design of Operation and Summary of Results (REDACTED)

    DTIC Science & Technology

    1948-01-01

    storage tanks {400 lbs/sq in) to pel’mit the collection o£ sampleD or gas in the vicinity of the radio- active clouc1. P~dicective LUlC.lysis of -~he gas...Corps Furnish ground dust sampling units and wrap-around countera. 4. Navy Naval :Research Lab. (a) FW:-nish ground dust sampling units...direct as necessar,r the collection or air• craft filters and gaseou.s samples trca aircraft based at Km.falem. (6) Vector Destro,.er-M:lne...Swee

  14. Semi-Supervised Data Summarization: Using Spectral Libraries to Improve Hyperspectral Clustering

    NASA Technical Reports Server (NTRS)

    Wagstaff, K. L.; Shu, H. P.; Mazzoni, D.; Castano, R.

    2005-01-01

    Hyperspectral imagers produce very large images, with each pixel recorded at hundreds or thousands of different wavelengths. The ability to automatically generate summaries of these data sets enables several important applications, such as quickly browsing through a large image repository or determining the best use of a limited bandwidth link (e.g., determining which images are most critical for full transmission). Clustering algorithms can be used to generate these summaries, but traditional clustering methods make decisions based only on the information contained in the data set. In contrast, we present a new method that additionally leverages existing spectral libraries to identify materials that are likely to be present in the image target area. We find that this approach simultaneously reduces runtime and produces summaries that are more relevant to science goals.

  15. Electromagnetic Imaging Methods for Nondestructive Evaluation Applications

    PubMed Central

    Deng, Yiming; Liu, Xin

    2011-01-01

    Electromagnetic nondestructive tests are important and widely used within the field of nondestructive evaluation (NDE). The recent advances in sensing technology, hardware and software development dedicated to imaging and image processing, and material sciences have greatly expanded the application fields, sophisticated the systems design and made the potential of electromagnetic NDE imaging seemingly unlimited. This review provides a comprehensive summary of research works on electromagnetic imaging methods for NDE applications, followed by the summary and discussions on future directions. PMID:22247693

  16. Compulsory Education: Schools, Pupils, Teachers, Programs and Methods. Conference Papers for the 8th Session of the International Standing Conference for the History of Education (Parma, Italy, September 3-6, 1986). Volume II.

    ERIC Educational Resources Information Center

    Genovesi, Giovanni, Ed.

    This second of four volumes on the history of compulsory education among the nations of Europe and the western hemisphere covers schools, pupils, teachers, programs, and methods. Of the volume's 16 selections, 13 are written in English and 3 are written in Italian. Most selections contain summaries; summaries of the Italian articles are written in…

  17. Using clustering and a modified classification algorithm for automatic text summarization

    NASA Astrophysics Data System (ADS)

    Aries, Abdelkrime; Oufaida, Houda; Nouali, Omar

    2013-01-01

    In this paper we describe a modified classification method destined for extractive summarization purpose. The classification in this method doesn't need a learning corpus; it uses the input text to do that. First, we cluster the document sentences to exploit the diversity of topics, then we use a learning algorithm (here we used Naive Bayes) on each cluster considering it as a class. After obtaining the classification model, we calculate the score of a sentence in each class, using a scoring model derived from classification algorithm. These scores are used, then, to reorder the sentences and extract the first ones as the output summary. We conducted some experiments using a corpus of scientific papers, and we have compared our results to another summarization system called UNIS.1 Also, we experiment the impact of clustering threshold tuning, on the resulted summary, as well as the impact of adding more features to the classifier. We found that this method is interesting, and gives good performance, and the addition of new features (which is simple using this method) can improve summary's accuracy.

  18. User-oriented summary extraction for soccer video based on multimodal analysis

    NASA Astrophysics Data System (ADS)

    Liu, Huayong; Jiang, Shanshan; He, Tingting

    2011-11-01

    An advanced user-oriented summary extraction method for soccer video is proposed in this work. Firstly, an algorithm of user-oriented summary extraction for soccer video is introduced. A novel approach that integrates multimodal analysis, such as extraction and analysis of the stadium features, moving object features, audio features and text features is introduced. By these features the semantic of the soccer video and the highlight mode are obtained. Then we can find the highlight position and put them together by highlight degrees to obtain the video summary. The experimental results for sports video of world cup soccer games indicate that multimodal analysis is effective for soccer video browsing and retrieval.

  19. A novel key-frame extraction approach for both video summary and video index.

    PubMed

    Lei, Shaoshuai; Xie, Gang; Yan, Gaowei

    2014-01-01

    Existing key-frame extraction methods are basically video summary oriented; yet the index task of key-frames is ignored. This paper presents a novel key-frame extraction approach which can be available for both video summary and video index. First a dynamic distance separability algorithm is advanced to divide a shot into subshots based on semantic structure, and then appropriate key-frames are extracted in each subshot by SVD decomposition. Finally, three evaluation indicators are proposed to evaluate the performance of the new approach. Experimental results show that the proposed approach achieves good semantic structure for semantics-based video index and meanwhile produces video summary consistent with human perception.

  20. Evaluation and Comparison of Methods for Measuring Ozone ...

    EPA Pesticide Factsheets

    Ambient evaluations of the various ozone and NO2 methods were conducted during field intensive studies as part of the NASA DISCOVER-AQ project conducted during July 2011 near Baltimore, MD; January – February 2013 in the San Juaquin valley, CA; September 2013 in Houston, TX; and July – August 2014 near Denver, CO. During field intensive studies, instruments were calibrated according to manufacturers’ operation manuals and in accordance with FRM requirements listed in 40 CFR 50. During the ambient evaluation campaigns, nightly automated zero and span checks were performed to monitor the validity of the calibration and control for drifts or variations in the span and/or zero response. Both the calibration gas concentrations and the nightly zero and span gas concentrations were delivered using a dynamic dilution calibration system (T700U/T701H, Teledyne API). The analyzers were housed within a temperature-controlled shelter during the sampling campaigns. A glass inlet with sampling height located approximately 5 m above ground level and a subsequent sampling manifold were shared by all instruments. Data generated by all analyzers were collected and logged using a field deployable data acquisition system (Envidas Ultimate). A summary of instruments used during DISCOVER-AQ deployment are listed in Table 1. Figure 1 shows a typical DISCOVER-AQ site (Houston 2013) where EPA (and others) instrumentation was deployed. Under the Clean Air Act, the U.S. EPA has estab

  1. Application of stable‐isotope labelling techniques for the detection of active diazotrophs

    PubMed Central

    Angel, Roey; Panhölzl, Christopher; Gabriel, Raphael; Herbold, Craig; Wanek, Wolfgang; Richter, Andreas; Eichorst, Stephanie A.

    2017-01-01

    Summary Investigating active participants in the fixation of dinitrogen gas is vital as N is often a limiting factor for primary production. Biological nitrogen fixation is performed by a diverse guild of bacteria and archaea (diazotrophs), which can be free‐living or symbionts. Free‐living diazotrophs are widely distributed in the environment, yet our knowledge about their identity and ecophysiology is still limited. A major challenge in investigating this guild is inferring activity from genetic data as this process is highly regulated. To address this challenge, we evaluated and improved several 15N‐based methods for detecting N2 fixation activity (with a focus on soil samples) and studying active diazotrophs. We compared the acetylene reduction assay and the 15N2 tracer method and demonstrated that the latter is more sensitive in samples with low activity. Additionally, tracing 15N into microbial RNA provides much higher sensitivity compared to bulk soil analysis. Active soil diazotrophs were identified with a 15N‐RNA‐SIP approach optimized for environmental samples and benchmarked to 15N‐DNA‐SIP. Lastly, we investigated the feasibility of using SIP‐Raman microspectroscopy for detecting 15N‐labelled cells. Taken together, these tools allow identifying and investigating active free‐living diazotrophs in a highly sensitive manner in diverse environments, from bulk to the single‐cell level. PMID:29027346

  2. Computer code for analysing three-dimensional viscous flows in impeller passages and other duct geometries

    NASA Technical Reports Server (NTRS)

    Tatchell, D. G.

    1979-01-01

    A code, CATHY3/M, was prepared and demonstrated by application to a sample case. The preparation is reviewed, a summary of the capabilities and main features of the code is given, and the sample case results are discussed. Recommendations for future use and development of the code are provided.

  3. 77 FR 52319 - Notice of Proposed Information Collection Requests; Institute of Education Sciences; Needs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-29

    ... research and evaluation needs in the future. The results of the survey will be used to prioritize the... Sciences; Needs Sensing Survey Under the REL Program: Sample Survey Instrument for School Board Members and District Administrators SUMMARY: The needs assessment consists of an online survey of a sample of school...

  4. The Importance of Motivational Appeals to Cooperative Extension Agricultural Clientele. Summary of Research.

    ERIC Educational Resources Information Center

    Wilson, Gary; Newcomb, L. H.

    A study was conducted to determine the relationship of certain motivational appeals to the extent of participation of extension clientele, as perceived by these clientele. A stratified random sample of thirty counties from the ten extension supervisory areas of Ohio was used for the study. This sample provided for 395 adult agricultural clientele…

  5. A Simulation Study of Methods for Selecting Subgroup-Specific Doses in Phase I Trials

    PubMed Central

    Morita, Satoshi; Thall, Peter F.; Takeda, Kentaro

    2016-01-01

    Summary Patient heterogeneity may complicate dose-finding in phase I clinical trials if the dose-toxicity curves differ between subgroups. Conducting separate trials within subgroups may lead to infeasibly small sample sizes in subgroups having low prevalence. Alternatively, it is not obvious how to conduct a single trial while accounting for heterogeneity. To address this problem, we consider a generalization of the continual reassessment method (O’Quigley, et al., 1990) based on a hierarchical Bayesian dose-toxicity model that borrows strength between subgroups under the assumption that the subgroups are exchangeable. We evaluate a design using this model that includes subgroup-specific dose selection and safety rules. A simulation study is presented that includes comparison of this method to three alternative approaches, based on non-hierarchical models, that make different types of assumptions about within-subgroup dose-toxicity curves. The simulations show that the hierarchical model-based method is recommended in settings where the dose-toxicity curves are exchangeable between subgroups. We present practical guidelines for application, and provide computer programs for trial simulation and conduct. PMID:28111916

  6. Experimental Design for Multi-drug Combination Studies Using Signaling Networks

    PubMed Central

    Huang, Hengzhen; Fang, Hong-Bin; Tan, Ming T.

    2017-01-01

    Summary Combinations of multiple drugs are an important approach to maximize the chance for therapeutic success by inhibiting multiple pathways/targets. Analytic methods for studying drug combinations have received increasing attention because major advances in biomedical research have made available large number of potential agents for testing. The preclinical experiment on multi-drug combinations plays a key role in (especially cancer) drug development because of the complex nature of the disease, the need to reduce development time and costs. Despite recent progresses in statistical methods for assessing drug interaction, there is an acute lack of methods for designing experiments on multi-drug combinations. The number of combinations grows exponentially with the number of drugs and dose-levels and it quickly precludes laboratory testing. Utilizing experimental dose-response data of single drugs and a few combinations along with pathway/network information to obtain an estimate of the functional structure of the dose-response relationship in silico, we propose an optimal design that allows exploration of the dose-effect surface with the smallest possible sample size in this paper. The simulation studies show our proposed methods perform well. PMID:28960231

  7. Simulated fissioning of uranium and testing of the fission-track dating method

    USGS Publications Warehouse

    McGee, V.E.; Johnson, N.M.; Naeser, C.W.

    1985-01-01

    A computer program (FTD-SIM) faithfully simulates the fissioning of 238U with time and 235U with neutron dose. The simulation is based on first principles of physics where the fissioning of 238U with the flux of time is described by Ns = ??f 238Ut and the fissioning of 235U with the fluence of neutrons is described by Ni = ??235U??. The Poisson law is used to set the stochastic variation of fissioning within the uranium population. The life history of a given crystal can thus be traced under an infinite variety of age and irradiation conditions. A single dating attempt or up to 500 dating attempts on a given crystal population can be simulated by specifying the age of the crystal population, the size and variation in the areas to be counted, the amount and distribution of uranium, the neutron dose to be used and its variation, and the desired ratio of 238U to 235U. A variety of probability distributions can be applied to uranium and counting-area. The Price and Walker age equation is used to estimate age. The output of FTD-SIM includes the tabulated results of each individual dating attempt (sample) on demand and/or the summary statistics and histograms for multiple dating attempts (samples) including the sampling age. An analysis of the results from FTD-SIM shows that: (1) The external detector method is intrinsically more precise than the population method. (2) For the external detector method a correlation between spontaneous track count, Ns, and induced track count, Ni, results when the population of grains has a stochastic uranium content and/or when the counting areas between grains are stochastic. For the population method no correlation can exist. (3) In the external detector method the sampling distribution of age is independent of the number of grains counted. In the population method the sampling distribution of age is highly dependent on the number of grains counted. (4) Grains with zero-track counts, either in Ns or Ni, are in integral part of fissioning theory and under certain circumstances must be included in any estimate of age. (5) In estimating standard error of age the standard error of Ns and Ni and ?? must be accurately estimated and propagated through the age equation. Several statistical models are presently available to do so. ?? 1985.

  8. An Automatic Multidocument Text Summarization Approach Based on Naïve Bayesian Classifier Using Timestamp Strategy

    PubMed Central

    Ramanujam, Nedunchelian; Kaliappan, Manivannan

    2016-01-01

    Nowadays, automatic multidocument text summarization systems can successfully retrieve the summary sentences from the input documents. But, it has many limitations such as inaccurate extraction to essential sentences, low coverage, poor coherence among the sentences, and redundancy. This paper introduces a new concept of timestamp approach with Naïve Bayesian Classification approach for multidocument text summarization. The timestamp provides the summary an ordered look, which achieves the coherent looking summary. It extracts the more relevant information from the multiple documents. Here, scoring strategy is also used to calculate the score for the words to obtain the word frequency. The higher linguistic quality is estimated in terms of readability and comprehensibility. In order to show the efficiency of the proposed method, this paper presents the comparison between the proposed methods with the existing MEAD algorithm. The timestamp procedure is also applied on the MEAD algorithm and the results are examined with the proposed method. The results show that the proposed method results in lesser time than the existing MEAD algorithm to execute the summarization process. Moreover, the proposed method results in better precision, recall, and F-score than the existing clustering with lexical chaining approach. PMID:27034971

  9. Summary of methods for calculating dynamic lateral stability and response and for estimating aerodynamic stability derivatives

    NASA Technical Reports Server (NTRS)

    Campbell, John P; Mckinney, Marion O

    1952-01-01

    A summary of methods for making dynamic lateral stability and response calculations and for estimating the aerodynamic stability derivatives required for use in these calculations is presented. The processes of performing calculations of the time histories of lateral motions, of the period and damping of these motions, and of the lateral stability boundaries are presented as a series of simple straightforward steps. Existing methods for estimating the stability derivatives are summarized and, in some cases, simple new empirical formulas are presented. Detailed estimation methods are presented for low-subsonic-speed conditions but only a brief discussion and a list of references are given for transonic and supersonic speed conditions.

  10. Polygenic scores via penalized regression on summary statistics.

    PubMed

    Mak, Timothy Shin Heng; Porsch, Robert Milan; Choi, Shing Wan; Zhou, Xueya; Sham, Pak Chung

    2017-09-01

    Polygenic scores (PGS) summarize the genetic contribution of a person's genotype to a disease or phenotype. They can be used to group participants into different risk categories for diseases, and are also used as covariates in epidemiological analyses. A number of possible ways of calculating PGS have been proposed, and recently there is much interest in methods that incorporate information available in published summary statistics. As there is no inherent information on linkage disequilibrium (LD) in summary statistics, a pertinent question is how we can use LD information available elsewhere to supplement such analyses. To answer this question, we propose a method for constructing PGS using summary statistics and a reference panel in a penalized regression framework, which we call lassosum. We also propose a general method for choosing the value of the tuning parameter in the absence of validation data. In our simulations, we showed that pseudovalidation often resulted in prediction accuracy that is comparable to using a dataset with validation phenotype and was clearly superior to the conservative option of setting the tuning parameter of lassosum to its lowest value. We also showed that lassosum achieved better prediction accuracy than simple clumping and P-value thresholding in almost all scenarios. It was also substantially faster and more accurate than the recently proposed LDpred. © 2017 WILEY PERIODICALS, INC.

  11. BAYESIAN LARGE-SCALE MULTIPLE REGRESSION WITH SUMMARY STATISTICS FROM GENOME-WIDE ASSOCIATION STUDIES1

    PubMed Central

    Zhu, Xiang; Stephens, Matthew

    2017-01-01

    Bayesian methods for large-scale multiple regression provide attractive approaches to the analysis of genome-wide association studies (GWAS). For example, they can estimate heritability of complex traits, allowing for both polygenic and sparse models; and by incorporating external genomic data into the priors, they can increase power and yield new biological insights. However, these methods require access to individual genotypes and phenotypes, which are often not easily available. Here we provide a framework for performing these analyses without individual-level data. Specifically, we introduce a “Regression with Summary Statistics” (RSS) likelihood, which relates the multiple regression coefficients to univariate regression results that are often easily available. The RSS likelihood requires estimates of correlations among covariates (SNPs), which also can be obtained from public databases. We perform Bayesian multiple regression analysis by combining the RSS likelihood with previously proposed prior distributions, sampling posteriors by Markov chain Monte Carlo. In a wide range of simulations RSS performs similarly to analyses using the individual data, both for estimating heritability and detecting associations. We apply RSS to a GWAS of human height that contains 253,288 individuals typed at 1.06 million SNPs, for which analyses of individual-level data are practically impossible. Estimates of heritability (52%) are consistent with, but more precise, than previous results using subsets of these data. We also identify many previously unreported loci that show evidence for association with height in our analyses. Software is available at https://github.com/stephenslab/rss. PMID:29399241

  12. Summary of 2012 reconnaissance field studies related to the petroleum geology of the Nenana Basin, interior Alaska

    USGS Publications Warehouse

    Wartes, Marwan A.; Gillis, Robert J.; Herriott, Trystan M.; Stanley, Richard G.; Helmold, Kenneth P.; Peterson, C. Shaun; Benowitz, Jeffrey A.

    2013-01-01

    The Alaska Division of Geological & Geophysical Surveys (DGGS) recently initiated a multi-year review of the hydrocarbon potential of frontier sedimentary basins in Alaska (Swenson and others, 2012). In collaboration with the Alaska Division of Oil & Gas and the U.S. Geological Survey we conducted reconnaissance field studies in two basins with recognized natural gas potential—the Susitna basin and the Nenana basin (LePain and others, 2012). This paper summarizes our initial work on the Nenana basin; a brief summary of our work in the Susitna basin can be found in Gillis and others (in press). During early May 2012, we conducted ten days of helicopter-supported fieldwork and reconnaissance sampling along the northern Alaska Range foothills and Yukon–Tanana upland near Fairbanks (fig. 1). The goal of this work was to improve our understanding of the geologic development of the Nenana basin and to collect a suite of samples to better evaluate hydrocarbon potential. Most laboratory analyses have not yet been completed, so this preliminary report serves as a summary of field data and sets the framework for future, more comprehensive analysis to be presented in later publications.

  13. Radiocarbon Dates from Volcanic Deposits of the Chaos Crags and Cinder Cone Eruptive Sequences and Other Deposits, Lassen Volcanic National Park and Vicinity, California

    USGS Publications Warehouse

    Clynne, Michael A.; Christiansen, Robert L.; Trimble, Deborah A.; McGeehin, John P.

    2008-01-01

    This contribution reports radiocarbon ages obtained from charcoal, wood and other samples collected between 1979 and 2001 in Lassen Volcanic National Park and vicinity and a few samples from other nearby localities. Most of the samples are from the Chaos Crags and Cinder Cone eruptive sequences. Brief summaries are given of the Chaos Crags and Cinder Cone eruptive sequences.

  14. Expressing urine from a gel disposable diaper for biomonitoring using phthalates as an example.

    PubMed

    Liu, Liangpo; Xia, Tongwei; Guo, Lihua; Cao, Lanyu; Zhao, Benhua; Zhang, Jie; Dong, Sijun; Shen, Heqing

    2012-11-01

    The urinary metabolites of phthalates are well-accepted exposure biomarkers for adults and children older than 6 years but are not commonly used for infants owing to non-convenient sampling. In the light of this situation, a novel sampling method based on monitoring the urine expressed from the gel diaper was developed. The urine was expressed from the gel absorbent after mixing the absorbent with CaCl(2) and then collected by a laboratory-made device; the urinary phthalate metabolites were extracted and cleaned using a solid-phase extraction (SPE) column and analyzed with high-performance liquid chromatography-electrospray ionization-tandem mass spectrometry / mass spectrometry. To evaluate the method's feasibility, the following factors were investigated: the proportion of CaCl(2) to gel absorbent, the urination volume variation and the target compounds' deposition bias in the diaper, the matrix blank of the different diaper brands, the storage stabilities and the recoveries of creatinine and phthalate metabolites in the expressed urine. Mono-methyl phthalate, mono-ethyl phthalate, mono-butyl phthalate, mono-benzyl phthalate, mono-2-ethylhexyl phthalate and mono-2-ethyl-5-oxohexyl phthalate were involved. 70-80% of the urine can be expressed from the diaper, and the expressed spiking recoveries and the limit of detection of mono-phthalates ranged from 88.5-115% and 0.21-0.50 ng/ml. The method was applied to measure phthalate metabolites in 65 gel diaper samples from 15 infants, and the pilot data suggests the infants are commonly exposed to phthalates. In summary, the method for monitoring of infant exposure to phthalates is sound and validated, and the potential health effects from the vulnerable infants' exposure to phthalates should be concerned.

  15. Gravimetric water distribution assessment from geoelectrical methods (ERT and EMI) in municipal solid waste landfill.

    PubMed

    Dumont, Gaël; Pilawski, Tamara; Dzaomuho-Lenieregue, Phidias; Hiligsmann, Serge; Delvigne, Frank; Thonart, Philippe; Robert, Tanguy; Nguyen, Frédéric; Hermans, Thomas

    2016-09-01

    The gravimetric water content of the waste material is a key parameter in waste biodegradation. Previous studies suggest a correlation between changes in water content and modification of electrical resistivity. This study, based on field work in Mont-Saint-Guibert landfill (Belgium), aimed, on one hand, at characterizing the relationship between gravimetric water content and electrical resistivity and on the other hand, at assessing geoelectrical methods as tools to characterize the gravimetric water distribution in a landfill. Using excavated waste samples obtained after drilling, we investigated the influences of the temperature, the liquid phase conductivity, the compaction and the water content on the electrical resistivity. Our results demonstrate that Archie's law and Campbell's law accurately describe these relationships in municipal solid waste (MSW). Next, we conducted a geophysical survey in situ using two techniques: borehole electromagnetics (EM) and electrical resistivity tomography (ERT). First, in order to validate the use of EM, EM values obtained in situ were compared to electrical resistivity of excavated waste samples from corresponding depths. The petrophysical laws were used to account for the change of environmental parameters (temperature and compaction). A rather good correlation was obtained between direct measurement on waste samples and borehole electromagnetic data. Second, ERT and EM were used to acquire a spatial distribution of the electrical resistivity. Then, using the petrophysical laws, this information was used to estimate the water content distribution. In summary, our results demonstrate that geoelectrical methods represent a pertinent approach to characterize spatial distribution of water content in municipal landfills when properly interpreted using ground truth data. These methods might therefore prove to be valuable tools in waste biodegradation optimization projects. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Surveillance plan for the early detection of H5N1 highly pathogenic avian influenza virus in migratory birds in the United States: surveillance year 2009

    USGS Publications Warehouse

    Brand, Christopher J.

    2009-01-01

    Executive Summary: This Surveillance Plan (Plan) describes plans for conducting surveillance of wild birds in the United States and its Territories and Freely-Associated States to provide for early detection of the introduction of the H5N1 Highly Pathogenic Avian Influenza (HPAI) subtype of the influenza A virus by migratory birds during the 2009 surveillance year, spanning the period of April 1, 2009 - March 31, 2010. The Plan represents a continuation of surveillance efforts begun in 2006 under the Interagency Strategic Plan for the Early Detection of H5N1 Highly Pathogenic Avian Influenza in Wild Migratory Birds (U.S. Department of Agriculture and U.S. Department of the Interior, 2006). The Plan sets forth sampling plans by: region, target species or species groups to be sampled, locations of sampling, sample sizes, and sampling approaches and methods. This Plan will be reviewed annually and modified as appropriate for subsequent surveillance years based on evaluation of information from previous years of surveillance, changing patterns and threats of H5N1 HPAI, and changes in funding availability for avian influenza surveillance. Specific sampling strategies will be developed accordingly within each of six regions, defined here as Alaska, Hawaiian/Pacific Islands, Lower Pacific Flyway (Washington, Oregon, California, Idaho, Nevada, Arizona), Central Flyway, Mississippi Flyway, and Atlantic Flyway.

  17. δ13C and δ18O measurements of carbonate rocks using Cavity Ring-Down Spectroscopy

    NASA Astrophysics Data System (ADS)

    Lucic, G.; Kim-Hak, D.; Curtis, J. H.

    2017-12-01

    We present a novel, user friendly and cost effective method for the analysis of δ13C and δ18O in CO2 gas obtained from acid digestion of carbonate rocks. 2 to 3 milligrams of pure carbonate, ground to a powder, is digested in a pre-evacuated glass vial using 100% phosphoric acid at 70° C. Vials with the reacted samples are then loaded onto an automated carousel sampler where produced CO2 gas in the headspace is extracted and sent to a Picarro CRDS isotopic C and O analyzer. Once loaded onto the carousel, 49 samples may be analyzed automatically at a rate of one sample every 15 minutes. δ13C and δ18O of the sample are reported in real time with a precision of 0.2 and 0.4 per mil, respectively. The portability and simplicity of the autosampler and CRDS setup opens up potential for permanent and mobile deployments, enabling near-realtime sampling feedback in the lab or on the go in the field. Consumable and operating costs are small when compared to other technology in use, making the CRDS-Carbonate system suitable for large and small research labs. Finally, we present a summary results from a series of validation tests in which standards and natural carbonate rock samples were analyzed and compared to traditional Kiel-IRMS results.

  18. Quality of life in children with new-onset epilepsy

    PubMed Central

    Ferro, Mark A.; Camfield, Carol S.; Huang, Wenyi; Levin, Simon D.; Smith, Mary Lou; Wiebe, Samuel; Zou, Guangyong

    2012-01-01

    Objectives: To assess health-related quality of life (HRQL) over 2 years in children 4−12 years old with new-onset epilepsy and risk factors. Methods: Data are from a multicenter prospective cohort study, the Health-Related Quality of Life Study in Children with Epilepsy Study (HERQULES). Parents reported on children's HRQL and family factors and neurologists on clinical characteristics 4 times. Mean subscale and summary scores were computed for HRQL. Individual growth curve models identified trajectories of change in HRQL scores. Multiple regression identified baseline risk factors for HRQL 2 years later. Results: A total of 374 (82) questionnaires were returned postdiagnosis and 283 (62%) of eligible parents completed all 4. Growth rates for HRQL summary scores were most rapid during the first 6 months and then stabilized. About one-half experienced clinically meaningful improvements in HRQL, one-third maintained their same level, and one-fifth declined. Compared with the general population, at 2 years our sample scored significantly lower on one-third of CHQ subscales and the psychosocial summary. After controlling for baseline HRQL, cognitive problems, poor family functioning, and high family demands were risk factors for poor HRQL 2 years later. Conclusions: On average, HRQL was relatively good but with highly variable individual trajectories. At least one-half did not experience clinically meaningful improvements or declined over 2 years. Cognitive problems were the strongest risk factor for compromised HRQL 2 years after diagnosis and may be largely responsible for declines in the HRQL of children newly diagnosed with epilepsy. PMID:23019268

  19. Validation of the Neurological Fatigue Index for stroke (NFI-Stroke)

    PubMed Central

    2012-01-01

    Background Fatigue is a common symptom in Stroke. Several self-report scales are available to measure this debilitating symptom but concern has been expressed about their construct validity. Objective To examine the reliability and validity of a recently developed scale for multiple sclerosis (MS) fatigue, the Neurological Fatigue Index (NFI-MS), in a sample of stroke patients. Method Six patients with stroke participated in qualitative interviews which were analysed and the themes compared for equivalence to those derived from existing data on MS fatigue. 999 questionnaire packs were sent to those with a stroke within the past four years. Data from the four subscales, and the Summary scale of the NFI-MS were fitted to the Rasch measurement model. Results Themes identified by stroke patients were consistent with those identified by those with MS. 282 questionnaires were returned and respondents had a mean age of 67.3 years; 62% were male, and were on average 17.2 (SD 11.4, range 2–50) months post stroke. The Physical, Cognitive and Summary scales all showed good fit to the model, were unidimensional, and free of differential item functioning by age, sex and time. The sleep scales failed to show adequate fit in their current format. Conclusion Post stroke fatigue appears to be represented by a combination of physical and cognitive components, confirmed by both qualitative and quantitative processes. The NFI-Stroke, comprising a Physical and Cognitive subscale, and a 10-item Summary scale, meets the strictest measurement requirements. Fit to the Rasch model allows conversion of ordinal raw scores to a linear metric. PMID:22587411

  20. A method to estimate the contribution of regional genetic associations to complex traits from summary association statistics.

    PubMed

    Pare, Guillaume; Mao, Shihong; Deng, Wei Q

    2016-06-08

    Despite considerable efforts, known genetic associations only explain a small fraction of predicted heritability. Regional associations combine information from multiple contiguous genetic variants and can improve variance explained at established association loci. However, regional associations are not easily amenable to estimation using summary association statistics because of sensitivity to linkage disequilibrium (LD). We now propose a novel method, LD Adjusted Regional Genetic Variance (LARGV), to estimate phenotypic variance explained by regional associations using summary statistics while accounting for LD. Our method is asymptotically equivalent to a multiple linear regression model when no interaction or haplotype effects are present. It has several applications, such as ranking of genetic regions according to variance explained or comparison of variance explained by two or more regions. Using height and BMI data from the Health Retirement Study (N = 7,776), we show that most genetic variance lies in a small proportion of the genome and that previously identified linkage peaks have higher than expected regional variance.

Top