Sample records for statistical reference levels

  1. 77 FR 62517 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-15

    ...-based vital statistics at the national level, referred to as the U.S. National Vital Statistics System... days of this notice. Proposed Project Vital Statistics Training Application, OMB No. 0920-0217--Revision exp. 5/31/2013--National Center for Health Statistics (NCHS), Centers for Disease Control and...

  2. 75 FR 15709 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-30

    ... statistics at the national level, referred to as the U.S. National Vital Statistics System (NVSS), depends on.... Proposed Project Vital Statistics Training Application (OMB No. 0920-0217 exp. 7/31/ 2010)--Extension--National Center for Health Statistics (NCHS), Centers for Disease Control and Prevention (CDC). Background...

  3. The Effect of Electroencephalogram (EEG) Reference Choice on Information-Theoretic Measures of the Complexity and Integration of EEG Signals

    PubMed Central

    Trujillo, Logan T.; Stanfield, Candice T.; Vela, Ruben D.

    2017-01-01

    Converging evidence suggests that human cognition and behavior emerge from functional brain networks interacting on local and global scales. We investigated two information-theoretic measures of functional brain segregation and integration—interaction complexity CI(X), and integration I(X)—as applied to electroencephalographic (EEG) signals and how these measures are affected by choice of EEG reference. CI(X) is a statistical measure of the system entropy accounted for by interactions among its elements, whereas I(X) indexes the overall deviation from statistical independence of the individual elements of a system. We recorded 72 channels of scalp EEG from human participants who sat in a wakeful resting state (interleaved counterbalanced eyes-open and eyes-closed blocks). CI(X) and I(X) of the EEG signals were computed using four different EEG references: linked-mastoids (LM) reference, average (AVG) reference, a Laplacian (LAP) “reference-free” transformation, and an infinity (INF) reference estimated via the Reference Electrode Standardization Technique (REST). Fourier-based power spectral density (PSD), a standard measure of resting state activity, was computed for comparison and as a check of data integrity and quality. We also performed dipole source modeling in order to assess the accuracy of neural source CI(X) and I(X) estimates obtained from scalp-level EEG signals. CI(X) was largest for the LAP transformation, smallest for the LM reference, and at intermediate values for the AVG and INF references. I(X) was smallest for the LAP transformation, largest for the LM reference, and at intermediate values for the AVG and INF references. Furthermore, across all references, CI(X) and I(X) reliably distinguished between resting-state conditions (larger values for eyes-open vs. eyes-closed). These findings occurred in the context of the overall expected pattern of resting state PSD. Dipole modeling showed that simulated scalp EEG-level CI(X) and I(X) reflected changes in underlying neural source dependencies, but only for higher levels of integration and with highest accuracy for the LAP transformation. Our observations suggest that the Laplacian-transformation should be preferred for the computation of scalp-level CI(X) and I(X) due to its positive impact on EEG signal quality and statistics, reduction of volume-conduction, and the higher accuracy this provides when estimating scalp-level EEG complexity and integration. PMID:28790884

  4. Locating and parsing bibliographic references in HTML medical articles

    PubMed Central

    Zou, Jie; Le, Daniel; Thoma, George R.

    2010-01-01

    The set of references that typically appear toward the end of journal articles is sometimes, though not always, a field in bibliographic (citation) databases. But even if references do not constitute such a field, they can be useful as a preprocessing step in the automated extraction of other bibliographic data from articles, as well as in computer-assisted indexing of articles. Automation in data extraction and indexing to minimize human labor is key to the affordable creation and maintenance of large bibliographic databases. Extracting the components of references, such as author names, article title, journal name, publication date and other entities, is therefore a valuable and sometimes necessary task. This paper describes a two-step process using statistical machine learning algorithms, to first locate the references in HTML medical articles and then to parse them. Reference locating identifies the reference section in an article and then decomposes it into individual references. We formulate this step as a two-class classification problem based on text and geometric features. An evaluation conducted on 500 articles drawn from 100 medical journals achieves near-perfect precision and recall rates for locating references. Reference parsing identifies the components of each reference. For this second step, we implement and compare two algorithms. One relies on sequence statistics and trains a Conditional Random Field. The other focuses on local feature statistics and trains a Support Vector Machine to classify each individual word, followed by a search algorithm that systematically corrects low confidence labels if the label sequence violates a set of predefined rules. The overall performance of these two reference-parsing algorithms is about the same: above 99% accuracy at the word level, and over 97% accuracy at the chunk level. PMID:20640222

  5. Locating and parsing bibliographic references in HTML medical articles.

    PubMed

    Zou, Jie; Le, Daniel; Thoma, George R

    2010-06-01

    The set of references that typically appear toward the end of journal articles is sometimes, though not always, a field in bibliographic (citation) databases. But even if references do not constitute such a field, they can be useful as a preprocessing step in the automated extraction of other bibliographic data from articles, as well as in computer-assisted indexing of articles. Automation in data extraction and indexing to minimize human labor is key to the affordable creation and maintenance of large bibliographic databases. Extracting the components of references, such as author names, article title, journal name, publication date and other entities, is therefore a valuable and sometimes necessary task. This paper describes a two-step process using statistical machine learning algorithms, to first locate the references in HTML medical articles and then to parse them. Reference locating identifies the reference section in an article and then decomposes it into individual references. We formulate this step as a two-class classification problem based on text and geometric features. An evaluation conducted on 500 articles drawn from 100 medical journals achieves near-perfect precision and recall rates for locating references. Reference parsing identifies the components of each reference. For this second step, we implement and compare two algorithms. One relies on sequence statistics and trains a Conditional Random Field. The other focuses on local feature statistics and trains a Support Vector Machine to classify each individual word, followed by a search algorithm that systematically corrects low confidence labels if the label sequence violates a set of predefined rules. The overall performance of these two reference-parsing algorithms is about the same: above 99% accuracy at the word level, and over 97% accuracy at the chunk level.

  6. 78 FR 17406 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-21

    ... authority, the collection of registration-based vital statistics at the national level, referred to as the U... should be received within 30 days of this notice. Proposed Project Vital Statistics Training Application, OMB No. 0920-0217 (expires May 31, 2013)--Revision--National Center for Health Statistics (NCHS...

  7. Determination of reference ranges for elements in human scalp hair.

    PubMed

    Druyan, M E; Bass, D; Puchyr, R; Urek, K; Quig, D; Harmon, E; Marquardt, W

    1998-06-01

    Expected values, reference ranges, or reference limits are necessary to enable clinicians to apply analytical chemical data in the delivery of health care. Determination of references ranges is not straightforward in terms of either selecting a reference population or performing statistical analysis. In light of logistical, scientific, and economic obstacles, it is understandable that clinical laboratories often combine approaches in developing health associated reference values. A laboratory may choose to: 1. Validate either reference ranges of other laboratories or published data from clinical research or both, through comparison of patients test data. 2. Base the laboratory's reference values on statistical analysis of results from specimens assayed by the clinical reference laboratory itself. 3. Adopt standards or recommendations of regulatory agencies and governmental bodies. 4. Initiate population studies to validate transferred reference ranges or to determine them anew. Effects of external contamination and anecdotal information from clinicians may be considered. The clinical utility of hair analysis is well accepted for some elements. For others, it remains in the realm of clinical investigation. This article elucidates an approach for establishment of reference ranges for elements in human scalp hair. Observed levels of analytes from hair specimens from both our laboratory's total patient population and from a physician-defined healthy American population have been evaluated. Examination of levels of elements often associated with toxicity serves to exemplify the process of determining reference ranges in hair. In addition the approach serves as a model for setting reference ranges for analytes in a variety of matrices.

  8. Empirical Reference Distributions for Networks of Different Size

    PubMed Central

    Smith, Anna; Calder, Catherine A.; Browning, Christopher R.

    2016-01-01

    Network analysis has become an increasingly prevalent research tool across a vast range of scientific fields. Here, we focus on the particular issue of comparing network statistics, i.e. graph-level measures of network structural features, across multiple networks that differ in size. Although “normalized” versions of some network statistics exist, we demonstrate via simulation why direct comparison is often inappropriate. We consider normalizing network statistics relative to a simple fully parameterized reference distribution and demonstrate via simulation how this is an improvement over direct comparison, but still sometimes problematic. We propose a new adjustment method based on a reference distribution constructed as a mixture model of random graphs which reflect the dependence structure exhibited in the observed networks. We show that using simple Bernoulli models as mixture components in this reference distribution can provide adjusted network statistics that are relatively comparable across different network sizes but still describe interesting features of networks, and that this can be accomplished at relatively low computational expense. Finally, we apply this methodology to a collection of ecological networks derived from the Los Angeles Family and Neighborhood Survey activity location data. PMID:27721556

  9. Cape Canaveral, Florida range reference atmosphere 0-70 km altitude

    NASA Technical Reports Server (NTRS)

    Tingle, A. (Editor)

    1983-01-01

    The RRA contains tabulations for monthly and annual means, standard deviations, skewness coefficients for wind speed, pressure temperature, density, water vapor pressure, virtual temperature, dew-point temperature, and the means and standard deviations for the zonal and meridional wind components and the linear (product moment) correlation coefficient between the wind components. These statistical parameters are tabulated at the station elevation and at 1 km intervals from sea level to 30 km and at 2 km intervals from 30 to 90 km altitude. The wind statistics are given at approximately 10 m above the station elevations and at altitudes with respect to mean sea level thereafter. For those range sites without rocketsonde measurements, the RRAs terminate at 30 km altitude or they are extended, if required, when rocketsonde data from a nearby launch site are available. There are four sets of tables for each of the 12 monthly reference periods and the annual reference period.

  10. World Population: Facts in Focus. World Population Data Sheet Workbook. Population Learning Series.

    ERIC Educational Resources Information Center

    Crews, Kimberly A.

    This workbook teaches population analysis using world population statistics. To complete the four student activity sheets, the students refer to the included "1988 World Population Data Sheet" which lists nations' statistical data that includes population totals, projected population, birth and death rates, fertility levels, and the…

  11. Statistical Reference Datasets

    National Institute of Standards and Technology Data Gateway

    Statistical Reference Datasets (Web, free access)   The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.

  12. Primary, Secondary, and Meta-Analysis of Research

    ERIC Educational Resources Information Center

    Glass, Gene V.

    1976-01-01

    Examines data analysis at three levels: analysis of data; secondary analysis is the re-analysis of data for the purpose of answering the original research question with better statistical techniques, or answering new questions with old data; and, meta-analysis refers to the statistical analysis of many analysis results from individual studies for…

  13. Evaluation of the 3M™ Molecular Detection Assay (MDA) 2 - Salmonella for the Detection of Salmonella spp. in Select Foods and Environmental Surfaces: Collaborative Study, First Action 2016.01.

    PubMed

    Bird, Patrick; Flannery, Jonathan; Crowley, Erin; Agin, James R; Goins, David; Monteroso, Lisa

    2016-07-01

    The 3M™ Molecular Detection Assay (MDA) 2 - Salmonella uses real-time isothermal technology for the rapid and accurate detection of Salmonella spp. from enriched select food, feed, and food-process environmental samples. The 3M MDA 2 - Salmonella was evaluated in a multilaboratory collaborative study using an unpaired study design. The 3M MDA 2 - Salmonella was compared to the U.S. Food and Drug Administration Bacteriological Analytical Manual Chapter 5 reference method for the detection of Salmonella in creamy peanut butter, and to the U.S. Department of Agriculture, Food Safety and Inspection Service Microbiology Laboratory Guidebook Chapter 4.08 reference method "Isolation and Identification of Salmonella from Meat, Poultry, Pasteurized Egg and Catfish Products and Carcass and Environmental Samples" for the detection of Salmonella in raw ground beef (73% lean). Technicians from 16 laboratories located within the continental United States participated. Each matrix was evaluated at three levels of contamination: an uninoculated control level (0 CFU/test portion), a low inoculum level (0.2-2 CFU/test portion), and a high inoculum level (2-5 CFU/test portion). Statistical analysis was conducted according to the probability of detection (POD) statistical model. Results obtained for the low inoculum level test portions produced difference in collaborator POD values of 0.03 (95% confidence interval, -0.10 to 0.16) for raw ground beef and 0.06 (95% confidence interval, -0.06 to 0.18) for creamy peanut butter, indicating no statistically significant difference between the candidate and reference methods.

  14. DISTMIX: direct imputation of summary statistics for unmeasured SNPs from mixed ethnicity cohorts.

    PubMed

    Lee, Donghyung; Bigdeli, T Bernard; Williamson, Vernell S; Vladimirov, Vladimir I; Riley, Brien P; Fanous, Ayman H; Bacanu, Silviu-Alin

    2015-10-01

    To increase the signal resolution for large-scale meta-analyses of genome-wide association studies, genotypes at unmeasured single nucleotide polymorphisms (SNPs) are commonly imputed using large multi-ethnic reference panels. However, the ever increasing size and ethnic diversity of both reference panels and cohorts makes genotype imputation computationally challenging for moderately sized computer clusters. Moreover, genotype imputation requires subject-level genetic data, which unlike summary statistics provided by virtually all studies, is not publicly available. While there are much less demanding methods which avoid the genotype imputation step by directly imputing SNP statistics, e.g. Directly Imputing summary STatistics (DIST) proposed by our group, their implicit assumptions make them applicable only to ethnically homogeneous cohorts. To decrease computational and access requirements for the analysis of cosmopolitan cohorts, we propose DISTMIX, which extends DIST capabilities to the analysis of mixed ethnicity cohorts. The method uses a relevant reference panel to directly impute unmeasured SNP statistics based only on statistics at measured SNPs and estimated/user-specified ethnic proportions. Simulations show that the proposed method adequately controls the Type I error rates. The 1000 Genomes panel imputation of summary statistics from the ethnically diverse Psychiatric Genetic Consortium Schizophrenia Phase 2 suggests that, when compared to genotype imputation methods, DISTMIX offers comparable imputation accuracy for only a fraction of computational resources. DISTMIX software, its reference population data, and usage examples are publicly available at http://code.google.com/p/distmix. dlee4@vcu.edu Supplementary Data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  15. Probabilistic performance estimators for computational chemistry methods: The empirical cumulative distribution function of absolute errors

    NASA Astrophysics Data System (ADS)

    Pernot, Pascal; Savin, Andreas

    2018-06-01

    Benchmarking studies in computational chemistry use reference datasets to assess the accuracy of a method through error statistics. The commonly used error statistics, such as the mean signed and mean unsigned errors, do not inform end-users on the expected amplitude of prediction errors attached to these methods. We show that, the distributions of model errors being neither normal nor zero-centered, these error statistics cannot be used to infer prediction error probabilities. To overcome this limitation, we advocate for the use of more informative statistics, based on the empirical cumulative distribution function of unsigned errors, namely, (1) the probability for a new calculation to have an absolute error below a chosen threshold and (2) the maximal amplitude of errors one can expect with a chosen high confidence level. Those statistics are also shown to be well suited for benchmarking and ranking studies. Moreover, the standard error on all benchmarking statistics depends on the size of the reference dataset. Systematic publication of these standard errors would be very helpful to assess the statistical reliability of benchmarking conclusions.

  16. A Statistical Approach to Establishing Subsystem Environmental Test Specifications

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1974-01-01

    Results are presented of a research task to evaluate structural responses at various subsystem mounting locations during spacecraft level test exposures to the environments of mechanical shock, acoustic noise, and random vibration. This statistical evaluation is presented in the form of recommended subsystem test specifications for these three environments as normalized to a reference set of spacecraft test levels and are thus suitable for extrapolation to a set of different spacecraft test levels. The recommendations are dependent upon a subsystem's mounting location in a spacecraft, and information is presented on how to determine this mounting zone for a given subsystem.

  17. On Certain New Methodology for Reducing Sensor and Readout Electronics Circuitry Noise in Digital Domain

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Miko, Joseph; Bradley, Damon; Heinzen, Katherine

    2008-01-01

    NASA Hubble Space Telescope (HST) and upcoming cosmology science missions carry instruments with multiple focal planes populated with many large sensor detector arrays. These sensors are passively cooled to low temperatures for low-level light (L3) and near-infrared (NIR) signal detection, and the sensor readout electronics circuitry must perform at extremely low noise levels to enable new required science measurements. Because we are at the technological edge of enhanced performance for sensors and readout electronics circuitry, as determined by thermal noise level at given temperature in analog domain, we must find new ways of further compensating for the noise in the signal digital domain. To facilitate this new approach, state-of-the-art sensors are augmented at their array hardware boundaries by non-illuminated reference pixels, which can be used to reduce noise attributed to sensors. There are a few proposed methodologies of processing in the digital domain the information carried by reference pixels, as employed by the Hubble Space Telescope and the James Webb Space Telescope Projects. These methods involve using spatial and temporal statistical parameters derived from boundary reference pixel information to enhance the active (non-reference) pixel signals. To make a step beyond this heritage methodology, we apply the NASA-developed technology known as the Hilbert- Huang Transform Data Processing System (HHT-DPS) for reference pixel information processing and its utilization in reconfigurable hardware on-board a spaceflight instrument or post-processing on the ground. The methodology examines signal processing for a 2-D domain, in which high-variance components of the thermal noise are carried by both active and reference pixels, similar to that in processing of low-voltage differential signals and subtraction of a single analog reference pixel from all active pixels on the sensor. Heritage methods using the aforementioned statistical parameters in the digital domain (such as statistical averaging of the reference pixels themselves) zeroes out the high-variance components, and the counterpart components in the active pixels remain uncorrected. This paper describes how the new methodology was demonstrated through analysis of fast-varying noise components using the Hilbert-Huang Transform Data Processing System tool (HHT-DPS) developed at NASA and the high-level programming language MATLAB (Trademark of MathWorks Inc.), as well as alternative methods for correcting for the high-variance noise component, using an HgCdTe sensor data. The NASA Hubble Space Telescope data post-processing, as well as future deep-space cosmology projects on-board instrument data processing from all the sensor channels, would benefit from this effort.

  18. Accuracy of metric sex analysis of skeletal remains using Fordisc based on a recent skull collection.

    PubMed

    Ramsthaler, F; Kreutz, K; Verhoff, M A

    2007-11-01

    It has been generally accepted in skeletal sex determination that the use of metric methods is limited due to the population dependence of the multivariate algorithms. The aim of the study was to verify the applicability of software-based sex estimations outside the reference population group for which discriminant equations have been developed. We examined 98 skulls from recent forensic cases of known age, sex, and Caucasian ancestry from cranium collections in Frankfurt and Mainz (Germany) to determine the accuracy of sex determination using the statistical software solution Fordisc which derives its database and functions from the US American Forensic Database. In a comparison between metric analysis using Fordisc and morphological determination of sex, average accuracy for both sexes was 86 vs 94%, respectively, and males were identified more accurately than females. The ratio of the true test result rate to the false test result rate was not statistically different for the two methodological approaches at a significance level of 0.05 but was statistically different at a level of 0.10 (p=0.06). Possible explanations for this difference comprise different ancestry, age distribution, and socio-economic status compared to the Fordisc reference sample. It is likely that a discriminant function analysis on the basis of more similar European reference samples will lead to more valid and reliable sexing results. The use of Fordisc as a single method for the estimation of sex of recent skeletal remains in Europe cannot be recommended without additional morphological assessment and without a built-in software update based on modern European reference samples.

  19. Image classification at low light levels

    NASA Astrophysics Data System (ADS)

    Wernick, Miles N.; Morris, G. Michael

    1986-12-01

    An imaging photon-counting detector is used to achieve automatic sorting of two image classes. The classification decision is formed on the basis of the cross correlation between a photon-limited input image and a reference function stored in computer memory. Expressions for the statistical parameters of the low-light-level correlation signal are given and are verified experimentally. To obtain a correlation-based system for two-class sorting, it is necessary to construct a reference function that produces useful information for class discrimination. An expression for such a reference function is derived using maximum-likelihood decision theory. Theoretically predicted results are used to compare on the basis of performance the maximum-likelihood reference function with Fukunaga-Koontz basis vectors and average filters. For each method, good class discrimination is found to result in milliseconds from a sparse sampling of the input image.

  20. Comparative evaluation of the accuracy of linear measurements between cone beam computed tomography and 3D microtomography.

    PubMed

    Mangione, Francesca; Meleo, Deborah; Talocco, Marco; Pecci, Raffaella; Pacifici, Luciano; Bedini, Rossella

    2013-01-01

    The aim of this study was to evaluate the influence of artifacts on the accuracy of linear measurements estimated with a common cone beam computed tomography (CBCT) system used in dental clinical practice, by comparing it with microCT system as standard reference. Ten bovine bone cylindrical samples containing one implant each, able to provide both points of reference and image quality degradation, have been scanned by CBCT and microCT systems. Thanks to the software of the two systems, for each cylindrical sample, two diameters taken at different levels, by using implants different points as references, have been measured. Results have been analyzed by ANOVA and a significant statistically difference has been found. Due to the obtained results, in this work it is possible to say that the measurements made with the two different instruments are still not statistically comparable, although in some samples were obtained similar performances and therefore not statistically significant. With the improvement of the hardware and software of CBCT systems, in the near future the two instruments will be able to provide similar performances.

  1. [Anthropometric study and evaluation of the nutritional status of a population school children in Granada; comparison of national and international reference standards].

    PubMed

    González Jiménez, E; Aguilar Cordero, M J; Álvarez Ferre, J; Padilla López, C; Valenza, M C

    2012-01-01

    Recent studies show an alarming increase in levels of overweight and obesity among children and adolescents. The main objectives of this research were the following: (i) to carry out an anthropometric evaluation of the nutritional status and body composition of school children in the city and province of Granada; (ii) to compare the nutritional status of this population sample with national and international reference standards. The results obtained in this study showed that the general prevalence of overweight in both sexes was 22.03% and that 9.12% of the children were obese. Statistically significant differences were found between the variable, weight for age and sex (p < 0.05) and the variable, height for age and sex (p < 0.05). Regarding the body mass index, no statistically significant differences were found for the variable, sex (p = 0.182). This contrasted with the variable, age, which did show statistically significant differences (p < 0.05). As a conclusion, the results of our study highlighted the fact that these anthropometric values were much higher than national and international reference standards.

  2. Strong correlations between the exponent α and the particle number for a Renyi monoatomic gas in Gibbs' statistical mechanics.

    PubMed

    Plastino, A; Rocca, M C

    2017-06-01

    Appealing to the 1902 Gibbs formalism for classical statistical mechanics (SM)-the first SM axiomatic theory ever that successfully explained equilibrium thermodynamics-we show that already at the classical level there is a strong correlation between Renyi's exponent α and the number of particles for very simple systems. No reference to heat baths is needed for such a purpose.

  3. Statistical analysis of electromagnetic radiation measurements in the vicinity of GSM/UMTS base station installed on buildings in Serbia.

    PubMed

    Koprivica, Mladen; Slavkovic, Vladimir; Neskovic, Natasa; Neskovic, Aleksandar

    2016-03-01

    As a result of dense deployment of public mobile base stations, additional electromagnetic (EM) radiation occurs in the modern human environment. At the same time, public concern about the exposure to EM radiation emitted by such sources has increased. In order to determine the level of radio frequency radiation generated by base stations, extensive EM field strength measurements were carried out for 664 base station locations, from which 276 locations refer to the case of base stations with antenna system installed on buildings. Having in mind the large percentage (42 %) of locations with installations on buildings, as well as the inevitable presence of people in their vicinity, a detailed analysis of this location category was performed. Measurement results showed that the maximum recorded value of total electric field strength has exceeded International Commission on Non-Ionizing Radiation Protection general public exposure reference levels at 2.5 % of locations and Serbian national reference levels at 15.6 % of locations. It should be emphasised that the values exceeding the reference levels were observed only outdoor, while in indoor total electric field strength in no case exceeded the defined reference levels. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. 34 CFR 647.3 - Who is eligible to participate in a McNair project?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... statistical references or other national survey data submitted to and accepted by the Secretary on a case-by-case basis. (d) Has not enrolled in doctoral level study at an institution of higher education...

  5. Prospects of Fine-Mapping Trait-Associated Genomic Regions by Using Summary Statistics from Genome-wide Association Studies.

    PubMed

    Benner, Christian; Havulinna, Aki S; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ripatti, Samuli; Pirinen, Matti

    2017-10-05

    During the past few years, various novel statistical methods have been developed for fine-mapping with the use of summary statistics from genome-wide association studies (GWASs). Although these approaches require information about the linkage disequilibrium (LD) between variants, there has not been a comprehensive evaluation of how estimation of the LD structure from reference genotype panels performs in comparison with that from the original individual-level GWAS data. Using population genotype data from Finland and the UK Biobank, we show here that a reference panel of 1,000 individuals from the target population is adequate for a GWAS cohort of up to 10,000 individuals, whereas smaller panels, such as those from the 1000 Genomes Project, should be avoided. We also show, both theoretically and empirically, that the size of the reference panel needs to scale with the GWAS sample size; this has important consequences for the application of these methods in ongoing GWAS meta-analyses and large biobank studies. We conclude by providing software tools and by recommending practices for sharing LD information to more efficiently exploit summary statistics in genetics research. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  6. Comparative assessment of a real-time particle monitor against the reference gravimetric method for PM10 and PM2.5 in indoor air

    NASA Astrophysics Data System (ADS)

    Tasić, Viša; Jovašević-Stojanović, Milena; Vardoulakis, Sotiris; Milošević, Novica; Kovačević, Renata; Petrović, Jelena

    2012-07-01

    Accurate monitoring of indoor mass concentrations of particulate matter is very important for health risk assessment as people in developed countries spend approximately 90% of their time indoors. The direct reading, aerosol monitoring device, Turnkey, OSIRIS Particle Monitor (Model 2315) and the European reference low volume sampler, LVS3 (Sven/Leckel LVS3) with size-selective inlets for PM10 and PM2.5 fractions were used to assess the comparability of available optical and gravimetric methods for particulate matter characterization in indoor air. Simultaneous 24-hour samples were collected in an indoor environment for 60 sampling periods in the town of Bor, Serbia. The 24-hour mean PM10 levels from the OSIRIS monitor were well correlated with the LVS3 levels (R2 = 0.87) and did not show statistically significant bias. The 24-hour mean PM2.5 levels from the OSIRIS monitor were moderately correlated with the LVS3 levels (R2 = 0.71), but show statistically significant bias. The results suggest that the OSIRIS monitor provides sufficiently accurate measurements for PM10. The OSIRIS monitor underestimated the indoor PM10 concentrations by approximately 12%, relative to the reference LVS3 sampler. The accuracy of PM10 measurements could be further improved through empirical adjustment. For the fine fraction of particulate matter, PM2.5, it was found that the OSIRIS monitor underestimated indoor concentrations by approximately 63%, relative to the reference LVS3 sampler. This could lead to exposure misclassification in health effects studies relying on PM2.5 measurements collected with this instrument in indoor environments.

  7. Evaluating physical habitat and water chemistry data from statewide stream monitoring programs to establish least-impacted conditions in Washington State

    USGS Publications Warehouse

    Wilmoth, Siri K.; Irvine, Kathryn M.; Larson, Chad

    2015-01-01

    Various GIS-generated land-use predictor variables, physical habitat metrics, and water chemistry variables from 75 reference streams and 351 randomly sampled sites throughout Washington State were evaluated for effectiveness at discriminating reference from random sites within level III ecoregions. A combination of multivariate clustering and ordination techniques were used. We describe average observed conditions for a subset of predictor variables as well as proposing statistical criteria for establishing reference conditions for stream habitat in Washington. Using these criteria, we determined whether any of the random sites met expectations for reference condition and whether any of the established reference sites failed to meet expectations for reference condition. Establishing these criteria will set a benchmark from which future data will be compared.

  8. An Integrative Account of Constraints on Cross-Situational Learning

    PubMed Central

    Yurovsky, Daniel; Frank, Michael C.

    2015-01-01

    Word-object co-occurrence statistics are a powerful information source for vocabulary learning, but there is considerable debate about how learners actually use them. While some theories hold that learners accumulate graded, statistical evidence about multiple referents for each word, others suggest that they track only a single candidate referent. In two large-scale experiments, we show that neither account is sufficient: Cross-situational learning involves elements of both. Further, the empirical data are captured by a computational model that formalizes how memory and attention interact with co-occurrence tracking. Together, the data and model unify opposing positions in a complex debate and underscore the value of understanding the interaction between computational and algorithmic levels of explanation. PMID:26302052

  9. Comparison of ambulatory blood pressure reference standards in children evaluated for hypertension.

    PubMed

    Jones, Deborah P; Richey, Phyllis A; Alpert, Bruce S

    2009-06-01

    The purpose of this study was to systematically compare methods for standardization of blood pressure levels obtained by ambulatory blood pressure monitoring (ABPM) in a group of 111 children studied at our institution. Blood pressure indices, blood pressure loads and standard deviation scores were calculated using the original ABPM and the modified reference standards. Bland-Altman plots and kappa statistics for the level of agreement were generated. Overall, the agreement between the two methods was excellent; however, approximately 5% of children were classified differently by one as compared with the other method. Depending on which version of the German Working Group's reference standards is used for interpretation of ABPM data, the classification of the individual as having hypertension or normal blood pressure may vary.

  10. Comparison of ambulatory blood pressure reference standards in children evaluated for hypertension

    PubMed Central

    Jones, Deborah P.; Richey, Phyllis A.; Alpert, Bruce S.

    2009-01-01

    Objective The purpose of this study was to systematically compare methods for standardization of blood pressure levels obtained by ambulatory blood pressure monitoring (ABPM) in a group of 111 children studied at our institution. Methods Blood pressure indices, blood pressure loads and standard deviation scores were calculated using he original ABPM and the modified reference standards. Bland—Altman plots and kappa statistics for the level of agreement were generated. Results Overall, the agreement between the two methods was excellent; however, approximately 5% of children were classified differently by one as compared with the other method. Conclusion Depending on which version of the German Working Group’s reference standards is used for interpretation of ABPM data, the classification of the individual as having hypertension or normal blood pressure may vary. PMID:19433980

  11. Low-level lasers alter mRNA levels from traditional reference genes used in breast cancer cells

    NASA Astrophysics Data System (ADS)

    Teixeira, A. F.; Canuto, K. S.; Rodrigues, J. A.; Fonseca, A. S.; Mencalha, A. L.

    2017-07-01

    Cancer is among the leading causes of mortality worldwide, increasing the importance of treatment development. Low-level lasers are used in several diseases, but some concerns remains on cancers. Reverse transcriptase quantitative polymerase chain reaction (RT-qPCR) is a technique used to understand cellular behavior through quantification of mRNA levels. Output data from target genes are commonly relative to a reference that cannot vary according to treatment. This study evaluated reference genes levels from MDA-MB-231 cells exposed to red or infrared lasers at different fluences. Cultures were exposed to red and infrared lasers, incubated (4 h, 37 °C), total RNA was extracted and cDNA synthesis was performed to evaluate mRNA levels from ACTB, GUSB and TRFC genes by RT-qPCR. Specific amplification was verified by melting curves and agarose gel electrophoresis. RefFinder enabled data analysis by geNorm, NormFinder and BestKeeper. Specific amplifications were obtained and, although mRNA levels from ACTB, GUSB or TRFC genes presented no significant variation through traditional statistical analysis, Excel-based tools revealed that the use of these reference genes are dependent of laser characteristics. Our data showed that exposure to low-level red and infrared lasers at different fluences alter the mRNA levels from ACTB, GUSB and TRFC in MDA-MB-231 cells.

  12. [Determination of normal reference value of pyrrole adducts in urine in young people in a university in Shandong, China].

    PubMed

    Wang, Hui; Wang, Yiping; Zhou, Zhenwei; Wang, Shuo; Yin, Hongyin; Xie, Keqin

    2015-06-01

    To determine the normal reference value of pyrrole adducts in urine in young people in a university in Shandong, China, and to provide a reliable basis for the clinical diagnosis of n-hexane poisoning. A total of 240 college students were randomly selected. After excluding 32 ineligible students, 208 subjects were included in this study, consisting of 104 males and 104 females, with a mean age of 21?3 years (range: 18 to 24 years). Morning urine was collected from each subject. The content of pyrrole adducts was determined by chromatometry. The content of pyrrole adducts in both male and female obeyed a positively skewed distribution. The median level of pyrrole adducts in male subjects was 0.88 nmol/ml, and the reference value was 0.14-3.92 nmol/ml. The median level of pyrrole adducts in female subjects was 0.93 nmol/ ml, and the reference value was 0.09-3.27 nmol/ml. Student's t test identified no statistical difference in pyrrole adduct level between male and female subjects (t=0.15, P>0.05). The median level of pyrrole adducts in normal young people is 0.91 nmol/ml, and the reference value is 0.11-3.95 nmol/ml.

  13. Differential item functioning magnitude and impact measures from item response theory models.

    PubMed

    Kleinman, Marjorie; Teresi, Jeanne A

    2016-01-01

    Measures of magnitude and impact of differential item functioning (DIF) at the item and scale level, respectively are presented and reviewed in this paper. Most measures are based on item response theory models. Magnitude refers to item level effect sizes, whereas impact refers to differences between groups at the scale score level. Reviewed are magnitude measures based on group differences in the expected item scores and impact measures based on differences in the expected scale scores. The similarities among these indices are demonstrated. Various software packages are described that provide magnitude and impact measures, and new software presented that computes all of the available statistics conveniently in one program with explanations of their relationships to one another.

  14. Hearing Tests on Mobile Devices: Evaluation of the Reference Sound Level by Means of Biological Calibration.

    PubMed

    Masalski, Marcin; Kipiński, Lech; Grysiński, Tomasz; Kręcicki, Tomasz

    2016-05-30

    Hearing tests carried out in home setting by means of mobile devices require previous calibration of the reference sound level. Mobile devices with bundled headphones create a possibility of applying the predefined level for a particular model as an alternative to calibrating each device separately. The objective of this study was to determine the reference sound level for sets composed of a mobile device and bundled headphones. Reference sound levels for Android-based mobile devices were determined using an open access mobile phone app by means of biological calibration, that is, in relation to the normal-hearing threshold. The examinations were conducted in 2 groups: an uncontrolled and a controlled one. In the uncontrolled group, the fully automated self-measurements were carried out in home conditions by 18- to 35-year-old subjects, without prior hearing problems, recruited online. Calibration was conducted as a preliminary step in preparation for further examination. In the controlled group, audiologist-assisted examinations were performed in a sound booth, on normal-hearing subjects verified through pure-tone audiometry, recruited offline from among the workers and patients of the clinic. In both the groups, the reference sound levels were determined on a subject's mobile device using the Bekesy audiometry. The reference sound levels were compared between the groups. Intramodel and intermodel analyses were carried out as well. In the uncontrolled group, 8988 calibrations were conducted on 8620 different devices representing 2040 models. In the controlled group, 158 calibrations (test and retest) were conducted on 79 devices representing 50 models. Result analysis was performed for 10 most frequently used models in both the groups. The difference in reference sound levels between uncontrolled and controlled groups was 1.50 dB (SD 4.42). The mean SD of the reference sound level determined for devices within the same model was 4.03 dB (95% CI 3.93-4.11). Statistically significant differences were found across models. Reference sound levels determined in the uncontrolled group are comparable to the values obtained in the controlled group. This validates the use of biological calibration in the uncontrolled group for determining the predefined reference sound level for new devices. Moreover, due to a relatively small deviation of the reference sound level for devices of the same model, it is feasible to conduct hearing screening on devices calibrated with the predefined reference sound level.

  15. Statistical results from the Virginia Tech propagation experiment using the Olympus 12, 20, and 30 GHz satellite beacons

    NASA Technical Reports Server (NTRS)

    Stutzman, Warren L.; Safaai-Jazi, A.; Pratt, Timothy; Nelson, B.; Laster, J.; Ajaz, H.

    1993-01-01

    Virginia Tech has performed a comprehensive propagation experiment using the Olympus satellite beacons at 12.5, 19.77, and 29.66 GHz (which we refer to as 12, 20, and 30 GHz). Four receive terminals were designed and constructed, one terminal at each frequency plus a portable one with 20 and 30 GHz receivers for microscale and scintillation studies. Total power radiometers were included in each terminal in order to set the clear air reference level for each beacon and also to predict path attenuation. More details on the equipment and the experiment design are found elsewhere. Statistical results for one year of data collection were analyzed. In addition, the following studies were performed: a microdiversity experiment in which two closely spaced 20 GHz receivers were used; a comparison of total power and Dicke switched radiometer measurements, frequency scaling of scintillations, and adaptive power control algorithm development. Statistical results are reported.

  16. Low-level lasers and mRNA levels of reference genes used in Escherichia coli

    NASA Astrophysics Data System (ADS)

    Teixeira, A. F.; Machado, Y. L. R. C.; Fonseca, A. S.; Mencalha, A. L.

    2016-11-01

    Low-level lasers are widely used for the treatment of diseases and antimicrobial photodynamic therapy. Reverse transcriptase quantitative polymerase chain reaction (RT-qPCR) is widely used to evaluate mRNA levels and output data from a target gene are commonly relative to a reference mRNA that cannot vary according to treatment. In this study, the level of reference genes from Escherichia coli exposed to red or infrared lasers at different fluences was evaluated. E. coli AB1157 cultures were exposed to red (660 nm) and infrared (808 nm) lasers, incubated (20 min, 37 °C), the total RNA was extracted, and cDNA synthesis was performed to evaluate mRNA levels from arcA, gyrA and rpoA genes by RT-qPCR. Melting curves and agarose gel electrophoresis were carried out to evaluate specific amplification. Data were analyzed by geNorm, NormFinder and BestKeeper. The melting curve and agarose gel electrophoresis showed specific amplification. Although mRNA levels from arcA, gyrA or rpoA genes presented no significant variations trough a traditional statistical analysis, Excel-based tools revealed that these reference genes are not suitable for E. coli cultures exposed to lasers. Our data showed that exposure to low-level red and infrared lasers at different fluences alter the mRNA levels from arcA, gyrA and rpoA in E. coli cells.

  17. Statistical learning and language acquisition

    PubMed Central

    Romberg, Alexa R.; Saffran, Jenny R.

    2011-01-01

    Human learners, including infants, are highly sensitive to structure in their environment. Statistical learning refers to the process of extracting this structure. A major question in language acquisition in the past few decades has been the extent to which infants use statistical learning mechanisms to acquire their native language. There have been many demonstrations showing infants’ ability to extract structures in linguistic input, such as the transitional probability between adjacent elements. This paper reviews current research on how statistical learning contributes to language acquisition. Current research is extending the initial findings of infants’ sensitivity to basic statistical information in many different directions, including investigating how infants represent regularities, learn about different levels of language, and integrate information across situations. These current directions emphasize studying statistical language learning in context: within language, within the infant learner, and within the environment as a whole. PMID:21666883

  18. Validation of Reference Genes for Real-Time Quantitative PCR (qPCR) Analysis of Avibacterium paragallinarum.

    PubMed

    Wen, Shuxiang; Chen, Xiaoling; Xu, Fuzhou; Sun, Huiling

    2016-01-01

    Real-time quantitative reverse transcription PCR (qRT-PCR) offers a robust method for measurement of gene expression levels. Selection of reliable reference gene(s) for gene expression study is conducive to reduce variations derived from different amounts of RNA and cDNA, the efficiency of the reverse transcriptase or polymerase enzymes. Until now reference genes identified for other members of the family Pasteurellaceae have not been validated for Avibacterium paragallinarum. The aim of this study was to validate nine reference genes of serovars A, B, and C strains of A. paragallinarum in different growth phase by qRT-PCR. Three of the most widely used statistical algorithms, geNorm, NormFinder and ΔCT method were used to evaluate the expression stability of reference genes. Data analyzed by overall rankings showed that in exponential and stationary phase of serovar A, the most stable reference genes were gyrA and atpD respectively; in exponential and stationary phase of serovar B, the most stable reference genes were atpD and recN respectively; in exponential and stationary phase of serovar C, the most stable reference genes were rpoB and recN respectively. This study provides recommendations for stable endogenous control genes for use in further studies involving measurement of gene expression levels.

  19. Using Electronic Data Interchange to Report Product Quality

    DTIC Science & Technology

    1993-03-01

    Numbers 0 31.1 S........................ . . . . ........... .... . .--- . ... N/U 140 SPS Sampling Parameters for Summary Statistics 0 1 N/U 150 REF...DTM Date/Time Reference 0 1 N/U 190 REF Reference Numbers 021 .................................. .......... .. ... NAU 200 STA Statistics 0 1 N/U 210...Measurements 0 1 N/U 120 DTM Date/Time Reference 0 >1 N/U 130 REF Reference Numbers 0 >1 :LOOIV f-SPS N/U 140 SPS Sampling Parameters for Summary Statistics 0 1

  20. Reliability, reference values and predictor variables of the ulnar sensory nerve in disease free adults.

    PubMed

    Ruediger, T M; Allison, S C; Moore, J M; Wainner, R S

    2014-09-01

    The purposes of this descriptive and exploratory study were to examine electrophysiological measures of ulnar sensory nerve function in disease free adults to determine reliability, determine reference values computed with appropriate statistical methods, and examine predictive ability of anthropometric variables. Antidromic sensory nerve conduction studies of the ulnar nerve using surface electrodes were performed on 100 volunteers. Reference values were computed from optimally transformed data. Reliability was computed from 30 subjects. Multiple linear regression models were constructed from four predictor variables. Reliability was greater than 0.85 for all paired measures. Responses were elicited in all subjects; reference values for sensory nerve action potential (SNAP) amplitude from above elbow stimulation are 3.3 μV and decrement across-elbow less than 46%. No single predictor variable accounted for more than 15% of the variance in the response. Electrophysiologic measures of the ulnar sensory nerve are reliable. Absent SNAP responses are inconsistent with disease free individuals. Reference values recommended in this report are based on appropriate transformations of non-normally distributed data. No strong statistical model of prediction could be derived from the limited set of predictor variables. Reliability analyses combined with relatively low level of measurement error suggest that ulnar sensory reference values may be used with confidence. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  1. Order-Constrained Reference Priors with Implications for Bayesian Isotonic Regression, Analysis of Covariance and Spatial Models

    NASA Astrophysics Data System (ADS)

    Gong, Maozhen

    Selecting an appropriate prior distribution is a fundamental issue in Bayesian Statistics. In this dissertation, under the framework provided by Berger and Bernardo, I derive the reference priors for several models which include: Analysis of Variance (ANOVA)/Analysis of Covariance (ANCOVA) models with a categorical variable under common ordering constraints, the conditionally autoregressive (CAR) models and the simultaneous autoregressive (SAR) models with a spatial autoregression parameter rho considered. The performances of reference priors for ANOVA/ANCOVA models are evaluated by simulation studies with comparisons to Jeffreys' prior and Least Squares Estimation (LSE). The priors are then illustrated in a Bayesian model of the "Risk of Type 2 Diabetes in New Mexico" data, where the relationship between the type 2 diabetes risk (through Hemoglobin A1c) and different smoking levels is investigated. In both simulation studies and real data set modeling, the reference priors that incorporate internal order information show good performances and can be used as default priors. The reference priors for the CAR and SAR models are also illustrated in the "1999 SAT State Average Verbal Scores" data with a comparison to a Uniform prior distribution. Due to the complexity of the reference priors for both CAR and SAR models, only a portion (12 states in the Midwest) of the original data set is considered. The reference priors can give a different marginal posterior distribution compared to a Uniform prior, which provides an alternative for prior specifications for areal data in Spatial statistics.

  2. Serum reference interval of ARCHITECT alpha-fetoprotein in healthy Chinese Han adults: Sub-analysis of a prospective multi-center study.

    PubMed

    Yan, Cunling; Yang, Jia; Wei, Lianhua; Hu, Jian; Song, Jiaqi; Wang, Xiaoqin; Han, Ruilin; Huang, Ying; Zhang, Wei; Soh, Andrew; Beshiri, Agim; Fan, Zhuping; Zheng, Yijie; Chen, Wei

    2018-02-01

    Alpha-fetoprotein (AFP) has been widely used in clinical practice for decades. However, large-scale survey of serum reference interval for ARCHITECT AFP is still absent in Chinese population. This study aimed to measure serum AFP levels in healthy Chinese Han subjects, which is a sub-analysis of an ongoing prospective, cross-sectional, multi-center study (ClinicalTrials.gov Identifier: NCT03047603). This analysis included a total of 530 participants (41.43±12.14years of age on average, 48.49% males), enrolled from 5 regional centers. Serum AFP level was measured by ARCHITECT immunoassay. Statistical analysis was performed using SAS 9.4 and R software. AFP distribution did not show significant correlation with age or sex. The overall median and interquartile range of AFP was 2.87 (2.09, 3.83) ng/mL. AFP level did not show a trend of increasing with age. The new reference interval was 2.0-7.07ng/mL (LOQ- 97.5th percentiles). The reference interval for ARCHITECT AFP is updated with the data of adequate number of healthy Han adults. This new reference interval is more practical and applicable in Chinese adults. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  3. New statistical potential for quality assessment of protein models and a survey of energy functions

    PubMed Central

    2010-01-01

    Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality. PMID:20226048

  4. Research study on neutral thermodynamic atmospheric model. [for space shuttle mission and abort trajectory

    NASA Technical Reports Server (NTRS)

    Hargraves, W. R.; Delulio, E. B.; Justus, C. G.

    1977-01-01

    The Global Reference Atmospheric Model is used along with the revised perturbation statistics to evaluate and computer graph various atmospheric statistics along a space shuttle reference mission and abort trajectory. The trajectory plots are height vs. ground range, with height from ground level to 155 km and ground range along the reentry trajectory. Cross sectional plots, height vs. latitude or longitude, are also generated for 80 deg longitude, with heights from 30 km to 90 km and latitude from -90 deg to +90 deg, and for 45 deg latitude, with heights from 30 km to 90 km and longitudes from 180 deg E to 180 deg W. The variables plotted are monthly average pressure, density, temperature, wind components, and wind speed and standard deviations and 99th inter-percentile range for each of these variables.

  5. 40 CFR 430.03 - Best management practices (BMPs) for spent pulping liquor, soap, and turpentine management, spill...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... triggers investigative or corrective action. Mills determine action levels by a statistical analysis of six... exchanger, recovery furnace or boiler, pipeline, valve, fitting, or other device that contains, processes... gases from the cooking of softwoods by the kraft pulping process. Sometimes referred to as sulfate...

  6. Assessment of tolerant sunfish populations (Lepomis sp.) inhabiting selenium-laden coal ash effluents. 1. Hematological and population level assessment.

    PubMed

    Lohner, T W; Reash, R J; Willet, V E; Rose, L A

    2001-11-01

    Sunfish were collected from coal ash effluent-receiving streams and Ohio River watershed reference sites to assess the effects of exposure to low-level selenium concentrations. Selenium, copper, and arsenic concentrations were statistically higher in tissue samples from exposed fish than in reference fish. Leukopenia, lymphocytosis, and neutropenia were evident in exposed fish and were indicative of metal exposure and effect. White blood cell counts and percent lymphocyte values were significantly correlated with liver selenium concentrations. Plasma protein levels were significantly lower in exposed fish than in fish from the Ohio River, indicating that exposed fish may have been nutritionally stressed. Condition factors for fish from the ash pond-receiving streams were the same as, or lower than, those of fish from the reference sites. There was no evidence that the growth rate of fish in the receiving streams differed from that of fish in the reference streams. Despite liver selenium concentrations which exceeded reported toxicity thresholds and evidence of significant hematological changes, there were no significant differences in fish condition factors, liver-somatic indices, or length-weight regressions related to selenium.

  7. Thematic accuracy of the 1992 National Land-Cover Data for the eastern United States: Statistical methodology and regional results

    USGS Publications Warehouse

    Stehman, S.V.; Wickham, J.D.; Smith, J.H.; Yang, L.

    2003-01-01

    The accuracy of the 1992 National Land-Cover Data (NLCD) map is assessed via a probability sampling design incorporating three levels of stratification and two stages of selection. Agreement between the map and reference land-cover labels is defined as a match between the primary or alternate reference label determined for a sample pixel and a mode class of the mapped 3×3 block of pixels centered on the sample pixel. Results are reported for each of the four regions comprising the eastern United States for both Anderson Level I and II classifications. Overall accuracies for Levels I and II are 80% and 46% for New England, 82% and 62% for New York/New Jersey (NY/NJ), 70% and 43% for the Mid-Atlantic, and 83% and 66% for the Southeast.

  8. Fast and accurate imputation of summary statistics enhances evidence of functional enrichment

    PubMed Central

    Pasaniuc, Bogdan; Zaitlen, Noah; Shi, Huwenbo; Bhatia, Gaurav; Gusev, Alexander; Pickrell, Joseph; Hirschhorn, Joel; Strachan, David P.; Patterson, Nick; Price, Alkes L.

    2014-01-01

    Motivation: Imputation using external reference panels (e.g. 1000 Genomes) is a widely used approach for increasing power in genome-wide association studies and meta-analysis. Existing hidden Markov models (HMM)-based imputation approaches require individual-level genotypes. Here, we develop a new method for Gaussian imputation from summary association statistics, a type of data that is becoming widely available. Results: In simulations using 1000 Genomes (1000G) data, this method recovers 84% (54%) of the effective sample size for common (>5%) and low-frequency (1–5%) variants [increasing to 87% (60%) when summary linkage disequilibrium information is available from target samples] versus the gold standard of 89% (67%) for HMM-based imputation, which cannot be applied to summary statistics. Our approach accounts for the limited sample size of the reference panel, a crucial step to eliminate false-positive associations, and it is computationally very fast. As an empirical demonstration, we apply our method to seven case–control phenotypes from the Wellcome Trust Case Control Consortium (WTCCC) data and a study of height in the British 1958 birth cohort (1958BC). Gaussian imputation from summary statistics recovers 95% (105%) of the effective sample size (as quantified by the ratio of χ2 association statistics) compared with HMM-based imputation from individual-level genotypes at the 227 (176) published single nucleotide polymorphisms (SNPs) in the WTCCC (1958BC height) data. In addition, for publicly available summary statistics from large meta-analyses of four lipid traits, we publicly release imputed summary statistics at 1000G SNPs, which could not have been obtained using previously published methods, and demonstrate their accuracy by masking subsets of the data. We show that 1000G imputation using our approach increases the magnitude and statistical evidence of enrichment at genic versus non-genic loci for these traits, as compared with an analysis without 1000G imputation. Thus, imputation of summary statistics will be a valuable tool in future functional enrichment analyses. Availability and implementation: Publicly available software package available at http://bogdan.bioinformatics.ucla.edu/software/. Contact: bpasaniuc@mednet.ucla.edu or aprice@hsph.harvard.edu Supplementary information: Supplementary materials are available at Bioinformatics online. PMID:24990607

  9. Serum level of vitamin D3 in cutaneous melanoma

    PubMed Central

    de Oliveira, Renato Santos; de Oliveira, Daniel Arcuschin; Martinho, Vitor Augusto Melão; Antoneli, Célia Beatriz Gianotti; Marcussi, Ludmilla Altino de Lima; Ferreira, Carlos Eduardo dos Santos

    2014-01-01

    Objective To compare the level of vitamin D3 in cutaneous melanoma patients, with or without disease activity, with reference values and with patients from a general hospital. Methods The serum levels of vitamin D3 were measured in cutaneous melanoma patients, aged 20 to 88 years, both genders, from January 2010 to December 2013. The samples from the general group were processed at Hospital Israelita Albert Einstein (control group). Data analysis was performed using the Statistics software. Results A total of 100 patients were studied, 54 of them men, with mean age of 54.67 years, and 95 Caucasian. Out of these 100 patients, 17 had active disease. The average levels of vitamin D3 in the melanoma patients were lower than the level considered sufficient, but above the average of the control group. Both groups (with or without active disease) of patients showed a similar distribution of vitamin D3 deficiency. Conclusion Vitamin D3 levels in melanoma patients were higher than those of general patients and lower than the reference level. If the reference values are appropriate, a large part of the population had insufficient levels of vitamin D, including those with melanoma, or else, this standard needs to be reevaluated. No difference in vitamin D3 levels was found among melanoma patients with or without active disease. More comprehensive research is needed to assess the relation between vitamin D and melanoma. PMID:25628199

  10. Lower incisor inclination regarding different reference planes.

    PubMed

    Zataráin, Brenda; Avila, Josué; Moyaho, Angeles; Carrasco, Rosendo; Velasco, Carmen

    2016-09-01

    The purpose of this study was to assess the degree of lower incisor inclination with respect to different reference planes. It was an observational, analytical, longitudinal, prospective study conducted on 100 lateral cephalograms which were corrected according to the photograph in natural head position in order to draw the true vertical plane (TVP). The incisor mandibular plane angle (IMPA) was compensated to eliminate the variation of the mandibular plane growth type with the formula "FMApx.- 25 (FMA) + IMPApx. = compensated IMPA (IMPACOM)". As the data followed normal distribution determined by the KolmogorovSmirnov test, parametric tests were used for the statistical analysis, Ttest, ANOVA and Pearson coefficient correlation test. Statistical analysis was performed using a statistical significance of p <0.05. There is correlation between TVP and NB line (NB) (0.8614), Frankfort mandibular incisor angle (FMIA) (0.8894), IMPA (0.6351), Apo line (Apo) (0.609), IMPACOM (0.8895) and McHorris angle (MH) (0.7769). ANOVA showed statistically significant differences between the means for the 7 variables with 95% confidence level, P=0.0001. The multiple range test showed no significant difference among means: APoNB (0.88), IMPAMH (0.36), IMPANB (0.65), FMIAIMPACOM (0.01), FMIATVP (0.18), TVPIMPACOM (0.17). There was correlation among all reference planes. There were statistically significant differences among the means of the planes measured, except for IMPACOM, FMIA and TVP. The IMPA differed significantly from the IMPACOM. The compensated IMPA and the FMIA did not differ significantly from the TVP. The true horizontal plane was mismatched with Frankfort plane in 84% of the sample with a range of 19°. The true vertical plane is adequate for measuring lower incisor inclination. Sociedad Argentina de Investigación Odontológica.

  11. The unrealized promise of infant statistical word-referent learning

    PubMed Central

    Smith, Linda B.; Suanda, Sumarga H.; Yu, Chen

    2014-01-01

    Recent theory and experiments offer a new solution as to how infant learners may break into word learning, by using cross-situational statistics to find the underlying word-referent mappings. Computational models demonstrate the in-principle plausibility of this statistical learning solution and experimental evidence shows that infants can aggregate and make statistically appropriate decisions from word-referent co-occurrence data. We review these contributions and then identify the gaps in current knowledge that prevent a confident conclusion about whether cross-situational learning is the mechanism through which infants break into word learning. We propose an agenda to address that gap that focuses on detailing the statistics in the learning environment and the cognitive processes that make use of those statistics. PMID:24637154

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fresquez, Philip R.

    Field mice are effective indicators of contaminant presence. This paper reports the concentrations of various radionuclides, heavy metals, polychlorinated biphenyls, high explosives, perchlorate, and dioxin/furans in field mice (mostly deer mice) collected from regional background areas in northern New Mexico. These data, represented as the regional statistical reference level (the mean plus three standard deviations = 99% confidence level), are used to compare with data from field mice collected from areas potentially impacted by Laboratory operations, as per the Environmental Surveillance Program at Los Alamos National Laboratory.

  13. [Reference values for erythrocyte cholinesterase activity in the working population of Antioquia, Colombia, according to the Michel and EQM techniques].

    PubMed

    Carmona-Fonseca, Jaime

    2003-11-01

    To establish reference values for erythrocyte cholinesterase (EC 3.1.1.7) activity for the active working population of two regions of the department of Antioquia, Colombia, that are located at different altitudes above sea level. We took representative samples from populations of active working persons 18 to 59 years old from two regions in the department of Antioquia: (1) the Aburrá Valley (1 540 m above sea level) and (2) the near east of the department (2 150 m above sea level). We excluded workers who were using cholinesterase-inhibiting substances in their work or at home, those who had a disease that altered their cholinesterase levels, and those who said they were not in good health. We measured the erythrocyte cholinesterase activity using two methods: (1) the Michel method and (2) the EQM method (EQM Research, Cincinnati, Ohio, United States of America). We carried out the measurements with 827 people, 415 from the Aburrá Valley and 412 from the near east region. We compared proportions using the chi-square test and Fisher's exact test. We utilized the Student's t test for independent samples to compare two averages. To simultaneously compare three or more averages, analysis of variance was used, followed by the Newman-Keuls multiple-range test. When the variables were not normally distributed or when the variances were not homogeneous, Kruskal-Wallis nonparametric analysis of variance was used to compare the medians. Three computer software programs were used in the statistical analysis: SPSS 9.0, SGPlus 7.1, and Epi Info 6.04. In all the statistical tests the level of significance was set at P < 0.05. The average erythrocyte cholinesterase activity value that we found for the studied population by using the Michel method was 0.857 delta pH/hour (95% confidence interval (CI): 0.849 to 0.866), and the average value found through the EQM method was 35.21 U/g hemoglobin (95% CI: 34.82 to 35.60). With the Michel method: (1) the enzymatic activity differed significantly between the two regions, according to the Newman-Keuls test; (2) within each region, the enzymatic activity was significantly higher among males than among females, according to the Newman-Keuls test; and (3) in none of the region-sex strata was there a statistically significant influence of age on the enzymatic activity. Using the EQM method, there were no statistically significant differences by region, sex, or age group. The erythrocyte cholinesterase activity values found by the two analytical techniques were significantly higher than the values from outside Colombia that are now being used as reference values in the country, which poses both clinical and epidemiological problems. We recommend that the data from this study be adopted as the reference values in Colombia.

  14. High levels of anxiety and depression in diabetic patients with Charcot foot.

    PubMed

    Chapman, Zahra; Shuttleworth, Charles Matthew James; Huber, Jörg Wolfgang

    2014-01-01

    Charcot foot is a rare but devastating complication of diabetes. Little research is available on the mental health impact of Charcot foot. Aim of the study is to assess mental health in diabetes patients with Charcot foot and to investigate the moderating effects of socio-demographic factors. The severity of the problem will be statistically evaluated with the help of a reference data set. Cross-sectional questionnaire data using the Hospital Anxiety and Depression Scale (HADS) and demographic background were collected from 50 patients with diabetes and Charcot complications (males 62%; mean age 62.2 ± 8.5 years). Statistical comparisons with a large data set of general diabetes patients acting as a point of reference were carried out. Anxiety and depression levels were high, (anxiety and depression scores 6.4 ± 4 and 6.3 ± 3.6 respectively). Females reported more severe anxiety and depression. Ethnic minorities and patients out of work reported more severe anxiety. Comparisons with published HADS data indicate that diabetes patients with Charcot foot experience more serious levels of anxiety and depression. The high levels of mental health problems which were found in this study in diabetes patients with Charcot foot require recognition by researchers and clinicians. The findings imply the need to screen for mental health problems in diabetes patients with Charcot foot.

  15. High levels of anxiety and depression in diabetic patients with Charcot foot

    PubMed Central

    2014-01-01

    Background/aims Charcot foot is a rare but devastating complication of diabetes. Little research is available on the mental health impact of Charcot foot. Aim of the study is to assess mental health in diabetes patients with Charcot foot and to investigate the moderating effects of socio-demographic factors. The severity of the problem will be statistically evaluated with the help of a reference data set. Methods Cross-sectional questionnaire data using the Hospital Anxiety and Depression Scale (HADS) and demographic background were collected from 50 patients with diabetes and Charcot complications (males 62%; mean age 62.2 ± 8.5 years). Statistical comparisons with a large data set of general diabetes patients acting as a point of reference were carried out. Results Anxiety and depression levels were high, (anxiety and depression scores 6.4 ± 4 and 6.3 ± 3.6 respectively). Females reported more severe anxiety and depression. Ethnic minorities and patients out of work reported more severe anxiety. Comparisons with published HADS data indicate that diabetes patients with Charcot foot experience more serious levels of anxiety and depression. Conclusions The high levels of mental health problems which were found in this study in diabetes patients with Charcot foot require recognition by researchers and clinicians. The findings imply the need to screen for mental health problems in diabetes patients with Charcot foot. PMID:24650435

  16. Level statistics of words: Finding keywords in literary texts and symbolic sequences

    NASA Astrophysics Data System (ADS)

    Carpena, P.; Bernaola-Galván, P.; Hackenberg, M.; Coronado, A. V.; Oliver, J. L.

    2009-03-01

    Using a generalization of the level statistics analysis of quantum disordered systems, we present an approach able to extract automatically keywords in literary texts. Our approach takes into account not only the frequencies of the words present in the text but also their spatial distribution along the text, and is based on the fact that relevant words are significantly clustered (i.e., they self-attract each other), while irrelevant words are distributed randomly in the text. Since a reference corpus is not needed, our approach is especially suitable for single documents for which no a priori information is available. In addition, we show that our method works also in generic symbolic sequences (continuous texts without spaces), thus suggesting its general applicability.

  17. Combined dietary and exercise intervention for control of serum cholesterol in the workplace

    NASA Technical Reports Server (NTRS)

    Angotti, C. M.; Chan, W. T.; Sample, C. J.; Levine, M. S.

    2000-01-01

    PURPOSE: To elucidate a potential combined dietary and exercise intervention affect on cardiovascular risk reduction of the National Aeronautics and Space Administration Headquarters employees. DESIGN: A nonexperimental, longitudinal, clinical-chart review study (1987 to 1996) of an identified intervention group and a reference (not a control) group. SETTING: The study group worked in an office environment and participated in the annual medical examinations. SUBJECTS: An intervention group of 858 people with initially elevated serum cholesterol, and a reference group of 963 people randomly sampled from 10% of the study group. MEASURES: Serum cholesterol data were obtained for both groups, respectively, from pre- and postintervention and annual examinations. The reference group was adjusted by statistical exclusion of potential intervention participants. Regression equations (cholesterol vs. study years) for the unadjusted/adjusted reference groups were tested for statistical significance. INTERVENTION: An 8-week individualized, combined dietary and exercise program was instituted with annual follow-ups and was repeated where warranted. RESULTS: Only the unadjusted (but not the adjusted) reference group with initial mean total serum cholesterol levels above 200 mg/dL shows a significant 9-year decline trend and significant beta coefficient tests. An intervention effect is suggested. Mean high density lipoprotein cholesterol rose slightly in the intervention group but was maintained in the reference group. CONCLUSION: With potential design limitations, the NASA intervention program focusing on a high risk group may be associated to some degree, if not fully, with an overall cardiovascular risk profile improvement.

  18. THEMATIC ACCURACY OF THE 1992 NATIONAL LAND-COVER DATA (NLCD) FOR THE EASTERN UNITED STATES: STATISTICAL METHODOLOGY AND REGIONAL RESULTS

    EPA Science Inventory

    The accuracy of the National Land Cover Data (NLCD) map is assessed via a probability sampling design incorporating three levels of stratification and two stages of selection. Agreement between the map and reference land-cover labels is defined as a match between the primary or a...

  19. Preparing a Data Scientist: A Pedagogic Experience in Designing a Big Data Analytics Course

    ERIC Educational Resources Information Center

    Asamoah, Daniel Adomako; Sharda, Ramesh; Hassan Zadeh, Amir; Kalgotra, Pankush

    2017-01-01

    In this article, we present an experiential perspective on how a big data analytics course was designed and delivered to students at a major Midwestern university. In reference to the "MSIS 2006 Model Curriculum," we designed this course as a level 2 course, with prerequisites in databases, computer programming, statistics, and data…

  20. A new item response theory model to adjust data allowing examinee choice

    PubMed Central

    Costa, Marcelo Azevedo; Braga Oliveira, Rivert Paulo

    2018-01-01

    In a typical questionnaire testing situation, examinees are not allowed to choose which items they answer because of a technical issue in obtaining satisfactory statistical estimates of examinee ability and item difficulty. This paper introduces a new item response theory (IRT) model that incorporates information from a novel representation of questionnaire data using network analysis. Three scenarios in which examinees select a subset of items were simulated. In the first scenario, the assumptions required to apply the standard Rasch model are met, thus establishing a reference for parameter accuracy. The second and third scenarios include five increasing levels of violating those assumptions. The results show substantial improvements over the standard model in item parameter recovery. Furthermore, the accuracy was closer to the reference in almost every evaluated scenario. To the best of our knowledge, this is the first proposal to obtain satisfactory IRT statistical estimates in the last two scenarios. PMID:29389996

  1. Assessment of tolerant sunfish populations (Lepomis sp.) inhabiting selenium-laden coal ash effluents. 3. Serum chemistry and fish health indicators.

    PubMed

    Lohner, T W; Reash, R J; Willet, V E; Fletcher, J

    2001-11-01

    Sunfish were collected from fly ash discharge-receiving streams to assess the possible effects of exposure to elevated selenium. Concentrations of selenium, copper, and arsenic were statistically higher in fish tissue (liver) samples from effluent-exposed fish than in reference fish. Several biomarkers were indicative of metal exposure and effect. Plasma protein levels and cholesterol levels were significantly lower in exposed fish, indicating nutritional stress. Ion levels (i.e., K) increased with exposure to ash pond metals, indicating possible gill damage. Fish from the receiving streams also had increased serum glucose and osmolality indicating possible acute stress due to sampling. Fish health assessments revealed a lower incidence of fin erosion, kidney discoloration, urolithiasis or nephrocalcinosis, liver discoloration, and parasites in exposed fish and a higher incidence of skin, eye, and gill aberrations. Condition factors of exposed fish were correlated with biomarker response and were the same as or lower than those of reference fish, but not related to selenium levels. Although several serum biochemical indicators differed between the ash pond-receiving stream and reference sites, pollutant exposure was apparently not sufficient to cause functional damage to critical organ systems.

  2. Injury profiles related to mortality in patients with a low Injury Severity Score: a case-mix issue?

    PubMed

    Joosse, Pieter; Schep, Niels W L; Goslings, J Carel

    2012-07-01

    Outcome prediction models are widely used to evaluate trauma care. External benchmarking provides individual institutions with a tool to compare survival with a reference dataset. However, these models do have limitations. In this study, the hypothesis was tested whether specific injuries are associated with increased mortality and whether differences in case-mix of these injuries influence outcome comparison. A retrospective study was conducted in a Dutch trauma region. Injury profiles, based on injuries most frequently endured by unexpected death, were determined. The association between these injury profiles and mortality was studied in patients with a low Injury Severity Score by logistic regression. The standardized survival of our population (Ws statistic) was compared with North-American and British reference databases, with and without patients suffering from previously defined injury profiles. In total, 14,811 patients were included. Hip fractures, minor pelvic fractures, femur fractures, and minor thoracic injuries were significantly associated with mortality corrected for age, sex, and physiologic derangement in patients with a low injury severity. Odds ratios ranged from 2.42 to 2.92. The Ws statistic for comparison with North-American databases significantly improved after exclusion of patients with these injuries. The Ws statistic for comparison with a British reference database remained unchanged. Hip fractures, minor pelvic fractures, femur fractures, and minor thoracic wall injuries are associated with increased mortality. Comparative outcome analysis of a population with a reference database that differs in case-mix with respect to these injuries should be interpreted cautiously. Prognostic study, level II.

  3. The quality of veterinary in-clinic and reference laboratory biochemical testing.

    PubMed

    Rishniw, Mark; Pion, Paul D; Maher, Tammy

    2012-03-01

    Although evaluation of biochemical analytes in blood is common in veterinary practice, studies assessing the global quality of veterinary in-clinic and reference laboratory testing have not been reported. The aim of this study was to assess the quality of biochemical testing in veterinary laboratories using results obtained from analyses of 3 levels of assayed quality control materials over 5 days. Quality was assessed by comparison of calculated total error with quality requirements, determination of sigma metrics, use of a quality goal index to determine factors contributing to poor performance, and agreement between in-clinic and reference laboratory mean results. The suitability of in-clinic and reference laboratory instruments for statistical quality control was determined using adaptations from the computerized program, EZRules3. Reference laboratories were able to achieve desirable quality requirements more frequently than in-clinic laboratories. Across all 3 materials, > 50% of in-clinic analyzers achieved a sigma metric ≥ 6.0 for measurement of 2 analytes, whereas > 50% of reference laboratory analyzers achieved a sigma metric ≥ 6.0 for measurement of 6 analytes. Expanded uncertainty of measurement and ± total allowable error resulted in the highest mean percentages of analytes demonstrating agreement between in-clinic and reference laboratories. Owing to marked variation in bias and coefficient of variation between analyzers of the same and different types, the percentages of analytes suitable for statistical quality control varied widely. These findings reflect the current state-of-the-art with regard to in-clinic and reference laboratory analyzer performance and provide a baseline for future evaluations of the quality of veterinary laboratory testing. © 2012 American Society for Veterinary Clinical Pathology.

  4. Modeling Cross-Situational Word-Referent Learning: Prior Questions

    ERIC Educational Resources Information Center

    Yu, Chen; Smith, Linda B.

    2012-01-01

    Both adults and young children possess powerful statistical computation capabilities--they can infer the referent of a word from highly ambiguous contexts involving many words and many referents by aggregating cross-situational statistical information across contexts. This ability has been explained by models of hypothesis testing and by models of…

  5. Establishment of new complete blood count reference values for healthy Thai adults.

    PubMed

    Wongkrajang, P; Chinswangwatanakul, W; Mokkhamakkun, C; Chuangsuwanich, N; Wesarachkitti, B; Thaowto, B; Laiwejpithaya, S; Komkhum, O

    2018-04-28

    Laboratory reference ranges are essential for diagnostic orientation and treatment decision. As complete blood count parameters are influenced by various factors, including gender, geographic origin, and ethnic origin, it is important to establish specific hematologic reference values for specific populations. This study was conducted at the Department of Clinical Pathology, Faculty of Medicine Siriraj Hospital, Mahidol University, Bangkok, Thailand. Blood samples were taken from healthy adults aged 18-60 years that attended a health check-up program at our hospital during February 2015 to July 2015. Hematologic and routine chemistry analysis were performed. Participants were determined to be healthy based on medical history and routine medical examinations. Serum vitamin B12, folate, ferritin, and hemoglobin typing were also analyzed to exclude the possible presence of anemia. A statistically significant difference was observed between males and females for Hb level, hematocrit level, red blood cell count, mean corpuscular hemoglobin concentration, percentage neutrophils, monocytes and eosinophils, and absolute neutrophil, lymphocyte, basophil, and platelet counts. Accordingly, gender-specific reference intervals were established for all complete blood count parameters in healthy Thai adult population. The reference value ranges established in this study reflect significant differences between genders. It is possible that these reference ranges may be generalizable to adults living in Thailand. The findings of this study emphasize the importance of establishing specific hematologic reference values for specific populations. © 2018 John Wiley & Sons Ltd.

  6. Breast Reference Set Application: Chris Li-FHCRC (2014) — EDRN Public Portal

    Cancer.gov

    This application proposes to use Reference Set #1. We request access to serum samples collected at the time of breast biopsy from subjects with IC (n=30) or benign disease without atypia (n=30). Statistical power: With 30 BC cases and 30 normal controls, a 25% difference in mean metabolite levels can be detected between groups with 80% power and α=0.05, assuming coefficients of variation of 30%, consistent with our past studies. These sample sizes appear sufficient to enable detection of changes similar in magnitude to those previously reported in pre-clinical (BC recurrence) specimens (20).

  7. The role of reference in cross-situational word learning.

    PubMed

    Wang, Felix Hao; Mintz, Toben H

    2018-01-01

    Word learning involves massive ambiguity, since in a particular encounter with a novel word, there are an unlimited number of potential referents. One proposal for how learners surmount the problem of ambiguity is that learners use cross-situational statistics to constrain the ambiguity: When a word and its referent co-occur across multiple situations, learners will associate the word with the correct referent. Yu and Smith (2007) propose that these co-occurrence statistics are sufficient for word-to-referent mapping. Alternative accounts hold that co-occurrence statistics alone are insufficient to support learning, and that learners are further guided by knowledge that words are referential (e.g., Waxman & Gelman, 2009). However, no behavioral word learning studies we are aware of explicitly manipulate subjects' prior assumptions about the role of the words in the experiments in order to test the influence of these assumptions. In this study, we directly test whether, when faced with referential ambiguity, co-occurrence statistics are sufficient for word-to-referent mappings in adult word-learners. Across a series of cross-situational learning experiments, we varied the degree to which there was support for the notion that the words were referential. At the same time, the statistical information about the words' meanings was held constant. When we overrode support for the notion that words were referential, subjects failed to learn the word-to-referent mappings, but otherwise they succeeded. Thus, cross-situational statistics were useful only when learners had the goal of discovering mappings between words and referents. We discuss the implications of these results for theories of word learning in children's language acquisition. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Trace element reference values in tissues from inhabitants of the EU. XII. Development of BioReVa program for statistical treatment.

    PubMed

    Iversen, B S; Sabbioni, E; Fortaner, S; Pietra, R; Nicolotti, A

    2003-01-20

    Statistical data treatment is a key point in the assessment of trace element reference values being the conclusive stage of a comprehensive and organized evaluation process of metal concentration in human body fluids. The EURO TERVIHT project (Trace Elements Reference Values in Human Tissues) was started for evaluating, checking and suggesting harmonized procedures for the establishment of trace element reference intervals in body fluids and tissues. Unfortunately, different statistical approaches are being used in this research field making data comparison difficult and in some cases impossible. Although international organizations such as International Federation of Clinical Chemistry (IFCC) or International Union of Pure and Applied Chemistry (IUPAC) have issued recommended guidelines for reference values assessment, including the statistical data treatment, a unique format and a standardized data layout is still missing. The aim of the present study is to present a software (BioReVa) running under Microsoft Windows platform suitable for calculating the reference intervals of trace elements in body matrices. The main scope for creating an ease-of-use application was to control the data distribution, to establish the reference intervals according to the accepted recommendation, on the base of the simple statistic, to get a standard presentation of experimental data and to have an application to which further need could be integrated in future. BioReVa calculates the IFCC reference intervals as well as the coverage intervals recommended by IUPAC as a supplement to the IFCC intervals. Examples of reference values and reference intervals calculated with BioReVa software concern Pb and Se in blood; Cd, In and Cr in urine, Hg and Mo in hair of different general European populations. University of Michigan

  9. Androgen profiling by liquid chromatography-tandem mass spectrometry (LC-MS/MS) in healthy normal-weight ovulatory and anovulatory late adolescent and young women.

    PubMed

    Fanelli, Flaminia; Gambineri, Alessandra; Belluomo, Ilaria; Repaci, Andrea; Di Lallo, Valentina Diana; Di Dalmazi, Guido; Mezzullo, Marco; Prontera, Olga; Cuomo, Gaia; Zanotti, Laura; Paccapelo, Alexandro; Morselli-Labate, Antonio Maria; Pagotto, Uberto; Pasquali, Renato

    2013-07-01

    Physiological transient imbalance typical of adolescence needs to be distinguished from hyperandrogenism-related dysfunction. The accurate determination of circulating androgens is the best indicator of hyperandrogenism. However, reliable reference intervals for adolescent and young women are not available. The aim of the study was to define androgen reference intervals in young women and to analyze the impact of the menstrual phase and ovulation efficiency over the androgen profile as assessed by reliable liquid chromatography-tandem mass spectrometry (LC-MS/MS) technique. Female high school students aged 16-19 years were included in the study. The study was performed on reference subjects properly selected among an unbiased population. Normal-weight, drug and disease free, eumenorrheic females with no signs of hyperandrogenism were included. The steroid hormone profile was determined by a validated in-house LC-MS/MS method. A statistical estimation of overall and menstrual phase-specific reference intervals was performed. A subgroup of anovulatory females was identified based on progesterone circulating levels. The impact of ovulation efficiency over hormonal profile was analyzed. A total of 159 females satisfied healthy criteria. Androgen levels did not vary according to menstrual phase, but a significantly higher upper reference limit was found for T in the luteal phase compared to the follicular phase. Higher T and androstenedione levels were observed in anovulatory compared to ovulatory females, paralleled by higher LH and FSH and lower 17-hydroxyprogesterone and 17β-estradiol levels. This is the first study providing LC-MS/MS-based, menstrual phase-specific reference intervals for the circulating androgen profile in young females. We identified a subgroup of anovulatory healthy females characterized by androgen imbalance.

  10. The Biomarker-Surrogacy Evaluation Schema: a review of the biomarker-surrogate literature and a proposal for a criterion-based, quantitative, multidimensional hierarchical levels of evidence schema for evaluating the status of biomarkers as surrogate endpoints.

    PubMed

    Lassere, Marissa N

    2008-06-01

    There are clear advantages to using biomarkers and surrogate endpoints, but concerns about clinical and statistical validity and systematic methods to evaluate these aspects hinder their efficient application. Section 2 is a systematic, historical review of the biomarker-surrogate endpoint literature with special reference to the nomenclature, the systems of classification and statistical methods developed for their evaluation. In Section 3 an explicit, criterion-based, quantitative, multidimensional hierarchical levels of evidence schema - Biomarker-Surrogacy Evaluation Schema - is proposed to evaluate and co-ordinate the multiple dimensions (biological, epidemiological, statistical, clinical trial and risk-benefit evidence) of the biomarker clinical endpoint relationships. The schema systematically evaluates and ranks the surrogacy status of biomarkers and surrogate endpoints using defined levels of evidence. The schema incorporates the three independent domains: Study Design, Target Outcome and Statistical Evaluation. Each domain has items ranked from zero to five. An additional category called Penalties incorporates additional considerations of biological plausibility, risk-benefit and generalizability. The total score (0-15) determines the level of evidence, with Level 1 the strongest and Level 5 the weakest. The term ;surrogate' is restricted to markers attaining Levels 1 or 2 only. Surrogacy status of markers can then be directly compared within and across different areas of medicine to guide individual, trial-based or drug-development decisions. This schema would facilitate communication between clinical, researcher, regulatory, industry and consumer participants necessary for evaluation of the biomarker-surrogate-clinical endpoint relationship in their different settings.

  11. Sigma: Strain-level inference of genomes from metagenomic analysis for biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahn, Tae-Hyuk; Chai, Juanjuan; Pan, Chongle

    Motivation: Metagenomic sequencing of clinical samples provides a promising technique for direct pathogen detection and characterization in biosurveillance. Taxonomic analysis at the strain level can be used to resolve serotypes of a pathogen in biosurveillance. Sigma was developed for strain-level identification and quantification of pathogens using their reference genomes based on metagenomic analysis. Results: Sigma provides not only accurate strain-level inferences, but also three unique capabilities: (i) Sigma quantifies the statistical uncertainty of its inferences, which includes hypothesis testing of identified genomes and confidence interval estimation of their relative abundances; (ii) Sigma enables strain variant calling by assigning metagenomic readsmore » to their most likely reference genomes; and (iii) Sigma supports parallel computing for fast analysis of large datasets. In conclusion, the algorithm performance was evaluated using simulated mock communities and fecal samples with spike-in pathogen strains. Availability and Implementation: Sigma was implemented in C++ with source codes and binaries freely available at http://sigma.omicsbio.org.« less

  12. Sigma: Strain-level inference of genomes from metagenomic analysis for biosurveillance

    DOE PAGES

    Ahn, Tae-Hyuk; Chai, Juanjuan; Pan, Chongle

    2014-09-29

    Motivation: Metagenomic sequencing of clinical samples provides a promising technique for direct pathogen detection and characterization in biosurveillance. Taxonomic analysis at the strain level can be used to resolve serotypes of a pathogen in biosurveillance. Sigma was developed for strain-level identification and quantification of pathogens using their reference genomes based on metagenomic analysis. Results: Sigma provides not only accurate strain-level inferences, but also three unique capabilities: (i) Sigma quantifies the statistical uncertainty of its inferences, which includes hypothesis testing of identified genomes and confidence interval estimation of their relative abundances; (ii) Sigma enables strain variant calling by assigning metagenomic readsmore » to their most likely reference genomes; and (iii) Sigma supports parallel computing for fast analysis of large datasets. In conclusion, the algorithm performance was evaluated using simulated mock communities and fecal samples with spike-in pathogen strains. Availability and Implementation: Sigma was implemented in C++ with source codes and binaries freely available at http://sigma.omicsbio.org.« less

  13. Spatio-temporal hierarchical modeling of rates and variability of Holocene sea-level changes in the western North Atlantic and the Caribbean

    NASA Astrophysics Data System (ADS)

    Ashe, E.; Kopp, R. E.; Khan, N.; Horton, B.; Engelhart, S. E.

    2016-12-01

    Sea level varies over of both space and time. Prior to the instrumental period, the sea-level record depends upon geological reconstructions that contain vertical and temporal uncertainty. Spatio-temporal statistical models enable the interpretation of RSL and rates of change as well as the reconstruction of the entire sea-level field from such noisy data. Hierarchical models explicitly distinguish between a process level, which characterizes the spatio-temporal field, and a data level, by which sparse proxy data and its noise is recorded. A hyperparameter level depicts prior expectations about the structure of variability in the spatio-temporal field. Spatio-temporal hierarchical models are amenable to several analysis approaches, with tradeoffs regarding computational efficiency and comprehensiveness of uncertainty characterization. A fully-Bayesian hierarchical model (BHM), which places prior probability distributions upon the hyperparameters, is more computationally intensive than an empirical hierarchical model (EHM), which uses point estimates of hyperparameters, derived from the data [1]. Here, we assess the sensitivity of posterior estimates of relative sea level (RSL) and rates to different statistical approaches by varying prior assumptions about the spatial and temporal structure of sea-level variability and applying multiple analytical approaches to Holocene sea-level proxies along the Atlantic coast of North American and the Caribbean [2]. References: 1. N Cressie, Wikle CK (2011) Statistics for spatio-temporal data (John Wiley & Sons). 2. Kahn N et al. (2016). Quaternary Science Reviews (in revision).

  14. Relationships between pathologic subjective halitosis, olfactory reference syndrome, and social anxiety in young Japanese women.

    PubMed

    Tsuruta, Miho; Takahashi, Toru; Tokunaga, Miki; Iwasaki, Masanori; Kataoka, Shota; Kakuta, Satoko; Soh, Inho; Awano, Shuji; Hirata, Hiromi; Kagawa, Masaharu; Ansai, Toshihiro

    2017-03-14

    Pathologic subjective halitosis is known as a halitosis complaint without objective confirmation of halitosis by others or by halitometer measurements; it has been reported to be associated with social anxiety disorder. Olfactory reference syndrome is a preoccupation with the false belief that one emits a foul and offensive body odor. Generally, patients with olfactory reference syndrome are concerned with multiple body parts. However, the mouth is known to be the most common source of body odor for those with olfactory reference syndrome, which could imply that the two conditions share similar features. Therefore, we investigated potential causal relationships among pathologic subjective halitosis, olfactory reference syndrome, social anxiety, and preoccupations with body part odors. A total of 1360 female students (mean age 19.6 ± 1.1 years) answered a self-administered questionnaire regarding pathologic subjective halitosis, olfactory reference syndrome, social anxiety, and preoccupation with odors of body parts such as mouth, body, armpits, and feet. The scale for pathologic subjective halitosis followed that developed by Tsunoda et al.; participants were divided into three groups based on their scores (i.e., levels of pathologic subjective halitosis). A Bayesian network was used to analyze causal relationships between pathologic subjective halitosis, olfactory reference syndrome, social anxiety, and preoccupations with body part odors. We found statistically significant differences in the results for olfactory reference syndrome and social anxiety among the various levels of pathologic subjective halitosis (P < 0.001). Residual analyses indicated that students with severe levels of pathologic subjective halitosis showed greater preoccupations with mouth and body odors (P < 0.05). Bayesian network analysis showed that social anxiety directly influenced pathologic subjective halitosis and olfactory reference syndrome. Preoccupations with mouth and body odors also influenced pathologic subjective halitosis. Social anxiety may be a causal factor of pathologic subjective halitosis and olfactory reference syndrome.

  15. [Analysis of master degree thesis of otolaryngology head and neck surgery in Xinjiang].

    PubMed

    Ayiheng, Qukuerhan; Niliapaer, Alimu; Yalikun, Yasheng

    2010-12-01

    To understand the basic situation and development of knowledge structure and ability of master degree of Otolaryngology Head and Neck Surgery in Xinjiang region in order to provide reference to further improve the quality of postgraduate students. Fourty-six papers of Otolaryngology master degree thesis were reviewed at randomly in terms of types, subject selection ranges as well as statistical methods during 1998-2009 in Xinjiang region in order to analyze and explore its advantages and characteristics and suggest a solution for its disadvantages. In 46 degree thesis, nine of them are scientific dissertations accounting for 19.57%, 37 are clinical professional degree thesis, accounting for 80.43%. Five are Experimental research papers, 30 are clinical research papers, 10 are clinical and experimental research papers, 1 of them is experimental epidemiology research paper; in this study, the kinds of diseases including every subject of ENT, various statistical methods are involved; references are 37.46 in average, 19.55 of them are foreign literatures references in nearly 5 years are 13.57; four ethnic groups are exist in postgraduate students with high teaching professional level of tutors. The clinical research should be focused in order to further research on ENT common diseases, the application of advanced research methods, the full application of the latest literature, tutors with high-level, training of students of various nationalities, basic research needs to be innovative and should be focus the subject characteristics, to avoid excessive duplication of research.

  16. Industry guidelines, laws and regulations ignored: quality of drug advertising in medical journals.

    PubMed

    Lankinen, Kari S; Levola, Tero; Marttinen, Kati; Puumalainen, Inka; Helin-Salmivaara, Arja

    2004-11-01

    To document the quality of evidence base for marketing claims in prescription drug advertisements, to facilitate identification of potential targets for quality improvement. A sample of 1036 advertisements from four major Finnish medical journals published in 2002. Marketing claims were classified in four groups: unambiguous clinical outcome, vague clinical outcome, emotive or immeasurable outcome and non-clinical outcome. Medline references were traced and classified according to the level of evidence available. The statistical variables used in the advertisements were also documented. The sample included 245 distinct advertisements with 883 marketing claims, 1-10 claims per advertisement. Three hundred thirty seven (38%) of the claims were referenced. Each claim could be supported by one reference or more, so the number of references analysed totalled 381, 1-9 references per advertisement. Nine percent of the claims implied unambiguous clinical outcomes, 68% included vague or emotive statements. Twenty one percent of the references were irrelevant to the claim. There was a fair amount of non-scientific and scientific support for the 73 unambiguous claims, but not a single claim was supported by strong scientific evidence. Vague, emotive and non-clinical claims were significantly more often supported by non-Medline or irrelevant references than unambiguous claims. Statistical parameters were stated only 34 times. Referenced marketing claims may appear more scientific, but the use of references does not guarantee the quality of the claims. For the benefit of all stakeholders, both the regulatory control and industry's self-control of drug marketing should adopt more active monitoring roles, and apply sanctions when appropriate. Concerted efforts by several stakeholders might be more effective. Copyright 2004 John Wiley & Sons, Ltd.

  17. Analysis of reference transactions using packaged computer programs.

    PubMed

    Calabretta, N; Ross, R

    1984-01-01

    Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.

  18. Changes in Library Technology and Reference Desk Statistics: Is There a Relationship?

    ERIC Educational Resources Information Center

    Thomsett-Scott, Beth; Reese, Patricia E.

    2006-01-01

    The incorporation of technology into library processes has tremendously impacted staff and users alike. The University of North Texas (UNT) Libraries is no exception. Sixteen years of reference statistics are analyzed to examine the relationships between the implementation of CD-ROMs and web-based resources and the number of reference questions.…

  19. Fast and accurate imputation of summary statistics enhances evidence of functional enrichment.

    PubMed

    Pasaniuc, Bogdan; Zaitlen, Noah; Shi, Huwenbo; Bhatia, Gaurav; Gusev, Alexander; Pickrell, Joseph; Hirschhorn, Joel; Strachan, David P; Patterson, Nick; Price, Alkes L

    2014-10-15

    Imputation using external reference panels (e.g. 1000 Genomes) is a widely used approach for increasing power in genome-wide association studies and meta-analysis. Existing hidden Markov models (HMM)-based imputation approaches require individual-level genotypes. Here, we develop a new method for Gaussian imputation from summary association statistics, a type of data that is becoming widely available. In simulations using 1000 Genomes (1000G) data, this method recovers 84% (54%) of the effective sample size for common (>5%) and low-frequency (1-5%) variants [increasing to 87% (60%) when summary linkage disequilibrium information is available from target samples] versus the gold standard of 89% (67%) for HMM-based imputation, which cannot be applied to summary statistics. Our approach accounts for the limited sample size of the reference panel, a crucial step to eliminate false-positive associations, and it is computationally very fast. As an empirical demonstration, we apply our method to seven case-control phenotypes from the Wellcome Trust Case Control Consortium (WTCCC) data and a study of height in the British 1958 birth cohort (1958BC). Gaussian imputation from summary statistics recovers 95% (105%) of the effective sample size (as quantified by the ratio of [Formula: see text] association statistics) compared with HMM-based imputation from individual-level genotypes at the 227 (176) published single nucleotide polymorphisms (SNPs) in the WTCCC (1958BC height) data. In addition, for publicly available summary statistics from large meta-analyses of four lipid traits, we publicly release imputed summary statistics at 1000G SNPs, which could not have been obtained using previously published methods, and demonstrate their accuracy by masking subsets of the data. We show that 1000G imputation using our approach increases the magnitude and statistical evidence of enrichment at genic versus non-genic loci for these traits, as compared with an analysis without 1000G imputation. Thus, imputation of summary statistics will be a valuable tool in future functional enrichment analyses. Publicly available software package available at http://bogdan.bioinformatics.ucla.edu/software/. bpasaniuc@mednet.ucla.edu or aprice@hsph.harvard.edu Supplementary materials are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Using Cochran's Z Statistic to Test the Kernel-Smoothed Item Response Function Differences between Focal and Reference Groups

    ERIC Educational Resources Information Center

    Zheng, Yinggan; Gierl, Mark J.; Cui, Ying

    2010-01-01

    This study combined the kernel smoothing procedure and a nonparametric differential item functioning statistic--Cochran's Z--to statistically test the difference between the kernel-smoothed item response functions for reference and focal groups. Simulation studies were conducted to investigate the Type I error and power of the proposed…

  1. Genetic diversity in Monoporeia affinis at polluted and reference sites of the Baltic Bothnian Bay.

    PubMed

    Guban, Peter; Wennerström, Lovisa; Elfwing, Tina; Sundelin, Brita; Laikre, Linda

    2015-04-15

    The amphipod Monoporeia affinis plays an important role in the Baltic Sea ecosystem as prey and as detritivore. The species is monitored for contaminant effects, but almost nothing is known about its genetics in this region. A pilot screening for genetic variation at the mitochondrial COI gene was performed in 113 individuals collected at six sites in the northern Baltic. Three coastal sites were polluted by pulp mill effluents, PAHs, and trace metals, and two coastal reference sites were without obvious connection to pollution sources. An off-coastal reference site was also included. Contaminated sites showed lower levels of genetic diversity than the coastal reference ones although the difference was not statistically significant. Divergence patterns measured as ΦST showed no significant differentiation within reference and polluted groups, but there was significant genetic divergence between them. The off-coastal sample differed significantly from all coastal sites and also showed lower genetic variation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Mapping of radio frequency electromagnetic field exposure levels in outdoor environment and comparing with reference levels for general public health.

    PubMed

    Cansiz, Mustafa; Abbasov, Teymuraz; Kurt, M Bahattin; Celik, A Recai

    2018-03-01

    In this study, radio frequency electromagnetic field exposure levels were measured on the main streets in the city center of Diyarbakır, Turkey. Measured electric field levels were plotted on satellite imagery of Diyarbakır and were compared with exposure guidelines published by the International Commission on Non-Ionizing Radiation Protection (ICNIRP). Exposure measurements were performed in dense urban, urban and suburban areas each day for 7 consecutive days. The measurement system consisted of high precision and portable spectrum analyzer, three-axis electric field antenna, connection cable and a laptop which was used to record the measurement samples as a data logger. The highest exposure levels were detected for two places, which are called Diclekent and Batıkent. It was observed that the highest instantaneous electric field strength value for Batıkent was 7.18 V/m and for Diclekent was 5.81 V/m. It was statistically determined that the main contributor band to the total exposure levels was Universal Mobile Telecommunications System band. Finally, it was concluded that all measured exposure levels were lower than the reference levels recommended by ICNIRP for general public health.

  3. Selection of Valid Reference Genes for Reverse Transcription Quantitative PCR Analysis in Heliconius numata (Lepidoptera: Nymphalidae)

    PubMed Central

    Chouteau, Mathieu; Whibley, Annabel; Joron, Mathieu; Llaurens, Violaine

    2016-01-01

    Identifying the genetic basis of adaptive variation is challenging in non-model organisms and quantitative real time PCR. is a useful tool for validating predictions regarding the expression of candidate genes. However, comparing expression levels in different conditions requires rigorous experimental design and statistical analyses. Here, we focused on the neotropical passion-vine butterflies Heliconius, non-model species studied in evolutionary biology for their adaptive variation in wing color patterns involved in mimicry and in the signaling of their toxicity to predators. We aimed at selecting stable reference genes to be used for normalization of gene expression data in RT-qPCR analyses from developing wing discs according to the minimal guidelines described in Minimum Information for publication of Quantitative Real-Time PCR Experiments (MIQE). To design internal RT-qPCR controls, we studied the stability of expression of nine candidate reference genes (actin, annexin, eF1α, FK506BP, PolyABP, PolyUBQ, RpL3, RPS3A, and tubulin) at two developmental stages (prepupal and pupal) using three widely used programs (GeNorm, NormFinder and BestKeeper). Results showed that, despite differences in statistical methods, genes RpL3, eF1α, polyABP, and annexin were stably expressed in wing discs in late larval and pupal stages of Heliconius numata. This combination of genes may be used as a reference for a reliable study of differential expression in wings for instance for genes involved in important phenotypic variation, such as wing color pattern variation. Through this example, we provide general useful technical recommendations as well as relevant statistical strategies for evolutionary biologists aiming to identify candidate-genes involved adaptive variation in non-model organisms. PMID:27271971

  4. Indoor Location Sensing with Invariant Wi-Fi Received Signal Strength Fingerprinting

    PubMed Central

    Husen, Mohd Nizam; Lee, Sukhan

    2016-01-01

    A method of location fingerprinting based on the Wi-Fi received signal strength (RSS) in an indoor environment is presented. The method aims to overcome the RSS instability due to varying channel disturbances in time by introducing the concept of invariant RSS statistics. The invariant RSS statistics represent here the RSS distributions collected at individual calibration locations under minimal random spatiotemporal disturbances in time. The invariant RSS statistics thus collected serve as the reference pattern classes for fingerprinting. Fingerprinting is carried out at an unknown location by identifying the reference pattern class that maximally supports the spontaneous RSS sensed from individual Wi-Fi sources. A design guideline is also presented as a rule of thumb for estimating the number of Wi-Fi signal sources required to be available for any given number of calibration locations under a certain level of random spatiotemporal disturbances. Experimental results show that the proposed method not only provides 17% higher success rate than conventional ones but also removes the need for recalibration. Furthermore, the resolution is shown finer by 40% with the execution time more than an order of magnitude faster than the conventional methods. These results are also backed up by theoretical analysis. PMID:27845711

  5. Indoor Location Sensing with Invariant Wi-Fi Received Signal Strength Fingerprinting.

    PubMed

    Husen, Mohd Nizam; Lee, Sukhan

    2016-11-11

    A method of location fingerprinting based on the Wi-Fi received signal strength (RSS) in an indoor environment is presented. The method aims to overcome the RSS instability due to varying channel disturbances in time by introducing the concept of invariant RSS statistics. The invariant RSS statistics represent here the RSS distributions collected at individual calibration locations under minimal random spatiotemporal disturbances in time. The invariant RSS statistics thus collected serve as the reference pattern classes for fingerprinting. Fingerprinting is carried out at an unknown location by identifying the reference pattern class that maximally supports the spontaneous RSS sensed from individual Wi-Fi sources. A design guideline is also presented as a rule of thumb for estimating the number of Wi-Fi signal sources required to be available for any given number of calibration locations under a certain level of random spatiotemporal disturbances. Experimental results show that the proposed method not only provides 17% higher success rate than conventional ones but also removes the need for recalibration. Furthermore, the resolution is shown finer by 40% with the execution time more than an order of magnitude faster than the conventional methods. These results are also backed up by theoretical analysis.

  6. Sources of international migration statistics in Africa.

    PubMed

    1984-01-01

    The sources of international migration data for Africa may be classified into 2 main categories: administrative records and 2) censuses and survey data. Both categories are sources for the direct measurement of migration, but the 2nd category can be used for the indirect estimation of net international migration. The administrative records from which data on international migration may be derived include 1) entry/departure cards or forms completed at international borders, 2) residence/work permits issued to aliens, and 3) general population registers and registers of aliens. The statistics derived from the entry/departure cards may be described as 1) land frontier control statistics and 2) port control statistics. The former refer to data derived from movements across land borders and the latter refer to information collected at international airports and seaports. Other administrative records which are potential sources of statistics on international migration in some African countries include some limited population registers, records of the registration of aliens, and particulars of residence/work permits issued to aliens. Although frontier control data are considered the most important source of international migration statistics, in many African countries these data are too deficient to provide a satisfactory indication of the level of international migration. Thus decennial population censuses and/or sample surveys are the major sources of the available statistics on the stock and characteristics of international migration. Indirect methods can be used to supplement census data with intercensal estimates of net migration using census data on the total population. This indirect method of obtaining information on migration can be used to evaluate estimates derived from frontier control records, and it also offers the means of obtaining alternative information on international migration in African countries which have not directly investigated migration topics in their censuses or surveys.

  7. 1998 Idaho Public Library Statistics and Library Directory. A Compilation of Input and Output Measures and Other Statistics in Reference to Idaho's Public Libraries, Covering the Period October 1, 1997 to September 30, 1998.

    ERIC Educational Resources Information Center

    Nelson, Frank, Comp.

    This report is a compilation of input and output measures and other statistics in reference to Idaho's public libraries, covering the period from October 1997 through September 1998. The introductory sections include notes on the statistics, definitions of performance measures, Idaho public library rankings for fiscal year 1996, and a state map…

  8. Enhancing the Equating of Item Difficulty Metrics: Estimation of Reference Distribution. Research Report. ETS RR-14-07

    ERIC Educational Resources Information Center

    Ali, Usama S.; Walker, Michael E.

    2014-01-01

    Two methods are currently in use at Educational Testing Service (ETS) for equating observed item difficulty statistics. The first method involves the linear equating of item statistics in an observed sample to reference statistics on the same items. The second method, or the item response curve (IRC) method, involves the summation of conditional…

  9. [Prevalence of obesity in children: study in the primary public Parisian schools].

    PubMed

    Barthel, B; Cariou, C; Lebas-Saison, E; Momas, I

    2001-03-01

    Obesity is an important risk factor in public health. In Paris, few statistical data are available in this area. The purpose of the present study is to evaluate the prevalence of overweight and obesity in 10 years-old children attending Paris elementary schools (cours moyen deuxième année--CM2--last level of the elementary school). 148 classes were randomly selected, gathering 3,621 schoolchildren 10 years 6 months old. 66 doctors in charge of health at school participated in the study, doing the measurements of weight, size and collecting also the weight and size at birth and at the "grande section-GS-level" (last level of the infant school, 5 years-old children) from the individual health file of the schoolchildren. The statistical analysis was based on the study of distributions of the observed Quetelet index (Q0) at the different ages, compared to French reference curves. A logistic regression analysis was performed to determine whether birth weight and GS weight predict obesity in CM2. In GS and in CM2, observed Quetelet indices are over expected values: in CM2, 22.8% of boys and 25.6% of girls exceed the reference value Q90; the prevalence of obesity (Q0 > or = Q97) is 13.4% in boys and 13.5% in girls. Among the variables "term", "weight at birth", weight in GS level and "gender", the weight in GS level is the only predictive factor of obesity in CM2 level. The situation in Paris appears to be serious. Preventive actions are needed at early stages to try to stop and, if possible, to reverse the present increase of overweight. In this context, school doctors have to play a prominent role.

  10. Tenascin-C levels in synovial fluid are elevated after injury to the human and canine joint and correlate with markers of inflammation and matrix degradation.

    PubMed

    Chockalingam, P S; Glasson, S S; Lohmander, L S

    2013-02-01

    We have previously shown the capacity of tenascin-C (TN-C) to induce inflammatory mediators and matrix degradation in vitro in human articular cartilage. The objective of the present study was to follow TN-C release into knee synovial fluid after acute joint injury or in joint disease, and to correlate TN-C levels with markers of cartilage matrix degradation and inflammation. Human knee synovial fluid samples (n = 164) were from a cross-sectional convenience cohort. Diagnostic groups were knee healthy reference, knee anterior cruciate ligament rupture, with or without concomitant meniscus lesions, isolated knee meniscus injury, acute inflammatory arthritis (AIA) and knee osteoarthritis (OA). TN-C was measured in synovial fluid samples using an enzyme-linked immunosorbent assay (ELISA) and results correlated to other cartilage markers. TN-C release was also monitored in joints of dogs that underwent knee instability surgery. Statistically significantly higher levels of TN-C compared to reference subjects were observed in the joint fluid of all human disease groups and in the dogs that underwent knee instability surgery. Statistically significant correlations were observed between the TN-C levels in the synovial fluid of the human patients and the levels of aggrecanase-dependent Ala-Arg-Gly-aggrecan (ARG-aggrecan) fragments and matrix metalloproteinases 1 and 3. We find highly elevated levels of TN-C in human knee joints after injury, AIA or OA that correlated with markers of cartilage degradation and inflammation. TN-C in synovial fluid may serve dual roles as a marker of joint damage and a stimulant of further joint degradation. Copyright © 2012 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  11. [How to start a neuroimaging study].

    PubMed

    Narumoto, Jin

    2012-06-01

    In order to help researchers understand how to start a neuroimaging study, several tips are described in this paper. These include 1) Choice of an imaging modality, 2) Statistical method, and 3) Interpretation of the results. 1) There are several imaging modalities available in clinical research. Advantages and disadvantages of each modality are described. 2) Statistical Parametric Mapping, which is the most common statistical software for neuroimaging analysis, is described in terms of parameter setting in normalization and level of significance. 3) In the discussion section, the region which shows a significant difference between patients and normal controls should be discussed in relation to the neurophysiology of the disease, making reference to previous reports from neuroimaging studies in normal controls, lesion studies and animal studies. A typical pattern of discussion is described.

  12. Analog-Based Postprocessing of Navigation-Related Hydrological Ensemble Forecasts

    NASA Astrophysics Data System (ADS)

    Hemri, S.; Klein, B.

    2017-11-01

    Inland waterway transport benefits from probabilistic forecasts of water levels as they allow to optimize the ship load and, hence, to minimize the transport costs. Probabilistic state-of-the-art hydrologic ensemble forecasts inherit biases and dispersion errors from the atmospheric ensemble forecasts they are driven with. The use of statistical postprocessing techniques like ensemble model output statistics (EMOS) allows for a reduction of these systematic errors by fitting a statistical model based on training data. In this study, training periods for EMOS are selected based on forecast analogs, i.e., historical forecasts that are similar to the forecast to be verified. Due to the strong autocorrelation of water levels, forecast analogs have to be selected based on entire forecast hydrographs in order to guarantee similar hydrograph shapes. Custom-tailored measures of similarity for forecast hydrographs comprise hydrological series distance (SD), the hydrological matching algorithm (HMA), and dynamic time warping (DTW). Verification against observations reveals that EMOS forecasts for water level at three gauges along the river Rhine with training periods selected based on SD, HMA, and DTW compare favorably with reference EMOS forecasts, which are based on either seasonal training periods or on training periods obtained by dividing the hydrological forecast trajectories into runoff regimes.

  13. A Framework for Establishing Standard Reference Scale of Texture by Multivariate Statistical Analysis Based on Instrumental Measurement and Sensory Evaluation.

    PubMed

    Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye

    2016-01-13

    A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.

  14. Ambiguity of Quality in Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Leptoukh, Greg

    2010-01-01

    This slide presentation reviews some of the issues in quality of remote sensing data. Data "quality" is used in several different contexts in remote sensing data, with quite different meanings. At the pixel level, quality typically refers to a quality control process exercised by the processing algorithm, not an explicit declaration of accuracy or precision. File level quality is usually a statistical summary of the pixel-level quality but is of doubtful use for scenes covering large areal extents. Quality at the dataset or product level, on the other hand, usually refers to how accurately the dataset is believed to represent the physical quantities it purports to measure. This assessment often bears but an indirect relationship at best to pixel level quality. In addition to ambiguity at different levels of granularity, ambiguity is endemic within levels. Pixel-level quality terms vary widely, as do recommendations for use of these flags. At the dataset/product level, quality for low-resolution gridded products is often extrapolated from validation campaigns using high spatial resolution swath data, a suspect practice at best. Making use of quality at all levels is complicated by the dependence on application needs. We will present examples of the various meanings of quality in remote sensing data and possible ways forward toward a more unified and usable quality framework.

  15. Identification of Reference Genes for Quantitative Gene Expression Studies in a Non-Model Tree Pistachio (Pistacia vera L.)

    PubMed Central

    Moazzam Jazi, Maryam; Ghadirzadeh Khorzoghi, Effat; Botanga, Christopher; Seyedi, Seyed Mahdi

    2016-01-01

    The tree species, Pistacia vera (P. vera) is an important commercial product that is salt-tolerant and long-lived, with a possible lifespan of over one thousand years. Gene expression analysis is an efficient method to explore the possible regulatory mechanisms underlying these characteristics. Therefore, having the most suitable set of reference genes is required for transcript level normalization under different conditions in P. vera. In the present study, we selected eight widely used reference genes, ACT, EF1α, α-TUB, β-TUB, GAPDH, CYP2, UBQ10, and 18S rRNA. Using qRT-PCR their expression was assessed in 54 different samples of three cultivars of P. vera. The samples were collected from different organs under various abiotic treatments (cold, drought, and salt) across three time points. Several statistical programs (geNorm, NormFinder, and BestKeeper) were applied to estimate the expression stability of candidate reference genes. Results obtained from the statistical analysis were then exposed to Rank aggregation package to generate a consensus gene rank. Based on our results, EF1α was found to be the superior reference gene in all samples under all abiotic treatments. In addition to EF1α, ACT and β-TUB were the second best reference genes for gene expression analysis in leaf and root. We recommended β-TUB as the second most stable gene for samples under the cold and drought treatments, while ACT holds the same position in samples analyzed under salt treatment. This report will benefit future research on the expression profiling of P. vera and other members of the Anacardiaceae family. PMID:27308855

  16. Identification of Reference Genes for Quantitative Gene Expression Studies in a Non-Model Tree Pistachio (Pistacia vera L.).

    PubMed

    Moazzam Jazi, Maryam; Ghadirzadeh Khorzoghi, Effat; Botanga, Christopher; Seyedi, Seyed Mahdi

    2016-01-01

    The tree species, Pistacia vera (P. vera) is an important commercial product that is salt-tolerant and long-lived, with a possible lifespan of over one thousand years. Gene expression analysis is an efficient method to explore the possible regulatory mechanisms underlying these characteristics. Therefore, having the most suitable set of reference genes is required for transcript level normalization under different conditions in P. vera. In the present study, we selected eight widely used reference genes, ACT, EF1α, α-TUB, β-TUB, GAPDH, CYP2, UBQ10, and 18S rRNA. Using qRT-PCR their expression was assessed in 54 different samples of three cultivars of P. vera. The samples were collected from different organs under various abiotic treatments (cold, drought, and salt) across three time points. Several statistical programs (geNorm, NormFinder, and BestKeeper) were applied to estimate the expression stability of candidate reference genes. Results obtained from the statistical analysis were then exposed to Rank aggregation package to generate a consensus gene rank. Based on our results, EF1α was found to be the superior reference gene in all samples under all abiotic treatments. In addition to EF1α, ACT and β-TUB were the second best reference genes for gene expression analysis in leaf and root. We recommended β-TUB as the second most stable gene for samples under the cold and drought treatments, while ACT holds the same position in samples analyzed under salt treatment. This report will benefit future research on the expression profiling of P. vera and other members of the Anacardiaceae family.

  17. Evaluation of person-level heterogeneity of treatment effects in published multiperson N-of-1 studies: systematic review and reanalysis.

    PubMed

    Raman, Gowri; Balk, Ethan M; Lai, Lana; Shi, Jennifer; Chan, Jeffrey; Lutz, Jennifer S; Dubois, Robert W; Kravitz, Richard L; Kent, David M

    2018-05-26

    Individual patients with the same condition may respond differently to similar treatments. Our aim is to summarise the reporting of person-level heterogeneity of treatment effects (HTE) in multiperson N-of-1 studies and to examine the evidence for person-level HTE through reanalysis. Systematic review and reanalysis of multiperson N-of-1 studies. Medline, Cochrane Controlled Trials, EMBASE, Web of Science and review of references through August 2017 for N-of-1 studies published in English. N-of-1 studies of pharmacological interventions with at least two subjects. Citation screening and data extractions were performed in duplicate. We performed statistical reanalysis testing for person-level HTE on all studies presenting person-level data. We identified 62 multiperson N-of-1 studies with at least two subjects. Statistical tests examining HTE were described in only 13 (21%), of which only two (3%) tested person-level HTE. Only 25 studies (40%) provided person-level data sufficient to reanalyse person-level HTE. Reanalysis using a fixed effect linear model identified statistically significant person-level HTE in 8 of the 13 studies (62%) reporting person-level treatment effects and in 8 of the 14 studies (57%) reporting person-level outcomes. Our analysis suggests that person-level HTE is common and often substantial. Reviewed studies had incomplete information on person-level treatment effects and their variation. Improved assessment and reporting of person-level treatment effects in multiperson N-of-1 studies are needed. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. Exploratory Multivariate Analysis. A Graphical Approach.

    DTIC Science & Technology

    1981-01-01

    Gnanadesikan , 1977) but we feel that these should be used with great caution unless one really has good reason to believe that the data came from such a...are referred to Gnanadesikan (1977). The present author hopes that the convenience of a single summary or significance level will not deter his readers...fit of a harmonic model to meteorological data. (In preparation). Gnanadesikan , R. (1977). Methods for Statistical Data Analysis of Multivariate

  19. A Survey of Insider Attack Detection Research

    DTIC Science & Technology

    2008-08-25

    modeling of statistical features , such as the frequency of events, the duration of events, the co-occurrence of multiple events combined through...forms of attack that have been reported [Error! Reference source not found.]. For example: • Unauthorized extraction , duplication, or exfiltration...network level. Schultz pointed out that not one approach will work but solutions need to be based on multiple sensors to be able to find any combination

  20. Choline: Clinical Nutrigenetic/Nutrigenomic Approaches for Identification of Functions and Dietary Requirements

    PubMed Central

    Zeisel, Steven H.

    2013-01-01

    Nutrigenetics/nutrigenomics (the study of the bidirectional interactions between genes and diet) is a rapidly developing field that is changing research and practice in human nutrition. Though eventually nutrition clinicians may be able to provide personalized nutrition recommendations, in the immediate future they are most likely to use this knowledge to improve dietary recommendations for populations. Currently, estimated average requirements are used to set dietary reference intakes because scientists cannot adequately identify subsets of the population that differ in requirement for a nutrient. Recommended intake levels must exceed the actual required intake for most of the population in order to assure that individuals with the highest requirement ingest adequate amounts of the nutrient. As a result, dietary reference intake levels often are set so high that diet guidelines suggest almost unattainable intakes of some foods. Once it is possible to identify common subgroups that differ in nutrient requirements using nutrigenetic/nutrigenomic profiling, targeted interventions and recommendations can be refined. In addition, when a large variance exists in response to a nutrient, statistical analyses often argue for a null effect. If responders could be differentiated from nonre-sponders based on nutrigenetic/nutrigenomic profiling, this statistical noise could be eliminated and the sensitivity of nutrition research greatly increased. PMID:20436254

  1. Insight into others' minds: spatio-temporal representations by intrinsic frame of reference.

    PubMed

    Sun, Yanlong; Wang, Hongbin

    2014-01-01

    Recent research has seen a growing interest in connections between domains of spatial and social cognition. Much evidence indicates that processes of representing space in distinct frames of reference (FOR) contribute to basic spatial abilities as well as sophisticated social abilities such as tracking other's intention and belief. Argument remains, however, that belief reasoning in social domain requires an innately dedicated system and cannot be reduced to low-level encoding of spatial relationships. Here we offer an integrated account advocating the critical roles of spatial representations in intrinsic frame of reference. By re-examining the results from a spatial task (Tamborello etal., 2012) and a false-belief task (Onishi and Baillargeon, 2005), we argue that spatial and social abilities share a common origin at the level of spatio-temporal association and predictive learning, where multiple FOR-based representations provide the basic building blocks for efficient and flexible partitioning of the environmental statistics. We also discuss neuroscience evidence supporting these mechanisms. We conclude that FOR-based representations may bridge the conceptual as well as the implementation gaps between the burgeoning fields of social and spatial cognition.

  2. No-reference multiscale blur detection tool for content based image retrieval

    NASA Astrophysics Data System (ADS)

    Ezekiel, Soundararajan; Stocker, Russell; Harrity, Kyle; Alford, Mark; Ferris, David; Blasch, Erik; Gorniak, Mark

    2014-06-01

    In recent years, digital cameras have been widely used for image capturing. These devices are equipped in cell phones, laptops, tablets, webcams, etc. Image quality is an important component of digital image analysis. To assess image quality for these mobile products, a standard image is required as a reference image. In this case, Root Mean Square Error and Peak Signal to Noise Ratio can be used to measure the quality of the images. However, these methods are not possible if there is no reference image. In our approach, a discrete-wavelet transformation is applied to the blurred image, which decomposes into the approximate image and three detail sub-images, namely horizontal, vertical, and diagonal images. We then focus on noise-measuring the detail images and blur-measuring the approximate image to assess the image quality. We then compute noise mean and noise ratio from the detail images, and blur mean and blur ratio from the approximate image. The Multi-scale Blur Detection (MBD) metric provides both an assessment of the noise and blur content. These values are weighted based on a linear regression against full-reference y values. From these statistics, we can compare to normal useful image statistics for image quality without needing a reference image. We then test the validity of our obtained weights by R2 analysis as well as using them to estimate image quality of an image with a known quality measure. The result shows that our method provides acceptable results for images containing low to mid noise levels and blur content.

  3. A Study on the Correlation between Cord Blood Glucose Level and the Apgar Score.

    PubMed

    Khan, Kalyan; Saha, Ashis Ranjan

    2013-02-01

    The study of the biochemical parameters of cord blood acts as a mirror, which usually reflects the neonatal status. The widely used system for the evaluation of a neonate is the Apgar score. There is no comprehensive published data which has established the association between the cord blood glucose level and the Apgar score. Similarly, there is also no well accepted reference range of the cord blood glucose level. The main objectives of the present study was to ascertain any adverse effects of an abnormal cord blood glucose level on the neonatal status and to find out a standard reference level of glucose in cord blood. The cord blood glucose estimation was done by using the glucose oxidase peroxidase method and the statistical analysis was performed by using the SPSS, version 16 software. In the present study, the cord blood glucose level was found to have no correlation with the Apgar scores which were calculated at both one minute and five minutes after birth. It was also found that for the foetus to be free from any obvious complication, the cord blood glucose level had to be around 87 mg/dl. The fluctuations in the maternal glucose levels are weakly associated with the glucose level in the cord blood.

  4. Muscle-strengthening and aerobic activities and mortality among 3+ year cancer survivors in the U.S.

    PubMed

    Tarasenko, Yelena N; Linder, Daniel F; Miller, Eric A

    2018-05-01

    This study examined the association between adherence to American College of Sports Medicine and American Cancer Society guidelines on aerobic and muscle-strengthening activities and mortality risks among 3+ year cancer survivors in the U.S. The observational study was based on 1999-2009 National Health Interview Survey Linked Mortality Files with follow-up through 2011. After applying exclusion criteria, there were 13,997 observations. The hazard ratios (HRs) for meeting recommendations on muscle-strengthening activities only, on aerobic activities only, and on both types of physical activity (i.e., adhering to complete guidelines) were calculated using a reference group of cancer survivors engaging in neither. Unadjusted and adjusted HRs of all-cause, cancer-specific, and cardiovascular disease-specific mortalities were estimated using Cox proportional hazards models. In all models, compared to the reference group, cancer survivors adhering to complete guidelines had significantly decreased all-cause, cancer-specific, and cardiovascular disease-specific mortalities (HRs ranged from 0.37 to 0.64, p's < 0.05). There were no statistically significant differences between hazard rates of cancer survivors engaging in recommended levels of muscle-strengthening activities only and the reference group (HRs ranged from 0.76 to 0.94, p's > 0.05). Wald test statistics suggested a significant dose-response relationship between levels of adherence to complete guidelines and cancer-specific mortality. While muscle-strengthening activities by themselves do not appear to reduce mortality risks, such activities may provide added cancer-specific survival benefits to 3+ year cancer survivors who are already aerobically active.

  5. Learning discriminative functional network features of schizophrenia

    NASA Astrophysics Data System (ADS)

    Gheiratmand, Mina; Rish, Irina; Cecchi, Guillermo; Brown, Matthew; Greiner, Russell; Bashivan, Pouya; Polosecki, Pablo; Dursun, Serdar

    2017-03-01

    Associating schizophrenia with disrupted functional connectivity is a central idea in schizophrenia research. However, identifying neuroimaging-based features that can serve as reliable "statistical biomarkers" of the disease remains a challenging open problem. We argue that generalization accuracy and stability of candidate features ("biomarkers") must be used as additional criteria on top of standard significance tests in order to discover more robust biomarkers. Generalization accuracy refers to the utility of biomarkers for making predictions about individuals, for example discriminating between patients and controls, in novel datasets. Feature stability refers to the reproducibility of the candidate features across different datasets. Here, we extracted functional connectivity network features from fMRI data at both high-resolution (voxel-level) and a spatially down-sampled lower-resolution ("supervoxel" level). At the supervoxel level, we used whole-brain network links, while at the voxel level, due to the intractably large number of features, we sampled a subset of them. We compared statistical significance, stability and discriminative utility of both feature types in a multi-site fMRI dataset, composed of schizophrenia patients and healthy controls. For both feature types, a considerable fraction of features showed significant differences between the two groups. Also, both feature types were similarly stable across multiple data subsets. However, the whole-brain supervoxel functional connectivity features showed a higher cross-validation classification accuracy of 78.7% vs. 72.4% for the voxel-level features. Cross-site variability and heterogeneity in the patient samples in the multi-site FBIRN dataset made the task more challenging compared to single-site studies. The use of the above methodology in combination with the fully data-driven approach using the whole brain information have the potential to shed light on "biomarker discovery" in schizophrenia.

  6. Evaluation of the 3M™ Petrifilm™ Salmonella express system for the detection of Salmonella species in selected foods: collaborative study.

    PubMed

    Bird, Patrick; Flannery, Jonathan; Crowley, Erin; Agin, James; Goins, David; Jechorek, Robert

    2014-01-01

    The 3M™ Petriflm™ Salmonella Express (SALX) System is a simple, ready-to-use chromogenic culture medium system for the rapid qualitative detection and biochemical confirmation of Salmonella spp. in food and food process environmental samples. The 3M Petrifilm SALX System was compared using an unpaired study design in a multilaboratory collaborative study to the U.S. Department of Agriculture/Food Safety and Inspection Service (USDA/FSIS) Microbiology Laboratory Guidebook (MLG) 4.07 (2013) Isolation and Identification of Salmonella from Meat, Poultry, Pasteurized Egg and Catfish Products and Carcass and Environmental Sponges for raw ground beef and the U.S. Food and Drug Administration Bacteriological Analytical Manual (FDA/BAM) Chapter 5, Salmonella (2011) reference method for dry dog food following the current AOAC validation guidelines. For this study, a total of 17 laboratories located throughout the continental United States evaluated 1872 test portions. For the 3M Petrifilm SALX System, raw ground beef was analyzed using 25 g test portions, and dry dog food was analyzed using 375 g test portions. For the reference methods, 25 g test portions of each inatrix were analyzed. The two matrices were artificially contaminated with Salmonella at three inoculation levels: an uninoculated control level (0 CFU/test portion), a low inoculum level (0.2-2 CFU/test portion), and a high inoculum level (2-5 CFU/test portion). Each inoculation level was statistically analyzed using the probability of detection statistical model. For the raw ground beef and dry dog food test portions, no significant differences at the 95% confidence interval were observed in the number of positive samples detected by the 3M Petrifilm SALX System versus either the USDA/FSIS-MLG or FDA/BAM methods.

  7. First proficiency testing to evaluate the ability of European Union National Reference Laboratories to detect staphylococcal enterotoxins in milk products.

    PubMed

    Hennekinne, Jacques-Antoine; Gohier, Martine; Maire, Tiphaine; Lapeyre, Christiane; Lombard, Bertrand; Dragacci, Sylviane

    2003-01-01

    The European Commission has designed a network of European Union-National Reference Laboratories (EU-NRLs), coordinated by a Community Reference Laboratory (CRL), for control of hygiene of milk and milk products (Council Directive 92/46/ECC). As a common contaminant of milk and milk products such as cheese, staphylococcal enterotoxins are often involved in human outbreaks and should be monitored regularly. The main tasks of the EU-CRLs were to select and transfer to the EU-NRLs a reference method for detection of enterotoxins, and to set up proficiency testing to evaluate the competency of the European laboratory network. The first interlaboratory exercise was performed on samples of freeze-dried cheese inoculated with 2 levels of staphylococcal enterotoxins (0.1 and 0.25 ng/g) and on an uninoculated control. These levels were chosen considering the EU regulation for staphylococcal enterotoxins in milk and milk products and the limit of detection of the enzyme-linked immunosorbent assay test recommended in the reference method. The trial was conducted according to the recommendations of ISO Guide 43. Results produced by laboratories were compiled and compared through statistical analysis. Except for data from 2 laboratories for the uninoculated control and cheese inoculated at 0.1 ng/g, all laboratories produced satisfactory results, showing the ability of the EU-NRL network to monitor the enterotoxin contaminant.

  8. Evaluation of salivary fluoride retention from a new high fluoride mouthrinse.

    PubMed

    Mason, Stephen C; Shirodaria, Soha; Sufi, Farzana; Rees, Gareth D; Birkhed, Dowen

    2010-11-01

    To evaluate salivary fluoride retention from a new high fluoride daily use mouthrinse over a 120 min period. Sixteen subjects completed a randomised single-blind, four-treatment cross-over trial. Sensodyne® Pronamel® mouthrinse (A) contained 450 ppm fluoride; reference products were Colgate® Fluorigard® (B), Listerine® Total Care (C) and Listerine Softmint Sensation (D) containing 225, 100 and 0 ppm fluoride respectively. Salivary fluoride retention was monitored ex vivo after a single supervised use of test product (10 mL, 60 s). Samples were collected at 0, 1, 3, 5, 15, 30, 60 and 120 min post-rinse, generating fluoride clearance curves from which the area under the curve (AUC) was calculated. Differences in salivary fluoride concentrations for each product were analysed using ANCOVA at each time point using a 5% significance level, as well as lnAUC for the periods 0-120, 0-1, 1-15, 15-60 and 60-120 min. Pairwise comparisons between all treatment groups were performed. Salivary fluoride levels for A-C peaked immediately following use. Fluoride levels were statistically significantly higher for A versus B-D (p≤ 0.004), linear dose responses were apparent. AUC(0-120) was statistically significantly greater for A than for B (p = 0.035), C (p< 0.0001) and D (p< 0.0001). Post-hoc comparisons of lnAUC for the remaining time domains showed fluoride retention from A was statistically significantly greater versus B-D (p< 0.0001). Single-use treatment with the new mouthrinse containing 450 ppm fluoride resulted in statistically significantly higher salivary fluoride levels throughout the 120 min test period. Total fluoride retention (AUC(0-120)) was also statistically significantly greater versus comparator rinse treatments. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Elemental composition of normal primary tooth enamel analyzed with XRMA and SIMS.

    PubMed

    Sabel, Nina; Dietz, Wolfram; Lundgren, Ted; Nietzsche, Sandor; Odelius, Hans; Rythén, Marianne; Rizell, Sara; Robertson, Agneta; Norén, Jörgen G; Klingberg, Gunilla

    2009-01-01

    There is an interest to analyze the chemical composition of enamel in teeth from patients with different developmental disorders or syndromes and evaluate possible differences compared to normal composition. For this purpose, it is essential to have reference material. The aim of this study was to, by means of X-ray micro analyses (XRMA) and secondary ion mass spectrometry (SIMS), present concentration gradients for C, O, P and Ca and F, Na, Mg, Cl, K and Sr in normal enamel of primary teeth from healthy individuals. 36 exfoliated primary teeth from 36 healthy children were collected, sectioned, and analyzed in the enamel and dentin with X-ray micro analyses for the content of C, O, P and Ca and F, Na MgCl, K and Sr. This study has supplied reference data for C, O, P and Ca in enamel in primary teeth from healthy subjects. No statistically significant differences in the elemental composition were found between incisors and molars.The ratio Ca/P is in concordance with other studies. Some elements have shown statistically significant differences between different levels of measurement. These results may be used as reference values for research on the chemical composition of enamel and dentin in primary teeth from patients with different conditions and/or syndromes.

  10. An evaluation of Wikipedia as a resource for patient education in nephrology.

    PubMed

    Thomas, Garry R; Eng, Lawson; de Wolff, Jacob F; Grover, Samir C

    2013-01-01

    Wikipedia, a multilingual online encyclopedia, is a common starting point for patient medical searches. As its articles can be authored and edited by anyone worldwide, the credibility of the medical content of Wikipedia has been openly questioned. Wikipedia medical articles have also been criticized as too advanced for the general public. This study assesses the comprehensiveness, reliability, and readability of nephrology articles on Wikipedia. The International Statistical Classification of Diseases and Related problems, 10th Edition (ICD-10) diagnostic codes for nephrology (N00-N29.8) were used as a topic list to investigate the English Wikipedia database. Comprehensiveness was assessed by the proportion of ICD-10 codes that had corresponding articles. Reliability was measured by both the number of references per article and proportion of references from substantiated sources. Finally, readability was assessed using three validated indices (Flesch-Kincaid grade level, Automated readability index, and Flesch reading ease). Nephrology articles on Wikipedia were relatively comprehensive, with 70.5% of ICD-10 codes being represented. The articles were fairly reliable, with 7.1 ± 9.8 (mean ± SD) references per article, of which 59.7 ± 35.0% were substantiated references. Finally, all three readability indices determined that nephrology articles are written at a college level. Wikipedia is a comprehensive and fairly reliable medical resource for nephrology patients that is written at a college reading level. Accessibility of this information for the general public may be improved by hosting it at alternative Wikipedias targeted at a lower reading level, such as the Simple English Wikipedia. © 2013 Wiley Periodicals, Inc.

  11. Two-year follow-up biomonitoring pilot study of residents' and controls' PFC plasma levels after PFOA reduction in public water system in Arnsberg, Germany.

    PubMed

    Brede, Edna; Wilhelm, Michael; Göen, Thomas; Müller, Johannes; Rauchfuss, Knut; Kraft, Martin; Hölzer, Jürgen

    2010-06-01

    Residents in Arnsberg, Germany, had been supplied by drinking water contaminated with perfluorooctanoate (PFOA). Biomonitoring data from 2006 evidenced that plasma PFOA concentrations of residents from Arnsberg were 4.5-8.3 times higher than those in reference groups. The introduction of charcoal filtration in July 2006 distinctly reduced PFOA concentrations in drinking water. Our one-year follow-up study showed a 10-20% reduction of PFOA plasma levels in residents from Arnsberg. Here we report the first results of the two-year follow-up study Arnsberg 2008. Additionally, the results of the two-year follow-up examination of the reference group are included. Paired plasma samples of 138 study participants (45 children, 46 mothers and 47 men) collected in 2006 and 2008 were considered in the statistical analyses. Within the two years plasma concentrations of PFOA, perfluorooctanesulfonate (PFOS) and perfluorohexanesulfonate (PFHxS) decreased in residents from Arnsberg and in control groups. The geometric means of PFOA plasma levels declined by 39% (children and mothers) and 26% (men) in Arnsberg and by 13-15% in the corresponding subgroups from the reference areas. For the population from Arnsberg a geometric mean plasma PFOA half-life of 3.26 years (range 1.03-14.67 years) was calculated. Our results confirm an ongoing reduction of the PFOA load in residents from Arnsberg. The decline of PFC levels in plasma of participants from the reference areas reflects the general decrease of human PFC exposure during the very recent years. Copyright 2010 Elsevier GmbH. All rights reserved.

  12. The effects of an energy efficiency retrofit on indoor air quality.

    PubMed

    Frey, S E; Destaillats, H; Cohn, S; Ahrentzen, S; Fraser, M P

    2015-04-01

    To investigate the impacts of an energy efficiency retrofit, indoor air quality and resident health were evaluated at a low-income senior housing apartment complex in Phoenix, Arizona, before and after a green energy building renovation. Indoor and outdoor air quality sampling was carried out simultaneously with a questionnaire to characterize personal habits and general health of residents. Measured indoor formaldehyde levels before the building retrofit routinely exceeded reference exposure limits, but in the long-term follow-up sampling, indoor formaldehyde decreased for the entire study population by a statistically significant margin. Indoor PM levels were dominated by fine particles and showed a statistically significant decrease in the long-term follow-up sampling within certain resident subpopulations (i.e. residents who report smoking and residents who had lived longer at the apartment complex). © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. Reliability of reference distances used in photogrammetry.

    PubMed

    Aksu, Muge; Kaya, Demet; Kocadereli, Ilken

    2010-07-01

    To determine the reliability of the reference distances used for photogrammetric assessment. The sample consisted of 100 subjects with mean ages of 22.97 +/- 2.98 years. Five lateral and four frontal parameters were measured directly on the subjects' faces. For photogrammetric assessment, two reference distances for the profile view and three reference distances for the frontal view were established. Standardized photographs were taken and all the parameters that had been measured directly on the face were measured on the photographs. The reliability of the reference distances was checked by comparing direct and indirect values of the parameters obtained from the subjects' faces and photographs. Repeated measure analysis of variance (ANOVA) and Bland-Altman analyses were used for statistical assessment. For profile measurements, the indirect values measured were statistically different from the direct values except for Sn-Sto in male subjects and Prn-Sn and Sn-Sto in female subjects. The indirect values of Prn-Sn and Sn-Sto were reliable in both sexes. The poorest results were obtained in the indirect values of the N-Sn parameter for female subjects and the Sn-Me parameter for male subjects according to the Sa-Sba reference distance. For frontal measurements, the indirect values were statistically different from the direct values in both sexes except for one in male subjects. The indirect values measured were not statistically different from the direct values for Go-Go. The indirect values of Ch-Ch were reliable in male subjects. The poorest results were obtained according to the P-P reference distance. For profile assessment, the T-Ex reference distance was reliable for Prn-Sn and Sn-Sto in both sexes. For frontal assessment, Ex-Ex and En-En reference distances were reliable for Ch-Ch in male subjects.

  14. SIRU utilization. Volume 1: Theory, development and test evaluation

    NASA Technical Reports Server (NTRS)

    Musoff, H.

    1974-01-01

    The theory, development, and test evaluations of the Strapdown Inertial Reference Unit (SIRU) are discussed. The statistical failure detection and isolation, single position calibration, and self alignment techniques are emphasized. Circuit diagrams of the system components are provided. Mathematical models are developed to show the performance characteristics of the subsystems. Specific areas of the utilization program are identified as: (1) error source propagation characteristics and (2) local level navigation performance demonstrations.

  15. Lead exposure in passerines inhabiting lead-contaminated floodplains in the Coeur d'Alene River Basin, Idaho, USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, G.D.; Kern, J.W.; Strickland, M.D.

    1999-06-01

    Blood collected from song sparrows (Melospiza melodia) and American robins (Turdus migratorius) captured with mist nets in a lead-contaminated (assessment) area and nearby uncontaminated (reference) areas within the Coeur d'Alene Basin in northern Idaho was analyzed for [delta]-aminolevulinic acid dehydratase activity (ALAD) and hematocrit levels, and livers were analyzed for lead. Mean ALAD inhibition in the assessment area was 51% in song sparrows and 75% in American robins. The proportion of the sampled population with ALAD inhibition > 50% was calculated to be 43% for song sparrows and 83% for American robins. Assessment area hematocrit values for song sparrows andmore » American robins were lower than in reference areas; however, differences were not statistically significant. Significantly higher levels of lead (wet weight) were found in livers from song sparrows captured on the assessment area ([bar x] = 1.93 ppm) than on reference areas. Study results indicate that 43% of the song sparrows and 83% of the American robins inhabiting the floodplain along the Coeur d'Alene River in the assessment area are being exposed to lead at levels sufficient to inhibit ALAD by > 50%. Variability in lead exposure indicators was attributed to high variability in environmental lead concentrations in the Coeur d'Alene River Basin.« less

  16. Aquarius Instrument Science Calibration During the Risk Reduction Phase

    NASA Technical Reports Server (NTRS)

    Ruf, Christopher S.

    2004-01-01

    This final report presents the results of work performed under NASA Grant NAG512726 during the period 15 January 2003 through 30 June 2004. An analysis was performed of a possible vicarious calibration method for use by Aquarius to monitor and stabilize the absolute and relative calibration of its microwave radiometer. Stationary statistical properties of the brightness temperature (T(sub B)) measured by a low Earth orbiting radiometer operating at 1.4135 GHz are considered as a means of validating its absolute calibration. The global minimum, maximum, and average T(sub B) are considered, together with a vicarious cold reference method that detects the presence of a sharp lower bound on naturally occurring values for T(sub B). Of particular interest is the reliability with which these statistics can be extracted from a realistic distribution of T(sub B) measurements that would be observed by a typical sensor. Simulations of measurements are performed that include the effects of instrument noise and variable environmental factors such as the global water vapor and ocean surface temperature, salinity and wind distributions. Global minima can vary widely due to instrument noise and are not a reliable calibration reference. Global maxima are strongly influenced by several environmental factors as well as instrument noise and are even less stationary. Global averages are largely insensitive to instrument noise and, in most cases, to environmental conditions as well. The global average T(sub B) varies at only the 0.1 K RMS level except in cases of anomalously high winds, when it can increase considerably more. The vicarious cold reference is similarly insensitive to instrument effects and most environmental factors. It is not significantly affected by high wind conditions. The stability of the vicarious reference is, however, found to be somewhat sensitive (at the several tenths of Kelvins level) to variations in the background cold space brightness, T(sub c). The global average is much less sensitive to this parameter and so using two approaches together can be mutually beneficial.

  17. Comparative bioavailability study of clonazepam after oral administration of two tablet formulations.

    PubMed

    Chauhan, B L; Sane, S P; Revankar, S N; Rammamurthy, L; Doshi, B; Bhatt, A D; Bhate, V R; Kulkarni, R D

    2000-10-01

    To assess the bioavailability of clonazepam from two brands of 2 mg tablet formulations--Epitril and reference brand. A two-way randomised cross-over bioavailability study was carried out in 12 healthy male volunteers. Coded plasma samples were analysed for levels of clonazepam by high performance liquid chromatography (HPLC) method. The mean Cmax, Tmax t1/2 beta and AUC (0-48) for Epitril were: 16.31 +/- 3.07 ng/mL, 1.63 +/- 0.48 h, 46.97 +/- 12.26 h and 207.70 +/- 57.07 ng/ml.h; for reference brand were 19.75 +/- 5.95 ng/mL, 1.42 +/- 0.29 h, 46.88 +/- 11.29 h and 215.70 +/- 50.89 ng/ml.h respectively. These were comparable and the differences were not statistically significant. Based on above pharmacokinetic parameters, Epitril was bioequivalent to reference brand.

  18. Flame detector operable in presence of proton radiation

    NASA Technical Reports Server (NTRS)

    Walker, D. J.; Turnage, J. E.; Linford, R. M. F.; Cornish, S. D. (Inventor)

    1974-01-01

    A detector of ultraviolet radiation for operation in a space vehicle which orbits through high intensity radiation areas is described. Two identical ultraviolet sensor tubes are mounted within a shield which limits to acceptable levels the amount of proton radiation reaching the sensor tubes. The shield has an opening which permits ultraviolet radiation to reach one of the sensing tubes. The shield keeps ultraviolet radiation from reaching the other sensor tube, designated the reference tube. The circuitry of the detector subtracts the output of the reference tube from the output of the sensing tube, and any portion of the output of the sensing tube which is due to proton radiation is offset by the output of the reference tube. A delay circuit in the detector prevents false alarms by keeping statistical variations in the proton radiation sensed by the two sensor tubes from developing an output signal.

  19. An open-source textbook for teaching climate-related risk analysis using the R computing environment

    NASA Astrophysics Data System (ADS)

    Applegate, P. J.; Keller, K.

    2015-12-01

    Greenhouse gas emissions lead to increased surface air temperatures and sea level rise. In turn, sea level rise increases the risks of flooding for people living near the world's coastlines. Our own research on assessing sea level rise-related risks emphasizes both Earth science and statistics. At the same time, the free, open-source computing environment R is growing in popularity among statisticians and scientists due to its flexibility and graphics capabilities, as well as its large library of existing functions. We have developed a set of laboratory exercises that introduce students to the Earth science and statistical concepts needed for assessing the risks presented by climate change, particularly sea-level rise. These exercises will be published as a free, open-source textbook on the Web. Each exercise begins with a description of the Earth science and/or statistical concepts that the exercise teaches, with references to key journal articles where appropriate. Next, students are asked to examine in detail a piece of existing R code, and the exercise text provides a clear explanation of how the code works. Finally, students are asked to modify the existing code to produce a well-defined outcome. We discuss our experiences in developing the exercises over two separate semesters at Penn State, plus using R Markdown to interweave explanatory text with sample code and figures in the textbook.

  20. A Frequency Domain Approach to Pretest Analysis Model Correlation and Model Updating for the Mid-Frequency Range

    DTIC Science & Technology

    2009-02-01

    range of modal analysis and the high frequency region of statistical energy analysis , is referred to as the mid-frequency range. The corresponding...frequency range of modal analysis and the high frequency region of statistical energy analysis , is referred to as the mid-frequency range. The...predictions. The averaging process is consistent with the averaging done in statistical energy analysis for stochastic systems. The FEM will always

  1. Agreement between diagnoses reached by clinical examination and available reference standards: a prospective study of 216 patients with lumbopelvic pain

    PubMed Central

    Laslett, Mark; McDonald, Barry; Tropp, Hans; Aprill, Charles N; Öberg, Birgitta

    2005-01-01

    Background The tissue origin of low back pain (LBP) or referred lower extremity symptoms (LES) may be identified in about 70% of cases using advanced imaging, discography and facet or sacroiliac joint blocks. These techniques are invasive and availability varies. A clinical examination is non-invasive and widely available but its validity is questioned. Diagnostic studies usually examine single tests in relation to single reference standards, yet in clinical practice, clinicians use multiple tests and select from a range of possible diagnoses. There is a need for studies that evaluate the diagnostic performance of clinical diagnoses against available reference standards. Methods We compared blinded clinical diagnoses with diagnoses based on available reference standards for known causes of LBP or LES such as discography, facet, sacroiliac or hip joint blocks, epidurals injections, advanced imaging studies or any combination of these tests. A prospective, blinded validity design was employed. Physiotherapists examined consecutive patients with chronic lumbopelvic pain and/or referred LES scheduled to receive the reference standard examinations. When diagnoses were in complete agreement regardless of complexity, "exact" agreement was recorded. When the clinical diagnosis was included within the reference standard diagnoses, "clinical agreement" was recorded. The proportional chance criterion (PCC) statistic was used to estimate agreement on multiple diagnostic possibilities because it accounts for the prevalence of individual categories in the sample. The kappa statistic was used to estimate agreement on six pathoanatomic diagnoses. Results In a sample of chronic LBP patients (n = 216) with high levels of disability and distress, 67% received a patho-anatomic diagnosis based on available reference standards, and 10% had more than one tissue origin of pain identified. For 27 diagnostic categories and combinations, chance clinical agreement (PCC) was estimated at 13%. "Exact" agreement between clinical and reference standard diagnoses was 32% and "clinical agreement" 51%. For six pathoanatomic categories (disc, facet joint, sacroiliac joint, hip joint, nerve root and spinal stenosis), PCC was 33% with actual agreement 56%. There was no overlap of 95% confidence intervals on any comparison. Diagnostic agreement on the six most common patho-anatomic categories produced a kappa of 0.31. Conclusion Clinical diagnoses agree with reference standards diagnoses more often than chance. Using available reference standards, most patients can have a tissue source of pain identified. PMID:15943873

  2. Electric power annual 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Electric Power Annual presents a summary of electric utility statistics at national, regional and State levels. The objective of the publication is to provide industry decisionmakers, government policymakers, analysts and the general public with historical data that may be used in understanding US electricity markets. The Electric Power Annual is prepared by the Survey Management Division; Office of Coal, Nuclear, Electric and Alternate Fuels; Energy Information Administration (EIA); US Department of Energy. ``The US Electric Power Industry at a Glance`` section presents a profile of the electric power industry ownership and performance, and a review of key statistics formore » the year. Subsequent sections present data on generating capability, including proposed capability additions; net generation; fossil-fuel statistics; retail sales; revenue; financial statistics; environmental statistics; electric power transactions; demand-side management; and nonutility power producers. In addition, the appendices provide supplemental data on major disturbances and unusual occurrences in US electricity power systems. Each section contains related text and tables and refers the reader to the appropriate publication that contains more detailed data on the subject matter. Monetary values in this publication are expressed in nominal terms.« less

  3. Status and Future Developments of SIRGAS

    NASA Astrophysics Data System (ADS)

    Fortes, L.; Lauría, E.; Brunini, C.; Amaya, W.; Sanchez, L.; Drewes, H.

    2007-05-01

    This paper presents the status and future developments of the SIRGAS (Geocentric Reference System for the Americas) project. Since its creation, in 1993, SIRGAS has coordinated two continental GPS campaigns in 1995 an 2000, responsible for the establishment of a very accurate 3D reference frame in the region. First focusing on South America, the project has expanded its scope to Latin America since 2001. Currently the maintenance of the SIRGAS reference frame is carried out through more than 80 continuous operating GNSS (Global Navigation Satellite System) stations available in the region, whose data is officially processed by the International GNSS Service (IGS) Regional Network Associate Analysis Centre for SIRGAS (IGS RNACC-SIR), functioning at the DGFI (Deutsches Geodatisches Forschungsinstitut), in Munich, to generate weekly coordinates and velocity information of each continuous GNSS station. Since October 2006, five additional experimental processing centers - located at the Brazilian Institute of Geography and Statistics (IBGE), National Institute of Statistics, Geography and Informatics of Mexico (INEGI), Military Geographic Institute of Argentina (IGM), University of La Plata (UNLP), Argentina, and Geographic Institute Agustín Codazzi, Colombia (IGAC) - have also been processing data from those stations in order to assume the official processing responsibility in near future. Many Latin American countries have already adopted SIRGAS as their new official reference system. Besides, efforts have been carried out in order to have the national geodetic networks of Central American countries connected to the SIRGAS reference frame, which will be accomplished by a GNSS campaign scheduled for the first semester of 2007. In terms of vertical datum, SIRGAS continues to coordinate with each member country all the necessary efforts towards making the geodetic leveling data available together with gravity information in order to support the computation of geopotential numbers, to be unified in a continental adjustment.

  4. The bioavailability of manganese in welders in relation to its solubility in welding fumes.

    PubMed

    Ellingsen, Dag G; Zibarev, Evgenij; Kusraeva, Zarina; Berlinger, Balazs; Chashchin, Maxim; Bast-Pettersen, Rita; Chashchin, Valery; Thomassen, Yngvar

    2013-02-01

    Blood and urine samples for determination of manganese (Mn) and iron (Fe) concentrations were collected in a cross-sectional study of 137 currently exposed welders, 137 referents and 34 former welders. Aerosol samples for measurements of personal air exposure to Mn and Fe were also collected. The aerosol samples were assessed for their solubility using a simulated lung lining fluid (Hatch solution). On average 13.8% of the total Mn mass (range 1-49%; N = 237) was soluble (Hatch sol), while only 1.4% (<0.1-10.0%; N = 237) of the total Fe mass was Hatch sol. The welders had statistically significantly higher geometric mean concentrations of Mn in whole blood (B-Mn 12.8 vs. 8.0 μg L (-1)), serum (S-Mn 1.04 vs. 0.77 μg L(-1)) and urine (U-Mn 0.36 vs. 0.07 μg g (-1) cr.) than the referents. Statistically significant univariate correlations were observed between exposure to Hatch sol Mn in the welding aerosol and B-Mn, S-Mn and U-Mn respectively. Pearson's correlation coefficient between mean Hatch sol Mn of two days preceding the collection of biological samples and U-Mn was 0.46 (p < 0.001). The duration of employment as a welder in years was also associated with B-Mn and S-Mn, but not with U-Mn. Statistically significantly higher U-Mn and B-Mn were observed in welders currently exposed to even less than 12 and 6 μg m (-3) Hatchsol Mn, respectively. When using the 95(th) percentile concentration among the referents as a cut-point, 70.0 and 64.5% of the most highly exposed welders exceeded this level with respect to B-Mn and U-Mn. The concentrations of B-Mn, S-Mn and U-Mn were all highly correlated in the welders, but not in the referents.

  5. Learning and understanding the Kruskal-Wallis one-way analysis-of-variance-by-ranks test for differences among three or more independent groups.

    PubMed

    Chan, Y; Walmsley, R P

    1997-12-01

    When several treatment methods are available for the same problem, many clinicians are faced with the task of deciding which treatment to use. Many clinicians may have conducted informal "mini-experiments" on their own to determine which treatment is best suited for the problem. These results are usually not documented or reported in a formal manner because many clinicians feel that they are "statistically challenged." Another reason may be because clinicians do not feel they have controlled enough test conditions to warrant analysis. In this update, a statistic is described that does not involve complicated statistical assumptions, making it a simple and easy-to-use statistical method. This update examines the use of two statistics and does not deal with other issues that could affect clinical research such as issues affecting credibility. For readers who want a more in-depth examination of this topic, references have been provided. The Kruskal-Wallis one-way analysis-of-variance-by-ranks test (or H test) is used to determine whether three or more independent groups are the same or different on some variable of interest when an ordinal level of data or an interval or ratio level of data is available. A hypothetical example will be presented to explain when and how to use this statistic, how to interpret results using the statistic, the advantages and disadvantages of the statistic, and what to look for in a written report. This hypothetical example will involve the use of ratio data to demonstrate how to choose between using the nonparametric H test and the more powerful parametric F test.

  6. DISSCO: direct imputation of summary statistics allowing covariates

    PubMed Central

    Xu, Zheng; Duan, Qing; Yan, Song; Chen, Wei; Li, Mingyao; Lange, Ethan; Li, Yun

    2015-01-01

    Background: Imputation of individual level genotypes at untyped markers using an external reference panel of genotyped or sequenced individuals has become standard practice in genetic association studies. Direct imputation of summary statistics can also be valuable, for example in meta-analyses where individual level genotype data are not available. Two methods (DIST and ImpG-Summary/LD), that assume a multivariate Gaussian distribution for the association summary statistics, have been proposed for imputing association summary statistics. However, both methods assume that the correlations between association summary statistics are the same as the correlations between the corresponding genotypes. This assumption can be violated in the presence of confounding covariates. Methods: We analytically show that in the absence of covariates, correlation among association summary statistics is indeed the same as that among the corresponding genotypes, thus serving as a theoretical justification for the recently proposed methods. We continue to prove that in the presence of covariates, correlation among association summary statistics becomes the partial correlation of the corresponding genotypes controlling for covariates. We therefore develop direct imputation of summary statistics allowing covariates (DISSCO). Results: We consider two real-life scenarios where the correlation and partial correlation likely make practical difference: (i) association studies in admixed populations; (ii) association studies in presence of other confounding covariate(s). Application of DISSCO to real datasets under both scenarios shows at least comparable, if not better, performance compared with existing correlation-based methods, particularly for lower frequency variants. For example, DISSCO can reduce the absolute deviation from the truth by 3.9–15.2% for variants with minor allele frequency <5%. Availability and implementation: http://www.unc.edu/∼yunmli/DISSCO. Contact: yunli@med.unc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25810429

  7. DISSCO: direct imputation of summary statistics allowing covariates.

    PubMed

    Xu, Zheng; Duan, Qing; Yan, Song; Chen, Wei; Li, Mingyao; Lange, Ethan; Li, Yun

    2015-08-01

    Imputation of individual level genotypes at untyped markers using an external reference panel of genotyped or sequenced individuals has become standard practice in genetic association studies. Direct imputation of summary statistics can also be valuable, for example in meta-analyses where individual level genotype data are not available. Two methods (DIST and ImpG-Summary/LD), that assume a multivariate Gaussian distribution for the association summary statistics, have been proposed for imputing association summary statistics. However, both methods assume that the correlations between association summary statistics are the same as the correlations between the corresponding genotypes. This assumption can be violated in the presence of confounding covariates. We analytically show that in the absence of covariates, correlation among association summary statistics is indeed the same as that among the corresponding genotypes, thus serving as a theoretical justification for the recently proposed methods. We continue to prove that in the presence of covariates, correlation among association summary statistics becomes the partial correlation of the corresponding genotypes controlling for covariates. We therefore develop direct imputation of summary statistics allowing covariates (DISSCO). We consider two real-life scenarios where the correlation and partial correlation likely make practical difference: (i) association studies in admixed populations; (ii) association studies in presence of other confounding covariate(s). Application of DISSCO to real datasets under both scenarios shows at least comparable, if not better, performance compared with existing correlation-based methods, particularly for lower frequency variants. For example, DISSCO can reduce the absolute deviation from the truth by 3.9-15.2% for variants with minor allele frequency <5%. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Immunoglobulin E-mediated sensitization to pine and beech dust in relation to wood dust exposure levels and respiratory symptoms in the furniture industry.

    PubMed

    Schlünssen, Vivi; Kespohl, Sabine; Jacobsen, Gitte; Raulf-Heimsoth, Monika; Schaumburg, Inger; Sigsgaard, Torben

    2011-03-01

    Wood dust exposure may cause Immunoglobulin E (IgE)-mediated allergic diseases. Our objectives were to estimate pine and beech dust sensitization rates among woodworkers and a reference group, explore the association between exposure and sensitization and between sensitization and respiratory symptoms, and finally investigate the impact of proteinogenic specific IgE (sIgE) epitopes on respiratory symptoms. In a Danish study among 52 furniture factories and 2 reference factories, we evaluated the workers' asthma and rhinitis status using questionnaires and blood samples collected from 1506 woodworkers and 195 references. Workers with asthma symptoms (N=298), a random study sample (N=399) and a random rhinitis sample (N=100) were evaluated for IgE-mediated sensitization to pine and beech dust. The prevalence of pine and beech sensitization among current woodworkers was 1.7 and 3.1%, respectively. No differences in sensitization rates were found between woodworkers and references, but the prevalence of wood dust sensitization was dose-dependently associated with the current level of wood dust exposure. No relation was observed between wood dust sensitization per se and respiratory symptoms. Only symptomatic subjects had proteinogenic IgE epitopes to pine. Increased odds ratios for sIgE based on proteinogenic epitopes to beech and respiratory symptoms were found, although they were not statistically significant. Sensitization rates to pine and beech were the same for woodworkers and references but dependent on the current wood dust exposure level. The importance of beech and pine wood sensitization is limited, but may be of clinical significance for a few workers if the IgE epitopes are proteinogenic.

  9. The application of the statistical classifying models for signal evaluation of the gas sensors analyzing mold contamination of the building materials

    NASA Astrophysics Data System (ADS)

    Majerek, Dariusz; Guz, Łukasz; Suchorab, Zbigniew; Łagód, Grzegorz; Sobczuk, Henryk

    2017-07-01

    Mold that develops on moistened building barriers is a major cause of the Sick Building Syndrome (SBS). Fungal contamination is normally evaluated using standard biological methods which are time-consuming and require a lot of manual labor. Fungi emit Volatile Organic Compounds (VOC) that can be detected in the indoor air using several techniques of detection e.g. chromatography. VOCs can be also detected using gas sensors arrays. All array sensors generate particular voltage signals that ought to be analyzed using properly selected statistical methods of interpretation. This work is focused on the attempt to apply statistical classifying models in evaluation of signals from gas sensors matrix to analyze the air sampled from the headspace of various types of the building materials at different level of contamination but also clean reference materials.

  10. A high-fidelity weather time series generator using the Markov Chain process on a piecewise level

    NASA Astrophysics Data System (ADS)

    Hersvik, K.; Endrerud, O.-E. V.

    2017-12-01

    A method is developed for generating a set of unique weather time-series based on an existing weather series. The method allows statistically valid weather variations to take place within repeated simulations of offshore operations. The numerous generated time series need to share the same statistical qualities as the original time series. Statistical qualities here refer mainly to the distribution of weather windows available for work, including durations and frequencies of such weather windows, and seasonal characteristics. The method is based on the Markov chain process. The core new development lies in how the Markov Process is used, specifically by joining small pieces of random length time series together rather than joining individual weather states, each from a single time step, which is a common solution found in the literature. This new Markov model shows favorable characteristics with respect to the requirements set forth and all aspects of the validation performed.

  11. Comparing age-wise reference intervals for serum creatinine concentration in a "Reality check" of the recommended cut-off.

    PubMed

    Verma, Mascha; Khadapkar, Rashmi; Sahu, Priyadarshi Soumyaranjan; Das, Bibhu Ranjan

    2006-09-01

    An increase in the communication within the healthcare services, both nationally and internationally, has strengthened the need for harmonization of measurements and reference intervals in laboratory medicine. In the present report, the calculated reference interval for serum creatinine (sCr) levels of healthy normal individuals (n=1121) in different sex and age groups are compared with the established interval. The calculated reference interval for sCr level was 0.4-1.3 mg/dL and 0.6 to 1.3 mg/dL in the age groups of 21-40 and 41-60 years respectively. The difference between the mean sCr values in total males and total females (age range 21-60 years) was statistically significant (p<0.0001); When male and female subjects were analyzed age-group wise, the data showed a significant difference in mean sCr values (p<0.0001) in three age groups (21-30, 31-40 and 41-50 years) however, in older age group (51-60 years), the difference was non-significant (p=0.07). The reference ranges were 0.7-1.3 and 0.4-1.0 mg/dL for males and females respectively where the lower limit was 0.1-0.2 units less than that of standard limits. An increase in the mean value of sCr was observed particularly in females with an increase in age. Hence it is of interest to validate an age specific reference ranges for sCr in our population.

  12. Selection of Reliable Reference Genes for Gene Expression Studies of a Promising Oilseed Crop, Plukenetia volubilis, by Real-Time Quantitative PCR.

    PubMed

    Niu, Longjian; Tao, Yan-Bin; Chen, Mao-Sheng; Fu, Qiantang; Li, Chaoqiong; Dong, Yuling; Wang, Xiulan; He, Huiying; Xu, Zeng-Fu

    2015-06-03

    Real-time quantitative PCR (RT-qPCR) is a reliable and widely used method for gene expression analysis. The accuracy of the determination of a target gene expression level by RT-qPCR demands the use of appropriate reference genes to normalize the mRNA levels among different samples. However, suitable reference genes for RT-qPCR have not been identified in Sacha inchi (Plukenetia volubilis), a promising oilseed crop known for its polyunsaturated fatty acid (PUFA)-rich seeds. In this study, using RT-qPCR, twelve candidate reference genes were examined in seedlings and adult plants, during flower and seed development and for the entire growth cycle of Sacha inchi. Four statistical algorithms (delta cycle threshold (ΔCt), BestKeeper, geNorm, and NormFinder) were used to assess the expression stabilities of the candidate genes. The results showed that ubiquitin-conjugating enzyme (UCE), actin (ACT) and phospholipase A22 (PLA) were the most stable genes in Sacha inchi seedlings. For roots, stems, leaves, flowers, and seeds from adult plants, 30S ribosomal protein S13 (RPS13), cyclophilin (CYC) and elongation factor-1alpha (EF1α) were recommended as reference genes for RT-qPCR. During the development of reproductive organs, PLA, ACT and UCE were the optimal reference genes for flower development, whereas UCE, RPS13 and RNA polymerase II subunit (RPII) were optimal for seed development. Considering the entire growth cycle of Sacha inchi, UCE, ACT and EF1α were sufficient for the purpose of normalization. Our results provide useful guidelines for the selection of reliable reference genes for the normalization of RT-qPCR data for seedlings and adult plants, for reproductive organs, and for the entire growth cycle of Sacha inchi.

  13. What Are the Key Statistics about Brain and Spinal Cord Cancers?

    MedlinePlus

    ... Brain and Spinal Cord Tumors in Adults Key Statistics for Brain and Spinal Cord Tumors The American ... Cord Tumors . Visit the American Cancer Society’s Cancer Statistics Center for more key statistics. Written by References ...

  14. Oxidative toxic stress in workers occupationally exposed to ceramic dust: A study in a ceramic manufacturing industry.

    PubMed

    Shad, Mehri Keshvari; Barkhordari, Abolfaz; Mehrparvar, Amir Houshang; Dehghani, Ali; Ranjbar, Akram; Moghadam, Rashid Heidari

    2016-09-27

    Exposure to compounds used in ceramic industries appears to be associated with induction of oxidative toxic stress. This cross sectional study was undertaken to assess the oxidative toxic stress parameters associated with occupational exposure to ceramic dust. Forty ceramic-exposed workers from a ceramic manufacturing industry and 40 unexposed referent subjects were studied. A questionnaire containing information regarding demographic variables, occupational history, history of any chronic disease, antioxidant consumption, and use of therapeutic drugs was administrated to them. Oxidative toxic stress biomarkers including lipid peroxidation (LPO), total antioxidant power (TAP), levels of total Thiol groups (TTG) and catalase (CAT) activity were measured. Significant increments in blood LPO levels, CAT activity and concomitant lower TAP were observed in ceramic exposed workers in comparison to referent group. No statistically significant difference was noted between the means of TTG levels between the groups. Findings of the study indicate that occupational exposure to ceramic dust induces oxidative toxic stress. Supplementation of workers with antioxidants may have beneficial effects on oxidative damages in ceramic industries.

  15. Improved Use of Small Reference Panels for Conditional and Joint Analysis with GWAS Summary Statistics.

    PubMed

    Deng, Yangqing; Pan, Wei

    2018-06-01

    Due to issues of practicality and confidentiality of genomic data sharing on a large scale, typically only meta- or mega-analyzed genome-wide association study (GWAS) summary data, not individual-level data, are publicly available. Reanalyses of such GWAS summary data for a wide range of applications have become more and more common and useful, which often require the use of an external reference panel with individual-level genotypic data to infer linkage disequilibrium (LD) among genetic variants. However, with a small sample size in only hundreds, as for the most popular 1000 Genomes Project European sample, estimation errors for LD are not negligible, leading to often dramatically increased numbers of false positives in subsequent analyses of GWAS summary data. To alleviate the problem in the context of association testing for a group of SNPs, we propose an alternative estimator of the covariance matrix with an idea similar to multiple imputation. We use numerical examples based on both simulated and real data to demonstrate the severe problem with the use of the 1000 Genomes Project reference panels, and the improved performance of our new approach. Copyright © 2018 by the Genetics Society of America.

  16. Lead accumulation in woodchucks (Marmota monax) at small arms and skeet ranges.

    PubMed

    Johnson, Mark S; Major, Michael A; Casteel, Stan W

    2004-10-01

    Increasing concern regarding the stewardship of US Army lands requires a proactive program to evaluate sites of potential risk. Small arms and upland skeet ranges are a potentially significant source of lead exposure for burrowing mammals. Woodchucks (Marmota monax) were evaluated for lead exposure in a previously used upland skeet range and a small arms range, respective to animals collected at two nearby reference locations. Soil lead concentrations collected at burrow entrances on the firing ranges were compared with blood, bone, kidney, liver, and fecal concentrations of woodchucks collected from the reference areas. No statistical differences were found in the lead concentrations in tissue between woodchucks in reference and firing ranges; concentrations of lead in liver and kidney were below detection limits. Levels in bone, blood, and feces suggest the bioavailability of lead at these various sites, although other factors (e.g., differences in foraging areas, age structure, habitat preferences, and environmental conditions) were also likely to influence exposure. Blood levels were below that which suggests toxicity. Further analysis of other ranges with higher lead concentrations and of small mammal species with smaller home ranges is recommended to further elucidate trends that could be extrapolated to other sites.

  17. Quality evaluation of no-reference MR images using multidirectional filters and image statistics.

    PubMed

    Jang, Jinseong; Bang, Kihun; Jang, Hanbyol; Hwang, Dosik

    2018-09-01

    This study aimed to develop a fully automatic, no-reference image-quality assessment (IQA) method for MR images. New quality-aware features were obtained by applying multidirectional filters to MR images and examining the feature statistics. A histogram of these features was then fitted to a generalized Gaussian distribution function for which the shape parameters yielded different values depending on the type of distortion in the MR image. Standard feature statistics were established through a training process based on high-quality MR images without distortion. Subsequently, the feature statistics of a test MR image were calculated and compared with the standards. The quality score was calculated as the difference between the shape parameters of the test image and the undistorted standard images. The proposed IQA method showed a >0.99 correlation with the conventional full-reference assessment methods; accordingly, this proposed method yielded the best performance among no-reference IQA methods for images containing six types of synthetic, MR-specific distortions. In addition, for authentically distorted images, the proposed method yielded the highest correlation with subjective assessments by human observers, thus demonstrating its superior performance over other no-reference IQAs. Our proposed IQA was designed to consider MR-specific features and outperformed other no-reference IQAs designed mainly for photographic images. Magn Reson Med 80:914-924, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.

  18. Evaluation of Skylab IB sensitivity to on-pad winds with turbulence

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1972-01-01

    Computer simulation was performed to estimate displacements and bending moments experienced by the SKYLAB 1B vehicle on the launch pad due to atmospheric winds. The vehicle was assumed to be a beam-like structure represented by a finite number of generalized coordinates. Wind flow across the vehicle was treated as a nonhomogeneous, stationary random process. Response computations were performed by the assumption of simple strip theory and application of generalized harmonic analysis. Displacement and bending moment statistics were obtained for six vehicle propellant loading conditions and four representative reference wind profile and turbulence levels. Means, variances and probability distributions are presented graphically for each case. A separate analysis was performed to indicate the influence of wind gradient variations on vehicle response statistics.

  19. A Handbook of Sound and Vibration Parameters

    DTIC Science & Technology

    1978-09-18

    fixed in space. (Reference 1.) no motion atay node Static Divergence: (See Divergence.) Statistical Energy Analysis (SEA): Statistical energy analysis is...parameters of the circuits come from statistics of the vibrational characteristics of the structure. Statistical energy analysis is uniquely successful

  20. Direct access compared with referred physical therapy episodes of care: a systematic review.

    PubMed

    Ojha, Heidi A; Snyder, Rachel S; Davenport, Todd E

    2014-01-01

    Evidence suggests that physical therapy through direct access may help decrease costs and improve patient outcomes compared with physical therapy by physician referral. The purpose of this study was to conduct a systematic review of the literature on patients with musculoskeletal injuries and compare health care costs and patient outcomes in episodes of physical therapy by direct access compared with referred physical therapy. Ovid MEDLINE, CINAHL (EBSCO), Web of Science, and PEDro were searched using terms related to physical therapy and direct access. Included articles were hand searched for additional references. Included studies compared data from physical therapy by direct access with physical therapy by physician referral, studying cost, outcomes, or harm. The studies were appraised using the Centre for Evidence-Based Medicine (CEBM) levels of evidence criteria and assigned a methodological score. Of the 1,501 articles that were screened, 8 articles at levels 3 to 4 on the CEBM scale were included. There were statistically significant and clinically meaningful findings across studies that satisfaction and outcomes were superior, and numbers of physical therapy visits, imaging ordered, medications prescribed, and additional non-physical therapy appointments were less in cohorts receiving physical therapy by direct access compared with referred episodes of care. There was no evidence for harm. There is evidence across level 3 and 4 studies (grade B to C CEBM level of recommendation) that physical therapy by direct access compared with referred episodes of care is associated with improved patient outcomes and decreased costs. Primary limitations were lack of group randomization, potential for selection bias, and limited generalizability. Physical therapy by way of direct access may contain health care costs and promote high-quality health care. Third-party payers should consider paying for physical therapy by direct access to decrease health care costs and incentivize optimal patient outcomes.

  1. [Efficacy and tolerability of cisapride in a new formula of 10 mg effervescent capsules for the treatment of functional dyspepsia].

    PubMed

    Grossi, L; Di Felice, F; Marzio, L

    1993-06-01

    The efficacy and tolerability of Cisapride effervescent granules and a metoclopramide-dimethicone combination were compared double-blind in two comparable groups of 15 patients each with dyspepsia. All patients received three sachets daily of either drug for 6 consecutive weeks. As for efficacy, Cisapride effervescent granules was found to reduce 85% (11/13) of symptoms to a statistically significant extent, as against 42% (5/12) in the reference group. Statistical analysis showed Cisapride effervescent granules to be more effective than the reference drug for 6 out of 11 evaluable symptoms. Mean global improvement was 86% for Cisapride effervescent granules vs 41% for the reference combination. Final judgment by the physician was more favorable for Cisapride effervescent granules than for the reference drug (p < 0.0001). Treatment withdrawal was never necessary and no significant changes of laboratory values were observed. No statistically significant difference between the two treatments as to tolerability was observed. In conclusion, Cisapride effervescent granules was found to have a better risk/benefit ratio than the reference combination.

  2. Conducted-Susceptibility Testing as an Alternative Approach to Unit-Level Radiated-Susceptibility Verifications

    NASA Astrophysics Data System (ADS)

    Badini, L.; Grassi, F.; Pignari, S. A.; Spadacini, G.; Bisognin, P.; Pelissou, P.; Marra, S.

    2016-05-01

    This work presents a theoretical rationale for the substitution of radiated-susceptibility (RS) verifications defined in current aerospace standards with an equivalent conducted-susceptibility (CS) test procedure based on bulk current injection (BCI) up to 500 MHz. Statistics is used to overcome the lack of knowledge about uncontrolled or uncertain setup parameters, with particular reference to the common-mode impedance of equipment. The BCI test level is properly investigated so to ensure correlation of currents injected in the equipment under test via CS and RS. In particular, an over-testing probability quantifies the severity of the BCI test with respect to the RS test.

  3. Identification and evaluation of reference genes for qRT-PCR normalization in Ganoderma lucidum.

    PubMed

    Xu, Jiang; Xu, ZhiChao; Zhu, YingJie; Luo, HongMei; Qian, Jun; Ji, AiJia; Hu, YuanLei; Sun, Wei; Wang, Bo; Song, JingYuan; Sun, Chao; Chen, ShiLin

    2014-01-01

    Quantitative real-time reverse transcription PCR (qRT-PCR) is a rapid, sensitive, and reliable technique for gene expression studies. The accuracy and reliability of qRT-PCR results depend on the stability of the reference genes used for gene normalization. Therefore, a systematic process of reference gene evaluation is needed. Ganoderma lucidum is a famous medicinal mushroom in East Asia. In the current study, 10 potential reference genes were selected from the G. lucidum genomic data. The sequences of these genes were manually curated, and primers were designed following strict criteria. The experiment was conducted using qRT-PCR, and the stability of each candidate gene was assessed using four commonly used statistical programs-geNorm, NormFinder, BestKeeper, and RefFinder. According to our results, PP2A was expressed at the most stable levels under different fermentation conditions, and RPL4 was the most stably expressed gene in different tissues. RPL4, PP2A, and β-tubulin are the most commonly recommended reference genes for normalizing gene expression in the entire sample set. The current study provides a foundation for the further use of qRT-PCR in G. lucidum gene analysis.

  4. Statistical Literacy as a Function of Online versus Hybrid Course Delivery Format for an Introductory Graduate Statistics Course

    ERIC Educational Resources Information Center

    Hahs-Vaughn, Debbie L.; Acquaye, Hannah; Griffith, Matthew D.; Jo, Hang; Matthews, Ken; Acharya, Parul

    2017-01-01

    Statistical literacy refers to understanding fundamental statistical concepts. Assessment of statistical literacy can take the forms of tasks that require students to identify, translate, compute, read, and interpret data. In addition, statistical instruction can take many forms encompassing course delivery format such as face-to-face, hybrid,…

  5. Non-linearity of geocentre motion and its impact on the origin of the terrestrial reference frame

    NASA Astrophysics Data System (ADS)

    Dong, Danan; Qu, Weijing; Fang, Peng; Peng, Dongju

    2014-08-01

    The terrestrial reference frame is a cornerstone for modern geodesy and its applications for a wide range of Earth sciences. The underlying assumption for establishing a terrestrial reference frame is that the motion of the solid Earth's figure centre relative to the mass centre of the Earth system on a multidecadal timescale is linear. However, past international terrestrial reference frames (ITRFs) showed unexpected accelerated motion in their translation parameters. Based on this underlying assumption, the inconsistency of relative origin motions of the ITRFs has been attributed to data reduction imperfection. We investigated the impact of surface mass loading from atmosphere, ocean, snow, soil moisture, ice sheet, glacier and sea level from 1983 to 2008 on the geocentre variations. The resultant geocentre time-series display notable trend acceleration from 1998 onward, in particular in the z-component. This effect is primarily driven by the hydrological mass redistribution in the continents (soil moisture, snow, ice sheet and glacier). The acceleration is statistically significant at the 99 per cent confidence level as determined using the Mann-Kendall test, and it is highly correlated with the satellite laser ranging determined translation series. Our study, based on independent geophysical and hydrological models, demonstrates that, in addition to systematic errors from analysis procedures, the observed non-linearity of the Earth-system behaviour at interannual timescales is physically driven and is able to explain 42 per cent of the disparity between the origins of ITRF2000 and ITRF2005, as well as the high level of consistency between the ITRF2005 and ITRF2008 origins.

  6. Directional genetic selection by pulp mill effluent on multiple natural populations of three-spined stickleback (Gasterosteus aculeatus).

    PubMed

    Lind, Emma E; Grahn, Mats

    2011-05-01

    Contamination can cause a rapid environmental change which may require populations to respond with evolutionary changes. To evaluate the effects of pulp mill effluents on population genetics, we sampled three-spined sticklebacks (Gasterosteus aculeatus) near four pulp mills and four adjacent reference sites and analyzed Amplified Fragment Length Polymorphism (AFLP) to compare genetic variability. A fine scale genetic structure was detected and samples from polluted sites separated from reference sites in multidimensional scaling plots (P<0.005, 1000 permutations) and locus-by-locus Analysis of Molecular Variance (AMOVA) further confirmed that habitats are significantly separated (F(ST)=0.021, P<0.01, 1023 permutations). The amount of genetic variation between populations did not differ between habitats, and populations from both habitats had similar levels of heterozygosity (polluted sites Nei's Hs=0.11, reference sites Nei's Hs=0.11). Still, pairwise F(ST): s between three, out of four, pairs of polluted-reference sites were significant. A F(ST)-outlier analysis showed that 21 (8.4%) loci were statistically different from a neutral distribution at the P<0.05 level and therefore indicated to be under divergent selection. When removing 13 F(ST)-outlier loci, significant at the P<0.01 level, differentiation between habitats disappeared in a multidimensional scaling plot. In conclusion, pulp mill effluence has acted as a selective agent on natural populations of G. aculeatus, causing a convergence in genotype composition change at multiple sites in an open environment. © The Author(s) 2011. This article is published with open access at Springerlink.com

  7. Using the bootstrap to establish statistical significance for relative validity comparisons among patient-reported outcome measures

    PubMed Central

    2013-01-01

    Background Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Methods Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. Results The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. Conclusions The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r < 0.7). Substantially greater power can be achieved when comparing measures of a very high correlation (r > 0.9). PMID:23721463

  8. Turbulent/non-turbulent interfaces detected in DNS of incompressible turbulent boundary layers

    NASA Astrophysics Data System (ADS)

    Watanabe, T.; Zhang, X.; Nagata, K.

    2018-03-01

    The turbulent/non-turbulent interface (TNTI) detected in direct numerical simulations is studied for incompressible, temporally developing turbulent boundary layers at momentum thickness Reynolds number Reθ ≈ 2000. The outer edge of the TNTI layer is detected as an isosurface of the vorticity magnitude with the threshold determined with the dependence of the turbulent volume on a threshold level. The spanwise vorticity magnitude and passive scalar are shown to be good markers of turbulent fluids, where the conditional statistics on a distance from the outer edge of the TNTI layer are almost identical to the ones obtained with the vorticity magnitude. Significant differences are observed for the conditional statistics between the TNTI detected by the kinetic energy and vorticity magnitude. A widely used grid setting determined solely from the wall unit results in an insufficient resolution in a streamwise direction in the outer region, whose influence is found for the geometry of the TNTI and vorticity jump across the TNTI layer. The present results suggest that the grid spacing should be similar for the streamwise and spanwise directions. Comparison of the TNTI layer among different flows requires appropriate normalization of the conditional statistics. Reference quantities of the turbulence near the TNTI layer are obtained with the average of turbulent fluids in the intermittent region. The conditional statistics normalized by the reference turbulence characteristics show good quantitative agreement for the turbulent boundary layer and planar jet when they are plotted against the distance from the outer edge of the TNTI layer divided by the Kolmogorov scale defined for turbulent fluids in the intermittent region.

  9. High Accuracy Verification of a Correlated-Photon-Based Method for Determining Photon-Counting Detection Efficiency

    DTIC Science & Technology

    2007-01-01

    Metrology; (270.5290) Photon statistics. References and links 1. W. H. Louisell, A. Yariv, and A. E. Siegman , “Quantum Fluctuations and Noise in...939–941 (1981). 7. S. R. Bowman, Y. H. Shih, and C. O. Alley, “The use of Geiger mode avalanche photodiodes for precise laser ranging at very low...light levels: An experimental evaluation”, in Laser Radar Technology and Applications I, James M. Cruickshank, Robert C. Harney eds., Proc. SPIE 663

  10. S.P.S.S. User's Manual #1-#4. Basic Program Construction in S.P.S.S.; S.P.S.S. Non-Procedural Statements and Procedural Commands; System Control Language and S.P.S.S.; Quick File Equate Statement Reference.

    ERIC Educational Resources Information Center

    Earl, Lorna L.

    This series of manuals describing and illustrating the Statistical Package for the Social Sciences (SPSS) was planned as a self-teaching instrument, beginning with the basics and progressing to an advanced level. Information on what the searcher must know to define the data and write a program for preliminary analysis is contained in manual 1,…

  11. Cancer Localization in the Prostate with F-18 Fluorocholine Positron Emission Tomography

    DTIC Science & Technology

    2009-01-15

    2008 Addendum to Final Report PI-Kwee, Sandi A. REFERENCES 1. Jemal A, Siegel R , Ward E, Murray T , Xu J, Thun MJ. Cancer statistics, 2007. CA...addition to those listed in the 2006 and 2007 reports: 2008 PUBLICATIONS: Kwee SA, Thibault G, Stack R , Coel M, Furusato B, Sesterhenn I. Use of...Kwee SA, Degrado T . Prostate biopsy guided by 18F-fluorocholine PET in men with persistently elevated PSA levels. Eur J Nucl Med Mol Imaging. 2008

  12. Corps Level Operational Art in Vietnam: A Study of II Field Force Commanders during Major Named Operations

    DTIC Science & Technology

    2013-05-23

    South Vietnam and the bombing campaign falling short of its desired effects, options generated by the administration’s Whiz Kids or civilian...The Washington Post, 7 July 2009, http://articles.washingtonpost.com (accessed 5 October 2012), The term Whiz Kid refers to young 6...the U.S. Department of Defense by implementing a statistics based management approach. Whiz kid originated from a group or former World War II U.S. Air

  13. Rich in resources/deficient in dollars! Which titles do reference departments really need?

    PubMed

    Fishman, D L; DelBaglivo, M

    1998-10-01

    Budget pressures, combined with the growing availability of resources, dictate careful examination of reference use. Two studies were conducted at the University of Maryland Health Sciences Library to examine this issue. A twelve-month reshelving study determined use by title and discipline; a simultaneous study analyzed print abstract and index use in an electronic environment. Staff electronically recorded statistics for unshelved reference books, coded the collection by discipline, and tracked use by school. Oral surveys administered to reference room abstract and index users focused on title usage, user demographics, and stated reason for use. Sixty-five and a half percent of reference collection titles were used. Medical titles received the most use, but, in the context of collection size, dentistry and nursing titles used the greatest percentage of their collections. At an individual title level, medical textbooks and drug handbooks were most used. Users of abstracts and indexes were primarily campus nursing and medical students who preferred print resources. The monograph data will guide reference expenditures in canceling little-used standing orders, expanding most-used portions of the collection, and analyzing underused sections. The abstract and index survey identified the following needs: targeting instruction, contacting faculty who assign print resources, increasing the number of computer workstations, and installing signs linking databases to print equivalents.

  14. From the connectome to the synaptome: an epic love story.

    PubMed

    DeFelipe, Javier

    2010-11-26

    A major challenge in neuroscience is to decipher the structural layout of the brain. The term "connectome" has recently been proposed to refer to the highly organized connection matrix of the human brain. However, defining how information flows through such a complex system represents so difficult a task that it seems unlikely it could be achieved in the near future or, for the most pessimistic, perhaps ever. Circuit diagrams of the nervous system can be considered at different levels, although they are surely impossible to complete at the synaptic level. Nevertheless, advances in our capacity to marry macro- and microscopic data may help establish a realistic statistical model that could describe connectivity at the ultrastructural level, the "synaptome," giving us cause for optimism.

  15. Adapt-Mix: learning local genetic correlation structure improves summary statistics-based analyses

    PubMed Central

    Park, Danny S.; Brown, Brielin; Eng, Celeste; Huntsman, Scott; Hu, Donglei; Torgerson, Dara G.; Burchard, Esteban G.; Zaitlen, Noah

    2015-01-01

    Motivation: Approaches to identifying new risk loci, training risk prediction models, imputing untyped variants and fine-mapping causal variants from summary statistics of genome-wide association studies are playing an increasingly important role in the human genetics community. Current summary statistics-based methods rely on global ‘best guess’ reference panels to model the genetic correlation structure of the dataset being studied. This approach, especially in admixed populations, has the potential to produce misleading results, ignores variation in local structure and is not feasible when appropriate reference panels are missing or small. Here, we develop a method, Adapt-Mix, that combines information across all available reference panels to produce estimates of local genetic correlation structure for summary statistics-based methods in arbitrary populations. Results: We applied Adapt-Mix to estimate the genetic correlation structure of both admixed and non-admixed individuals using simulated and real data. We evaluated our method by measuring the performance of two summary statistics-based methods: imputation and joint-testing. When using our method as opposed to the current standard of ‘best guess’ reference panels, we observed a 28% decrease in mean-squared error for imputation and a 73.7% decrease in mean-squared error for joint-testing. Availability and implementation: Our method is publicly available in a software package called ADAPT-Mix available at https://github.com/dpark27/adapt_mix. Contact: noah.zaitlen@ucsf.edu PMID:26072481

  16. 2013 Annual Disability Statistics Compendium

    ERIC Educational Resources Information Center

    Houtenville, Andrew J.

    2013-01-01

    The "Annual Disability Statistics Compendium" is a publication of statistics about people with disabilities and the government programs which serve them. It is modeled after the U.S. Department of Commerce's annual "Statistical Abstracts of the United States." The "Compendium" is designed to serve as a reference guide…

  17. Statistical Abstract of the United States: 2012. 131st Edition

    ERIC Educational Resources Information Center

    US Census Bureau, 2011

    2011-01-01

    "The Statistical Abstract of the United States," published from 1878 to 2012, is the authoritative and comprehensive summary of statistics on the social, political, and economic organization of the United States. It is designed to serve as a convenient volume for statistical reference, and as a guide to other statistical publications and…

  18. Statistics about Hearing, Balance, Ear Infections and Deafness

    MedlinePlus

    ... You are here Home » Health Info Statistics about Hearing, Balance, Ear Infections, and Deafness Quick Statistics Charts ... What the Numbers Mean: An Epidemiological Perspective on Hearing References on Hearing Epidemiology Last Updated Date: October ...

  19. A UNIFIED FRAMEWORK FOR VARIANCE COMPONENT ESTIMATION WITH SUMMARY STATISTICS IN GENOME-WIDE ASSOCIATION STUDIES.

    PubMed

    Zhou, Xiang

    2017-12-01

    Linear mixed models (LMMs) are among the most commonly used tools for genetic association studies. However, the standard method for estimating variance components in LMMs-the restricted maximum likelihood estimation method (REML)-suffers from several important drawbacks: REML requires individual-level genotypes and phenotypes from all samples in the study, is computationally slow, and produces downward-biased estimates in case control studies. To remedy these drawbacks, we present an alternative framework for variance component estimation, which we refer to as MQS. MQS is based on the method of moments (MoM) and the minimal norm quadratic unbiased estimation (MINQUE) criterion, and brings two seemingly unrelated methods-the renowned Haseman-Elston (HE) regression and the recent LD score regression (LDSC)-into the same unified statistical framework. With this new framework, we provide an alternative but mathematically equivalent form of HE that allows for the use of summary statistics. We provide an exact estimation form of LDSC to yield unbiased and statistically more efficient estimates. A key feature of our method is its ability to pair marginal z -scores computed using all samples with SNP correlation information computed using a small random subset of individuals (or individuals from a proper reference panel), while capable of producing estimates that can be almost as accurate as if both quantities are computed using the full data. As a result, our method produces unbiased and statistically efficient estimates, and makes use of summary statistics, while it is computationally efficient for large data sets. Using simulations and applications to 37 phenotypes from 8 real data sets, we illustrate the benefits of our method for estimating and partitioning SNP heritability in population studies as well as for heritability estimation in family studies. Our method is implemented in the GEMMA software package, freely available at www.xzlab.org/software.html.

  20. Bioconductor Workflow for Microbiome Data Analysis: from raw reads to community analyses

    PubMed Central

    Callahan, Ben J.; Sankaran, Kris; Fukuyama, Julia A.; McMurdie, Paul J.; Holmes, Susan P.

    2016-01-01

    High-throughput sequencing of PCR-amplified taxonomic markers (like the 16S rRNA gene) has enabled a new level of analysis of complex bacterial communities known as microbiomes. Many tools exist to quantify and compare abundance levels or OTU composition of communities in different conditions. The sequencing reads have to be denoised and assigned to the closest taxa from a reference database. Common approaches use a notion of 97% similarity and normalize the data by subsampling to equalize library sizes. In this paper, we show that statistical models allow more accurate abundance estimates. By providing a complete workflow in R, we enable the user to do sophisticated downstream statistical analyses, whether parametric or nonparametric. We provide examples of using the R packages dada2, phyloseq, DESeq2, ggplot2 and vegan to filter, visualize and test microbiome data. We also provide examples of supervised analyses using random forests and nonparametric testing using community networks and the ggnetwork package. PMID:27508062

  1. A supervised learning approach for Crohn's disease detection using higher-order image statistics and a novel shape asymmetry measure.

    PubMed

    Mahapatra, Dwarikanath; Schueffler, Peter; Tielbeek, Jeroen A W; Buhmann, Joachim M; Vos, Franciscus M

    2013-10-01

    Increasing incidence of Crohn's disease (CD) in the Western world has made its accurate diagnosis an important medical challenge. The current reference standard for diagnosis, colonoscopy, is time-consuming and invasive while magnetic resonance imaging (MRI) has emerged as the preferred noninvasive procedure over colonoscopy. Current MRI approaches assess rate of contrast enhancement and bowel wall thickness, and rely on extensive manual segmentation for accurate analysis. We propose a supervised learning method for the identification and localization of regions in abdominal magnetic resonance images that have been affected by CD. Low-level features like intensity and texture are used with shape asymmetry information to distinguish between diseased and normal regions. Particular emphasis is laid on a novel entropy-based shape asymmetry method and higher-order statistics like skewness and kurtosis. Multi-scale feature extraction renders the method robust. Experiments on real patient data show that our features achieve a high level of accuracy and perform better than two competing methods.

  2. Computer Aided Reference Services in the Academic Library: Experiences in Organizing and Operating an Online Reference Service.

    ERIC Educational Resources Information Center

    Hoover, Ryan E.

    1979-01-01

    Summarizes the development of the Computer-Aided Reference Services (CARS) division of the University of Utah Libraries' reference department. Development, organizational structure, site selection, equipment, management, staffing and training considerations, promotion and marketing, budget and pricing, record keeping, statistics, and evaluation…

  3. Implementation and Use of the Reference Analytics Module of LibAnswers

    ERIC Educational Resources Information Center

    Flatley, Robert; Jensen, Robert Bruce

    2012-01-01

    Academic libraries have traditionally collected reference statistics using hash marks on paper. Although efficient and simple, this method is not an effective way to capture the complexity of reference transactions. Several electronic tools are now available to assist libraries with collecting often elusive reference data--among them homegrown…

  4. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  5. Comparative Evaluation of C-reactive Proteins in Pregnant Women with and without Periodontal Pathologies: A Prospective Cohort Analysis.

    PubMed

    Mannava, Padmakanth; Gokhale, Sunil; Pujari, Sudarshan; Biswas, Krishna P; Kaliappan, Satish; Vijapure, Shashank

    2016-06-01

    Inflammation of tooth supporting structures is referred to as periodontitis. C-reactive proteins (CRP) levels are usually increased in case of chronic inflammatory process like periodontitis. Association of CRP with pregnancy has been observed in the past, which includes most commonly preterm delivery, preeclampsia, etc. Therefore, it can be hypothesized that CRP may act as a link between periodontitis and adverse pregnancy outcomes. Hence, we aim to evaluate the plasma CRP levels in pregnant women with and without periodontal pathologies. The study included 210 pregnant women who reported to the hospital with periodontal problems and for routine checkups. All the patients were divided into three groups based on the presence and absence of periodontal pathologies. Russell's Periodontal Index Score was used for the evaluation of periodontal status of the subjects. While comparing the mean CRP levels in all the three study groups, statistically significant results were obtained. Statistically significant results were obtained while comparing the mean CRP levels in group C patients before treatment and after treatment therapy. The CRP levels were estimated by taking blood samples. Paired t-test and one-way analysis of variance was used to assess the correlation between the two parameters. Casual association might exist between the CRP levels and periodontal diseases in pregnant women and the CRP levels may also get elevated in pregnant women.

  6. Radio-Optical Reference Frame Link Using the U.S. Naval Observatory Astrograph and Deep CCD Imaging

    NASA Astrophysics Data System (ADS)

    Zacharias, N.; Zacharias, M. I.

    2014-05-01

    Between 1997 and 2004 several observing runs were conducted, mainly with the CTIO 0.9 m, to image International Celestial Reference Frame (ICRF) counterparts (mostly QSOs) in order to determine accurate optical positions. Contemporary to these deep CCD images, the same fields were observed with the U.S. Naval Observatory astrograph in the same bandpass. They provide accurate positions on the Hipparcos/Tycho-2 system for stars in the 10-16 mag range used as reference stars for the deep CCD imaging data. Here we present final optical position results of 413 sources based on reference stars obtained by dedicated astrograph observations that were reduced following two different procedures. These optical positions are compared to radio very long baseline interferometry positions. The current optical system is not perfectly aligned to the ICRF radio system with rigid body rotation angles of 3-5 mas (= 3σ level) found between them for all three axes. Furthermore, statistically, the optical-radio position differences are found to exceed the total, combined, known errors in the observations. Systematic errors in the optical reference star positions and physical offsets between the centers of optical and radio emissions are both identified as likely causes. A detrimental, astrophysical, random noise component is postulated to be on about the 10 mas level. If confirmed by future observations, this could severely limit the Gaia to ICRF reference frame alignment accuracy to an error of about 0.5 mas per coordinate axis with the current number of sources envisioned to provide the link. A list of 36 ICRF sources without the detection of an optical counterpart to a limiting magnitude of about R = 22 is provided as well.

  7. Radio-optical reference frame link using the U.S. Naval observatory astrograph and deep CCD imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zacharias, N.; Zacharias, M. I., E-mail: nz@usno.navy.mil

    2014-05-01

    Between 1997 and 2004 several observing runs were conducted, mainly with the CTIO 0.9 m, to image International Celestial Reference Frame (ICRF) counterparts (mostly QSOs) in order to determine accurate optical positions. Contemporary to these deep CCD images, the same fields were observed with the U.S. Naval Observatory astrograph in the same bandpass. They provide accurate positions on the Hipparcos/Tycho-2 system for stars in the 10-16 mag range used as reference stars for the deep CCD imaging data. Here we present final optical position results of 413 sources based on reference stars obtained by dedicated astrograph observations that were reducedmore » following two different procedures. These optical positions are compared to radio very long baseline interferometry positions. The current optical system is not perfectly aligned to the ICRF radio system with rigid body rotation angles of 3-5 mas (= 3σ level) found between them for all three axes. Furthermore, statistically, the optical-radio position differences are found to exceed the total, combined, known errors in the observations. Systematic errors in the optical reference star positions and physical offsets between the centers of optical and radio emissions are both identified as likely causes. A detrimental, astrophysical, random noise component is postulated to be on about the 10 mas level. If confirmed by future observations, this could severely limit the Gaia to ICRF reference frame alignment accuracy to an error of about 0.5 mas per coordinate axis with the current number of sources envisioned to provide the link. A list of 36 ICRF sources without the detection of an optical counterpart to a limiting magnitude of about R = 22 is provided as well.« less

  8. Mutation Screening of the Krüppel-like Factor 1 Gene in Individuals With Increased Fetal Hemoglobin Referred for Hemoglobinopathy Investigation in South of Iran.

    PubMed

    Hamid, Mohammad; Ershadi Oskouei, Sanaz; Shariati, Gholamreza; Babaei, Esmaeil; Galehdari, Hamid; Saberi, Alihossein; Sedaghat, Alireza

    2018-04-01

    Any mutation in the Krüppel-like factor 1 (KLF1) gene may interfere with its proper related function in the erythropoiesis process and lead to alterations in proper activation of its downstream protein through globin switching, which results in an increase in fetal hemoglobin (HbF). This study aimed to investigate whether KLF1 mutation can associate with high level of HbF in individuals with increased fetal hemoglobin referred for screening of hemoglobinopathies in south of Iran. The human KLF1 gene was amplified via the polymerase chain reaction procedure, and sequencing was used to determine any mutation in these patients. Moreover, XmnI polymorphisms in the position of -158 of γ-globin gene promoter were analyzed in all patients by polymerase chain reaction restriction fragment length polymorphism. Analysis of sequencing revealed a missense mutation in the KLF1 gene, p.Ser102Pro (c.304T>C), which was detectable in 10 of 23 cases with elevated HbF level. This mutation was only detected in individuals who had a HbF level between 3.1% and 25.6%. Statistical analysis showed that the frequency of C allele is significantly correlated with a high level of HbF (P<0.05). The allele frequency of positive result of XmnI polymorphism in individuals with increased HbF level was also significant, which showed an association with increased HbF level (P<0.05). To the best of our knowledge, this is the first report of p.Ser102Pro (c.304T>C) in the KLF1 gene in β-thalassemia patients with increased level of fetal hemoglobin. According to statistical results of p.Ser102Pro mutation and XmnI polymorphism, it has been strongly suggested that both polymorphisms have an association with increased HbF samples. These nucleotide changes alone may not be the only elements raising the level of HbF, and other regulatory and modifying factors also play a role in HbF production.

  9. Effects of pre and postnatal exposure to low levels of polybromodiphenyl ethers on neurodevelopment and thyroid hormone levels at 4 years of age.

    PubMed

    Gascon, Mireia; Vrijheid, Martine; Martínez, David; Forns, Joan; Grimalt, Joan O; Torrent, Maties; Sunyer, Jordi

    2011-04-01

    There are at present very few studies of the effects of polybromodiphenyl ethers (PBDEs), used as flame retardants in consumer products, on neurodevelopment or thyroid hormone levels in humans. The present study aims to examine the association between pre and postnatal PBDE concentrations and neurodevelopment and thyroid hormone levels in children at age 4years and isolate the effects of PBDEs from those of PCBs, DDT, DDE and HCB. A prospective birth cohort in Menorca (Spain) enrolled 482 pregnant mothers between 1997 and 1998. At 4years, children were assessed for motor and cognitive function (McCarthy Scales of Children's Abilities), attention-deficit, hyperactivity and impulsivity (ADHD-DSM-IV) and social competence (California Preschool Social Competence Scale). PBDE concentrations were measured in cord blood (N=88) and in serum of 4years olds (N=244). Among all congeners analyzed only PBDE 47 was quantified in a reasonable number of samples (LOQ=0.002ng/ml). Exposure to PBDE 47 was analyzed as a dichotomous variable: concentrations above the LOQ (exposed) and concentrations below (referents). Scores for cognitive and motor functions were always lower in children pre and postnatally exposed to PBDE47 than in referents, but none of these associations was statistically significant (β coefficient (95%CI) of the total cognition score: -2.7 (-7.0, 1.6) for postnatal exposure, and -1.4 (-9.2, 6.5) for prenatal exposure). Postnatal exposure to PBDE 47 was statistically significantly related to an increased risk of symptoms on the attention deficit subscale of ADHD symptoms (RR (95%CI)=1.8 (1.0, 3.2)) but not to hyperactivity symptoms. A statistically significant higher risk of poor social competence symptoms was observed as a consequence of postnatal PBDE 47 exposure (RR (95%CI)=2.6 (1.2, 5.9)). Adjustment for other organochlorine compounds did not influence the results. Levels of thyroid hormones were not associated to PBDE exposure. This study highlights the importance of assessing the effects of PBDE exposure not just prenatally but also during the early years of life. In the light of current evidence a precautionary approach towards PBDE exposure of both mothers and children seems warranted. Copyright © 2010 Elsevier Ltd. All rights reserved.

  10. On Statistical Approaches for Demonstrating Analytical Similarity in the Presence of Correlation.

    PubMed

    Yang, Harry; Novick, Steven; Burdick, Richard K

    Analytical similarity is the foundation for demonstration of biosimilarity between a proposed product and a reference product. For this assessment, currently the U.S. Food and Drug Administration (FDA) recommends a tiered system in which quality attributes are categorized into three tiers commensurate with their risk and approaches of varying statistical rigor are subsequently used for the three-tier quality attributes. Key to the analyses of Tiers 1 and 2 quality attributes is the establishment of equivalence acceptance criterion and quality range. For particular licensure applications, the FDA has provided advice on statistical methods for demonstration of analytical similarity. For example, for Tier 1 assessment, an equivalence test can be used based on an equivalence margin of 1.5 σ R , where σ R is the reference product variability estimated by the sample standard deviation S R from a sample of reference lots. The quality range for demonstrating Tier 2 analytical similarity is of the form X̄ R ± K × σ R where the constant K is appropriately justified. To demonstrate Tier 2 analytical similarity, a large percentage (e.g., 90%) of test product must fall in the quality range. In this paper, through both theoretical derivations and simulations, we show that when the reference drug product lots are correlated, the sample standard deviation S R underestimates the true reference product variability σ R As a result, substituting S R for σ R in the Tier 1 equivalence acceptance criterion and the Tier 2 quality range inappropriately reduces the statistical power and the ability to declare analytical similarity. Also explored is the impact of correlation among drug product lots on Type I error rate and power. Three methods based on generalized pivotal quantities are introduced, and their performance is compared against a two-one-sided tests (TOST) approach. Finally, strategies to mitigate risk of correlation among the reference products lots are discussed. A biosimilar is a generic version of the original biological drug product. A key component of a biosimilar development is the demonstration of analytical similarity between the biosimilar and the reference product. Such demonstration relies on application of statistical methods to establish a similarity margin and appropriate test for equivalence between the two products. This paper discusses statistical issues with demonstration of analytical similarity and provides alternate approaches to potentially mitigate these problems. © PDA, Inc. 2016.

  11. Assessment of impact of urbanisation on background radiation exposure and human health risk estimation in Kuala Lumpur, Malaysia.

    PubMed

    Sanusi, M S M; Ramli, A T; Hassan, W M S W; Lee, M H; Izham, A; Said, M N; Wagiran, H; Heryanshah, A

    2017-07-01

    Kuala Lumpur has been undergoing rapid urbanisation process, mainly in infrastructure development. The opening of new township and residential in former tin mining areas, particularly in the heavy mineral- or tin-bearing alluvial soil in Kuala Lumpur, is a contentious subject in land-use regulation. Construction practices, i.e. reclamation and dredging in these areas are potential to enhance the radioactivity levels of soil and subsequently, increase the existing background gamma radiation levels. This situation is worsened with the utilisation of tin tailings as construction materials apart from unavoidable soil pollutions due to naturally occurring radioactive materials in construction materials, e.g. granitic aggregate, cement and red clay brick. This study was conducted to assess the urbanisation impacts on background gamma radiation in Kuala Lumpur. The study found that the mean value of measured dose rate was 251±6nGyh -1 (156-392nGyh -1 ) and 4 times higher than the world average value. High radioactivity levels of 238 U (95±12Bqkg -1 ), 232 Th (191±23Bqkg -1 ,) and 40 K (727±130Bqkg -1 ) in soil were identified as the major source of high radiation exposure. Based on statistical ANOVA, t-test, and analyses of cumulative probability distribution, this study has statistically verified the dose enhancements in the background radiation. The effective dose was estimated to be 0.31±0.01mSvy -1 per man. The recommended ICRP reference level (1-20mSvy -1 ) is applicable to the involved existing exposure situation in this study. The estimated effective dose in this study is lower than the ICRP reference level and too low to cause deterministic radiation effects. Nevertheless based on estimations of lifetime radiation exposure risks, this study found that there was small probability for individual in Kuala Lumpur being diagnosed with cancer and dying of cancer. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Constructing a Criterion Reference Test to Measure the Research and Statistical Competencies of Graduate Students at the Jordanian Governmental Universities

    ERIC Educational Resources Information Center

    Al-Habashneh, Maher Hussein; Najjar, Nabil Juma

    2017-01-01

    This study aimed at constructing a criterion-reference test to measure the research and statistical competencies of graduate students at the Jordanian governmental universities, the test has to be in its first form of (50) multiple choice items, then the test was introduced to (5) arbitrators with competence in measurement and evaluation to…

  13. Statistical Analyses of Brain Surfaces Using Gaussian Random Fields on 2-D Manifolds

    PubMed Central

    Staib, Lawrence H.; Xu, Dongrong; Zhu, Hongtu; Peterson, Bradley S.

    2008-01-01

    Interest in the morphometric analysis of the brain and its subregions has recently intensified because growth or degeneration of the brain in health or illness affects not only the volume but also the shape of cortical and subcortical brain regions, and new image processing techniques permit detection of small and highly localized perturbations in shape or localized volume, with remarkable precision. An appropriate statistical representation of the shape of a brain region is essential, however, for detecting, localizing, and interpreting variability in its surface contour and for identifying differences in volume of the underlying tissue that produce that variability across individuals and groups of individuals. Our statistical representation of the shape of a brain region is defined by a reference region for that region and by a Gaussian random field (GRF) that is defined across the entire surface of the region. We first select a reference region from a set of segmented brain images of healthy individuals. The GRF is then estimated as the signed Euclidean distances between points on the surface of the reference region and the corresponding points on the corresponding region in images of brains that have been coregistered to the reference. Correspondences between points on these surfaces are defined through deformations of each region of a brain into the coordinate space of the reference region using the principles of fluid dynamics. The warped, coregistered region of each subject is then unwarped into its native space, simultaneously bringing into that space the map of corresponding points that was established when the surfaces of the subject and reference regions were tightly coregistered. The proposed statistical description of the shape of surface contours makes no assumptions, other than smoothness, about the shape of the region or its GRF. The description also allows for the detection and localization of statistically significant differences in the shapes of the surfaces across groups of subjects at both a fine and coarse scale. We demonstrate the effectiveness of these statistical methods by applying them to study differences in shape of the amygdala and hippocampus in a large sample of normal subjects and in subjects with attention deficit/hyperactivity disorder (ADHD). PMID:17243583

  14. Effect of non-surgical periodontal treatment on transferrin serum levels in patients with chronic periodontitis

    PubMed Central

    Shirmohamadi, Adileh; Chitsazi, Mohamad Taghi; Faramarzi, Masoumeh; Salari, Ashkan; Naser Alavi, Fereshteh; Pashazadeh, Nazila

    2016-01-01

    Background. Transferrin is a negative acute phase protein, which decreases during inflammation and infection. The aim of the present investigation was to evaluate changes in the transferrin serum levels subsequent to non-surgical treatment of chronic periodontal disease. Methods. Twenty patients with chronic periodontitis and 20 systemically healthy subjects without periodontal disease, who had referred to Tabriz Faculty of Dentistry, were selected. Transferrin serum levels and clinical periodontal parameters (pocket depth, clinical attachment level, gingival index, bleeding index and plaque index) were measured at baseline and 3 months after non-surgical periodontal treatment. Data were analyzed with descriptive statistical methods (means ± standard deviations). Independent samples t-test was used to compare transferrin serum levels and clinical variables between the test and control groups. Paired samples t-test was used in the test group for comparisons before and after treatment. Statistical significance was set at P < 0.05. Results. The mean transferrin serum level in patients with chronic periodontitis (213.1 ± 9.2 mg/dL) was significantly less than that in periodontally healthy subjects (307.8 ± 11.7 mg/dL). Three months after periodontal treatment, the transferrin serum level increased significantly (298.3 ± 7.6 mg/dL) and approached the levels in periodontally healthy subjects (P < 0.05). Conclusion. The decrease and increase in transferrin serum levels with periodontal disease and periodontal treatment, respectively, indicated an inverse relationship between transferrin serum levels and chronic periodontitis. PMID:27651883

  15. Designing image segmentation studies: Statistical power, sample size and reference standard quality.

    PubMed

    Gibson, Eli; Hu, Yipeng; Huisman, Henkjan J; Barratt, Dean C

    2017-12-01

    Segmentation algorithms are typically evaluated by comparison to an accepted reference standard. The cost of generating accurate reference standards for medical image segmentation can be substantial. Since the study cost and the likelihood of detecting a clinically meaningful difference in accuracy both depend on the size and on the quality of the study reference standard, balancing these trade-offs supports the efficient use of research resources. In this work, we derive a statistical power calculation that enables researchers to estimate the appropriate sample size to detect clinically meaningful differences in segmentation accuracy (i.e. the proportion of voxels matching the reference standard) between two algorithms. Furthermore, we derive a formula to relate reference standard errors to their effect on the sample sizes of studies using lower-quality (but potentially more affordable and practically available) reference standards. The accuracy of the derived sample size formula was estimated through Monte Carlo simulation, demonstrating, with 95% confidence, a predicted statistical power within 4% of simulated values across a range of model parameters. This corresponds to sample size errors of less than 4 subjects and errors in the detectable accuracy difference less than 0.6%. The applicability of the formula to real-world data was assessed using bootstrap resampling simulations for pairs of algorithms from the PROMISE12 prostate MR segmentation challenge data set. The model predicted the simulated power for the majority of algorithm pairs within 4% for simulated experiments using a high-quality reference standard and within 6% for simulated experiments using a low-quality reference standard. A case study, also based on the PROMISE12 data, illustrates using the formulae to evaluate whether to use a lower-quality reference standard in a prostate segmentation study. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    NASA Astrophysics Data System (ADS)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system was developed to allow users to easily select the most reliable reference climate data at each target point based on the elevation of grid cell. By constructing the best combination of reference data for the study domain, the accurate and reliable statistically downscaled climate projections could be significantly improved.

  17. HAPRAP: a haplotype-based iterative method for statistical fine mapping using GWAS summary statistics.

    PubMed

    Zheng, Jie; Rodriguez, Santiago; Laurin, Charles; Baird, Denis; Trela-Larsen, Lea; Erzurumluoglu, Mesut A; Zheng, Yi; White, Jon; Giambartolomei, Claudia; Zabaneh, Delilah; Morris, Richard; Kumari, Meena; Casas, Juan P; Hingorani, Aroon D; Evans, David M; Gaunt, Tom R; Day, Ian N M

    2017-01-01

    Fine mapping is a widely used approach for identifying the causal variant(s) at disease-associated loci. Standard methods (e.g. multiple regression) require individual level genotypes. Recent fine mapping methods using summary-level data require the pairwise correlation coefficients ([Formula: see text]) of the variants. However, haplotypes rather than pairwise [Formula: see text], are the true biological representation of linkage disequilibrium (LD) among multiple loci. In this article, we present an empirical iterative method, HAPlotype Regional Association analysis Program (HAPRAP), that enables fine mapping using summary statistics and haplotype information from an individual-level reference panel. Simulations with individual-level genotypes show that the results of HAPRAP and multiple regression are highly consistent. In simulation with summary-level data, we demonstrate that HAPRAP is less sensitive to poor LD estimates. In a parametric simulation using Genetic Investigation of ANthropometric Traits height data, HAPRAP performs well with a small training sample size (N < 2000) while other methods become suboptimal. Moreover, HAPRAP's performance is not affected substantially by single nucleotide polymorphisms (SNPs) with low minor allele frequencies. We applied the method to existing quantitative trait and binary outcome meta-analyses (human height, QTc interval and gallbladder disease); all previous reported association signals were replicated and two additional variants were independently associated with human height. Due to the growing availability of summary level data, the value of HAPRAP is likely to increase markedly for future analyses (e.g. functional prediction and identification of instruments for Mendelian randomization). The HAPRAP package and documentation are available at http://apps.biocompute.org.uk/haprap/ CONTACT: : jie.zheng@bristol.ac.uk or tom.gaunt@bristol.ac.ukSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  18. Serum and synovial fluid C-reactive protein level variations in dogs with degenerative joint disease and their relationships with physiological parameters.

    PubMed

    Boal, S; Miguel Carreira, L

    2015-09-01

    Degenerative joint disease (DJD) is a progressive, chronic joint disease with an inflammatory component promoting an acute phase protein (APP) response. C-reactive protein (CRP) is one of the most important APPs, used as an inflammation marker in human, but not veterinary medicine. The study was developed in a sample of 48 dogs (n = 48) with DJD and aimed to: 1) identify and quantify the synovial fluid CRP (SFCRP) in these specimens using a validated ELISA test for serum CRP (SCRP) detection and quantification; and 2) to study the possible relationship between SCRP and SFCRP levels variations in DJD patients evaluating the influence of some physical parameters such as gender, body weight, pain level, DJD grade, and the physical activity (PA) of the patients. Statistical analysis considered the results significant for p values <0.05. Our study showed that it is possible to detect and quantify SFCRP levels in DJD patients using a previously validated canine SCRP ELISA test, allowing us to point out a preliminary reference value for SFCRP in patients with DJD. Although, individuals with DJD presents SCRP values within the normal reference range and the SFCRP levels were always lower. Obesity, pain, and the DJD grade presented by the patients are conditions which seem to influence the SCRP levels but not the SFCRP.

  19. Factors influencing students' perceptions of their quantitative skills

    NASA Astrophysics Data System (ADS)

    Matthews, Kelly E.; Hodgson, Yvonne; Varsavsky, Cristina

    2013-09-01

    There is international agreement that quantitative skills (QS) are an essential graduate competence in science. QS refer to the application of mathematical and statistical thinking and reasoning in science. This study reports on the use of the Science Students Skills Inventory to capture final year science students' perceptions of their QS across multiple indicators, at two Australian research-intensive universities. Statistical analysis reveals several variables predicting higher levels of self-rated competence in QS: students' grade point average, students' perceptions of inclusion of QS in the science degree programme, their confidence in QS, and their belief that QS will be useful in the future. The findings are discussed in terms of implications for designing science curricula more effectively to build students' QS throughout science degree programmes. Suggestions for further research are offered.

  20. Moving beyond Assumptions: The Use of Virtual Reference Data in an Academic Library

    ERIC Educational Resources Information Center

    Nolen, David S.; Powers, Amanda Clay; Zhang, Li; Xu, Yue; Cannady, Rachel E.; Li, Judy

    2012-01-01

    The Mississippi State University Libraries' Virtual Reference Service collected statistics about virtual reference usage. Analysis of the data collected by an entry survey from chat and e-mail transactions provided librarians with concrete information about what patron groups were the highest and lowest users of virtual reference services. These…

  1. Blind prediction of natural video quality.

    PubMed

    Saad, Michele A; Bovik, Alan C; Charrier, Christophe

    2014-03-01

    We propose a blind (no reference or NR) video quality evaluation model that is nondistortion specific. The approach relies on a spatio-temporal model of video scenes in the discrete cosine transform domain, and on a model that characterizes the type of motion occurring in the scenes, to predict video quality. We use the models to define video statistics and perceptual features that are the basis of a video quality assessment (VQA) algorithm that does not require the presence of a pristine video to compare against in order to predict a perceptual quality score. The contributions of this paper are threefold. 1) We propose a spatio-temporal natural scene statistics (NSS) model for videos. 2) We propose a motion model that quantifies motion coherency in video scenes. 3) We show that the proposed NSS and motion coherency models are appropriate for quality assessment of videos, and we utilize them to design a blind VQA algorithm that correlates highly with human judgments of quality. The proposed algorithm, called video BLIINDS, is tested on the LIVE VQA database and on the EPFL-PoliMi video database and shown to perform close to the level of top performing reduced and full reference VQA algorithms.

  2. Current Practices of Measuring and Reference Range Reporting of Free and Total Testosterone in the United States.

    PubMed

    Le, Margaret; Flores, David; May, Danica; Gourley, Eric; Nangia, Ajay K

    2016-05-01

    The evaluation and management of male hypogonadism should be based on symptoms and on serum testosterone levels. Diagnostically this relies on accurate testing and reference values. Our objective was to define the distribution of reference values and assays for free and total testosterone by clinical laboratories in the United States. Upper and lower reference values, assay methodology and source of published reference ranges were obtained from laboratories across the country. A standardized survey was reviewed with laboratory staff via telephone. Descriptive statistics were used to tabulate results. We surveyed a total of 120 laboratories in 47 states. Total testosterone was measured in house at 73% of laboratories. At the remaining laboratories studies were sent to larger centralized reference facilities. The mean ± SD lower reference value of total testosterone was 231 ± 46 ng/dl (range 160 to 300) and the mean upper limit was 850 ± 141 ng/dl (range 726 to 1,130). Only 9% of laboratories where in-house total testosterone testing was performed created a reference range unique to their region. Others validated the instrument recommended reference values in a small number of internal test samples. For free testosterone 82% of laboratories sent testing to larger centralized reference laboratories where equilibrium dialysis and/or liquid chromatography with mass spectrometry was done. The remaining laboratories used published algorithms to calculate serum free testosterone. Reference ranges for testosterone assays vary significantly among laboratories. The ranges are predominantly defined by limited population studies of men with unknown medical and reproductive histories. These poorly defined and variable reference values, especially the lower limit, affect how clinicians determine treatment. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  3. First echelon hospital care before trauma center transfer in a rural trauma system: does it affect outcome?

    PubMed

    Helling, Thomas S; Davit, Flavia; Edwards, Kim

    2010-12-01

    Rural trauma has been associated with higher mortality because of a number of geographic and demographic factors. Many victims, of necessity, are first cared for in nearby hospitals, many of which are not designated trauma centers (TCs), and then transferred to identified TCs. This first echelon care might adversely affect eventual outcome. We have sought to examine the fate of trauma patients transferred after first echelon hospital evaluation and treatment. All trauma patients transferred (referred group) to a Pennsylvania Level I TC located in a geographically isolated and rural setting during a 68-month period were retrospectively compared with patients transported directly to the TC (direct group). Outcome measures included mortality, complications, physiologic parameters on arrival at the TC, operations within 6 hours of arrival at the TC, discharge disposition from the TC, and functional outcome. Patients with an injury severity score <9 and those discharged from the TC within 24 hours were excluded. During the study period, 2,388 patients were transported directly and 529 were transferred. Mortality between groups was not different: 6% (referred) versus 9% (direct), p = 0.074. Occurrence of complications was not different between the two groups. Physiologic parameters (systolic blood pressure, heart rate, and Glasgow Coma Scale score) at admission to the Level I TC differed statistically between the two groups but seemed near equivalent clinically. Sixteen percent of patients required an operative procedure within 6 hours in the direct group compared with 10% in the referral group (p = 0.001). Hospital and intensive care unit length of stay were less in the referred group, although this was not statistically significant. Performance scores on discharge were equivalent in all categories except transfer ability. Time from injury to definitive care (TC) was 1.6 hours ± 3.0 hours in the direct group and 5.3 hours ± 3.8 hours in the referred group (p < 0.0001). The most common procedure performed at first echelon hospitals was airway control (55% of referred patients). In this rural setting, care at first echelon hospitals, most (95%) of which were not designated TCs, seemed to augment, rather than detract from, favorable outcomes realized after definitive care at the TC.

  4. Improved method for selection of the NOAEL.

    PubMed

    Calabrese, E J; Baldwin, L A

    1994-02-01

    The paper proposes that the NOAEL be defined as the highest dosage tested that is statistically significantly different from the control group while also being statistically significantly different from the LOAEL. This new definition requires that the NOAEL be defined from two points of reference rather than the current approach (i.e., single point of reference) in which the NOAEL represents only the highest dosage not statistically significantly different from the control group. This proposal is necessary in order to differentiate NOAELs which are statistically distinguishable from the LOAEL. Under the new regime only those satisfying both criteria would be designated a true NOAEL while those satisfying only one criteria (i.e., not statistically significant different from the control group) would be designated a "quasi" NOAEL and handled differently (i.e., via an uncertainty factor) for risk assessment purposes.

  5. Thematic Accuracy Assessment of the 2011 National Land ...

    EPA Pesticide Factsheets

    Accuracy assessment is a standard protocol of National Land Cover Database (NLCD) mapping. Here we report agreement statistics between map and reference labels for NLCD 2011, which includes land cover for ca. 2001, ca. 2006, and ca. 2011. The two main objectives were assessment of agreement between map and reference labels for the three, single-date NLCD land cover products at Level II and Level I of the classification hierarchy, and agreement for 17 land cover change reporting themes based on Level I classes (e.g., forest loss; forest gain; forest, no change) for three change periods (2001–2006, 2006–2011, and 2001–2011). The single-date overall accuracies were 82%, 83%, and 83% at Level II and 88%, 89%, and 89% at Level I for 2011, 2006, and 2001, respectively. Many class-specific user's accuracies met or exceeded a previously established nominal accuracy benchmark of 85%. Overall accuracies for 2006 and 2001 land cover components of NLCD 2011 were approximately 4% higher (at Level II and Level I) than the overall accuracies for the same components of NLCD 2006. The high Level I overall, user's, and producer's accuracies for the single-date eras in NLCD 2011 did not translate into high class-specific user's and producer's accuracies for many of the 17 change reporting themes. User's accuracies were high for the no change reporting themes, commonly exceeding 85%, but were typically much lower for the reporting themes that represented change. Only forest l

  6. Geographic Proximity and Racial Disparities in Cancer Clinical Trial Participation

    PubMed Central

    Kanarek, Norma F.; Tsai, Hua-Ling; Metzger-Gaud, Sharon; Damron, Dorothy; Guseynova, Alla; Klamerus, Justin F.; Rudin, Charles M.

    2011-01-01

    This study assessed the effects of race and place of residence on clinical trial participation by patients seen at a designated NCI comprehensive cancer center. Clinical trial accrual to cancer case ratios were evaluated using a database of residents at the continental United States seen at The Sidney Kimmel Comprehensive Cancer Center at Johns Hopkins from 2005 to 2007. Place of residence was categorized into 3 nonoverlapping geographic areas: Baltimore City, non–Baltimore City catchment area, and non–catchment area. Controlling for age, sex, county poverty level, and cancer site, significant race and place of residence differences were seen in therapeutic or nontherapeutic clinical trials participation. White non–Baltimore City catchment area residents, the designated reference group, achieved the highest participation rate. Although the test of interaction (control group compared with all others) was not significant, some race–geographic area group differences were detected. In therapeutic trials, most race–place of residence group levels were statistically lower and different from reference; in nontherapeutic trials, race-specific Baltimore City groups participated at levels similar to reference. Baltimore City residents had lower participation rates only in therapeutic trials, irrespective of race. County poverty level was not significant but was retained as a confounder. Place of residence and race were found to be significant predictors of participation in therapeutic and nontherapeutic clinical trials, although patterns differed somewhat between therapeutic and nontherapeutic trials. Clinical trial accruals are not uniform across age, sex, race, place of residence, cancer site, or trial type, underscoring that cancer centers must better understand their source patients to enhance clinical trial participation. PMID:21147901

  7. [The alpha-fetoprotein in prognosis of survival of and functional rehabilitation of patients with ischemic stroke].

    PubMed

    Arkhipkin, A A; Liang, O V; Kochetov, A G

    2014-10-01

    The study was carried out to determine the prognostic value of alpha-fetoprotein in development of lethal outcome and degree of functional rehabilitation of patients with ischemic stroke. The sampling included 216 patients in acute period of ischemic stroke. At the first day of development of disease they were measured the level of human alpha-fetoprotein. At the second day of disease patients were evaluated the degree of functional rehabilitation and the rate of lethal outcomes was calculated. Previously, the reference interval for alpha-fetoprotein was calculated according the guidelines of the International federation of clinical chemistry and national standard. The reference interval amounted to 0.59-3.78 mE/l. The study results demonstrated that low level of alpha-fetoprotein is related to higher risk of lethal outcome (SE=1.7, p=0.012). The increasing of level of alpha-fetoprotein over mentioned threshold value statistically significant increases probability of survival of patients. The further increasing more than 2.28 mE/l is related to subsequent good functional rehabilitation according the modifies Rankine scale (SE=1.4, p=0.001) and Barthel index (SE=1.49, p<0.001).

  8. Rainfall Downscaling Conditional on Upper-air Atmospheric Predictors: Improved Assessment of Rainfall Statistics in a Changing Climate

    NASA Astrophysics Data System (ADS)

    Langousis, Andreas; Mamalakis, Antonis; Deidda, Roberto; Marrocu, Marino

    2015-04-01

    To improve the level skill of Global Climate Models (GCMs) and Regional Climate Models (RCMs) in reproducing the statistics of rainfall at a basin level and at hydrologically relevant temporal scales (e.g. daily), two types of statistical approaches have been suggested. One is the statistical correction of climate model rainfall outputs using historical series of precipitation. The other is the use of stochastic models of rainfall to conditionally simulate precipitation series, based on large-scale atmospheric predictors produced by climate models (e.g. geopotential height, relative vorticity, divergence, mean sea level pressure). The latter approach, usually referred to as statistical rainfall downscaling, aims at reproducing the statistical character of rainfall, while accounting for the effects of large-scale atmospheric circulation (and, therefore, climate forcing) on rainfall statistics. While promising, statistical rainfall downscaling has not attracted much attention in recent years, since the suggested approaches involved complex (i.e. subjective or computationally intense) identification procedures of the local weather, in addition to demonstrating limited success in reproducing several statistical features of rainfall, such as seasonal variations, the distributions of dry and wet spell lengths, the distribution of the mean rainfall intensity inside wet periods, and the distribution of rainfall extremes. In an effort to remedy those shortcomings, Langousis and Kaleris (2014) developed a statistical framework for simulation of daily rainfall intensities conditional on upper air variables, which accurately reproduces the statistical character of rainfall at multiple time-scales. Here, we study the relative performance of: a) quantile-quantile (Q-Q) correction of climate model rainfall products, and b) the statistical downscaling scheme of Langousis and Kaleris (2014), in reproducing the statistical structure of rainfall, as well as rainfall extremes, at a regional level. This is done for an intermediate-sized catchment in Italy, i.e. the Flumendosa catchment, using climate model rainfall and atmospheric data from the ENSEMBLES project (http://ensembleseu.metoffice.com). In doing so, we split the historical rainfall record of mean areal precipitation (MAP) in 15-year calibration and 45-year validation periods, and compare the historical rainfall statistics to those obtained from: a) Q-Q corrected climate model rainfall products, and b) synthetic rainfall series generated by the suggested downscaling scheme. To our knowledge, this is the first time that climate model rainfall and statistically downscaled precipitation are compared to catchment-averaged MAP at a daily resolution. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the climate model used and the length of the calibration period. This is particularly the case for the yearly rainfall maxima, where direct statistical correction of climate model rainfall outputs shows increased sensitivity to the length of the calibration period and the climate model used. The robustness of the suggested downscaling scheme in modeling rainfall extremes at a daily resolution, is a notable feature that can effectively be used to assess hydrologic risk at a regional level under changing climatic conditions. Acknowledgments The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General Secretariat for Research and Technology), and is co-financed by the European Social Fund (ESF) and the Greek State. CRS4 highly acknowledges the contribution of the Sardinian regional authorities.

  9. Reference values for serum levels of insulin-like growth factor 1 (IGF-1) and IGF-binding protein 3 (IGFBP-3) in the West Black Sea region of Turkey.

    PubMed

    Guven, Berrak; Can, Murat; Mungan, Gorkem; Acіkgoz, Serefden

    2013-03-01

    The aim of this study was to determine the normal values of serum IGF-1 and IGFBP-3 in Turkish children and adults (1-79 years). The study included 571 healthy children and 625 healthy adults from the West Black Sea region of Turkey. Serum IGF-1 and IGFBP-3 concentrations were determined using a chemiluminescent immunometric assay on an Immulite 1000 analyzer. IGF-1 and IGFBP-3 levels tended to be higher in girls compared to boys among the children. The differences were statistically significant in puberty from age 12-14 years for IGF-1 and prepubertally from age 9-10 years for IGFBP-3. Peaks of serum IGF-1 levels were observed 2 years earlier in girls (14 years) than boys (16 years). The general pattern of IGFBP-3 was similar to IGF-1 during puberty. In adults, IGF-1 and IGFBP-3 levels decreased by age. There was no significant difference in IGF-1 and IGFBP3 values between men and women in any age group. This study established age- and sex-specific reference values for serum IGF-1 and IGFBP-3 in healthy Turkish children and adults.

  10. A Comparative Study of the Applied Methods for Estimating Deflection of the Vertical in Terrestrial Geodetic Measurements

    PubMed Central

    Vittuari, Luca; Tini, Maria Alessandra; Sarti, Pierguido; Serantoni, Eugenio; Borghi, Alessandra; Negusini, Monia; Guillaume, Sébastien

    2016-01-01

    This paper compares three different methods capable of estimating the deflection of the vertical (DoV): one is based on the joint use of high precision spirit leveling and Global Navigation Satellite Systems (GNSS), a second uses astro-geodetic measurements and the third gravimetric geoid models. The working data sets refer to the geodetic International Terrestrial Reference Frame (ITRF) co-location sites of Medicina (Northern, Italy) and Noto (Sicily), these latter being excellent test beds for our investigations. The measurements were planned and realized to estimate the DoV with a level of precision comparable to the angular accuracy achievable in high precision network measured by modern high-end total stations. The three methods are in excellent agreement, with an operational supremacy of the astro-geodetic method, being faster and more precise than the others. The method that combines leveling and GNSS has slightly larger standard deviations; although well within the 1 arcsec level, which was assumed as threshold. Finally, the geoid model based method, whose 2.5 arcsec standard deviations exceed this threshold, is also statistically consistent with the others and should be used to determine the DoV components where local ad hoc measurements are lacking. PMID:27104544

  11. Development and field application of a nonlinear ultrasonic modulation technique for fatigue crack detection without reference data from an intact condition

    NASA Astrophysics Data System (ADS)

    Lim, Hyung Jin; Kim, Yongtak; Koo, Gunhee; Yang, Suyoung; Sohn, Hoon; Bae, In-hwan; Jang, Jeong-Hwan

    2016-09-01

    In this study, a fatigue crack detection technique, which detects a fatigue crack without relying on any reference data obtained from the intact condition of a target structure, is developed using nonlinear ultrasonic modulation and applied to a real bridge structure. Using two wafer-type lead zirconate titanate (PZT) transducers, ultrasonic excitations at two distinctive frequencies are applied to a target inspection spot and the corresponding ultrasonic response is measured by another PZT transducer. Then, the nonlinear modulation components produced by a breathing-crack are extracted from the measured ultrasonic response, and a statistical classifier, which can determine if the nonlinear modulation components are statistically significant in comparison with the background noise level, is proposed. The effectiveness of the proposed fatigue crack detection technique is experimentally validated using the data obtained from aluminum plates and aircraft fitting-lug specimens under varying temperature and loading conditions, and through a field testing of Yeongjong Grand Bridge in South Korea. The uniqueness of this study lies in that (1) detection of a micro fatigue crack with less than 1 μm width and fatigue cracks in the range of 10-20 μm in width using nonlinear ultrasonic modulation, (2) automated detection of fatigue crack formation without using reference data obtained from an intact condition, (3) reliable and robust diagnosis under varying temperature and loading conditions, (4) application of a local fatigue crack detection technique to online monitoring of a real bridge.

  12. Thematic accuracy assessment of the 2011 National Land Cover Database (NLCD)

    USGS Publications Warehouse

    Wickham, James; Stehman, Stephen V.; Gass, Leila; Dewitz, Jon; Sorenson, Daniel G.; Granneman, Brian J.; Poss, Richard V.; Baer, Lori Anne

    2017-01-01

    Accuracy assessment is a standard protocol of National Land Cover Database (NLCD) mapping. Here we report agreement statistics between map and reference labels for NLCD 2011, which includes land cover for ca. 2001, ca. 2006, and ca. 2011. The two main objectives were assessment of agreement between map and reference labels for the three, single-date NLCD land cover products at Level II and Level I of the classification hierarchy, and agreement for 17 land cover change reporting themes based on Level I classes (e.g., forest loss; forest gain; forest, no change) for three change periods (2001–2006, 2006–2011, and 2001–2011). The single-date overall accuracies were 82%, 83%, and 83% at Level II and 88%, 89%, and 89% at Level I for 2011, 2006, and 2001, respectively. Many class-specific user's accuracies met or exceeded a previously established nominal accuracy benchmark of 85%. Overall accuracies for 2006 and 2001 land cover components of NLCD 2011 were approximately 4% higher (at Level II and Level I) than the overall accuracies for the same components of NLCD 2006. The high Level I overall, user's, and producer's accuracies for the single-date eras in NLCD 2011 did not translate into high class-specific user's and producer's accuracies for many of the 17 change reporting themes. User's accuracies were high for the no change reporting themes, commonly exceeding 85%, but were typically much lower for the reporting themes that represented change. Only forest loss, forest gain, and urban gain had user's accuracies that exceeded 70%. Lower user's accuracies for the other change reporting themes may be attributable to the difficulty in determining the context of grass (e.g., open urban, grassland, agriculture) and between the components of the forest-shrubland-grassland gradient at either the mapping phase, reference label assignment phase, or both. NLCD 2011 user's accuracies for forest loss, forest gain, and urban gain compare favorably with results from other land cover change accuracy assessments.

  13. The Reference Statistic Shuffle: Finding a Tool That Not Only Adds Stuff up but Improves Results

    ERIC Educational Resources Information Center

    Northam, Sarah H.

    2012-01-01

    Reference librarians are constantly balancing their reference duties against other duties, for example, collection development and departmental liaison. The author works as a reference librarian at Texas A&M University-Commerce Libraries in Commerce, Texas. The university has approximately 11,000 students, a large number of whom are…

  14. Location Is Everything: The Use and Marketing of Reference E-Mail

    ERIC Educational Resources Information Center

    Collins, Susan L.

    2006-01-01

    Reference e-mail continues to be a vital reference service. This article studies the trends in reference e-mail use over an eight year period. Usage statistics are analyzed particularly in light of the marketing of the service via changes in the location of the service on the official university libraries' Web pages. Included are recommendations…

  15. Collaborative derivation of reference intervals for major clinical laboratory tests in Japan.

    PubMed

    Ichihara, Kiyoshi; Yomamoto, Yoshikazu; Hotta, Taeko; Hosogaya, Shigemi; Miyachi, Hayato; Itoh, Yoshihisa; Ishibashi, Midori; Kang, Dongchon

    2016-05-01

    Three multicentre studies of reference intervals were conducted recently in Japan. The Committee on Common Reference Intervals of the Japan Society of Clinical Chemistry sought to establish common reference intervals for 40 laboratory tests which were measured in common in the three studies and regarded as well harmonized in Japan. The study protocols were comparable with recruitment mostly from hospital workers with body mass index ≤28 and no medications. Age and sex distributions were made equal to obtain a final data size of 6345 individuals. Between-subgroup differences were expressed as the SD ratio (between-subgroup SD divided by SD representing the reference interval). Between-study differences were all within acceptable levels, and thus the three datasets were merged. By adopting SD ratio ≥0.50 as a guide, sex-specific reference intervals were necessary for 12 assays. Age-specific reference intervals for females partitioned at age 45 were required for five analytes. The reference intervals derived by the parametric method resulted in appreciable narrowing of the ranges by applying the latent abnormal values exclusion method in 10 items which were closely associated with prevalent disorders among healthy individuals. Sex- and age-related profiles of reference values, derived from individuals with no abnormal results in major tests, showed peculiar patterns specific to each analyte. Common reference intervals for nationwide use were developed for 40 major tests, based on three multicentre studies by advanced statistical methods. Sex- and age-related profiles of reference values are of great relevance not only for interpreting test results, but for applying clinical decision limits specified in various clinical guidelines. © The Author(s) 2015.

  16. Reference aquaplanet climate in the Community Atmosphere Model, Version 5

    DOE PAGES

    Medeiros, Brian; Williamson, David L.; Olson, Jerry G.

    2016-03-18

    In this study, fundamental characteristics of the aquaplanet climate simulated by the Community Atmosphere Model, Version 5.3 (CAM5.3) are presented. The assumptions and simplifications of the configuration are described. A 16 year long, perpetual equinox integration with prescribed SST using the model’s standard 18 grid spacing is presented as a reference simulation. Statistical analysis is presented that shows similar aquaplanet configurations can be run for about 2 years to obtain robust climatological structures, including global and zonal means, eddy statistics, and precipitation distributions. Such a simulation can be compared to the reference simulation to discern differences in the climate, includingmore » an assessment of confidence in the differences. To aid such comparisons, the reference simulation has been made available via earthsystemgrid.org. Examples are shown comparing the reference simulation with simulations from the CAM5 series that make different microphysical assumptions and use a different dynamical core.« less

  17. Post traumatic brain perfusion SPECT analysis using reconstructed ROI maps of radioactive microsphere derived cerebral blood flow and statistical parametric mapping

    PubMed Central

    McGoron, Anthony J; Capille, Michael; Georgiou, Michael F; Sanchez, Pablo; Solano, Juan; Gonzalez-Brito, Manuel; Kuluz, John W

    2008-01-01

    Background Assessment of cerebral blood flow (CBF) by SPECT could be important in the management of patients with severe traumatic brain injury (TBI) because changes in regional CBF can affect outcome by promoting edema formation and intracranial pressure elevation (with cerebral hyperemia), or by causing secondary ischemic injury including post-traumatic stroke. The purpose of this study was to establish an improved method for evaluating regional CBF changes after TBI in piglets. Methods The focal effects of moderate traumatic brain injury (TBI) on cerebral blood flow (CBF) by SPECT cerebral blood perfusion (CBP) imaging in an animal model were investigated by parallelized statistical techniques. Regional CBF was measured by radioactive microspheres and by SPECT 2 hours after injury in sham-operated piglets versus those receiving severe TBI by fluid-percussion injury to the left parietal lobe. Qualitative SPECT CBP accuracy was assessed against reference radioactive microsphere regional CBF measurements by map reconstruction, registration and smoothing. Cerebral hypoperfusion in the test group was identified at the voxel level using statistical parametric mapping (SPM). Results A significant area of hypoperfusion (P < 0.01) was found as a response to the TBI. Statistical mapping of the reference microsphere CBF data confirms a focal decrease found with SPECT and SPM. Conclusion The suitability of SPM for application to the experimental model and ability to provide insight into CBF changes in response to traumatic injury was validated by the SPECT SPM result of a decrease in CBP at the left parietal region injury area of the test group. Further study and correlation of this characteristic lesion with long-term outcomes and auxiliary diagnostic modalities is critical to developing more effective critical care treatment guidelines and automated medical imaging processing techniques. PMID:18312639

  18. Post traumatic brain perfusion SPECT analysis using reconstructed ROI maps of radioactive microsphere derived cerebral blood flow and statistical parametric mapping.

    PubMed

    McGoron, Anthony J; Capille, Michael; Georgiou, Michael F; Sanchez, Pablo; Solano, Juan; Gonzalez-Brito, Manuel; Kuluz, John W

    2008-02-29

    Assessment of cerebral blood flow (CBF) by SPECT could be important in the management of patients with severe traumatic brain injury (TBI) because changes in regional CBF can affect outcome by promoting edema formation and intracranial pressure elevation (with cerebral hyperemia), or by causing secondary ischemic injury including post-traumatic stroke. The purpose of this study was to establish an improved method for evaluating regional CBF changes after TBI in piglets. The focal effects of moderate traumatic brain injury (TBI) on cerebral blood flow (CBF) by SPECT cerebral blood perfusion (CBP) imaging in an animal model were investigated by parallelized statistical techniques. Regional CBF was measured by radioactive microspheres and by SPECT 2 hours after injury in sham-operated piglets versus those receiving severe TBI by fluid-percussion injury to the left parietal lobe. Qualitative SPECT CBP accuracy was assessed against reference radioactive microsphere regional CBF measurements by map reconstruction, registration and smoothing. Cerebral hypoperfusion in the test group was identified at the voxel level using statistical parametric mapping (SPM). A significant area of hypoperfusion (P < 0.01) was found as a response to the TBI. Statistical mapping of the reference microsphere CBF data confirms a focal decrease found with SPECT and SPM. The suitability of SPM for application to the experimental model and ability to provide insight into CBF changes in response to traumatic injury was validated by the SPECT SPM result of a decrease in CBP at the left parietal region injury area of the test group. Further study and correlation of this characteristic lesion with long-term outcomes and auxiliary diagnostic modalities is critical to developing more effective critical care treatment guidelines and automated medical imaging processing techniques.

  19. Statistical Methods in Psychology Journals.

    ERIC Educational Resources Information Center

    Willkinson, Leland

    1999-01-01

    Proposes guidelines for revising the American Psychological Association (APA) publication manual or other APA materials to clarify the application of statistics in research reports. The guidelines are intended to induce authors and editors to recognize the thoughtless application of statistical methods. Contains 54 references. (SLD)

  20. 77 FR 74103 - Alternatives to the Use of Credit Ratings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-13

    ... NPRM identified references made to nationally recognized statistical rating organization (NRSRO) \\3... or external credit risk assessments (including credit ratings), and default statistics. The preamble...); Default statistics (i.e., whether providers of credit information relating to securities express a view...

  1. Reference Ranges for Serum Uric Acid among Healthy Assamese People

    PubMed Central

    Das, Madhumita; Borah, N. C.; Ghose, M.; Choudhury, N.

    2014-01-01

    This study was designed to establish reference ranges for serum uric acid among healthy adult Assamese population. Samples from 1470 aged 35–86 years were used to establish age and sex related reference range by the centile method (central 95 percentile) for serum uric acid level. There were 51% (n = 754) males and 49% (n = 716) females; 75.9% (n = 1115) of them were from urban area and the rest 24.1% (n = 355) were from the rural area. Majority of the population were nonvegetarian (98.6%, n = 1450) and only 1.4% (n = 20) were vegetarian. The mean age, weight, height, and uric acid of the studied group were 53.6 ± 11.3 years, 62.6 ± 10.5 kg, 160 ± 9.4 cm, and 5.5 ± 1.4 mg/dL, respectively. There is a statistically significant difference in the mean value of the abovementioned parameters between male and female. The observed reference range of uric acid in the population is 2.6–8.2 mg/dL which is wider than the current reference range used in the laboratory. Except gender (P < 0.0001), we did not find any significant relation of uric acid with other selected factors. PMID:24672726

  2. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    PubMed

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  3. Fate of Yersinia enterocolitica during manufacture, ripening and storage of Lighvan cheese.

    PubMed

    Hanifian, Shahram; Khani, Sajjad

    2012-05-15

    This study aimed to evaluate the behavior of virulent Yersinia enterocolitica (YE) during the manufacture, ripening and storage of Lighvan cheese with particular reference to strains of YE, initial inoculation level, and storage time. Three strains of YE with low (1 log cfu/ml) and high (3 log cfu/ml) inoculation levels were inoculated to raw whole ewe's milk which was then used for manufacturing of Lighvan cheese. Throughout the manufacturing, ripening and storage periods the number of YE was counted on selective media. Enumerated colonies were then confirmed by duplex PCR using ail and virF genes. Moreover, some microbial and physiochemical characteristics of the cheese samples were examined. According to the results, initial inoculation level and storage time had statistically significant (P<0.01) effects on persistency of YE, while strain type exhibited no statistically significant (P>0.01) impact on survival of the pathogen. Results showed a rapid increase in the number of YE during manufacturing, however, in the ripening and storage periods the number of YE was decreased and eventually it was eliminated in all cheese batches after 4 months of storage. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. [Satisfaction with hospital care among diabetic outpatients and its associated factors. Secondary use of official statistics].

    PubMed

    Tsuboi, Satoshi; Uehara, Ritei; Oguma, Taeko; Kojo, Takao; Enkh-Oyun, Tsogzolbaatar; Kotani, Kazuhiko; Aoyama, Yasuko; Okayama, Akira; Hashimoto, Shuji; Yamagata, Zentaro; Ohashi, Yasuo; Katanoda, Kota; Nakamura, Yosikazu; Sobue, Tomotaka

    2014-01-01

    Generalizable data on current satisfaction levels are required to establish a scientific basis for the political advancement of measures to improve satisfaction with hospital care among patients with diabetes. The present study made secondary use of existing official statistics in order to demonstrate the range of satisfaction levels with hospital care among diabetic outpatients and to closely examine related factors. Data sets that consolidated the Patient Survey, the Survey of Medical Care Institutions, and the Patient Behavior Survey (all from 2008) were created. Shared medical institution survey reference numbers were used to consolidate the data from the Patient Survey and the Survey of Medical Care Institutions, and in addition, sex and date of birth were used to consolidate the Patient Behavior Survey data. The range of satisfaction levels with hospital care among diabetic outpatients was investigated along with any relationship with the following potentially related factors: visitation status (first or repeat examination); waiting time until examination; examination duration; care-seeking status (any use of other medical facilities, etc.); diabetic complications; other complications; coverage under the Public Assistance Act; smoking cessation outpatient services; hospitals that specialized in treating diabetes (metabolic medicine); medical care on Saturday, Sunday, and public holidays; and provision of health checkups. Overall, 62.3% of diabetic outpatients were either fairly or extremely satisfied with their hospital care, whereas 5.6% expressed dissatisfaction. Satisfaction levels with hospital care were found to be significantly related to visitation status, waiting time until examination, examination duration, care-seeking status, and Saturday medical care. Multivariate analysis with the factors demonstrated to be significantly related to satisfaction revealed significant relationships between high satisfaction levels and repeat examinations, short waiting times, no use of any other medical facilities, and long examinations. Consolidating official statistics from multiple sources indicated the range of satisfaction levels with hospital care among diabetic outpatients and facilitated the clarification of factors affecting satisfaction. Reducing waiting times and ensuring sufficient time spent on examinations are important for increasing satisfaction levels with hospital care among patients with diabetes. It is hoped that official statistics can be further applied to many future public health policy studies.

  5. Determination of Age-Dependent Reference Ranges for Coagulation Tests Performed Using Destiny Plus.

    PubMed

    Arslan, Fatma Demet; Serdar, Muhittin; Merve Ari, Elif; Onur Oztan, Mustafa; Hikmet Kozcu, Sureyya; Tarhan, Huseyin; Cakmak, Ozgur; Zeytinli, Merve; Yasar Ellidag, Hamit

    2016-06-01

    In order to apply the right treatment for hemostatic disorders in pediatric patients, laboratory data should be interpreted with age-appropriate reference ranges. The purpose of this study was to determining age-dependent reference range values for prothrombin time (PT), activated partial thromboplastin time (aPTT), fibrinogen tests, and D-dimer tests. A total of 320 volunteers were included in the study with the following ages: 1 month - 1 year (n = 52), 2 - 5 years (n = 50), 6 - 10 years (n = 48), 11 - 17 years (n = 38), and 18 - 65 years (n = 132). Each volunteer completed a survey to exclude hemostatic system disorder. Using a nonparametric method, the lower and upper limits, including 95% distribution and 90% confidence intervals, were calculated. No statistically significant differences were found between PT and aPTT values in the groups consisting of children. Thus, the reference ranges were separated into child and adult age groups. PT and aPTT values were significantly higher in the children than in the adults. Fibrinogen values in the 6 - 10 age group and the adult age group were significantly higher than in the other groups. D-dimer levels were significantly lower in those aged 2 - 17; thus, a separate reference range was established. These results support other findings related to developmental hemostasis, confirming that adult and pediatric age groups should be evaluated using different reference ranges.

  6. A versatile entropic measure of grey level inhomogeneity

    NASA Astrophysics Data System (ADS)

    Piasecki, Ryszard

    2009-06-01

    An entropic measure for the analysis of grey level inhomogeneity (GLI) is proposed as a function of length scale. It allows us to quantify the statistical dissimilarity of the actual macrostate and the maximizing entropy of the reference one. The maximums (minimums) of the measure indicate those scales at which higher (lower) average grey level inhomogeneity appears compared to neighbour scales. Even a deeply hidden statistical grey level periodicity can be detected by the equally distant minimums of the measure. The striking effect of multiple intersecting curves (MICs) of the measure has been revealed for pairs of simulated patterns, which differ in shades of grey or symmetry properties only. In turn, for evolving photosphere granulation patterns, the stability in time of the first peak position has been found. Interestingly, the third peak is dominant at initial steps of the evolution. This indicates a temporary grouping of granules at a length scale that may belong to the mesogranulation phenomenon. This behaviour has similarities with that reported by Consolini, Berrilli et al. [G. Consolini, F. Berrilli, A. Florio, E. Pietropaolo, L.A. Smaldone, Astron. Astrophys. 402 (2003) 1115; F. Berrilli, D. Del Moro, S. Russo, G. Consolini, Th. Straus, Astrophys. J. 632 (2005) 677] for binarized granulation images of a different data set.

  7. Comparison of serum 25-hydroxy vitamin D levels between mothers with small for gestational age and appropriate for gestational age newborns in Kerman.

    PubMed

    Mirzaei, Fatemeh; Amiri Moghadam, Tayebeh; Arasteh, Peyman

    2015-04-01

    Vitamin D deficiency during pregnancy is associated with some adverse pregnancy outcomes but its relationship with fetal growth is unknown. We compared the 25-hydroxy vitamin D levels between mothers and their small for gestational age (SGA) newborns with mothers and their appropriate for gestational age (AGA) newborns. The study population included pregnant women that referred to Afzalipour Hospital in Kerman from 2012 to 2013. The case and control group consisted of 40 pregnant mothers with SGA and AGA newborns, respectively. The maternal and infants 25-hydroxy vitamin D levels were measured in the two groups. 25-hydroxy vitamin D deficiency (<20 ng/ml) was statistically higher in women with SGA newborns in comparison to women with AGA newborns (p=0.003).Vitamin D deficiency was higher among the SGA newborns in comparison to AGA newborns (25% vs. 17.5%), although this finding was not statistically meaningful (p=0.379). The relationship of vitamin D deficiency levels between mothers and infants in both the SGA group and the AGA group was significant. Our study reveals a high prevalence of vitamin D deficiency in women with SGA infants in comparison to women with AGA children. In addition, maternal vitamin D deficiency is associated with its deficiency in newborns.

  8. Selection of Reliable Reference Genes for Gene Expression Studies of a Promising Oilseed Crop, Plukenetia volubilis, by Real-Time Quantitative PCR

    PubMed Central

    Niu, Longjian; Tao, Yan-Bin; Chen, Mao-Sheng; Fu, Qiantang; Li, Chaoqiong; Dong, Yuling; Wang, Xiulan; He, Huiying; Xu, Zeng-Fu

    2015-01-01

    Real-time quantitative PCR (RT-qPCR) is a reliable and widely used method for gene expression analysis. The accuracy of the determination of a target gene expression level by RT-qPCR demands the use of appropriate reference genes to normalize the mRNA levels among different samples. However, suitable reference genes for RT-qPCR have not been identified in Sacha inchi (Plukenetia volubilis), a promising oilseed crop known for its polyunsaturated fatty acid (PUFA)-rich seeds. In this study, using RT-qPCR, twelve candidate reference genes were examined in seedlings and adult plants, during flower and seed development and for the entire growth cycle of Sacha inchi. Four statistical algorithms (delta cycle threshold (ΔCt), BestKeeper, geNorm, and NormFinder) were used to assess the expression stabilities of the candidate genes. The results showed that ubiquitin-conjugating enzyme (UCE), actin (ACT) and phospholipase A22 (PLA) were the most stable genes in Sacha inchi seedlings. For roots, stems, leaves, flowers, and seeds from adult plants, 30S ribosomal protein S13 (RPS13), cyclophilin (CYC) and elongation factor-1alpha (EF1α) were recommended as reference genes for RT-qPCR. During the development of reproductive organs, PLA, ACT and UCE were the optimal reference genes for flower development, whereas UCE, RPS13 and RNA polymerase II subunit (RPII) were optimal for seed development. Considering the entire growth cycle of Sacha inchi, UCE, ACT and EF1α were sufficient for the purpose of normalization. Our results provide useful guidelines for the selection of reliable reference genes for the normalization of RT-qPCR data for seedlings and adult plants, for reproductive organs, and for the entire growth cycle of Sacha inchi. PMID:26047338

  9. Establishment of a reference value for chromium in the blood for biological monitoring among occupational chromium workers.

    PubMed

    Li, Ping; Li, Yang; Zhang, Ji; Yu, Shan-Fa; Wang, Zhi-Liang; Jia, Guang

    2016-10-01

    The concentration of chromium in the blood (CrB) has been confirmed as a biomarker for occupational chromium exposure, but its biological exposure indices (BEIs) are still unclear, so we collected data from the years 2006 and 2008 (Shandong Province, China) to analyze the relationship between the concentration of chromium in the air (CrA) of the workplaces and CrB to establish a reference value of CrB for biological monitoring of occupational workers. The levels of the indicators for nasal injury, kidney (β2 microglobulin (β2-MG)), and genetic damages (8-hydroxy-deoxyguanosine (8-OHdG) and micronucleus (MN)) were measured in all subjects of the year 2011 (Henan Province, China) to verify the protective effect in this reference value of CrB. Compared with the control groups, the concentrations of CrA and CrB in chromium exposed groups were significantly higher (P < 0.05). Positive correlations were found between CrA and CrB in chromium exposed groups (r 2006 = 0.60, r 2008 = 0.35) in the years 2006 and 2008. According to the occupational exposure limitation of CrA (50 μg/m(3), China), the reference value of CrB was recommended to 20 μg/L. The levels of nasal injury, β2-MG, 8-OhdG, and MN were not significantly different between the low chromium exposed group (CrB ≤ 20 μg/L) and the control group, while the levels of β2-MG, 8-OHdG, and MN were statistically different in the high chromium exposed group than that in the control group. This research proved that only in occupational workers, CrB could be used as a biomarker to show chromium exposure in the environment. The recommended reference value of CrB was 20 μg/L. © The Author(s) 2015.

  10. Accuracy and coverage of the modernized Polish Maritime differential GPS system

    NASA Astrophysics Data System (ADS)

    Specht, Cezary

    2011-01-01

    The DGPS navigation service augments The NAVSTAR Global Positioning System by providing localized pseudorange correction factors and ancillary information which are broadcast over selected marine reference stations. The DGPS service position and integrity information satisfy requirements in coastal navigation and hydrographic surveys. Polish Maritime DGPS system has been established in 1994 and modernized (in 2009) to meet the requirements set out in IMO resolution for a future GNSS, but also to preserve backward signal compatibility of user equipment. Having finalized installation of the new technology L1, L2 reference equipment performance tests were performed.The paper presents results of the coverage modeling and accuracy measuring campaign based on long-term signal analyses of the DGPS reference station Rozewie, which was performed for 26 days in July 2009. Final results allowed to verify the coverage area of the differential signal from reference station and calculated repeatable and absolute accuracy of the system, after the technical modernization. Obtained field strength level area and position statistics (215,000 fixes) were compared to past measurements performed in 2002 (coverage) and 2005 (accuracy), when previous system infrastructure was in operation.So far, no campaigns were performed on differential Galileo. However, as signals, signal processing and receiver techniques are comparable to those know from DGPS. Because all satellite differential GNSS systems use the same transmission standard (RTCM), maritime DGPS Radiobeacons are standardized in all radio communication aspects (frequency, binary rate, modulation), then the accuracy results of differential Galileo can be expected as a similar to DGPS.Coverage of the reference station was calculated based on unique software, which calculate the signal strength level based on transmitter parameters or field signal strength measurement campaign, done in the representative points. The software works based on Baltic sea vector map, ground electric parameters and models atmospheric noise level in the transmission band.

  11. Health facilities humanisation: design guidelines supported by statistical evidence.

    PubMed

    Bosia, Daniela; Marino, Donatella; Peretti, Gabriella

    2016-01-01

    Healthcare building humanisation is currently a widely debated issue and the development of patient centered and evidence based design is growing worldwide. Many international health organizations and researchers understand the importance of Patient Centred Design and leading architects incorporate it into the design process. In Italy this design approach is still at an early stage. The article refers to research commissioned by the Italian Health Ministry and carried out by R. Del Nord (Università degli Studi di Firenze) and G. Peretti (Politecnico di Torino) with their collaborators. The scope of the research was the definition of design guidelines for healthcare facilities humanisation. The methodology framework adopted is the well established need and performance approach in architectural design. The article deals with the results of statistical investigations for the definition and ranking of users' needs and the consistent expression of their requirements. The investigations were carried out with the cooperation of psychologists of the Università degli Studi di Torino and researchers of the Università degli Studi di Cagliari. The proposed evaluation system allows ranking of health facilities according to the level of humanisation achieved. The statistical investigation evidence collected allowed the definition of humanisation design guidelines for health-care facilities and for the assessment of their specific level of humanisation.

  12. Prevalence Odds Ratio versus Prevalence Ratio: Choice Comes with Consequences

    PubMed Central

    Tamhane, Ashutosh R; Westfall, Andrew O; Burkholder, Greer A; Cutter, Gary R

    2016-01-01

    Odds ratio (OR), risk ratio (RR), and prevalence ratio (PR) are some of the measures of association which are often reported in research studies quantifying the relationship between an independent variable and the outcome of interest. There has been much debate on the issue of which measure is appropriate to report depending on the study design. However, the literature on selecting a particular category of the outcome to be modeled and/or change in reference group for categorical independent variables and the effect on statistical significance, although known, is scantly discussed nor published with examples. In this article, we provide an example of a cross-sectional study wherein PR was chosen over (Prevalence) OR and demonstrate the analytic implications of the choice of category to be modeled and choice of reference level for independent variables. PMID:27460748

  13. Acidity in DMSO from the embedded cluster integral equation quantum solvation model.

    PubMed

    Heil, Jochen; Tomazic, Daniel; Egbers, Simon; Kast, Stefan M

    2014-04-01

    The embedded cluster reference interaction site model (EC-RISM) is applied to the prediction of acidity constants of organic molecules in dimethyl sulfoxide (DMSO) solution. EC-RISM is based on a self-consistent treatment of the solute's electronic structure and the solvent's structure by coupling quantum-chemical calculations with three-dimensional (3D) RISM integral equation theory. We compare available DMSO force fields with reference calculations obtained using the polarizable continuum model (PCM). The results are evaluated statistically using two different approaches to eliminating the proton contribution: a linear regression model and an analysis of pK(a) shifts for compound pairs. Suitable levels of theory for the integral equation methodology are benchmarked. The results are further analyzed and illustrated by visualizing solvent site distribution functions and comparing them with an aqueous environment.

  14. Adaptive interference cancel filter for evoked potential using high-order cumulants.

    PubMed

    Lin, Bor-Shyh; Lin, Bor-Shing; Chong, Fok-Ching; Lai, Feipei

    2004-01-01

    This paper is to present evoked potential (EP) processing using adaptive interference cancel (AIC) filter with second and high order cumulants. In conventional ensemble averaging method, people have to conduct repetitively experiments to record the required data. Recently, the use of AIC structure with second statistics in processing EP has proved more efficiency than traditional averaging method, but it is sensitive to both of the reference signal statistics and the choice of step size. Thus, we proposed higher order statistics-based AIC method to improve these disadvantages. This study was experimented in somatosensory EP corrupted with EEG. Gradient type algorithm is used in AIC method. Comparisons with AIC filter on second, third, fourth order statistics are also presented in this paper. We observed that AIC filter with third order statistics has better convergent performance for EP processing and is not sensitive to the selection of step size and reference input.

  15. Wilcoxon's signed-rank statistic: what null hypothesis and why it matters.

    PubMed

    Li, Heng; Johnson, Terri

    2014-01-01

    In statistical literature, the term 'signed-rank test' (or 'Wilcoxon signed-rank test') has been used to refer to two distinct tests: a test for symmetry of distribution and a test for the median of a symmetric distribution, sharing a common test statistic. To avoid potential ambiguity, we propose to refer to those two tests by different names, as 'test for symmetry based on signed-rank statistic' and 'test for median based on signed-rank statistic', respectively. The utility of such terminological differentiation should become evident through our discussion of how those tests connect and contrast with sign test and one-sample t-test. Published 2014. This article is a U.S. Government work and is in the public domain in the USA. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  16. [Comparison of the effect of different diagnostic criteria of subclinical hypothyroidism and positive TPO-Ab on pregnancy outcomes].

    PubMed

    He, Yiping; He, Tongqiang; Wang, Yanxia; Xu, Zhao; Xu, Yehong; Wu, Yiqing; Ji, Jing; Mi, Yang

    2014-11-01

    To explore the effect of different diagnositic criteria of subclinical hypothyroidism using thyroid stimulating hormone (TSH) and positive thyroid peroxidase antibodies (TPO-Ab) on the pregnancy outcomes. 3 244 pregnant women who had their antenatal care and delivered in Child and Maternity Health Hospital of Shannxi Province August from 2011 to February 2013 were recruited prospectively. According to the standard of American Thyroid Association (ATA), pregnant women with normal serum free thyroxine (FT4) whose serum TSH level> 2.50 mU/L were diagnosed as subclinical hypothyroidism in pregnancy (foreign standard group). According to the Guideline of Diagnosis and Therapy of Prenatal and Postpartum Thyroid Disease made by Chinese Society of Endocrinology and Chinese Society of Perinatal Medicine in 2012, pregnant women with serum TSH level> 5.76 mU/L, and normal FT4 were diagnosed as subclinical hypothyroidism in pregnancy(national standard group). Pregnant women with subclinical hypothyroidism whose serum TSH levels were between 2.50-5.76 mU/L were referred as the study observed group; and pregnant women with serum TSH level< 2.50 mU/L and negative TPO- Ab were referred as the control group. Positive TPO-Ab results and the pregnancy outcomes were analyzed. (1) There were 635 cases in the foreign standard group, with the incidence of 19.57% (635/3 244). And there were 70 cases in the national standard group, with the incidence of 2.16% (70/3 244). There were statistically significant difference between the two groups (P < 0.01). There were 565 cases in the study observed group, with the incidence of 17.42% (565/3 244). There was statistically significant difference (P < 0.01) when compared with the national standard group; while there was no statistically significant difference (P > 0.05) when compared with the foreign standard group. (2) Among the 3 244 cases, 402 cases had positive TPO-Ab. 318 positive cases were in the foreign standard group, and the incidence of subclinical hypothyroidism was 79.10% (318/402). There were 317 negative cases in the foreign standard group, with the incidence of 11.15% (317/2 842). The difference was statistically significant (P < 0.01) between them. In the national standard group, 46 cases had positive TPO-Ab, with the incidence of 11.44% (46/402), and 24 cases had negative result, with the incidence of 0.84% (24/2 842). There were statistically significant difference (P < 0.01) between them. In the study observed group, 272 cases were TPO-Ab positive, with the incidence of 67.66% (272/402), and 293 cases were negative, with the incidence of 10.31% (293/2 842), the difference was statistically significant (P < 0.01). (3) The incidence of miscarriage, premature delivery, gestational hypertension disease, gestational diabetes mellitus(GDM)in the foreign standard group had statistically significant differences (P < 0.05) when compared with the control group, respectively. While there was no statistically significant difference (P > 0.05) in the incidence of placental abruption or fetal distress. And the incidence of miscarriage, premature delivery, gestational hypertension disease, GDM in the national standard group had statistical significant difference (P < 0.05) compared with the control group, respectively. While there was no statistically significant difference (P > 0.05) in the incidence of placental abruption or fetal distress. This study observed group of pregnant women's abortion, gestational hypertension disease, GDM incidence respectively compared with control group, the difference had statistical significance (P < 0.05); but in preterm labor, placental abruption, and fetal distress incidence, there were no statistically significant difference (P > 0.05). (4) The incidence of miscarriage, premature delivery, gestational hypertension disease, GDM, placental abruption, fetal distress in the TPO-Ab positive cases of the national standard group showed an increase trend when compared with TPO-Ab negative cases, with no statistically significant difference (P > 0.05). The incidence of gestational hypertension disease and GDM in the TPO-Ab positive cases of the study observed group had statistical significance difference (P < 0.05) when compared with TPO-Ab negative cases; while the incidence of miscarriage, premature birth, placental abruption, fetal distress had no statistically significant difference (P > 0.05). The incidence of gestational hypertension disease and GDM in the TPO-Ab positive cases had statistically significance difference when compared with TPO-Ab negtive cases of foreign standard group (P < 0.05). (1) The incidence of subclinical hypothyroidism is rather high during early pregnancy and can lead to adverse pregnancy outcome. (2) Positive TPO-Ab result has important predictive value of the thyroid dysfunction and GDM. (3) Relatively, the ATA standard of diagnosis (serum TSH level> 2.50 mU/L) is safer for the antenatal care; the national standard (serum TSH level> 5.76 mU/L) is not conducive to pregnancy management.

  17. Colombian reference growth curves for height, weight, body mass index and head circumference.

    PubMed

    Durán, Paola; Merker, Andrea; Briceño, Germán; Colón, Eugenia; Line, Dionne; Abad, Verónica; Del Toro, Kenny; Chahín, Silvia; Matallana, Audrey Mary; Lema, Adriana; Llano, Mauricio; Céspedes, Jaime; Hagenäs, Lars

    2016-03-01

    Published Growth studies from Latin America are limited to growth references from Argentina and Venezuela. The aim of this study was to construct reference growth curves for height, weight, body mass index (BMI) and head circumference of Colombian children in a format that is useful for following the growth of the individual child and as a tool for public health. Prospective measurements from 27 209 Colombian children from middle and upper socio-economic level families were processed using the generalised additive models for location, scale and shape (GAMLSS). Descriptive statistics for length and height, weight, BMI and head circumference for age are given as raw and smoothed values. Final height was 172.3 cm for boys and 159.4 cm for girls. Weight at 18 years of age was 64.0 kg for boys and 54 kg for girls. Growth curves are presented in a ± 3 SD format using logarithmic axes. The constructed reference growth curves are a start for following secular trends in Colombia and are also in the presented layout an optimal clinical tool for health care. ©2015 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  18. The geography of references in elite articles: Which countries contribute to the archives of knowledge?

    PubMed Central

    Wagner, Caroline

    2018-01-01

    This study asks the question on which national “shoulders” the world’s top-level research stands. Traditionally, the number of citations to national papers has been the evaluative measures of national scientific standings. We raise a different question: instead of analyzing the citations to a countries’ articles (the forward view), we examine references to prior publications from specific countries cited in the most elite publications (the backward—citing—view). “Elite publications” are operationalized as the top-1% most-highly cited articles. Using the articles published from 2004 to 2013, we examine the research referenced in these works. Our results confirm the well-known fact that China has emerged to become a major player in science. However, China still belongs to the low contributors when countries are ranked as contributors to the cited references in top-1% articles. Using this perspective, the results do not support a decreasing trend for the USA; in fact, the USA exceeds expectations (compared to its publication share) in terms of references in the top-1% articles. Switzerland, Sweden, and the Netherlands also appear at the top of the list. However, the results for Germany are lower than statistically expected. PMID:29579088

  19. Selection and Validation of Reference Genes for Quantitative Real-Time Polymerase Chain Reaction Studies in Mossy Maze Polypore, Cerrena unicolor (Higher Basidiomycetes).

    PubMed

    Yang, Jie; Lin, Qi; Lin, Juan; Ye, Xiuyun

    2016-01-01

    With its ability to produce ligninolytic enzymes such as laccases, white-rot basidiomycete Cerrena unicolor, a medicinal mushroom, has great potential in biotechnology. Elucidation of the expression profiles of genes encoding ligninolytic enzymes are important for increasing their production. Quantitative real-time polymerase chain reaction (qPCR) is a powerful tool to study transcriptional regulation of genes of interest. To ensure accuracy and reliability of qPCR analysis of C. unicolor, expression levels of seven candidate reference genes were studied at different growth phases, under various induction conditions, and with a range of carbon/nitrogen ratios and carbon and nitrogen sources. The stability of the genes were analyzed with five statistical approaches, namely geNorm, NormFinder, BestKeeper, the ΔCt method, and RefFinder. Our results indicated that the selection of reference genes varied with sample sets. A combination of four reference genes (Cyt-c, ATP6, TEF1, and β-tubulin) were recommended for normalizing gene expression at different growth phases. GAPDH and Cyt-c were the appropriate reference genes under different induction conditions. ATP6 and TEF1 were most stable in fermentation media with various carbon/nitrogen ratios. In the fermentation media with various carbon or nitrogen sources, 18S rRNA and GAPDH were the references of choice. The present study represents the first validation analysis of reference genes in C. unicolor and serves as a foundation for its qPCR analysis.

  20. [Features of the lipid exchange in workers employed in aluminium productions].

    PubMed

    Kudaeva, I V; Dyakovich, O A; Masnavieva, L B; Popkova, O V; Abramatets, E A

    Aluminum production can be referred to the category of industries of the increased health hazard for the workers. During technological process of receiving aluminum the air of a working zone is polluted by a large number of harmful substances. Workers are exposed to the complex of toxicants possessing a polytropic impact on the body. The most significant consequences are violations of different types of metabolism in the organism, including lipid metabolism. The purpose of the study is the investigation of the state of lipid metabolism in persons working in the production of aluminum. The object of research was 108 male workers of aluminum production suffering from occupational pathology of airways. The group of comparison was consisted of 103 men, apparently healthy, not exposed to toxicants. There was determined the content of the total cholesterol (TC), high and low density lipoprotein cholesterol (HDLC and LDLC), triglycerides (TG), phospholipids (PL), atherogenic index (AI). Statistical processing was performed with the use of software «Statistica 6.0». There were established statistically significant differences of indices of lipid exchange in the persons occupied in aluminum production when related to the group of comparison. IA values in persons from the study group proved to be higher than in the comparison group, due to elevated levels of TC and LDLC. The TG and PL level was also higher. The values of IA, TC and TG in workers of aluminum production in more than 50% cases exceeded the reference values. The average concentration of HDL cholesterol in both groups did not differ, and was above the lower reference boundary. Established features of lipid metabolism in workers of aluminum allow us to suggest the distinction in mechanisms of developing proatherogenic disorders from previously established ones for workers exposed to other chemicals. One of the causes of the shaping of these disorders can be oxidative stress, which in turn serves as a response to the exposure of complex of toxic substances to workers.

  1. A comparison of body image concern in candidates for rhinoplasty and therapeutic surgery.

    PubMed

    Hashemi, Seyed Amirhosein Ghazizadeh; Edalatnoor, Behnoosh; Edalatnoor, Behnaz; Niksun, Omid

    2017-09-01

    Body dysmorphic disorder among patients referring for cosmetic surgeries is a disorder that if not diagnosed by a physician, can cause irreparable damage to the doctor and the patient. The aim of this study was to compare body image concern in candidates for rhinoplasty and therapeutic surgery. This was a cross-sectional study conducted on 212 patients referring to Loghman Hospital of Tehran for rhinoplasty and therapeutic surgery during the period from 2014 through 2016. For each person in a cosmetic surgery group, a person of the same sex and age in a therapeutic surgery group was matched, and the study was conducted on 60 subjects in the rhinoplasty group and 62 patients in the therapeutic surgery group. Then, the Body Image Concern Inventory and demographic data were filled by all patients and the level of body image concern in both groups was compared. Statistical analysis was conducted using SPSS 16, Chi-square test as well as paired-samples t-test. P-value of less than 0.05 was considered statistically significant. In this study, 122 patients (49 males and 73 females) with mean age of 27.1±7.3 between 18 and 55 years of age were investigated. Sixty subjects were candidates for rhinoplasty and 62 subjects for therapeutic surgery. Candidates for rhinoplasty were mostly male (60%) and single (63.3%). Results of the t-test demonstrated that body image concern and body dysmorphic disorder were higher in the rhinoplasty group compared to the therapeutic group (p<0.05). Results of this study showed that the frequency of rhinoplasty candidates is higher in single male subjects. In addition, body image concern was higher in rhinoplasty candidates compared to candidates for other surgeries. Visiting and correct interviewing of people who referred for rhinoplasty is very important to measure their level of body image concern to diagnose any disorders available and to consider required treatments.

  2. Body mass index in relation to serum prostate-specific antigen levels and prostate cancer risk.

    PubMed

    Bonn, Stephanie E; Sjölander, Arvid; Tillander, Annika; Wiklund, Fredrik; Grönberg, Henrik; Bälter, Katarina

    2016-07-01

    High Body mass index (BMI) has been directly associated with risk of aggressive or fatal prostate cancer. One possible explanation may be an effect of BMI on serum levels of prostate-specific antigen (PSA). To study the association between BMI and serum PSA as well as prostate cancer risk, a large cohort of men without prostate cancer at baseline was followed prospectively for prostate cancer diagnoses until 2015. Serum PSA and BMI were assessed among 15,827 men at baseline in 2010-2012. During follow-up, 735 men were diagnosed with prostate cancer with 282 (38.4%) classified as high-grade cancers. Multivariable linear regression models and natural cubic linear regression splines were fitted for analyses of BMI and log-PSA. For risk analysis, Cox proportional hazards regression models were used to estimate hazard ratios (HR) and 95% confidence intervals (CI) and natural cubic Cox regression splines producing standardized cancer-free probabilities were fitted. Results showed that baseline Serum PSA decreased by 1.6% (95% CI: -2.1 to -1.1) with every one unit increase in BMI. Statistically significant decreases of 3.7, 11.7 and 32.3% were seen for increasing BMI-categories of 25 < 30, 30 < 35 and ≥35 kg/m(2), respectively, compared to the reference (18.5 < 25 kg/m(2)). No statistically significant associations were seen between BMI and prostate cancer risk although results were indicative of a positive association to incidence rates of high-grade disease and an inverse association to incidence of low-grade disease. However, findings regarding risk are limited by the short follow-up time. In conclusion, BMI was inversely associated to PSA-levels. BMI should be taken into consideration when referring men to a prostate biopsy based on serum PSA-levels. © 2016 UICC.

  3. Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review.

    PubMed

    Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C

    2018-03-07

    Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.

  4. The relationship between particulate pollution levels in Australian cities, meteorology, and landscape fire activity detected from MODIS hotspots.

    PubMed

    Price, Owen F; Williamson, Grant J; Henderson, Sarah B; Johnston, Fay; Bowman, David M J S

    2012-01-01

    Smoke from bushfires is an emerging issue for fire managers because of increasing evidence for its public health effects. Development of forecasting models to predict future pollution levels based on the relationship between bushfire activity and current pollution levels would be a useful management tool. As a first step, we use daily thermal anomalies detected by the MODIS Active Fire Product (referred to as "hotspots"), pollution concentrations, and meteorological data for the years 2002 to 2008, to examine the statistical relationship between fire activity in the landscapes and pollution levels around Perth and Sydney, two large Australian cities. Resultant models were statistically significant, but differed in their goodness of fit and the distance at which the strength of the relationship was strongest. For Sydney, a univariate model for hotspot activity within 100 km explained 24% of variation in pollution levels, and the best model including atmospheric variables explained 56% of variation. For Perth, the best radius was 400 km, explaining only 7% of variation, while the model including atmospheric variables explained 31% of the variation. Pollution was higher when the atmosphere was more stable and in the presence of on-shore winds, whereas there was no effect of wind blowing from the fires toward the pollution monitors. Our analysis shows there is a good prospect for developing region-specific forecasting tools combining hotspot fire activity with meteorological data.

  5. Generation of dense statistical connectomes from sparse morphological data

    PubMed Central

    Egger, Robert; Dercksen, Vincent J.; Udvary, Daniel; Hege, Hans-Christian; Oberlaender, Marcel

    2014-01-01

    Sensory-evoked signal flow, at cellular and network levels, is primarily determined by the synaptic wiring of the underlying neuronal circuitry. Measurements of synaptic innervation, connection probabilities and subcellular organization of synaptic inputs are thus among the most active fields of research in contemporary neuroscience. Methods to measure these quantities range from electrophysiological recordings over reconstructions of dendrite-axon overlap at light-microscopic levels to dense circuit reconstructions of small volumes at electron-microscopic resolution. However, quantitative and complete measurements at subcellular resolution and mesoscopic scales to obtain all local and long-range synaptic in/outputs for any neuron within an entire brain region are beyond present methodological limits. Here, we present a novel concept, implemented within an interactive software environment called NeuroNet, which allows (i) integration of sparsely sampled (sub)cellular morphological data into an accurate anatomical reference frame of the brain region(s) of interest, (ii) up-scaling to generate an average dense model of the neuronal circuitry within the respective brain region(s) and (iii) statistical measurements of synaptic innervation between all neurons within the model. We illustrate our approach by generating a dense average model of the entire rat vibrissal cortex, providing the required anatomical data, and illustrate how to measure synaptic innervation statistically. Comparing our results with data from paired recordings in vitro and in vivo, as well as with reconstructions of synaptic contact sites at light- and electron-microscopic levels, we find that our in silico measurements are in line with previous results. PMID:25426033

  6. Patron Preference in Reference Service Points.

    ERIC Educational Resources Information Center

    Morgan, Linda

    1980-01-01

    Behavior of patrons choosing between a person sitting at a counter and one sitting at a desk at each of two reference points was observed at the reference department during remodeling at the M. D. Anderson Library of the University of Houston. Results showed a statistically relevant preference for the counter. (Author/JD)

  7. Economics: A Guide to Reference Sources.

    ERIC Educational Resources Information Center

    Mason, Mary, Comp.

    Approximately 84 reference materials on economics located in the McLennan Library, McGill University (Montreal), are cited in this annotated bibliography. The bibliography serves to provide an overview of the printed bibliographic and reference sources useful for the study of economics. Financial and business sources and statistical compendia and…

  8. Reference Values for the Pediatric Quality of Life Inventory and the Multidimensional Fatigue Scale in Adolescent Athletes by Sport and Sex.

    PubMed

    Snyder Valier, Alison R; Welch Bacon, Cailee E; Bay, R Curtis; Molzen, Eileen; Lam, Kenneth C; Valovich McLeod, Tamara C

    2017-10-01

    Effective use of patient-rated outcome measures to facilitate optimal patient care requires an understanding of the reference values of these measures within the population of interest. Little is known about reference values for commonly used patient-rated outcome measures in adolescent athletes. To determine reference values for the Pediatric Quality of Life Inventory (PedsQL) and the Multidimensional Fatigue Scale (MFS) in adolescent athletes by sport and sex. Cross-sectional study; Level of evidence, 3. A convenience sample of interscholastic adolescent athletes from 9 sports was used. Participants completed the PedsQL and MFS during one testing session at the start of their sport season. Data were stratified by sport and sex. Dependent variables included the total PedsQL score and the 5 PedsQL subscale scores: physical functioning, psychosocial functioning, emotional functioning, social functioning, and school functioning. Dependent variables for the MFS included 3 subscale scores: general functioning, sleep functioning, and cognitive functioning. Summary statistics were reported for total and subscale scores by sport and sex. Among 3574 males and 1329 female adolescent athletes, the PedsQL scores (100 possible points) generally indicated high levels of health regardless of sport played. Mean PedsQL total and subscales scores ranged from 82.6 to 95.7 for males and 83.9 to 95.2 for females. Mean MFS subscale scores (100 possible points) ranged from 74.2 to 90.9 for males and 72.8 to 87.4 for females. Healthy male and female adolescent athletes reported relatively high levels of health on the PedsQL subscales and total scores regardless of sport; no mean scores were lower than 82.6 points for males or 83.9 points for females. On the MFS, males and females tended to report low effect of general and cognitive fatigue regardless of sport; mean scores were higher than 83.5 points for males and 83.8 points for females. Clinically, athletes who score below the reference values for their sport have poorer health status than average adolescent athletes participating in that sport. Scores below reference values may warrant consideration of early intervention or treatment.

  9. Atlas(®) Listeria monocytogenes LmG2 Detection Assay Using Transcription Mediated Amplification to Detect Listeria monocytogenes in Selected Foods and Stainless Steel Surface.

    PubMed

    Bres, Vanessa; Yang, Hua; Hsu, Ernie; Ren, Yan; Cheng, Ying; Wisniewski, Michele; Hanhan, Maesa; Zaslavsky, Polina; Noll, Nathan; Weaver, Brett; Campbell, Paul; Reshatoff, Michael; Becker, Michael

    2014-01-01

    The Atlas Listeria monocytogenes LmG2 Detection Assay, developed by Roka Bioscience Inc., was compared to a reference culture method for seven food types (hot dogs, cured ham, deli turkey, chicken salad, vanilla ice cream, frozen chocolate cream pie, and frozen cheese pizza) and one surface (stainless steel, grade 316). A 125 g portion of deli turkey was tested using a 1:4 food:media dilution ratio, and a 25 g portion for all other foods was tested using 1:9 food:media dilution ratio. The enrichment time and media for Roka's method was 24 to 28 h for 25 g food samples and environmental surfaces, and 44 to 48 h for 125 g at 35 ± 2°C in PALCAM broth containing 0.02 g/L nalidixic acid. Comparison of the Atlas Listeria monocytogenes LmG2 Detection Assay to the reference method required an unpaired approach. For each matrix, 20 samples inoculated at a fractional level and five samples inoculated at a high level with a different strain of Listeria monocytogenes were tested by each method. The Atlas Listeria monocytogenes LmG2 Detection Assay was compared to the Official Methods of Analysis of AOAC INTERNATIONAL 993.12 method for dairy products, the U.S. Department of Agriculture, Food Safety and Inspection Service, Microbiology Laboratory Guidebook 8.08 method for ready-to-eat meat and environmental samples, and the U.S. Food and Drug Administration Bacteriological Analytical Manual, Chapter 10 method for frozen foods. In the method developer studies, Roka's method, at 24 h (or 44 h for 125 g food samples), had 126 positives out of 200 total inoculated samples, compared to 102 positives for the reference methods at 48 h. In the independent laboratory studies, vanilla ice cream, deli turkey and stainless steel grade 316 were evaluated. Roka's method, at 24 h (or 44 h for 125 g food samples), had 64 positives out of 75 total inoculated samples compared to 54 positives for the reference methods at 48 h. The Atlas Listeria monocytogenes LmG2 Detection Assay detected all 50 L. monocytogenes strains that encompassed 13 serotypes across the various lineages and none of the 30 exclusive organisms, including seven other Listeria species. The product consistency and kit stability studies revealed no statistical differences between the three lots tested or to the term of the shelf life. Finally, the robustness study demonstrated no statistical differences when samples were incubated at 33 ± 2°C or 37 ± 2°C, when enrichment aliquots were 1.3 mL or 1.7 mL, or when the samples were analyzed the same day or five days later. Overall the Atlas Listeria monocytogenes LmG2 Detection Assay is statistically equivalent to or better than the reference methods and is robust to the tested variations.

  10. Educational interactive multimedia software: The impact of interactivity on learning

    NASA Astrophysics Data System (ADS)

    Reamon, Derek Trent

    This dissertation discusses the design, development, deployment and testing of two versions of educational interactive multimedia software. Both versions of the software are focused on teaching mechanical engineering undergraduates about the fundamentals of direct-current (DC) motor physics and selection. The two versions of Motor Workshop software cover the same basic materials on motors, but differ in the level of interactivity between the students and the software. Here, the level of interactivity refers to the particular role of the computer in the interaction between the user and the software. In one version, the students navigate through information that is organized by topic, reading text, and viewing embedded video clips; this is referred to as "low-level interactivity" software because the computer simply presents the content. In the other version, the students are given a task to accomplish---they must design a small motor-driven 'virtual' vehicle that competes against computer-generated opponents. The interaction is guided by the software which offers advice from 'experts' and provides contextual information; we refer to this as "high-level interactivity" software because the computer is actively participating in the interaction. The software was used in two sets of experiments, where students using the low-level interactivity software served as the 'control group,' and students using the highly interactive software were the 'treatment group.' Data, including pre- and post-performance tests, questionnaire responses, learning style characterizations, activity tracking logs and videotapes were collected for analysis. Statistical and observational research methods were applied to the various data to test the hypothesis that the level of interactivity effects the learning situation, with higher levels of interactivity being more effective for learning. The results show that both the low-level and high-level interactive versions of the software were effective in promoting learning about the subject of motors. The focus of learning varied between users of the two versions, however. The low-level version was more effective for teaching concepts and terminology, while the high-level version seemed to be more effective for teaching engineering applications.

  11. Trends in reference usage statistics in an academic health sciences library.

    PubMed

    De Groote, Sandra L; Hitchcock, Kristin; McGowan, Richard

    2007-01-01

    To examine reference questions asked through traditional means at an academic health sciences library and place this data within the context of larger trends in reference services. Detailed data on the types of reference questions asked were collected during two one-month periods in 2003 and 2004. General statistics documenting broad categories of questions were compiled over a fifteen-year period. Administrative data show a steady increase in questions from 1990 to 1997/98 (23,848 to 48,037, followed by a decline through 2004/05 to 10,031. The distribution of reference questions asked over the years has changed-including a reduction in mediated searches 2,157 in 1990/91 to 18 in 2004/05, an increase in instruction 1,284 in 1993/94 to 1,897 in 2004/05 and an increase in digital reference interactions 0 in 1999/2000 to 581 in 2004/05. The most commonly asked questions at the current reference desk are about journal holdings 19%, book holdings 12%, and directional issues 12%. This study provides a unique snapshot of reference services in the contemporary library, where both online and offline services are commonplace. Changes in questions have impacted the way the library provides services, but traditional reference remains the core of information services in this health sciences library.

  12. Ultralow-dose CT of the craniofacial bone for navigated surgery using adaptive statistical iterative reconstruction and model-based iterative reconstruction: 2D and 3D image quality.

    PubMed

    Widmann, Gerlig; Schullian, Peter; Gassner, Eva-Maria; Hoermann, Romed; Bale, Reto; Puelacher, Wolfgang

    2015-03-01

    OBJECTIVE. The purpose of this article is to evaluate 2D and 3D image quality of high-resolution ultralow-dose CT images of the craniofacial bone for navigated surgery using adaptive statistical iterative reconstruction (ASIR) and model-based iterative reconstruction (MBIR) in comparison with standard filtered backprojection (FBP). MATERIALS AND METHODS. A formalin-fixed human cadaver head was scanned using a clinical reference protocol at a CT dose index volume of 30.48 mGy and a series of five ultralow-dose protocols at 3.48, 2.19, 0.82, 0.44, and 0.22 mGy using FBP and ASIR at 50% (ASIR-50), ASIR at 100% (ASIR-100), and MBIR. Blinded 2D axial and 3D volume-rendered images were compared with each other by three readers using top-down scoring. Scores were analyzed per protocol or dose and reconstruction. All images were compared with the FBP reference at 30.48 mGy. A nonparametric Mann-Whitney U test was used. Statistical significance was set at p < 0.05. RESULTS. For 2D images, the FBP reference at 30.48 mGy did not statistically significantly differ from ASIR-100 at 3.48 mGy, ASIR-100 at 2.19 mGy, and MBIR at 0.82 mGy. MBIR at 2.19 and 3.48 mGy scored statistically significantly better than the FBP reference (p = 0.032 and 0.001, respectively). For 3D images, the FBP reference at 30.48 mGy did not statistically significantly differ from all reconstructions at 3.48 mGy; FBP and ASIR-100 at 2.19 mGy; FBP, ASIR-100, and MBIR at 0.82 mGy; MBIR at 0.44 mGy; and MBIR at 0.22 mGy. CONCLUSION. MBIR (2D and 3D) and ASIR-100 (2D) may significantly improve subjective image quality of ultralow-dose images and may allow more than 90% dose reductions.

  13. The effect of iconicity of visual displays on statistical reasoning: evidence in favor of the null hypothesis.

    PubMed

    Sirota, Miroslav; Kostovičová, Lenka; Juanchich, Marie

    2014-08-01

    Knowing which properties of visual displays facilitate statistical reasoning bears practical and theoretical implications. Therefore, we studied the effect of one property of visual diplays - iconicity (i.e., the resemblance of a visual sign to its referent) - on Bayesian reasoning. Two main accounts of statistical reasoning predict different effect of iconicity on Bayesian reasoning. The ecological-rationality account predicts a positive iconicity effect, because more highly iconic signs resemble more individuated objects, which tap better into an evolutionary-designed frequency-coding mechanism that, in turn, facilitates Bayesian reasoning. The nested-sets account predicts a null iconicity effect, because iconicity does not affect the salience of a nested-sets structure-the factor facilitating Bayesian reasoning processed by a general reasoning mechanism. In two well-powered experiments (N = 577), we found no support for a positive iconicity effect across different iconicity levels that were manipulated in different visual displays (meta-analytical overall effect: log OR = -0.13, 95% CI [-0.53, 0.28]). A Bayes factor analysis provided strong evidence in favor of the null hypothesis-the null iconicity effect. Thus, these findings corroborate the nested-sets rather than the ecological-rationality account of statistical reasoning.

  14. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…

  15. IFLA General Conference, 1986. Management and Technology Division. Section: Statistics. Papers.

    ERIC Educational Resources Information Center

    International Federation of Library Associations and Institutions, The Hague (Netherlands).

    Papers on statistics which were presented at the 1986 International Federation of Library Associations (IFLA) conference include: (1) "Library Data Collection in Brazil (Nice Menezes de Figueiredo, Brazil); (2) "Fact-Finding on Statistics and Reference Tools in Japan" (Yuriko Sugimoto, Chihomi Oka, Ikuko Mayumi, and Keiko Kurata,…

  16. 76 FR 26549 - Removal of Certain References to Credit Ratings Under the Securities Exchange Act of 1934

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-06

    ... recognized statistical rating organization'' (``NRSRO'') as part of the Commission's amendments to its broker... rating agency'' and ``nationally recognized statistical rating organization'' in Exchange Act Sections 3... ``nationally recognized statistical rating organization'' means a credit rating agency that: (A) issues credit...

  17. Educating the Educator: U.S. Government Statistical Sources for Geographic Research and Teaching.

    ERIC Educational Resources Information Center

    Fryman, James F.; Wilkinson, Patrick J.

    Appropriate for college geography students and researchers, this paper briefly introduces basic federal statistical publications and corresponding finding aids. General references include "Statistical Abstract of the United States," and three complementary publications: "County and City Data Book,""State and Metropolitan Area Data Book," and…

  18. Variance approximations for assessments of classification accuracy

    Treesearch

    R. L. Czaplewski

    1994-01-01

    Variance approximations are derived for the weighted and unweighted kappa statistics, the conditional kappa statistic, and conditional probabilities. These statistics are useful to assess classification accuracy, such as accuracy of remotely sensed classifications in thematic maps when compared to a sample of reference classifications made in the field. Published...

  19. Mobile Digest of Education Statistics, 2013. NCES 2014-086

    ERIC Educational Resources Information Center

    Snyder, Thomas D.

    2014-01-01

    This is the first edition of the "Mobile Digest of Education Statistics." This compact compilation of statistical information covers prekindergarten through graduate school to describe the current American education scene. The "Mobile Digest" is designed as an easy mobile reference for materials found in detail in the…

  20. A Model of Statistics Performance Based on Achievement Goal Theory.

    ERIC Educational Resources Information Center

    Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.

    2003-01-01

    Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…

  1. Can intermuscular cleavage planes provide proper transverse screw angle? Comparison of two paraspinal approaches.

    PubMed

    Cheng, Xiaofei; Ni, Bin; Liu, Qi; Chen, Jinshui; Guan, Huapeng

    2013-01-01

    The goal of this study was to determine which paraspinal approach provided a better transverse screw angle (TSA) for each vertebral level in lower lumbar surgery. Axial computed tomography (CT) images of 100 patients, from L3 to S1, were used to measure the angulation parameters, including transverse pedicle angle (TPA) and transverse cleavage plane angle (TCPA) of entry from the two approaches. The difference value between TCPA and TPA, defined as difference angle (DA), was calculated. Statistical differences of DA obtained by the two approaches and the angulation parameters between sexes, and the correlation between each angulation parameter and age or body mass index (BMI) were analyzed. TPA ranged from about 16° at L3 to 30° at S1. TCPA through the Wiltse's and Weaver's approach ranged from about -10° and 25° at L3 to 12° and 32° at S1, respectively. The absolute values of DA through the Weaver's approach were significantly lower than those through the Wiltse's approach at each level. The angulation parameters showed no significant difference with sex and no significant correlation with age or BMI. In the lower lumbar vertebrae (L3-L5) and S1, pedicle screw placement through the Weaver's approach may more easily yield the preferred TSA consistent with TPA than that through the Wiltse's approach. The reference values obtained in this paper may be applied regardless of sex, age or BMI and the descriptive statistical results may be used as references for applying the two paraspinal approaches.

  2. Evaluation of mericon E. coli O157 Screen Plus and mericon E. coli STEC O-Type Pathogen Detection Assays in Select Foods: Collaborative Study, First Action 2017.05.

    PubMed

    Bird, Patrick; Benzinger, M Joseph; Bastin, Benjamin; Crowley, Erin; Agin, James; Goins, David; Armstrong, Marcia

    2018-05-01

    QIAGEN mericon Escherichia coli O157 Screen Plus and mericon E. coli Shiga toxin-producing E. coli (STEC) O-Type Pathogen Detection Assays use Real-Time PCR technology for the rapid, accurate detection of E. coli O157 and the "big six" (O26, O45, O103, O111, O121, O145) (non-O157 STEC) in select food types. Using a paired study design, the assays were compared with the U.S. Department of Agriculture, Food Safety Inspection Service Microbiology Laboratory Guidebook Chapter 5.09 reference method for the detection of E. coli O157:H7 in raw ground beef. Both mericon assays were evaluated using the manual and an automated DNA extraction method. Thirteen technicians from five laboratories located within the continental United States participated in the collaborative study. Three levels of contamination were evaluated. Statistical analysis was conducted according to the probability of detection (POD) statistical model. Results obtained for the low-inoculum level test portions produced a difference between laboratories POD (dLPOD) value with a 95% confidence interval of 0.00 (-0.12, 0.12) for the mericon E. coli O157 Screen Plus with manual and automated extraction and mericon E. coli STEC O-Type with manual extraction and -0.01 (-0.13, 0.10) for the mericon E. coli STEC O-Type with automated extraction. The dLPOD results indicate equivalence between the candidate methods and the reference method.

  3. Classifying northern forests using Thematic Mapper Simulator data

    NASA Technical Reports Server (NTRS)

    Nelson, R. F.; Latty, R. S.; Mott, G.

    1984-01-01

    Thematic Mapper Simulator data were collected over a 23,200 hectare forested area near Baxter State Park in north-central Maine. Photointerpreted ground reference information was used to drive a stratified random sampling procedure for waveband discriminant analyses and to generate training statistics and test pixel accuracies. Stepwise discriminant analyses indicated that the following bands best differentiated the thirteen level II - III cover types (in order of entry): near infrared (0.77 to 0.90 micron), blue (0.46 0.52 micron), first middle infrared (1.53 to 1.73 microns), second middle infrared (2.06 to 2.33 microsn), red (0.63 to 0.69 micron), thermal (10.32 to 12.33 microns). Classification accuracies peaked at 58 percent for thirteen level II-III land-cover classes and at 65 percent for ten level II classes.

  4. Lead exposure and eclampsia in Britain, 1883-1934.

    PubMed

    Troesken, Werner

    2006-07-01

    Eclampsia refers to a coma or seizure activity in a pregnant woman with no prior history of such activity. This paper presents a mix of historical and epidemiological evidence consistent with the hypothesis that chronic lead exposure is a predisposing factor for eclampsia. The historical evidence is based on research conducted by British physicians around 1900 showing that the geographic variation in eclampsia across England and Wales was correlated with lead levels in local drinking water supplies. A formal epidemiological analysis based on a data set of English and Welsh counties observed in 1883 corroborates the evidence presented by historical observers. In particular, the statistical results show that the death rate from eclampsia in counties with high-water-lead levels exceeded the death rate in counties with low-water-lead levels by a factor of 2.34 (95% CI: 1.54-3.14).

  5. Development of a Multidisciplinary Program to Expedite Care of Esophageal Emergencies.

    PubMed

    Ceppa, DuyKhanh P; Rosati, Carlo Maria; Chabtini, Lola; Stokes, Samantha M; Cook, Holly C; Rieger, Karen M; Birdas, Thomas J; Lappas, John C; Kessler, William R; DeWitt, John M; Maglinte, Dean D; Kesler, Kenneth A

    2017-09-01

    Level 1 programs have improved outcomes by expediting the multidisciplinary care of critically ill patients. We established a novel level 1 program for the management of esophageal emergencies. After institutional review board approval, we performed a retrospective analysis of patients referred to our level 1 esophageal emergency program from April 2013 through November 2015. A historical comparison group of patients treated for the same diagnosis in the previous 2 years was used. Eighty patients were referred and transported an average distance of 56 miles (range, 1-163 miles). Median time from referral to arrival was 2.4 hours (range, 0.4-12.9 hours). Referrals included 6 (7%) patients with esophageal obstruction and 71 (89%) patients with suspected esophageal perforation. Of the patients with suspected esophageal perforation, causes included iatrogenic (n = 26), Boerhaave's syndrome (n = 32), and other (n = 13). Forty-six percent (n = 33) of patients were referred because of pneumomediastinum, but perforation could not be subsequently demonstrated. Initial management of patients with documented esophageal perforation included operative treatment (n = 25), endoscopic intervention (n = 8), and supportive care (n = 5). Retrospective analysis demonstrated a statistically significant difference in mean Pittsburgh severity index score (PSS) between esophageal perforation treatment groups (p < 0.01). In patients with confirmed perforations, there were 3 (8%) mortalities within 30 days. More patients in the esophageal level 1 program were transferred to our institution in less than 24 hours after diagnosis than in the historical comparison group (p < 0.01). Development of an esophageal emergency referral program has facilitated multidisciplinary care at a high-volume institution, and early outcomes appear favorable. Copyright © 2017 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  6. Models of Reference Services in Australian Academic Libraries

    ERIC Educational Resources Information Center

    Burke, Liz

    2008-01-01

    This article reports on a project which was undertaken in 2006 to investigate the current modes and methods for delivering reference services in Australian academic libraries. The project included a literature review to assist in providing a definition of reference services as well as a snapshot of statistics showing staff and patron numbers from…

  7. Inventory of DOT Statistical Information Systems

    DOT National Transportation Integrated Search

    1983-01-01

    The inventory represents an update of relevant systems described in the Transportation Statistical Reference File (TSRF), coordinated with the GAO update of Congressional Sources and Systems, and the Information Collection Budget. The inventory compi...

  8. Improved Reference Sampling and Subtraction: A Technique for Reducing the Read Noise of Near-infrared Detector Systems

    NASA Astrophysics Data System (ADS)

    Rauscher, Bernard J.; Arendt, Richard G.; Fixsen, D. J.; Greenhouse, Matthew A.; Lander, Matthew; Lindler, Don; Loose, Markus; Moseley, S. H.; Mott, D. Brent; Wen, Yiting; Wilson, Donna V.; Xenophontos, Christos

    2017-10-01

    Near-infrared array detectors, like the James Webb Space Telescope (JWST) NIRSpec’s Teledyne’s H2RGs, often provide reference pixels and a reference output. These are used to remove correlated noise. Improved reference sampling and subtraction (IRS2) is a statistical technique for using this reference information optimally in a least-squares sense. Compared with the traditional H2RG readout, IRS2 uses a different clocking pattern to interleave many more reference pixels into the data than is otherwise possible. Compared with standard reference correction techniques, IRS2 subtracts the reference pixels and reference output using a statistically optimized set of frequency-dependent weights. The benefits include somewhat lower noise variance and much less obvious correlated noise. NIRSpec’s IRS2 images are cosmetically clean, with less 1/f banding than in traditional data from the same system. This article describes the IRS2 clocking pattern and presents the equations needed to use IRS2 in systems other than NIRSpec. For NIRSpec, applying these equations is already an option in the calibration pipeline. As an aid to instrument builders, we provide our prototype IRS2 calibration software and sample JWST NIRSpec data. The same techniques are applicable to other detector systems, including those based on Teledyne’s H4RG arrays. The H4RG’s interleaved reference pixel readout mode is effectively one IRS2 pattern.

  9. Serum IGF-I and IGFBP-3 levels of Turkish children during childhood and adolescence: establishment of reference ranges with emphasis on puberty.

    PubMed

    Bereket, Abdullah; Turan, Serap; Omar, Anjumanara; Berber, Mustafa; Ozen, Ahmet; Akbenlioglu, Cengiz; Haklar, Goncagul

    2006-01-01

    We established age- and sex-related reference ranges for serum insulin-like growth factor-I (IGF-I) and insulin-like growth factor binding protein-3 (IGFBP-3) levels in 807 healthy Turkish children (428 boys, 379 girls), and constructed a model for calculation of standard deviation scores of IGF-I and IGFBP-3 according to age, sex and pubertal stage. Serum IGF-I and IGFBP-3 concentrations tended to be higher in girls compared to boys of the same ages, but the differences were statistically significant only in pubertal ages (9-14 years) for IGF-I and only in prepubertal ages for IGFBP-3 (6-8 years) (p < 0.05). Peak IGF-I concentrations were observed earlier in girls than boys (14 vs. 15 years, Tanner stage IV vs. V) starting to decline thereafter. IGFBP-3 levels peaked at age 13 and at Tanner stage IV in both sexes with a subsequent fall. Serum levels of IGF-I and IGFBP-3 increased steadily with age in the prepubertal stage followed by a rapid increase in IGF-I in the early pubertal stages. A relatively steeper increase in IGF-I but not in IGFBP-3 levels was observed at age 10-11 years in girls and at 12-13 years in boys which preceded the reported age of pubertal growth spurt. At late pubertal stages, both IGF-I and IGFBP-3 either did not change or decreased by increasing age. Interrelationships between growth factors and anthropometric measurements have been described, and the physiologic consequences of these have been discussed in detail. Differences in the pattern of IGF-I and IGFBP-3 in the present paper and those reported in other studies emphasize the importance of locally established reference ranges. Establishment of this reference data and a standard deviation score prediction model based on age, sex and puberty will enhance the diagnostic power and utility of IGF-I and IGFBP-3 in evaluating growth disorders in our population. Copyright 2006 S. Karger AG, Basel

  10. Assessment of wadeable stream resources in the driftless area ecoregion in Western Wisconsin using a probabilistic sampling design.

    PubMed

    Miller, Michael A; Colby, Alison C C; Kanehl, Paul D; Blocksom, Karen

    2009-03-01

    The Wisconsin Department of Natural Resources (WDNR), with support from the U.S. EPA, conducted an assessment of wadeable streams in the Driftless Area ecoregion in western Wisconsin using a probabilistic sampling design. This ecoregion encompasses 20% of Wisconsin's land area and contains 8,800 miles of perennial streams. Randomly-selected stream sites (n = 60) equally distributed among stream orders 1-4 were sampled. Watershed land use, riparian and in-stream habitat, water chemistry, macroinvertebrate, and fish assemblage data were collected at each true random site and an associated "modified-random" site on each stream that was accessed via a road crossing nearest to the true random site. Targeted least-disturbed reference sites (n = 22) were also sampled to develop reference conditions for various physical, chemical, and biological measures. Cumulative distribution function plots of various measures collected at the true random sites evaluated with reference condition thresholds, indicate that high proportions of the random sites (and by inference the entire Driftless Area wadeable stream population) show some level of degradation. Study results show no statistically significant differences between the true random and modified-random sample sites for any of the nine physical habitat, 11 water chemistry, seven macroinvertebrate, or eight fish metrics analyzed. In Wisconsin's Driftless Area, 79% of wadeable stream lengths were accessible via road crossings. While further evaluation of the statistical rigor of using a modified-random sampling design is warranted, sampling randomly-selected stream sites accessed via the nearest road crossing may provide a more economical way to apply probabilistic sampling in stream monitoring programs.

  11. Systematic versus random sampling in stereological studies.

    PubMed

    West, Mark J

    2012-12-01

    The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.

  12. Prevalence odds ratio versus prevalence ratio: choice comes with consequences.

    PubMed

    Tamhane, Ashutosh R; Westfall, Andrew O; Burkholder, Greer A; Cutter, Gary R

    2016-12-30

    Odds ratio, risk ratio, and prevalence ratio are some of the measures of association which are often reported in research studies quantifying the relationship between an independent variable and the outcome of interest. There has been much debate on the issue of which measure is appropriate to report depending on the study design. However, the literature on selecting a particular category of the outcome to be modeled and/or change in reference group for categorical independent variables and the effect on statistical significance, although known, is scantly discussed nor published with examples. In this article, we provide an example of a cross-sectional study wherein prevalence ratio was chosen over (Prevalence) odds ratio and demonstrate the analytic implications of the choice of category to be modeled and choice of reference level for independent variables. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  13. The SRS-Viewer: A Software Tool for Displaying and Evaluation of Pyroshock Data

    NASA Astrophysics Data System (ADS)

    Eberl, Stefan

    2014-06-01

    For the evaluation of the success of a pyroshock, the time domain and the corresponding Shock-Response- Spectra (SRS) have to be considered. The SRS-Viewer is an IABG developed software tool [1] to read data in Universal File format (*.unv) and either display or plot for each accelerometer the time domain, corresponding SRS and the specified Reference-SRS with tolerances in the background.The software calculates the "Average (AVG)", "Maximum (MAX)" and "Minimum (MIN)" SRS of any selection of accelerometers. A statistical analysis calculates the percentages of measured SRS above the specified Reference-SRS level and the percentage within the tolerance bands for comparison with the specified success criteria.Overlay plots of single accelerometers of different test runs enable to monitor the repeatability of the shock input and the integrity of the specimen. Furthermore the difference between the shock on a mass-dummy and the real test unit can be examined.

  14. The interaction between manganese exposure and alcohol on neurobehavioral outcomes in welders.

    PubMed

    Ellingsen, Dag G; Kusraeva, Zarina; Bast-Pettersen, Rita; Zibarev, Evgenij; Chashchin, Maxim; Thomassen, Yngvar; Chashchin, Valery

    2014-01-01

    Neurobehavioral functions were studied in 137 welders exposed to the geometric mean (GM) air concentration of 214 μg/m(3) (range 1-3230) of manganese (Mn) based on the individual mean from two days of air sampling. Only 22 μg/m(3) (GM) was soluble in the artificial lung fluid Hatch solution. The welders were compared to 137 referents (turner/fitters) recruited from the same plants. The GM concentrations of Mn in whole blood (B-Mn) and urine (U-Mn) were 12.8 μg/L and 0.36 μg/g creatinine versus 8.0 μg/L and 0.07 μg/g creatinine in the referents. Alcohol consumption was assessed by measuring carbohydrate deficient transferrin in serum (sCDT). The welders had poorer performance than the referents on the Grooved Pegboard, Finger Tapping, Simple Reaction Time (SRT) and possibly the Maximum Frequency tests. They also reported more subjective symptoms. Welders with sCDT above the upper reference limit had substantially poorer performances on the Grooved Pegboard test, Finger Tapping test and SRT than welders with sCDT below this level. No effect of high sCDT was observed in the referents, indicating an interaction between high sCDT and exposure to Mn for these tests. Self-reported alcohol consumption had no impact on these neurobehavioral test results. A statistically significant difference in the SRT and Grooved Pegboard test results remained after excluding all subjects with sCDT above the normal level, but the difference in test scores between the groups was smaller. These welders also reported more subjective symptoms than the referents. The results suggest that sCDT should be measured in neurobehavioral studies of occupationally Mn exposed populations for a more precise estimation of high alcohol consumption. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Evaluation of reference genes for quantitative real-time PCR in oil palm elite planting materials propagated by tissue culture.

    PubMed

    Chan, Pek-Lan; Rose, Ray J; Abdul Murad, Abdul Munir; Zainal, Zamri; Low, Eng-Ti Leslie; Ooi, Leslie Cheng-Li; Ooi, Siew-Eng; Yahya, Suzaini; Singh, Rajinder

    2014-01-01

    The somatic embryogenesis tissue culture process has been utilized to propagate high yielding oil palm. Due to the low callogenesis and embryogenesis rates, molecular studies were initiated to identify genes regulating the process, and their expression levels are usually quantified using reverse transcription quantitative real-time PCR (RT-qPCR). With the recent release of oil palm genome sequences, it is crucial to establish a proper strategy for gene analysis using RT-qPCR. Selection of the most suitable reference genes should be performed for accurate quantification of gene expression levels. In this study, eight candidate reference genes selected from cDNA microarray study and literature review were evaluated comprehensively across 26 tissue culture samples using RT-qPCR. These samples were collected from two tissue culture lines and media treatments, which consisted of leaf explants cultures, callus and embryoids from consecutive developmental stages. Three statistical algorithms (geNorm, NormFinder and BestKeeper) confirmed that the expression stability of novel reference genes (pOP-EA01332, PD00380 and PD00569) outperformed classical housekeeping genes (GAPDH, NAD5, TUBULIN, UBIQUITIN and ACTIN). PD00380 and PD00569 were identified as the most stably expressed genes in total samples, MA2 and MA8 tissue culture lines. Their applicability to validate the expression profiles of a putative ethylene-responsive transcription factor 3-like gene demonstrated the importance of using the geometric mean of two genes for normalization. Systematic selection of the most stably expressed reference genes for RT-qPCR was established in oil palm tissue culture samples. PD00380 and PD00569 were selected for accurate and reliable normalization of gene expression data from RT-qPCR. These data will be valuable to the research associated with the tissue culture process. Also, the method described here will facilitate the selection of appropriate reference genes in other oil palm tissues and in the expression profiling of genes relating to yield, biotic and abiotic stresses.

  16. Rating locomotive crew diesel emission exposure profiles using statistics and Bayesian Decision Analysis.

    PubMed

    Hewett, Paul; Bullock, William H

    2014-01-01

    For more than 20 years CSX Transportation (CSXT) has collected exposure measurements from locomotive engineers and conductors who are potentially exposed to diesel emissions. The database included measurements for elemental and total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, carbon monoxide, and nitrogen dioxide. This database was statistically analyzed and summarized, and the resulting statistics and exposure profiles were compared to relevant occupational exposure limits (OELs) using both parametric and non-parametric descriptive and compliance statistics. Exposure ratings, using the American Industrial Health Association (AIHA) exposure categorization scheme, were determined using both the compliance statistics and Bayesian Decision Analysis (BDA). The statistical analysis of the elemental carbon data (a marker for diesel particulate) strongly suggests that the majority of levels in the cabs of the lead locomotives (n = 156) were less than the California guideline of 0.020 mg/m(3). The sample 95th percentile was roughly half the guideline; resulting in an AIHA exposure rating of category 2/3 (determined using BDA). The elemental carbon (EC) levels in the trailing locomotives tended to be greater than those in the lead locomotive; however, locomotive crews rarely ride in the trailing locomotive. Lead locomotive EC levels were similar to those reported by other investigators studying locomotive crew exposures and to levels measured in urban areas. Lastly, both the EC sample mean and 95%UCL were less than the Environmental Protection Agency (EPA) reference concentration of 0.005 mg/m(3). With the exception of nitrogen dioxide, the overwhelming majority of the measurements for total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, and combustion gases in the cabs of CSXT locomotives were either non-detects or considerably less than the working OELs for the years represented in the database. When compared to the previous American Conference of Governmental Industrial Hygienists (ACGIH) threshold limit value (TLV) of 3 ppm the nitrogen dioxide exposure profile merits an exposure rating of AIHA exposure category 1. However, using the newly adopted TLV of 0.2 ppm the exposure profile receives an exposure rating of category 4. Further evaluation is recommended to determine the current status of nitrogen dioxide exposures. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resource: additional text on OELs, methods, results, and additional figures and tables.].

  17. Pocket Guide to Transportation 2016

    DOT National Transportation Integrated Search

    2016-01-01

    The 2016 Pocket Guide to Transportation is a compilation of statistics that provides key information on the U.S. transportation system and highlights major trends. Intended as a compact reference, it supports the Bureau of Transportation Statistics m...

  18. Pocket Guide to Transportation 2015

    DOT National Transportation Integrated Search

    2015-01-01

    The 2015 Pocket Guide to Transportation is a compilation of statistics that provide key information and highlight major trends on the U.S. transportation system. Intended as a compact reference, it supports the Bureau of Transportation Statistics mis...

  19. Pocket Guide to Transportation 2014

    DOT National Transportation Integrated Search

    2014-01-01

    The 2014 Pocket Guide to Transportation is a compilation of statistics related to the performance and impact of the U.S. transportation system. Intended as a compact reference, it supports the Bureau of Transportation Statistics mission to create,...

  20. Determination of Age-Dependent Reference Ranges for Coagulation Tests Performed Using Destiny Plus

    PubMed Central

    Arslan, Fatma Demet; Serdar, Muhittin; Merve Ari, Elif; Onur Oztan, Mustafa; Hikmet Kozcu, Sureyya; Tarhan, Huseyin; Cakmak, Ozgur; Zeytinli, Merve; Yasar Ellidag, Hamit

    2016-01-01

    Background In order to apply the right treatment for hemostatic disorders in pediatric patients, laboratory data should be interpreted with age-appropriate reference ranges. Objectives The purpose of this study was to determining age-dependent reference range values for prothrombin time (PT), activated partial thromboplastin time (aPTT), fibrinogen tests, and D-dimer tests. Materials and Methods A total of 320 volunteers were included in the study with the following ages: 1 month - 1 year (n = 52), 2 - 5 years (n = 50), 6 - 10 years (n = 48), 11 - 17 years (n = 38), and 18 - 65 years (n = 132). Each volunteer completed a survey to exclude hemostatic system disorder. Using a nonparametric method, the lower and upper limits, including 95% distribution and 90% confidence intervals, were calculated. Results No statistically significant differences were found between PT and aPTT values in the groups consisting of children. Thus, the reference ranges were separated into child and adult age groups. PT and aPTT values were significantly higher in the children than in the adults. Fibrinogen values in the 6 - 10 age group and the adult age group were significantly higher than in the other groups. D-dimer levels were significantly lower in those aged 2 - 17; thus, a separate reference range was established. Conclusions These results support other findings related to developmental hemostasis, confirming that adult and pediatric age groups should be evaluated using different reference ranges. PMID:27617078

  1. snoU6 and 5S RNAs are not reliable miRNA reference genes in neuronal differentiation.

    PubMed

    Lim, Q E; Zhou, L; Ho, Y K; Wan, G; Too, H P

    2011-12-29

    Accurate profiling of microRNAs (miRNAs) is an essential step for understanding the functional significance of these small RNAs in both physiological and pathological processes. Quantitative real-time PCR (qPCR) has gained acceptance as a robust and reliable transcriptomic method to profile subtle changes in miRNA levels and requires reference genes for accurate normalization of gene expression. 5S and snoU6 RNAs are commonly used as reference genes in microRNA quantification. It is currently unknown if these small RNAs are stably expressed during neuronal differentiation. Panels of miRNAs have been suggested as alternative reference genes to 5S and snoU6 in various physiological contexts. To test the hypothesis that miRNAs may serve as stable references during neuronal differentiation, the expressions of eight miRNAs, 5S and snoU6 RNAs in five differentiating neuronal cell types were analyzed using qPCR. The stabilities of the expressions were evaluated using two complementary statistical approaches (geNorm and Normfinder). Expressions of 5S and snoU6 RNAs were stable under some but not all conditions of neuronal differentiation and thus are not suitable reference genes. In contrast, a combination of three miRNAs (miR-103, miR-106b and miR-26b) allowed accurate expression normalization across different models of neuronal differentiation. Copyright © 2011 IBRO. Published by Elsevier Ltd. All rights reserved.

  2. Children in the UK: Signposts to Statistics.

    ERIC Educational Resources Information Center

    Grey, Eleanor

    This guide indicates statistical sources in the United Kingdom dealing with children and young people. Regular and occasional sources are listed in a three-column format including the name of the source, a brief description, and the geographic area to which statistics refer. Information is classified under 25 topic headings: abortions; accidents;…

  3. Mini-Digest of Education Statistics, 2008. NCES 2009-021

    ERIC Educational Resources Information Center

    Snyder, Thomas D.

    2009-01-01

    This publication is the 14th edition of the "Mini-Digest of Education Statistics," a pocket-sized compilation of statistical information covering the broad field of American education from kindergarten through graduate school. The "Mini-Digest" is designed as an easy reference for materials found in much greater detail in the…

  4. Personal exposure to mobile phone frequencies and well-being in adults: a cross-sectional study based on dosimetry.

    PubMed

    Thomas, Silke; Kühnlein, Anja; Heinrich, Sabine; Praml, Georg; Nowak, Dennis; von Kries, Rüdiger; Radon, Katja

    2008-09-01

    The use of mobile phone telecommunication has increased in recent years. In parallel, there is growing concern about possible adverse health effects of cellular phone networks. We used personal dosimetry to investigate the association between exposure to mobile phone frequencies and well-being in adults. A random population-based sample of 329 adults living in four different Bavarian towns was assembled for the study. Using a dosimeter (ESM-140 Maschek Electronics), we obtained an exposure profile over 24 h for three mobile phone frequency ranges (measurement interval 1 s, limit of determination 0.05 V/m). Exposure levels over waking hours were totalled and expressed as mean percentage of the International Commission on Non-Ionizing Radiation Protection (ICNIRP) reference level. Each participant reported acute symptoms in a day-long diary. Data on five groups of chronic symptoms and potential confounders were assessed during an interview. The overall exposure to high-frequency electromagnetic fields was markedly below the ICNIRP reference level. We did not find any statistically significant association between the exposure and chronic symptoms or between the exposure and acute symptoms. Larger studies using mobile phone dosimetry are warranted to confirm these findings. Copyright 2008 Wiley-Liss, Inc.

  5. Viewing brain processes as Critical State Transitions across levels of organization: Neural events in Cognition and Consciousness, and general principles.

    PubMed

    Werner, Gerhard

    2009-04-01

    In this theoretical and speculative essay, I propose that insights into certain aspects of neural system functions can be gained from viewing brain function in terms of the branch of Statistical Mechanics currently referred to as "Modern Critical Theory" [Stanley, H.E., 1987. Introduction to Phase Transitions and Critical Phenomena. Oxford University Press; Marro, J., Dickman, R., 1999. Nonequilibrium Phase Transitions in Lattice Models. Cambridge University Press, Cambridge, UK]. The application of this framework is here explored in two stages: in the first place, its principles are applied to state transitions in global brain dynamics, with benchmarks of Cognitive Neuroscience providing the relevant empirical reference points. The second stage generalizes to suggest in more detail how the same principles could also apply to the relation between other levels of the structural-functional hierarchy of the nervous system and between neural assemblies. In this view, state transitions resulting from the processing at one level are the input to the next, in the image of a 'bucket brigade', with the content of each bucket being passed on along the chain, after having undergone a state transition. The unique features of a process of this kind will be discussed and illustrated.

  6. Hispanic ethnicity and Caucasian race: Relations with posttraumatic stress disorder's factor structure in clinic-referred youth.

    PubMed

    Contractor, Ateka A; Claycomb, Meredith A; Byllesby, Brianna M; Layne, Christopher M; Kaplow, Julie B; Steinberg, Alan M; Elhai, Jon D

    2015-09-01

    The severity of posttraumatic stress disorder (PTSD) symptoms is linked to race and ethnicity, albeit with contradictory findings (reviewed in Alcántara, Casement, & Lewis-Fernández, 2013; Pole, Gone, & Kulkarni, 2008). We systematically examined Caucasian (n = 3,767) versus non-Caucasian race (n = 2,824) and Hispanic (n = 2,395) versus non-Hispanic ethnicity (n = 3,853) as candidate moderators of PTSD's 5-factor model structural parameters (Elhai et al., 2013). The sample was drawn from the National Child Traumatic Stress Network's Core Data Set, currently the largest national data set of clinic-referred children and adolescents exposed to potentially traumatic events. Using confirmatory factor analysis, we tested the invariance of PTSD symptom structural parameters by race and ethnicity. Chi-square difference tests and goodness-of-fit values showed statistical equivalence across racial and ethnic groups in the factor structure of PTSD and in mean item-level indicators of PTSD symptom severity. Results support the structural invariance of PTSD's 5-factor model across the compared racial and ethnic groups. Furthermore, results indicated equivalent item-level severity across racial and ethnic groups; this supports the use of item-level comparisons across these groups. (c) 2015 APA, all rights reserved).

  7. Nocturnal oxygen saturation profiles of healthy term infants

    PubMed Central

    Terrill, Philip Ian; Dakin, Carolyn; Hughes, Ian; Yuill, Maggie; Parsley, Chloe

    2015-01-01

    Objective Pulse oximetry is used extensively in hospital and home settings to measure arterial oxygen saturation (SpO2). Interpretation of the trend and range of SpO2 values observed in infants is currently limited by a lack of reference ranges using current devices, and may be augmented by development of cumulative frequency (CF) reference-curves. This study aims to provide reference oxygen saturation values from a prospective longitudinal cohort of healthy infants. Design Prospective longitudinal cohort study. Setting Sleep-laboratory. Patients 34 healthy term infants were enrolled, and studied at 2 weeks, 3, 6, 12 and 24 months of age (N=30, 25, 27, 26, 20, respectively). Interventions Full overnight polysomnography, including 2 s averaging pulse oximetry (Masimo Radical). Main outcome measurements Summary SpO2 statistics (mean, median, 5th and 10th percentiles) and SpO2 CF plots were calculated for each recording. CF reference-curves were then generated for each study age. Analyses were repeated with sleep-state stratifications and inclusion of manual artefact removal. Results Median nocturnal SpO2 values ranged between 98% and 99% over the first 2 years of life and the CF reference-curves shift right by 1% between 2 weeks and 3 months. CF reference-curves did not change with manual artefact removal during sleep and did not vary between rapid eye movement (REM) and non-REM sleep. Manual artefact removal did significantly change summary statistics and CF reference-curves during wake. Conclusions SpO2 CF curves provide an intuitive visual tool for evaluating whether an individual's nocturnal SpO2 distribution falls within the range of healthy age-matched infants, thereby complementing summary statistics in the interpretation of extended oximetry recordings in infants. PMID:25063836

  8. Accounting for Global Climate Model Projection Uncertainty in Modern Statistical Downscaling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johannesson, G

    2010-03-17

    Future climate change has emerged as a national and a global security threat. To carry out the needed adaptation and mitigation steps, a quantification of the expected level of climate change is needed, both at the global and the regional scale; in the end, the impact of climate change is felt at the local/regional level. An important part of such climate change assessment is uncertainty quantification. Decision and policy makers are not only interested in 'best guesses' of expected climate change, but rather probabilistic quantification (e.g., Rougier, 2007). For example, consider the following question: What is the probability that themore » average summer temperature will increase by at least 4 C in region R if global CO{sub 2} emission increases by P% from current levels by time T? It is a simple question, but one that remains very difficult to answer. It is answering these kind of questions that is the focus of this effort. The uncertainty associated with future climate change can be attributed to three major factors: (1) Uncertainty about future emission of green house gasses (GHG). (2) Given a future GHG emission scenario, what is its impact on the global climate? (3) Given a particular evolution of the global climate, what does it mean for a particular location/region? In what follows, we assume a particular GHG emission scenario has been selected. Given the GHG emission scenario, the current batch of the state-of-the-art global climate models (GCMs) is used to simulate future climate under this scenario, yielding an ensemble of future climate projections (which reflect, to some degree our uncertainty of being able to simulate future climate give a particular GHG scenario). Due to the coarse-resolution nature of the GCM projections, they need to be spatially downscaled for regional impact assessments. To downscale a given GCM projection, two methods have emerged: dynamical downscaling and statistical (empirical) downscaling (SDS). Dynamic downscaling involves configuring and running a regional climate model (RCM) nested within a given GCM projection (i.e., the GCM provides bounder conditions for the RCM). On the other hand, statistical downscaling aims at establishing a statistical relationship between observed local/regional climate variables of interest and synoptic (GCM-scale) climate predictors. The resulting empirical relationship is then applied to future GCM projections. A comparison of the pros and cons of dynamical versus statistical downscaling is outside the scope of this effort, but has been extensively studied and the reader is referred to Wilby et al. (1998); Murphy (1999); Wood et al. (2004); Benestad et al. (2007); Fowler et al. (2007), and references within those. The scope of this effort is to study methodology, a statistical framework, to propagate and account for GCM uncertainty in regional statistical downscaling assessment. In particular, we will explore how to leverage an ensemble of GCM projections to quantify the impact of the GCM uncertainty in such an assessment. There are three main component to this effort: (1) gather the necessary climate-related data for a regional SDS study, including multiple GCM projections, (2) carry out SDS, and (3) assess the uncertainty. The first step is carried out using tools written in the Python programming language, while analysis tools were developed in the statistical programming language R; see Figure 1.« less

  9. Shared decision-making, stigma, and child mental health functioning among families referred for primary care-located mental health services.

    PubMed

    Butler, Ashley M

    2014-03-01

    There is growing emphasis on shared decision making (SDM) to promote family participation in care and improve the quality of child mental health care. Yet, little is known about the relationship of SDM with parental perceptions of child mental health treatment or child mental health functioning. The objectives of this preliminary study were to examine (a) the frequency of perceived SDM with providers among minority parents of children referred to colocated mental health care in a primary care clinic, (b) associations between parent-reported SDM and mental health treatment stigma and child mental health impairment, and (c) differences in SDM among parents of children with various levels of mental health problem severity. Participants were 36 Latino and African American parents of children (ages 2-7 years) who were referred to colocated mental health care for externalizing mental health problems (disruptive, hyperactive, and aggressive behaviors). Parents completed questions assessing their perceptions of SDM with providers, child mental health treatment stigma, child mental health severity, and level of child mental health impairment. Descriptive statistics demonstrated the majority of the sample reported frequent SDM with providers. Correlation coefficients indicated higher SDM was associated with lower stigma regarding mental health treatment and lower parent-perceived child mental health impairment. Analysis of variance showed no significant difference in SDM among parents of children with different parent-reported levels of child mental health severity. Future research should examine the potential of SDM for addressing child mental health treatment stigma and impairment among minority families.

  10. Role of homocysteine for thromboembolic complication in patients with non-valvular atrial fibrilation.

    PubMed

    Cingozbay, B Y; Yiginer, O; Cebeci, B S; Kardesoglu, E; Demiralp, E; Dincturk, M

    2002-10-01

    Thromboembolism is the most important complication in patients with atrial fibrilation (AF). Homocysteine is a toxic amino acid that has been recently accepted as a risk factor for atherosclerosis and stroke. The aim of the present study is to show whether there is a relation between hyperhomocysteinemia and thromboembolic complications in patients with non-valvular AF. We admitted 38 patients with non-valvular AF. The patients were divided into two groups: group A (n = 20; mean age, 75.7 +/- 10.4 years; three males/17 females), and group B (n = 18; mean age, 68.0 +/- 10.6 years; 11 males/seven females). While group A consisted of the patients with AF and stroke, group B was composed of the patients with AF but without stroke. The patients having sinus rhythm (15 subjects) were used as the reference group to obtain the cut-off value. Homocysteine was measured by the immunoassay method. The means of the homocysteine levels were 12.4 +/- 3.3 micromol/l in group A, 8.3 +/- 2.3 micromol/l in group B and 9.3 +/- 1.8 micromol/l in the reference group. The cut-off value was 10.6 micromol/l. Group A had a statistically higher homocysteine level than not only group B, but also the reference group (P < 0.05). While 60% of group A (n = 12) had the elevated homocysteine level, the rate was only 22% for group B (n = 4). In conclusion, hyperhomocysteinemia may be one of the explanations for the increased rate of thromboembolic complications in older patients with AF.

  11. Musculoskeletal Dysfunctions in Patients With Chronic Pelvic Pain: A Preliminary Descriptive Survey.

    PubMed

    Mieritz, Rune Mygind; Thorhauge, Kirsten; Forman, Axel; Mieritz, Hanne Beck; Hartvigsen, Jan; Christensen, Henrik Wulff

    The purpose of this study was to determine the prevalence of musculoskeletal dysfunctions based on a standardized clinical examination of patients with chronic pelvic pain (CPP) who were referred to a specialized tertiary care center for laparoscopic examination. In addition, we stratified levels of self-reported pelvic pain, self-rated health, education, and work status based on musculoskeletal dysfunction status. This study used a cross-sectional design to determine the prevalence of musculoskeletal dysfunctions in women with CPP who were referred to a tertiary care center specializing in care of women with CPP. The women completed a questionnaire and underwent a blinded systematic objective clinical examination of the musculoskeletal system by a doctor of chiropractic who then categorized the patients as having or not having musculoskeletal dysfunction. Ninety-four patients returned the questionnaire, completed the clinical examination, and fulfilled the inclusion criteria. More than half of the referred patients with CPP (48 out of 94) had musculoskeletal dysfunctions in the lumbar/pelvic region. No statistically significant differences were found between the groups with respect to self-rated health, education, work status, and pain level. Pain location was significantly different after Bonferroni correction in 1 out of the 36 aspects. In this sample of CPP patients, 51% were categorized as having a musculoskeletal dysfunction. Overall, CPP patients were similar with respect to certain characteristics, such as age, body mass index, and pain level, regardless of their classification; however, patients with musculoskeletal dysfunction tended to report more pain in the front and back of the lower limbs. Copyright © 2016. Published by Elsevier Inc.

  12. [Influence of diet and behavior related factors on the peripheral blood triglyceride levels in adults: a cross-sectional study].

    PubMed

    Liang, M B; Wang, H; Zhang, J; He, Q F; Fang, L; Wang, L X; Su, D T; Zhao, M; Zhang, X W; Hu, R Y; Cong, L M; Ding, G G; Ye, Z; Yu, M

    2017-12-10

    Objective: To study the influence of diet and behavior related factors on the peripheral blood triglyceride levels in adults, through a cross-sectional survey. Methods: The current study included 13 434 subjects without histories of major chronic diseases from a population-based cross-sectional survey: the 2010 Metabolic Syndrome Survey in Zhejiang Province. A generalized linear model was used to investigate the influence of diet/behavior-related factors on the peripheral blood triglyceride levels. Results: Mean TG of the sample population appeared as (1.36±1.18) mmol/L. The proportions of elevated TG and marginally elevated TG were 10.3% and 11.0% respectively, with statistically significant difference seen between males and females ( χ (2)=44.135, P <0.001). In this sampled population, the daily intake of cooking oil was exceeding the recommendation levels by over 50% while the intake of fruit, milk, nuts and physical exercise were much below the recommendation. There were statistically significant differences between smoking, alcohol-intake, meat, fruit and water intake in male population from this study. However, in females, the intake of aquatic product and physical exercise showed statistically significant differences. After controlling for other variables, factors as age, drinking, staple food and aquatic products showed positive influence on TG, while milk presented negative influence on TG. Through interaction analysis, fruit and meat intake in males and staple food in females showed positive influence on TG, when compared to the reference group. Conclusion: Hyperglyceridemia appeared as one of the major metabolic abnormities in Zhejiang province. Programs on monitoring the alcohol, staple food and meat intake should be priority on intervention, in the communities.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Medeiros, Brian; Williamson, David L.; Olson, Jerry G.

    In this study, fundamental characteristics of the aquaplanet climate simulated by the Community Atmosphere Model, Version 5.3 (CAM5.3) are presented. The assumptions and simplifications of the configuration are described. A 16 year long, perpetual equinox integration with prescribed SST using the model’s standard 18 grid spacing is presented as a reference simulation. Statistical analysis is presented that shows similar aquaplanet configurations can be run for about 2 years to obtain robust climatological structures, including global and zonal means, eddy statistics, and precipitation distributions. Such a simulation can be compared to the reference simulation to discern differences in the climate, includingmore » an assessment of confidence in the differences. To aid such comparisons, the reference simulation has been made available via earthsystemgrid.org. Examples are shown comparing the reference simulation with simulations from the CAM5 series that make different microphysical assumptions and use a different dynamical core.« less

  14. The company objects keep: Linking referents together during cross-situational word learning.

    PubMed

    Zettersten, Martin; Wojcik, Erica; Benitez, Viridiana L; Saffran, Jenny

    2018-04-01

    Learning the meanings of words involves not only linking individual words to referents but also building a network of connections among entities in the world, concepts, and words. Previous studies reveal that infants and adults track the statistical co-occurrence of labels and objects across multiple ambiguous training instances to learn words. However, it is less clear whether, given distributional or attentional cues, learners also encode associations amongst the novel objects. We investigated the consequences of two types of cues that highlighted object-object links in a cross-situational word learning task: distributional structure - how frequently the referents of novel words occurred together - and visual context - whether the referents were seen on matching backgrounds. Across three experiments, we found that in addition to learning novel words, adults formed connections between frequently co-occurring objects. These findings indicate that learners exploit statistical regularities to form multiple types of associations during word learning.

  15. Psychophysiological effects of a web-based stress management system: a prospective, randomized controlled intervention study of IT and media workers [ISRCTN54254861].

    PubMed

    Hasson, Dan; Anderberg, Ulla Maria; Theorell, Töres; Arnetz, Bengt B

    2005-07-25

    The aim of the present study was to assess possible effects on mental and physical well-being and stress-related biological markers of a web-based health promotion tool. A randomized, prospectively controlled study was conducted with before and after measurements, involving 303 employees (187 men and 116 women, age 23-64) from four information technology and two media companies. Half of the participants were offered web-based health promotion and stress management training (intervention) lasting for six months. All other participants constituted the reference group. Different biological markers were measured to detect possible physiological changes. After six months the intervention group had improved statistically significantly compared to the reference group on ratings of ability to manage stress, sleep quality, mental energy, concentration ability and social support. The anabolic hormone dehydroepiandosterone sulphate (DHEA-S) decreased significantly in the reference group as compared to unchanged levels in the intervention group. Neuropeptide Y (NPY) increased significantly in the intervention group compared to the reference group. Chromogranin A (CgA) decreased significantly in the intervention group as compared to the reference group. Tumour necrosis factor alpha (TNFalpha) decreased significantly in the reference group compared to the intervention group. Logistic regression analysis revealed that group (intervention vs. reference) remained a significant factor in five out of nine predictive models. The results indicate that an automatic web-based system might have short-term beneficial physiological and psychological effects and thus might be an opportunity in counteracting some clinically relevant and common stress and health issues of today.

  16. 2011 statistical abstract of the United States

    USGS Publications Warehouse

    Krisanda, Joseph M.

    2011-01-01

    The Statistical Abstract of the United States, published since 1878, is the authoritative and comprehensive summary of statistics on the social, political, and economic organization of the United States.Use the Abstract as a convenient volume for statistical reference, and as a guide to sources of more information both in print and on the Web.Sources of data include the Census Bureau, Bureau of Labor Statistics, Bureau of Economic Analysis, and many other Federal agencies and private organizations.

  17. Quantitative skills as a graduate learning outcome of university science degree programmes: student performance explored through theplanned-enacted-experiencedcurriculum model

    NASA Astrophysics Data System (ADS)

    Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn

    2016-07-01

    Application of mathematical and statistical thinking and reasoning, typically referred to as quantitative skills, is essential for university bioscience students. First, this study developed an assessment task intended to gauge graduating students' quantitative skills. The Quantitative Skills Assessment of Science Students (QSASS) was the result, which examined 10 mathematical and statistical sub-topics. Second, the study established an evidential baseline of students' quantitative skills performance and confidence levels by piloting the QSASS with 187 final-year biosciences students at a research-intensive university. The study is framed within the planned-enacted-experienced curriculum model and contributes to science reform efforts focused on enhancing the quantitative skills of university graduates, particularly in the biosciences. The results found, on average, weak performance and low confidence on the QSASS, suggesting divergence between academics' intentions and students' experiences of learning quantitative skills. Implications for curriculum design and future studies are discussed.

  18. Global estimates of shark catches using trade records from commercial markets.

    PubMed

    Clarke, Shelley C; McAllister, Murdoch K; Milner-Gulland, E J; Kirkwood, G P; Michielsens, Catherine G J; Agnew, David J; Pikitch, Ellen K; Nakano, Hideki; Shivji, Mahmood S

    2006-10-01

    Despite growing concerns about overexploitation of sharks, lack of accurate, species-specific harvest data often hampers quantitative stock assessment. In such cases, trade studies can provide insights into exploitation unavailable from traditional monitoring. We applied Bayesian statistical methods to trade data in combination with genetic identification to estimate by species, the annual number of globally traded shark fins, the most commercially valuable product from a group of species often unrecorded in harvest statistics. Our results provide the first fishery-independent estimate of the scale of shark catches worldwide and indicate that shark biomass in the fin trade is three to four times higher than shark catch figures reported in the only global data base. Comparison of our estimates to approximated stock assessment reference points for one of the most commonly traded species, blue shark, suggests that current trade volumes in numbers of sharks are close to or possibly exceeding the maximum sustainable yield levels.

  19. [School performance of former premature infants in the first four years of school].

    PubMed

    Frenzel, J; Paalhorn, U

    1992-12-01

    School achievement during the first four grades was analysed by means of subject marks in 203 prematurely born and in 140 maturely born children. In subjects referring to behaviour in the classroom, no statistically significant differences in average marks could be found between prematurely born children and the control group. The average marks in performance subjects were slightly higher in the subgroup of very prematurely born children. However, statistically significant lower marks were observed in the subject of sports only, and for formerly immature children also in the subject of manual training. The higher the educational level of the mothers, the better the average marks without sports. No relationships between school performance and manifestation of postnatal risk factor like Apgar score, blood gas values and duration of oxygen dependency could be seen. These results demonstrate that school performance of former premature infants lies within the normal variance of their grade.

  20. Aquatic effects assessment: needs and tools.

    PubMed

    Marchini, Silvia

    2002-01-01

    In the assessment of the adverse effects pollutants can produce on exposed ecosystems, different approaches can be followed depending on the quality and quantity of information available, whose advantages and limits are discussed with reference to the aquatic compartment. When experimental data are lacking, a predictive approach can be pursued by making use of validated quantitative structure-activity relationships (QSARs), which provide reliable ecotoxicity estimates only if appropriate models are applied. The experimental approach is central to any environmental hazard assessment procedure, although many uncertainties underlying the extrapolation from a limited set of single species laboratory data to the complexity of the ecosystem (e.g., the limitations of common summary statistics, the variability of species sensitivity, the need to consider alterations at higher level of integration) make the task difficult. When adequate toxicity information are available, the statistical extrapolation approach can be used to predict environmental compatible concentrations.

  1. Effect of lifestyle, education and socioeconomic status on periodontal health

    PubMed Central

    Gundala, Rupasree; Chava, Vijay K.

    2010-01-01

    Background: The health model which forms the basis is knowledge, attitude, temporary, and permanent behaviors. Currently, more emphasis has been directed towards the combined influence of lifestyle, education, levels and socioeconomic factors, instead of regular risk factors in dealing with chronic illnesses. The present study is conducted to correlate the periodontal health of people with reference to lifestyle, education level, and socioeconomic status. Materials and Methods: A cross-sectional study was conducted in the Department of Periodontics, Narayana Dental College and Hospital, Nellore. A total of 1350 subjects were examined and 948 patients were randomly selected from out patient department. Information about their lifestyle, education level, and socioeconomic status were recorded using a questionnaire and correlated with the periodontal status. Results: The statistical analysis showed significant decrease in periodontitis when income and education levels increased. Also the prevalence of periodontitis associated with a healthy lifestyle is significantly lower when compared to an unhealthy lifestyle. Conclusions: There is a strong association of lifestyle, education level, and socioeconomic status with periodontal health. PMID:22114373

  2. Reduction of Cortisol Levels and Participants' Responses Following Art Making.

    PubMed

    Kaimal, Girija; Ray, Kendra; Muniz, Juan

    2016-04-02

    This quasi-experimental study investigated the impact of visual art making on the cortisol levels of 39 healthy adults. Participants provided saliva samples to assess cortisol levels before and after 45 minutes of art making. Participants also provided written responses about the experience at the end of the session. Results indicate that art making resulted in statistically significant lowering of cortisol levels. Participants' written responses indicated that they found the art-making session to be relaxing, enjoyable, helpful for learning about new aspects of self, freeing from constraints, an evolving process of initial struggle to later resolution, and about flow/losing themselves in the work. They also reflected that the session evoked a desire to make art in the future. There were weak associations between changes in cortisol level and age, time of day, and participant responses related to learning about one's self and references to an evolving process in art making. There were no significant differences in outcomes based on prior experiences with art making, media choice, or gender.

  3. Effectively identifying regulatory hotspots while capturing expression heterogeneity in gene expression studies

    PubMed Central

    2014-01-01

    Expression quantitative trait loci (eQTL) mapping is a tool that can systematically identify genetic variation affecting gene expression. eQTL mapping studies have shown that certain genomic locations, referred to as regulatory hotspots, may affect the expression levels of many genes. Recently, studies have shown that various confounding factors may induce spurious regulatory hotspots. Here, we introduce a novel statistical method that effectively eliminates spurious hotspots while retaining genuine hotspots. Applied to simulated and real datasets, we validate that our method achieves greater sensitivity while retaining low false discovery rates compared to previous methods. PMID:24708878

  4. Death of Reference or Birth of a New Marketing Age?

    ERIC Educational Resources Information Center

    Henry, Jo

    2011-01-01

    Reference transactions in academic libraries have been on the decline since mid-1990. The Academic Library Survey from the National Center for Education Statistics shows an average drop of 25% in reference use from 1996-2004 with higher numbers at some institutions such as the University of Maryland which plummeted 47% (Martell, 2008). The…

  5. Making Decisions: Using Electronic Data Collection to Re-Envision Reference Services at the USF Tampa Libraries

    ERIC Educational Resources Information Center

    Todorinova, Lily; Huse, Andy; Lewis, Barbara; Torrence, Matt

    2011-01-01

    Declining reference statistics, diminishing human resources, and the desire to be more proactive and embedded in academic departments, prompted the University of South Florida Library to create a taskforce for re-envisioning reference services. The taskforce was charged with examining the staffing patterns at the desk and developing…

  6. [Disability leave and sick leave in Spain. 2016 legislative update].

    PubMed

    Vicente-Herrero, María Teófila; Terradillos-García, María Jesús; Capdevila-García, Luisa M; Ramírez-Íñiguez de la Torre, María Victoria; Aguilar-Jiménez, Encarna; Aguado-Benedí, María José; López-González, Angel Arturo; Torres-Alberich, José Ignacio

    2018-01-01

    In Spanish, the concepts of discapacidad (disability leave) and incapacidad (sick leave) jointly refer to the impairment of a person due to injuries, diseases or deficiencies that limit their activity in a social, personal or occupational field. However, this common link does not imply that both concepts are the same. Statistical data from INE (Instituto Nacional de Estadística: Statistic National Institute) show that Spain had in 2015 3.85 million persons with a disability (59.8% were women). Statistical data from 2015 from INSS (Instituto Nacional de Seguridad Social: Social Security National Institute) show high levels in the number of processes and in workers affected by temporary sick leave, with social costs to the social security system. Both concepts have been updated: about disability leave, Law 39/2006 adjusted terminology by avoiding the use of concepts with discriminating or pejorative connotation. Regarding sick leave, the Ley General de Seguridad Social (General Social Security Law)has been amended and came into effect in January, 2016. It is necessary to know and distinguish these aspects for a better administrative management, and a more oriented information to the affected patient.

  7. Improving single-molecule FRET measurements by confining molecules in nanopipettes

    NASA Astrophysics Data System (ADS)

    Vogelsang, J.; Doose, S.; Sauer, M.; Tinnefeld, P.

    2007-07-01

    In recent years Fluorescence Resonance Energy Transfer (FRET) has been widely used to determine distances, observe distance dynamics, and monitor molecular binding at the single-molecule level. A basic constraint of single-molecule FRET studies is the limited distance resolution owing to low photon statistics. We demonstrate that by confining molecules in nanopipettes (50-100 nm diameter) smFRET can be measured with improved photon statistics reducing the width of FRET proximity ratio distributions (PRD). This increase in distance resolution makes it possible to reveal subpopulations and dynamics in biomolecular complexes. Our data indicate that the width of PRD is not only determined by photon statistics (shot noise) and distance distributions between the chromophores but that photoinduced dark states of the acceptor also contribute to the PRD width. Furthermore, acceptor dark states such as triplet states influence the accuracy of determined mean FRET values. In this context, we present a strategy for the correction of the shift of the mean PR that is related to triplet induced blinking of the acceptor using reference FCS measurements.

  8. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network

    PubMed Central

    Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing

    2016-01-01

    A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method. PMID:26761006

  9. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network.

    PubMed

    Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing

    2016-01-08

    A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method.

  10. The journals are full of great studies but can we believe the statistics? Revisiting the mass privatisation - mortality debate.

    PubMed

    Gerry, Christopher J

    2012-07-01

    Cross-national statistical analyses based on country-level panel data are increasingly popular in social epidemiology. To provide reliable results on the societal determinants of health, analysts must give very careful consideration to conceptual and methodological issues: aggregate (historical) data are typically compatible with multiple alternative stories of the data-generating process. Studies in this field which fail to relate their empirical approach to the true underlying data-generating process are likely to produce misleading results if, for example, they misspecify their models by failing to explore the statistical properties of the longitudinal aspect of their data or by ignoring endogeneity issues. We illustrate the importance of this extra need for care with reference to a recent debate on whether discussing the role of rapid mass privatisation can explain post-communist mortality fluctuations. We demonstrate that the finding that rapid mass privatisation was a "crucial determinant" of male mortality fluctuations in the post-communist world is rejected once better consideration is given to the way in which the data are generated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. An accurate test for homogeneity of odds ratios based on Cochran's Q-statistic.

    PubMed

    Kulinskaya, Elena; Dollinger, Michael B

    2015-06-10

    A frequently used statistic for testing homogeneity in a meta-analysis of K independent studies is Cochran's Q. For a standard test of homogeneity the Q statistic is referred to a chi-square distribution with K-1 degrees of freedom. For the situation in which the effects of the studies are logarithms of odds ratios, the chi-square distribution is much too conservative for moderate size studies, although it may be asymptotically correct as the individual studies become large. Using a mixture of theoretical results and simulations, we provide formulas to estimate the shape and scale parameters of a gamma distribution to fit the distribution of Q. Simulation studies show that the gamma distribution is a good approximation to the distribution for Q. Use of the gamma distribution instead of the chi-square distribution for Q should eliminate inaccurate inferences in assessing homogeneity in a meta-analysis. (A computer program for implementing this test is provided.) This hypothesis test is competitive with the Breslow-Day test both in accuracy of level and in power.

  12. Comparing Serum Follicle-Stimulating Hormone (FSH) Level with Vaginal PH in Women with Menopausal Symptoms.

    PubMed

    Vahidroodsari, Fatemeh; Ayati, Seddigheh; Yousefi, Zohreh; Saeed, Shohreh

    2010-01-01

    Despite the important implication for women's health and reproduction, very few studies have focused on vaginal PH for menopausal diagnosis. Recent studies have suggested vaginal PH as a simple, noninvasive and inexpensive method for this purpose. The aim of this study is to compare serum FSH level with vaginal PH in menopause. This is a cross-sectional, descriptive study, conducted on 103 women (aged 31-95 yrs) with menopausal symptoms who were referred to the Menopausal Clinic at Ghaem Hospital during 2006. Vaginal pH was measured using pH meter strips and serum FSH levels were measured using immunoassay methods. The data was analyzed using SPSS software (version 11.5) and results were evaluated statistically by the Chi-square and Kappa tests. p≤0.05 was considered statistically significant. According to this study, in the absence of vaginal infection, the average vaginal pH in these 103 menopausal women was 5.33±0.53. If the menopausal hallmark was considered as vaginal pH>4.5, and serum FSH as ≥20 mIU/ml, then the sensitivity of vaginal pH for menopausal diagnosis was 97%. The mean of FSH levels in this population was 80.79 mIU/ml. Vaginal pH is a simple, accurate, and cost effective tool that can be suggested as a suitable alternative to serum FSH measurement for the diagnosis of menopause.

  13. Salutogenic factors for mental health promotion in work settings and organizations.

    PubMed

    Graeser, Silke

    2011-12-01

    Accompanied by an increasing awareness of companies and organizations for mental health conditions in work settings and organizations, the salutogenic perspective provides a promising approach to identify supportive factors and resources of organizations to promote mental health. Based on the sense of coherence (SOC) - usually treated as an individual and personality trait concept - an organization-based SOC scale was developed to identify potential salutogenic factors of a university as an organization and work place. Based on results of two samples of employees (n = 362, n = 204), factors associated with the organization-based SOC were evaluated. Statistical analysis yielded significant correlations between mental health and the setting-based SOC as well as the three factors of the SOC yielded by factor analysis yielded three factors comprehensibility, manageability and meaningfulness. Significant statistic results of bivariate and multivariate analyses emphasize the significance of aspects such as participation and comprehensibility referring to the organization, social cohesion and social climate on the social level, and recognition on the individual level for an organization-based SOC. Potential approaches for the further development of interventions for work-place health promotion based on salutogenic factors and resources on the individual, social and organization level are elaborated and the transcultural dimensions of these factors discussed.

  14. Length separation of single-walled carbon nanotubes and its impact on structural and electrical properties of wafer-level fabricated carbon nanotube-field-effect transistors

    NASA Astrophysics Data System (ADS)

    Böttger, Simon; Hermann, Sascha; Schulz, Stefan E.; Gessner, Thomas

    2016-10-01

    For an industrial realization of devices based on single-walled carbon nanotube (SWCNTs) such as field-effect transistors (FETs) it becomes increasingly important to consider technological aspects such as intrinsic device structure, integration process controllability as well as yield. From the perspective of a wafer-level integration technology, the influence of SWCNT length on the performance of short-channel CNT-FETs is demonstrated by means of a statistical and comparative study. Therefore, a methodological development of a length separation process based on size-exclusion chromatography was conducted in order to extract well-separated SWCNT dispersions with narrowed length distribution. It could be shown that short SWCNTs adversely affect integrability and reproducibility, underlined by a 25% decline of the integration yield with respect to long SWCNTs. Furthermore, it turns out that the significant changes in electrical performance are directly linked to a SWCNT chain formation in the transistor channel. In particular, CNT-FETs with long SWCNTs outperform reference and short SWCNTs with respect to hole mobility and subthreshold controllability by up to 300% and up to 140%, respectively. As a whole, this study provides a statistical and comparative analysis towards chain-less CNT-FETs fabricated with a wafer-level technology.

  15. Elasmobranch qPCR reference genes: a case study of hypoxia preconditioned epaulette sharks

    PubMed Central

    2010-01-01

    Background Elasmobranch fishes are an ancient group of vertebrates which have high potential as model species for research into evolutionary physiology and genomics. However, no comparative studies have established suitable reference genes for quantitative PCR (qPCR) in elasmobranchs for any physiological conditions. Oxygen availability has been a major force shaping the physiological evolution of vertebrates, especially fishes. Here we examined the suitability of 9 reference candidates from various functional categories after a single hypoxic insult or after hypoxia preconditioning in epaulette shark (Hemiscyllium ocellatum). Results Epaulette sharks were caught and exposed to hypoxia. Tissues were collected from 10 controls, 10 individuals with single hypoxic insult and 10 individuals with hypoxia preconditioning (8 hypoxic insults, 12 hours apart). We produced sequence information for reference gene candidates and monitored mRNA expression levels in four tissues: cerebellum, heart, gill and eye. The stability of the genes was examined with analysis of variance, geNorm and NormFinder. The best ranking genes in our study were eukaryotic translation elongation factor 1 beta (eef1b), ubiquitin (ubq) and polymerase (RNA) II (DNA directed) polypeptide F (polr2f). The performance of the ribosomal protein L6 (rpl6) was tissue-dependent. Notably, in one tissue the analysis of variance indicated statistically significant differences between treatments for genes that were ranked as the most stable candidates by reference gene software. Conclusions Our results indicate that eef1b and ubq are generally the most suitable reference genes for the conditions and tissues in the present epaulette shark studies. These genes could also be potential reference gene candidates for other physiological studies examining stress in elasmobranchs. The results emphasise the importance of inter-group variation in reference gene evaluation. PMID:20416043

  16. Evaluation and Validation of Reference Genes for qRT-PCR Normalization in Frankliniella occidentalis (Thysanoptera:Thripidae)

    PubMed Central

    Zheng, Yu-Tao; Li, Hong-Bo; Lu, Ming-Xing; Du, Yu-Zhou

    2014-01-01

    Quantitative real time PCR (qRT-PCR) has emerged as a reliable and reproducible technique for studying gene expression analysis. For accurate results, the normalization of data with reference genes is particularly essential. Once the transcriptome sequencing of Frankliniella occidentalis was completed, numerous unigenes were identified and annotated. Unfortunately, there are no studies on the stability of reference genes used in F. occidentalis. In this work, seven candidate reference genes, including actin, 18S rRNA, H3, tubulin, GAPDH, EF-1 and RPL32, were evaluated for their suitability as normalization genes under different experimental conditions using the statistical software programs BestKeeper, geNorm, Normfinder and the comparative ΔCt method. Because the rankings of the reference genes provided by each of the four programs were different, we chose a user-friendly web-based comprehensive tool RefFinder to get the final ranking. The result demonstrated that EF-1 and RPL32 displayed the most stable expression in different developmental stages; RPL32 and GAPDH showed the most stable expression at high temperatures, while 18S and EF-1 exhibited the most stable expression at low temperatures. In this study, we validated the suitable reference genes in F. occidentalis for gene expression profiling under different experimental conditions. The choice of internal standard is very important in the normalization of the target gene expression levels, thus validating and selecting the best genes will help improve the quality of gene expression data of F. occidentalis. What is more, these validated reference genes could serve as the basis for the selection of candidate reference genes in other insects. PMID:25356721

  17. Evaluation and validation of reference genes for qRT-PCR normalization in Frankliniella occidentalis (Thysanoptera: Thripidae).

    PubMed

    Zheng, Yu-Tao; Li, Hong-Bo; Lu, Ming-Xing; Du, Yu-Zhou

    2014-01-01

    Quantitative real time PCR (qRT-PCR) has emerged as a reliable and reproducible technique for studying gene expression analysis. For accurate results, the normalization of data with reference genes is particularly essential. Once the transcriptome sequencing of Frankliniella occidentalis was completed, numerous unigenes were identified and annotated. Unfortunately, there are no studies on the stability of reference genes used in F. occidentalis. In this work, seven candidate reference genes, including actin, 18S rRNA, H3, tubulin, GAPDH, EF-1 and RPL32, were evaluated for their suitability as normalization genes under different experimental conditions using the statistical software programs BestKeeper, geNorm, Normfinder and the comparative ΔCt method. Because the rankings of the reference genes provided by each of the four programs were different, we chose a user-friendly web-based comprehensive tool RefFinder to get the final ranking. The result demonstrated that EF-1 and RPL32 displayed the most stable expression in different developmental stages; RPL32 and GAPDH showed the most stable expression at high temperatures, while 18S and EF-1 exhibited the most stable expression at low temperatures. In this study, we validated the suitable reference genes in F. occidentalis for gene expression profiling under different experimental conditions. The choice of internal standard is very important in the normalization of the target gene expression levels, thus validating and selecting the best genes will help improve the quality of gene expression data of F. occidentalis. What is more, these validated reference genes could serve as the basis for the selection of candidate reference genes in other insects.

  18. Reference gene selection for quantitative real-time PCR in Solanum lycopersicum L. inoculated with the mycorrhizal fungus Rhizophagus irregularis.

    PubMed

    Fuentes, Alejandra; Ortiz, Javier; Saavedra, Nicolás; Salazar, Luis A; Meneses, Claudio; Arriagada, Cesar

    2016-04-01

    The gene expression stability of candidate reference genes in the roots and leaves of Solanum lycopersicum inoculated with arbuscular mycorrhizal fungi was investigated. Eight candidate reference genes including elongation factor 1 α (EF1), glyceraldehyde-3-phosphate dehydrogenase (GAPDH), phosphoglycerate kinase (PGK), protein phosphatase 2A (PP2Acs), ribosomal protein L2 (RPL2), β-tubulin (TUB), ubiquitin (UBI) and actin (ACT) were selected, and their expression stability was assessed to determine the most stable internal reference for quantitative PCR normalization in S. lycopersicum inoculated with the arbuscular mycorrhizal fungus Rhizophagus irregularis. The stability of each gene was analysed in leaves and roots together and separated using the geNorm and NormFinder algorithms. Differences were detected between leaves and roots, varying among the best-ranked genes depending on the algorithm used and the tissue analysed. PGK, TUB and EF1 genes showed higher stability in roots, while EF1 and UBI had higher stability in leaves. Statistical algorithms indicated that the GAPDH gene was the least stable under the experimental conditions assayed. Then, we analysed the expression levels of the LePT4 gene, a phosphate transporter whose expression is induced by fungal colonization in host plant roots. No differences were observed when the most stable genes were used as reference genes. However, when GAPDH was used as the reference gene, we observed an overestimation of LePT4 expression. In summary, our results revealed that candidate reference genes present variable stability in S. lycopersicum arbuscular mycorrhizal symbiosis depending on the algorithm and tissue analysed. Thus, reference gene selection is an important issue for obtaining reliable results in gene expression quantification. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  19. Statistical tables and charts showing geochemical variation in the Mesoproterozoic Big Creek, Apple Creek, and Gunsight formations, Lemhi group, Salmon River Mountains and Lemhi Range, central Idaho

    USGS Publications Warehouse

    Lindsey, David A.; Tysdal, Russell G.; Taggart, Joseph E.

    2002-01-01

    The principal purpose of this report is to provide a reference archive for results of a statistical analysis of geochemical data for metasedimentary rocks of Mesoproterozoic age of the Salmon River Mountains and Lemhi Range, central Idaho. Descriptions of geochemical data sets, statistical methods, rationale for interpretations, and references to the literature are provided. Three methods of analysis are used: R-mode factor analysis of major oxide and trace element data for identifying petrochemical processes, analysis of variance for effects of rock type and stratigraphic position on chemical composition, and major-oxide ratio plots for comparison with the chemical composition of common clastic sedimentary rocks.

  20. Strip mine reclamation: criteria and methods for measurement of revegetation success. Progress report, April 1, 1980-March 31, 1981

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carrel, J.E.; Kucera, C.L.; Johannsen, C.J.

    1980-12-01

    During this contract period research was continued at finding suitable methods and criteria for determining the success of revegetation in Midwestern prime ag lands strip mined for coal. Particularly important to the experimental design was the concept of reference areas, which were nearby fields from which the performance standards for reclaimed areas were derived. Direct and remote sensing techniques for measuring plant ground cover, production, and species composition were tested. 15 mine sites were worked in which were permitted under interim permanent surface mine regulations and in 4 adjoining reference sites. Studies at 9 prelaw sites were continued. All sitesmore » were either in Missouri or Illinois. Data gathered in the 1980 growing season showed that 13 unmanaged or young mineland pastures generally had lower average ground cover and production than 2 reference pastures. In contrast, yields at approximately 40% of 11 recently reclaimed mine sites planted with winter wheat, soybeans, or milo were statistically similar to 3 reference values. Digital computer image analysis of color infrared aerial photographs, when compared to ground level measurements, was a fast, accurate, and inexpensive way to determine plant ground cover and areas. But the remote sensing approach was inferior to standard surface methods for detailing plant species abundance and composition.« less

  1. A pilot study of clinical agreement in cardiovascular preparticipation examinations: how good is the standard of care?

    PubMed

    O'Connor, Francis G; Johnson, Jeremy D; Chapin, Mark; Oriscello, Ralph G; Taylor, Dean C

    2005-05-01

    To evaluate the interobserver agreement between physicians regarding a abnormal cardiovascular assessment on athletic preparticipation examinations. Cross-sectional clinical survey. Outpatient Clinic, United States Military Academy, West Point, NY. We randomly selected 101 out of 539 cadet-athletes presenting for a preparticipation examination. Two primary care sports medicine fellows and a cardiologist examined the cadets. After obtaining informed consent from all participants, all 3 physicians separately evaluated all 101 cadets. The physicians recorded their clinical findings and whether they thought further cardiovascular evaluation (echocardiography) was indicated. Rate of referral for further cardiovascular evaluation, clinical agreement between sports medicine fellows, and clinical agreement between sports medicine fellows and the cardiologist. Each fellow referred 6 of the 101 evaluated cadets (5.9%). The cardiologist referred none. Although each fellow referred 6 cadets, only 1 cadet was referred by both. The kappa statistic for clinical agreement between fellows is 0.114 (95% CI, -0.182 to 0.411). There was no clinical agreement between the fellows and the cardiologist. This pilot study reveals a low level of agreement between physicians regarding which athletes with an abnormal examination deserved further testing. It challenges the standard of care and questions whether there is a need for improved technologies or improved training in cardiovascular clinical assessment.

  2. Geostatistical applications in environmental remediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, R.N.; Purucker, S.T.; Lyon, B.F.

    1995-02-01

    Geostatistical analysis refers to a collection of statistical methods for addressing data that vary in space. By incorporating spatial information into the analysis, geostatistics has advantages over traditional statistical analysis for problems with a spatial context. Geostatistics has a history of success in earth science applications, and its popularity is increasing in other areas, including environmental remediation. Due to recent advances in computer technology, geostatistical algorithms can be executed at a speed comparable to many standard statistical software packages. When used responsibly, geostatistics is a systematic and defensible tool can be used in various decision frameworks, such as the Datamore » Quality Objectives (DQO) process. At every point in the site, geostatistics can estimate both the concentration level and the probability or risk of exceeding a given value. Using these probability maps can assist in identifying clean-up zones. Given any decision threshold and an acceptable level of risk, the probability maps identify those areas that are estimated to be above or below the acceptable risk. Those areas that are above the threshold are of the most concern with regard to remediation. In addition to estimating clean-up zones, geostatistics can assist in designing cost-effective secondary sampling schemes. Those areas of the probability map with high levels of estimated uncertainty are areas where more secondary sampling should occur. In addition, geostatistics has the ability to incorporate soft data directly into the analysis. These data include historical records, a highly correlated secondary contaminant, or expert judgment. The role of geostatistics in environmental remediation is a tool that in conjunction with other methods can provide a common forum for building consensus.« less

  3. Zinc levels in foods from southeastern Spain: relationship to daily dietary intake.

    PubMed

    Terrés, C; Navarro, M; Martín-Lagos, F; Giménez, R; López, H; López, M C

    2001-08-01

    The zinc content of 300 food and 79 beverage samples was determined using flame atomic absorption spectrometry. Sample recoveries, repeatability, and analyses of NIST and CBR-CEC reference materials demonstrated the reliability and accuracy of this technique. Mean zinc concentrations varied from 0.02 microg/ml in fresh water to 71.0 microg/g (fresh weight) in pork liver. The daily dietary intake of zinc for inhabitants of southeastern Spain was estimated to be 10.1 mg (5.5, 4.0, 0.5, and 0.1 mg Zn/day per person from foods of animal and vegetable origin, drinks, and other foods, respectively). Zinc levels found in high protein foods (meat, fish, milk products, eggs, dry fruits, cereals and legumes) were significantly higher than those found in food with a low protein content (vegetables, fruits and drinks) (p < 0.001). A significant linear correlation between zinc levels and the corresponding protein content of cereals, legumes and dry fruits was found (r = 0.754, p < 0.005). Zinc concentrations in milk samples were significantly modified by the thermal treatment (p < 0.001), and the skimming (p < 0.05) and calcium enrichment processes (p < 0.001). Shellfish zinc levels were also significantly higher than those measured in fish (p < 0.05). Mean zinc concentrations found in cheese were statistically higher than those determined in the remaining milk products (p < 0.001). Zinc levels measured in distilled beverages were also statistically lower than those found in fermented ones (p < 0.001).

  4. The Relationship between Particulate Pollution Levels in Australian Cities, Meteorology, and Landscape Fire Activity Detected from MODIS Hotspots

    PubMed Central

    Price, Owen F.; Williamson, Grant J.; Henderson, Sarah B.; Johnston, Fay; Bowman, David M. J. S.

    2012-01-01

    Smoke from bushfires is an emerging issue for fire managers because of increasing evidence for its public health effects. Development of forecasting models to predict future pollution levels based on the relationship between bushfire activity and current pollution levels would be a useful management tool. As a first step, we use daily thermal anomalies detected by the MODIS Active Fire Product (referred to as “hotspots”), pollution concentrations, and meteorological data for the years 2002 to 2008, to examine the statistical relationship between fire activity in the landscapes and pollution levels around Perth and Sydney, two large Australian cities. Resultant models were statistically significant, but differed in their goodness of fit and the distance at which the strength of the relationship was strongest. For Sydney, a univariate model for hotspot activity within 100 km explained 24% of variation in pollution levels, and the best model including atmospheric variables explained 56% of variation. For Perth, the best radius was 400 km, explaining only 7% of variation, while the model including atmospheric variables explained 31% of the variation. Pollution was higher when the atmosphere was more stable and in the presence of on-shore winds, whereas there was no effect of wind blowing from the fires toward the pollution monitors. Our analysis shows there is a good prospect for developing region-specific forecasting tools combining hotspot fire activity with meteorological data. PMID:23071788

  5. Determinants of 25(OH)D sufficiency in obese minority children: selecting outcome measures and analytic approaches.

    PubMed

    Zhou, Ping; Schechter, Clyde; Cai, Ziyong; Markowitz, Morri

    2011-06-01

    To highlight complexities in defining vitamin D sufficiency in children. Serum 25-(OH) vitamin D [25(OH)D] levels from 140 healthy obese children age 6 to 21 years living in the inner city were compared with multiple health outcome measures, including bone biomarkers and cardiovascular risk factors. Several statistical analytic approaches were used, including Pearson correlation, analysis of covariance (ANCOVA), and "hockey stick" regression modeling. Potential threshold levels for vitamin D sufficiency varied by outcome variable and analytic approach. Only systolic blood pressure (SBP) was significantly correlated with 25(OH)D (r = -0.261; P = .038). ANCOVA revealed that SBP and triglyceride levels were statistically significant in the test groups [25(OH)D <10, <15 and <20 ng/mL] compared with the reference group [25(OH)D >25 ng/mL]. ANCOVA also showed that only children with severe vitamin D deficiency [25(OH)D <10 ng/mL] had significantly higher parathyroid hormone levels (Δ = 15; P = .0334). Hockey stick model regression analyses found evidence of a threshold level in SBP, with a 25(OH)D breakpoint of 27 ng/mL, along with a 25(OH)D breakpoint of 18 ng/mL for triglycerides, but no relationship between 25(OH)D and parathyroid hormone. Defining vitamin D sufficiency should take into account different vitamin D-related health outcome measures and analytic methodologies. Copyright © 2011 Mosby, Inc. All rights reserved.

  6. 49 CFR Appendix B to Part 24 - Statistical Report Form

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 1 2013-10-01 2013-10-01 false Statistical Report Form B Appendix B to Part 24... ACQUISITION FOR FEDERAL AND FEDERALLY-ASSISTED PROGRAMS Pt. 24, App. B Appendix B to Part 24—Statistical... entries in Parts of this section A, B and C to the nearest dollar. 6. Regulatory references. The...

  7. 2011 statistical abstract of the United States

    USGS Publications Warehouse

    Krisanda, Joseph M.

    2011-01-01

    The Statistical Abstract of the United States, published since 1878, is the authoritative and comprehensive summary of statistics on the social, political, and economic organization of the United States.


    Use the Abstract as a convenient volume for statistical reference, and as a guide to sources of more information both in print and on the Web.


    Sources of data include the Census Bureau, Bureau of Labor Statistics, Bureau of Economic Analysis, and many other Federal agencies and private organizations.

  8. Validation of a modification to Performance-Tested Method 070601: Reveal Listeria Test for detection of Listeria spp. in selected foods and selected environmental samples.

    PubMed

    Alles, Susan; Peng, Linda X; Mozola, Mark A

    2009-01-01

    A modification to Performance-Tested Method (PTM) 070601, Reveal Listeria Test (Reveal), is described. The modified method uses a new media formulation, LESS enrichment broth, in single-step enrichment protocols for both foods and environmental sponge and swab samples. Food samples are enriched for 27-30 h at 30 degrees C and environmental samples for 24-48 h at 30 degrees C. Implementation of these abbreviated enrichment procedures allows test results to be obtained on a next-day basis. In testing of 14 food types in internal comparative studies with inoculated samples, there was a statistically significant difference in performance between the Reveal and reference culture [U.S. Food and Drug Administration's Bacteriological Analytical Manual (FDA/BAM) or U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS)] methods for only a single food in one trial (pasteurized crab meat) at the 27 h enrichment time point, with more positive results obtained with the FDA/BAM reference method. No foods showed statistically significant differences in method performance at the 30 h time point. Independent laboratory testing of 3 foods again produced a statistically significant difference in results for crab meat at the 27 h time point; otherwise results of the Reveal and reference methods were statistically equivalent. Overall, considering both internal and independent laboratory trials, sensitivity of the Reveal method relative to the reference culture procedures in testing of foods was 85.9% at 27 h and 97.1% at 30 h. Results from 5 environmental surfaces inoculated with various strains of Listeria spp. showed that the Reveal method was more productive than the reference USDA-FSIS culture procedure for 3 surfaces (stainless steel, plastic, and cast iron), whereas results were statistically equivalent to the reference method for the other 2 surfaces (ceramic tile and sealed concrete). An independent laboratory trial with ceramic tile inoculated with L. monocytogenes confirmed the effectiveness of the Reveal method at the 24 h time point. Overall, sensitivity of the Reveal method at 24 h relative to that of the USDA-FSIS method was 153%. The Reveal method exhibited extremely high specificity, with only a single false-positive result in all trials combined for overall specificity of 99.5%.

  9. Statistical Analyses for Probabilistic Assessments of the Reactor Pressure Vessel Structural Integrity: Building a Master Curve on an Extract of the 'Euro' Fracture Toughness Dataset, Controlling Statistical Uncertainty for Both Mono-Temperature and multi-temperature tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Josse, Florent; Lefebvre, Yannick; Todeschini, Patrick

    2006-07-01

    Assessing the structural integrity of a nuclear Reactor Pressure Vessel (RPV) subjected to pressurized-thermal-shock (PTS) transients is extremely important to safety. In addition to conventional deterministic calculations to confirm RPV integrity, Electricite de France (EDF) carries out probabilistic analyses. Probabilistic analyses are interesting because some key variables, albeit conventionally taken at conservative values, can be modeled more accurately through statistical variability. One variable which significantly affects RPV structural integrity assessment is cleavage fracture initiation toughness. The reference fracture toughness method currently in use at EDF is the RCCM and ASME Code lower-bound K{sub IC} based on the indexing parameter RT{submore » NDT}. However, in order to quantify the toughness scatter for probabilistic analyses, the master curve method is being analyzed at present. Furthermore, the master curve method is a direct means of evaluating fracture toughness based on K{sub JC} data. In the framework of the master curve investigation undertaken by EDF, this article deals with the following two statistical items: building a master curve from an extract of a fracture toughness dataset (from the European project 'Unified Reference Fracture Toughness Design curves for RPV Steels') and controlling statistical uncertainty for both mono-temperature and multi-temperature tests. Concerning the first point, master curve temperature dependence is empirical in nature. To determine the 'original' master curve, Wallin postulated that a unified description of fracture toughness temperature dependence for ferritic steels is possible, and used a large number of data corresponding to nuclear-grade pressure vessel steels and welds. Our working hypothesis is that some ferritic steels may behave in slightly different ways. Therefore we focused exclusively on the basic french reactor vessel metal of types A508 Class 3 and A 533 grade B Class 1, taking the sampling level and direction into account as well as the test specimen type. As for the second point, the emphasis is placed on the uncertainties in applying the master curve approach. For a toughness dataset based on different specimens of a single product, application of the master curve methodology requires the statistical estimation of one parameter: the reference temperature T{sub 0}. Because of the limited number of specimens, estimation of this temperature is uncertain. The ASTM standard provides a rough evaluation of this statistical uncertainty through an approximate confidence interval. In this paper, a thorough study is carried out to build more meaningful confidence intervals (for both mono-temperature and multi-temperature tests). These results ensure better control over uncertainty, and allow rigorous analysis of the impact of its influencing factors: the number of specimens and the temperatures at which they have been tested. (authors)« less

  10. A One-year Follow-up Study of a Tapered Hydrophilic Implant Design Using Various Placement Protocols in the Maxilla

    PubMed Central

    Zwaan, Jakob; Vanden Bogaerde, Leonardo; Sahlin, Herman; Sennerby, Lars

    2016-01-01

    Purpose: To study the clinical/radiographic outcomes and stability of a tapered implant design with a hydrophilic surface when placed in the maxilla using various protocols and followed for one year. Methods: Ninety-seven consecutive patients treated as part of daily routine in two clinics with 163 tapered implants in healed sites, in extraction sockets and together with bone augmentation procedures in the maxilla were evaluated after one year in function. Individual healing periods varying from 0 to 6 months had been used. Insertion torque (IT) and resonance frequency analysis (RFA) measurements were made at baseline. Follow-up RFA registrations were made after 6 and 12 months of loading. The marginal bone levels were measured in intraoral radiographs from baseline and after 12 months. A reference group consisting of 163 consecutive straight maxillary implants was used for the comparison of baseline IT and RFA measurements. Results: Five implants failed before loading, giving an implant survival rate of 96.9% and a prosthesis survival rate of 99.4% after one year. The mean marginal bone loss after one year was 0.5 mm (SD 0.4). The mean IT was statistically significantly higher for tapered than for straight reference implants (41.3 ± 12.0 Ncm vs 33.6 ± 12.5 Ncm, p < 0.001). The tapered implants showed a statistically insignificantly higher mean ISQ value than the straight references implants (73.7 ± 6.4 ISQ vs 72.2 ± 8.0 ISQ, p=0.119). There was no correlation between IT and marginal bone loss. There was a correlation between IT and RFA measurements (p < 0.001). Conclusion: The tapered implant showed a high survival rate and minimal marginal bone loss after one year in function when using various protocols for placement. The tapered implant showed significantly higher insertion torque values than straight reference implants. PMID:28077972

  11. Influence of Ultra-Low-Dose and Iterative Reconstructions on the Visualization of Orbital Soft Tissues on Maxillofacial CT.

    PubMed

    Widmann, G; Juranek, D; Waldenberger, F; Schullian, P; Dennhardt, A; Hoermann, R; Steurer, M; Gassner, E-M; Puelacher, W

    2017-08-01

    Dose reduction on CT scans for surgical planning and postoperative evaluation of midface and orbital fractures is an important concern. The purpose of this study was to evaluate the variability of various low-dose and iterative reconstruction techniques on the visualization of orbital soft tissues. Contrast-to-noise ratios of the optic nerve and inferior rectus muscle and subjective scores of a human cadaver were calculated from CT with a reference dose protocol (CT dose index volume = 36.69 mGy) and a subsequent series of low-dose protocols (LDPs I-4: CT dose index volume = 4.18, 2.64, 0.99, and 0.53 mGy) with filtered back-projection (FBP) and adaptive statistical iterative reconstruction (ASIR)-50, ASIR-100, and model-based iterative reconstruction. The Dunn Multiple Comparison Test was used to compare each combination of protocols (α = .05). Compared with the reference dose protocol with FBP, the following statistically significant differences in contrast-to-noise ratios were shown (all, P ≤ .012) for the following: 1) optic nerve: LDP-I with FBP; LDP-II with FBP and ASIR-50; LDP-III with FBP, ASIR-50, and ASIR-100; and LDP-IV with FBP, ASIR-50, and ASIR-100; and 2) inferior rectus muscle: LDP-II with FBP, LDP-III with FBP and ASIR-50, and LDP-IV with FBP, ASIR-50, and ASIR-100. Model-based iterative reconstruction showed the best contrast-to-noise ratio in all images and provided similar subjective scores for LDP-II. ASIR-50 had no remarkable effect, and ASIR-100, a small effect on subjective scores. Compared with a reference dose protocol with FBP, model-based iterative reconstruction may show similar diagnostic visibility of orbital soft tissues at a CT dose index volume of 2.64 mGy. Low-dose technology and iterative reconstruction technology may redefine current reference dose levels in maxillofacial CT. © 2017 by American Journal of Neuroradiology.

  12. Sociology: A Student's Guide to Reference Sources.

    ERIC Educational Resources Information Center

    Waiser, Joni, Comp.

    This guide lists selective reference sources which are useful for research in sociology. The guide is arranged by document type: guides, dictionaries, encyclopedias, directories and biographical sources, statistics, book reviews, theses and dissertations, general social science bibliographies, sociology bibliographies, special subject…

  13. Can natural variability trigger effects on fish and fish habitat as defined in environment Canada's metal mining environmental effects monitoring program?

    PubMed

    Mackey, Robin; Rees, Cassandra; Wells, Kelly; Pham, Samantha; England, Kent

    2013-01-01

    The Metal Mining Effluent Regulations (MMER) took effect in 2002 and require most metal mining operations in Canada to complete environmental effects monitoring (EEM) programs. An "effect" under the MMER EEM program is considered any positive or negative statistically significant difference in fish population, fish usability, or benthic invertebrate community EEM-defined endpoints. Two consecutive studies with the same statistically significant differences trigger more intensive monitoring, including the characterization of extent and magnitude and investigation of cause. Standard EEM study designs do not require multiple reference areas or preexposure sampling, thus results and conclusions about mine effects are highly contingent on the selection of a near perfect reference area and are at risk of falsely labeling natural variation as mine related "effects." A case study was completed to characterize the natural variability in EEM-defined endpoints during preexposure or baseline conditions. This involved completing a typical EEM study in future reference and exposure lakes surrounding a proposed uranium (U) mine in northern Saskatchewan, Canada. Moon Lake was sampled as the future exposure area as it is currently proposed to receive effluent from the U mine. Two reference areas were used: Slush Lake for both the fish population and benthic invertebrate community surveys and Lake C as a second reference area for the benthic invertebrate community survey. Moon Lake, Slush Lake, and Lake C are located in the same drainage basin in close proximity to one another. All 3 lakes contained similar water quality, fish communities, aquatic habitat, and a sediment composition largely comprised of fine-textured particles. The fish population survey consisted of a nonlethal northern pike (Esox lucius) and a lethal yellow perch (Perca flavescens) survey. A comparison of the 5 benthic invertebrate community effect endpoints, 4 nonlethal northern pike population effect endpoints, and 10 lethal yellow perch effect endpoints resulted in the observation of several statistically significant differences at the future exposure area relative to the reference area and/or areas. When the data from 2 reference areas assessed for the benthic invertebrate community survey were pooled, no significant differences in effect endpoints were observed. These results demonstrate weaknesses in the definition of an "effect" used by the MMER EEM program and in the use of a single reference area. Determination of the ecological significance of statistical differences identified as part of EEM programs conducted during the operational period should consider preexisting (background) natural variability between reference and exposure areas. Copyright © 2012 SETAC.

  14. Systematic identification of human housekeeping genes possibly useful as references in gene expression studies.

    PubMed

    Caracausi, Maria; Piovesan, Allison; Antonaros, Francesca; Strippoli, Pierluigi; Vitale, Lorenza; Pelleri, Maria Chiara

    2017-09-01

    The ideal reference, or control, gene for the study of gene expression in a given organism should be expressed at a medium‑high level for easy detection, should be expressed at a constant/stable level throughout different cell types and within the same cell type undergoing different treatments, and should maintain these features through as many different tissues of the organism. From a biological point of view, these theoretical requirements of an ideal reference gene appear to be best suited to housekeeping (HK) genes. Recent advancements in the quality and completeness of human expression microarray data and in their statistical analysis may provide new clues toward the quantitative standardization of human gene expression studies in biology and medicine, both cross‑ and within‑tissue. The systematic approach used by the present study is based on the Transcriptome Mapper tool and exploits the automated reassignment of probes to corresponding genes, intra‑ and inter‑sample normalization, elaboration and representation of gene expression values in linear form within an indexed and searchable database with a graphical interface recording quantitative levels of expression, expression variability and cross‑tissue width of expression for more than 31,000 transcripts. The present study conducted a meta‑analysis of a pool of 646 expression profile data sets from 54 different human tissues and identified actin γ 1 as the HK gene that best fits the combination of all the traditional criteria to be used as a reference gene for general use; two ribosomal protein genes, RPS18 and RPS27, and one aquaporin gene, POM121 transmembrane nucleporin C, were also identified. The present study provided a list of tissue‑ and organ‑specific genes that may be most suited for the following individual tissues/organs: Adipose tissue, bone marrow, brain, heart, kidney, liver, lung, ovary, skeletal muscle and testis; and also provides in these cases a representative, quantitative portrait of the relative, typical gene‑expression profile in the form of searchable database tables.

  15. C-statistic fitting routines: User's manual and reference guide

    NASA Technical Reports Server (NTRS)

    Nousek, John A.; Farwana, Vida

    1991-01-01

    The computer program is discussed which can read several input files and provide a best set of values for the functions provided by the user, using either C-statistic or the chi(exp 2) statistic method. The program consists of one main routine and several functions and subroutines. Detail descriptions of each function and subroutine is presented. A brief description of the C-statistic and the reason for its application is also presented.

  16. Army Air Forces Statistical Digest, 1946. First Annual Number

    DTIC Science & Technology

    1947-06-01

    accordance with AAF Letter 5-5 dated 28 April 1947, the Army Air Forces Statistical Digest has been designated as the official AAF statisti- cal yearbook...more detailed exposition, Since the Digest is designed primarily as a reference. manual, the re- action of users to its contents is important in the...distributed by this Headquarters (Statistical Control Division, Office of the Air Comptroller) is hereby designated as the official AAF statistical yearbook

  17. A statistically robust EEG re-referencing procedure to mitigate reference effect

    PubMed Central

    Lepage, Kyle Q.; Kramer, Mark A.; Chu, Catherine J.

    2014-01-01

    Background The electroencephalogram (EEG) remains the primary tool for diagnosis of abnormal brain activity in clinical neurology and for in vivo recordings of human neurophysiology in neuroscience research. In EEG data acquisition, voltage is measured at positions on the scalp with respect to a reference electrode. When this reference electrode responds to electrical activity or artifact all electrodes are affected. Successful analysis of EEG data often involves re-referencing procedures that modify the recorded traces and seek to minimize the impact of reference electrode activity upon functions of the original EEG recordings. New method We provide a novel, statistically robust procedure that adapts a robust maximum-likelihood type estimator to the problem of reference estimation, reduces the influence of neural activity from the re-referencing operation, and maintains good performance in a wide variety of empirical scenarios. Results The performance of the proposed and existing re-referencing procedures are validated in simulation and with examples of EEG recordings. To facilitate this comparison, channel-to-channel correlations are investigated theoretically and in simulation. Comparison with existing methods The proposed procedure avoids using data contaminated by neural signal and remains unbiased in recording scenarios where physical references, the common average reference (CAR) and the reference estimation standardization technique (REST) are not optimal. Conclusion The proposed procedure is simple, fast, and avoids the potential for substantial bias when analyzing low-density EEG data. PMID:24975291

  18. Prevalence of osteoporosis in Australian women: Geelong Osteoporosis Study.

    PubMed

    Henry, M J; Pasco, J A; Nicholson, G C; Seeman, E; Kotowicz, M A

    2000-01-01

    To evaluate the prevalence of osteoporosis at various sites among Australian women, cross-sectional bone mineral density (BMD) data for adult females was obtained from an age-stratified population-based sample (n = 1494; 20-94 yr) drawn at random from the Barwon Statistical Division, a population characteristic of Australia. Age- and weight- (and for three sites, height) matched reference ranges for BMD at the lumbar spine, proximal femur, forearm, and total body were developed using regression techniques. The cutoff BMD level for osteoporosis at the PA spine was 0. 917g/cm(2) and 0.713 g/cm(2) at the femoral neck according to the World Health Organization (WHO) guidelines. The upper cutoff level for osteopenia was 1.128 g/cm(2) at the PA spine and 0.913g/cm(2) for the femoral neck. The proportion of Australian women categorized as having osteoporosis at the PA spine, femoral neck, or midforearm ranged from 0.9% among those aged 40-44 yr to 87.0% for those older than 79 yr. This study provides reference data representative of the Australian female population. A large proportion of elderly Australian women has osteoporosis according to the WHO guidelines.

  19. Validation of endogenous internal real-time PCR controls in renal tissues.

    PubMed

    Cui, Xiangqin; Zhou, Juling; Qiu, Jing; Johnson, Martin R; Mrug, Michal

    2009-01-01

    Endogenous internal controls ('reference' or 'housekeeping' genes) are widely used in real-time PCR (RT-PCR) analyses. Their use relies on the premise of consistently stable expression across studied experimental conditions. Unfortunately, none of these controls fulfills this premise across a wide range of experimental conditions; consequently, none of them can be recommended for universal use. To determine which endogenous RT-PCR controls are suitable for analyses of renal tissues altered by kidney disease, we studied the expression of 16 commonly used 'reference genes' in 7 mildly and 7 severely affected whole kidney tissues from a well-characterized cystic kidney disease model. Expression levels of these 16 genes, determined by TaqMan RT-PCR analyses and Affymetrix GeneChip arrays, were normalized and tested for overall variance and equivalence of the means. Both statistical approaches and both TaqMan- and GeneChip-based methods converged on 3 out of the 4 top-ranked genes (Ppia, Gapdh and Pgk1) that had the most constant expression levels across the studied phenotypes. A combination of the top-ranked genes will provide a suitable endogenous internal control for similar studies of kidney tissues across a wide range of disease severity. Copyright 2009 S. Karger AG, Basel.

  20. Oxidative stress and lipid peroxidation in prolonged users of methamphetamine.

    PubMed

    Solhi, Hassan; Malekirad, Aliakbar; Kazemifar, Amir Mohammad; Sharifi, Farzaneh

    2014-07-01

    Methamphetamine abuse results in numerous adverse health effects. Formation of free radicals may be a contributing factor. Methamphetamine has produced free radicals in animal studies. Present study was conducted to evaluate status of oxidative stress and lipid peroxidation among chronic methamphetamine users. Ninety six individuals were selected randomly from methamphetamine abusers who had referred to rehabilitation and treatment center for drug abuse and their closed relatives, after providing informed consent. Blood samples were taken from each of the studied individuals. Ferric reducing ability of plasma (FRAP) assay and serum level of MDA (malondialdehyde) were used to assess the total anti-oxidant power and status of lipid peroxidation of the body, respectively. The results were analyzed by SPSS software version 16.0. Differences among groups were determined by T-test. Total anti-oxidant powers of plasma were 0.31±0.04 micromoles/liter and 0.46±0.05 micromoles/liter in methamphetamine abusers and control groups respectively. The difference was statistically significant (p-value=0.04). Levels of MDA were 4.38±5.05 micromoles/liter and 1.72±2.04 micromoles/liter in methamphetamine abusers and control group. The difference was statistically significant (p-value=0.01). results of present study suggest that prolonged use of methamphetamine exerts oxidative stress on the body and enhances lipid peroxidation. The event may contribute to emergence of adverse effects of acute and prolonged use of methamphetamine; such as loss of attention, psychomotor dysfunction, and cognitive deficits. It is recommended that antioxidants were included in drug regimens prescribed for methamphetamine abusers who referred to physicians to seek medical care for any reason.

  1. The Relationship between TOC and pH with Exchangeable Heavy Metal Levels in Lithuanian Podzols

    NASA Astrophysics Data System (ADS)

    Khaledian, Yones; Pereira, Paulo; Brevik, Eric C.; Pundyte, Neringa; Paliulis, Dainius

    2017-04-01

    Heavy metals can have a negative impact on public and environmental health. The objective of this study was to investigate the relationship between total organic carbon (TOC) and pH with exchangeable heavy metals (Pb, Cd, Cu and Zn) in order to predict exchangeable heavy metal content in soils sampled near Panevėžys and Kaunas, Lithuania. Principal component regression (PCR) and nonlinear regression methods were tested to find the statistical relationship between TOC and pH with heavy metals. The results of PCR [R2 = 0.68, RMSE = 0.07] and non-linear regression [R2 = 0.74, RMSE= 0.065] (pH with TOC and exchangeable parameters) were statistically significant. However, this was not observed in the relationships of pH and TOC separately with exchangeable heavy metals. The results indicated that pH had a higher correlation with exchangeable heavy metals (non-linear regression [R2 = 0.72, RMSE= 0.066]) than TOC with heavy metals [R2 = 0.30, RMSE= 0.004]. It can be concluded that even though there was a strong relationship between TOC and pH with exchangeable metals, the metal mobility (exchangeable metals) can be explained by pH better than TOC in this study. Finally, manipulating soil pH could likely be productive to assess and control heavy metals when financial and time limitations exist (Khaledian et al. 2016). Reference(s) Khaledian Y, Pereira P, Brevik E.C, Pundyte N, Paliulis D. 2016. The Influence of Organic Carbon and pH on Heavy Metals, Potassium, and Magnesium Levels in Lithuanian Podzols. Land Degradation and Development. DOI: 10.1002/ldr.2638

  2. Stata companion.

    PubMed

    Brennan, Jennifer Sousa

    2010-01-01

    This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.

  3. Spatial Statistical Network Models for Stream and River Temperature in the Chesapeake Bay Watershed, USA

    EPA Science Inventory

    Regional temperature models are needed for characterizing and mapping stream thermal regimes, establishing reference conditions, predicting future impacts and identifying critical thermal refugia. Spatial statistical models have been developed to improve regression modeling techn...

  4. Evaluation of 3M™ Molecular Detection Assay (MDA) Listeria for the Detection of Listeria species in Selected Foods and Environmental Surfaces: Collaborative Study, First Action 2014.06.

    PubMed

    Bird, Patrick; Flannery, Jonathan; Crowley, Erin; Agin, James; Goins, David; Monteroso, Lisa; Benesh, DeAnn

    2015-01-01

    The 3M™ Molecular Detection Assay (MDA) Listeria is used with the 3M Molecular Detection System for the detection of Listeria species in food, food-related, and environmental samples after enrichment. The assay utilizes loop-mediated isothermal amplification to rapidly amplify Listeria target DNA with high specificity and sensitivity, combined with bioluminescence to detect the amplification. The 3M MDA Listeria method was evaluated using an unpaired study design in a multilaboratory collaborative study and compared to the AOAC Official Method of AnalysisSM (OMA) 993.12 Listeria monocytogenes in Milk and Dairy Products reference method for the detection of Listeria species in full-fat (4% milk fat) cottage cheese (25 g test portions). A total of 15 laboratories located in the continental United States and Canada participated. Each matrix had three inoculation levels: an uninoculated control level (0 CFU/test portion), and two levels artificially contaminated with Listeria monocytogenes, a low inoculum level (0.2-2 CFU/test portion) and a high inoculum level (2-5 CFU/test portion) using nonheat-stressed cells. In total, 792 unpaired replicate portions were analyzed. Statistical analysis was conducted according to the probability of detection (POD) model. Results obtained for the low inoculum level test portions produced a difference in cross-laboratory POD value of -0.07 with a 95% confidence interval of (-0.19, 0.06). No statistically significant differences were observed in the number of positive samples detected by the 3M MDA Listeria method versus the AOAC OMA method.

  5. Assessing the effect of land use change on catchment runoff by combined use of statistical tests and hydrological modelling: Case studies from Zimbabwe

    NASA Astrophysics Data System (ADS)

    Lørup, Jens Kristian; Refsgaard, Jens Christian; Mazvimavi, Dominic

    1998-03-01

    The purpose of this study was to identify and assess long-term impacts of land use change on catchment runoff in semi-arid Zimbabwe, based on analyses of long hydrological time series (25-50 years) from six medium-sized (200-1000 km 2) non-experimental rural catchments. A methodology combining common statistical methods with hydrological modelling was adopted in order to distinguish between the effects of climate variability and the effects of land use change. The hydrological model (NAM) was in general able to simulate the observed hydrographs very well during the reference period, thus providing a means to account for the effects of climate variability and hence strengthening the power of the subsequent statistical tests. In the test period the validated model was used to provide the runoff record which would have occurred in the absence of land use change. The analyses indicated a decrease in the annual runoff for most of the six catchments, with the largest changes occurring for catchments located within communal land, where large increases in population and agricultural intensity have taken place. However, the decrease was only statistically significant at the 5% level for one of the catchments.

  6. The Effects of Aerobic Exercises and 25(OH) D Supplementation on GLP1 and DPP4 Level in Type II Diabetic Patients.

    PubMed

    Rahimi, Naser; Samavati Sharif, Mohammad Ali; Goharian, Amir Reza; Pour, Ali Heidarian

    2017-01-01

    The purpose of this study was to investigate the effects of an 8-week aerobic exercise and supplementation of 25(OH)D3 on GLP1 and DDP4 levels in men with type II diabetes. In this semiexperimental research, among 40-60-year-old men with type II diabetes who were referred to the diabetic center of Isabn-E Maryam hospital in Isfahan; of whom, 48 patients were voluntarily accepted and then were randomly divided into 4 groups: aerobic exercise group, aerobic exercise with 25(OH) D supplement group, 25(OH) D supplement group, and the control group. An aerobic exercise program was conducted for 8 weeks (3 sessions/week, each session 60 to75 min with 60-80% HRmax). The supplement user group received 50,000 units of oral Vitamin D once weekly for 8 weeks. The GLP1, DPP4, and 25(OH) D levels were measured before and after the intervention. At last, the data were statistically analyzed using the ANCOVA and post hoc test of least significant difference. The results of ANCOVA showed a significant difference between the GLP1 and DPP4 levels in aerobic exercise with control group while these changes were not statistically significant between the 25(OH) D supplement group with control group ( P < 0.05). Aerobic exercises have resulted an increase in GLP1 level and a decrease in DPP4 level. However, consumption of Vitamin D supplement alone did not cause any changes in GLP1and DPP4 levels but led to an increase in 25-hydroxy Vitamin D level.

  7. A study of mortality patterns at a tyre factory 1951-1985: a reference statistic dilemma.

    PubMed

    Veys, C A

    2004-08-01

    The general and cancer mortalities of rubber workers at a large tyre factory were studied in an area of marked regional variation in death rates. Three quinquennial intakes of male rubber workers engaged between January 1946 and December 1960 formed a composite cohort of 6454 men to be followed up. Over 99% were successfully traced by December 1985. The cohort analysis used both national and local rates as reference statistics for several causes. Between 1951 and 1985, a national standardized mortality ratio (SMRN) of 101 for all causes (based on 2556 deaths) was noted, whereas the local standardized mortality ratio (SMRL) was only 79. For all cancers, the figures were 115 (SMRN) and 93 (SMRL), for stomach cancer they were 137 (SMRN) and 84 (SMRL), and for lung cancer they were 121 (SMRN) and 94 (SMRL). No outright excesses against the national norm were observed for other cancers except for larynx, brain and central nervous system and thyroid cancer and the leukaemias. Excesses were statistically significant for cancer of the gallbladder and the bile ducts, for silicotuberculosis (SMRN = 1000) and for the pneumoconioses (SMRN = 706). Deaths from cerebrovascular diseases, chronic bronchitis and emphysema showed statistically significant deficits using either norm. These results from a large factory cohort study of rubber workers, followed for over three decades, demonstrate the marked discrepancy that can result from using only one reference statistic in areas of significant variation in mortality patterns.

  8. Spacecraft software training needs assessment research, appendices

    NASA Technical Reports Server (NTRS)

    Ratcliff, Shirley; Golas, Katharine

    1990-01-01

    The appendices to the previously reported study are presented: statistical data from task rating worksheets; SSD references; survey forms; fourth generation language, a powerful, long-term solution to maintenance cost; task list; methodology; SwRI's instructional systems development model; relevant research; and references.

  9. Evaluating compression or expansion of morbidity in Canada: trends in life expectancy and health-adjusted life expectancy from 1994 to 2010

    PubMed Central

    Colin, Steensma; Lidia, Loukine; Bernard, C. K. Choi

    2017-01-01

    Introduction: The objective of this study was to investigate whether morbidity in Canada, at the national and provincial levels, is compressing or expanding by tracking trends in life expectancy (LE) and health-adjusted life expectancy (HALE) from 1994 to 2010. “Compression” refers to a decrease in the proportion of life spent in an unhealthy state over time. It happens when HALE increases faster than LE. “Expansion” refers to an increase in the proportion of life spent in an unhealthy state that happens when HALE is stable or increases more slowly than LE. Methods: We estimated LE using mortality and population data from Statistics Canada. We took health-related quality of life (i.e. morbidity) data used to calculate HALE from the National Population Health Survey (1994–1999) and the Canadian Community Health Survey (2000–2010). We built abridged life tables for seven time intervals, covering the period 1994 to 2010 and corresponding to the year of each available survey cycle, for females and males, and for each of the 10 Canadian provinces. National and provincial trends were assessed at birth, and at ages 20 years and 65 years. Results: We observed an overall average annual increase in HALE that was statistically significant in both Canadian females and males at each of the three ages assessed, with the exception of females at birth. At birth, HALE increased an average of 0.2% (p = .08) and 0.3% (p < .001) annually for females and males respectively over the 1994 to 2010 period. At the national level for all three age groups, we observed a statistically nonsignificant average annual increase in the proportion of life spent in an unhealthy state, with the exception of men at age 65, who experienced a non-significant decrease. At the provincial level at birth, we observed a significant increase in proportion of life spent in an unhealthy state for Newfoundland and Labrador (NL) and Prince Edward Island (PEI). Conclusion: Our study did not detect a clear overall trend in compression or expansion of morbidity from 1994 to 2010 at the national level in Canada. However, our results suggested an expansion of morbidity in NL and PEI. Our study indicates the importance of continued tracking of the secular trends of life expectancy and HALE in Canada in order to verify the presence of compression or expansion of morbidity. Further study should be undertaken to understand what is driving the observed expansion of morbidity in NL and in PEI. PMID:28273034

  10. A Catalogue of Data in the Statistical Information Centre, March 1976. (Catalogue de donnees du Centre d'information statistique, Mars 1976.)

    ERIC Educational Resources Information Center

    Department of Indian Affairs and Northern Development, Ottawa (Ontario).

    Over 189 materials which cover aspects of the Administration, Parks Canada, Indian and Eskimo Affairs, and Northern Development Programs are cited in this bilingual catalogue (English and French). Information given for each entry is: reference number, statistics available, years covered, and whether the statistics are available by area, region,…

  11. RIPL - Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    NASA Astrophysics Data System (ADS)

    Capote, R.; Herman, M.; Obložinský, P.; Young, P. G.; Goriely, S.; Belgya, T.; Ignatyuk, A. V.; Koning, A. J.; Hilaire, S.; Plujko, V. A.; Avrigeanu, M.; Bersillon, O.; Chadwick, M. B.; Fukahori, T.; Ge, Zhigang; Han, Yinlu; Kailas, S.; Kopecky, J.; Maslov, V. M.; Reffo, G.; Sin, M.; Soukhovitskii, E. Sh.; Talou, P.

    2009-12-01

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released in January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and γ-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from 51V to 239Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.

  12. RIPL - Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capote, R.; Herman, M.; Oblozinsky, P.

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released inmore » January 2009, and is available on the Web through (http://www-nds.iaea.org/RIPL-3/). This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and {gamma}-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from {sup 51}V to {sup 239}Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.« less

  13. RIPL-Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capote, R.; Herman, M.; Capote,R.

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released inmore » January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and {gamma}-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from {sup 51}V to {sup 239}Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.« less

  14. Anomaly-specified virtual dimensionality

    NASA Astrophysics Data System (ADS)

    Chen, Shih-Yu; Paylor, Drew; Chang, Chein-I.

    2013-09-01

    Virtual dimensionality (VD) has received considerable interest where VD is used to estimate the number of spectral distinct signatures, denoted by p. Unfortunately, no specific definition is provided by VD for what a spectrally distinct signature is. As a result, various types of spectral distinct signatures determine different values of VD. There is no one value-fit-all for VD. In order to address this issue this paper presents a new concept, referred to as anomaly-specified VD (AS-VD) which determines the number of anomalies of interest present in the data. Specifically, two types of anomaly detection algorithms are of particular interest, sample covariance matrix K-based anomaly detector developed by Reed and Yu, referred to as K-RXD and sample correlation matrix R-based RXD, referred to as R-RXD. Since K-RXD is only determined by 2nd order statistics compared to R-RXD which is specified by statistics of the first two orders including sample mean as the first order statistics, the values determined by K-RXD and R-RXD will be different. Experiments are conducted in comparison with widely used eigen-based approaches.

  15. Normal probabilities for Vandenberg AFB wind components - monthly reference periods for all flight azimuths, 0- to 70-km altitudes

    NASA Technical Reports Server (NTRS)

    Falls, L. W.

    1975-01-01

    Vandenberg Air Force Base (AFB), California, wind component statistics are presented to be used for aerospace engineering applications that require component wind probabilities for various flight azimuths and selected altitudes. The normal (Gaussian) distribution is presented as a statistical model to represent component winds at Vandenberg AFB. Head tail, and crosswind components are tabulated for all flight azimuths for altitudes from 0 to 70 km by monthly reference periods. Wind components are given for 11 selected percentiles ranging from 0.135 percent to 99.865 percent for each month. The results of statistical goodness-of-fit tests are presented to verify the use of the Gaussian distribution as an adequate model to represent component winds at Vandenberg AFB.

  16. Normal probabilities for Cape Kennedy wind components: Monthly reference periods for all flight azimuths. Altitudes 0 to 70 kilometers

    NASA Technical Reports Server (NTRS)

    Falls, L. W.

    1973-01-01

    This document replaces Cape Kennedy empirical wind component statistics which are presently being used for aerospace engineering applications that require component wind probabilities for various flight azimuths and selected altitudes. The normal (Gaussian) distribution is presented as an adequate statistical model to represent component winds at Cape Kennedy. Head-, tail-, and crosswind components are tabulated for all flight azimuths for altitudes from 0 to 70 km by monthly reference periods. Wind components are given for 11 selected percentiles ranging from 0.135 percent to 99,865 percent for each month. Results of statistical goodness-of-fit tests are presented to verify the use of the Gaussian distribution as an adequate model to represent component winds at Cape Kennedy, Florida.

  17. Hydrologic Conditions in Kansas, water year 2015

    USGS Publications Warehouse

    May, Madison R.

    2016-03-31

    The U.S. Geological Survey (USGS), in cooperation with Federal, State, and local agencies, maintains a long-term network of hydrologic monitoring sites in Kansas. In 2015, the network included about 200 real-time streamgages (hereafter referred to as “gages”), 12 real-time reservoir-level monitoring stations, and 30 groundwater-level monitoring wells. These data and associated analyses provide a unique overview of hydrologic conditions and help improve the understanding of Kansas’s water resources.Real-time data are verified by the USGS throughout the year with regular measurements of streamflow, lake levels, and groundwater levels. These data are used in protecting life and property; and managing water resources for agricultural, industrial, public supply, ecological, and recreational purposes. Yearly hydrologic conditions are characterized by comparing statistical analyses of current and historical water year (WY) data for the period of record. A WY is the 12-month period from October 1 through September 30 and is designated by the year in which it ends.

  18. Assessment of technological level of stem cell research using principal component analysis.

    PubMed

    Do Cho, Sung; Hwan Hyun, Byung; Kim, Jae Kyeom

    2016-01-01

    In general, technological levels have been assessed based on specialist's opinion through the methods such as Delphi. But in such cases, results could be significantly biased per study design and individual expert. In this study, therefore scientific literatures and patents were selected by means of analytic indexes for statistic approach and technical assessment of stem cell fields. The analytic indexes, numbers and impact indexes of scientific literatures and patents, were weighted based on principal component analysis, and then, were summated into the single value. Technological obsolescence was calculated through the cited half-life of patents issued by the United States Patents and Trademark Office and was reflected in technological level assessment. As results, ranks of each nation's in reference to the technology level were rated by the proposed method. Furthermore we were able to evaluate strengthens and weaknesses thereof. Although our empirical research presents faithful results, in the further study, there is a need to compare the existing methods and the suggested method.

  19. Duration of surgical-orthodontic treatment.

    PubMed

    Häll, Birgitta; Jämsä, Tapio; Soukka, Tero; Peltomäki, Timo

    2008-10-01

    To study the duration of surgical-orthodontic treatment with special reference to patients' age and the type of tooth movements, i.e. extraction vs. non-extraction and intrusion before or extrusion after surgery to level the curve of Spee. The material consisted files of 37 consecutive surgical-orthodontic patients. The files were reviewed and gender, diagnosis, type of malocclusion, age at the initiation of treatment, duration of treatment, type of tooth movements (extraction vs. non-extraction and levelling of the curve of Spee before or after operation) and type of operation were retrieved. For statistical analyses two sample t-test, Kruskal-Wallis and Spearman rank correlation tests were used. Mean treatment duration of the sample was 26.8 months, of which pre-surgical orthodontics took on average 17.5 months. Patients with extractions as part of the treatment had statistically and clinically significantly longer treatment duration, on average 8 months, than those without extractions. No other studied variable seemed to have an impact on the treatment time. The present small sample size prevents reliable conclusions to be made. However, the findings suggest, and patients should be informed, that extractions included in the treatment plan increase chances of longer duration of surgical-orthodontic treatment.

  20. Correlation between radio-induced lymphocyte apoptosis measurements obtained from two French centres.

    PubMed

    Mirjolet, C; Merlin, J L; Dalban, C; Maingon, P; Azria, D

    2016-07-01

    In the era of modern treatment delivery, increasing the dose delivered to the target to improve local control might be modulated by the patient's intrinsic radio-sensitivity. A predictive assay based on radio-induced lymphocyte apoptosis quantification highlighted the significant correlation between CD4 and CD8 T-lymphocyte apoptosis and grade 2 or 3 radiation-induced late toxicities. By conducting this assay at several technical platforms, the aim of this study was to demonstrate that radio-induced lymphocyte apoptosis values obtained from two different platforms were comparable. For 25 patients included in the PARATOXOR trial running in Dijon the radio-induced lymphocyte apoptosis results obtained from the laboratory of Montpellier (IRCM, Inserm U1194, France), considered as the reference (referred to as Lab 1), were compared with those from the laboratory located at the Institut de cancérologie de Lorraine (ICL, France), referred to as Lab 2. Different statistical methods were used to measure the agreement between the radio-induced lymphocyte apoptosis data from the two laboratories (quantitative data). The Bland-Altman plot was used to identify potential bias. All statistical tests demonstrated good agreement between radio-induced lymphocyte apoptosis values obtained from both sites and no major bias was identified. Since radio-induced lymphocyte apoptosis values, which predict tolerance to radiotherapy, could be assessed by two laboratories and showed a high level of robustness and consistency, we can suggest that this assay be extended to any laboratories that use the same technique. Copyright © 2016 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  1. An explorative study of school performance and antipsychotic medication.

    PubMed

    van der Schans, J; Vardar, S; Çiçek, R; Bos, H J; Hoekstra, P J; de Vries, T W; Hak, E

    2016-09-21

    Antipsychotic therapy can reduce severe symptoms of psychiatric disorders, however, data on school performance among children on such treatment are lacking. The objective was to explore school performance among children using antipsychotic drugs at the end of primary education. A cross-sectional study was conducted using the University Groningen pharmacy database linked to academic achievement scores at the end of primary school (Dutch Cito-test) obtained from Statistics Netherlands. Mean Cito-test scores and standard deviations were obtained for children on antipsychotic therapy and reference children, and statistically compared using analyses of covariance. In addition, differences in subgroups as boys versus girls, ethnicity, household income, and late starters (start date within 12 months of the Cito-test) versus early starters (start date > 12 months before the Cito-test) were tested. In all, data from 7994 children could be linked to Cito-test scores. At the time of the Cito-test, 45 (0.6 %) were on treatment with antipsychotics. Children using antipsychotics scored on average 3.6 points lower than the reference peer group (534.5 ± 9.5). Scores were different across gender and levels of household income (p < 0.05). Scores of early starters were significantly higher than starters within 12 months (533.7 ± 1.7 vs. 524.1 ± 2.6). This first exploration showed that children on antipsychotic treatment have lower school performance compared to the reference peer group at the end of primary school. This was most noticeable for girls, but early starters were less affected than later starters. Due to the observational cross-sectional nature of this study, no causality can be inferred, but the results indicate that school performance should be closely monitored and causes of underperformance despite treatment warrants more research.

  2. Implementation of Certified EHR, Patient Portal, and "Direct" Messaging Technology in a Radiology Environment Enhances Communication of Radiology Results to Both Referring Physicians and Patients.

    PubMed

    Reicher, Joshua Jay; Reicher, Murray Aaron

    2016-06-01

    Since 2009, the Federal government distributed over $29 billion to providers who were adopting compliant electronic health record (EHR) technology. With a focus on radiology, we explore how EHR technology impacts interoperability with referring clinicians' EHRs and patient engagement. We also discuss the high-level details of contributing supporting frameworks, specifically Direct messaging and health information service provider (HISP) technology. We characterized Direct messaging, a secure e-mail-like protocol built to allow exchange of encrypted health information online, and the new supporting HISP infrastructure. Statistics related to both the testing and active use of this framework were obtained from DirectTrust.org, an organization whose framework supports Direct messaging use by healthcare organizations. To evaluate patient engagement, we obtained usage data from a radiology-centric patient portal between 2014 and 2015, which in some cases included access to radiology reports. Statistics from 2013 to 2015 showed a rise in issued secure Direct addresses from 8724 to 752,496; a rise in the number of participating healthcare organizations from 667 to 39,751; and a rise in the secure messages sent from 122,842 to 27,316,438. Regarding patient engagement, an average of 234,679 patients per month were provided portal access, with 86,400 patients per month given access to radiology reports. Availability of radiology reports online was strongly associated with increased system usage, with a likelihood ratio of 2.63. The use of certified EHR technology and Direct messaging in the practice of radiology allows for the communication of patient information and radiology results with referring clinicians and increases patient use of patient portal technology, supporting bidirectional radiologist-patient communication.

  3. Assessment of Dredged Material Toxicity in San Francisco Bay

    DTIC Science & Technology

    1990-11-01

    reference sediment. When compared to the fine-grain Sequim Bay refer- ence material, no statistically significant mortalities were detected. R...Oakland Harbor. Sequim Bay material was used as the refer- ence. The hierarchy of interspecific sensitivity was oyster larvae > juvenile sand dabs...in San Francisco Bay 6. AUTHOR(S) Thomas M. Dillon, David W. Moore 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) - 8. PERFORMING ORGANIZATION

  4. Development of certified reference materials for electrolytes in human serum (GBW09124-09126).

    PubMed

    Feng, Liuxing; Wang, Jun; Cui, Yanjie; Shi, Naijie; Li, Haifeng; Li, Hongmei

    2017-05-01

    Three reference materials, at relatively low, middle, and high concentrations, were developed for analysis of the mass fractions of electrolytes (K, Ca, Na, Mg, Cl, and Li) in human serum. The reference materials were prepared by adding high purity chloride salts to normal human serum. The concentration range of the three levels is within ±20% of normal human serum. It was shown that 14 units with duplicate analysis is enough to demonstrate the homogeneity of these candidate reference materials. The statistical results also showed no significant trends in both short-term stability test for 1 week at 40 °C and long-term stability test for 14 months. The certification methods of the six elements include isotope dilution inductively coupled plasma mass spectrometry (ID-ICP-MS), inductively coupled plasma optical emission spectroscopy (ICP-OES), atomic absorption spectroscopy (AAS), ion chromatography (IC), and ion-selective electrode (ISE). The certification methods were validated by international comparisons among a number of national metrology institutes (NMIs). The combined relative standard uncertainties of the property values were estimated by considering the uncertainties of the analytical methods, homogeneity, and stability. The range of the expanded uncertainties of all the elements is from 2.2% to 3.9%. The certified reference materials (CRMs) are primarily intended for use in the calibration and validation of procedures in clinical analysis for the determination of electrolytes in human serum or plasma. Graphical Abstract Certified reference materials for K, Ca, Mg, Na, Cl and Li in human serum (GBW09124-09126).

  5. Impact study of the Argo array definition in the Mediterranean Sea based on satellite altimetry gridded data

    NASA Astrophysics Data System (ADS)

    Sanchez-Roman, Antonio; Ruiz, Simón; Pascual, Ananda; Guinehut, Stéphanie; Mourre, Baptiste

    2016-04-01

    The existing Argo network provides essential data in near real time to constrain monitoring and forecasting centers and strongly complements the observations of the ocean surface from space. The comparison of Sea Level Anomalies (SLA) provided by satellite altimeters with in-situ Dynamic Heights Anomalies (DHA) derived from the temperature and salinity profiles of Argo floats contribute to better characterize the error budget associated with the altimeter observations. In this work, performed in the frame of the E-AIMS FP7 European Project, we focus on the Argo observing system in the Mediterranean Sea and its impact on SLA fields provided by satellite altimetry measurements in the basin. Namely, we focus on the sensitivity of specific SLA gridded merged products provided by AVISO in the Mediterranean to the reference depth (400 or 900 dbar) selected in the computation of the Argo Dynamic Height (DH) as an integration of the Argo T/S profiles through the water column. This reference depth will have impact on the number of valid Argo profiles and therefore on their temporal sampling and the coverage by the network used to compare with altimeter data. To compare both datasets, altimeter grids and synthetic climatologies used to compute DHA were spatially and temporally interpolated at the position and time of each in-situ Argo profile by a mapping method based on an optimal interpolation scheme. The analysis was conducted in the entire Mediterranean Sea and different sub-regions of the basin. The second part of this work is devoted to investigate which configuration in terms of spatial sampling of the Argo array in the Mediterranean will properly reproduce the mesoscale dynamics in this basin, which is comprehensively captured by new standards of specific altimeter products for this region. To do that, several Observing System Simulation Experiments (OSSEs) were conducted assuming that altimetry data computed from AVISO specific reanalysis gridded merged product for the Mediterranean as the "true" field. The choice of the reference depth of Argo profiles impacts the number of valid profiles used to compute DHA and therefore the spatial coverage by the network. Results show that the impact of the reference level in the computation of Argo DH is statistically significant since the standard deviation of the differences between DH computed from Altimetry and Argo data referred to reference depth of 400 dbar and 900 dbar are quite different (4.85 and 5.11 cm, respectively). Therefore, 400 dbar should be taken as reference depth to compute DHA from Argo data in the Mediterranean. On the contrary, similar scores are obtained when shallow floats are not included in the computation (4.85 cm against 4.87 cm). In any case, we must highlight that all the studies show significant correlations (95 %) higher than 0.70 between Altimetry and Argo data with a STD for the differences between both datasets of around 4.90 cm. Furthermore, the sub-basin study shows improved statistics for the eastern sub-basin for DHA referred to 400 dbar while minimum values are obtained for the western sub-basin when computing DHA referred to 900 dbar. On the other hand, results about the OSSEs suggest that maintaining an array of Argo floats of 100×100 km, the variance of the large-scale signal and most of the mesoscale features of SLA fields are recovered. Therefore, the network coverage should be enlarged in the Mediterranean in order to achieve at least this spatial resolution.

  6. Selection of reference genes for quantitative real-time PCR normalization in Panax ginseng at different stages of growth and in different organs.

    PubMed

    Liu, Jing; Wang, Qun; Sun, Minying; Zhu, Linlin; Yang, Michael; Zhao, Yu

    2014-01-01

    Quantitative real-time reverse transcription PCR (qRT-PCR) has become a widely used method for gene expression analysis; however, its data interpretation largely depends on the stability of reference genes. The transcriptomics of Panax ginseng, one of the most popular and traditional ingredients used in Chinese medicines, is increasingly being studied. Furthermore, it is vital to establish a series of reliable reference genes when qRT-PCR is used to assess the gene expression profile of ginseng. In this study, we screened out candidate reference genes for ginseng using gene expression data generated by a high-throughput sequencing platform. Based on the statistical tests, 20 reference genes (10 traditional housekeeping genes and 10 novel genes) were selected. These genes were tested for the normalization of expression levels in five growth stages and three distinct plant organs of ginseng by qPCR. These genes were subsequently ranked and compared according to the stability of their expressions using geNorm, NormFinder, and BestKeeper computational programs. Although the best reference genes were found to vary across different samples, CYP and EF-1α were the most stable genes amongst all samples. GAPDH/30S RPS20, CYP/60S RPL13 and CYP/QCR were the optimum pair of reference genes in the roots, stems, and leaves. CYP/60S RPL13, CYP/eIF-5A, aTUB/V-ATP, eIF-5A/SAR1, and aTUB/pol IIa were the most stably expressed combinations in each of the five developmental stages. Our study serves as a foundation for developing an accurate method of qRT-PCR and will benefit future studies on gene expression profiles of Panax Ginseng.

  7. Notes on numerical reliability of several statistical analysis programs

    USGS Publications Warehouse

    Landwehr, J.M.; Tasker, Gary D.

    1999-01-01

    This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.

  8. EXTRAPOLATION TECHNIQUES EVALUATING 24 HOURS OF AVERAGE ELECTROMAGNETIC FIELD EMITTED BY RADIO BASE STATION INSTALLATIONS: SPECTRUM ANALYZER MEASUREMENTS OF LTE AND UMTS SIGNALS.

    PubMed

    Mossetti, Stefano; de Bartolo, Daniela; Veronese, Ivan; Cantone, Marie Claire; Cosenza, Cristina; Nava, Elisa

    2017-04-01

    International and national organizations have formulated guidelines establishing limits for occupational and residential electromagnetic field (EMF) exposure at high-frequency fields. Italian legislation fixed 20 V/m as a limit for public protection from exposure to EMFs in the frequency range 0.1 MHz-3 GHz and 6 V/m as a reference level. Recently, the law was changed and the reference level must now be evaluated as the 24-hour average value, instead of the previous highest 6 minutes in a day. The law refers to a technical guide (CEI 211-7/E published in 2013) for the extrapolation techniques that public authorities have to use when assessing exposure for compliance with limits. In this work, we present measurements carried out with a vectorial spectrum analyzer to identify technical critical aspects in these extrapolation techniques, when applied to UMTS and LTE signals. We focused also on finding a good balance between statistically significant values and logistic managements in control activity, as the signal trend in situ is not known. Measurements were repeated several times over several months and for different mobile companies. The outcome presented in this article allowed us to evaluate the reliability of the extrapolation results obtained and to have a starting point for defining operating procedures. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. [Full blood count reference values in children of 8 to 12 years old residing at 2,760 m above sea level].

    PubMed

    Armando García-Miranda, L; Contreras, I; Estrada, J A

    2014-04-01

    To determine reference values for full blood count parameters in a population of children 8 to 12 years old, living at an altitude of 2760 m above sea level. Our sample consisted of 102 individuals on whom a full blood count was performed. The parameters included: total number of red blood cells, platelets, white cells, and a differential count (millions/μl and %) of neutrophils, lymphocytes, monocytes, eosinophils and basophils. Additionally, we obtained values for hemoglobin, hematocrit, mean corpuscular volume, mean corpuscular hemoglobin, concentration of corpuscular hemoglobin and red blood cell distribution width. The results were statistically analyzed with a non-parametric test, to divide the sample in quartiles and obtain the lower and upper limits for our intervals. Moreover, the values for the intervals obtained from this analysis were compared to intervals obtained estimating+- 2 standard deviations above and below from our mean values. Our results showed significant differences compared to normal interval values reported for the adult Mexican population in most of the parameters studied. The full blood count is an important laboratory test used routinely for the initial assessment of a patient. Values of full blood counts in healthy individuals vary according to gender, age and geographic location; therefore, each population should have its own reference values. Copyright © 2013 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.

  10. Reliable gene expression analysis by reverse transcription-quantitative PCR: reporting and minimizing the uncertainty in data accuracy.

    PubMed

    Remans, Tony; Keunen, Els; Bex, Geert Jan; Smeets, Karen; Vangronsveld, Jaco; Cuypers, Ann

    2014-10-01

    Reverse transcription-quantitative PCR (RT-qPCR) has been widely adopted to measure differences in mRNA levels; however, biological and technical variation strongly affects the accuracy of the reported differences. RT-qPCR specialists have warned that, unless researchers minimize this variability, they may report inaccurate differences and draw incorrect biological conclusions. The Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines describe procedures for conducting and reporting RT-qPCR experiments. The MIQE guidelines enable others to judge the reliability of reported results; however, a recent literature survey found low adherence to these guidelines. Additionally, even experiments that use appropriate procedures remain subject to individual variation that statistical methods cannot correct. For example, since ideal reference genes do not exist, the widely used method of normalizing RT-qPCR data to reference genes generates background noise that affects the accuracy of measured changes in mRNA levels. However, current RT-qPCR data reporting styles ignore this source of variation. In this commentary, we direct researchers to appropriate procedures, outline a method to present the remaining uncertainty in data accuracy, and propose an intuitive way to select reference genes to minimize uncertainty. Reporting the uncertainty in data accuracy also serves for quality assessment, enabling researchers and peer reviewers to confidently evaluate the reliability of gene expression data. © 2014 American Society of Plant Biologists. All rights reserved.

  11. The Application of a Statistical Analysis Software Package to Explosive Testing

    DTIC Science & Technology

    1993-12-01

    deviation not corrected for test interval. M refer to equation 2. s refer to equation 3. G refer to section 2.1, C 36 Appendix I : Program Structured ...APPENDIX I: Program Structured Diagrams 37 APPENDIX II: Bruceton Reference Graphs 39 APPENDIX III: Input and Output Data File Format 44 APPENDIX IV...directly from Graph II, which has been digitised and incorporated into the program . IfM falls below 0.3, the curve that is closest to diff( eq . 3a) is

  12. Global forest cover mapping for the United Nations Food and Agriculture Organization forest resources assessment 2000 program

    USGS Publications Warehouse

    Zhu, Z.; Waller, E.

    2003-01-01

    Many countries periodically produce national reports on the status and changes of forest resources, using statistical surveys and spatial mapping of remotely sensed data. At the global level, the Food and Agriculture Organization (FAO) of the United Nations has conducted a Forest Resources Assessment (FRA) program every 10 yr since 1980, producing statistics and analysis that give a global synopsis of forest resources in the world. For the year 2000 of the FRA program (FRA2000), a global forest cover map was produced to provide spatial context to the extensive survey. The forest cover map, produced at the U.S. Geological Survey (USGS) EROS Data Center (EDC), has five classes: closed forest, open or fragmented forest, other wooded land, other land cover, and water. The first two forested classes at the global scale were delineated using combinations of temporal compositing, modified mixture analysis, geographic stratification, and other classification techniques. The remaining three FAO classes were derived primarily from the USGS global land cover characteristics database (Loveland et al. 1999). Validated on the basis of existing reference data sets, the map is estimated to be 77% accurate for the first four classes (no reference data were available for water), and 86% accurate for the forest and nonforest classification. The final map will be published as an insert to the FAO FRA2000 report.

  13. In Vitro Comparative Evaluation of Different Types of Impression Trays and Impression Materials on the Accuracy of Open Tray Implant Impressions: A Pilot Study

    PubMed Central

    Gupta, Sonam; Balakrishnan, Dhanasekar

    2017-01-01

    Purpose. For a precise fit of multiple implant framework, having an accurate definitive cast is imperative. The present study evaluated dimensional accuracy of master casts obtained using different impression trays and materials with open tray impression technique. Materials and Methods. A machined aluminum reference model with four parallel implant analogues was fabricated. Forty implant level impressions were made. Eight groups (n = 5) were tested using impression materials (polyether and vinylsiloxanether) and four types of impression trays, two being custom (self-cure acrylic and light cure acrylic) and two being stock (plastic and metal). The interimplant distances were measured on master casts using a coordinate measuring machine. The collected data was compared with a standard reference model and was statistically analyzed using two-way ANOVA. Results. Statistically significant difference (p < 0.05) was found between the two impression materials. However, the difference seen was small (36 μm) irrespective of the tray type used. No significant difference (p > 0.05) was observed between varied stock and custom trays. Conclusions. The polyether impression material proved to be more accurate than vinylsiloxanether impression material. The rigid nonperforated stock trays, both plastic and metal, could be an alternative for custom trays for multi-implant impressions when used with medium viscosity impression materials. PMID:28348595

  14. Thermodynamics of mixtures of patchy and spherical colloids of different sizes: A multi-body association theory with complete reference fluid information.

    PubMed

    Bansal, Artee; Valiya Parambathu, Arjun; Asthagiri, D; Cox, Kenneth R; Chapman, Walter G

    2017-04-28

    We present a theory to predict the structure and thermodynamics of mixtures of colloids of different diameters, building on our earlier work [A. Bansal et al., J. Chem. Phys. 145, 074904 (2016)] that considered mixtures with all particles constrained to have the same size. The patchy, solvent particles have short-range directional interactions, while the solute particles have short-range isotropic interactions. The hard-sphere mixture without any association site forms the reference fluid. An important ingredient within the multi-body association theory is the description of clustering of the reference solvent around the reference solute. Here we account for the physical, multi-body clusters of the reference solvent around the reference solute in terms of occupancy statistics in a defined observation volume. These occupancy probabilities are obtained from enhanced sampling simulations, but we also present statistical mechanical models to estimate these probabilities with limited simulation data. Relative to an approach that describes only up to three-body correlations in the reference, incorporating the complete reference information better predicts the bonding state and thermodynamics of the physical solute for a wide range of system conditions. Importantly, analysis of the residual chemical potential of the infinitely dilute solute from molecular simulation and theory shows that whereas the chemical potential is somewhat insensitive to the description of the structure of the reference fluid, the energetic and entropic contributions are not, with the results from the complete reference approach being in better agreement with particle simulations.

  15. Thermodynamics of mixtures of patchy and spherical colloids of different sizes: A multi-body association theory with complete reference fluid information

    NASA Astrophysics Data System (ADS)

    Bansal, Artee; Valiya Parambathu, Arjun; Asthagiri, D.; Cox, Kenneth R.; Chapman, Walter G.

    2017-04-01

    We present a theory to predict the structure and thermodynamics of mixtures of colloids of different diameters, building on our earlier work [A. Bansal et al., J. Chem. Phys. 145, 074904 (2016)] that considered mixtures with all particles constrained to have the same size. The patchy, solvent particles have short-range directional interactions, while the solute particles have short-range isotropic interactions. The hard-sphere mixture without any association site forms the reference fluid. An important ingredient within the multi-body association theory is the description of clustering of the reference solvent around the reference solute. Here we account for the physical, multi-body clusters of the reference solvent around the reference solute in terms of occupancy statistics in a defined observation volume. These occupancy probabilities are obtained from enhanced sampling simulations, but we also present statistical mechanical models to estimate these probabilities with limited simulation data. Relative to an approach that describes only up to three-body correlations in the reference, incorporating the complete reference information better predicts the bonding state and thermodynamics of the physical solute for a wide range of system conditions. Importantly, analysis of the residual chemical potential of the infinitely dilute solute from molecular simulation and theory shows that whereas the chemical potential is somewhat insensitive to the description of the structure of the reference fluid, the energetic and entropic contributions are not, with the results from the complete reference approach being in better agreement with particle simulations.

  16. Medical students' child oral-health-related knowledge, practices and attitudes.

    PubMed

    AlYousef, Y; Damiano, P; Weber-Gasparoni, K; Qian, F; Murph, J; Nothwehr, F

    2013-11-01

    This study evaluated medical interns' oral health knowledge, and other factors influencing their ability and willingness to perform oral-health-related practices for high-caries-risk children. A 15-item survey was emailed to all eligible graduating fifth-year medical students at King Khalid University Hospital to address these areas of interest. Chi-square statistics and logistic regression models were used to analyse data. One-hundred and twenty-one (49%) usable surveys were returned from two mailings. On questions regarding comfort levels when performing oral-health-related practices on children under age 3, physicians noted high levels of comfort with all specified oral health practices. Regarding satisfaction of students with medical training, the majority of respondents (87.5%) rated their medical training as fair or poor in preparing them for oral health assessments compared to only 35%, 29% and 7% of respondents giving fair or poor ratings to child abuse identification, caring for special needs patients and primary care paediatric practice, respectively. Additionally, although 90% of respondents noted that the role of primary physicians in counselling/referring children with oral health was important, 60% did not agree with the AAPD and AAP guidelines that state that all children should be referred to a dentist by 12 months of age. Multivariate logistic regression analyses revealed several statistically significant variables that predict the likelihood of performing various oral-health-related practices. The choice of public-health-oriented future clinical goals, the level of oral health knowledge, how interns rated their oral health training in medical school and the average number of children seen per week, all--to varying degrees--proved important predicator variables for the likelihood of performing them once in practice. More oral-health-related training of medical students seems warranted and could improve their interest in providing oral-health-related screening and referrals in practice. Increasing student exposure to child patients and increasing exposures to oral health knowledge and problems could be targeted towards students interested in primary care and public health to use resources most efficiently in the effort to combat the growing caries levels amongst young children in Saudi Arabia. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Is ICRP guidance on the use of reference levels consistent?

    PubMed

    Hedemann-Jensen, Per; McEwan, Andrew C

    2011-12-01

    In ICRP 103, which has replaced ICRP 60, it is stated that no fundamental changes have been introduced compared with ICRP 60. This is true except that the application of reference levels in emergency and existing exposure situations seems to be applied inconsistently, and also in the related publications ICRP 109 and ICRP 111. ICRP 103 emphasises that focus should be on the residual doses after the implementation of protection strategies in emergency and existing exposure situations. If possible, the result of an optimised protection strategy should bring the residual dose below the reference level. Thus the reference level represents the maximum acceptable residual dose after an optimised protection strategy has been implemented. It is not an 'off-the-shelf item' that can be set free of the prevailing situation. It should be determined as part of the process of optimising the protection strategy. If not, protection would be sub-optimised. However, in ICRP 103 some inconsistent concepts have been introduced, e.g. in paragraph 279 which states: 'All exposures above or below the reference level should be subject to optimisation of protection, and particular attention should be given to exposures above the reference level'. If, in fact, all exposures above and below reference levels are subject to the process of optimisation, reference levels appear superfluous. It could be considered that if optimisation of protection below a fixed reference level is necessary, then the reference level has been set too high at the outset. Up until the last phase of the preparation of ICRP 103 the concept of a dose constraint was recommended to constrain the optimisation of protection in all types of exposure situations. In the final phase, the term 'dose constraint' was changed to 'reference level' for emergency and existing exposure situations. However, it seems as if in ICRP 103 it was not fully recognised that dose constraints and reference levels are conceptually different. The use of reference levels in radiological protection is reviewed. It is concluded that the recommendations in ICRP 103 and related ICRP publications seem to be inconsistent regarding the use of reference levels in existing and emergency exposure situations.

  18. Clinical availability of a self-administered odor questionnaire for patients with olfactory disorders.

    PubMed

    Takebayashi, Hironori; Tsuzuki, Kenzo; Oka, Hideki; Fukazawa, Keijiro; Daimon, Takashi; Sakagami, Masafumi

    2011-02-01

    This study demonstrated statistical correlations between a novel self-administered odor questionnaire (SAOQ) and other olfaction tests in patients with olfactory disorders, and the usefulness of this questionnaire was discussed. Between December 2004 and November 2009 (5 years), the SAOQ was completed by 405 healthy people without any nasal diseases (Group A) and 539 patients with an olfactory disorder (Group B) at the Department of Otolaryngology, Hyogo College of Medicine. This was a prospective study. The SAOQ proposed by the Japan Rhinology Society is a self-administered survey consisting of 20 smell-related items: "steamed rice, miso, seaweed, soy sauce, baked bread, butter, curry, garlic, orange, strawberry, green tea, coffee, chocolate, household gas, garbage, timber, stercus, sweat, flower, and perfume". The normal reference range of scores (%) of the SAOQ was calculated in Group A. To determine whether the results of the SAOQ were correlated with those of visual analogue scale (VAS) and T&T olfactometer, pre- and post-treatment results of the SAOQ and olfaction tests were analyzed. The questionnaire response rates were 99.5% (403/405 people) in Group A and 95.9% (517/539 patients) in Group B. The statistically normal reference level of the SAOQ was determined as more than 70%. In Group B, the mean pre-treatment SAOQ score (20.4%), VAS score (16.5%), and T&T recognition threshold (5.0) significantly improved to values of 46.7%, 41.1%, and 4.1 after treatments, respectively (n=249). Both pre- and post-treatment SAOQ scores (ΔQ) had statistically significant relationships with those of VAS and T&T (n=249). The utility of the SAOQ as an easy method of estimating olfaction was suggested. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  19. [Effects of Buzhong Yiqi decoction on expression of Bad, NF-κB, caspase-9, Survivin, and mTOR in nude mice with A549/DDP transplantation tumors].

    PubMed

    Liu, Ya-Li; Yi, Jia-Li; Liu, Chun-Ying

    2017-02-01

    This study was aimed to explore the effects of Buzhong Yiqi decoction on the expression levels of Bad, NF-κB, caspase-9, Survivin, and mTOR in nude mice with A549/DDP transplantation tumors.Sixty BALB/C mice were randomly divided into blank control group, tumor-bearing control group, cisplatin group and Buzhong Yiqi decoction of high, medium and low doses+cisplatin groups (hereinafter referred to as the high,medium and low combined groups). A549/DDP cells (concentration of 5×106 cells/mL)were cultured and inoculated in various groups, then the tumor-forming situations were observed. Corresponding treatment was given in all groups. Fourteen days later, immunohistochemistry and Real-time PCR methods were used to detect the expression levels of Bad, NF-κB, caspase-9, Survivin, mTOR protein and mRNA in tumors.Results showed that Buzhong Yiqi decoction combined with cisplatin could reduce the volume of transplanted tumors, and there was significant difference between medium combined group and high combined group(P<0.05). As compared with the tumor-bearing control group, the expression levels of Bad, NF-κB, Survivin and mTOR were significantly reduced in medium and high combined groups(P<0.05); the protein and mRNA expression levels of caspase-9 were gradually increased in medium combined and high combined groups(P<0.05), with statistical difference with tumor-bearing control group(P<0.05). There were statistical difference in mRNA expression of Bad, NF-κB and caspase-9 between medium combined group, high combined group and cisplatin group, low-combined group, tumor-bearing control group(P<0.05), but there was no statistical difference between cisplatin group, low-combined group, and tumor-bearing control group. In addition, there was no statistical difference between medium combined group and high combined group in protein and mRNA expression levels of various factors. Experimental results showed that Buzhong Yiqi decoction combined with cisplatin can inhibit the growth of A549/DDP transplanted tumors, and the mechanism may be associated with regulating Bad, NF-κB, caspase-9, Survivin, and mTOR levels as well as promoting apoptosis. Copyright© by the Chinese Pharmaceutical Association.

  20. Validating internal controls for quantitative plant gene expression studies.

    PubMed

    Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H

    2004-08-18

    Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments.

  1. Estimation of diagnostic test accuracy without full verification: a review of latent class methods

    PubMed Central

    Collins, John; Huynh, Minh

    2014-01-01

    The performance of a diagnostic test is best evaluated against a reference test that is without error. For many diseases, this is not possible, and an imperfect reference test must be used. However, diagnostic accuracy estimates may be biased if inaccurately verified status is used as the truth. Statistical models have been developed to handle this situation by treating disease as a latent variable. In this paper, we conduct a systematized review of statistical methods using latent class models for estimating test accuracy and disease prevalence in the absence of complete verification. PMID:24910172

  2. Design Optimization and In Vitro-In Vivo Evaluation of Orally Dissolving Strips of Clobazam

    PubMed Central

    Bala, Rajni; Khanna, Sushil; Pawar, Pravin

    2014-01-01

    Clobazam orally dissolving strips were prepared by solvent casting method. A full 32 factorial design was applied for optimization using different concentration of film forming polymer and disintegrating agent as independent variable and disintegration time, % cumulative drug release, and tensile strength as dependent variable. In addition the prepared films were also evaluated for surface pH, folding endurance, and content uniformity. The optimized film formulation showing the maximum in vitro drug release, satisfactory in vitro disintegration time, and tensile strength was selected for bioavailability study and compared with a reference marketed product (frisium5 tablets) in rabbits. Formulation (F6) was selected by the Design-expert software which exhibited DT (24 sec), TS (2.85 N/cm2), and in vitro drug release (96.6%). Statistical evaluation revealed no significant difference between the bioavailability parameters of the test film (F6) and the reference product. The mean ratio values (test/reference) of C max (95.87%), t max (71.42%), AUC0−t (98.125%), and AUC0−∞ (99.213%) indicated that the two formulae exhibited comparable plasma level-time profiles. PMID:25328709

  3. Comparison of pulsar positions from timing and very long baseline astrometry

    NASA Astrophysics Data System (ADS)

    Wang, J. B.; Coles, W. A.; Hobbs, G.; Shannon, R. M.; Manchester, R. N.; Kerr, M.; Yuan, J. P.; Wang, N.; Bailes, M.; Bhat, N. D. R.; Dai, S.; Dempsey, J.; Keith, M. J.; Lasky, P. D.; Levin, Y.; Osłowski, S.; Ravi, V.; Reardon, D. J.; Rosado, P. A.; Russell, C. J.; Spiewak, R.; van Straten, W.; Toomey, L.; Wen, L.; You, X.-P.; Zhu, X.-J.

    2017-07-01

    Pulsar positions can be measured with high precision using both pulsar timing methods and very long baseline interferometry (VLBI). Pulsar timing positions are referenced to a solar-system ephemeris, whereas VLBI positions are referenced to distant quasars. Here, we compare pulsar positions from published VLBI measurements with those obtained from pulsar timing data from the Nanshan and Parkes radio telescopes in order to relate the two reference frames. We find that the timing positions differ significantly from the VLBI positions (and also differ between different ephemerides). A statistically significant change in the obliquity of the ecliptic of 2.16 ± 0.33 mas is found for the JPL ephemeris DE405, but no significant rotation is found in subsequent JPL ephemerides. The accuracy with which we can relate the two frames is limited by the current uncertainties in the VLBI reference source positions and in matching the pulsars to their reference source. Not only do the timing positions depend on the ephemeris used in computing them, but also different segments of the timing data lead to varying position estimates. These variations are mostly common to all ephemerides, but slight changes are seen at the 10 μas level between ephemerides.

  4. Quantitative determination and validation of octreotide acetate using 1 H-NMR spectroscopy with internal standard method.

    PubMed

    Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang

    2018-01-01

    Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Correlation between the Severity and Type of Acne Lesions with Serum Zinc Levels in Patients with Acne Vulgaris

    PubMed Central

    Rostami Mogaddam, Majid; Safavi Ardabili, Nastaran; Soflaee, Maedeh

    2014-01-01

    Acne vulgaris is the most common cutaneous disorder affecting adolescents and young adults. Some studies have reported an association between serum zinc levels and acne vulgaris. We aimed to evaluate the serum zinc level in patients with acne vulgaris and compare it with healthy controls. One hundred patients with acne vulgaris and 100 healthy controls were referred to our clinic. Acne severity was classified according to Global Acne Grading System (GAGS). Atomic absorption spectrophotometry was used to measure serum zinc levels. Mean serum level of zinc in acne patients and controls was 81.31 ± 17.63 μg/dl and 82.63 ± 17.49 μg/dl, respectively. Although the mean serum zinc level was lower in acne group, it was not statistically significant (P = 0.598). There was a correlation between serum zinc levels with severity and type of acne lesions. The results of our study suggest that zinc levels may be related to the severity and type of acne lesions in patients with acne vulgaris. Relative decrease of serum zinc level in acne patients suggests a role for zinc in the pathogenesis of acne vulgaris. PMID:25157359

  6. Correlation between the severity and type of acne lesions with serum zinc levels in patients with acne vulgaris.

    PubMed

    Rostami Mogaddam, Majid; Safavi Ardabili, Nastaran; Maleki, Nasrollah; Soflaee, Maedeh

    2014-01-01

    Acne vulgaris is the most common cutaneous disorder affecting adolescents and young adults. Some studies have reported an association between serum zinc levels and acne vulgaris. We aimed to evaluate the serum zinc level in patients with acne vulgaris and compare it with healthy controls. One hundred patients with acne vulgaris and 100 healthy controls were referred to our clinic. Acne severity was classified according to Global Acne Grading System (GAGS). Atomic absorption spectrophotometry was used to measure serum zinc levels. Mean serum level of zinc in acne patients and controls was 81.31 ± 17.63 μg/dl and 82.63 ± 17.49 μg/dl, respectively. Although the mean serum zinc level was lower in acne group, it was not statistically significant (P = 0.598). There was a correlation between serum zinc levels with severity and type of acne lesions. The results of our study suggest that zinc levels may be related to the severity and type of acne lesions in patients with acne vulgaris. Relative decrease of serum zinc level in acne patients suggests a role for zinc in the pathogenesis of acne vulgaris.

  7. Accuracy evaluation of contour next compared with five blood glucose monitoring systems across a wide range of blood glucose concentrations occurring in a clinical research setting.

    PubMed

    Klaff, Leslie J; Brazg, Ronald; Hughes, Kristen; Tideman, Ann M; Schachner, Holly C; Stenger, Patricia; Pardo, Scott; Dunne, Nancy; Parkes, Joan Lee

    2015-01-01

    This study evaluated the accuracy of Contour(®) Next (CN; Bayer HealthCare LLC, Diabetes Care, Whippany, NJ) compared with five blood glucose monitoring systems (BGMSs) across a wide range of clinically occurring blood glucose levels. Subjects (n=146) were ≥ 18 years and had type 1 or type 2 diabetes. Subjects' glucose levels were safely lowered or raised to provide a wide range of glucose values. Capillary blood samples were tested on six BGMSs and a YSI glucose analyzer (YSI Life Sciences, Inc., Yellow Springs, OH) as the reference. Extreme glucose values were achieved by glucose modification of the blood sample. System accuracy was assessed by mean absolute difference (MAD) and mean absolute relative difference (MARD) across several glucose ranges, with <70 mg/dL evaluated by MAD as the primary end point. In the low glucose range (<70 mg/dL), MAD values were as follows: Accu-Chek(®) Aviva Nano (Roche Diagnostics, Indianapolis, IN), 3.34 mg/dL; CN, 2.03 mg/dL; FreeStyle Lite(®) (FSL; Abbott Diabetes Care, Inc., Alameda, CA), 2.77 mg/dL; OneTouch(®) Ultra(®) 2 (LifeScan, Inc., Milpitas, CA), 10.20 mg/dL; OneTouch(®) Verio(®) Pro (LifeScan, Inc.), 4.53 mg/dL; and Truetrack(®) (Nipro Diagnostics, Inc., Fort Lauderdale, FL), 11.08 mg/dL. The lowest MAD in the low glucose range, from CN, was statistically significantly lower than those of the other BGMSs with the exception of the FSL. CN also had a statistically significantly lower MARD than all other BGMSs in the low glucose range. In the overall glucose range (21-496 mg/dL), CN yielded the lowest MAD and MARD values, which were statistically significantly lower in comparison with the other BGMSs. When compared with other BGMSs, CN demonstrated the lowest mean deviation from the reference value (by MAD and MARD) across multiple glucose ranges.

  8. Effect of the image resolution on the statistical descriptors of heterogeneous media.

    PubMed

    Ledesma-Alonso, René; Barbosa, Romeli; Ortegón, Jaime

    2018-02-01

    The characterization and reconstruction of heterogeneous materials, such as porous media and electrode materials, involve the application of image processing methods to data acquired by scanning electron microscopy or other microscopy techniques. Among them, binarization and decimation are critical in order to compute the correlation functions that characterize the microstructure of the above-mentioned materials. In this study, we present a theoretical analysis of the effects of the image-size reduction, due to the progressive and sequential decimation of the original image. Three different decimation procedures (random, bilinear, and bicubic) were implemented and their consequences on the discrete correlation functions (two-point, line-path, and pore-size distribution) and the coarseness (derived from the local volume fraction) are reported and analyzed. The chosen statistical descriptors (correlation functions and coarseness) are typically employed to characterize and reconstruct heterogeneous materials. A normalization for each of the correlation functions has been performed. When the loss of statistical information has not been significant for a decimated image, its normalized correlation function is forecast by the trend of the original image (reference function). In contrast, when the decimated image does not hold statistical evidence of the original one, the normalized correlation function diverts from the reference function. Moreover, the equally weighted sum of the average of the squared difference, between the discrete correlation functions of the decimated images and the reference functions, leads to a definition of an overall error. During the first stages of the gradual decimation, the error remains relatively small and independent of the decimation procedure. Above a threshold defined by the correlation length of the reference function, the error becomes a function of the number of decimation steps. At this stage, some statistical information is lost and the error becomes dependent on the decimation procedure. These results may help us to restrict the amount of information that one can afford to lose during a decimation process, in order to reduce the computational and memory cost, when one aims to diminish the time consumed by a characterization or reconstruction technique, yet maintaining the statistical quality of the digitized sample.

  9. Effect of the image resolution on the statistical descriptors of heterogeneous media

    NASA Astrophysics Data System (ADS)

    Ledesma-Alonso, René; Barbosa, Romeli; Ortegón, Jaime

    2018-02-01

    The characterization and reconstruction of heterogeneous materials, such as porous media and electrode materials, involve the application of image processing methods to data acquired by scanning electron microscopy or other microscopy techniques. Among them, binarization and decimation are critical in order to compute the correlation functions that characterize the microstructure of the above-mentioned materials. In this study, we present a theoretical analysis of the effects of the image-size reduction, due to the progressive and sequential decimation of the original image. Three different decimation procedures (random, bilinear, and bicubic) were implemented and their consequences on the discrete correlation functions (two-point, line-path, and pore-size distribution) and the coarseness (derived from the local volume fraction) are reported and analyzed. The chosen statistical descriptors (correlation functions and coarseness) are typically employed to characterize and reconstruct heterogeneous materials. A normalization for each of the correlation functions has been performed. When the loss of statistical information has not been significant for a decimated image, its normalized correlation function is forecast by the trend of the original image (reference function). In contrast, when the decimated image does not hold statistical evidence of the original one, the normalized correlation function diverts from the reference function. Moreover, the equally weighted sum of the average of the squared difference, between the discrete correlation functions of the decimated images and the reference functions, leads to a definition of an overall error. During the first stages of the gradual decimation, the error remains relatively small and independent of the decimation procedure. Above a threshold defined by the correlation length of the reference function, the error becomes a function of the number of decimation steps. At this stage, some statistical information is lost and the error becomes dependent on the decimation procedure. These results may help us to restrict the amount of information that one can afford to lose during a decimation process, in order to reduce the computational and memory cost, when one aims to diminish the time consumed by a characterization or reconstruction technique, yet maintaining the statistical quality of the digitized sample.

  10. Computer-Assisted Search Of Large Textual Data Bases

    NASA Technical Reports Server (NTRS)

    Driscoll, James R.

    1995-01-01

    "QA" denotes high-speed computer system for searching diverse collections of documents including (but not limited to) technical reference manuals, legal documents, medical documents, news releases, and patents. Incorporates previously available and emerging information-retrieval technology to help user intelligently and rapidly locate information found in large textual data bases. Technology includes provision for inquiries in natural language; statistical ranking of retrieved information; artificial-intelligence implementation of semantics, in which "surface level" knowledge found in text used to improve ranking of retrieved information; and relevance feedback, in which user's judgements of relevance of some retrieved documents used automatically to modify search for further information.

  11. Prediction of rain effects on earth-space communication links operating in the 10 to 35 GHz frequency range

    NASA Technical Reports Server (NTRS)

    Stutzman, Warren L.

    1989-01-01

    This paper reviews the effects of precipitation on earth-space communication links operating the 10 to 35 GHz frequency range. Emphasis is on the quantitative prediction of rain attenuation and depolarization. Discussions center on the models developed at Virginia Tech. Comments on other models are included as well as literature references to key works. Also included is the system level modeling for dual polarized communication systems with techniques for calculating antenna and propagation medium effects. Simple models for the calculation of average annual attenuation and cross-polarization discrimination (XPD) are presented. Calculation of worst month statistics are also presented.

  12. MGmapper: Reference based mapping and taxonomy annotation of metagenomics sequence reads

    PubMed Central

    Lukjancenko, Oksana; Thomsen, Martin Christen Frølund; Maddalena Sperotto, Maria; Lund, Ole; Møller Aarestrup, Frank; Sicheritz-Pontén, Thomas

    2017-01-01

    An increasing amount of species and gene identification studies rely on the use of next generation sequence analysis of either single isolate or metagenomics samples. Several methods are available to perform taxonomic annotations and a previous metagenomics benchmark study has shown that a vast number of false positive species annotations are a problem unless thresholds or post-processing are applied to differentiate between correct and false annotations. MGmapper is a package to process raw next generation sequence data and perform reference based sequence assignment, followed by a post-processing analysis to produce reliable taxonomy annotation at species and strain level resolution. An in-vitro bacterial mock community sample comprised of 8 genuses, 11 species and 12 strains was previously used to benchmark metagenomics classification methods. After applying a post-processing filter, we obtained 100% correct taxonomy assignments at species and genus level. A sensitivity and precision at 75% was obtained for strain level annotations. A comparison between MGmapper and Kraken at species level, shows MGmapper assigns taxonomy at species level using 84.8% of the sequence reads, compared to 70.5% for Kraken and both methods identified all species with no false positives. Extensive read count statistics are provided in plain text and excel sheets for both rejected and accepted taxonomy annotations. The use of custom databases is possible for the command-line version of MGmapper, and the complete pipeline is freely available as a bitbucked package (https://bitbucket.org/genomicepidemiology/mgmapper). A web-version (https://cge.cbs.dtu.dk/services/MGmapper) provides the basic functionality for analysis of small fastq datasets. PMID:28467460

  13. MGmapper: Reference based mapping and taxonomy annotation of metagenomics sequence reads.

    PubMed

    Petersen, Thomas Nordahl; Lukjancenko, Oksana; Thomsen, Martin Christen Frølund; Maddalena Sperotto, Maria; Lund, Ole; Møller Aarestrup, Frank; Sicheritz-Pontén, Thomas

    2017-01-01

    An increasing amount of species and gene identification studies rely on the use of next generation sequence analysis of either single isolate or metagenomics samples. Several methods are available to perform taxonomic annotations and a previous metagenomics benchmark study has shown that a vast number of false positive species annotations are a problem unless thresholds or post-processing are applied to differentiate between correct and false annotations. MGmapper is a package to process raw next generation sequence data and perform reference based sequence assignment, followed by a post-processing analysis to produce reliable taxonomy annotation at species and strain level resolution. An in-vitro bacterial mock community sample comprised of 8 genuses, 11 species and 12 strains was previously used to benchmark metagenomics classification methods. After applying a post-processing filter, we obtained 100% correct taxonomy assignments at species and genus level. A sensitivity and precision at 75% was obtained for strain level annotations. A comparison between MGmapper and Kraken at species level, shows MGmapper assigns taxonomy at species level using 84.8% of the sequence reads, compared to 70.5% for Kraken and both methods identified all species with no false positives. Extensive read count statistics are provided in plain text and excel sheets for both rejected and accepted taxonomy annotations. The use of custom databases is possible for the command-line version of MGmapper, and the complete pipeline is freely available as a bitbucked package (https://bitbucket.org/genomicepidemiology/mgmapper). A web-version (https://cge.cbs.dtu.dk/services/MGmapper) provides the basic functionality for analysis of small fastq datasets.

  14. Lack of association between depression and C-reactive protein level in the baseline of Longitudinal Study of Adult Health (ELSA-Brasil).

    PubMed

    de Menezes, Sara Teles; de Figueiredo, Roberta Carvalho; Goulart, Alessandra Carvalho; Nunes, Maria Angélica; M Benseñor, Isabela; Viana, Maria Carmen; Barreto, Sandhi Maria

    2017-01-15

    Depression has been linked to increased levels of inflammatory markers in clinical studies, but results from general population samples are inconsistent. We aimed to investigate whether depression was associated with serum CRP levels in a cross-sectional analysis of a large cohort from a middle-income country. We analyzed baseline data from 14,821 participants (35-74 years) of the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil). Current depression (last 7 days) was assessed by the Clinical Interview Schedule-Revised (CIS-R). Because individuals on antidepressants could be negative on CIS-R due to their therapeutic effect, the explanatory variable had three categories: (1) negative on CIS-R and not using antidepressant (reference); (2) negative on CIS-R but using antidepressant; (3) positive on CIS-R with/without antidepressant use. Associations with CRP were investigated by general linear model (GLM). After adjustments for confounders, neither current depression, nor antidepressant use was statistically associated with elevated CRP levels. Additionally, analyzes stratified by gender, type and severity of depression did not change the results. The reference group in our analysis might include participants with a lifetime history of depression. Additionally, the exclusion of questions on weight fluctuation and appetite from the CIS-R applied in ELSA-Brasil may have slightly underestimated the prevalence of depression, as well as limited our ability to assess the presence of somatic symptoms. This study found no association between current depression, use of antidepressants, and serum CRP levels. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. The impact of exposure to radio frequency electromagnetic fields on chronic well-being in young people--a cross-sectional study based on personal dosimetry.

    PubMed

    Heinrich, Sabine; Thomas, Silke; Heumann, Christian; von Kries, Rüdiger; Radon, Katja

    2011-01-01

    A possible influence of radio frequency electromagnetic field (RF EMF) exposure on health outcomes was investigated in various studies. The main problem of previous studies was exposure assessment. The aim of our study was the investigation of a possible association between RF EMF and chronic well-being in young persons using personal dosimetry. 3022 children and adolescents were randomly selected from the population registries of four Bavarian cities in Germany (participation 52%). Personal interview data on chronic symptoms, socio-demographic characteristics and potential confounders were collected. A 24-h radio frequency exposure profile was generated using a personal dosimeter. Exposure levels over waking hours were expressed as mean percentage of the International Commission on Non-Ionizing Radiation Protection (ICNIRP) reference level. Half of the children and nearly every adolescent owned a mobile phone which was used only for short durations per day. Measured exposure was far below the current ICNIRP reference levels. The most reported chronic symptom in children and adolescents was fatigue. No statistically significant association between measured exposure and chronic symptoms was observed. Our results do not indicate an association between measured exposure to RF EMF and chronic well-being in children and adolescents. Prospective studies investigating potential long-term effects of RF EMF are necessary to confirm our results. Copyright © 2010 Elsevier Ltd. All rights reserved.

  16. Sociodemographic and smoking associated with obesity in adult women in Iran: results from the National Health Survey.

    PubMed

    Bakhshi, Enayatollah; Eshraghian, Mohammad Reza; Mohammad, Kazem; Foroushani, Abbas Rahimi; Zeraati, Hojat; Fotouhi, Akbar; Siassi, Fraidon; Seifi, Behjat

    2008-12-01

    There is no study that had a sample size sufficient to study the association between sociodemographic and smoking with obesity in Iran. The goal was to investigate these associations in the Iranian women. Multivariate statistical techniques included 14 176 women between 20 and 69 years of age. Height and weight were measured rather than self-reported. In Iranian adult women, obesity OR(S) for the moderate and high education were 0.78 and 0.41, respectively, compared with basic level. Using low economy index as the reference, Obesity OR(S) for the urban women were 1.29, 1.25 and 1.28 for the lower-middle, upper-middle and high groups, respectively. Obesity OR(S) for the rural women were 1.71, 1.71 and 2.02 for the lower-middle, upper-middle and high groups, respectively. Obesity OR was 0.48 for active workforce compared with inactive group. Obesity OR was 0.70 for smokers women compared with nonsmokers. Using non-married as the reference group, Obesity OR(S) were 1.23 and 2.34 for married urban and rural women, respectively. Our results on the associations between age, smoking, education level, workforce and obesity are consistent with most studies, but between economic level and obesity are consistent with some study in developing countries.

  17. Accuracy and precision of 3 intraoral scanners and accuracy of conventional impressions: A novel in vivo analysis method.

    PubMed

    Nedelcu, R; Olsson, P; Nyström, I; Rydén, J; Thor, A

    2018-02-01

    To evaluate a novel methodology using industrial scanners as a reference, and assess in vivo accuracy of 3 intraoral scanners (IOS) and conventional impressions. Further, to evaluate IOS precision in vivo. Four reference-bodies were bonded to the buccal surfaces of upper premolars and incisors in five subjects. After three reference-scans, ATOS Core 80 (ATOS), subjects were scanned three times with three IOS systems: 3M True Definition (3M), CEREC Omnicam (OMNI) and Trios 3 (TRIOS). One conventional impression (IMPR) was taken, 3M Impregum Penta Soft, and poured models were digitized with laboratory scanner 3shape D1000 (D1000). Best-fit alignment of reference-bodies and 3D Compare Analysis was performed. Precision of ATOS and D1000 was assessed for quantitative evaluation and comparison. Accuracy of IOS and IMPR were analyzed using ATOS as reference. Precision of IOS was evaluated through intra-system comparison. Precision of ATOS reference scanner (mean 0.6 μm) and D1000 (mean 0.5 μm) was high. Pairwise multiple comparisons of reference-bodies located in different tooth positions displayed a statistically significant difference of accuracy between two scanner-groups: 3M and TRIOS, over OMNI (p value range 0.0001 to 0.0006). IMPR did not show any statistically significant difference to IOS. However, deviations of IOS and IMPR were within a similar magnitude. No statistical difference was found for IOS precision. The methodology can be used for assessing accuracy of IOS and IMPR in vivo in up to five units bilaterally from midline. 3M and TRIOS had a higher accuracy than OMNI. IMPR overlapped both groups. Intraoral scanners can be used as a replacement for conventional impressions when restoring up to ten units without extended edentulous spans. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Nocturnal oxygen saturation profiles of healthy term infants.

    PubMed

    Terrill, Philip Ian; Dakin, Carolyn; Hughes, Ian; Yuill, Maggie; Parsley, Chloe

    2015-01-01

    Pulse oximetry is used extensively in hospital and home settings to measure arterial oxygen saturation (SpO2). Interpretation of the trend and range of SpO2 values observed in infants is currently limited by a lack of reference ranges using current devices, and may be augmented by development of cumulative frequency (CF) reference-curves. This study aims to provide reference oxygen saturation values from a prospective longitudinal cohort of healthy infants. Prospective longitudinal cohort study. Sleep-laboratory. 34 healthy term infants were enrolled, and studied at 2 weeks, 3, 6, 12 and 24 months of age (N=30, 25, 27, 26, 20, respectively). Full overnight polysomnography, including 2 s averaging pulse oximetry (Masimo Radical). Summary SpO2 statistics (mean, median, 5th and 10th percentiles) and SpO2 CF plots were calculated for each recording. CF reference-curves were then generated for each study age. Analyses were repeated with sleep-state stratifications and inclusion of manual artefact removal. Median nocturnal SpO2 values ranged between 98% and 99% over the first 2 years of life and the CF reference-curves shift right by 1% between 2 weeks and 3 months. CF reference-curves did not change with manual artefact removal during sleep and did not vary between rapid eye movement (REM) and non-REM sleep. Manual artefact removal did significantly change summary statistics and CF reference-curves during wake. SpO2 CF curves provide an intuitive visual tool for evaluating whether an individual's nocturnal SpO2 distribution falls within the range of healthy age-matched infants, thereby complementing summary statistics in the interpretation of extended oximetry recordings in infants. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  19. Some Tests of Randomness with Applications

    DTIC Science & Technology

    1981-02-01

    freedom. For further details, the reader is referred to Gnanadesikan (1977, p. 169) wherein other relevant tests are also given, Graphical tests, as...sample from a gamma distri- bution. J. Am. Statist. Assoc. 71, 480-7. Gnanadesikan , R. (1977). Methods for Statistical Data Analysis of Multivariate

  20. Network Polymers Formed Under Nonideal Conditions.

    DTIC Science & Technology

    1986-12-01

    the system or the limited ability of the statistical model to account for stochastic correlations. The viscosity of the reacting system was measured as...based on competing reactions (ring, chain) and employs equilibrium chain statistics . The work thus far has been limited to single cycle growth on an...polymerizations, because a large number of differential equations must be solved. The Makovian approach (sometimes referred to as the statistical or

  1. Common pitfalls in statistical analysis: Measures of agreement.

    PubMed

    Ranganathan, Priya; Pramesh, C S; Aggarwal, Rakesh

    2017-01-01

    Agreement between measurements refers to the degree of concordance between two (or more) sets of measurements. Statistical methods to test agreement are used to assess inter-rater variability or to decide whether one technique for measuring a variable can substitute another. In this article, we look at statistical measures of agreement for different types of data and discuss the differences between these and those for assessing correlation.

  2. Humidity-corrected Arrhenius equation: The reference condition approach.

    PubMed

    Naveršnik, Klemen; Jurečič, Rok

    2016-03-16

    Accelerated and stress stability data is often used to predict shelf life of pharmaceuticals. Temperature, combined with humidity accelerates chemical decomposition and the Arrhenius equation is used to extrapolate accelerated stability results to long-term stability. Statistical estimation of the humidity-corrected Arrhenius equation is not straightforward due to its non-linearity. A two stage nonlinear fitting approach is used in practice, followed by a prediction stage. We developed a single-stage statistical procedure, called the reference condition approach, which has better statistical properties (less collinearity, direct estimation of uncertainty, narrower prediction interval) and is significantly easier to use, compared to the existing approaches. Our statistical model was populated with data from a 35-day stress stability study on a laboratory batch of vitamin tablets and required mere 30 laboratory assay determinations. The stability prediction agreed well with the actual 24-month long term stability of the product. The approach has high potential to assist product formulation, specification setting and stability statements. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Dissolution curve comparisons through the F(2) parameter, a Bayesian extension of the f(2) statistic.

    PubMed

    Novick, Steven; Shen, Yan; Yang, Harry; Peterson, John; LeBlond, Dave; Altan, Stan

    2015-01-01

    Dissolution (or in vitro release) studies constitute an important aspect of pharmaceutical drug development. One important use of such studies is for justifying a biowaiver for post-approval changes which requires establishing equivalence between the new and old product. We propose a statistically rigorous modeling approach for this purpose based on the estimation of what we refer to as the F2 parameter, an extension of the commonly used f2 statistic. A Bayesian test procedure is proposed in relation to a set of composite hypotheses that capture the similarity requirement on the absolute mean differences between test and reference dissolution profiles. Several examples are provided to illustrate the application. Results of our simulation study comparing the performance of f2 and the proposed method show that our Bayesian approach is comparable to or in many cases superior to the f2 statistic as a decision rule. Further useful extensions of the method, such as the use of continuous-time dissolution modeling, are considered.

  4. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  5. Sexually Transmitted Diseases: A Selective, Annotated Bibliography.

    ERIC Educational Resources Information Center

    Planned Parenthood Federation of America, Inc., New York, NY. Education Dept.

    This document contains a reference sheet and an annotated bibliography concerned with sexually transmitted diseases (STD). The reference sheet provides a brief, accurate overview of STDs which includes both statistical and background information. The bibliography contains 83 entries, listed alphabetically, that deal with STDs. Books and articles…

  6. Determination of Reference Catalogs for Meridian Observations Using Statistical Method

    NASA Astrophysics Data System (ADS)

    Li, Z. Y.

    2014-09-01

    The meridian observational data are useful for developing high-precision planetary ephemerides of the solar system. These historical data are provided by the jet propulsion laboratory (JPL) or the Institut De Mecanique Celeste Et De Calcul Des Ephemerides (IMCCE). However, we find that the reference systems (realized by the fundamental catalogs FK3 (Third Fundamental Catalogue), FK4 (Fourth Fundamental Catalogue), and FK5 (Fifth Fundamental Catalogue), or Hipparcos), to which the observations are referred, are not given explicitly for some sets of data. The incompleteness of information prevents us from eliminating the systematic effects due to the different fundamental catalogs. The purpose of this paper is to specify clearly the reference catalogs of these observations with the problems in their records by using the JPL DE421 ephemeris. The data for the corresponding planets in the geocentric celestial reference system (GCRS) obtained from the DE421 are transformed to the apparent places with different hypothesis regarding the reference catalogs. Then the validations of the hypothesis are tested by two kinds of statistical quantities which are used to indicate the significance of difference between the original and transformed data series. As a result, this method is proved to be effective for specifying the reference catalogs, and the missed information is determined unambiguously. Finally these meridian data are transformed to the GCRS for further applications in the development of planetary ephemerides.

  7. Simplified estimation of age-specific reference intervals for skewed data.

    PubMed

    Wright, E M; Royston, P

    1997-12-30

    Age-specific reference intervals are commonly used in medical screening and clinical practice, where interest lies in the detection of extreme values. Many different statistical approaches have been published on this topic. The advantages of a parametric method are that they necessarily produce smooth centile curves, the entire density is estimated and an explicit formula is available for the centiles. The method proposed here is a simplified version of a recent approach proposed by Royston and Wright. Basic transformations of the data and multiple regression techniques are combined to model the mean, standard deviation and skewness. Using these simple tools, which are implemented in almost all statistical computer packages, age-specific reference intervals may be obtained. The scope of the method is illustrated by fitting models to several real data sets and assessing each model using goodness-of-fit techniques.

  8. Statistical Analyses of Hydrophobic Interactions: A Mini-Review

    DOE PAGES

    Pratt, Lawrence R.; Chaudhari, Mangesh I.; Rempe, Susan B.

    2016-07-14

    Here this review focuses on the striking recent progress in solving for hydrophobic interactions between small inert molecules. We discuss several new understandings. First, the inverse temperature phenomenology of hydrophobic interactions, i.e., strengthening of hydrophobic bonds with increasing temperature, is decisively exhibited by hydrophobic interactions between atomic-scale hard sphere solutes in water. Second, inclusion of attractive interactions associated with atomic-size hydrophobic reference cases leads to substantial, nontrivial corrections to reference results for purely repulsive solutes. Hydrophobic bonds are weakened by adding solute dispersion forces to treatment of reference cases. The classic statistical mechanical theory for those corrections is not accuratemore » in this application, but molecular quasi-chemical theory shows promise. Lastly, because of the masking roles of excluded volume and attractive interactions, comparisons that do not discriminate the different possibilities face an interpretive danger.« less

  9. Indirect methods for reference interval determination - review and recommendations.

    PubMed

    Jones, Graham R D; Haeckel, Rainer; Loh, Tze Ping; Sikaris, Ken; Streichert, Thomas; Katayev, Alex; Barth, Julian H; Ozarda, Yesim

    2018-04-19

    Reference intervals are a vital part of the information supplied by clinical laboratories to support interpretation of numerical pathology results such as are produced in clinical chemistry and hematology laboratories. The traditional method for establishing reference intervals, known as the direct approach, is based on collecting samples from members of a preselected reference population, making the measurements and then determining the intervals. An alternative approach is to perform analysis of results generated as part of routine pathology testing and using appropriate statistical techniques to determine reference intervals. This is known as the indirect approach. This paper from a working group of the International Federation of Clinical Chemistry (IFCC) Committee on Reference Intervals and Decision Limits (C-RIDL) aims to summarize current thinking on indirect approaches to reference intervals. The indirect approach has some major potential advantages compared with direct methods. The processes are faster, cheaper and do not involve patient inconvenience, discomfort or the risks associated with generating new patient health information. Indirect methods also use the same preanalytical and analytical techniques used for patient management and can provide very large numbers for assessment. Limitations to the indirect methods include possible effects of diseased subpopulations on the derived interval. The IFCC C-RIDL aims to encourage the use of indirect methods to establish and verify reference intervals, to promote publication of such intervals with clear explanation of the process used and also to support the development of improved statistical techniques for these studies.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kogalovskii, M.R.

    This paper presents a review of problems related to statistical database systems, which are wide-spread in various fields of activity. Statistical databases (SDB) are referred to as databases that consist of data and are used for statistical analysis. Topics under consideration are: SDB peculiarities, properties of data models adequate for SDB requirements, metadata functions, null-value problems, SDB compromise protection problems, stored data compression techniques, and statistical data representation means. Also examined is whether the present Database Management Systems (DBMS) satisfy the SDB requirements. Some actual research directions in SDB systems are considered.

  11. Equivalence Testing of Complex Particle Size Distribution Profiles Based on Earth Mover's Distance.

    PubMed

    Hu, Meng; Jiang, Xiaohui; Absar, Mohammad; Choi, Stephanie; Kozak, Darby; Shen, Meiyu; Weng, Yu-Ting; Zhao, Liang; Lionberger, Robert

    2018-04-12

    Particle size distribution (PSD) is an important property of particulates in drug products. In the evaluation of generic drug products formulated as suspensions, emulsions, and liposomes, the PSD comparisons between a test product and the branded product can provide useful information regarding in vitro and in vivo performance. Historically, the FDA has recommended the population bioequivalence (PBE) statistical approach to compare the PSD descriptors D50 and SPAN from test and reference products to support product equivalence. In this study, the earth mover's distance (EMD) is proposed as a new metric for comparing PSD particularly when the PSD profile exhibits complex distribution (e.g., multiple peaks) that is not accurately described by the D50 and SPAN descriptor. EMD is a statistical metric that measures the discrepancy (distance) between size distribution profiles without a prior assumption of the distribution. PBE is then adopted to perform statistical test to establish equivalence based on the calculated EMD distances. Simulations show that proposed EMD-based approach is effective in comparing test and reference profiles for equivalence testing and is superior compared to commonly used distance measures, e.g., Euclidean and Kolmogorov-Smirnov distances. The proposed approach was demonstrated by evaluating equivalence of cyclosporine ophthalmic emulsion PSDs that were manufactured under different conditions. Our results show that proposed approach can effectively pass an equivalent product (e.g., reference product against itself) and reject an inequivalent product (e.g., reference product against negative control), thus suggesting its usefulness in supporting bioequivalence determination of a test product to the reference product which both possess multimodal PSDs.

  12. Statistical Analysis of a Round-Robin Measurement Survey of Two Candidate Materials for a Seebeck Coefficient Standard Reference Material

    PubMed Central

    Lu, Z. Q. J.; Lowhorn, N. D.; Wong-Ng, W.; Zhang, W.; Thomas, E. L.; Otani, M.; Green, M. L.; Tran, T. N.; Caylor, C.; Dilley, N. R.; Downey, A.; Edwards, B.; Elsner, N.; Ghamaty, S.; Hogan, T.; Jie, Q.; Li, Q.; Martin, J.; Nolas, G.; Obara, H.; Sharp, J.; Venkatasubramanian, R.; Willigan, R.; Yang, J.; Tritt, T.

    2009-01-01

    In an effort to develop a Standard Reference Material (SRM™) for Seebeck coefficient, we have conducted a round-robin measurement survey of two candidate materials—undoped Bi2Te3 and Constantan (55 % Cu and 45 % Ni alloy). Measurements were performed in two rounds by twelve laboratories involved in active thermoelectric research using a number of different commercial and custom-built measurement systems and techniques. In this paper we report the detailed statistical analyses on the interlaboratory measurement results and the statistical methodology for analysis of irregularly sampled measurement curves in the interlaboratory study setting. Based on these results, we have selected Bi2Te3 as the prototype standard material. Once available, this SRM will be useful for future interlaboratory data comparison and instrument calibrations. PMID:27504212

  13. Modeling Cross-Situational Word–Referent Learning: Prior Questions

    PubMed Central

    Yu, Chen; Smith, Linda B.

    2013-01-01

    Both adults and young children possess powerful statistical computation capabilities—they can infer the referent of a word from highly ambiguous contexts involving many words and many referents by aggregating cross-situational statistical information across contexts. This ability has been explained by models of hypothesis testing and by models of associative learning. This article describes a series of simulation studies and analyses designed to understand the different learning mechanisms posited by the 2 classes of models and their relation to each other. Variants of a hypothesis-testing model and a simple or dumb associative mechanism were examined under different specifications of information selection, computation, and decision. Critically, these 3 components of the models interact in complex ways. The models illustrate a fundamental tradeoff between amount of data input and powerful computations: With the selection of more information, dumb associative models can mimic the powerful learning that is accomplished by hypothesis-testing models with fewer data. However, because of the interactions among the component parts of the models, the associative model can mimic various hypothesis-testing models, producing the same learning patterns but through different internal components. The simulations argue for the importance of a compositional approach to human statistical learning: the experimental decomposition of the processes that contribute to statistical learning in human learners and models with the internal components that can be evaluated independently and together. PMID:22229490

  14. Statistical complexity without explicit reference to underlying probabilities

    NASA Astrophysics Data System (ADS)

    Pennini, F.; Plastino, A.

    2018-06-01

    We show that extremely simple systems of a not too large number of particles can be simultaneously thermally stable and complex. To such an end, we extend the statistical complexity's notion to simple configurations of non-interacting particles, without appeal to probabilities, and discuss configurational properties.

  15. Evaluation of Modification of the 3M™ Molecular Detection Assay (MDA) Salmonella Method (2013.09) for the Detection of Salmonella in Selected Foods: Collaborative Study.

    PubMed

    Bird, Patrick; Fisher, Kiel; Boyle, Megan; Huffman, Travis; Benzinger, M Joseph; Bedinghaus, Paige; Flannery, Jonathon; Crowley, Erin; Agin, James; Goins, David; Benesh, DeAnn; David, John

    2014-01-01

    The 3M(™) Molecular Detection Assay (MDA) Salmonella utilizes isothermal amplification of nucleic acid sequences with high specificity, efficiency, rapidity and bioluminescence to detect amplification of Salmonella spp. in food, food-related, and environmental samples after enrichment. A method modification and matrix extension study of the previously approved AOAC Official Method(SM) 2013.09 was conducted, and approval of the modification was received on March 20, 2014. Using an unpaired study design in a multilaboratory collaborative study, the 3M MDA Salmonella method was compared to the U.S. Department of Agriculture/Food Safety and Inspection Service (USDA/FSIS) Microbiology Laboratory Guidebook (MLG) 4.05 (2011), Isolation and Identification of Salmonella from Meat, Poultry, Pasteurized Egg, and Catfish Products for raw ground beef and the U.S. Food and Drug Administration (FDA)/Bacteriological Analytical Manual (BAM) Chapter 5, Salmonella reference method for wet dog food following the current AOAC guidelines. A total of 20 laboratories participated. For the 3M MDA Salmonella method, raw ground beef was analyzed using 25 g test portions, and wet dog food was analyzed using 375 g test portions. For the reference methods, 25 g test portions of each matrix were analyzed. Each matrix was artificially contaminated with Salmonella at three inoculation levels: an uninoculated control level (0 CFU/test portion), a low inoculum level (0.2-2 CFU/test portion), and a high inoculum level (2-5 CFU/test portion). In this study, 1512 unpaired replicate samples were analyzed. Statistical analysis was conducted according to the probability of detection (POD). For the low-level raw ground beef test portions, the following dLPOD (difference between the LPODs of the reference and candidate method) values with 95% confidence intervals were obtained: -0.01 (-0.14, +0.12). For the low-level wet dog food test portions, the following dLPOD with 95% confidence intervals were obtained: -0.04 (-0.16, +0.09). No significant differences were observed in the number of positive samples detected by the 3M MDA Salmonella method versus either the USDA/FSIS-MLG or FDA/BAM methods.

  16. Evaluation of 3M molecular detection assay (MDA) Salmonella for the detection of Salmonella in selected foods: collaborative study.

    PubMed

    Bird, Patrick; Fisher, Kiel; Boyle, Megan; Huffman, Travis; Benzinger, M Joseph; Bedinghaus, Paige; Flannery, Jonathan; Crowley, Erin; Agin, James; Goins, David; Benesh, DeAnn; David, John

    2013-01-01

    The 3M Molecular Detection Assay (MDA) Salmonella is used with the 3M Molecular Detection System for the detection of Salmonella spp. in food, food-related, and environmental samples after enrichment. The assay utilizes loop-mediated isothermal amplification to rapidly amplify Salmonella target DNA with high specificity and sensitivity, combined with bioluminescence to detect the amplification. The 3M MDA Salmonella method was compared using an unpaired study design in a multilaboratory collaborative study to the U.S. Department of Agriculture/Food Safety and Inspection Service-Microbiology Laboratory Guidebook (USDA/FSIS-MLG 4.05), Isolation and Identification of Salmonella from Meat, Poultry, Pasteurized Egg and Catfish Products for raw ground beef and the U.S. Food and Drug Administration/Bacteriological Analytical Manual (FDA/BAM) Chapter 5 Salmonella reference method for wet dog food following the current AOAC guidelines. A total of 20 laboratories participated. For the 3M MDA Salmonella method, raw ground beef was analyzed using 25 g test portions, and wet dog food was analyzed using 375 g test portions. For the reference methods, 25 g test portions of each matrix were analyzed. Each matrix was artificially contaminated with Salmonella at three inoculation levels: an uninoculated control level (0 CFU/test portion), a low inoculum level (0.2-2 CFU/test portion), and a high inoculum level (2-5 CFU/test portion). In this study, 1512 unpaired replicate samples were analyzed. Statistical analysis was conducted according to the probability of detection (POD). For the low-level raw ground beef test portions, the following dLPOD (difference between the POD of the reference and candidate method) values with 95% confidence intervals were obtained: -0.01 (-0.14, +0.12). For the low-level wet dog food test portions, the following dLPOD with 95% confidence intervals were obtained: -0.04 (-0.16, +0.09). No significant differences were observed in the number of positive samples detected by the 3M MDA Salmonella method versus either the USDA/FSIS-MLG or FDA/BAM methods.

  17. Relationship Between Occlusal Plane and Three Levels of Ala Tragus line in Dentulous and Partially Dentulous Patients in Different Age Groups: A Pilot Study

    PubMed Central

    Shaikh, Saquib Ahmed; K, Lekha

    2015-01-01

    Statement of problem: Correct orientation of the occlusal plane plays a vital role in achieving optimal aesthetics, occlusal balance and function of complete dentures. The use of ala tragus line for determination of occlusal plane has been a topic of debate over past many years. Also, the effect of age on level of ala tragal line has not been investigated in the past. Purpose: To determine the effect of age on location of Ala-Tragus line. Materials and Methods: A total of 180 patients (90 males and 90 females) were selected with complete dentition and were grouped according to their age in three age groups with 60 subjects in each age group (Group A: 20-35 y, Group B: 36-50 y, Group C: 51-65 y). Right lateral profile photographs were taken with subjects having fox plane placed intraorally parallel to occlusal plane. Reference points corresponding to inferior border, middle or superior border of tragus and inferior border of ala of nose were marked on photographs. These were joined to get three different levels of Ala-Tragus line. Images were analysed photometrically and most parallel relationship was determined in between arms of fox plane (that represented the occlusal plane) and three different levels of ala tragus line. Data obtained was subjected to statistical analysis using Pearson chi-square and Likelihood-ratio chi-square test. Results: Significant correlation was found between age and level of Ala-Tragus line. The occlusal plane was found to be more parallel to Ala-tragus line when inferior border of tragus was considered as posterior reference point in young adult age group (20-35 y). In older age groups, occlusal plane was found to be more parallel to Ala-tragus line when middle of tragus was considered as posterior reference point. Conclusion: Within the limitations of this study, it can be concluded that a definite relationship exists in between age and level of ala tragus line. PMID:25859523

  18. The epistemological status of general circulation models

    NASA Astrophysics Data System (ADS)

    Loehle, Craig

    2018-03-01

    Forecasts of both likely anthropogenic effects on climate and consequent effects on nature and society are based on large, complex software tools called general circulation models (GCMs). Forecasts generated by GCMs have been used extensively in policy decisions related to climate change. However, the relation between underlying physical theories and results produced by GCMs is unclear. In the case of GCMs, many discretizations and approximations are made, and simulating Earth system processes is far from simple and currently leads to some results with unknown energy balance implications. Statistical testing of GCM forecasts for degree of agreement with data would facilitate assessment of fitness for use. If model results need to be put on an anomaly basis due to model bias, then both visual and quantitative measures of model fit depend strongly on the reference period used for normalization, making testing problematic. Epistemology is here applied to problems of statistical inference during testing, the relationship between the underlying physics and the models, the epistemic meaning of ensemble statistics, problems of spatial and temporal scale, the existence or not of an unforced null for climate fluctuations, the meaning of existing uncertainty estimates, and other issues. Rigorous reasoning entails carefully quantifying levels of uncertainty.

  19. Back disorders and health problems among subway train operators exposed to whole-body vibration.

    PubMed

    Johanning, E

    1991-12-01

    Back disease associated with whole-body vibration has not been evaluated for subway train operators. A recent study demonstrated that this group is exposed to whole-body vibration at levels above the international standard. To investigate this risk further, a self-administered questionnaire survey was conducted among subway train operators (N = 492) and a similar reference group (N = 92). The operators had a higher prevalence than the referents in all aspects of back problems, particularly for cervical and lower back pain. In a multiple logistic regression model, the odds ratio for sciatic pain among subway train operators was 3.9 (95% CI 1.7-8.6); the operators also had a higher risk of hearing-related problems (odds ratio 3.2, 95% CI 0.6-17.4) and of gastrointestinal problems (odds ratio 1.6, 95% CI 1.1-2.5). Although a cumulative dose-response relationship could not be statistically demonstrated, the findings appear to be related to exposure to whole-body vibration and inadequate ergonomic conditions.

  20. Circles South East: the first 10 years 2002-2012.

    PubMed

    Bates, Andrew; Williams, Dominic; Wilson, Chris; Wilson, Robin J

    2014-07-01

    This article describes the first 10 years of the implementation of Circles of Support and Accountability (Circles) in the management of sexual offenders in South-East England by Circles South East (CSE). The Circles of 71 core members are reviewed in detail, with reference to demographic data, offense and sentencing histories, risk assessment data, and considerations regarding Multi-Agency Public Protection Arrangements. A group of 71 comparison subjects who were referred to CSE and deemed suitable for but did not receive the service was identified. Follow-up behaviors of both groups are examined (including all forms of reconviction, breach of orders, and prison recall). Over a comparable follow-up period of 55 months, the incidence of violent and contact sexual reconviction in the comparison group was significantly higher than for the Circles cohort. Comparisons are made between expected and actual levels of sexual reconviction, with the Circles cohort showing lower than expected rate of sexual reconviction but not to a statistically significant degree. © The Author(s) 2013.

  1. Component Models for Fuzzy Data

    ERIC Educational Resources Information Center

    Coppi, Renato; Giordani, Paolo; D'Urso, Pierpaolo

    2006-01-01

    The fuzzy perspective in statistical analysis is first illustrated with reference to the "Informational Paradigm" allowing us to deal with different types of uncertainties related to the various informational ingredients (data, model, assumptions). The fuzzy empirical data are then introduced, referring to "J" LR fuzzy variables as observed on "I"…

  2. Public Administration: A Bibliography of Selected Reference Sources.

    ERIC Educational Resources Information Center

    Brustman, Mary Jane

    This guide presents an annotated list of selected reference sources in public administration. All of the sources listed are found at the Graduate Library for Public Affairs and Policy (GLPP) located at the State University of New York, Albany. Detailed, exhaustive guides in literature, research, indexes, abstracts, statistical sources, government…

  3. Evaluation of the Reference Envelope Approach for Assessing Toxicity in Contaminated Surficial Urban Freshwater Sediments

    EPA Science Inventory

    The reference envelope (RE) has been proposed as an alternative approach to assess sediment toxicity to overcome limitations imposed by the use of control sediments including differences in non-contaminant characteristics and low statistical power when many test sediments are com...

  4. Cross-Situational Learning of Minimal Word Pairs

    ERIC Educational Resources Information Center

    Escudero, Paola; Mulak, Karen E.; Vlach, Haley A.

    2016-01-01

    "Cross-situational statistical learning" of words involves tracking co-occurrences of auditory words and objects across time to infer word-referent mappings. Previous research has demonstrated that learners can infer referents across sets of very phonologically distinct words (e.g., WUG, DAX), but it remains unknown whether learners can…

  5. Evaluation of Reference Genes for Quantitative Real-Time PCR in Oil Palm Elite Planting Materials Propagated by Tissue Culture

    PubMed Central

    Chan, Pek-Lan; Rose, Ray J.; Abdul Murad, Abdul Munir; Zainal, Zamri; Leslie Low, Eng-Ti; Ooi, Leslie Cheng-Li; Ooi, Siew-Eng; Yahya, Suzaini; Singh, Rajinder

    2014-01-01

    Background The somatic embryogenesis tissue culture process has been utilized to propagate high yielding oil palm. Due to the low callogenesis and embryogenesis rates, molecular studies were initiated to identify genes regulating the process, and their expression levels are usually quantified using reverse transcription quantitative real-time PCR (RT-qPCR). With the recent release of oil palm genome sequences, it is crucial to establish a proper strategy for gene analysis using RT-qPCR. Selection of the most suitable reference genes should be performed for accurate quantification of gene expression levels. Results In this study, eight candidate reference genes selected from cDNA microarray study and literature review were evaluated comprehensively across 26 tissue culture samples using RT-qPCR. These samples were collected from two tissue culture lines and media treatments, which consisted of leaf explants cultures, callus and embryoids from consecutive developmental stages. Three statistical algorithms (geNorm, NormFinder and BestKeeper) confirmed that the expression stability of novel reference genes (pOP-EA01332, PD00380 and PD00569) outperformed classical housekeeping genes (GAPDH, NAD5, TUBULIN, UBIQUITIN and ACTIN). PD00380 and PD00569 were identified as the most stably expressed genes in total samples, MA2 and MA8 tissue culture lines. Their applicability to validate the expression profiles of a putative ethylene-responsive transcription factor 3-like gene demonstrated the importance of using the geometric mean of two genes for normalization. Conclusions Systematic selection of the most stably expressed reference genes for RT-qPCR was established in oil palm tissue culture samples. PD00380 and PD00569 were selected for accurate and reliable normalization of gene expression data from RT-qPCR. These data will be valuable to the research associated with the tissue culture process. Also, the method described here will facilitate the selection of appropriate reference genes in other oil palm tissues and in the expression profiling of genes relating to yield, biotic and abiotic stresses. PMID:24927412

  6. Reference scenarios for deforestation and forest degradation in support of REDD: a review of data and methods

    NASA Astrophysics Data System (ADS)

    Olander, Lydia P.; Gibbs, Holly K.; Steininger, Marc; Swenson, Jennifer J.; Murray, Brian C.

    2008-04-01

    Global climate policy initiatives are now being proposed to compensate tropical forest nations for reducing carbon emissions from deforestation and forest degradation (REDD). These proposals have the potential to include developing countries more actively in international greenhouse gas mitigation and to address a substantial share of the world's emissions which come from tropical deforestation. For such a policy to be viable it must have a credible benchmark against which emissions reduction can be calculated. This benchmark, sometimes termed a baseline or reference emissions scenario, can be based directly on historical emissions or can use historical emissions as input for business as usual projections. Here, we review existing data and methods that could be used to measure historical deforestation and forest degradation reference scenarios including FAO (Food and Agricultural Organization of the United Nations) national statistics and various remote sensing sources. The freely available and corrected global Landsat imagery for 1990, 2000 and soon to come for 2005 may be the best primary data source for most developing countries with other coarser resolution high frequency or radar data as a valuable complement for addressing problems with cloud cover and for distinguishing larger scale degradation. While sampling of imagery has been effectively useful for pan-tropical and continental estimates of deforestation, wall-to-wall (or full coverage) allows more detailed assessments for measuring national-level reference emissions. It is possible to measure historical deforestation with sufficient certainty for determining reference emissions, but there must be continued calls at the international level for making high-resolution imagery available, and for financial and technical assistance to help countries determine credible reference scenarios. The data available for past years may not be sufficient for assessing all forms of forest degradation, but new data sources will have greater potential in 2007 and after. This paper focuses only on the methods for measuring changes in forest area, but this information must be coupled with estimates of change in forest carbon stocks in order to quantify emissions from deforestation and forest degradation.

  7. Elevated blood lipids are uncommon in patients with post-polio syndrome--a cross sectional study.

    PubMed

    Melin, Eva; Kahan, Thomas; Borg, Kristian

    2015-04-29

    The post-polio syndrome occurs in people who previously have had poliomyelitis. After the initial recovery, new or increasing neurologic symptoms occur. Inflammation and dyslipidaemia may play an important role in the development of atherosclerotic complications, for example myocardial infarction and angina pectoris. Previous studies on cardiovascular risk factors in the post-polio syndrome have found a higher prevalence of hypertension, ischemic heart disease, hyperlipidaemia, and stroke in these patients. The present study was undertaken in order to evaluate whether post-polio patients have elevated lipid values, and if blood lipid abnormalities could be correlated to signs of inflammation. Cross-sectional study of 89 consecutive post-polio patients, (53 women, mean age 65 years) from the Post-Polio Outpatient Clinic, Danderyd University Hospital, Stockholm, Sweden. The lipid profiles of post-polio patients were compared to age and sex matched reference values from two earlier studies. Statistical analyses were performed by the Student's t-test, and linear regression analyses were assessed by Pearson's correlation coefficient. Mean total cholesterol levels (5.7 mmol/L) were low or normal in post-polio patients, whereas low density lipoprotein levels (3.6 mmol/L) were normal, and high density lipoprotein (1.5 mmol/L) and triglycerides (1.4 mmol/L) lower than reference values. The prevalence of diabetes (7%), hypertension (38%), concomitant cardiovascular disease, (including angina pectoris, myocardial infarction, heart failure, atrial fibrillation and stroke) (7%), and calculated 10 year risk of coronary heart disease according to Framingham risk score algorithm (8%) was not increased in post-polio patients. Compared to reference populations, post-polio patients in Sweden appear to have low or normal total cholesterol and low density lipoprotein levels, whereas high density lipoprotein and triglyceride levels are low. Hence, a possible persisting inflammatory process in post-polio syndrome does not seem to be associated with increased lipids and an increased risk for coronary heart disease events.

  8. Statistical considerations for harmonization of the global multicenter study on reference values.

    PubMed

    Ichihara, Kiyoshi

    2014-05-15

    The global multicenter study on reference values coordinated by the Committee on Reference Intervals and Decision Limits (C-RIDL) of the IFCC was launched in December 2011, targeting 45 commonly tested analytes with the following objectives: 1) to derive reference intervals (RIs) country by country using a common protocol, and 2) to explore regionality/ethnicity of reference values by aligning test results among the countries. To achieve these objectives, it is crucial to harmonize 1) the protocol for recruitment and sampling, 2) statistical procedures for deriving the RI, and 3) test results through measurement of a panel of sera in common. For harmonized recruitment, very lenient inclusion/exclusion criteria were adopted in view of differences in interpretation of what constitutes healthiness by different cultures and investigators. This policy may require secondary exclusion of individuals according to the standard of each country at the time of deriving RIs. An iterative optimization procedure, called the latent abnormal values exclusion (LAVE) method, can be applied to automate the process of refining the choice of reference individuals. For global comparison of reference values, test results must be harmonized, based on the among-country, pair-wise linear relationships of test values for the panel. Traceability of reference values can be ensured based on values assigned indirectly to the panel through collaborative measurement of certified reference materials. The validity of the adopted strategies is discussed in this article, based on interim results obtained to date from five countries. Special considerations are made for dissociation of RIs by parametric and nonparametric methods and between-country difference in the effect of body mass index on reference values. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Multi-pulse multi-delay (MPMD) multiple access modulation for UWB

    DOEpatents

    Dowla, Farid U.; Nekoogar, Faranak

    2007-03-20

    A new modulation scheme in UWB communications is introduced. This modulation technique utilizes multiple orthogonal transmitted-reference pulses for UWB channelization. The proposed UWB receiver samples the second order statistical function at both zero and non-zero lags and matches the samples to stored second order statistical functions, thus sampling and matching the shape of second order statistical functions rather than just the shape of the received pulses.

  10. Tests of Statistical Significance Made Sound

    ERIC Educational Resources Information Center

    Haig, Brian D.

    2017-01-01

    This article considers the nature and place of tests of statistical significance (ToSS) in science, with particular reference to psychology. Despite the enormous amount of attention given to this topic, psychology's understanding of ToSS remains deficient. The major problem stems from a widespread and uncritical acceptance of null hypothesis…

  11. 49 CFR 1007.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Officer refers to the individual designated to process requests and handle various other matters relating... finger or voice print or a photograph. Statistical Record means a record in a system of records maintained for statistical research or reporting purposes only and not used in whole or in part in making any...

  12. Teaching Primary School Mathematics and Statistics: Evidence-Based Practice

    ERIC Educational Resources Information Center

    Averill, Robin; Harvey, Roger

    2010-01-01

    Here is the only reference book you will ever need for teaching primary school mathematics and statistics. It is full of exciting and engaging snapshots of excellent classroom practice relevant to "The New Zealand Curriculum" and national mathematics standards. There are many fascinating examples of investigative learning experiences,…

  13. Publication Bias: The Achilles' Heel of Systematic Reviews?

    ERIC Educational Resources Information Center

    Torgerson, Carole J.

    2006-01-01

    The term "publication bias" usually refers to the tendency for a greater proportion of statistically significant positive results of experiments to be published and, conversely, a greater proportion of statistically significant negative or null results not to be published. It is widely accepted in the fields of healthcare and psychological…

  14. Fast and global authenticity screening of honey using ¹H-NMR profiling.

    PubMed

    Spiteri, Marc; Jamin, Eric; Thomas, Freddy; Rebours, Agathe; Lees, Michèle; Rogers, Karyne M; Rutledge, Douglas N

    2015-12-15

    An innovative analytical approach was developed to tackle the most common adulterations and quality deviations in honey. Using proton-NMR profiling coupled to suitable quantification procedures and statistical models, analytical criteria were defined to check the authenticity of both mono- and multi-floral honey. The reference data set used was a worldwide collection of more than 800 honeys, covering most of the economically significant botanical and geographical origins. Typical plant nectar markers can be used to check monofloral honey labeling. Spectral patterns and natural variability were established for multifloral honeys, and marker signals for sugar syrups were identified by statistical comparison with a commercial dataset of ca. 200 honeys. Although the results are qualitative, spiking experiments have confirmed the ability of the method to detect sugar addition down to 10% levels in favorable cases. Within the same NMR experiments, quantification of glucose, fructose, sucrose and 5-HMF (regulated parameters) was performed. Finally markers showing the onset of fermentation are described. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Summary Diagrams for Coupled Hydrodynamic-Ecosystem Model Skill Assessment

    DTIC Science & Technology

    2009-01-01

    reference point have the smallest unbiased RMSD value (Fig. 3). It would appear that the cluster of model points closest to the reference point may...total RMSD values. This is particularly the case for phyto- plankton absorption (Fig. 3B) where the cluster of points closest to the reference...pattern statistics and the bias (difference of mean values) each magnitude of the total Root-Mean-Square Difference ( RMSD ). An alternative skill score and

  16. Effects of Omega-3 Fatty Acid Supplementation on Glucose Control and Lipid Levels in Type 2 Diabetes: A Meta-Analysis

    PubMed Central

    Chen, Cai; Yu, Xuefeng; Shao, Shiying

    2015-01-01

    Background Many studies assessed the impact of marine omega-3 fatty acids on glycemic homeostasis and lipid profiles in patients with type 2 diabetes (T2DM), but reported controversial results. Our goal was to systematically evaluate the effects of omega-3 on glucose control and lipid levels. Methods Medline, Pubmed, Cochrane Library, Embase, the National Research Register, and SIGLE were searched to identify eligible randomized clinical trials (RCTs). Extracted data from RCTs were analyzed using STATA 11.0 statistical software with fixed or random effects model. Effect sizes were presented as weighted mean differences (WMD) with 95% confidence intervals (95% CI). Heterogeneity was assessed using the Chi-square test with significance level set at p < 0.1. Results 20 RCT trials were included into this meta-analysis. Among patients with omega-3 supplementation, triglyceride (TG) levels were significantly decreased by 0.24 mmol/L. No marked change in total cholesterol (TC), HbA1c, fasting plasma glucose, postprandial plasma glucose, BMI or body weight was observed. High ratio of EPA/DHA contributed to a greater decreasing tendency in plasma insulin, HbAc1, TC, TG, and BMI measures, although no statistical significance was identified (except TG). FPG levels were increased by 0.42 mmol/L in Asians. No evidence of publication bias was observed in this meta-analysis. Conclusions The ratio of EPA/DHA and early intervention with omega 3 fatty acids may affect their effects on glucose control and lipid levels, which may serve as a dietary reference for clinicians or nutritionists who manage diabetic patients. PMID:26431431

  17. Pediatric patient safety events during hospitalization: approaches to accounting for institution-level effects.

    PubMed

    Slonim, Anthony D; Marcin, James P; Turenne, Wendy; Hall, Matt; Joseph, Jill G

    2007-12-01

    To determine the rates, patient, and institutional characteristics associated with the occurrence of patient safety indicators (PSIs) in hospitalized children and the degree of statistical difference derived from using three approaches of controlling for institution level effects. Pediatric Health Information System Dataset consisting of all pediatric discharges (<21 years of age) from 34 academic, freestanding children's hospitals for calendar year 2003. The rates of PSIs were computed for all discharges. The patient and institutional characteristics associated with these PSIs were calculated. The analyses sequentially applied three increasingly conservative methods to control for the institution-level effects robust standard error estimation, a fixed effects model, and a random effects model. The degree of difference from a "base state," which excluded institution-level variables, and between the models was calculated. The effects of these analyses on the interpretation of the PSIs are presented. PSIs are relatively infrequent events in hospitalized children ranging from 0 per 10,000 (postoperative hip fracture) to 87 per 10,000 (postoperative respiratory failure). Significant variables associated PSIs included age (neonates), race (Caucasians), payor status (public insurance), severity of illness (extreme), and hospital size (>300 beds), which all had higher rates of PSIs than their reference groups in the bivariable logistic regression results. The three different approaches of adjusting for institution-level effects demonstrated that there were similarities in both the clinical and statistical significance across each of the models. Institution-level effects can be appropriately controlled for by using a variety of methods in the analyses of administrative data. Whenever possible, resource-conservative methods should be used in the analyses especially if clinical implications are minimal.

  18. Methods for collection and analysis of aquatic biological and microbiological samples

    USGS Publications Warehouse

    Greeson, Phillip E.; Ehlke, T.A.; Irwin, G.A.; Lium, B.W.; Slack, K.V.

    1977-01-01

    Chapter A4 contains methods used by the U.S. Geological Survey to collect, preserve, and analyze waters to determine their biological and microbiological properties. Part 1 discusses biological sampling and sampling statistics. The statistical procedures are accompanied by examples. Part 2 consists of detailed descriptions of more than 45 individual methods, including those for bacteria, phytoplankton, zooplankton, seston, periphyton, macrophytes, benthic invertebrates, fish and other vertebrates, cellular contents, productivity, and bioassays. Each method is summarized, and the application, interferences, apparatus, reagents, collection, analysis, calculations, reporting of results, precision and references are given. Part 3 consists of a glossary. Part 4 is a list of taxonomic references.

  19. Using CRANID to test the population affinity of known crania.

    PubMed

    Kallenberger, Lauren; Pilbrow, Varsha

    2012-11-01

    CRANID is a statistical program used to infer the source population of a cranium of unknown origin by comparing its cranial dimensions with a worldwide craniometric database. It has great potential for estimating ancestry in archaeological, forensic and repatriation cases. In this paper we test the validity of CRANID in classifying crania of known geographic origin. Twenty-three crania of known geographic origin but unknown sex were selected from the osteological collections of the University of Melbourne. Only 18 crania showed good statistical match with the CRANID database. Without considering accuracy of sex allocation, 11 crania were accurately classified into major geographic regions and nine were correctly classified to geographically closest available reference populations. Four of the five crania with poor statistical match were nonetheless correctly allocated to major geographical regions, although none was accurately assigned to geographically closest reference samples. We conclude that if sex allocations are overlooked, CRANID can accurately assign 39% of specimens to geographically closest matching reference samples and 48% to major geographic regions. Better source population representation may improve goodness of fit, but known sex-differentiated samples are needed to further test the utility of CRANID. © 2012 The Authors Journal of Anatomy © 2012 Anatomical Society.

  20. Linking sounds to meanings: infant statistical learning in a natural language.

    PubMed

    Hay, Jessica F; Pelucchi, Bruna; Graf Estes, Katharine; Saffran, Jenny R

    2011-09-01

    The processes of infant word segmentation and infant word learning have largely been studied separately. However, the ease with which potential word forms are segmented from fluent speech seems likely to influence subsequent mappings between words and their referents. To explore this process, we tested the link between the statistical coherence of sequences presented in fluent speech and infants' subsequent use of those sequences as labels for novel objects. Notably, the materials were drawn from a natural language unfamiliar to the infants (Italian). The results of three experiments suggest that there is a close relationship between the statistics of the speech stream and subsequent mapping of labels to referents. Mapping was facilitated when the labels contained high transitional probabilities in the forward and/or backward direction (Experiment 1). When no transitional probability information was available (Experiment 2), or when the internal transitional probabilities of the labels were low in both directions (Experiment 3), infants failed to link the labels to their referents. Word learning appears to be strongly influenced by infants' prior experience with the distribution of sounds that make up words in natural languages. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. pplacer: linear time maximum-likelihood and Bayesian phylogenetic placement of sequences onto a fixed reference tree

    PubMed Central

    2010-01-01

    Background Likelihood-based phylogenetic inference is generally considered to be the most reliable classification method for unknown sequences. However, traditional likelihood-based phylogenetic methods cannot be applied to large volumes of short reads from next-generation sequencing due to computational complexity issues and lack of phylogenetic signal. "Phylogenetic placement," where a reference tree is fixed and the unknown query sequences are placed onto the tree via a reference alignment, is a way to bring the inferential power offered by likelihood-based approaches to large data sets. Results This paper introduces pplacer, a software package for phylogenetic placement and subsequent visualization. The algorithm can place twenty thousand short reads on a reference tree of one thousand taxa per hour per processor, has essentially linear time and memory complexity in the number of reference taxa, and is easy to run in parallel. Pplacer features calculation of the posterior probability of a placement on an edge, which is a statistically rigorous way of quantifying uncertainty on an edge-by-edge basis. It also can inform the user of the positional uncertainty for query sequences by calculating expected distance between placement locations, which is crucial in the estimation of uncertainty with a well-sampled reference tree. The software provides visualizations using branch thickness and color to represent number of placements and their uncertainty. A simulation study using reads generated from 631 COG alignments shows a high level of accuracy for phylogenetic placement over a wide range of alignment diversity, and the power of edge uncertainty estimates to measure placement confidence. Conclusions Pplacer enables efficient phylogenetic placement and subsequent visualization, making likelihood-based phylogenetics methodology practical for large collections of reads; it is freely available as source code, binaries, and a web service. PMID:21034504

  2. Prevalence of short stature in Saudi children and adolescents

    PubMed Central

    El Mouzan, Mohammad I.; Al Herbish, Abdullah S.; Al Salloum, Abdullah A.; Foster, Peter J.; Al Omer, Ahmad A.; Qurachi, Mansour M.

    2011-01-01

    BACKGROUND AND OBJECTIVE: Data on stature in Saudi children and adolescents are limited. The objective of this report was to establish the national prevalence of short stature in Saudi children and adolescents. DESIGN AND SETTING: Community-based, cross-sectional study conducted over 2 years (2004, 2005) PATIENTS AND METHODS: The national data set of the Saudi reference was used to calculate the stature for age for children and adolescents 5 to 18 years of age. Using the 2007 World Health Organization (WHO) reference, the prevalence of moderate and severe short stature was defined as the proportion of children whose standard deviation score for stature for age was less than -2 and -3, respectively. In addition, the 2000 Center for Disease Control (CDC) and the older 1978 National Center for Health Statistics (NCHS)/WHO references were used for comparison. RESULTS: Using the 2007 WHO reference, sample size in the Saudi reference was 19 372 healthy children and adolescents 5 to 17 years of age, with 50.8% being boys. The overall prevalence of moderate and severe short stature in boys was 11.3% and 1.8%, respectively; and in girls, 10.5% and 1.2%, respectively. The prevalence of moderate short stature was 12.1%, 11% and 11.3% in boys and 10.9%, 11.3% and 10.5% in girls when the 1978 WHO, the 2000 CDC and the 2007 WHO references were used, respectively. CONCLUSIONS: The national prevalence of short stature in Saudi children and adolescents is intermediate compared with the international level. Improvement in the socioeconomic and health status of children and adolescents should lead to a reduction in the prevalence of short stature. PMID:21911988

  3. SPM analysis of parametric (R)-[11C]PK11195 binding images: plasma input versus reference tissue parametric methods.

    PubMed

    Schuitemaker, Alie; van Berckel, Bart N M; Kropholler, Marc A; Veltman, Dick J; Scheltens, Philip; Jonker, Cees; Lammertsma, Adriaan A; Boellaard, Ronald

    2007-05-01

    (R)-[11C]PK11195 has been used for quantifying cerebral microglial activation in vivo. In previous studies, both plasma input and reference tissue methods have been used, usually in combination with a region of interest (ROI) approach. Definition of ROIs, however, can be labourious and prone to interobserver variation. In addition, results are only obtained for predefined areas and (unexpected) signals in undefined areas may be missed. On the other hand, standard pharmacokinetic models are too sensitive to noise to calculate (R)-[11C]PK11195 binding on a voxel-by-voxel basis. Linearised versions of both plasma input and reference tissue models have been described, and these are more suitable for parametric imaging. The purpose of this study was to compare the performance of these plasma input and reference tissue parametric methods on the outcome of statistical parametric mapping (SPM) analysis of (R)-[11C]PK11195 binding. Dynamic (R)-[11C]PK11195 PET scans with arterial blood sampling were performed in 7 younger and 11 elderly healthy subjects. Parametric images of volume of distribution (Vd) and binding potential (BP) were generated using linearised versions of plasma input (Logan) and reference tissue (Reference Parametric Mapping) models. Images were compared at the group level using SPM with a two-sample t-test per voxel, both with and without proportional scaling. Parametric BP images without scaling provided the most sensitive framework for determining differences in (R)-[11C]PK11195 binding between younger and elderly subjects. Vd images could only demonstrate differences in (R)-[11C]PK11195 binding when analysed with proportional scaling due to intersubject variation in K1/k2 (blood-brain barrier transport and non-specific binding).

  4. Identification of Suitable Reference Genes for mRNA Studies in Bone Marrow in a Mouse Model of Hematopoietic Stem Cell Transplantation.

    PubMed

    Li, H; Chen, C; Yao, H; Li, X; Yang, N; Qiao, J; Xu, K; Zeng, L

    2016-10-01

    Bone marrow micro-environment changes during hematopoietic stem cell transplantation (HSCT) with subsequent alteration of genes expression. Quantitative polymerase chain reaction (q-PCR) is a reliable and reproducible technique for the analysis of gene expression. To obtain more accurate results, it is essential to find a reference during HSCT. However, which gene is suitable during HSCT remains unclear. This study aimed to identify suitable reference genes for mRNA studies in bone marrow after HSCT. C57BL/6 mice were treated with either total body irradiation (group T) or busulfan/cyclophosphamide (BU/CY) (group B) followed by infusion of bone marrow cells. Normal mice without treatments were served as a control. All samples (group T + group B + control) were defined as group G. On days 7, 14, and 21 after transplantation, transcription levels of 7 candidate genes, ACTB, B2M, GAPDH, HMBS, HPRT, SDHA, and YWHAZ, in bone marrow cells were measured by use of real-time quantitative PCR. The expression stability of these 7 candidate reference genes were analyzed by 2 statistical software programs, GeNorm and NormFinder. Our results showed that ACTB displayed the highest expression in group G, with lowest expression of PSDHA in group T and HPRT in groups B and G. Analysis of expression stability by use of GeNorm or NormFinder demonstrated that expression of B2M in bone marrow were much more stable during HSCT, compared with other candidate genes including commonly used reference genes GAPDH and ACTB. ACTB could be used as a suitable reference gene for mRNA studies in bone marrow after HSCT. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Preliminary comparative assessment of PM10 hourly measurement results from new monitoring stations type using stochastic and exploratory methodology and models

    NASA Astrophysics Data System (ADS)

    Czechowski, Piotr Oskar; Owczarek, Tomasz; Badyda, Artur; Majewski, Grzegorz; Rogulski, Mariusz; Ogrodnik, Paweł

    2018-01-01

    The paper presents selected preliminary stage key issues proposed extended equivalence measurement results assessment for new portable devices - the comparability PM10 concentration results hourly series with reference station measurement results with statistical methods. In article presented new portable meters technical aspects. The emphasis was placed on the comparability the results using the stochastic and exploratory methods methodology concept. The concept is based on notice that results series simple comparability in the time domain is insufficient. The comparison of regularity should be done in three complementary fields of statistical modeling: time, frequency and space. The proposal is based on model's results of five annual series measurement results new mobile devices and WIOS (Provincial Environmental Protection Inspectorate) reference station located in Nowy Sacz city. The obtained results indicate both the comparison methodology completeness and the high correspondence obtained new measurements results devices with reference.

  6. Correlation between Post-LASIK Starburst Symptom and Ocular Wavefront Aberrations

    NASA Astrophysics Data System (ADS)

    Liu, Yong-Ji; Mu, Guo-Guang; Wang, Zhao-Qi; Wang-Yan

    2006-06-01

    Monochromatic aberrations in post laser in-situ keratomileusis (LASIK) eyes are measured. The data are categorized into reference group and starburst group according to the visual symptoms. Statistic analysis has been made to find the correlation between the ocular wavefront aberrations and the starburst symptom. The rms aberrations of the 3rd and 4th orders for the starburst group are significantly larger than those for the reference group. The starburst symptom shows a strong correlation with vertical coma, total coma, spherical aberrations. For 3-mm pupil size and 5.8-mm pupil size, the modulation transfer function (MTF) of the starburst group are lower than those of the reference group, but their visual acuities are close. MTF and PSF analyses are made for two groups, and the results are consistent with the statistical analysis, which means the difference between the two groups is mainly due to the third- and fourth-order Zernike aberrations.

  7. [Effects of Gushen Antai pills combined with progestin on serum β-HCG, P, E2 and CA125 in patients with threatened abortion].

    PubMed

    Tian, Chun-Man; Chen, Bo

    2016-01-01

    To investigate the clinical effect of Gushen Antai pills and progesterone in the treatment of threatened abortion, in order to provide references for early clinical intervention with threatened abortion. The 112 cases of patients with threatened abortion were randomly divided into the control group and the observation group. 56 cases in each group. Patients in the control group was injected with progesterone, the observation group was treated with Gushen Antai pills in addition to the therapy of the control group. Both groups were treated by drugs for two weeks. Their venous bloods (5 mL) were collected before treatment and in 1, 2 weeks after treatment to determine serum levels of β-HCG, P, E2 and CA125. The differences between the two groups after treatment were compared. The total effective rate of the control group and the observation group were 79% and 91.9% respectively, with a statistically significant difference between the two groups (P<0.05). Two weeks after the treatment, the serum levels of P and E2 in the observation group were significantly higher than before treatment, but the serum CA125 levels decreased significantly after treatment (P<0.05). These indicators showed statistically significant difference compared with that of the control group (P<0.05). After treatment, the serum β-HCG levels of the two groups were significantly higher than before treatment (P<0.05), but there was no statistically significant difference between the two groups. Gushen Antai pills and progesterone had a better clinical curative effect in treatment threatened abortion, which could significantly raise serum β-HCG, P and E2, reduce serum CA125 and increase the tocolysis efficiency, and so it was worth promoted in clinic. Copyright© by the Chinese Pharmaceutical Association.

  8. Electronic Resource Expenditure and the Decline in Reference Transaction Statistics in Academic Libraries

    ERIC Educational Resources Information Center

    Dubnjakovic, Ana

    2012-01-01

    The current study investigates factors influencing increase in reference transactions in a typical week in academic libraries across the United States of America. Employing multiple regression analysis and general linear modeling, variables of interest from the "Academic Library Survey (ALS) 2006" survey (sample size 3960 academic libraries) were…

  9. Towards a new tool for the evaluation of the quality of ultrasound compressed images.

    PubMed

    Delgorge, Cécile; Rosenberger, Christophe; Poisson, Gérard; Vieyres, Pierre

    2006-11-01

    This paper presents a new tool for the evaluation of ultrasound image compression. The goal is to measure the image quality as easily as with a statistical criterion, and with the same reliability as the one provided by the medical assessment. An initial experiment is proposed to medical experts and represents our reference value for the comparison of evaluation criteria. Twenty-one statistical criteria are selected from the literature. A cumulative absolute similarity measure is defined as a distance between the criterion to evaluate and the reference value. A first fusion method based on a linear combination of criteria is proposed to improve the results obtained by each of them separately. The second proposed approach combines different statistical criteria and uses the medical assessment in a training phase with a support vector machine. Some experimental results are given and show the benefit of fusion.

  10. Insights from analysis for harmful and potentially harmful constituents (HPHCs) in tobacco products.

    PubMed

    Oldham, Michael J; DeSoi, Darren J; Rimmer, Lonnie T; Wagner, Karl A; Morton, Michael J

    2014-10-01

    A total of 20 commercial cigarette and 16 commercial smokeless tobacco products were assayed for 96 compounds listed as harmful and potentially harmful constituents (HPHCs) by the US Food and Drug Administration. For each product, a single lot was used for all testing. Both International Organization for Standardization and Health Canada smoking regimens were used for cigarette testing. For those HPHCs detected, measured levels were consistent with levels reported in the literature, however substantial assay variability (measured as average relative standard deviation) was found for most results. Using an abbreviated list of HPHCs, statistically significant differences for most of these HPHCs occurred when results were obtained 4-6months apart (i.e., temporal variability). The assay variability and temporal variability demonstrate the need for standardized analytical methods with defined repeatability and reproducibility for each HPHC using certified reference standards. Temporal variability also means that simple conventional comparisons, such as two-sample t-tests, are inappropriate for comparing products tested at different points in time from the same laboratory or from different laboratories. Until capable laboratories use standardized assays with established repeatability, reproducibility, and certified reference standards, the resulting HPHC data will be unreliable for product comparisons or other decision making in regulatory science. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Animal movement: Statistical models for telemetry data

    USGS Publications Warehouse

    Hooten, Mevin B.; Johnson, Devin S.; McClintock, Brett T.; Morales, Juan M.

    2017-01-01

    The study of animal movement has always been a key element in ecological science, because it is inherently linked to critical processes that scale from individuals to populations and communities to ecosystems. Rapid improvements in biotelemetry data collection and processing technology have given rise to a variety of statistical methods for characterizing animal movement. The book serves as a comprehensive reference for the types of statistical models used to study individual-based animal movement. 

  12. Social Capital and Human Mortality: Explaining the Rural Paradox with County-Level Mortality Data

    PubMed Central

    Jensen, Leif; Haran, Murali

    2014-01-01

    The “rural paradox” refers to standardized mortality rates in rural areas that are unexpectedly low in view of well-known economic and infrastructural disadvantages there. We explore this paradox by incorporating social capital, a promising explanatory factor that has seldom been incorporated into residential mortality research. We do so while being attentive to spatial dependence, a statistical problem often ignored in mortality research. Analyzing data for counties in the contiguous United States, we find that: (1) the rural paradox is confirmed with both metro/non-metro and rural-urban continuum codes, (2) social capital significantly reduces the impacts of residence on mortality after controlling for race/ethnicity and socioeconomic covariates, (3) this attenuation is greater when a spatial perspective is imposed on the analysis, (4) social capital is negatively associated with mortality at the county level, and (5) spatial dependence is strongly in evidence. A spatial approach is necessary in county-level analyses such as ours to yield unbiased estimates and optimal model fit. PMID:25392565

  13. A data base and analysis program for shuttle main engine dynamic pressure measurements

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1986-01-01

    A dynamic pressure data base management system is described for measurements obtained from space shuttle main engine (SSME) hot firing tests. The data were provided in terms of engine power level and rms pressure time histories, and power spectra of the dynamic pressure measurements at selected times during each test. Test measurements and engine locations are defined along with a discussion of data acquisition and reduction procedures. A description of the data base management analysis system is provided and subroutines developed for obtaining selected measurement means, variances, ranges and other statistics of interest are discussed. A summary of pressure spectra obtained at SSME rated power level is provided for reference. Application of the singular value decomposition technique to spectrum interpolation is discussed and isoplots of interpolated spectra are presented to indicate measurement trends with engine power level. Program listings of the data base management and spectrum interpolation software are given. Appendices are included to document all data base measurements.

  14. Detection of Undocumented Changepoints Using Multiple Test Statistics and Composite Reference Series.

    NASA Astrophysics Data System (ADS)

    Menne, Matthew J.; Williams, Claude N., Jr.

    2005-10-01

    An evaluation of three hypothesis test statistics that are commonly used in the detection of undocumented changepoints is described. The goal of the evaluation was to determine whether the use of multiple tests could improve undocumented, artificial changepoint detection skill in climate series. The use of successive hypothesis testing is compared to optimal approaches, both of which are designed for situations in which multiple undocumented changepoints may be present. In addition, the importance of the form of the composite climate reference series is evaluated, particularly with regard to the impact of undocumented changepoints in the various component series that are used to calculate the composite.In a comparison of single test changepoint detection skill, the composite reference series formulation is shown to be less important than the choice of the hypothesis test statistic, provided that the composite is calculated from the serially complete and homogeneous component series. However, each of the evaluated composite series is not equally susceptible to the presence of changepoints in its components, which may be erroneously attributed to the target series. Moreover, a reference formulation that is based on the averaging of the first-difference component series is susceptible to random walks when the composition of the component series changes through time (e.g., values are missing), and its use is, therefore, not recommended. When more than one test is required to reject the null hypothesis of no changepoint, the number of detected changepoints is reduced proportionately less than the number of false alarms in a wide variety of Monte Carlo simulations. Consequently, a consensus of hypothesis tests appears to improve undocumented changepoint detection skill, especially when reference series homogeneity is violated. A consensus of successive hypothesis tests using a semihierarchic splitting algorithm also compares favorably to optimal solutions, even when changepoints are not hierarchic.

  15. Reference Charts for Fetal Cerebellar Vermis Height: A Prospective Cross-Sectional Study of 10605 Fetuses

    PubMed Central

    Cignini, Pietro; Giorlandino, Maurizio; Brutti, Pierpaolo; Mangiafico, Lucia; Aloisi, Alessia; Giorlandino, Claudio

    2016-01-01

    Objective To establish reference charts for fetal cerebellar vermis height in an unselected population. Methods A prospective cross-sectional study between September 2009 and December 2014 was carried out at ALTAMEDICA Fetal–Maternal Medical Centre, Rome, Italy. Of 25203 fetal biometric measurements, 12167 (48%) measurements of the cerebellar vermis were available. After excluding 1562 (12.8%) measurements, a total of 10605 (87.2%) fetuses were considered and analyzed once only. Parametric and nonparametric quantile regression models were used for the statistical analysis. In order to evaluate the robustness of the proposed reference charts regarding various distributional assumptions on the ultrasound measurements at hand, we compared the gestational age-specific reference curves we produced through the statistical methods used. Normal mean height based on parametric and nonparametric methods were defined for each week of gestation and the regression equation expressing the height of the cerebellar vermis as a function of gestational age was calculated. Finally the correlation between dimension/gestation was measured. Results The mean height of the cerebellar vermis was 12.7mm (SD, 1.6mm; 95% confidence interval, 12.7–12.8mm). The regression equation expressing the height of the CV as a function of the gestational age was: height (mm) = -4.85+0.78 x gestational age. The correlation between dimension/gestation was expressed by the coefficient r = 0.87. Conclusion This is the first prospective cross-sectional study on fetal cerebellar vermis biometry with such a large sample size reported in literature. It is a detailed statistical survey and contains new centile-based reference charts for fetal height of cerebellar vermis measurements. PMID:26812238

  16. Validating internal controls for quantitative plant gene expression studies

    PubMed Central

    Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H

    2004-01-01

    Background Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Results Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Conclusion Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments. PMID:15317655

  17. Sport concussion assessment tool-Third edition normative reference values for professional Rugby Union players.

    PubMed

    Fuller, G W; Govind, O; Tucker, R; Raftery, M

    2018-04-01

    To establish normative reference data for the SCAT3 in professional Rugby Union players. A cross sectional study in professional Rugby Union players competing in national and international professional competitions between 2015 and 2016. The SCAT3 was administered pre-season or prior to tournaments. Data was collected electronically using a custom tablet application. SCAT3 subcomponents distributions were described and normative ranges determined using percentile cut-offs for average, unusually low/high, and extremely low/high scores. The association between player characteristics and performance in SCAT3 subcomponents was also investigated in exploratory analyses. A total of 3611 professional Rugby Union players were included. The most common baseline symptom was fatigue (14%). The symptom score median (md) was 0 (interquartile range (IQR)=0-1). Symptom severity md was 0 (IQR=0-1). The md of the SAC score was 28 (IQR=26-29). The md of the MBESS was 2 (IQR=0-4). The Tandem gait md was 11.1s (IQR=10.0-12.7s). Upper limb coordination was normal in 98.4%. Younger age and lower educational level were associated with worse performance on delayed recall and reverse month sub-components of the SCAT3 (p<0.0001). No statistically significant differences in SCAT3 subcomponents were evident across gender. Representative normative reference values for the SCAT3 among professional Rugby Union players are provided. Baseline performance on concentration and delayed recall tests may be lower in younger athletes or in those with lower educational level. Copyright © 2017. Published by Elsevier Ltd.

  18. Particle size distributions by transmission electron microscopy: an interlaboratory comparison case study

    PubMed Central

    Rice, Stephen B; Chan, Christopher; Brown, Scott C; Eschbach, Peter; Han, Li; Ensor, David S; Stefaniak, Aleksandr B; Bonevich, John; Vladár, András E; Hight Walker, Angela R; Zheng, Jiwen; Starnes, Catherine; Stromberg, Arnold; Ye, Jia; Grulke, Eric A

    2015-01-01

    This paper reports an interlaboratory comparison that evaluated a protocol for measuring and analysing the particle size distribution of discrete, metallic, spheroidal nanoparticles using transmission electron microscopy (TEM). The study was focused on automated image capture and automated particle analysis. NIST RM8012 gold nanoparticles (30 nm nominal diameter) were measured for area-equivalent diameter distributions by eight laboratories. Statistical analysis was used to (1) assess the data quality without using size distribution reference models, (2) determine reference model parameters for different size distribution reference models and non-linear regression fitting methods and (3) assess the measurement uncertainty of a size distribution parameter by using its coefficient of variation. The interlaboratory area-equivalent diameter mean, 27.6 nm ± 2.4 nm (computed based on a normal distribution), was quite similar to the area-equivalent diameter, 27.6 nm, assigned to NIST RM8012. The lognormal reference model was the preferred choice for these particle size distributions as, for all laboratories, its parameters had lower relative standard errors (RSEs) than the other size distribution reference models tested (normal, Weibull and Rosin–Rammler–Bennett). The RSEs for the fitted standard deviations were two orders of magnitude higher than those for the fitted means, suggesting that most of the parameter estimate errors were associated with estimating the breadth of the distributions. The coefficients of variation for the interlaboratory statistics also confirmed the lognormal reference model as the preferred choice. From quasi-linear plots, the typical range for good fits between the model and cumulative number-based distributions was 1.9 fitted standard deviations less than the mean to 2.3 fitted standard deviations above the mean. Automated image capture, automated particle analysis and statistical evaluation of the data and fitting coefficients provide a framework for assessing nanoparticle size distributions using TEM for image acquisition. PMID:26361398

  19. Statistics in three biomedical journals.

    PubMed

    Pilcík, T

    2003-01-01

    In this paper we analyze the use of statistics and associated problems, in three Czech biological journals in the year 2000. We investigated 23 articles Folia Biologica, 60 articles in Folia Microbiologica, and 88 articles in Physiological Research. The highest frequency of publications with statistical content have used descriptive statistics and t-test. The most usual mistake concerns the absence of reference about the used statistical software and insufficient description of the data. We have compared our results with the results of similar studies in some other medical journals. The use of important statistical methods is comparable with those used in most medical journals, the proportion of articles, in which the applied method is described insufficiently is moderately low.

  20. Application of an Online Reference for Reviewing Basic Statistical Principles of Operating Room Management

    ERIC Educational Resources Information Center

    Dexter, Franklin; Masursky, Danielle; Wachtel, Ruth E.; Nussmeier, Nancy A.

    2010-01-01

    Operating room (OR) management differs from clinical anesthesia in that statistical literacy is needed daily to make good decisions. Two of the authors teach a course in operations research for surgical services to anesthesiologists, anesthesia residents, OR nursing directors, hospital administration students, and analysts to provide them with the…

  1. Critical Values and Transforming Data: Teaching Statistics with Social Justice

    ERIC Educational Resources Information Center

    Lesser, Lawrence M.

    2007-01-01

    Despite the dearth of literature specifically on teaching statistics using social justice, there is precedent in the more general realm of teaching using social justice, or even in teaching mathematics using social justice. This article offers an overview of content examples, resources, and references that can be used in the specific area of…

  2. Information Distribution Practices of Federal Statistical Agencies: The Census Bureau Example.

    ERIC Educational Resources Information Center

    Gey, Frederick C.

    1993-01-01

    Describes the current and historical distribution channels of the U.S. Bureau of the Census within a framework of distribution policies and practices for federal statistical information. The issues of reasonable distribution policies and the impact of technological change are discussed, and guidelines are offered. (Contains 26 references.) (EAM)

  3. Comparison of probability statistics for automated ship detection in SAR imagery

    NASA Astrophysics Data System (ADS)

    Henschel, Michael D.; Rey, Maria T.; Campbell, J. W. M.; Petrovic, D.

    1998-12-01

    This paper discuses the initial results of a recent operational trial of the Ocean Monitoring Workstation's (OMW) ship detection algorithm which is essentially a Constant False Alarm Rate filter applied to Synthetic Aperture Radar data. The choice of probability distribution and methodologies for calculating scene specific statistics are discussed in some detail. An empirical basis for the choice of probability distribution used is discussed. We compare the results using a l-look, k-distribution function with various parameter choices and methods of estimation. As a special case of sea clutter statistics the application of a (chi) 2-distribution is also discussed. Comparisons are made with reference to RADARSAT data collected during the Maritime Command Operation Training exercise conducted in Atlantic Canadian Waters in June 1998. Reference is also made to previously collected statistics. The OMW is a commercial software suite that provides modules for automated vessel detection, oil spill monitoring, and environmental monitoring. This work has been undertaken to fine tune the OMW algorithm's, with special emphasis on the false alarm rate of each algorithm.

  4. Organic food consumption during pregnancy and its association with health-related characteristics: the KOALA Birth Cohort Study.

    PubMed

    Simões-Wüst, Ana Paula; Moltó-Puigmartí, Carolina; Jansen, Eugene Hjm; van Dongen, Martien Cjm; Dagnelie, Pieter C; Thijs, Carel

    2017-08-01

    To investigate the associations of organic food consumption with maternal pre-pregnancy BMI, hypertension and diabetes in pregnancy, and several blood biomarkers of pregnant women. Prospective cohort study. Pregnant women were recruited at midwives' practices and through channels related to consumption of food from organic origin. Pregnant women who filled in FFQ and donated a blood sample (n 1339). Participant groups were defined based on the share of consumed organic products; to discriminate between effects of food origin and food patterns, healthy diet indicators were considered in some statistical models. Consumption of organic food was associated with a more favourable pre-pregnancy BMI and lower prevalence of gestational diabetes. Compared with participants consuming no organic food (reference group), a marker of dairy products intake (pentadecanoic acid) and trans-fatty acids from natural origin (vaccenic and rumenic acids) were higher among participants consuming organic food (organic groups), whereas elaidic acid, a marker of the intake of trans-fatty acids found in industrially hydrogenated fats, was lower. Plasma levels of homocysteine and 25-hydroxyvitamin D were lower in the organic groups than in the reference group. Differences in pentadecanoic acid, vaccenic acid and vitamin D retained statistical significance when correcting for indicators of the healthy diet pattern associated with the consumption of organic food. Consumption of organic food during pregnancy is associated with several health-related characteristics and blood biomarkers. Part of the observed associations is explained by food patterns accompanying the consumption of organic food.

  5. Prediction equations for maximal respiratory pressures of Brazilian adolescents.

    PubMed

    Mendes, Raquel E F; Campos, Tania F; Macêdo, Thalita M F; Borja, Raíssa O; Parreira, Verônica F; Mendonça, Karla M P P

    2013-01-01

    The literature emphasizes the need for studies to provide reference values and equations able to predict respiratory muscle strength of Brazilian subjects at different ages and from different regions of Brazil. To develop prediction equations for maximal respiratory pressures (MRP) of Brazilian adolescents. In total, 182 healthy adolescents (98 boys and 84 girls) aged between 12 and 18 years, enrolled in public and private schools in the city of Natal-RN, were evaluated using an MVD300 digital manometer (Globalmed®) according to a standardized protocol. Statistical analysis was performed using SPSS Statistics 17.0 software, with a significance level of 5%. Data normality was verified using the Kolmogorov-Smirnov test, and descriptive analysis results were expressed as the mean and standard deviation. To verify the correlation between the MRP and the independent variables (age, weight, height and sex), the Pearson correlation test was used. To obtain the prediction equations, stepwise multiple linear regression was used. The variables height, weight and sex were correlated to MRP. However, weight and sex explained part of the variability of MRP, and the regression analysis in this study indicated that these variables contributed significantly in predicting maximal inspiratory pressure, and only sex contributed significantly to maximal expiratory pressure. This study provides reference values and two models of prediction equations for maximal inspiratory and expiratory pressures and sets the necessary normal lower limits for the assessment of the respiratory muscle strength of Brazilian adolescents.

  6. A hybrid hydrologically complemented warning model for shallow landslides induced by extreme rainfall in Korean Mountain

    NASA Astrophysics Data System (ADS)

    Singh Pradhan, Ananta Man; Kang, Hyo-Sub; Kim, Yun-Tae

    2016-04-01

    This study uses a physically based approach to evaluate the factor of safety of the hillslope for different hydrological conditions, in Mt Umyeon, south of Seoul. The hydrological conditions were determined using intensity and duration of whole Korea of known landslide inventory data. Quantile regression statistical method was used to ascertain different probability warning levels on the basis of rainfall thresholds. Physically based models are easily interpreted and have high predictive capabilities but rely on spatially explicit and accurate parameterization, which is commonly not possible. Statistical probabilistic methods can include other causative factors which influence the slope stability such as forest, soil and geology, but rely on good landslide inventories of the site. In this study a hybrid approach has described that combines the physically-based landslide susceptibility for different hydrological conditions. A presence-only based maximum entropy model was used to hybrid and analyze relation of landslide with conditioning factors. About 80% of the landslides were listed among the unstable sites identified in the proposed model, thereby presenting its effectiveness and accuracy in determining unstable areas and areas that require evacuation. These cumulative rainfall thresholds provide a valuable reference to guide disaster prevention authorities in the issuance of warning levels with the ability to reduce losses and save lives.

  7. Confidence intervals for single-case effect size measures based on randomization test inversion.

    PubMed

    Michiels, Bart; Heyvaert, Mieke; Meulders, Ann; Onghena, Patrick

    2017-02-01

    In the current paper, we present a method to construct nonparametric confidence intervals (CIs) for single-case effect size measures in the context of various single-case designs. We use the relationship between a two-sided statistical hypothesis test at significance level α and a 100 (1 - α) % two-sided CI to construct CIs for any effect size measure θ that contain all point null hypothesis θ values that cannot be rejected by the hypothesis test at significance level α. This method of hypothesis test inversion (HTI) can be employed using a randomization test as the statistical hypothesis test in order to construct a nonparametric CI for θ. We will refer to this procedure as randomization test inversion (RTI). We illustrate RTI in a situation in which θ is the unstandardized and the standardized difference in means between two treatments in a completely randomized single-case design. Additionally, we demonstrate how RTI can be extended to other types of single-case designs. Finally, we discuss a few challenges for RTI as well as possibilities when using the method with other effect size measures, such as rank-based nonoverlap indices. Supplementary to this paper, we provide easy-to-use R code, which allows the user to construct nonparametric CIs according to the proposed method.

  8. The script concordance test in radiation oncology: validation study of a new tool to assess clinical reasoning

    PubMed Central

    Lambert, Carole; Gagnon, Robert; Nguyen, David; Charlin, Bernard

    2009-01-01

    Background The Script Concordance test (SCT) is a reliable and valid tool to evaluate clinical reasoning in complex situations where experts' opinions may be divided. Scores reflect the degree of concordance between the performance of examinees and that of a reference panel of experienced physicians. The purpose of this study is to demonstrate SCT's usefulness in radiation oncology. Methods A 90 items radiation oncology SCT was administered to 155 participants. Three levels of experience were tested: medical students (n = 70), radiation oncology residents (n = 38) and radiation oncologists (n = 47). Statistical tests were performed to assess reliability and to document validity. Results After item optimization, the test comprised 30 cases and 70 questions. Cronbach alpha was 0.90. Mean scores were 51.62 (± 8.19) for students, 71.20 (± 9.45) for residents and 76.67 (± 6.14) for radiation oncologists. The difference between the three groups was statistically significant when compared by the Kruskall-Wallis test (p < 0.001). Conclusion The SCT is reliable and useful to discriminate among participants according to their level of experience in radiation oncology. It appears as a useful tool to document the progression of reasoning during residency training. PMID:19203358

  9. Translation and Validation of the Knee Society Score - KSS for Brazilian Portuguese

    PubMed Central

    Silva, Adriana Lucia Pastore e; Demange, Marco Kawamura; Gobbi, Riccardo Gomes; da Silva, Tânia Fernanda Cardoso; Pécora, José Ricardo; Croci, Alberto Tesconi

    2012-01-01

    Objective To translate, culturally adapt and validate the "Knee Society Score"(KSS) for the Portuguese language and determine its measurement properties, reproducibility and validity. Methods We analyzed 70 patients of both sexes, aged between 55 and 85 years, in a cross-sectional clinical trial, with diagnosis of primary osteoarthritis ,undergoing total knee arthroplasty surgery. We assessed the patients with the English version of the KSS questionnaire and after 30 minutes with the Portuguese version of the KSS questionnaire, done by a different evaluator. All the patients were assessed preoperatively, and again at three, and six months postoperatively. Results There was no statistical difference, using Cronbach's alpha index and the Bland-Altman graphical analysis, for the knees core during the preoperative period (p =1), and at three months (p =0.991) and six months postoperatively (p =0.985). There was no statistical difference for knee function score for all three periods (p =1.0). Conclusion The Brazilian version of the Knee Society Score is easy to apply, as well providing as a valid and reliable instrument for measuring the knee score and function of Brazilian patients undergoing TKA. Level of Evidence: Level I - Diagnostic Studies- Investigating a Diagnostic Test- Testing of previously developed diagnostic criteria on consecutive patients (with universally applied 'gold' reference standard). PMID:24453576

  10. Consistent integration of experimental and ab initio data into molecular and coarse-grained models

    NASA Astrophysics Data System (ADS)

    Vlcek, Lukas

    As computer simulations are increasingly used to complement or replace experiments, highly accurate descriptions of physical systems at different time and length scales are required to achieve realistic predictions. The questions of how to objectively measure model quality in relation to reference experimental or ab initio data, and how to transition seamlessly between different levels of resolution are therefore of prime interest. To address these issues, we use the concept of statistical distance to define a measure of similarity between statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the systems' measurable properties. Through systematic coarse-graining, we arrive at appropriate expressions for optimization loss functions consistently incorporating microscopic ab initio data as well as macroscopic experimental data. The design of coarse-grained and multiscale models is then based on factoring the model system partition function into terms describing the system at different resolution levels. The optimization algorithm takes advantage of thermodynamic perturbation expressions for fast exploration of the model parameter space, enabling us to scan millions of parameter combinations per hour on a single CPU. The robustness and generality of the new model optimization framework and its efficient implementation are illustrated on selected examples including aqueous solutions, magnetic systems, and metal alloys.

  11. Transcriptome profiling of a Saccharomyces cerevisiae mutant with a constitutively activated Ras/cAMP pathway.

    PubMed

    Jones, D L; Petty, J; Hoyle, D C; Hayes, A; Ragni, E; Popolo, L; Oliver, S G; Stateva, L I

    2003-12-16

    Often changes in gene expression levels have been considered significant only when above/below some arbitrarily chosen threshold. We investigated the effect of applying a purely statistical approach to microarray analysis and demonstrated that small changes in gene expression have biological significance. Whole genome microarray analysis of a pde2Delta mutant, constructed in the Saccharomyces cerevisiae reference strain FY23, revealed altered expression of approximately 11% of protein encoding genes. The mutant, characterized by constitutive activation of the Ras/cAMP pathway, has increased sensitivity to stress, reduced ability to assimilate nonfermentable carbon sources, and some cell wall integrity defects. Applying the Munich Information Centre for Protein Sequences (MIPS) functional categories revealed increased expression of genes related to ribosome biogenesis and downregulation of genes in the cell rescue, defense, cell death and aging category, suggesting a decreased response to stress conditions. A reduced level of gene expression in the unfolded protein response pathway (UPR) was observed. Cell wall genes whose expression was affected by this mutation were also identified. Several of the cAMP-responsive orphan genes, upon further investigation, revealed cell wall functions; others had previously unidentified phenotypes assigned to them. This investigation provides a statistical global transcriptome analysis of the cellular response to constitutive activation of the Ras/cAMP pathway.

  12. Small Aircraft RF Interference Path Loss

    NASA Technical Reports Server (NTRS)

    Nguyen, Truong X.; Koppen, Sandra V.; Ely, Jay J.; Szatkowski, George N.; Mielnik, John J.; Salud, Maria Theresa P.

    2007-01-01

    Interference to aircraft radio receivers is an increasing concern as more portable electronic devices are allowed onboard. Interference signals are attenuated as they propagate from inside the cabin to aircraft radio antennas mounted on the outside of the aircraft. The attenuation level is referred to as the interference path loss (IPL) value. Significant published IPL data exists for transport and regional category airplanes. This report fills a void by providing data for small business/corporate and general aviation aircraft. In this effort, IPL measurements are performed on ten small aircraft of different designs and manufacturers. Multiple radio systems are addressed. Along with the typical worst-case coupling values, statistical distributions are also reported that could lead to better interference risk assessment.

  13. Pattern-reversal electroretinograms in unilateral glaucoma.

    PubMed

    Wanger, P; Persson, H E

    1983-06-01

    Pattern-reversal and flash electroretinograms (ERG) and oscillatory potentials (OP) were recorded from 11 patients with unilateral glaucoma. All glaucomatous eyes had reduced amplitudes both compared to the opposite eye in the same patient and to reference values. In 10 of the 11 cases this reduction was below the level of normal variation. The difference in pattern-reversal ERG amplitude means from glaucomatous and opposite eyes was statistically significant. No differences were observed in flash ERGs or OPs. The histopathologic correlate to the visual field defects in glaucoma is retinal ganglion cell degeneration. The present electrophysiologic findings support the view, based on results from animal experiments, that the pattern-reversal ERG reflects ganglion cell activity.

  14. [Genetic diseases in pediatric patients hospitalised in the town of Ubaté, Colombia].

    PubMed

    Páez, Paola; Suárez-Obando, Fernando; Zarante, Ignacio

    2008-01-01

    Describing genetic disease frequency in a second-level hospital's in-patient paediatric service The hospital's statistical department's records for 2005 were comprehensively reviewed; the study was carried out in the town of Ubaté during 2006. Complex diseases led to nearly 25% of all hospitalisations, including multifactor diseases and congenital malformations. However, an aetiological study and/or geneticist consultation or referral took place on a few occasions. Primary care hospitals should become more relevant reference centres for detecting genetic diseases amongst the paediatric population. New mechanisms are needed for implementing this to allow patients access to a geneticist and for an aetiological diagnosis to be made and providing suitable genetic counselling.

  15. Feasibility study of new energy projects on three-level indicator system

    NASA Astrophysics Data System (ADS)

    Zhan, Zhigang

    2018-06-01

    With the rapid development of new energy industry, many new energy development projects are being carried out all over the world. To analyze the feasibility of the project. we build feasibility of new energy projects assessment model, based on the gathered abundant data about progress in new energy projects.12 indicators are selected by principal component analysis(PCA). Then we construct a new three-level indicator system, where the first level has 1 indicator, the second level has 5 indicators and the third level has 12 indicators to evaluate. Moreover, we use the entropy weight method (EWM) to get weight vector of the indicators in the third level and the multivariate statistical analysis(MVA)to get the weight vector of indicators in the second-class. We use this evaluation model to evaluate the feasibility of the new energy project and make a reference for the subsequent new energy investment. This could be a contribution to the world's low-carbon and green development by investing in sustainable new energy projects. We will introduce new variables and improve the weight model in the future. We also conduct a sensitivity analysis of the model and illustrate the strengths and weaknesses.

  16. Health Care Wide Hazards

    MedlinePlus

    ... What's New | Offices Home Workers Regulations Enforcement Data & Statistics Training Publications Newsroom Small Business Anti-Retaliation eTools Home : Hospital Scope | Glossary | References | ...

  17. Statistical analysis of electromagnetic radiation measurements in the vicinity of GSM/UMTS base station antenna masts.

    PubMed

    Koprivica, Mladen; Neskovic, Natasa; Neskovic, Aleksandar; Paunovic, George

    2014-01-01

    As a result of dense installations of public mobile base station, additional electromagnetic radiation occurs in the living environment. In order to determine the level of radio-frequency radiation generated by base stations, extensive electromagnetic field strength measurements were carried out for 664 base station locations. Base station locations were classified into three categories: indoor, masts and locations with installations on buildings. Having in mind the large percentage (47 %) of sites with antenna masts, a detailed analysis of this location category was performed, and the measurement results were presented. It was concluded that the total electric field strength in the vicinity of base station antenna masts in no case exceeded 10 V m(-1), which is quite below the International Commission on Non-Ionizing Radiation Protection reference levels. At horizontal distances >50 m from the mast bottom, the median and maximum values were <1 and 2 V m(-1), respectively.

  18. Inductive reasoning and judgment interference: experiments on Simpson's paradox.

    PubMed

    Fiedler, Klaus; Walther, Eva; Freytag, Peter; Nickel, Stefanie

    2003-01-01

    In a series of experiments on inductive reasoning, participants assessed the relationship between gender, success, and a covariate in a situation akin to Simpson's paradox: Although women were less successful then men according to overall statistics, they actually fared better then men at either of two universities. Understanding trivariate relationships of this kind requires cognitive routines similar to analysis of covariance. Across the first five experiments, however, participants generalized the disadvantage of women at the aggregate level to judgments referring to the different levels of the covariate, even when motivation was high and appropriate mental models were activated. The remaining three experiments demonstrated that Simpson's paradox could be mastered when the salience of the covariate was increased and when the salience of gender was decreased by the inclusion of temporal cues that disambiguate the causal status of the covariate. Copyright 2003 Society for Personality and Social Psychology, Inc.

  19. Statistical analysis of electromagnetic radiation measurements in the vicinity of indoor microcell GSM/UMTS base stations in Serbia.

    PubMed

    Koprivica, Mladen; Petrić, Majda; Nešković, Nataša; Nešković, Aleksandar

    2016-01-01

    To determine the level of radiofrequency radiation generated by base stations of Global System for Mobile Communications and Universal Mobile Telecommunication System, extensive electromagnetic field strength measurements were carried out in the vicinity of 664 base station locations. These were classified into three categories: indoor, masts, and locations with installations on buildings. Although microcell base stations with antennas installed indoors typically emit less power than outdoor macrocell base stations, the fact that people can be found close to antennas requires exposure originating from these base stations to be carefully considered. Measurement results showed that maximum recorded value of electric field strength exceeded International Commission on Non-Ionizing Radiation Protection reference levels at 7% of indoor base station locations. At the same time, this percentage was much lower in the case of masts and installations on buildings (0% and 2.5%, respectively). © 2015 Wiley Periodicals, Inc.

  20. Ninety-three pictures and 108 questions for the elicitation of homophones

    PubMed Central

    FERREIRA, VICTOR S.; CUTTING, J. COOPER

    2007-01-01

    Homographs and homophones have interesting linguistic properties that make them useful in many experiments involving language. To assist researchers in the elicitation of homophones, this paper presents a set of 93 line-drawn pictures of objects with homophonic names and a set of 108 questions with homophonic answers. Statistics are also included for each picture and question: Picture statistics include name-agreement percentages, dominance, and frequency statistics of depicted referents, and picture-naming latencies both with and without study of the picture names. For questions, statistics include answer-agreement percentages, difficulty ratings, dominance, frequency statistics, and naming latencies for 60 of the most consistently answered questions. PMID:18185842

  1. Statistical methods for nuclear material management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen W.M.; Bennett, C.A.

    1988-12-01

    This book is intended as a reference manual of statistical methodology for nuclear material management practitioners. It describes statistical methods currently or potentially important in nuclear material management, explains the choice of methods for specific applications, and provides examples of practical applications to nuclear material management problems. Together with the accompanying training manual, which contains fully worked out problems keyed to each chapter, this book can also be used as a textbook for courses in statistical methods for nuclear material management. It should provide increased understanding and guidance to help improve the application of statistical methods to nuclear material managementmore » problems.« less

  2. Chaotic Dynamics of Linguistic-Like Processes at the Syntactical and Semantic Levels: in the Pursuit of a Multifractal Attractor

    NASA Astrophysics Data System (ADS)

    Nicolis, John S.; Katsikas, Anastassis A.

    Collective parameters such as the Zipf's law-like statistics, the Transinformation, the Block Entropy and the Markovian character are compared for natural, genetic, musical and artificially generated long texts from generating partitions (alphabets) on homogeneous as well as on multifractal chaotic maps. It appears that minimal requirements for a language at the syntactical level such as memory, selectivity of few keywords and broken symmetry in one dimension (polarity) are more or less met by dynamically iterating simple maps or flows e.g. very simple chaotic hardware. The same selectivity is observed at the semantic level where the aim refers to partitioning a set of enviromental impinging stimuli onto coexisting attractors-categories. Under the regime of pattern recognition and classification, few key features of a pattern or few categories claim the lion's share of the information stored in this pattern and practically, only these key features are persistently scanned by the cognitive processor. A multifractal attractor model can in principle explain this high selectivity, both at the syntactical and the semantic levels.

  3. Serum levels of C-reactive protein in adolescents with periodontitis.

    PubMed

    López, Rodrigo; Baelum, Vibeke; Hedegaard, Chris Juul; Bendtzen, Klaus

    2011-04-01

    The results of several cross-sectional studies suggested a relationship between periodontitis and higher serum levels of C-reactive protein (CRP). Most of these studies were restricted to adult study groups with severe periodontal inflammation, and the potential effects of confounding factors were frequently overlooked. A case-referent study comprised of 87 adolescent cases who presented with clinical attachment loss ≥3 mm recorded in ≥2 of 16 teeth and 73 controls who did not fulfill these criteria was nested in a fully enumerated adolescent population. Venous blood samples were obtained, and CRP levels were quantified, using a high-sensitive bead-based flow cytometric assay. The Mann-Whitney U test was used to assess overall differences between groups. The median serum CRP values for cases and controls were 64 ng/ml (interquartile range: 27 to 234 ng/ml) and 55 ng/ml (31 to 183 ng/ml), respectively (P = 0.8). Serum levels of CRP were not significantly higher among subjects with periodontitis than among controls. However, a statistically significant positive association between percentages of sites with bleeding on probing and log-transformed CRP values was observed.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Richard O.

    The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Somemore » statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.« less

  5. Using a detailed uncertainty analysis to adjust mapped rates of forest disturbance derived from Landsat time series data (Invited)

    NASA Astrophysics Data System (ADS)

    Cohen, W. B.; Yang, Z.; Stehman, S.; Huang, C.; Healey, S. P.

    2013-12-01

    Forest ecosystem process models require spatially and temporally detailed disturbance data to accurately predict fluxes of carbon or changes in biodiversity over time. A variety of new mapping algorithms using dense Landsat time series show great promise for providing disturbance characterizations at an annual time step. These algorithms provide unprecedented detail with respect to timing, magnitude, and duration of individual disturbance events, and causal agent. But all maps have error and disturbance maps in particular can have significant omission error because many disturbances are relatively subtle. Because disturbance, although ubiquitous, can be a relatively rare event spatially in any given year, omission errors can have a great impact on mapped rates. Using a high quality reference disturbance dataset, it is possible to not only characterize map errors but also to adjust mapped disturbance rates to provide unbiased rate estimates with confidence intervals. We present results from a national-level disturbance mapping project (the North American Forest Dynamics project) based on the Vegetation Change Tracker (VCT) with annual Landsat time series and uncertainty analyses that consist of three basic components: response design, statistical design, and analyses. The response design describes the reference data collection, in terms of the tool used (TimeSync), a formal description of interpretations, and the approach for data collection. The statistical design defines the selection of plot samples to be interpreted, whether stratification is used, and the sample size. Analyses involve derivation of standard agreement matrices between the map and the reference data, and use of inclusion probabilities and post-stratification to adjust mapped disturbance rates. Because for NAFD we use annual time series, both mapped and adjusted rates are provided at an annual time step from ~1985-present. Preliminary evaluations indicate that VCT captures most of the higher intensity disturbances, but that many of the lower intensity disturbances (thinnings, stress related to insects and disease, etc.) are missed. Because lower intensity disturbances are a large proportion of the total set of disturbances, adjusting mapped disturbance rates to include these can be important for inclusion in ecosystem process models. The described statistical disturbance rate adjustments are aspatial in nature, such that the basic underlying map is unchanged. For spatially explicit ecosystem modeling, such adjustments, although important, can be difficult to directly incorporate. One approach for improving the basic underlying map is an ensemble modeling approach that uses several different complementary maps, each derived from a different algorithm and having their own strengths and weaknesses relative to disturbance magnitude and causal agent of disturbance. We will present results from a pilot study associated with the Landscape Change Monitoring System (LCMS), an emerging national-level program that builds upon NAFD and the well-established Monitoring Trends in Burn Severity (MTBS) program.

  6. A comparison of the National Center for Health Statistics and new World Health Organization growth references for school-age children and adolescents with the use of data from 11 low-income countries.

    PubMed

    Rousham, Emily K; Roschnik, Natalie; Baylon, Melba Andrea B; Bobrow, Emily A; Burkhanova, Mavzuna; Campion, M Gerda; Adle-Chua, Teresita; Degefie, Tedbabe; Hilari, Caroline; Kalengamaliro, Humphreys; Kassa, Tamiru; Maiga, Fadima; Mahumane, Bonifacio J; Mukaka, Mary; Ouattara, Fatimata; Parawan, Amado R; Sacko, Moussa; Patterson, David W; Sobgo, Gaston; Khandaker, Ikhtiar Uddin; Hall, Andrew

    2011-08-01

    In 2007 new World Health Organization (WHO) growth references for children aged 5-19 y were introduced to replace the National Center for Health Statistics (NCHS) references. This study aimed to compare the prevalence of stunting, wasting, and thinness estimated by the NCHS and WHO growth references. NCHS and WHO height-for-age z scores were calculated with the use of cross-sectional data from 20,605 schoolchildren aged 5-17 y in 11 low-income countries. The differences in the percentage of stunted children were estimated for each year of age and sex. The z scores of body mass index-for-age and weight-for-height were calculated with the use of the WHO and NCHS references, respectively, to compare differences in the prevalence of thinness and wasting. No systematic differences in mean z scores of height-for-age were observed between the WHO and NCHS growth references. However, z scores of height-for-age varied by sex and age, particularly during early adolescence. In children for whom weight-for-height could be calculated, the estimated prevalence of thinness (WHO reference) was consistently higher than the prevalence of wasting (NCHS reference) by as much as 9% in girls and 18% in boys. In undernourished populations, the application of the WHO (2007) references may result in differences in the prevalence of stunting for each sex compared with results shown when the NCHS references are used as well as a higher estimated prevalence of thinness than of wasting. An awareness of these differences is important for comparative studies or the evaluation of programs. For school-age children and adolescents across all ranges of anthropometric status, the same growth references should be applied when such studies are undertaken.

  7. Plasma creatinine in dogs: intra- and inter-laboratory variation in 10 European veterinary laboratories

    PubMed Central

    2011-01-01

    Background There is substantial variation in reported reference intervals for canine plasma creatinine among veterinary laboratories, thereby influencing the clinical assessment of analytical results. The aims of the study was to determine the inter- and intra-laboratory variation in plasma creatinine among 10 veterinary laboratories, and to compare results from each laboratory with the upper limit of its reference interval. Methods Samples were collected from 10 healthy dogs, 10 dogs with expected intermediate plasma creatinine concentrations, and 10 dogs with azotemia. Overlap was observed for the first two groups. The 30 samples were divided into 3 batches and shipped in random order by postal delivery for plasma creatinine determination. Statistical testing was performed in accordance with ISO standard methodology. Results Inter- and intra-laboratory variation was clinically acceptable as plasma creatinine values for most samples were usually of the same magnitude. A few extreme outliers caused three laboratories to fail statistical testing for consistency. Laboratory sample means above or below the overall sample mean, did not unequivocally reflect high or low reference intervals in that laboratory. Conclusions In spite of close analytical results, further standardization among laboratories is warranted. The discrepant reference intervals seem to largely reflect different populations used in establishing the reference intervals, rather than analytical variation due to different laboratory methods. PMID:21477356

  8. Construction and comparative evaluation of different activity detection methods in brain FDG-PET.

    PubMed

    Buchholz, Hans-Georg; Wenzel, Fabian; Gartenschläger, Martin; Thiele, Frank; Young, Stewart; Reuss, Stefan; Schreckenberger, Mathias

    2015-08-18

    We constructed and evaluated reference brain FDG-PET databases for usage by three software programs (Computer-aided diagnosis for dementia (CAD4D), Statistical Parametric Mapping (SPM) and NEUROSTAT), which allow a user-independent detection of dementia-related hypometabolism in patients' brain FDG-PET. Thirty-seven healthy volunteers were scanned in order to construct brain FDG reference databases, which reflect the normal, age-dependent glucose consumption in human brain, using either software. Databases were compared to each other to assess the impact of different stereotactic normalization algorithms used by either software package. In addition, performance of the new reference databases in the detection of altered glucose consumption in the brains of patients was evaluated by calculating statistical maps of regional hypometabolism in FDG-PET of 20 patients with confirmed Alzheimer's dementia (AD) and of 10 non-AD patients. Extent (hypometabolic volume referred to as cluster size) and magnitude (peak z-score) of detected hypometabolism was statistically analyzed. Differences between the reference databases built by CAD4D, SPM or NEUROSTAT were observed. Due to the different normalization methods, altered spatial FDG patterns were found. When analyzing patient data with the reference databases created using CAD4D, SPM or NEUROSTAT, similar characteristic clusters of hypometabolism in the same brain regions were found in the AD group with either software. However, larger z-scores were observed with CAD4D and NEUROSTAT than those reported by SPM. Better concordance with CAD4D and NEUROSTAT was achieved using the spatially normalized images of SPM and an independent z-score calculation. The three software packages identified the peak z-scores in the same brain region in 11 of 20 AD cases, and there was concordance between CAD4D and SPM in 16 AD subjects. The clinical evaluation of brain FDG-PET of 20 AD patients with either CAD4D-, SPM- or NEUROSTAT-generated databases from an identical reference dataset showed similar patterns of hypometabolism in the brain regions known to be involved in AD. The extent of hypometabolism and peak z-score appeared to be influenced by the calculation method used in each software package rather than by different spatial normalization parameters.

  9. The Performance of a PN Spread Spectrum Receiver Preceded by an Adaptive Interference Suppression Filter.

    DTIC Science & Technology

    1982-12-01

    Sequence dj Estimate of the Desired Signal DEL Sampling Time Interval DS Direct Sequence c Sufficient Statistic E/T Signal Power Erfc Complimentary Error...Namely, a white Gaussian noise (WGN) generator was added. Also, a statistical subroutine was added in order to assess performance improvement at the...reference code and then passed through a correlation detector whose output is the sufficient 1 statistic , e . Using a threshold device and the sufficient

  10. The influence of lidocaine topical anesthesia during transesophageal echocardiography on blood methemoglobin level and risk of methemoglobinemia.

    PubMed

    Filipiak-Strzecka, Dominika; Kasprzak, Jarosław D; Wiszniewska, Marta; Walusiak-Skorupa, Jolanta; Lipiec, Piotr

    2015-04-01

    Methemoglobinemia is a relatively rare, but potentially life-threating medical condition, which may be induced by application of topical anaesthetic agents commonly used during endoscopic procedure. The aim of our study was to assess the influence of lidocaine used prior to transesophageal echocardiography (TEE) on the blood level of methemoglobin in vivo. Additionally we attempted to establish the occurrence rate of clinically evident lidocaine-induced methemoglobinemia on the basis of data collected in our institution. We retrospectively analyzed patient records from 3,354 TEEs performed in our echocardiographic laboratory over the course of 13 years in search for clinically evident methemoglobinemia cases. Additionally, 18 consecutive patients referred for TEE were included in the prospective part of our analysis. Blood samples were tested before and 60 min after pre-TEE lidocaine anesthesia application. Information concerning concomitant conditions and pharmacotherapy were also obtained. In 3,354 patients who underwent TEE in our institution no cases of clinically evident methemoglobinemia occurred. In the prospective part of the study, none of 18 patients [16 (89 %) men, mean age 63 ± 13] was diagnosed with either clinical symptoms of methemoglobinemia or exceeded normal blood concentration of methemoglobin. Initial mean methemoglobin level was 0.5 ± 0.1 % with mild, statistically (but not clinically) significant rise to 0.6 ± 0.1 % after 60 min (p = 0.02). Among the analyzed factors only the relation between the proton pump inhibitors intake and methemoglobin blood level rise was identified as statistically relevant (p = 0.03). In adults, pre-TEE lidocaine anesthesia with recommended dosage results in significant increase in methemoglobin blood level, which however does not exceed normal values and does not result in clinically evident methemoglobinemia.

  11. Statistical foundations of liquid-crystal theory

    PubMed Central

    Seguin, Brian; Fried, Eliot

    2013-01-01

    We develop a mechanical theory for systems of rod-like particles. Central to our approach is the assumption that the external power expenditure for any subsystem of rods is independent of the underlying frame of reference. This assumption is used to derive the basic balance laws for forces and torques. By considering inertial forces on par with other forces, these laws hold relative to any frame of reference, inertial or noninertial. Finally, we introduce a simple set of constitutive relations to govern the interactions between rods and find restrictions necessary and sufficient for these laws to be consistent with thermodynamics. Our framework provides a foundation for a statistical mechanical derivation of the macroscopic balance laws governing liquid crystals. PMID:23772091

  12. Community-oriented medical education and clinical training: comparison by medical students in hospitals.

    PubMed

    Ali, Azizi

    2012-10-01

    To determine the students' comparison of their one month educational trainings in Community-Oriented Medical Education with hospitals clinical education. Observational study. Kermanshah Community-Oriented Medical Education Field, Kermanshah University of Medical Sciences, Kermanshah, Iran, from April 2000 to February 2009. As of 2000, medical interns of Kermanshah University of Medical Sciences spend one month in the field of community-oriented medical education. At the end of the one-month period, the interns filled a questionnaire of 11 questions (based on the Likert scale) to assess the level of education in the field compared to hospital clinics. Data of questionnaires collected and completed from 2000 through 2009 (948 questionnaires) were analyzed on SPSS 18 using descriptive statistics (percentage) and analytic statistics (Chi-square test). The 948 students consisted of 66.4% males (n = 666) and 33.6% females (n = 282). All 11 variables of comparison were rated improved in the field education compared to the hospital training. The greatest difference pertained referring patients to the relevant health units (82% vs. 23.3%); patience in education (84.6% vs. 37.1%); consideration given to the three levels of prevention (77.2% vs. 33.6%) and the attention paid to the presence of students (91.7% vs. 51.8%), all of which were statistically significant (p < 0.0001). According to the interns, the educational status of specialized clinics of the field was superior to the specific clinics of hospitals (p < 0.0001). From the standpoint of medical students, training in community-oriented medical education in the field was better than training in the hospitals' clinics.

  13. How to Build a Desk Statistics Tracker in Less than an Hour Using Forms in Google Docs

    ERIC Educational Resources Information Center

    Carter, Sunshine; Ambrosi, Thomas

    2011-01-01

    The University of Minnesota-Duluth is the second largest campus in the University of Minnesota system. The UMD library, which serves more than 11,000 students and 500 faculty members, is primarily an undergraduate library. The reference team consists of eight librarians, including author Sunshine Carter, reference and electronics resources…

  14. 76 FR 31787 - United States Standards for Grades of Potatoes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ... from the National Agricultural Statistics Service (NASS), the average potato crop value for 2006-2008... Table VII in 5Sec. 1.1565. 0 8. Section 51.1564 is amended by: 0 A. Amending the introductory text by... introductory text by removing the reference ``Table IV'', and by adding the reference ``Table VII'', in its...

  15. [Research of Odo Bujwid (1857-1942) concerning the vaccine against rabies-historical characterisation].

    PubMed

    Wasiewicz, Barbara

    2016-01-01

    The present article refers to the historical characterisation of Odo Bujwid's (1857-1942) research concerning the vaccine against rabies. The introduction refers to the treatment methods applied before Ludwik Pasteur's discovery. The following part refers to Odo Bujwid's own research including diagnostics, characterisation of the symptoms of disease, modification of the original Ludwik Pasteur's method and statistical information. The resume emphasizes that Odo Bujwid's scientific research was the introduction and generalisation the worldwide microbiology knowledge at the polish lands.

  16. Is the 2000 CDC growth reference appropriate for developing countries?

    PubMed

    Roberfroid, Dominique; Lerude, Marie-Paule; Pérez-Cueto, Armando; Kolsteren, Patrick

    2006-04-01

    In 2000, the Centers for Disease Control and Prevention (CDC) produced a revised growth reference. This has already been used in different settings outside the USA. Using data obtained during a nutritional survey in Madagascar, we compare results produced by using both the 2000 CDC and the 1978 National Center for Health Statistics (NCHS)/World Health Organization (WHO) growth references. We show that changing the reference has an important impact on nutritional diagnosis. In particular, the prevalence of wasting is greatly increased. This could generate substantial operational and clinical difficulties. We recommend continued use of the 1978 NCHS/WHO reference until release of the new WHO multi-country growth charts.

  17. Exercise capacity in pediatric patients with inflammatory bowel disease.

    PubMed

    Ploeger, Hilde E; Takken, Tim; Wilk, Boguslaw; Issenman, Robert M; Sears, Ryan; Suri, Soni; Timmons, Brian W

    2011-05-01

    To examine exercise capacity in youth with Crohn's disease (CD) and ulcerative colitis (UC). Eleven males and eight females with CD and six males and four females with UC participated. Patients performed standard exercise tests to assess peak power (PP) and mean power (MP) and peak aerobic mechanical power (W(peak)) and peak oxygen uptake (VO(2peak)). Fitness variables were compared with reference data and also correlated with relevant clinical outcomes. Pediatric patients with inflammatory bowel disease had lower PP (∼90% of predicted), MP (∼88% of predicted), W(peak) (∼91% of predicted), and VO(2peak) (∼75% of predicted) compared with reference values. When patients with CD or UC were compared separately to reference values, W(peak) was significantly lower only in the CD group. No statistically significant correlations were found between any exercise variables and disease duration (r = 0.01 to 0.14, P = .47 to .95) or disease activity (r = -0.19 to -0.31, P = .11 to .38), measured by pediatric CD activity index or pediatric ulcerative colitis activity index. After controlling for chronological age, recent hemoglobin levels were significantly correlated with PP (r = 0.45, P = .049), MP (r = 0.63, P = .003), VO(2peak) (r = 0.62, P = .004), and W(peak) (r = 0.70, P = .001). Pediatric patients with inflammatory bowel disease exhibit impaired aerobic and anaerobic exercise capacity compared with reference values. Copyright © 2011 Mosby, Inc. All rights reserved.

  18. Factors affecting members' evaluation of agri-business ventures' effectiveness.

    PubMed

    Hashemi, Seyyed Mahmoud; Hedjazi, Yousef

    2011-02-01

    This paper presents work to identify factors affecting effectiveness of agri-business ventures (A-BVs) on the side of providers as perceived by their members. A survey was conducted among 95 members of A-BVs in Zanjan province, Iran. To collect data, a questionnaire was designed. Two distinct groups of A-BVs with low (group 1) and high (group 2) perceived (evaluated) levels of effectiveness were revealed. The study showed that there were significant differences between the two groups on important characteristics of A-BVs and their members. The study also found that there were statistically significant relationships between A-BVs' governance structure and capacity, management and organization characteristics and the perceived effectiveness, whereas there were no statistically significant relationships between A-BVs' advisory methods characteristic applied by members and the perceived effectiveness. Logistic regression results also showed that level of application of rules encouraging members' active participation in important decision makings, clear terms of reference to guide contracting procedures, roles, and responsibilities of parties involved, type of people served and geographical area of program coverage, and members' ability to use Information and Communication Technologies (ICTs) were predictors of the perceived (evaluated) effectiveness of A-BVs. The study showed that evaluation of members of effectiveness of A-BVs would not be the same. It is suggested that Iranian public agricultural extension organization, as responsible organization for monitoring and evaluating services conducted by A-BVs, considered these differences between members with different levels of some important variables. 2010 Elsevier Ltd. All rights reserved.

  19. The Effect of Risk Factors on the Levels of Chemical Elements in the Tibial Plateau of Patients with Osteoarthritis following Knee Surgery.

    PubMed

    Lanocha-Arendarczyk, Natalia; Kosik-Bogacka, Danuta Izabela; Prokopowicz, Adam; Kalisinska, Elzbieta; Sokolowski, Sebastian; Karaczun, Maciej; Zietek, Pawel; Podlasińska, Joanna; Pilarczyk, Bogumila; Tomza-Marciniak, Agnieszka; Baranowska-Bosiacka, Irena; Gutowska, Izabela; Safranow, Krzysztof; Chlubek, Dariusz

    2015-01-01

    The aim of this study was to evaluate the aforementioned chemical elements in tibial plateau samples obtained during knee arthroplasty. The gender-specific analysis of chemical element levels in the bone samples revealed that there were statistically significant differences in the concentration of Pb and Se/Pb ratio. The contents of elements in the tibial plateau in the patients with osteoarthritis (OA) can be arranged in the following descending order: F(-) > K > Zn > Fe > Sr > Pb > Mn > Se > Cd > THg. We observed statistical significant effects of environmental factors including smoking, seafood diet, and geographical distribution on the levels of the elements in tibial bone. Significant positive correlation coefficients were found for the relationships K-Cd, Zn-Sr, Zn-F(-), THg-Pb, Pb-Cd, Se-Se/Pb, Se-Se/Cd, Se/Pb-Se/Cd, Pb-Cd/Ca, Cd-Cd/Ca, and F(-)-F(-)/Ca·1000. Significant negative correlations were found for the relationships THg-Se/Pb, Pb-Se/Pb, Cd-Se/Pb, K-Se/Cd, Pb-Se/Cd, Cd-Se/Cd, THg-Se/THg, Pb-Se/THg, Se-Pb/Cd, Zn-Cd/Ca, and Se/Cd-Cd/Ca. The results reported here may provide a basis for establishing reference values for the tibial plateau in patients with OA who had undergone knee replacement surgery. The concentrations of elements in the bone with OA were determined by age, presence of implants, smoking, fish and seafood diet, and sport activity.

  20. A hybrid downscaling procedure for estimating the vertical distribution of ambient temperature in local scale

    NASA Astrophysics Data System (ADS)

    Yiannikopoulou, I.; Philippopoulos, K.; Deligiorgi, D.

    2012-04-01

    The vertical thermal structure of the atmosphere is defined by a combination of dynamic and radiation transfer processes and plays an important role in describing the meteorological conditions at local scales. The scope of this work is to develop and quantify the predictive ability of a hybrid dynamic-statistical downscaling procedure to estimate the vertical profile of ambient temperature at finer spatial scales. The study focuses on the warm period of the year (June - August) and the method is applied to an urban coastal site (Hellinikon), located in eastern Mediterranean. The two-step methodology initially involves the dynamic downscaling of coarse resolution climate data via the RegCM4.0 regional climate model and subsequently the statistical downscaling of the modeled outputs by developing and training site-specific artificial neural networks (ANN). The 2.5ox2.5o gridded NCEP-DOE Reanalysis 2 dataset is used as initial and boundary conditions for the dynamic downscaling element of the methodology, which enhances the regional representivity of the dataset to 20km and provides modeled fields in 18 vertical levels. The regional climate modeling results are compared versus the upper-air Hellinikon radiosonde observations and the mean absolute error (MAE) is calculated between the four grid point values nearest to the station and the ambient temperature at the standard and significant pressure levels. The statistical downscaling element of the methodology consists of an ensemble of ANN models, one for each pressure level, which are trained separately and employ the regional scale RegCM4.0 output. The ANN models are theoretically capable of estimating any measurable input-output function to any desired degree of accuracy. In this study they are used as non-linear function approximators for identifying the relationship between a number of predictor variables and the ambient temperature at the various vertical levels. An insight of the statistically derived input-output transfer functions is obtained by utilizing the ANN weights method, which quantifies the relative importance of the predictor variables in the estimation procedure. The overall downscaling performance evaluation incorporates a set of correlation and statistical measures along with appropriate statistical tests. The hybrid downscaling method presented in this work can be extended to various locations by training different site-specific ANN models and the results, depending on the application, can be used for assisting the understanding of the past, present and future climatology. ____________________________ This research has been co-financed by the European Union and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: Heracleitus II: Investing in knowledge society through the European Social Fund.

  1. Eliminating traditional reference services in an academic health sciences library: a case study

    PubMed Central

    Schulte, Stephanie J

    2011-01-01

    Question: How were traditional librarian reference desk services successfully eliminated at one health sciences library? Setting: The analysis was done at an academic health sciences library at a major research university. Method: A gap analysis was performed, evaluating changes in the first eleven months through analysis of reference transaction and instructional session data. Main Results: Substantial increases were seen in the overall number of specialized reference transactions and those conducted by librarians lasting more than thirty minutes. The number of reference transactions overall increased after implementing the new model. Several new small-scale instructional initiatives began, though perhaps not directly related to the new model. Conclusion: Traditional reference desk services were eliminated at one academic health sciences library without negative impact on reference and instructional statistics. Eliminating ties to the confines of the physical library due to staffing reference desk hours removed one significant barrier to a more proactive liaison program. PMID:22022221

  2. Review of Army Officer Educational System. Volume 1. Summary Report

    DTIC Science & Technology

    1971-12-01

    13-5 V. StaffingGuide. . . . . . . . .. 13-6 VI. Academic Facilities .................. 13-9 VII. Educational Innovations . . . . . .... . 13-10 14...report does not lean heavily on statistical support. It gives references, research, and statistical data only when essential to validity, accuracy, or...Leadership, History, Interbranch and Interservice Education , Facilities , Regulations, Staffing Guide, Educational Innovations Chapter 14 - Concluding

  3. 2.5-Year-Olds Use Cross-Situational Consistency to Learn Verbs under Referential Uncertainty

    ERIC Educational Resources Information Center

    Scott, Rose M.; Fisher, Cynthia

    2012-01-01

    Recent evidence shows that children can use cross-situational statistics to learn new object labels under referential ambiguity (e.g., Smith & Yu, 2008). Such evidence has been interpreted as support for proposals that statistical information about word-referent co-occurrence plays a powerful role in word learning. But object labels represent only…

  4. Measuring Classroom Management Expertise (CME) of Teachers: A Video-Based Assessment Approach and Statistical Results

    ERIC Educational Resources Information Center

    König, Johannes

    2015-01-01

    The study aims at developing and exploring a novel video-based assessment that captures classroom management expertise (CME) of teachers and for which statistical results are provided. CME measurement is conceptualized by using four video clips that refer to typical classroom management situations in which teachers are heavily challenged…

  5. Zero Autocorrelation Waveforms: A Doppler Statistic and Multifunction Problems

    DTIC Science & Technology

    2006-01-01

    by ANSI Std Z39-18 It is natural to refer to A as the ambiguity function of u, since in the usual setting on the real line R, the analogue ambiguity...Doppler statistic |Cu,uek(j)| is excellent and provable for detecting deodorized Doppler frequency shift [11] (see Fig. 2). Also, if one graphs only

  6. Bayesian statistics in medicine: a 25 year review.

    PubMed

    Ashby, Deborah

    2006-11-15

    This review examines the state of Bayesian thinking as Statistics in Medicine was launched in 1982, reflecting particularly on its applicability and uses in medical research. It then looks at each subsequent five-year epoch, with a focus on papers appearing in Statistics in Medicine, putting these in the context of major developments in Bayesian thinking and computation with reference to important books, landmark meetings and seminal papers. It charts the growth of Bayesian statistics as it is applied to medicine and makes predictions for the future. From sparse beginnings, where Bayesian statistics was barely mentioned, Bayesian statistics has now permeated all the major areas of medical statistics, including clinical trials, epidemiology, meta-analyses and evidence synthesis, spatial modelling, longitudinal modelling, survival modelling, molecular genetics and decision-making in respect of new technologies.

  7. Reference values for the CAVIPRES-30 questionnaire, a global questionnaire on the health-related quality of life of patients with prostate cancer.

    PubMed

    Gómez-Veiga, F; Silmi-Moyano, A; Günthner, S; Puyol-Pallas, M; Cózar-Olmo, J M

    2014-06-01

    Define and establish the reference values of the CAVIPRES-30 Questionnaire, a health related quality of life questionnaire specific for prostate cancer patients. The CAVIPRES-30 was administered to 2,630 males with prostate cancer included by 238 Urologist belonging to the Spanish National Healthcare System. Descriptive analysis on socio-demographic and clinical data were performed, and multivariate analyses were used to corroborate that stratification variables were statistically significantly and independently associated to the overall score of the questionnaire. The variables Time since diagnosis of the illness, whether the patient had a Stable partner or not, if he was, or not, undergoing Symptomatic treatment were statistically significantly and independently associated (P < .001) to the overall score of the questionnaire. The reference values table of the CAVIPRES-30 questionnaire is made up of different kinds of information of each patient profile: sample size, descriptive statistics with regard to the overall score, Cronbach's alpha value (between .791 and .875) and the questionnaire's values are reported by deciles. The results of this study contribute new proof as to the suitability and usefulness of the CAVIPRES-30 questionnaire as an instrument for assessing individually the quality of life of prostate cancer. Copyright © 2013 AEU. Published by Elsevier Espana. All rights reserved.

  8. From plastic to gold: a unified classification scheme for reference standards in medical image processing

    NASA Astrophysics Data System (ADS)

    Lehmann, Thomas M.

    2002-05-01

    Reliable evaluation of medical image processing is of major importance for routine applications. Nonetheless, evaluation is often omitted or methodically defective when novel approaches or algorithms are introduced. Adopted from medical diagnosis, we define the following criteria to classify reference standards: 1. Reliance, if the generation or capturing of test images for evaluation follows an exactly determined and reproducible protocol. 2. Equivalence, if the image material or relationships considered within an algorithmic reference standard equal real-life data with respect to structure, noise, or other parameters of importance. 3. Independence, if any reference standard relies on a different procedure than that to be evaluated, or on other images or image modalities than that used routinely. This criterion bans the simultaneous use of one image for both, training and test phase. 4. Relevance, if the algorithm to be evaluated is self-reproducible. If random parameters or optimization strategies are applied, reliability of the algorithm must be shown before the reference standard is applied for evaluation. 5. Significance, if the number of reference standard images that are used for evaluation is sufficient large to enable statistically founded analysis. We demand that a true gold standard must satisfy the Criteria 1 to 3. Any standard only satisfying two criteria, i.e., Criterion 1 and Criterion 2 or Criterion 1 and Criterion 3, is referred to as silver standard. Other standards are termed to be from plastic. Before exhaustive evaluation based on gold or silver standards is performed, its relevance must be shown (Criterion 4) and sufficient tests must be carried out to found statistical analysis (Criterion 5). In this paper, examples are given for each class of reference standards.

  9. New Korean reference for birth weight by gestational age and sex: data from the Korean Statistical Information Service (2008-2012).

    PubMed

    Lim, Jung Sub; Lim, Se Won; Ahn, Ju Hyun; Song, Bong Sub; Shim, Kye Shik; Hwang, Il Tae

    2014-09-01

    To construct new Korean reference curves for birth weight by sex and gestational age using contemporary Korean birth weight data and to compare them with the Lubchenco and the 2010 United States (US) intrauterine growth curves. Data of 2,336,727 newborns by the Korean Statistical Information Service (2008-2012) were used. Smoothed percentile curves were created by the Lambda Mu Sigma method using subsample of singleton. The new Korean reference curves were compared with the Lubchenco and the 2010 US intrauterine growth curves. Reference of the 3rd, 10th, 25th, 50th, 75th, 90th, and 97th percentiles birth weight by gestational age were made using 2,249,804 (male, 1,159,070) singleton newborns with gestational age 23-43 weeks. Separate birth weight curves were constructed for male and female. The Korean reference curves are similar to the 2010 US intrauterine growth curves. However, the cutoff values for small for gestational age (<10th percentile) of the new Korean curves differed from those of the Lubchenco curves for each gestational age. The Lubchenco curves underestimated the percentage of infants who were born small for gestational age. The new Korean reference curves for birth weight show a different pattern from the Lubchenco curves, which were made from white neonates more than 60 years ago. Further research on short-term and long-term health outcomes of small for gestational age babies based on the new Korean reference data is needed.

  10. A 1-year randomized study to evaluate the effects of a dose reduction in oral contraceptives on lipids and carbohydrate metabolism: 20 microg ethinyl estradiol combined with 100 microg levonorgestrel.

    PubMed

    Skouby, Sven O; Endrikat, Jan; Düsterberg, Bernd; Schmidt, Werner; Gerlinger, Christoph; Wessel, Jens; Goldstein, Henri; Jespersen, Joergen

    2005-02-01

    To evaluate the impact on lipid and carbohydrate variables of a combined one-third ethinyl estradiol (EE)/levonorgestrel (LNG) dose reduction in oral contraceptives. In an open-label, randomized study, a dose-reduced oral contraceptive containing 20 microg EE and 100 microg LNG (20 EE/100 LNG) was compared with a reference preparation containing 30 microg EE and 150 microg LNG (30 EE/150 LNG). One-year data from 48 volunteers were obtained. We found a decrease of HDL2 cholesterol and increases of low-density lipoprotein cholesterol, very low-density lipoprotein cholesterol and total triglycerides in both treatment groups from baseline to the 13th treatment cycle. Although for four of six variables, the changes in the 20 EE group were lower compared with the 30 EE group, none of the differences between the two treatments were statistically significant. The median values for the fasting levels of insulin, C-peptide and free fatty acids slightly increased or remained unchanged while the fasting glucose levels slightly decreased after 13 treatment cycles. While the glucose area under the curve (AUC) (0-3 h) was similar in both groups during the OGTT, the insulin AUC(0-3 h) was less increased in the 20 EE/100 LNG group compared with the 30 EE/150 LNG group. None of the differences between the treatment groups for any of the carbohydrate metabolism variables were statistically significant at any time point. Both study treatments were safe and well tolerated by the volunteers. Similar effects on the lipid and carbohydrate profiles were found for both preparations. The balanced one-third EE dose reduction in this new oral contraceptive caused slightly lower, but insignificant, changes in the lipid and carbohydrate variables compared with the reference treatment.

  11. SU-E-I-33: Establishment of CT Diagnostic Reference Levels in Province Nova Scotia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tonkopi, E; Abdolell, M; Duffy, S

    2015-06-15

    Purpose: To evaluate patient radiation dose from the most frequently performed CT examinations and to establish provincial diagnostic reference levels (DRLs) as a tool for protocol optimization. Methods: The study investigated the following CT examinations: head, chest, abdomen/pelvis, and chest/abdomen/pelvis (CAP). Dose data, volume CT dose index (CTDIvol) and dose-length product (DLP), were collected from 15 CT scanners installed during 2004–2014 in 11 hospital sites of Nova Scotia. All scanners had dose modulation options and multislice capability (16–128 detector rows). The sample for each protocol included 15 average size patients (70±20 kg). Provincial DRLs were calculated as the 75th percentilemore » of patient dose distributions. The differences in dose between hospitals were evaluated with a single factor ANOVA statistical test. Generalized linear modeling was used to determine the factors associated with higher radiation dose. A sample of 36 abdominal studies performed on three different scanners was blinded and randomized for an assessment by an experienced radiologist who graded the imaging quality of anatomic structures. Results: Data for 900 patients were collected. The DRLs were proposed using CTDIvol (mGy) and DLP (mGy*cm) values for CT head (67 and 1049, respectively), chest (12 and 393), abdomen/pelvis (16 and 717), and CAP (14 and 1034). These DRLs were lower than the published national data except for the head CTDIvol. The differences between the means of the dose distributions from each scanner were statistically significant (p<0.05) for all examinations. A very weak correlation was found between the dose and the scanner age or the number of slices with Pearson’s correlation coefficients of 0.011–0.315. The blinded analysis of image quality demonstrated no clinically significant difference except for the noise category. Conclusion: Provincial DRLs were established for typical CT examinations. The variations in dose between the hospitals suggested a large potential for optimization of examinations. Radiology Research Foundation grant.« less

  12. Dentascan – Is the Investment Worth the Hype ???

    PubMed Central

    Shah, Monali A; Shah, Sneha S; Dave, Deepak

    2013-01-01

    Background: Open Bone Measurement (OBM) and Bone Sounding (BS) are most reliable but invasive clinical methods for Alveolar Bone Level (ABL) assessment, causing discomfort to the patient. Routinely, IOPAs & OPGs are the commonest radiographic techniques used, which tend to underestimate bone loss and obscure buccal/lingual defects. Novel technique like dentascan (CBCT) eliminates this limitation by giving images in 3 planes – sagittal, coronal and axial. Aim: To compare & correlate non-invasive 3D radiographic technique of Dentascan with BS & OBM, and IOPA and OPG, in assessing the ABL. Settings and Design: Cross-sectional diagnostic study. Material and Methods: Two hundred and five sites were subjected to clinical and radiographic diagnostic techniques. Relative distance between the alveolar bone crest and reference wire was measured. All the measurements were compared and tested against the OBM. Statistical Analysis: Student’s t-test, ANOVA, Pearson correlation coefficient. Results: There is statistically significant difference between dentascan and OBM, only BS showed agreement with OBM (p < 0.05). Dentascan weakly correlated with OBM & BS lingually.Rest all techniques showed statistically significant difference between them (p= 0.00). Conclusion: Within the limitations of this study, only BS seems to be comparable with OBM with no superior result of Dentascan over the conventional techniques, except for lingual measurements. PMID:24551722

  13. The importance of vegetation change in the prediction of future tropical cyclone flood statistics

    NASA Astrophysics Data System (ADS)

    Irish, J. L.; Resio, D.; Bilskie, M. V.; Hagen, S. C.; Weiss, R.

    2015-12-01

    Global sea level rise is a near certainty over the next century (e.g., Stocker et al. 2013 [IPCC] and references therein). With sea level rise, coastal topography and land cover (hereafter "landscape") is expected to change and tropical cyclone flood hazard is expected to accelerate (e.g., Irish et al. 2010 [Ocean Eng], Woodruff et al. 2013 [Nature], Bilskie et al. 2014 [Geophys Res Lett], Ferreira et al. 2014 [Coast Eng], Passeri et al. 2015 [Nat Hazards]). Yet, the relative importance of sea-level rise induced landscape change on future tropical cyclone flood hazard assessment is not known. In this paper, idealized scenarios are used to evaluate the relative impact of one class of landscape change on future tropical cyclone extreme-value statistics in back-barrier regions: sea level rise induced vegetation migration and loss. The joint probability method with optimal sampling (JPM-OS) (Resio et al. 2009 [Nat Hazards]) with idealized surge response functions (e.g., Irish et al. 2009 [Nat Hazards]) is used to quantify the present-day and future flood hazard under various sea level rise scenarios. Results are evaluated in terms of their impact on the flood statistics (a) when projected flood elevations are included directly in the JPM analysis (Figure 1) and (b) when represented as additional uncertainty within the JPM integral (Resio et al. 2013 [Nat Hazards]), i.e., as random error. Findings are expected to aid in determining the level of effort required to reasonably account for future landscape change in hazard assessments, namely in determining when such processes are sufficiently captured by added uncertainty and when sea level rise induced vegetation changes must be considered dynamically, via detailed modeling initiatives. Acknowledgements: This material is based upon work supported by the National Science Foundation under Grant No. CMMI-1206271 and by the National Sea Grant College Program of the U.S. Department of Commerce's National Oceanic and Atmospheric Administration under Grant No. NA10OAR4170099. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of these organizations. The STOKES ARCC at the University of Central Florida provided computational resources for storm surge simulations.

  14. A Survey of Probabilistic Methods for Dynamical Systems with Uncertain Parameters.

    DTIC Science & Technology

    1986-05-01

    J., "An Approach to the Theoretical Background of Statistical Energy Analysis Applied to Structural Vibration," Journ. Acoust. Soc. Amer., Vol. 69...1973, Sect. 8.3. 80. Lyon, R.H., " Statistical Energy Analysis of Dynamical Systems," M.I.T. Press, 1975. e) Late References added in Proofreading !! 81...Dowell, E.H., and Kubota, Y., "Asymptotic Modal Analysis and ’~ y C-" -165- Statistical Energy Analysis of Dynamical Systems," Journ. Appi. - Mech

  15. Validation of Reference Genes for Gene Expression Studies in Virus-Infected Nicotiana benthamiana Using Quantitative Real-Time PCR

    PubMed Central

    Han, Chenggui; Yu, Jialin; Li, Dawei; Zhang, Yongliang

    2012-01-01

    Nicotiana benthamiana is the most widely-used experimental host in plant virology. The recent release of the draft genome sequence for N. benthamiana consolidates its role as a model for plant–pathogen interactions. Quantitative real-time PCR (qPCR) is commonly employed for quantitative gene expression analysis. For valid qPCR analysis, accurate normalisation of gene expression against an appropriate internal control is required. Yet there has been little systematic investigation of reference gene stability in N. benthamiana under conditions of viral infections. In this study, the expression profiles of 16 commonly used housekeeping genes (GAPDH, 18S, EF1α, SAMD, L23, UK, PP2A, APR, UBI3, SAND, ACT, TUB, GBP, F-BOX, PPR and TIP41) were determined in N. benthamiana and those with acceptable expression levels were further selected for transcript stability analysis by qPCR of complementary DNA prepared from N. benthamiana leaf tissue infected with one of five RNA plant viruses (Tobacco necrosis virus A, Beet black scorch virus, Beet necrotic yellow vein virus, Barley stripe mosaic virus and Potato virus X). Gene stability was analysed in parallel by three commonly-used dedicated algorithms: geNorm, NormFinder and BestKeeper. Statistical analysis revealed that the PP2A, F-BOX and L23 genes were the most stable overall, and that the combination of these three genes was sufficient for accurate normalisation. In addition, the suitability of PP2A, F-BOX and L23 as reference genes was illustrated by expression-level analysis of AGO2 and RdR6 in virus-infected N. benthamiana leaves. This is the first study to systematically examine and evaluate the stability of different reference genes in N. benthamiana. Our results not only provide researchers studying these viruses a shortlist of potential housekeeping genes to use as normalisers for qPCR experiments, but should also guide the selection of appropriate reference genes for gene expression studies of N. benthamiana under other biotic and abiotic stress conditions. PMID:23029521

  16. "Hyperstat": an educational and working tool in epidemiology.

    PubMed

    Nicolosi, A

    1995-01-01

    The work of a researcher in epidemiology is based on studying literature, planning studies, gathering data, analyzing data and writing results. Therefore he has need for performing, more or less, simple calculations, the need for consulting or quoting literature, the need for consulting textbooks about certain issues or procedures, and the need for looking at a specific formula. There are no programs conceived as a workstation to assist the different aspects of researcher work in an integrated fashion. A hypertextual system was developed which supports different stages of the epidemiologist's work. It combines database management, statistical analysis or planning, and literature searches. The software was developed on Apple Macintosh by using Hypercard 2.1 as a database and HyperTalk as a programming language. The program is structured in 7 "stacks" or files: Procedures; Statistical Tables; Graphs; References; Text; Formulas; Help. Each stack has its own management system with an automated Table of Contents. Stacks contain "cards" which make up the databases and carry executable programs. The programs are of four kinds: association; statistical procedure; formatting (input/output); database management. The system performs general statistical procedures, procedures applicable to epidemiological studies only (follow-up and case-control), and procedures for clinical trials. All commands are given by clicking the mouse on self-explanatory "buttons". In order to perform calculations, the user only needs to enter the data into the appropriate cells and then click on the selected procedure's button. The system has a hypertextual structure. The user can go from a procedure to other cards following the preferred order of succession and according to built-in associations. The user can access different levels of knowledge or information from any stack he is consulting or operating. From every card, the user can go to a selected procedure to perform statistical calculations, to the reference database management system, to the textbook in which all procedures and issues are discussed in detail, to the database of statistical formulas with automated table of contents, to statistical tables with automated table of contents, or to the help module. he program has a very user-friendly interface and leaves the user free to use the same format he would use on paper. The interface does not require special skills. It reflects the Macintosh philosophy of using windows, buttons and mouse. This allows the user to perform complicated calculations without losing the "feel" of data, weight alternatives, and simulations. This program shares many features in common with hypertexts. It has an underlying network database where the nodes consist of text, graphics, executable procedures, and combinations of these; the nodes in the database correspond to windows on the screen; the links between the nodes in the database are visible as "active" text or icons in the windows; the text is read by following links and opening new windows. The program is especially useful as an educational tool, directed to medical and epidemiology students. The combination of computing capabilities with a textbook and databases of formulas and literature references, makes the program versatile and attractive as a learning tool. The program is also helpful in the work done at the desk, where the researcher examines results, consults literature, explores different analytic approaches, plans new studies, or writes grant proposals or scientific articles.

  17. [Nickel levels in female dermatological patients].

    PubMed

    Schwegler, U; Twardella, D; Fedorov, M; Darsow, U; Schaller, K-H; Habernegg, R; Behrendt, H; Fromme, H

    2009-07-01

    Nickel levels in urine were determined among 163 female dermatological patients aged 18 to 46 years. Data on life-style factors were collected in parallel via a questionnaire. Urinary nickel excretion was in the normal range of the German female population (0.2-46.1 microg Ni/g creatinine). The 95th percentile (3.9 microg Ni/l urine) exceeded the German reference value (3.0 microg Ni/l urine). In the multivariate regression analyses we found a statistically significant increase of ln-transformed nickel levels with increase in age and in women using dietary supplements. The following variables were not associated with Nickel urine levels: suffering from nickel eczema, smoking, drinking stagnated water, eating foods with high nickel contents and using nickel-containing kitchen utensils as, for example, an electric kettle with an open heater coil. We conclude that personal urinary levels should be assessed with simultaneous consideration of habits and life-style factors. A German national survery would be useful. Those patients who experience the exacerbation of their eczema in cases of oral provocation, for example, by a high nickel diet should be aware of potential sources of nickel, such as supplements.

  18. Multicenter field trial on possible health effects of toluene. II. Cross-sectional evaluation of acute low-level exposure.

    PubMed

    Neubert, D; Gericke, C; Hanke, B; Beckmann, G; Baltes, M M; Kühl, K P; Bochert, G; Hartmann, J

    2001-11-15

    Data on possible acute effects of today's relevant low-level exposure to toluene are contradictory, and information on possible effects of exposure under occupational conditions is largely lacking. In a controlled, multi-center, blinded field trial, effects possibly associated with acute toluene exposure were evaluated in workers of 12 German rotogravure factories. Medical examinations (inquiries on subjective symptoms, and standard tests of psycho-physiological and psycho-motor functions) were performed on almost 1500 volunteers, of whom 1290 were toluene-exposed (1178 men and 112 women), and about 200 participants served as references (157 men and 37 women), but the main aim of the trial was to reveal dose-response relationships. All volunteers were of the morning work-shift (6 h exposure). Both individual ambient air concentrations (time-weighted average) during the work-shift, as well as blood toluene concentrations after the work-shift were measured. Therefore, the medical data could for the first time be correlated with the actual individual body burden (blood toluene level) at the time of testing. In order to largely exclude confounding by chronic toluene exposure, kinetic measurements as well as the psycho-physiological and psycho-motoric tests were performed before and after the work-shift. Except for minor statistical deviations, neither convincing dose-dependent acute effects could be demonstrated with regression analyses in male volunteers at the exposure levels evaluated, nor were significant differences found when applying group statistics (highly toluene-exposed group versus volunteers with negligible exposure). Due to the rather large number of participants, the predictive power of the study is high, especially when compared with previous publications. In two psycho-physiological tests, a few more female volunteers with quite low toluene body burdens (<340 microg/l blood) showed relatively low scores when compared with participants of the reference group. Although evidence for a medical relevance is meager, the small numbers of participants, in both the exposure and the reference groups, hamper a reliable interpretation of the results concerning exposure levels above 85 microg toluene/l blood, and it is difficult to take confounding factors adequately into account. For the end points evaluated and under occupational conditions, neither blood toluene levels of 850 to 1700 microg/l (in the highest exposure group [EXPO-IV] with 56 participants), as measured 1/2 (+/-1/2) h after the work-shift, nor ambient air concentrations (time-weighted average over 6 h) between 50 and 100 ppm (188-375 mg/m(3)) were convincingly associated with alterations in psycho-physiological and psycho-motoric performances or increased the frequency of subjective complaints in male volunteers. For higher dose ranges of toluene exposure (i.e. >1700 microg toluene/l blood [or >100 ppm in ambient air]), our data set is too small for far reaching conclusions. Our data are insufficient for conclusions on a possibly higher susceptibility to toluene of some female workers. Results of kinetic studies and possible effects of long-term exposure are discussed in two accompanying publications (Neubert et al., 2001; Gericke et al., 2001).

  19. Dental age assessment of southern Chinese using the United Kingdom Caucasian reference dataset.

    PubMed

    Jayaraman, Jayakumar; Roberts, Graham J; King, Nigel M; Wong, Hai Ming

    2012-03-10

    Dental age assessment is one the most accurate methods for estimating the age of an unknown person. Demirjian's dataset on a French-Canadian population has been widely tested for its applicability on various ethnic groups including southern Chinese. Following inaccurate results from these studies, investigators are now confronted with using alternate datasets for comparison. Testing the applicability of other reliable datasets which result in accurate findings might limit the need to develop population specific standards. Recently, a Reference Data Set (RDS) similar to the Demirjian was prepared in the United Kingdom (UK) and has been subsequently validated. The advantages of the UK Caucasian RDS includes versatility from including both the maxillary and mandibular dentitions, involvement of a wide age group of subjects for evaluation and the possibility of precise age estimation with the mathematical technique of meta-analysis. The aim of this study was to evaluate the applicability of the United Kingdom Caucasian RDS on southern Chinese subjects. Dental panoramic tomographs (DPT) of 266 subjects (133 males and 133 females) aged 2-21 years that were previously taken for clinical diagnostic purposes were selected and scored by a single calibrated examiner based on Demirjian's classification of tooth developmental stages (A-H). The ages corresponding to each stage of tooth developmental stage were obtained from the UK dataset. Intra-examiner reproducibility was tested and the Cohen kappa (0.88) showed that the level of agreement was 'almost perfect'. The estimated dental age was then compared with the chronological age using a paired t-test, with statistical significance set at p<0.01. The results showed that the UK dataset, underestimated the age of southern Chinese subjects by 0.24 years but the results were not statistically significant. In conclusion, the UK Caucasian RDS may not be suitable for estimating the age of southern Chinese subjects and there is a need for an ethnic specific reference dataset for southern Chinese. Copyright © 2011. Published by Elsevier Ireland Ltd.

  20. Statistical analysis of regulatory ecotoxicity tests.

    PubMed

    Isnard, P; Flammarion, P; Roman, G; Babut, M; Bastien, P; Bintein, S; Esserméant, L; Férard, J F; Gallotti-Schmitt, S; Saouter, E; Saroli, M; Thiébaud, H; Tomassone, R; Vindimian, E

    2001-11-01

    ANOVA-type data analysis, i.e.. determination of lowest-observed-effect concentrations (LOECs), and no-observed-effect concentrations (NOECs), has been widely used for statistical analysis of chronic ecotoxicity data. However, it is more and more criticised for several reasons, among which the most important is probably the fact that the NOEC depends on the choice of test concentrations and number of replications and rewards poor experiments, i.e., high variability, with high NOEC values. Thus, a recent OECD workshop concluded that the use of the NOEC should be phased out and that a regression-based estimation procedure should be used. Following this workshop, a working group was established at the French level between government, academia and industry representatives. Twenty-seven sets of chronic data (algae, daphnia, fish) were collected and analysed by ANOVA and regression procedures. Several regression models were compared and relations between NOECs and ECx, for different values of x, were established in order to find an alternative summary parameter to the NOEC. Biological arguments are scarce to help in defining a negligible level of effect x for the ECx. With regard to their use in the risk assessment procedures, a convenient methodology would be to choose x so that ECx are on average similar to the present NOEC. This would lead to no major change in the risk assessment procedure. However, experimental data show that the ECx depend on the regression models and that their accuracy decreases in the low effect zone. This disadvantage could probably be reduced by adapting existing experimental protocols but it could mean more experimental effort and higher cost. ECx (derived with existing test guidelines, e.g., regarding the number of replicates) whose lowest bounds of the confidence interval are on average similar to present NOEC would improve this approach by a priori encouraging more precise experiments. However, narrow confidence intervals are not only linked to good experimental practices, but also depend on the distance between the best model fit and experimental data. At least, these approaches still use the NOEC as a reference although this reference is statistically not correct. On the contrary, EC50 are the most precise values to estimate on a concentration response curve, but they are clearly different from the NOEC and their use would require a modification of existing assessment factors.

Top