A quantitative comparison of corrective and perfective maintenance
NASA Technical Reports Server (NTRS)
Henry, Joel; Cain, James
1994-01-01
This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.
Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons
2014-01-01
Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-01-01
Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-02-01
A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.
Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C
2015-02-01
Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
ERIC Educational Resources Information Center
Huang, Liuli
2018-01-01
Research frequently uses the quantitative approach to explore undergraduate students' anxiety regarding statistics. However, few studies of adults' statistics anxiety use the qualitative method, or a sole focus on graduate students. Moreover, even fewer studies focus on a comparison of adults' anxiety levels before and after an introductory…
A Backscatter-Lidar Forward-Operator
NASA Astrophysics Data System (ADS)
Geisinger, Armin; Behrendt, Andreas; Wulfmeyer, Volker; Vogel, Bernhard; Mattis, Ina; Flentje, Harald; Förstner, Jochen; Potthast, Roland
2015-04-01
We have developed a forward-operator which is capable of calculating virtual lidar profiles from atmospheric state simulations. The operator allows us to compare lidar measurements and model simulations based on the same measurement parameter: the lidar backscatter profile. This method simplifies qualitative comparisons and also makes quantitative comparisons possible, including statistical error quantification. Implemented into an aerosol-capable model system, the operator will act as a component to assimilate backscatter-lidar measurements. As many weather services maintain already networks of backscatter-lidars, such data are acquired already in an operational manner. To estimate and quantify errors due to missing or uncertain aerosol information, we started sensitivity studies about several scattering parameters such as the aerosol size and both the real and imaginary part of the complex index of refraction. Furthermore, quantitative and statistical comparisons between measurements and virtual measurements are shown in this study, i.e. applying the backscatter-lidar forward-operator on model output.
Portillo, M C; Gonzalez, J M
2008-08-01
Molecular fingerprints of microbial communities are a common method for the analysis and comparison of environmental samples. The significance of differences between microbial community fingerprints was analyzed considering the presence of different phylotypes and their relative abundance. A method is proposed by simulating coverage of the analyzed communities as a function of sampling size applying a Cramér-von Mises statistic. Comparisons were performed by a Monte Carlo testing procedure. As an example, this procedure was used to compare several sediment samples from freshwater ponds using a relative quantitative PCR-DGGE profiling technique. The method was able to discriminate among different samples based on their molecular fingerprints, and confirmed the lack of differences between aliquots from a single sample.
Statistical design of quantitative mass spectrometry-based proteomic experiments.
Oberg, Ann L; Vitek, Olga
2009-05-01
We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.
ERIC Educational Resources Information Center
Jamison, Joseph A.
2013-01-01
This quantitative study sought to determine whether there were significant statistical differences between the performance scores of special education and general education students' scores when in team or solo-teaching environments as may occur in inclusively taught classrooms. The investigated problem occurs because despite education's stated…
A novel statistical method for quantitative comparison of multiple ChIP-seq datasets.
Chen, Li; Wang, Chi; Qin, Zhaohui S; Wu, Hao
2015-06-15
ChIP-seq is a powerful technology to measure the protein binding or histone modification strength in the whole genome scale. Although there are a number of methods available for single ChIP-seq data analysis (e.g. 'peak detection'), rigorous statistical method for quantitative comparison of multiple ChIP-seq datasets with the considerations of data from control experiment, signal to noise ratios, biological variations and multiple-factor experimental designs is under-developed. In this work, we develop a statistical method to perform quantitative comparison of multiple ChIP-seq datasets and detect genomic regions showing differential protein binding or histone modification. We first detect peaks from all datasets and then union them to form a single set of candidate regions. The read counts from IP experiment at the candidate regions are assumed to follow Poisson distribution. The underlying Poisson rates are modeled as an experiment-specific function of artifacts and biological signals. We then obtain the estimated biological signals and compare them through the hypothesis testing procedure in a linear model framework. Simulations and real data analyses demonstrate that the proposed method provides more accurate and robust results compared with existing ones. An R software package ChIPComp is freely available at http://web1.sph.emory.edu/users/hwu30/software/ChIPComp.html. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Clinical applications of a quantitative analysis of regional lift ventricular wall motion
NASA Technical Reports Server (NTRS)
Leighton, R. F.; Rich, J. M.; Pollack, M. E.; Altieri, P. I.
1975-01-01
Observations were summarized which may have clinical application. These were obtained from a quantitative analysis of wall motion that was used to detect both hypokinesis and tardokinesis in left ventricular cineangiograms. The method was based on statistical comparisons with normal values for regional wall motion derived from the cineangiograms of patients who were found not to have heart disease.
ERIC Educational Resources Information Center
Barrows, Russell D.
2007-01-01
A one-way ANOVA experiment is performed to determine whether or not the three standardization methods are statistically different in determining the concentration of the three paraffin analytes. The laboratory exercise asks students to combine the three methods in a single analytical procedure of their own design to determine the concentration of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, T.; Ungers, L.; Briggs, T.
1980-08-01
The purpose of this study is to estimate both quantitatively and qualitatively, the worker and societal risks attributable to four photovoltaic cell (solar cell) production processes. Quantitative risk values were determined by use of statistics from the California semiconductor industry. The qualitative risk assessment was performed using a variety of both governmental and private sources of data. The occupational health statistics derived from the semiconductor industry were used to predict injury and fatality levels associated with photovoltaic cell manufacturing. The use of these statistics to characterize the two silicon processes described herein is defensible from the standpoint that many ofmore » the same process steps and materials are used in both the semiconductor and photovoltaic industries. These health statistics are less applicable to the gallium arsenide and cadmium sulfide manufacturing processes, primarily because of differences in the materials utilized. Although such differences tend to discourage any absolute comparisons among the four photovoltaic cell production processes, certain relative comparisons are warranted. To facilitate a risk comparison of the four processes, the number and severity of process-related chemical hazards were assessed. This qualitative hazard assessment addresses both the relative toxicity and the exposure potential of substances in the workplace. In addition to the worker-related hazards, estimates of process-related emissions and wastes are also provided.« less
OdorMapComparer: an application for quantitative analyses and comparisons of fMRI brain odor maps.
Liu, Nian; Xu, Fuqiang; Miller, Perry L; Shepherd, Gordon M
2007-01-01
Brain odor maps are reconstructed flat images that describe the spatial activity patterns in the glomerular layer of the olfactory bulbs in animals exposed to different odor stimuli. We have developed a software application, OdorMapComparer, to carry out quantitative analyses and comparisons of the fMRI odor maps. This application is an open-source window program that first loads two odor map images being compared. It allows image transformations including scaling, flipping, rotating, and warping so that the two images can be appropriately aligned to each other. It performs simple subtraction, addition, and average of signals in the two images. It also provides comparative statistics including the normalized correlation (NC) and spatial correlation coefficient. Experimental studies showed that the rodent fMRI odor maps for aliphatic aldehydes displayed spatial activity patterns that are similar in gross outlines but somewhat different in specific subregions. Analyses with OdorMapComparer indicate that the similarity between odor maps decreases with increasing difference in the length of carbon chains. For example, the map of butanal is more closely related to that of pentanal (with a NC = 0.617) than to that of octanal (NC = 0.082), which is consistent with animal behavioral studies. The study also indicates that fMRI odor maps are statistically odor-specific and repeatable across both the intra- and intersubject trials. OdorMapComparer thus provides a tool for quantitative, statistical analyses and comparisons of fMRI odor maps in a fashion that is integrated with the overall odor mapping techniques.
Obuchowski, Nancy A; Barnhart, Huiman X; Buckler, Andrew J; Pennello, Gene; Wang, Xiao-Feng; Kalpathy-Cramer, Jayashree; Kim, Hyun J Grace; Reeves, Anthony P
2015-02-01
Quantitative imaging biomarkers are being used increasingly in medicine to diagnose and monitor patients' disease. The computer algorithms that measure quantitative imaging biomarkers have different technical performance characteristics. In this paper we illustrate the appropriate statistical methods for assessing and comparing the bias, precision, and agreement of computer algorithms. We use data from three studies of pulmonary nodules. The first study is a small phantom study used to illustrate metrics for assessing repeatability. The second study is a large phantom study allowing assessment of four algorithms' bias and reproducibility for measuring tumor volume and the change in tumor volume. The third study is a small clinical study of patients whose tumors were measured on two occasions. This study allows a direct assessment of six algorithms' performance for measuring tumor change. With these three examples we compare and contrast study designs and performance metrics, and we illustrate the advantages and limitations of various common statistical methods for quantitative imaging biomarker studies. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
EvolQG - An R package for evolutionary quantitative genetics
Melo, Diogo; Garcia, Guilherme; Hubbe, Alex; Assis, Ana Paula; Marroig, Gabriel
2016-01-01
We present an open source package for performing evolutionary quantitative genetics analyses in the R environment for statistical computing. Evolutionary theory shows that evolution depends critically on the available variation in a given population. When dealing with many quantitative traits this variation is expressed in the form of a covariance matrix, particularly the additive genetic covariance matrix or sometimes the phenotypic matrix, when the genetic matrix is unavailable and there is evidence the phenotypic matrix is sufficiently similar to the genetic matrix. Given this mathematical representation of available variation, the \\textbf{EvolQG} package provides functions for calculation of relevant evolutionary statistics; estimation of sampling error; corrections for this error; matrix comparison via correlations, distances and matrix decomposition; analysis of modularity patterns; and functions for testing evolutionary hypotheses on taxa diversification. PMID:27785352
Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana
2014-02-01
To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.
Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana
2014-01-01
Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Williams, Malcolm; Sloan, Luke; Cheung, Sin Yi; Sutton, Carole; Stevens, Sebastian; Runham, Libby
2016-06-01
This paper reports on a quasi-experiment in which quantitative methods (QM) are embedded within a substantive sociology module. Through measuring student attitudes before and after the intervention alongside control group comparisons, we illustrate the impact that embedding has on the student experience. Our findings are complex and even contradictory. Whilst the experimental group were less likely to be distrustful of statistics and appreciate how QM inform social research, they were also less confident about their statistical abilities, suggesting that through 'doing' quantitative sociology the experimental group are exposed to the intricacies of method and their optimism about their own abilities is challenged. We conclude that embedding QM in a single substantive module is not a 'magic bullet' and that a wider programme of content and assessment diversification across the curriculum is preferential.
[A comparison of convenience sampling and purposive sampling].
Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien
2014-06-01
Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.
Dai, Qi; Yang, Yanchun; Wang, Tianming
2008-10-15
Many proposed statistical measures can efficiently compare biological sequences to further infer their structures, functions and evolutionary information. They are related in spirit because all the ideas for sequence comparison try to use the information on the k-word distributions, Markov model or both. Motivated by adding k-word distributions to Markov model directly, we investigated two novel statistical measures for sequence comparison, called wre.k.r and S2.k.r. The proposed measures were tested by similarity search, evaluation on functionally related regulatory sequences and phylogenetic analysis. This offers the systematic and quantitative experimental assessment of our measures. Moreover, we compared our achievements with these based on alignment or alignment-free. We grouped our experiments into two sets. The first one, performed via ROC (receiver operating curve) analysis, aims at assessing the intrinsic ability of our statistical measures to search for similar sequences from a database and discriminate functionally related regulatory sequences from unrelated sequences. The second one aims at assessing how well our statistical measure is used for phylogenetic analysis. The experimental assessment demonstrates that our similarity measures intending to incorporate k-word distributions into Markov model are more efficient.
Williams, Malcolm; Sloan, Luke; Cheung, Sin Yi; Sutton, Carole; Stevens, Sebastian; Runham, Libby
2015-01-01
This paper reports on a quasi-experiment in which quantitative methods (QM) are embedded within a substantive sociology module. Through measuring student attitudes before and after the intervention alongside control group comparisons, we illustrate the impact that embedding has on the student experience. Our findings are complex and even contradictory. Whilst the experimental group were less likely to be distrustful of statistics and appreciate how QM inform social research, they were also less confident about their statistical abilities, suggesting that through ‘doing’ quantitative sociology the experimental group are exposed to the intricacies of method and their optimism about their own abilities is challenged. We conclude that embedding QM in a single substantive module is not a ‘magic bullet’ and that a wider programme of content and assessment diversification across the curriculum is preferential. PMID:27330225
Using Alien Coins to Test Whether Simple Inference Is Bayesian
ERIC Educational Resources Information Center
Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.
2016-01-01
Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…
Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia
2017-01-01
This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199
Mayo, Charles S; Yao, John; Eisbruch, Avraham; Balter, James M; Litzenberg, Dale W; Matuszak, Martha M; Kessler, Marc L; Weyburn, Grant; Anderson, Carlos J; Owen, Dawn; Jackson, William C; Haken, Randall Ten
2017-01-01
To develop statistical dose-volume histogram (DVH)-based metrics and a visualization method to quantify the comparison of treatment plans with historical experience and among different institutions. The descriptive statistical summary (ie, median, first and third quartiles, and 95% confidence intervals) of volume-normalized DVH curve sets of past experiences was visualized through the creation of statistical DVH plots. Detailed distribution parameters were calculated and stored in JavaScript Object Notation files to facilitate management, including transfer and potential multi-institutional comparisons. In the treatment plan evaluation, structure DVH curves were scored against computed statistical DVHs and weighted experience scores (WESs). Individual, clinically used, DVH-based metrics were integrated into a generalized evaluation metric (GEM) as a priority-weighted sum of normalized incomplete gamma functions. Historical treatment plans for 351 patients with head and neck cancer, 104 with prostate cancer who were treated with conventional fractionation, and 94 with liver cancer who were treated with stereotactic body radiation therapy were analyzed to demonstrate the usage of statistical DVH, WES, and GEM in a plan evaluation. A shareable dashboard plugin was created to display statistical DVHs and integrate GEM and WES scores into a clinical plan evaluation within the treatment planning system. Benchmarking with normal tissue complication probability scores was carried out to compare the behavior of GEM and WES scores. DVH curves from historical treatment plans were characterized and presented, with difficult-to-spare structures (ie, frequently compromised organs at risk) identified. Quantitative evaluations by GEM and/or WES compared favorably with the normal tissue complication probability Lyman-Kutcher-Burman model, transforming a set of discrete threshold-priority limits into a continuous model reflecting physician objectives and historical experience. Statistical DVH offers an easy-to-read, detailed, and comprehensive way to visualize the quantitative comparison with historical experiences and among institutions. WES and GEM metrics offer a flexible means of incorporating discrete threshold-prioritizations and historic context into a set of standardized scoring metrics. Together, they provide a practical approach for incorporating big data into clinical practice for treatment plan evaluations.
Numerical and Qualitative Contrasts of Two Statistical Models ...
Two statistical approaches, weighted regression on time, discharge, and season and generalized additive models, have recently been used to evaluate water quality trends in estuaries. Both models have been used in similar contexts despite differences in statistical foundations and products. This study provided an empirical and qualitative comparison of both models using 29 years of data for two discrete time series of chlorophyll-a (chl-a) in the Patuxent River estuary. Empirical descriptions of each model were based on predictive performance against the observed data, ability to reproduce flow-normalized trends with simulated data, and comparisons of performance with validation datasets. Between-model differences were apparent but minor and both models had comparable abilities to remove flow effects from simulated time series. Both models similarly predicted observations for missing data with different characteristics. Trends from each model revealed distinct mainstem influences of the Chesapeake Bay with both models predicting a roughly 65% increase in chl-a over time in the lower estuary, whereas flow-normalized predictions for the upper estuary showed a more dynamic pattern, with a nearly 100% increase in chl-a in the last 10 years. Qualitative comparisons highlighted important differences in the statistical structure, available products, and characteristics of the data and desired analysis. This manuscript describes a quantitative comparison of two recently-
Probability of Detection (POD) as a statistical model for the validation of qualitative methods.
Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T
2011-01-01
A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.
Of pacemakers and statistics: the actuarial method extended.
Dussel, J; Wolbarst, A B; Scott-Millar, R N; Obel, I W
1980-01-01
Pacemakers cease functioning because of either natural battery exhaustion (nbe) or component failure (cf). A study of four series of pacemakers shows that a simple extension of the actuarial method, so as to incorporate Normal statistics, makes possible a quantitative differentiation between the two modes of failure. This involves the separation of the overall failure probability density function PDF(t) into constituent parts pdfnbe(t) and pdfcf(t). The approach should allow a meaningful comparison of the characteristics of different pacemaker types.
Quantitative impact of pediatric sinus surgery on facial growth.
Senior, B; Wirtschafter, A; Mai, C; Becker, C; Belenky, W
2000-11-01
To quantitatively evaluate the long-term impact of sinus surgery on paranasal sinus development in the pediatric patient. Longitudinal review of eight pediatric patients treated with unilateral sinus surgery for periorbital or orbital cellulitis with an average follow-up of 6.9 years. Control subjects consisted of two groups, 9 normal adult patients with no computed tomographic evidence of sinusitis and 10 adult patients with scans consistent with sinusitis and a history of sinus-related symptoms extending to childhood. Application of computed tomography (CT) volumetrics, a technique allowing for precise calculation of volumes using thinly cut CT images, to the study and control groups. Paired Student t test analyses of side-to-side volume comparisons in the normal patients, patients with sinusitis, and patients who had surgery revealed no statistically significant differences. Comparisons between the orbital volumes of patients who did and did not have surgery revealed a statistically significant increase in orbital volume in patients who had surgery. Only minimal changes in facial volume measurements have been found, confirming clinical impressions that sinus surgery in children is safe and without significant cosmetic sequelae.
Computerized EEG analysis for studying the effect of drugs on the central nervous system.
Rosadini, G; Cavazza, B; Rodriguez, G; Sannita, W G; Siccardi, A
1977-11-01
Samples of our experience in quantitative pharmaco-EEG are reviewed to discuss and define its applicability and limits. Simple processing systems, such as the computation of Hjorth's descriptors, are useful for on-line monitoring of drug-induced EEG modifications which are evident also at the visual visual analysis. Power spectral analysis is suitable to identify and quantify EEG effects not evident at the visual inspection. It demonstrated how the EEG effects of compounds in a long-acting formulation vary according to the sampling time and the explored cerebral area. EEG modifications not detected by power spectral analysis can be defined by comparing statistically (F test) the spectral values of the EEG from a single lead at the different samples (longitudinal comparison), or the spectral values from different leads at any sample (intrahemispheric comparison). The presently available procedures of quantitative pharmaco-EEG are effective when applied to the study of mutltilead EEG recordings in a statistically significant sample of population. They do not seem reliable in the monitoring of directing of neuropyschiatric therapies in single patients, due to individual variability of drug effects.
The Quantitative Reasoning for College Science (QuaRCS) Assessment in non-Astro 101 Courses II
NASA Astrophysics Data System (ADS)
Kirkman, Thomas W.; Jensen, Ellen
2017-06-01
The Quantitative Reasoning for College Science (QuaRCS) Assessment[1] aims to measure the pre-algebra mathematical skills that are often part of "general education" science courses like Astro 101. In four majors STEM classes, we report comparisons between QuaRCS metrics, ACT math, GPAO, and the course grade. In three of four classes QuaRCS QR score and ACT math were statistically significantly correlated (with r˜.6), however in the fourth course —a senior-level microbiology course— there was no statistically significantly correlation (in fact, r<0). In all courses —even in courses with seemingly little quantitative content— course grade was statistically significantly correlated to GPAO and QR. A QuaRCS metric aiming to report the students belief in the importance of math in science was seen to grow with the course level. Pre/post QuaRCS testing in Physics courses showed fractional sigma gains in QR, self-estimated math fluency and math importance, but not all of those increases were statistically significant. Using a QuaRCS map relating the questions to skill areas, we found graph reading, percentages, and proportional reasoning to be the most misunderstood skills in all four courses.[1] QuaRCS, Follette, et al.,2015, DOI: http://dx.doi.org/10.5038/1936-4660.8.2.2
Environmental Health Practice: Statistically Based Performance Measurement
Enander, Richard T.; Gagnon, Ronald N.; Hanumara, R. Choudary; Park, Eugene; Armstrong, Thomas; Gute, David M.
2007-01-01
Objectives. State environmental and health protection agencies have traditionally relied on a facility-by-facility inspection-enforcement paradigm to achieve compliance with government regulations. We evaluated the effectiveness of a new approach that uses a self-certification random sampling design. Methods. Comprehensive environmental and occupational health data from a 3-year statewide industry self-certification initiative were collected from representative automotive refinishing facilities located in Rhode Island. Statistical comparisons between baseline and postintervention data facilitated a quantitative evaluation of statewide performance. Results. The analysis of field data collected from 82 randomly selected automotive refinishing facilities showed statistically significant improvements (P<.05, Fisher exact test) in 4 major performance categories: occupational health and safety, air pollution control, hazardous waste management, and wastewater discharge. Statistical significance was also shown when a modified Bonferroni adjustment for multiple comparisons was performed. Conclusions. Our findings suggest that the new self-certification approach to environmental and worker protection is effective and can be used as an adjunct to further enhance state and federal enforcement programs. PMID:17267709
The resolving power of in vitro genotoxicity assays for cigarette smoke particulate matter.
Scott, K; Saul, J; Crooks, I; Camacho, O M; Dillon, D; Meredith, C
2013-06-01
In vitro genotoxicity assays are often used to compare tobacco smoke particulate matter (PM) from different cigarettes. The quantitative aspect of the comparisons requires appropriate statistical methods and replication levels, to support the interpretation in terms of power and significance. This paper recommends a uniform statistical analysis for the Ames test, mouse lymphoma mammalian cell mutation assay (MLA) and the in vitro micronucleus test (IVMNT); involving a hierarchical decision process with respect to slope, fixed effect and single dose comparisons. With these methods, replication levels of 5 (Ames test TA98), 4 (Ames test TA100), 10 (Ames test TA1537), 6 (MLA) and 4 (IVMNT) resolved a 30% difference in PM genotoxicity. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ghezelbash, Reza; Maghsoudi, Abbas
2018-05-01
The delineation of populations of stream sediment geochemical data is a crucial task in regional exploration surveys. In this contribution, uni-element stream sediment geochemical data of Cu, Au, Mo, and Bi have been subjected to two reliable anomaly-background separation methods, namely, the concentration-area (C-A) fractal and the U-spatial statistics methods to separate geochemical anomalies related to porphyry-type Cu mineralization in northwest Iran. The quantitative comparison of the delineated geochemical populations using the modified success-rate curves revealed the superiority of the U-spatial statistics method over the fractal model. Moreover, geochemical maps of investigated elements revealed strongly positive correlations between strong anomalies and Oligocene-Miocene intrusions in the study area. Therefore, follow-up exploration programs should focus on these areas.
Advanced quantitative measurement methodology in physics education research
NASA Astrophysics Data System (ADS)
Wang, Jing
The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The reasoning ability structures for U.S. and Chinese students at different educational levels are given by the analysis. A final discussion on the advanced quantitative assessment methodology and the pure mathematical methodology is presented at the end.
NASA Astrophysics Data System (ADS)
Kistenev, Yu. V.; Kuzmin, D. A.; Sandykova, E. A.; Shapovalov, A. V.
2015-11-01
An approach to the reduction of the space of the absorption spectra, based on the original criterion for profile analysis of the spectra, was proposed. This criterion dates back to the known statistics chi-square test of Pearson. Introduced criterion allows to quantify the differences of spectral curves.
[Clinical research IV. Relevancy of the statistical test chosen].
Talavera, Juan O; Rivas-Ruiz, Rodolfo
2011-01-01
When we look at the difference between two therapies or the association of a risk factor or prognostic indicator with its outcome, we need to evaluate the accuracy of the result. This assessment is based on a judgment that uses information about the study design and statistical management of the information. This paper specifically mentions the relevance of the statistical test selected. Statistical tests are chosen mainly from two characteristics: the objective of the study and type of variables. The objective can be divided into three test groups: a) those in which you want to show differences between groups or inside a group before and after a maneuver, b) those that seek to show the relationship (correlation) between variables, and c) those that aim to predict an outcome. The types of variables are divided in two: quantitative (continuous and discontinuous) and qualitative (ordinal and dichotomous). For example, if we seek to demonstrate differences in age (quantitative variable) among patients with systemic lupus erythematosus (SLE) with and without neurological disease (two groups), the appropriate test is the "Student t test for independent samples." But if the comparison is about the frequency of females (binomial variable), then the appropriate statistical test is the χ(2).
2014-01-01
Quantitative imaging biomarkers (QIBs) are being used increasingly in medicine to diagnose and monitor patients’ disease. The computer algorithms that measure QIBs have different technical performance characteristics. In this paper we illustrate the appropriate statistical methods for assessing and comparing the bias, precision, and agreement of computer algorithms. We use data from three studies of pulmonary nodules. The first study is a small phantom study used to illustrate metrics for assessing repeatability. The second study is a large phantom study allowing assessment of four algorithms’ bias and reproducibility for measuring tumor volume and the change in tumor volume. The third study is a small clinical study of patients whose tumors were measured on two occasions. This study allows a direct assessment of six algorithms’ performance for measuring tumor change. With these three examples we compare and contrast study designs and performance metrics, and we illustrate the advantages and limitations of various common statistical methods for QIB studies. PMID:24919828
Solomon, Justin; Mileto, Achille; Nelson, Rendon C; Roy Choudhury, Kingshuk; Samei, Ehsan
2016-04-01
To determine if radiation dose and reconstruction algorithm affect the computer-based extraction and analysis of quantitative imaging features in lung nodules, liver lesions, and renal stones at multi-detector row computed tomography (CT). Retrospective analysis of data from a prospective, multicenter, HIPAA-compliant, institutional review board-approved clinical trial was performed by extracting 23 quantitative imaging features (size, shape, attenuation, edge sharpness, pixel value distribution, and texture) of lesions on multi-detector row CT images of 20 adult patients (14 men, six women; mean age, 63 years; range, 38-72 years) referred for known or suspected focal liver lesions, lung nodules, or kidney stones. Data were acquired between September 2011 and April 2012. All multi-detector row CT scans were performed at two different radiation dose levels; images were reconstructed with filtered back projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction (MBIR) algorithms. A linear mixed-effects model was used to assess the effect of radiation dose and reconstruction algorithm on extracted features. Among the 23 imaging features assessed, radiation dose had a significant effect on five, three, and four of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Adaptive statistical iterative reconstruction had a significant effect on three, one, and one of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). MBIR reconstruction had a significant effect on nine, 11, and 15 of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Of note, the measured size of lung nodules and renal stones with MBIR was significantly different than those for the other two algorithms (P < .002 for all comparisons). Although lesion texture was significantly affected by the reconstruction algorithm used (average of 3.33 features affected by MBIR throughout lesion types; P < .002, for all comparisons), no significant effect of the radiation dose setting was observed for all but one of the texture features (P = .002-.998). Radiation dose settings and reconstruction algorithms affect the extraction and analysis of quantitative imaging features in lesions at multi-detector row CT.
Uncertainty Analysis of Radar and Gauge Rainfall Estimates in the Russian River Basin
NASA Astrophysics Data System (ADS)
Cifelli, R.; Chen, H.; Willie, D.; Reynolds, D.; Campbell, C.; Sukovich, E.
2013-12-01
Radar Quantitative Precipitation Estimation (QPE) has been a very important application of weather radar since it was introduced and made widely available after World War II. Although great progress has been made over the last two decades, it is still a challenging process especially in regions of complex terrain such as the western U.S. It is also extremely difficult to make direct use of radar precipitation data in quantitative hydrologic forecasting models. To improve the understanding of rainfall estimation and distributions in the NOAA Hydrometeorology Testbed in northern California (HMT-West), extensive evaluation of radar and gauge QPE products has been performed using a set of independent rain gauge data. This study focuses on the rainfall evaluation in the Russian River Basin. The statistical properties of the different gridded QPE products will be compared quantitatively. The main emphasis of this study will be on the analysis of uncertainties of the radar and gauge rainfall products that are subject to various sources of error. The spatial variation analysis of the radar estimates is performed by measuring the statistical distribution of the radar base data such as reflectivity and by the comparison with a rain gauge cluster. The application of mean field bias values to the radar rainfall data will also be described. The uncertainty analysis of the gauge rainfall will be focused on the comparison of traditional kriging and conditional bias penalized kriging (Seo 2012) methods. This comparison is performed with the retrospective Multisensor Precipitation Estimator (MPE) system installed at the NOAA Earth System Research Laboratory. The independent gauge set will again be used as the verification tool for the newly generated rainfall products.
Design and analysis issues in quantitative proteomics studies.
Karp, Natasha A; Lilley, Kathryn S
2007-09-01
Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.
Multistrip western blotting to increase quantitative data output.
Kiyatkin, Anatoly; Aksamitiene, Edita
2009-01-01
The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip western blotting increases the data output per single blotting cycle up to tenfold, allows concurrent monitoring of up to nine different proteins from the same loading of the sample, and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data, and therefore is beneficial to apply in biomedical diagnostics, systems biology, and cell signaling research.
Martin, Guillaume; Chapuis, Elodie; Goudet, Jérôme
2008-01-01
Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Qst–Fst) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2Fst/(1 − Fst)G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2Fst/(1 − Fst)] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Qst–Fst comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions. PMID:18245845
ERIC Educational Resources Information Center
Abouelenein, Mahmoud S.
2012-01-01
The purpose of this quantitative, descriptive research study was to determine, through statistical analysis, any correlation between the perceived transformational leadership traits of CIOs at two-year community colleges in Kansas and measures of the job satisfaction among IT workers at those community colleges. The objectives of this research…
Goto, Masami; Abe, Osamu; Hata, Junichi; Fukunaga, Issei; Shimoji, Keigo; Kunimatsu, Akira; Gomi, Tsutomu
2017-02-01
Background Diffusion tensor imaging (DTI) is a magnetic resonance imaging (MRI) technique that reflects the Brownian motion of water molecules constrained within brain tissue. Fractional anisotropy (FA) is one of the most commonly measured DTI parameters, and can be applied to quantitative analysis of white matter as tract-based spatial statistics (TBSS) and voxel-wise analysis. Purpose To show an association between metallic implants and the results of statistical analysis (voxel-wise group comparison and TBSS) for fractional anisotropy (FA) mapping, in DTI of healthy adults. Material and Methods Sixteen healthy volunteers were scanned with 3-Tesla MRI. A magnetic keeper type of dental implant was used as the metallic implant. DTI was acquired three times in each participant: (i) without a magnetic keeper (FAnon1); (ii) with a magnetic keeper (FAimp); and (iii) without a magnetic keeper (FAnon2) as reproducibility of FAnon1. Group comparisons with paired t-test were performed as FAnon1 vs. FAnon2, and as FAnon1 vs. FAimp. Results Regions of significantly reduced and increased local FA values were revealed by voxel-wise group comparison analysis (a P value of less than 0.05, corrected with family-wise error), but not by TBSS. Conclusion Metallic implants existing outside the field of view produce artifacts that affect the statistical analysis (voxel-wise group comparisons) for FA mapping. When statistical analysis for FA mapping is conducted by researchers, it is important to pay attention to any dental implants present in the mouths of the participants.
Chumbley, Scott; Zhang, Song; Morris, Max; ...
2016-11-16
Since the development of the striagraph, various attempts have been made to enhance forensic investigation through the use of measuring and imaging equipment. This study describes the development of a prototype system employing an easy-to-use software interface designed to provide forensic examiners with the ability to measure topography of a toolmarked surface and then conduct various comparisons using a statistical algorithm. Acquisition of the data is carried out using a portable 3D optical profilometer, and comparison of the resulting data files is made using software named “MANTIS” (Mark and Tool Inspection Suite). The system has been tested on laboratory-produced markingsmore » that include fully striated marks (e.g., screwdriver markings), quasistriated markings produced by shear-cut pliers, impression marks left by chisels, rifling marks on bullets, and cut marks produced by knives. Using the system, an examiner has the potential to (i) visually compare two toolmarked surfaces in a manner similar to a comparison microscope and (ii) use the quantitative information embedded within the acquired data to obtain an objective statistical comparison of the data files. Finally, this study shows that, based on the results from laboratory samples, the system has great potential for aiding examiners in conducting comparisons of toolmarks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chumbley, Scott; Zhang, Song; Morris, Max
Since the development of the striagraph, various attempts have been made to enhance forensic investigation through the use of measuring and imaging equipment. This study describes the development of a prototype system employing an easy-to-use software interface designed to provide forensic examiners with the ability to measure topography of a toolmarked surface and then conduct various comparisons using a statistical algorithm. Acquisition of the data is carried out using a portable 3D optical profilometer, and comparison of the resulting data files is made using software named “MANTIS” (Mark and Tool Inspection Suite). The system has been tested on laboratory-produced markingsmore » that include fully striated marks (e.g., screwdriver markings), quasistriated markings produced by shear-cut pliers, impression marks left by chisels, rifling marks on bullets, and cut marks produced by knives. Using the system, an examiner has the potential to (i) visually compare two toolmarked surfaces in a manner similar to a comparison microscope and (ii) use the quantitative information embedded within the acquired data to obtain an objective statistical comparison of the data files. Finally, this study shows that, based on the results from laboratory samples, the system has great potential for aiding examiners in conducting comparisons of toolmarks.« less
Quantifying Safety Margin Using the Risk-Informed Safety Margin Characterization (RISMC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Brunett, Acacia
2015-04-26
The Risk-Informed Safety Margin Characterization (RISMC), developed by Idaho National Laboratory as part of the Light-Water Reactor Sustainability Project, utilizes a probabilistic safety margin comparison between a load and capacity distribution, rather than a deterministic comparison between two values, as is usually done in best-estimate plus uncertainty analyses. The goal is to determine the failure probability, or in other words, the probability of the system load equaling or exceeding the system capacity. While this method has been used in pilot studies, there has been little work conducted investigating the statistical significance of the resulting failure probability. In particular, it ismore » difficult to determine how many simulations are necessary to properly characterize the failure probability. This work uses classical (frequentist) statistics and confidence intervals to examine the impact in statistical accuracy when the number of simulations is varied. Two methods are proposed to establish confidence intervals related to the failure probability established using a RISMC analysis. The confidence interval provides information about the statistical accuracy of the method utilized to explore the uncertainty space, and offers a quantitative method to gauge the increase in statistical accuracy due to performing additional simulations.« less
Ion induced electron emission statistics under Agm- cluster bombardment of Ag
NASA Astrophysics Data System (ADS)
Breuers, A.; Penning, R.; Wucher, A.
2018-05-01
The electron emission from a polycrystalline silver surface under bombardment with Agm- cluster ions (m = 1, 2, 3) is investigated in terms of ion induced kinetic excitation. The electron yield γ is determined directly by a current measurement method on the one hand and implicitly by the analysis of the electron emission statistics on the other hand. Successful measurements of the electron emission spectra ensure a deeper understanding of the ion induced kinetic electron emission process, with particular emphasis on the effect of the projectile cluster size to the yield as well as to emission statistics. The results allow a quantitative comparison to computer simulations performed for silver atoms and clusters impinging onto a silver surface.
Liu, Jian; Gao, Yun-Hua; Li, Ding-Dong; Gao, Yan-Chun; Hou, Ling-Mi; Xie, Ting
2014-01-01
To compare the value of contrast-enhanced ultrasound (CEUS) qualitative and quantitative analysis in the identification of breast tumor lumps. Qualitative and quantitative indicators of CEUS for 73 cases of breast tumor lumps were retrospectively analyzed by univariate and multivariate approaches. Logistic regression was applied and ROC curves were drawn for evaluation and comparison. The CEUS qualitative indicator-generated regression equation contained three indicators, namely enhanced homogeneity, diameter line expansion and peak intensity grading, which demonstrated prediction accuracy for benign and malignant breast tumor lumps of 91.8%; the quantitative indicator-generated regression equation only contained one indicator, namely the relative peak intensity, and its prediction accuracy was 61.5%. The corresponding areas under the ROC curve for qualitative and quantitative analyses were 91.3% and 75.7%, respectively, which exhibited a statistically significant difference by the Z test (P<0.05). The ability of CEUS qualitative analysis to identify breast tumor lumps is better than with quantitative analysis.
Lee, Ellen E; Della Selva, Megan P; Liu, Anson; Himelhoch, Seth
2015-01-01
Given the significant disability, morbidity and mortality associated with depression, the promising recent trials of ketamine highlight a novel intervention. A meta-analysis was conducted to assess the efficacy of ketamine in comparison with placebo for the reduction of depressive symptoms in patients who meet criteria for a major depressive episode. Two electronic databases were searched in September 2013 for English-language studies that were randomized placebo-controlled trials of ketamine treatment for patients with major depressive disorder or bipolar depression and utilized a standardized rating scale. Studies including participants receiving electroconvulsive therapy and adolescent/child participants were excluded. Five studies were included in the quantitative meta-analysis. The quantitative meta-analysis showed that ketamine significantly reduced depressive symptoms. The overall effect size at day 1 was large and statistically significant with an overall standardized mean difference of 1.01 (95% confidence interval 0.69-1.34) (P<.001), with the effects sustained at 7 days postinfusion. The heterogeneity of the studies was low and not statistically significant, and the funnel plot showed no publication bias. The large and statistically significant effect of ketamine on depressive symptoms supports a promising, new and effective pharmacotherapy with rapid onset, high efficacy and good tolerability. Copyright © 2015. Published by Elsevier Inc.
Frank, C; Bray, D; Rademaker, A; Chrusch, C; Sabiston, P; Bodie, D; Rangayyan, R
1989-01-01
To establish a normal baseline for comparison, thirty-one thousand collagen fibril diameters were measured in calibrated transmission electron (TEM) photomicrographs of normal rabbit medial collateral ligaments (MCL's). A new automated method of quantitation was used to compare statistically fibril minimum diameter distributions in one midsubstance location in both MCL's from six animals at 3 months of age (immature) and three animals at 10 months of age (mature). Pooled results demonstrate that rabbit MCL's have statistically different (p less than 0.001) mean minimum diameters at these two ages. Interanimal differences in mean fibril minimum diameters were also significant (p less than 0.001) and varied by 20% to 25% in both mature and immature animals. Finally, there were significant differences (p less than 0.001) in mean diameters and distributions from side-to-side in all animals. These mean left-to-right differences were less than 10% in all mature animals but as much as 62% in some immature animals. Statistical analysis of these data demonstrate that animal-to-animal comparisons using these protocols require a large number of animals with appropriate numbers of fibrils being measured to detect small intergroup differences. With experiments which compare left to right ligaments, far fewer animals are required to detect similarly small differences. These results demonstrate the necessity for rigorous control of sampling, an extensive normal baseline and statistically confirmed experimental designs in any TEM comparisons of collagen fibril diameters.
Quantitative evaluation of lumbar intervertebral disc degeneration by axial T2* mapping.
Huang, Leitao; Liu, Yuan; Ding, Yi; Wu, Xia; Zhang, Ning; Lai, Qi; Zeng, Xianjun; Wan, Zongmiao; Dai, Min; Zhang, Bin
2017-12-01
To quantitatively evaluate the clinical value and demonstrate the potential benefits of biochemical axial T2* mapping-based grading of early stages of degenerative disc disease (DDD) using 3.0-T magnetic resonance imaging (MRI) in a clinical setting.Fifty patients with low back pain and 20 healthy volunteers (control) underwent standard MRI protocols including axial T2* mapping. All the intervertebral discs (IVDs) were classified morphologically. Lumbar IVDs were graded using Pfirrmann score (I to IV). The T2* values of the anterior annulus fibrosus (AF), posterior AF, and nucleus pulposus (NP) of each lumbar IVD were measured. The differences between groups were analyzed regarding specific T2* pattern at different regions of interest.The T2* values of the NP and posterior AF in the patient group were significantly lower than those in the control group (P < .01). The T2* value of the anterior AF was not significantly different between the patients and the controls (P > .05). The mean T2*values of the lumbar IVD in the patient group were significantly lower, especially the posterior AF, followed by the NP, and finally, the anterior AF. In the anterior AF, comparison of grade I with grade III and grade I with grade IV showed statistically significant differences (P = .07 and P = .08, respectively). Similarly, in the NP, comparison of grade I with grade III, grade I with grade IV, grade II with grade III, and grade II with grade IV showed statistically significant differences (P < .001). In the posterior AF, comparison of grade II with grade IV showed a statistically significant difference (P = .032). T2 values decreased linearly with increasing degeneration based on the Pfirrmann scoring system (ρ < -0.5, P < .001).Changes in the T2* value can signify early degenerative IVD diseases. Hence, T2* mapping can be used as a diagnostic tool for quantitative assessment of IVD degeneration. Copyright © 2017 The Authors. Published by Wolters Kluwer Health, Inc. All rights reserved.
Quantitative evaluation of lumbar intervertebral disc degeneration by axial T2∗ mapping
Huang, Leitao; Liu, Yuan; Ding, Yi; Wu, Xia; Zhang, Ning; Lai, Qi; Zeng, Xianjun; Wan, Zongmiao; Dai, Min; Zhang, Bin
2017-01-01
Abstract To quantitatively evaluate the clinical value and demonstrate the potential benefits of biochemical axial T2∗ mapping-based grading of early stages of degenerative disc disease (DDD) using 3.0-T magnetic resonance imaging (MRI) in a clinical setting. Fifty patients with low back pain and 20 healthy volunteers (control) underwent standard MRI protocols including axial T2∗ mapping. All the intervertebral discs (IVDs) were classified morphologically. Lumbar IVDs were graded using Pfirrmann score (I to IV). The T2∗ values of the anterior annulus fibrosus (AF), posterior AF, and nucleus pulposus (NP) of each lumbar IVD were measured. The differences between groups were analyzed regarding specific T2∗ pattern at different regions of interest. The T2∗ values of the NP and posterior AF in the patient group were significantly lower than those in the control group (P < .01). The T2∗ value of the anterior AF was not significantly different between the patients and the controls (P > .05). The mean T2∗values of the lumbar IVD in the patient group were significantly lower, especially the posterior AF, followed by the NP, and finally, the anterior AF. In the anterior AF, comparison of grade I with grade III and grade I with grade IV showed statistically significant differences (P = .07 and P = .08, respectively). Similarly, in the NP, comparison of grade I with grade III, grade I with grade IV, grade II with grade III, and grade II with grade IV showed statistically significant differences (P < .001). In the posterior AF, comparison of grade II with grade IV showed a statistically significant difference (P = .032). T2∗ values decreased linearly with increasing degeneration based on the Pfirrmann scoring system (ρ < −0.5, P < .001). Changes in the T2∗ value can signify early degenerative IVD diseases. Hence, T2∗ mapping can be used as a diagnostic tool for quantitative assessment of IVD degeneration. PMID:29390547
Statistical significance of trace evidence matches using independent physicochemical measurements
NASA Astrophysics Data System (ADS)
Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George
1997-02-01
A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.
Shahgaldi, Kambiz; Gudmundsson, Petri; Manouras, Aristomenis; Brodin, Lars-Ake; Winter, Reidar
2009-08-25
Visual assessment of left ventricular ejection fraction (LVEF) is often used in clinical routine despite general recommendations to use quantitative biplane Simpsons (BPS) measurements. Even thou quantitative methods are well validated and from many reasons preferable, the feasibility of visual assessment (eyeballing) is superior. There is to date only sparse data comparing visual EF assessment in comparison to quantitative methods available. The aim of this study was to compare visual EF assessment by two-dimensional echocardiography (2DE) and triplane echocardiography (TPE) using quantitative real-time three-dimensional echocardiography (RT3DE) as the reference method. Thirty patients were enrolled in the study. Eyeballing EF was assessed using apical 4-and 2 chamber views and TP mode by two experienced readers blinded to all clinical data. The measurements were compared to quantitative RT3DE. There were an excellent correlation between eyeballing EF by 2D and TP vs 3DE (r = 0.91 and 0.95 respectively) without any significant bias (-0.5 +/- 3.7% and -0.2 +/- 2.9% respectively). Intraobserver variability was 3.8% for eyeballing 2DE, 3.2% for eyeballing TP and 2.3% for quantitative 3D-EF. Interobserver variability was 7.5% for eyeballing 2D and 8.4% for eyeballing TP. Visual estimation of LVEF both using 2D and TP by an experienced reader correlates well with quantitative EF determined by RT3DE. There is an apparent trend towards a smaller variability using TP in comparison to 2D, this was however not statistically significant.
Multistrip Western blotting: a tool for comparative quantitative analysis of multiple proteins.
Aksamitiene, Edita; Hoek, Jan B; Kiyatkin, Anatoly
2015-01-01
The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical Western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip Western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip Western blotting increases data output per single blotting cycle up to tenfold; allows concurrent measurement of up to nine different total and/or posttranslationally modified protein expression obtained from the same loading of the sample; and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data and therefore is advantageous to apply in biomedical diagnostics, systems biology, and cell signaling research.
SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries.
Wu, Jemma X; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P
2016-07-01
The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries*
Wu, Jemma X.; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P.
2016-01-01
The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. PMID:27161445
A Method for Label-Free, Differential Top-Down Proteomics.
Ntai, Ioanna; Toby, Timothy K; LeDuc, Richard D; Kelleher, Neil L
2016-01-01
Biomarker discovery in the translational research has heavily relied on labeled and label-free quantitative bottom-up proteomics. Here, we describe a new approach to biomarker studies that utilizes high-throughput top-down proteomics and is the first to offer whole protein characterization and relative quantitation within the same experiment. Using yeast as a model, we report procedures for a label-free approach to quantify the relative abundance of intact proteins ranging from 0 to 30 kDa in two different states. In this chapter, we describe the integrated methodology for the large-scale profiling and quantitation of the intact proteome by liquid chromatography-mass spectrometry (LC-MS) without the need for metabolic or chemical labeling. This recent advance for quantitative top-down proteomics is best implemented with a robust and highly controlled sample preparation workflow before data acquisition on a high-resolution mass spectrometer, and the application of a hierarchical linear statistical model to account for the multiple levels of variance contained in quantitative proteomic comparisons of samples for basic and clinical research.
Statistical Characterization and Classification of Edge-Localized Plasma Instabilities
NASA Astrophysics Data System (ADS)
Webster, A. J.; Dendy, R. O.
2013-04-01
The statistics of edge-localized plasma instabilities (ELMs) in toroidal magnetically confined fusion plasmas are considered. From first principles, standard experimentally motivated assumptions are shown to determine a specific probability distribution for the waiting times between ELMs: the Weibull distribution. This is confirmed empirically by a statistically rigorous comparison with a large data set from the Joint European Torus. The successful characterization of ELM waiting times enables future work to progress in various ways. Here we present a quantitative classification of ELM types, complementary to phenomenological approaches. It also informs us about the nature of ELM processes, such as whether they are random or deterministic. The methods are extremely general and can be applied to numerous other quasiperiodic intermittent phenomena.
ERIC Educational Resources Information Center
Benítez, Isabel; Padilla, José-Luis
2014-01-01
Differential item functioning (DIF) can undermine the validity of cross-lingual comparisons. While a lot of efficient statistics for detecting DIF are available, few general findings have been found to explain DIF results. The objective of the article was to study DIF sources by using a mixed method design. The design involves a quantitative phase…
NASA Astrophysics Data System (ADS)
El Koussaifi, R.; Tikan, A.; Toffoli, A.; Randoux, S.; Suret, P.; Onorato, M.
2018-01-01
Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.
El Koussaifi, R; Tikan, A; Toffoli, A; Randoux, S; Suret, P; Onorato, M
2018-01-01
Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.
Love, Milton S.; Saiki, Michael K.; May, Thomas W.; Yee, Julie L.
2013-01-01
elements. Forty-two elements were excluded from statistical comparisons as they (1) consisted of major cations that were unlikely to accumulate to potentially toxic concentrations; (2) were not detected by the analytical procedures; or (3) were detected at concentrations too low to yield reliable quantitative measurements. The remaining 21 elements consisted of aluminum, arsenic, barium, cadmium, chromium, cobalt, copper, gallium, iron, lead, lithium, manganese, mercury, nickel, rubidium, selenium, strontium, tin, titanium, vanadium, and zinc. Statistical comparisons of these elements indicated that none consistently exhibited higher concentrations at oil platforms than at natural areas. However, the concentrations of copper, selenium, titanium, and vanadium in Pacific sanddab were unusual because small individuals exhibited either no differences between oil platforms and natural areas or significantly lower concentrations at oil platforms than at natural areas, whereas large individuals exhibited significantly higher concentrations at oil platforms than at natural areas.
NASA Astrophysics Data System (ADS)
Li, Y.; Robertson, C.
2018-06-01
The influence of irradiation defect dispersions on plastic strain spreading is investigated by means of three-dimensional dislocation dynamics (DD) simulations, accounting for thermally activated slip and cross-slip mechanisms in Fe-2.5%Cr grains. The defect-induced evolutions of the effective screw dislocation mobility are evaluated by means of statistical comparisons, for various defect number density and defect size cases. Each comparison is systematically associated with a quantitative Defect-Induced Apparent Straining Temperature shift (or «ΔDIAT»), calculated without any adjustable parameters. In the investigated cases, the ΔDIAT level associated with a given defect dispersion closely replicates the measured ductile to brittle transition temperature shift (ΔDBTT) due to the same, actual defect dispersion. The results are further analyzed in terms of dislocation-based plasticity mechanisms and their possible relations with the dose-dependent changes of the ductile to brittle transition temperature.
Shahgaldi, Kambiz; Gudmundsson, Petri; Manouras, Aristomenis; Brodin, Lars-Åke; Winter, Reidar
2009-01-01
Background Visual assessment of left ventricular ejection fraction (LVEF) is often used in clinical routine despite general recommendations to use quantitative biplane Simpsons (BPS) measurements. Even thou quantitative methods are well validated and from many reasons preferable, the feasibility of visual assessment (eyeballing) is superior. There is to date only sparse data comparing visual EF assessment in comparison to quantitative methods available. The aim of this study was to compare visual EF assessment by two-dimensional echocardiography (2DE) and triplane echocardiography (TPE) using quantitative real-time three-dimensional echocardiography (RT3DE) as the reference method. Methods Thirty patients were enrolled in the study. Eyeballing EF was assessed using apical 4-and 2 chamber views and TP mode by two experienced readers blinded to all clinical data. The measurements were compared to quantitative RT3DE. Results There were an excellent correlation between eyeballing EF by 2D and TP vs 3DE (r = 0.91 and 0.95 respectively) without any significant bias (-0.5 ± 3.7% and -0.2 ± 2.9% respectively). Intraobserver variability was 3.8% for eyeballing 2DE, 3.2% for eyeballing TP and 2.3% for quantitative 3D-EF. Interobserver variability was 7.5% for eyeballing 2D and 8.4% for eyeballing TP. Conclusion Visual estimation of LVEF both using 2D and TP by an experienced reader correlates well with quantitative EF determined by RT3DE. There is an apparent trend towards a smaller variability using TP in comparison to 2D, this was however not statistically significant. PMID:19706183
Comparison of GEANT4 very low energy cross section models with experimental data in water.
Incerti, S; Ivanchenko, A; Karamitros, M; Mantero, A; Moretto, P; Tran, H N; Mascialino, B; Champion, C; Ivanchenko, V N; Bernal, M A; Francis, Z; Villagrasa, C; Baldacchin, G; Guèye, P; Capra, R; Nieminen, P; Zacharatou, C
2010-09-01
The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H0, H+) and (He0, He+, He2+), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called "GEANT4-DNA" physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant deviations from each other. The GEANT4-DNA physics models available in the GEANT4 toolkit have been compared in this article to available experimental data in the water vapor phase as well as to several published recommendations on the mass stopping power. These models represent a first step in the extension of the GEANT4 Monte Carlo toolkit to the simulation of biological effects of ionizing radiation.
QGene 4.0, an extensible Java QTL-analysis platform.
Joehanes, Roby; Nelson, James C
2008-12-01
Of many statistical methods developed to date for quantitative trait locus (QTL) analysis, only a limited subset are available in public software allowing their exploration, comparison and practical application by researchers. We have developed QGene 4.0, a plug-in platform that allows execution and comparison of a variety of modern QTL-mapping methods and supports third-party addition of new ones. The software accommodates line-cross mating designs consisting of any arbitrary sequence of selfing, backcrossing, intercrossing and haploid-doubling steps that includes map, population, and trait simulators; and is scriptable. Software and documentation are available at http://coding.plantpath.ksu.edu/qgene. Source code is available on request.
Creation of 0.10-cm-1 resolution quantitative infrared spectral libraries for gas samples
NASA Astrophysics Data System (ADS)
Sharpe, Steven W.; Sams, Robert L.; Johnson, Timothy J.; Chu, Pamela M.; Rhoderick, George C.; Guenther, Franklin R.
2002-02-01
The National Institute of Standards and Technology (NIST) and the Pacific Northwest National Laboratory (PNNL) are independently creating quantitative, approximately 0.10 cm-1 resolution, infrared spectral libraries of vapor phase compounds. The NIST library will consist of approximately 100 vapor phase spectra of volatile hazardous air pollutants (HAPs) and suspected greenhouse gases. The PNNL library will consist of approximately 400 vapor phase spectra associated with DOE's remediation mission. A critical part of creating and validating any quantitative work involves independent verification based on inter-laboratory comparisons. The two laboratories use significantly different sample preparation and handling techniques. NIST uses gravimetric dilution and a continuous flowing sample while PNNL uses partial pressure dilution and a static sample. Agreement is generally found to be within the statistical uncertainties of the Beer's law fit and less than 3 percent of the total integrated band areas for the 4 chemicals used in this comparison. There does appear to be a small systematic difference between the PNNL and NIST data, however. Possible sources of the systematic difference will be discussed as well as technical details concerning the sample preparation and the procedures for overcoming instrumental artifacts.
A two-factor error model for quantitative steganalysis
NASA Astrophysics Data System (ADS)
Böhme, Rainer; Ker, Andrew D.
2006-02-01
Quantitative steganalysis refers to the exercise not only of detecting the presence of hidden stego messages in carrier objects, but also of estimating the secret message length. This problem is well studied, with many detectors proposed but only a sparse analysis of errors in the estimators. A deep understanding of the error model, however, is a fundamental requirement for the assessment and comparison of different detection methods. This paper presents a rationale for a two-factor model for sources of error in quantitative steganalysis, and shows evidence from a dedicated large-scale nested experimental set-up with a total of more than 200 million attacks. Apart from general findings about the distribution functions found in both classes of errors, their respective weight is determined, and implications for statistical hypothesis tests in benchmarking scenarios or regression analyses are demonstrated. The results are based on a rigorous comparison of five different detection methods under many different external conditions, such as size of the carrier, previous JPEG compression, and colour channel selection. We include analyses demonstrating the effects of local variance and cover saturation on the different sources of error, as well as presenting the case for a relative bias model for between-image error.
Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS
NASA Technical Reports Server (NTRS)
Long, S. M.; Grosfils, E. B.
2005-01-01
Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.
An Overview of data science uses in bioimage informatics.
Chessel, Anatole
2017-02-15
This review aims at providing a practical overview of the use of statistical features and associated data science methods in bioimage informatics. To achieve a quantitative link between images and biological concepts, one typically replaces an object coming from an image (a segmented cell or intracellular object, a pattern of expression or localisation, even a whole image) by a vector of numbers. They range from carefully crafted biologically relevant measurements to features learnt through deep neural networks. This replacement allows for the use of practical algorithms for visualisation, comparison and inference, such as the ones from machine learning or multivariate statistics. While originating mainly, for biology, in high content screening, those methods are integral to the use of data science for the quantitative analysis of microscopy images to gain biological insight, and they are sure to gather more interest as the need to make sense of the increasing amount of acquired imaging data grows more pressing. Copyright © 2017 Elsevier Inc. All rights reserved.
Statistical methodology: II. Reliability and validity assessment in study design, Part B.
Karras, D J
1997-02-01
Validity measures the correspondence between a test and other purported measures of the same or similar qualities. When a reference standard exists, a criterion-based validity coefficient can be calculated. If no such standard is available, the concepts of content and construct validity may be used, but quantitative analysis may not be possible. The Pearson and Spearman tests of correlation are often used to assess the correspondence between tests, but do not account for measurement biases and may yield misleading results. Techniques that measure interest differences may be more meaningful in validity assessment, and the kappa statistic is useful for analyzing categorical variables. Questionnaires often can be designed to allow quantitative assessment of reliability and validity, although this may be difficult. Inclusion of homogeneous questions is necessary to assess reliability. Analysis is enhanced by using Likert scales or similar techniques that yield ordinal data. Validity assessment of questionnaires requires careful definition of the scope of the test and comparison with previously validated tools.
Use of simulation-based learning in undergraduate nurse education: An umbrella systematic review.
Cant, Robyn P; Cooper, Simon J
2017-02-01
To conduct a systematic review to appraise and review evidence on the impact of simulation-based education for undergraduate/pre-licensure nursing students, using existing reviews of literature. An umbrella review (review of reviews). Cumulative Index of Nursing and Allied Health Literature (CINAHLPlus), PubMed, and Google Scholar. Reviews of literature conducted between 2010 and 2015 regarding simulation-based education for pre-licensure nursing students. The Joanna Briggs Institute methodology for conduct of an umbrella review was used to inform the review process. Twenty-five systematic reviews of literature were included, of which 14 were recent (2013-2015). Most described the level of evidence of component studies as a mix of experimental and quasi-experimental designs. The reviews measured around 14 different main outcome variables, thus limiting the number of primary studies that each individual review could pool to appraise. Many reviews agreed on the key learning outcome of knowledge acquisition, although no overall quantitative effect was derived. Three of four high-quality reviews found that simulation supported psychomotor development; a fourth found too few high quality studies to make a statistical comparison. Simulation statistically improved self-efficacy in pretest-posttest studies, and in experimental designs self-efficacy was superior to that of other teaching methods; lower level research designs limiting further comparison. The reviews commonly reported strong student satisfaction with simulation education and some reported improved confidence and/or critical thinking. This umbrella review took a global view of 25 reviews of simulation research in nursing education, comprising over 700 primary studies. To discern overall outcomes across reviews, statistical comparison of quantitative results (effect size) must be the key comparator. Simulation-based education contributes to students' learning in a number of ways when integrated into pre-licensure nursing curricula. Overall, use of a constellation of instruments and a lack of high quality study designs mean that there are still some gaps in evidence of effects that need to be addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Cameron, Enrico; Pilla, Giorgio; Stella, Fabio A.
2018-06-01
The application of statistical classification methods is investigated—in comparison also to spatial interpolation methods—for predicting the acceptability of well-water quality in a situation where an effective quantitative model of the hydrogeological system under consideration cannot be developed. In the example area in northern Italy, in particular, the aquifer is locally affected by saline water and the concentration of chloride is the main indicator of both saltwater occurrence and groundwater quality. The goal is to predict if the chloride concentration in a water well will exceed the allowable concentration so that the water is unfit for the intended use. A statistical classification algorithm achieved the best predictive performances and the results of the study show that statistical classification methods provide further tools for dealing with groundwater quality problems concerning hydrogeological systems that are too difficult to describe analytically or to simulate effectively.
Vu, Trung N; Valkenborg, Dirk; Smets, Koen; Verwaest, Kim A; Dommisse, Roger; Lemière, Filip; Verschoren, Alain; Goethals, Bart; Laukens, Kris
2011-10-20
Nuclear magnetic resonance spectroscopy (NMR) is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA). The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT) cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio) is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear improvements in comparison to other methods, and the down-to-earth quantitative analysis works well for the CluPA-aligned spectra. The whole workflow is embedded into a modular and statistically sound framework that is implemented as an R package called "speaq" ("spectrum alignment and quantitation"), which is freely available from http://code.google.com/p/speaq/.
Quantitative analysis of tympanic membrane perforation: a simple and reliable method.
Ibekwe, T S; Adeosun, A A; Nwaorgu, O G
2009-01-01
Accurate assessment of the features of tympanic membrane perforation, especially size, site, duration and aetiology, is important, as it enables optimum management. To describe a simple, cheap and effective method of quantitatively analysing tympanic membrane perforations. The system described comprises a video-otoscope (capable of generating still and video images of the tympanic membrane), adapted via a universal serial bus box to a computer screen, with images analysed using the Image J geometrical analysis software package. The reproducibility of results and their correlation with conventional otoscopic methods of estimation were tested statistically with the paired t-test and correlational tests, using the Statistical Package for the Social Sciences version 11 software. The following equation was generated: P/T x 100 per cent = percentage perforation, where P is the area (in pixels2) of the tympanic membrane perforation and T is the total area (in pixels2) for the entire tympanic membrane (including the perforation). Illustrations are shown. Comparison of blinded data on tympanic membrane perforation area obtained independently from assessments by two trained otologists, of comparative years of experience, using the video-otoscopy system described, showed similar findings, with strong correlations devoid of inter-observer error (p = 0.000, r = 1). Comparison with conventional otoscopic assessment also indicated significant correlation, comparing results for two trained otologists, but some inter-observer variation was present (p = 0.000, r = 0.896). Correlation between the two methods for each of the otologists was also highly significant (p = 0.000). A computer-adapted video-otoscope, with images analysed by Image J software, represents a cheap, reliable, technology-driven, clinical method of quantitative analysis of tympanic membrane perforations and injuries.
Yamaguchi, Hiromi; Matsumoto, Sawako; Ishibashi, Mariko; Hasegawa, Kiyoshi; Sugitani, Masahiko; Takayama, Tadatoshi; Esumi, Mariko
2013-10-01
The level of expression of housekeeping genes is in general considered stable, and a representative gene such as glyceraldehyde-3-phosphate dehydrogenase is commonly used as an internal control for quantitating mRNA. However, expression of housekeeping genes is not always constant under pathological conditions. To determine which genes would be most suitable as internal controls for quantitative gene expression studies in human liver diseases, we quantified 12 representative housekeeping genes in 27 non-cancerous liver tissues (normal, chronic hepatitis C with and without liver cirrhosis). We identified β-glucuronidase as the most suitable gene for studies on liver by rigorous statistical analysis of inter- and intra-group comparisons. We conclude that it is important to determine the most appropriate control gene for the particular condition to be analyzed. © 2013 Elsevier Inc. All rights reserved.
LOD significance thresholds for QTL analysis in experimental populations of diploid species
Van Ooijen JW
1999-11-01
Linkage analysis with molecular genetic markers is a very powerful tool in the biological research of quantitative traits. The lack of an easy way to know what areas of the genome can be designated as statistically significant for containing a gene affecting the quantitative trait of interest hampers the important prediction of the rate of false positives. In this paper four tables, obtained by large-scale simulations, are presented that can be used with a simple formula to get the false-positives rate for analyses of the standard types of experimental populations with diploid species with any size of genome. A new definition of the term 'suggestive linkage' is proposed that allows a more objective comparison of results across species.
One-dimensional turbulence modeling of a turbulent counterflow flame with comparison to DNS
Jozefik, Zoltan; Kerstein, Alan R.; Schmidt, Heiko; ...
2015-06-01
The one-dimensional turbulence (ODT) model is applied to a reactant-to-product counterflow configuration and results are compared with DNS data. The model employed herein solves conservation equations for momentum, energy, and species on a one dimensional (1D) domain corresponding to the line spanning the domain between nozzle orifice centers. The effects of turbulent mixing are modeled via a stochastic process, while the Kolmogorov and reactive length and time scales are explicitly resolved and a detailed chemical kinetic mechanism is used. Comparisons between model and DNS results for spatial mean and root-mean-square (RMS) velocity, temperature, and major and minor species profiles aremore » shown. The ODT approach shows qualitatively and quantitatively reasonable agreement with the DNS data. Scatter plots and statistics conditioned on temperature are also compared for heat release rate and all species. ODT is able to capture the range of results depicted by DNS. As a result, conditional statistics show signs of underignition.« less
Noise Maps for Quantitative and Clinical Severity Towards Long-Term ECG Monitoring.
Everss-Villalba, Estrella; Melgarejo-Meseguer, Francisco Manuel; Blanco-Velasco, Manuel; Gimeno-Blanes, Francisco Javier; Sala-Pla, Salvador; Rojo-Álvarez, José Luis; García-Alberola, Arcadi
2017-10-25
Noise and artifacts are inherent contaminating components and are particularly present in Holter electrocardiogram (ECG) monitoring. The presence of noise is even more significant in long-term monitoring (LTM) recordings, as these are collected for several days in patients following their daily activities; hence, strong artifact components can temporarily impair the clinical measurements from the LTM recordings. Traditionally, the noise presence has been dealt with as a problem of non-desirable component removal by means of several quantitative signal metrics such as the signal-to-noise ratio (SNR), but current systems do not provide any information about the true impact of noise on the ECG clinical evaluation. As a first step towards an alternative to classical approaches, this work assesses the ECG quality under the assumption that an ECG has good quality when it is clinically interpretable. Therefore, our hypotheses are that it is possible (a) to create a clinical severity score for the effect of the noise on the ECG, (b) to characterize its consistency in terms of its temporal and statistical distribution, and (c) to use it for signal quality evaluation in LTM scenarios. For this purpose, a database of external event recorder (EER) signals is assembled and labeled from a clinical point of view for its use as the gold standard of noise severity categorization. These devices are assumed to capture those signal segments more prone to be corrupted with noise during long-term periods. Then, the ECG noise is characterized through the comparison of these clinical severity criteria with conventional quantitative metrics taken from traditional noise-removal approaches, and noise maps are proposed as a novel representation tool to achieve this comparison. Our results showed that neither of the benchmarked quantitative noise measurement criteria represent an accurate enough estimation of the clinical severity of the noise. A case study of long-term ECG is reported, showing the statistical and temporal correspondences and properties with respect to EER signals used to create the gold standard for clinical noise. The proposed noise maps, together with the statistical consistency of the characterization of the noise clinical severity, paves the way towards forthcoming systems providing us with noise maps of the noise clinical severity, allowing the user to process different ECG segments with different techniques and in terms of different measured clinical parameters.
Comparison of Housing Construction Development in Selected Regions of Central Europe
NASA Astrophysics Data System (ADS)
Dvorský, Ján; Petráková, Zora; Hollý, Ján
2017-12-01
In fast-growing countries, the economic growth, which came after the global financial crisis, ought to be manifested in the development of housing policy. The development of the region is directly related to the increase of the quality of living of its inhabitants. Housing construction and its relation with the availability of housing is a key issue for population overall. Comparison of its development in selected regions is important for experts in the field of construction, mayors of the regions, the state, but especially for the inhabitants themselves. The aim of the article is to compare the number of new dwellings with building permits and completed dwellings with final building approval between selected regions by using a mathematical statistics method - “Analysis of variance”. The article also uses the tools of descriptive statistics such as a point graph, a graph of deviations from the average, basic statistical characteristics of mean and variability. Qualitative factors influencing the construction of flats as well as the causes of quantitative differences in the number of started apartments under construction and completed apartments in selected regions of Central Europe are the subjects of the article’s conclusions.
Statistics of velocity fluctuations of Geldart A particles in a circulating fluidized bed riser
Vaidheeswaran, Avinash; Shaffer, Franklin; Gopalan, Balaji
2017-11-21
Here, the statistics of fluctuating velocity components are studied in the riser of a closed-loop circulating fluidized bed with fluid catalytic cracking catalyst particles. Our analysis shows distinct similarities as well as deviations compared to existing theories and bench-scale experiments. The study confirms anisotropic and non-Maxwellian distribution of fluctuating velocity components. The velocity distribution functions (VDFs) corresponding to transverse fluctuations exhibit symmetry, and follow a stretched-exponential behavior up to three standard deviations. The form of the transverse VDF is largely determined by interparticle interactions. The tails become more overpopulated with an increase in particle loading. The observed deviations from themore » Gaussian distribution are represented using the leading order term in the Sonine expansion, which is commonly used to approximate the VDFs in kinetic theory for granular flows. The vertical fluctuating VDFs are asymmetric and the skewness shifts as the wall is approached. In comparison to transverse fluctuations, the vertical VDF is determined by the local hydrodynamics. This is an observation of particle velocity fluctuations in a large-scale system and their quantitative comparison with the Maxwell-Boltzmann statistics.« less
A comparison study of image features between FFDM and film mammogram images
Jing, Hao; Yang, Yongyi; Wernick, Miles N.; Yarusso, Laura M.; Nishikawa, Robert M.
2012-01-01
Purpose: This work is to provide a direct, quantitative comparison of image features measured by film and full-field digital mammography (FFDM). The purpose is to investigate whether there is any systematic difference between film and FFDM in terms of quantitative image features and their influence on the performance of a computer-aided diagnosis (CAD) system. Methods: The authors make use of a set of matched film-FFDM image pairs acquired from cadaver breast specimens with simulated microcalcifications consisting of bone and teeth fragments using both a GE digital mammography system and a screen-film system. To quantify the image features, the authors consider a set of 12 textural features of lesion regions and six image features of individual microcalcifications (MCs). The authors first conduct a direct comparison on these quantitative features extracted from film and FFDM images. The authors then study the performance of a CAD classifier for discriminating between MCs and false positives (FPs) when the classifier is trained on images of different types (film, FFDM, or both). Results: For all the features considered, the quantitative results show a high degree of correlation between features extracted from film and FFDM, with the correlation coefficients ranging from 0.7326 to 0.9602 for the different features. Based on a Fisher sign rank test, there was no significant difference observed between the features extracted from film and those from FFDM. For both MC detection and discrimination of FPs from MCs, FFDM had a slight but statistically significant advantage in performance; however, when the classifiers were trained on different types of images (acquired with FFDM or SFM) for discriminating MCs from FPs, there was little difference. Conclusions: The results indicate good agreement between film and FFDM in quantitative image features. While FFDM images provide better detection performance in MCs, FFDM and film images may be interchangeable for the purposes of training CAD algorithms, and a single CAD algorithm may be applied to either type of images. PMID:22830771
78 FR 70059 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-22
... (as opposed to quantitative statistical methods). In consultation with research experts, we have... qualitative interviews (as opposed to quantitative statistical methods). In consultation with research experts... utilization of qualitative interviews (as opposed to quantitative statistical methods). In consultation with...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radeleff, Boris, E-mail: Boris.radeleff@med.uni-heidelberg.de; Thierjung, Heidi; Stampfl, Ulrike
2008-09-15
PurposeTo date no direct experimental comparison between the CYPHER-Select and TAXUS-Express stents is available. Therefore, we investigated late in-stent stenosis, thrombogenicity, and inflammation, comparing the CYPHER-Select, TAXUS-Express, and custom-made cobalt chromium Polyzene-F nanocoated stents (CCPS) in the minipig coronary artery model.MethodsThe three stent types were implanted in the right coronary artery of 30 minipigs. The primary endpoint was in-stent stenosis assessed by quantitative angiography and microscopy. Secondary endpoints were inflammation and thrombogenicity evaluated by scores for inflammation and immunoreactivity (C-reactive protein and transforming growth factor beta). Follow-up was at 4 and 12 weeks.ResultsStent placement was successful in all animals; nomore » thrombus deposition occurred. Quantitative angiography did not depict statistically significant differences between the three stent types after 4 and 12 weeks. Quantitative microscopy at 4 weeks showed a statistically significant thicker neointima (p = 0.0431) for the CYPHER (105.034 {+-} 62.52 {mu}m) versus the TAXUS (74.864 {+-} 66.03 {mu}m) and versus the CCPS (63.542 {+-} 39.57 {mu}m). At 12 weeks there were no statistically significant differences. Inflammation scores at 4 weeks were significantly lower for the CCPS and CYPHER compared with the TAXUS stent (p = 0.0431). After 12 weeks statistical significance was only found for the CYPHER versus the TAXUS stent (p = 0.0431). The semiquantitative immunoreactivity scores for C-reactive protein and transforming growth factor beta showed no statistically significant differences between the three stent types after 4 and 12 weeks.ConclusionsThe CCPS provided effective control of late in-stent stenosis and thrombogenicity in this porcine model compared with the two drug-eluting stents. Its low inflammation score underscores its noninflammatory potential and might explain its equivalence to the two DES.« less
Schuch, Klaus; Logothetis, Nikos K.; Maass, Wolfgang
2011-01-01
A major goal of computational neuroscience is the creation of computer models for cortical areas whose response to sensory stimuli resembles that of cortical areas in vivo in important aspects. It is seldom considered whether the simulated spiking activity is realistic (in a statistical sense) in response to natural stimuli. Because certain statistical properties of spike responses were suggested to facilitate computations in the cortex, acquiring a realistic firing regimen in cortical network models might be a prerequisite for analyzing their computational functions. We present a characterization and comparison of the statistical response properties of the primary visual cortex (V1) in vivo and in silico in response to natural stimuli. We recorded from multiple electrodes in area V1 of 4 macaque monkeys and developed a large state-of-the-art network model for a 5 × 5-mm patch of V1 composed of 35,000 neurons and 3.9 million synapses that integrates previously published anatomical and physiological details. By quantitative comparison of the model response to the “statistical fingerprint” of responses in vivo, we find that our model for a patch of V1 responds to the same movie in a way which matches the statistical structure of the recorded data surprisingly well. The deviation between the firing regimen of the model and the in vivo data are on the same level as deviations among monkeys and sessions. This suggests that, despite strong simplifications and abstractions of cortical network models, they are nevertheless capable of generating realistic spiking activity. To reach a realistic firing state, it was not only necessary to include both N-methyl-d-aspartate and GABAB synaptic conductances in our model, but also to markedly increase the strength of excitatory synapses onto inhibitory neurons (>2-fold) in comparison to literature values, hinting at the importance to carefully adjust the effect of inhibition for achieving realistic dynamics in current network models. PMID:21106898
Silver, Matt; Montana, Giovanni
2012-01-01
Where causal SNPs (single nucleotide polymorphisms) tend to accumulate within biological pathways, the incorporation of prior pathways information into a statistical model is expected to increase the power to detect true associations in a genetic association study. Most existing pathways-based methods rely on marginal SNP statistics and do not fully exploit the dependence patterns among SNPs within pathways. We use a sparse regression model, with SNPs grouped into pathways, to identify causal pathways associated with a quantitative trait. Notable features of our “pathways group lasso with adaptive weights” (P-GLAW) algorithm include the incorporation of all pathways in a single regression model, an adaptive pathway weighting procedure that accounts for factors biasing pathway selection, and the use of a bootstrap sampling procedure for the ranking of important pathways. P-GLAW takes account of the presence of overlapping pathways and uses a novel combination of techniques to optimise model estimation, making it fast to run, even on whole genome datasets. In a comparison study with an alternative pathways method based on univariate SNP statistics, our method demonstrates high sensitivity and specificity for the detection of important pathways, showing the greatest relative gains in performance where marginal SNP effect sizes are small. PMID:22499682
Arıkan, Fatma İnci; Kara, Semra; Bilgin, Hüseyin; Özkan, Fatma; Bilge, Yıldız Dallar
2017-07-01
The purpose of the current study was to investigate the possible effects of phototherapy on bone status of term infants evaluated by measurement of tibial bone speed of sound (SOS). The phototherapy group (n = 30) consisted of children who had undergone phototherapy for at least 24 h and the control group (n = 30) comprised children who had not received phototherapy. Blood samples were obtained from all infants for serum calcium, phosphorus, magnesium, alkaline phosphatase, parathyroid hormone and vitamin D concentrations. The left tibial quantitative ultrasound (QUS) measurements were performed using a commercial device. There was no statistically significant difference between phototherapy-exposed and nonexposed infants in terms of Ca, P, ALP, PTH and vitamin D levels. Comparison of bone SOS between the phototherapy-exposed and control group revealed no statistically difference. Also, no significant difference in Z-score for SOS was observed between those with or without exposure. The data of our study indicate that phototherapy treatment has no impact on bone status in the hyperbilirubinemic infants. Although there is no statistically significant evidence of an excess risk of bone damage following phototherapy, studies with larger sample sizes and longer duration of follow-up are needed to gain a better understanding of its effects.
PET imaging and quantitation of Internet-addicted patients and normal controls
NASA Astrophysics Data System (ADS)
Jeong, Ha-Kyu; Kim, Hee-Joung; Jung, Haijo; Son, Hye-Kyung; Kim, Dong-Hyeon; Yun, Mijin; Shin, Yee-Jin; Lee, Jong-Doo
2002-04-01
Internet addicted patients (IAPs) have widely been increased, as Internet games are becoming very popular in daily life. The purpose of this study was to investigate regional brain activation patterns associated with excessive use of Internet games in adolescents. Six normal controls (NCs) and eight IAPs who were classified as addiction group by adapted version of DSM-IV for pathologic gambling were participated. 18F-FDG PET studies were performed for all adolescents at their rest and activated condition after 20 minutes of each subject's favorite Internet game. To investigate quantitative metabolic differences in both groups, all possible combinations of group comparison were carried out using Statistical Parametric Mapping (SPM 99). Regional brain activation foci were identified on Talairach coordinate. SPM results showed increased metabolic activation in occipital lobes for both groups. Higher metabolisms were seen at resting condition in IAPs than that of in NCs. In comparison to both groups, IAPs showed different patterns of regional brain metabolic activation compared with that of NCs. It suggests that addictive use of Internet games may result in functional alteration of developing brain in adolescents.
Quantitative tests for plate tectonics on Venus
NASA Technical Reports Server (NTRS)
Kaula, W. M.; Phillips, R. J.
1981-01-01
Quantitative comparisons are made between the characteristics of plate tectonics on the earth and those which are possible on Venus. Considerations of the factors influencing rise height and relating the decrease in rise height to plate velocity indicate that the rate of topographic dropoff from spreading centers should be about half that on earth due to greater rock-fluid density contrast and lower temperature differential between the surface and interior. Statistical analyses of Pioneer Venus radar altimetry data and global earth elevation data is used to identify 21,000 km of ridge on Venus and 33,000 km on earth, and reveal Venus ridges to have a less well-defined mode in crest heights and a greater concavity than earth ridges. Comparison of the Venus results with the spreading rates and associated heat flow on earth reveals plate creation rates on Venus to be 0.7 sq km/year or less and indicates that not more than 15% of Venus's energy is delivered to the surface by plate tectonics, in contrast to values of 2.9 sq km a year and 70% for earth.
A comment on the PCAST report: Skip the "match"/"non-match" stage.
Morrison, Geoffrey Stewart; Kaye, David H; Balding, David J; Taylor, Duncan; Dawid, Philip; Aitken, Colin G G; Gittelson, Simone; Zadora, Grzegorz; Robertson, Bernard; Willis, Sheila; Pope, Susan; Neil, Martin; Martire, Kristy A; Hepler, Amanda; Gill, Richard D; Jamieson, Allan; de Zoete, Jacob; Ostrum, R Brent; Caliebe, Amke
2017-03-01
This letter comments on the report "Forensic science in criminal courts: Ensuring scientific validity of feature-comparison methods" recently released by the President's Council of Advisors on Science and Technology (PCAST). The report advocates a procedure for evaluation of forensic evidence that is a two-stage procedure in which the first stage is "match"/"non-match" and the second stage is empirical assessment of sensitivity (correct acceptance) and false alarm (false acceptance) rates. Almost always, quantitative data from feature-comparison methods are continuously-valued and have within-source variability. We explain why a two-stage procedure is not appropriate for this type of data, and recommend use of statistical procedures which are appropriate. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Knobel, LeRoy L.
2006-01-01
This report presents qualitative and quantitative comparisons of water-quality data from the Idaho National Laboratory, Idaho, to determine if the change from purging three wellbore volumes to one wellbore volume has a discernible effect on the comparability of the data. Historical water-quality data for 30 wells were visually compared to water-quality data collected after purging only 1 wellbore volume from the same wells. Of the 322 qualitatively examined constituent plots, 97.5 percent met 1 or more of the criteria established for determining data comparability. A simple statistical equation to determine if water-quality data collected from 28 wells at the INL with long purge times (after pumping 1 and 3 wellbore volumes of water) were statistically the same at the 95-percent confidence level indicated that 97.9 percent of 379 constituent pairs were equivalent. Comparability of water-quality data determined from both the qualitative (97.5 percent comparable) and quantitative (97.9 percent comparable) evaluations after purging 1 and 3 wellbore volumes of water indicates that the change from purging 3 to 1 wellbore volumes had no discernible effect on comparability of water-quality data at the INL. However, the qualitative evaluation was limited because only October-November 2003 data were available for comparison to historical data. This report was prepared by the U.S. Geological Survey in cooperation with the U.S. Department of Energy.
IsobariQ: software for isobaric quantitative proteomics using IPTL, iTRAQ, and TMT.
Arntzen, Magnus Ø; Koehler, Christian J; Barsnes, Harald; Berven, Frode S; Treumann, Achim; Thiede, Bernd
2011-02-04
Isobaric peptide labeling plays an important role in relative quantitative comparisons of proteomes. Isobaric labeling techniques utilize MS/MS spectra for relative quantification, which can be either based on the relative intensities of reporter ions in the low mass region (iTRAQ and TMT) or on the relative intensities of quantification signatures throughout the spectrum due to isobaric peptide termini labeling (IPTL). Due to the increased quantitative information found in MS/MS fragment spectra generated by the recently developed IPTL approach, new software was required to extract the quantitative information. IsobariQ was specifically developed for this purpose; however, support for the reporter ion techniques iTRAQ and TMT is also included. In addition, to address recently emphasized issues about heterogeneity of variance in proteomics data sets, IsobariQ employs the statistical software package R and variance stabilizing normalization (VSN) algorithms available therein. Finally, the functionality of IsobariQ is validated with data sets of experiments using 6-plex TMT and IPTL. Notably, protein substrates resulting from cleavage by proteases can be identified as shown for caspase targets in apoptosis.
Multibaseline gravitational wave radiometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talukder, Dipongkar; Bose, Sukanta; Mitra, Sanjit
2011-03-15
We present a statistic for the detection of stochastic gravitational wave backgrounds (SGWBs) using radiometry with a network of multiple baselines. We also quantitatively compare the sensitivities of existing baselines and their network to SGWBs. We assess how the measurement accuracy of signal parameters, e.g., the sky position of a localized source, can improve when using a network of baselines, as compared to any of the single participating baselines. The search statistic itself is derived from the likelihood ratio of the cross correlation of the data across all possible baselines in a detector network and is optimal in Gaussian noise.more » Specifically, it is the likelihood ratio maximized over the strength of the SGWB and is called the maximized-likelihood ratio (MLR). One of the main advantages of using the MLR over past search strategies for inferring the presence or absence of a signal is that the former does not require the deconvolution of the cross correlation statistic. Therefore, it does not suffer from errors inherent to the deconvolution procedure and is especially useful for detecting weak sources. In the limit of a single baseline, it reduces to the detection statistic studied by Ballmer [Classical Quantum Gravity 23, S179 (2006).] and Mitra et al.[Phys. Rev. D 77, 042002 (2008).]. Unlike past studies, here the MLR statistic enables us to compare quantitatively the performances of a variety of baselines searching for a SGWB signal in (simulated) data. Although we use simulated noise and SGWB signals for making these comparisons, our method can be straightforwardly applied on real data.« less
Fu, Rongwei; Gartlehner, Gerald; Grant, Mark; Shamliyan, Tatyana; Sedrakyan, Art; Wilt, Timothy J; Griffith, Lauren; Oremus, Mark; Raina, Parminder; Ismaila, Afisi; Santaguida, Pasqualina; Lau, Joseph; Trikalinos, Thomas A
2011-11-01
This article is to establish recommendations for conducting quantitative synthesis, or meta-analysis, using study-level data in comparative effectiveness reviews (CERs) for the Evidence-based Practice Center (EPC) program of the Agency for Healthcare Research and Quality. We focused on recurrent issues in the EPC program and the recommendations were developed using group discussion and consensus based on current knowledge in the literature. We first discussed considerations for deciding whether to combine studies, followed by discussions on indirect comparison and incorporation of indirect evidence. Then, we described our recommendations on choosing effect measures and statistical models, giving special attention to combining studies with rare events; and on testing and exploring heterogeneity. Finally, we briefly presented recommendations on combining studies of mixed design and on sensitivity analysis. Quantitative synthesis should be conducted in a transparent and consistent way. Inclusion of multiple alternative interventions in CERs increases the complexity of quantitative synthesis, whereas the basic issues in quantitative synthesis remain crucial considerations in quantitative synthesis for a CER. We will cover more issues in future versions and update and improve recommendations with the accumulation of new research to advance the goal for transparency and consistency. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Merceret, Francis J.; Crawford, Winifred C.
2010-01-01
Knowledge of peak wind speeds is important to the safety of personnel and flight hardware at Kennedy Space Center (KSC) and the Cape Canaveral Air Force Station (CCAFS), but they are more difficult to forecast than mean wind speeds. Development of a reliable model for the gust factor (GF) relating the peak to the mean wind speed motivated a previous study of GF in tropical storms. The same motivation inspired a climatological study of non-TS peak wind speed statistics without the use of GF. Both studies presented their respective statistics as functions of mean wind speed and height. The few comparisons of IS and non-TS GF in the literature suggest that the non-TS GF at a given height and mean wind speed are smaller than the corresponding TS GF. The investigation reported here converted the non-TS peak wind statistics mentioned above to the equivalent GF statistics and compared the results with the previous TS GF results. The advantage of this effort over all previously reported studies of its kind is that the TS and non-TS data are taken from the same towers in the same locations. That eliminates differing surface attributes, including roughness length and thermal properties, as a major source of variance in the comparison. The results are consistent with the literature, but include much more detailed, quantitative information on the nature of the relationship between TS and non-TS GF as a function of height and mean wind speed. In addition, the data suggest the possibility of providing an operational model for non-TS GF as a function of height and wind speed in a manner similar to the one previously developed for TS GF.
NASA Astrophysics Data System (ADS)
Suwannasri, A.; Kaewlai, R.; Asavaphatiboon, S.
2016-03-01
This study was to determine if administration of a low volume high-concentration iodinated contrast medium can preserve image quality in comparison with regular-concentration intravenous contrast medium in patient undergoing contrast-enhancement abdominal computed tomography (CT). Eighty-four patients were randomly divided into 3 groups of similar iodine delivery rate; A: 1.2 cc/kg of iomeprol-400, B: 1.0 cc/kg of iomeprol-400 and C: 1.5 cc/kg of ioversol-350. Contrast enhancement of the liver parenchyma, pancreas and aorta was quantitatively measured in Hounsfield units and qualitative assessed by a radiologist. T-test was used to evaluate contrast enhancement, and Chi-square test was used to evaluate qualitative image assessment, at significance level of 0.05 with 95% confidence intervals. There were no statistically significant differences in contrast enhancement of liver parenchyma and pancreas between group A and group C in both quantitative and qualitative analyses. Group C showed superior vascular enhancement to group A and B on quantitative analysis.
Wiesmüller, Marco; Quick, Harald H; Navalpakkam, Bharath; Lell, Michael M; Uder, Michael; Ritt, Philipp; Schmidt, Daniela; Beck, Michael; Kuwert, Torsten; von Gall, Carl C
2013-01-01
PET/MR hybrid scanners have recently been introduced, but not yet validated. The aim of this study was to compare the PET components of a PET/CT hybrid system and of a simultaneous whole-body PET/MR hybrid system with regard to reproducibility of lesion detection and quantitation of tracer uptake. A total of 46 patients underwent a whole-body PET/CT scan 1 h after injection and an average of 88 min later a second scan using a hybrid PET/MR system. The radioactive tracers used were (18)F-deoxyglucose (FDG), (18)F-ethylcholine (FEC) and (68)Ga-DOTATATE (Ga-DOTATATE). The PET images from PET/CT (PET(CT)) and from PET/MR (PET(MR)) were analysed for tracer-positive lesions. Regional tracer uptake in these foci was quantified using volumes of interest, and maximal and average standardized uptake values (SUV(max) and SUV(avg), respectively) were calculated. Of the 46 patients, 43 were eligible for comparison and statistical analysis. All lesions except one identified by PET(CT) were identified by PET(MR) (99.2 %). In 38 patients (88.4 %), the same number of foci were identified by PET(CT) and by PET(MR). In four patients, more lesions were identified by PET(MR) than by PET(CT), in one patient PET(CT) revealed an additional focus compared to PET(MR). The mean SUV(max) and SUV(avg) of all lesions determined by PET(MR) were by 21 % and 11 % lower, respectively, than the values determined by PET(CT) (p < 0.05), and a strong correlation between these variables was identified (Spearman rho 0.835; p < 0.01). PET/MR showed equivalent performance in terms of qualitative lesion detection to PET/CT. The differences demonstrated in quantitation of tracer uptake between PET(CT) and PET(MR) were minor, but statistically significant. Nevertheless, a more detailed study of the quantitative accuracy of PET(MR) and the factors governing it is needed to ultimately assess its accuracy in measuring tissue tracer concentrations.
Ghasemi Damavandi, Hamidreza; Sen Gupta, Ananya; Nelson, Robert K; Reddy, Christopher M
2016-01-01
Comprehensive two-dimensional gas chromatography [Formula: see text] provides high-resolution separations across hundreds of compounds in a complex mixture, thus unlocking unprecedented information for intricate quantitative interpretation. We exploit this compound diversity across the [Formula: see text] topography to provide quantitative compound-cognizant interpretation beyond target compound analysis with petroleum forensics as a practical application. We focus on the [Formula: see text] topography of biomarker hydrocarbons, hopanes and steranes, as they are generally recalcitrant to weathering. We introduce peak topography maps (PTM) and topography partitioning techniques that consider a notably broader and more diverse range of target and non-target biomarker compounds compared to traditional approaches that consider approximately 20 biomarker ratios. Specifically, we consider a range of 33-154 target and non-target biomarkers with highest-to-lowest peak ratio within an injection ranging from 4.86 to 19.6 (precise numbers depend on biomarker diversity of individual injections). We also provide a robust quantitative measure for directly determining "match" between samples, without necessitating training data sets. We validate our methods across 34 [Formula: see text] injections from a diverse portfolio of petroleum sources, and provide quantitative comparison of performance against established statistical methods such as principal components analysis (PCA). Our data set includes a wide range of samples collected following the 2010 Deepwater Horizon disaster that released approximately 160 million gallons of crude oil from the Macondo well (MW). Samples that were clearly collected following this disaster exhibit statistically significant match [Formula: see text] using PTM-based interpretation against other closely related sources. PTM-based interpretation also provides higher differentiation between closely correlated but distinct sources than obtained using PCA-based statistical comparisons. In addition to results based on this experimental field data, we also provide extentive perturbation analysis of the PTM method over numerical simulations that introduce random variability of peak locations over the [Formula: see text] biomarker ROI image of the MW pre-spill sample (sample [Formula: see text] in Additional file 4: Table S1). We compare the robustness of the cross-PTM score against peak location variability in both dimensions and compare the results against PCA analysis over the same set of simulated images. Detailed description of the simulation experiment and discussion of results are provided in Additional file 1: Section S8. We provide a peak-cognizant informational framework for quantitative interpretation of [Formula: see text] topography. Proposed topographic analysis enables [Formula: see text] forensic interpretation across target petroleum biomarkers, while including the nuances of lesser-known non-target biomarkers clustered around the target peaks. This allows potential discovery of hitherto unknown connections between target and non-target biomarkers.
Müller, Hans-Peter; Grön, Georg; Sprengelmeyer, Reiner; Kassubek, Jan; Ludolph, Albert C; Hobbs, Nicola; Cole, James; Roos, Raymund A C; Duerr, Alexandra; Tabrizi, Sarah J; Landwehrmeyer, G Bernhard; Süssmuth, Sigurd D
2013-01-01
Assessment of the feasibility to average diffusion tensor imaging (DTI) metrics of MRI data acquired in the course of a multicenter study. Sixty-one early stage Huntington's disease patients and forty healthy controls were studied using four different MR scanners at four European sites with acquisition protocols as close as possible to a given standard protocol. The potential and feasibility of averaging data acquired at different sites was evaluated quantitatively by region-of-interest (ROI) based statistical comparisons of coefficients of variation (CV) across centers, as well as by testing for significant group-by-center differences on averaged fractional anisotropy (FA) values between patients and controls. In addition, a whole-brain based statistical between-group comparison was performed using FA maps. The ex post facto statistical evaluation of CV and FA-values in a priori defined ROIs showed no differences between sites above chance indicating that data were not systematically biased by center specific factors. Averaging FA-maps from DTI data acquired at different study sites and different MR scanner types does not appear to be systematically biased. A suitable recipe for testing on the possibility to pool multicenter DTI data is provided to permit averaging of DTI-derived metrics to differentiate patients from healthy controls at a larger scale.
NASA Astrophysics Data System (ADS)
Grulke, Eric A.; Wu, Xiaochun; Ji, Yinglu; Buhr, Egbert; Yamamoto, Kazuhiro; Song, Nam Woong; Stefaniak, Aleksandr B.; Schwegler-Berry, Diane; Burchett, Woodrow W.; Lambert, Joshua; Stromberg, Arnold J.
2018-04-01
Size and shape distributions of gold nanorod samples are critical to their physico-chemical properties, especially their longitudinal surface plasmon resonance. This interlaboratory comparison study developed methods for measuring and evaluating size and shape distributions for gold nanorod samples using transmission electron microscopy (TEM) images. The objective was to determine whether two different samples, which had different performance attributes in their application, were different with respect to their size and/or shape descriptor distributions. Touching particles in the captured images were identified using a ruggedness shape descriptor. Nanorods could be distinguished from nanocubes using an elongational shape descriptor. A non-parametric statistical test showed that cumulative distributions of an elongational shape descriptor, that is, the aspect ratio, were statistically different between the two samples for all laboratories. While the scale parameters of size and shape distributions were similar for both samples, the width parameters of size and shape distributions were statistically different. This protocol fulfills an important need for a standardized approach to measure gold nanorod size and shape distributions for applications in which quantitative measurements and comparisons are important. Furthermore, the validated protocol workflow can be automated, thus providing consistent and rapid measurements of nanorod size and shape distributions for researchers, regulatory agencies, and industry.
Scaling images using their background ratio. An application in statistical comparisons of images.
Kalemis, A; Binnie, D; Bailey, D L; Flower, M A; Ott, R J
2003-06-07
Comparison of two medical images often requires image scaling as a pre-processing step. This is usually done with the scaling-to-the-mean or scaling-to-the-maximum techniques which, under certain circumstances, in quantitative applications may contribute a significant amount of bias. In this paper, we present a simple scaling method which assumes only that the most predominant values in the corresponding images belong to their background structure. The ratio of the two images to be compared is calculated and its frequency histogram is plotted. The scaling factor is given by the position of the peak in this histogram which belongs to the background structure. The method was tested against the traditional scaling-to-the-mean technique on simulated planar gamma-camera images which were compared using pixelwise statistical parametric tests. Both sensitivity and specificity for each condition were measured over a range of different contrasts and sizes of inhomogeneity for the two scaling techniques. The new method was found to preserve sensitivity in all cases while the traditional technique resulted in significant degradation of sensitivity in certain cases.
The Earthquake‐Source Inversion Validation (SIV) Project
Mai, P. Martin; Schorlemmer, Danijel; Page, Morgan T.; Ampuero, Jean-Paul; Asano, Kimiyuki; Causse, Mathieu; Custodio, Susana; Fan, Wenyuan; Festa, Gaetano; Galis, Martin; Gallovic, Frantisek; Imperatori, Walter; Käser, Martin; Malytskyy, Dmytro; Okuwaki, Ryo; Pollitz, Fred; Passone, Luca; Razafindrakoto, Hoby N. T.; Sekiguchi, Haruko; Song, Seok Goo; Somala, Surendra N.; Thingbaijam, Kiran K. S.; Twardzik, Cedric; van Driel, Martin; Vyas, Jagdish C.; Wang, Rongjiang; Yagi, Yuji; Zielke, Olaf
2016-01-01
Finite‐fault earthquake source inversions infer the (time‐dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake‐source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward‐modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source‐model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake‐source imaging problem.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaidheeswaran, Avinash; Shaffer, Franklin; Gopalan, Balaji
Here, the statistics of fluctuating velocity components are studied in the riser of a closed-loop circulating fluidized bed with fluid catalytic cracking catalyst particles. Our analysis shows distinct similarities as well as deviations compared to existing theories and bench-scale experiments. The study confirms anisotropic and non-Maxwellian distribution of fluctuating velocity components. The velocity distribution functions (VDFs) corresponding to transverse fluctuations exhibit symmetry, and follow a stretched-exponential behavior up to three standard deviations. The form of the transverse VDF is largely determined by interparticle interactions. The tails become more overpopulated with an increase in particle loading. The observed deviations from themore » Gaussian distribution are represented using the leading order term in the Sonine expansion, which is commonly used to approximate the VDFs in kinetic theory for granular flows. The vertical fluctuating VDFs are asymmetric and the skewness shifts as the wall is approached. In comparison to transverse fluctuations, the vertical VDF is determined by the local hydrodynamics. This is an observation of particle velocity fluctuations in a large-scale system and their quantitative comparison with the Maxwell-Boltzmann statistics.« less
Application of pedagogy reflective in statistical methods course and practicum statistical methods
NASA Astrophysics Data System (ADS)
Julie, Hongki
2017-08-01
Subject Elementary Statistics, Statistical Methods and Statistical Methods Practicum aimed to equip students of Mathematics Education about descriptive statistics and inferential statistics. The students' understanding about descriptive and inferential statistics were important for students on Mathematics Education Department, especially for those who took the final task associated with quantitative research. In quantitative research, students were required to be able to present and describe the quantitative data in an appropriate manner, to make conclusions from their quantitative data, and to create relationships between independent and dependent variables were defined in their research. In fact, when students made their final project associated with quantitative research, it was not been rare still met the students making mistakes in the steps of making conclusions and error in choosing the hypothetical testing process. As a result, they got incorrect conclusions. This is a very fatal mistake for those who did the quantitative research. There were some things gained from the implementation of reflective pedagogy on teaching learning process in Statistical Methods and Statistical Methods Practicum courses, namely: 1. Twenty two students passed in this course and and one student did not pass in this course. 2. The value of the most accomplished student was A that was achieved by 18 students. 3. According all students, their critical stance could be developed by them, and they could build a caring for each other through a learning process in this course. 4. All students agreed that through a learning process that they undergo in the course, they can build a caring for each other.
Ezard, Thomas H.G.; Jørgensen, Peter S.; Zimmerman, Naupaka; Chamberlain, Scott; Salguero-Gómez, Roberto; Curran, Timothy J.; Poisot, Timothée
2014-01-01
Proficiency in mathematics and statistics is essential to modern ecological science, yet few studies have assessed the level of quantitative training received by ecologists. To do so, we conducted an online survey. The 937 respondents were mostly early-career scientists who studied biology as undergraduates. We found a clear self-perceived lack of quantitative training: 75% were not satisfied with their understanding of mathematical models; 75% felt that the level of mathematics was “too low” in their ecology classes; 90% wanted more mathematics classes for ecologists; and 95% more statistics classes. Respondents thought that 30% of classes in ecology-related degrees should be focused on quantitative disciplines, which is likely higher than for most existing programs. The main suggestion to improve quantitative training was to relate theoretical and statistical modeling to applied ecological problems. Improving quantitative training will require dedicated, quantitative classes for ecology-related degrees that contain good mathematical and statistical practice. PMID:24688862
McMullin, Brian T; Leung, Ming-Ying; Shanbhag, Arun S; McNulty, Donald; Mabrey, Jay D; Agrawal, C Mauli
2006-02-01
A total of 750 images of individual ultra-high molecular weight polyethylene (UHMWPE) particles isolated from periprosthetic failed hip, knee, and shoulder arthroplasties were extracted from archival scanning electron micrographs. Particle size and morphology was subsequently analyzed using computerized image analysis software utilizing five descriptors found in ASTM F1877-98, a standard for quantitative description of wear debris. An online survey application was developed to display particle images, and allowed ten respondents to classify particle morphologies according to commonly used terminology as fibers, flakes, or granules. Particles were categorized based on a simple majority of responses. All descriptors were evaluated using a one-way ANOVA and Tukey-Kramer test for all-pairs comparison among each class of particles. A logistic regression model using half of the particles included in the survey was then used to develop a mathematical scheme to predict whether a given particle should be classified as a fiber, flake, or granule based on its quantitative measurements. The validity of the model was then assessed using the other half of the survey particles and compared with human responses. Comparison of the quantitative measurements of isolated particles showed that the morphologies of each particle type classified by respondents were statistically different from one another (p<0.05). The average agreement between mathematical prediction and human respondents was 83.5% (standard error 0.16%). These data suggest that computerized descriptors can be feasibly correlated with subjective terminology, thus providing a basis for a common vocabulary for particle description which can be translated into quantitative dimensions.
McMullin, Brian T.; Leung, Ming-Ying; Shanbhag, Arun S.; McNulty, Donald; Mabrey, Jay D.; Agrawal, C. Mauli
2014-01-01
A total of 750 images of individual ultra-high molecular weight polyethylene (UHMWPE) particles isolated from periprosthetic failed hip, knee, and shoulder arthroplasties were extracted from archival scanning electron micrographs. Particle size and morphology was subsequently analyzed using computerized image analysis software utilizing five descriptors found in ASTM F1877-98, a standard for quantitative description of wear debris. An online survey application was developed to display particle images, and allowed ten respondents to classify particle morphologies according to commonly used terminology as fibers, flakes, or granules. Particles were categorized based on a simple majority of responses. All descriptors were evaluated using a one-way ANOVA and Tukey–Kramer test for all-pairs comparison among each class of particles. A logistic regression model using half of the particles included in the survey was then used to develop a mathematical scheme to predict whether a given particle should be classified as a fiber, flake, or granule based on its quantitative measurements. The validity of the model was then assessed using the other half of the survey particles and compared with human responses. Comparison of the quantitative measurements of isolated particles showed that the morphologies of each particle type classified by respondents were statistically different from one another (po0:05). The average agreement between mathematical prediction and human respondents was 83.5% (standard error 0.16%). These data suggest that computerized descriptors can be feasibly correlated with subjective terminology, thus providing a basis for a common vocabulary for particle description which can be translated into quantitative dimensions. PMID:16112725
Ning, Lei; Song, Li-Jiang; Fan, Shun-Wu; Zhao, Xing; Chen, Yi-Lei; Li, Zhao-Zhi; Hu, Zi-Ang
2017-10-11
This study established gender-specific reference values in mainland Chinese (MC) and is important for quantitative morphometry for diagnosis and epidemiological study of osteoporotic vertebral compressive fracture. Comparisons of reference values among different racial populations are then performed to demonstrate the MC-specific characteristic. Osteoporotic vertebral compressive fracture (OVCF) is a common complication of osteoporosis in the elder population. Clinical diagnosis and epidemiological study of OVCF often employ quantitative morphometry, which relies heavily on the comparison of patients' vertebral parameters to existing reference values derived from the normal population. Thus, reference values are crucial in clinical diagnosis. To our knowledge, this is the first study to establish reference values of the mainland Chinese (MC) for quantitative morphometry. Vertebral heights including anterior (Ha), middle (Hm), posterior (Hp) heights, and predicted posterior height (pp) from T4 to L5 were obtained; and ratios of Ha/Hp, Hm/Hp and Hp/pp. were calculated from 585 MC (both female and male) for establishing reference values and subsequent comparisons with other studies. Vertebral heights increased progressively from T4 to L3 but then decreased in L4 and L5. Both genders showed similar ratios of vertebral dimensions, but male vertebrae were statistically larger than those of female (P < 0.01). Vertebral size of MC population was smaller than that of US and UK population, but was surprisingly larger than that of Hong Kong Chinese, although these two are commonly considered as one race. Data from different racial populations showed similar dimensional ratios in all vertebrae. We established gender-specific reference values for MC. Our results also indicated the necessity of establishing reference values that are not only race- and gender-specific, but also population- or region-specific for accurate quantitative morphometric assessment of OVCF.
Effectiveness of feature and classifier algorithms in character recognition systems
NASA Astrophysics Data System (ADS)
Wilson, Charles L.
1993-04-01
At the first Census Optical Character Recognition Systems Conference, NIST generated accuracy data for more than character recognition systems. Most systems were tested on the recognition of isolated digits and upper and lower case alphabetic characters. The recognition experiments were performed on sample sizes of 58,000 digits, and 12,000 upper and lower case alphabetic characters. The algorithms used by the 26 conference participants included rule-based methods, image-based methods, statistical methods, and neural networks. The neural network methods included Multi-Layer Perceptron's, Learned Vector Quantitization, Neocognitrons, and cascaded neural networks. In this paper 11 different systems are compared using correlations between the answers of different systems, comparing the decrease in error rate as a function of confidence of recognition, and comparing the writer dependence of recognition. This comparison shows that methods that used different algorithms for feature extraction and recognition performed with very high levels of correlation. This is true for neural network systems, hybrid systems, and statistically based systems, and leads to the conclusion that neural networks have not yet demonstrated a clear superiority to more conventional statistical methods. Comparison of these results with the models of Vapnick (for estimation problems), MacKay (for Bayesian statistical models), Moody (for effective parameterization), and Boltzmann models (for information content) demonstrate that as the limits of training data variance are approached, all classifier systems have similar statistical properties. The limiting condition can only be approached for sufficiently rich feature sets because the accuracy limit is controlled by the available information content of the training set, which must pass through the feature extraction process prior to classification.
MO-G-12A-01: Quantitative Imaging Metrology: What Should Be Assessed and How?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giger, M; Petrick, N; Obuchowski, N
The first two symposia in the Quantitative Imaging Track focused on 1) the introduction of quantitative imaging (QI) challenges and opportunities, and QI efforts of agencies and organizations such as the RSNA, NCI, FDA, and NIST, and 2) the techniques, applications, and challenges of QI, with specific examples from CT, PET/CT, and MR. This third symposium in the QI Track will focus on metrology and its importance in successfully advancing the QI field. While the specific focus will be on QI, many of the concepts presented are more broadly applicable to many areas of medical physics research and applications. Asmore » such, the topics discussed should be of interest to medical physicists involved in imaging as well as therapy. The first talk of the session will focus on the introduction to metrology and why it is critically important in QI. The second talk will focus on appropriate methods for technical performance assessment. The third talk will address statistically valid methods for algorithm comparison, a common problem not only in QI but also in other areas of medical physics. The final talk in the session will address strategies for publication of results that will allow statistically valid meta-analyses, which is critical for combining results of individual studies with typically small sample sizes in a manner that can best inform decisions and advance the field. Learning Objectives: Understand the importance of metrology in the QI efforts. Understand appropriate methods for technical performance assessment. Understand methods for comparing algorithms with or without reference data (i.e., “ground truth”). Understand the challenges and importance of reporting results in a manner that allows for statistically valid meta-analyses.« less
NASA Astrophysics Data System (ADS)
Hsu, Kuo-Hsien
2012-11-01
Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.
ERIC Educational Resources Information Center
Beck, Christopher W.
2018-01-01
Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…
Understanding quantitative research: part 1.
Hoe, Juanita; Hoare, Zoë
This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.
Savasoglu, Kaan; Payzin, Kadriye Bahriye; Ozdemirkiran, Fusun; Berber, Belgin
2015-08-01
To determine the use of the Quantitative Real Time PCR (RQ-PCR) assay follow-up with Chronic Myeloid Leukemia (CML) patients. Cross-sectional observational. Izmir Ataturk Education and Research Hospital, Izmir, Turkey, from 2009 to 2013. Cytogenetic, FISH, RQ-PCR test results from 177 CMLpatients' materials selected between 2009 - 2013 years was set up for comparison analysis. Statistical analysis was performed to compare between FISH, karyotype and RQ-PCR results of the patients. Karyotyping and FISH specificity and sensitivity rates determined by ROC analysis compared with RQ-PCR results. Chi-square test was used to compare test failure rates. Sensitivity and specificity values were determined for karyotyping 17.6 - 98% (p=0.118, p > 0.05) and for FISH 22.5 - 96% (p=0.064, p > 0.05) respectively. FISH sensitivity was slightly higher than karyotyping but there was calculated a strong correlation between them (p < 0.001). RQ-PCR test failure rate did not correlate with other two tests (p > 0.05); however, karyotyping and FISH test failure rate was statistically significant (p < 0.001). Besides, the situation needed for karyotype analysis, RQ-PCR assay can be used alone in the follow-up of CMLdisease.
NASA Astrophysics Data System (ADS)
Brereton, Carol A.; Johnson, Matthew R.
2012-05-01
Fugitive pollutant sources from the oil and gas industry are typically quite difficult to find within industrial plants and refineries, yet they are a significant contributor of global greenhouse gas emissions. A novel approach for locating fugitive emission sources using computationally efficient trajectory statistical methods (TSM) has been investigated in detailed proof-of-concept simulations. Four TSMs were examined in a variety of source emissions scenarios developed using transient CFD simulations on the simplified geometry of an actual gas plant: potential source contribution function (PSCF), concentration weighted trajectory (CWT), residence time weighted concentration (RTWC), and quantitative transport bias analysis (QTBA). Quantitative comparisons were made using a correlation measure based on search area from the source(s). PSCF, CWT and RTWC could all distinguish areas near major sources from the surroundings. QTBA successfully located sources in only some cases, even when provided with a large data set. RTWC, given sufficient domain trajectory coverage, distinguished source areas best, but otherwise could produce false source predictions. Using RTWC in conjunction with CWT could overcome this issue as well as reduce sensitivity to noise in the data. The results demonstrate that TSMs are a promising approach for identifying fugitive emissions sources within complex facility geometries.
Gao, Bin; Li, Xiaoqing; Woo, Wai Lok; Tian, Gui Yun
2018-05-01
Thermographic inspection has been widely applied to non-destructive testing and evaluation with the capabilities of rapid, contactless, and large surface area detection. Image segmentation is considered essential for identifying and sizing defects. To attain a high-level performance, specific physics-based models that describe defects generation and enable the precise extraction of target region are of crucial importance. In this paper, an effective genetic first-order statistical image segmentation algorithm is proposed for quantitative crack detection. The proposed method automatically extracts valuable spatial-temporal patterns from unsupervised feature extraction algorithm and avoids a range of issues associated with human intervention in laborious manual selection of specific thermal video frames for processing. An internal genetic functionality is built into the proposed algorithm to automatically control the segmentation threshold to render enhanced accuracy in sizing the cracks. Eddy current pulsed thermography will be implemented as a platform to demonstrate surface crack detection. Experimental tests and comparisons have been conducted to verify the efficacy of the proposed method. In addition, a global quantitative assessment index F-score has been adopted to objectively evaluate the performance of different segmentation algorithms.
Thompson, Holly M; Minamimoto, Ryogo; Jamali, Mehran; Barkhodari, Amir; von Eyben, Rie; Iagaru, Andrei
2016-07-01
As quantitative F-FDG PET numbers and pooling of results from different PET/CT scanners become more influential in the management of patients, it becomes imperative that we fully interrogate differences between scanners to fully understand the degree of scanner bias on the statistical power of studies. Participants with body mass index (BMI) greater than 25, scheduled on a time-of-flight (TOF)-capable PET/CT scanner, had a consecutive scan on a non-TOF-capable PET/CT scanner and vice versa. SUVmean in various tissues and SUVmax of malignant lesions were measured from both scans, matched to each subject. Data were analyzed using a mixed-effects model, and statistical significance was determined using equivalence testing, with P < 0.05 being significant. Equivalence was established in all baseline organs, except the cerebellum, matched per patient between scanner types. Mixed-effects method analysis of lesions, repeated between scan types and matched per patient, demonstrated good concordance between scanner types. Patients could be scanned on either a TOF or non-TOF-capable PET/CT scanner without clinical compromise to quantitative SUV measurements.
Sempa, Joseph B; Ujeneza, Eva L; Nieuwoudt, Martin
2017-01-01
In Sub-Saharan African (SSA) resource limited settings, Cluster of Differentiation 4 (CD4) counts continue to be used for clinical decision making in antiretroviral therapy (ART). Here, HIV-infected people often remain with CD4 counts <350 cells/μL even after 5 years of viral load suppression. Ongoing immunological monitoring is necessary. Due to varying statistical modeling methods comparing immune response to ART across different cohorts is difficult. We systematically review such models and detail the similarities, differences and problems. 'Preferred Reporting Items for Systematic Review and Meta-Analyses' guidelines were used. Only studies of immune-response after ART initiation from SSA in adults were included. Data was extracted from each study and tabulated. Outcomes were categorized into 3 groups: 'slope', 'survival', and 'asymptote' models. Wordclouds were drawn wherein the frequency of variables occurring in the reviewed models is indicated by their size and color. 69 covariates were identified in the final models of 35 studies. Effect sizes of covariates were not directly quantitatively comparable in view of the combination of differing variables and scale transformation methods across models. Wordclouds enabled the identification of qualitative and semi-quantitative covariate sets for each outcome category. Comparison across categories identified sex, baseline age, baseline log viral load, baseline CD4, ART initiation regimen and ART duration as a minimal consensus set. Most models were different with respect to covariates included, variable transformations and scales, model assumptions, modelling strategies and reporting methods, even for the same outcomes. To enable comparison across cohorts, statistical models would benefit from the application of more uniform modelling techniques. Historic efforts have produced results that are anecdotal to individual cohorts only. This study was able to define 'prior' knowledge in the Bayesian sense. Such information has value for prospective modelling efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sevcik, R. S.; Hyman, D. A.; Basumallich, L.
2013-01-01
A technique for carbohydrate analysis for bioprocess samples has been developed, providing reduced analysis time compared to current practice in the biofuels R&D community. The Thermofisher CarboPac SA10 anion-exchange column enables isocratic separation of monosaccharides, sucrose and cellobiose in approximately 7 minutes. Additionally, use of a low-volume (0.2 mL) injection valve in combination with a high-volume detection cell minimizes the extent of sample dilution required to bring sugar concentrations into the linear range of the pulsed amperometric detector (PAD). Three laboratories, representing academia, industry, and government, participated in an interlaboratory study which analyzed twenty-one opportunistic samples representing biomass pretreatment, enzymaticmore » saccharification, and fermentation samples. The technique's robustness, linearity, and interlaboratory reproducibility were evaluated and showed excellent-to-acceptable characteristics. Additionally, quantitation by the CarboPac SA10/PAD was compared with the current practice method utilizing a HPX-87P/RID. While these two methods showed good agreement a statistical comparison found significant quantitation difference between them, highlighting the difference between selective and universal detection modes.« less
Public and patient involvement in quantitative health research: A statistical perspective.
Hannigan, Ailish
2018-06-19
The majority of studies included in recent reviews of impact for public and patient involvement (PPI) in health research had a qualitative design. PPI in solely quantitative designs is underexplored, particularly its impact on statistical analysis. Statisticians in practice have a long history of working in both consultative (indirect) and collaborative (direct) roles in health research, yet their perspective on PPI in quantitative health research has never been explicitly examined. To explore the potential and challenges of PPI from a statistical perspective at distinct stages of quantitative research, that is sampling, measurement and statistical analysis, distinguishing between indirect and direct PPI. Statistical analysis is underpinned by having a representative sample, and a collaborative or direct approach to PPI may help achieve that by supporting access to and increasing participation of under-represented groups in the population. Acknowledging and valuing the role of lay knowledge of the context in statistical analysis and in deciding what variables to measure may support collective learning and advance scientific understanding, as evidenced by the use of participatory modelling in other disciplines. A recurring issue for quantitative researchers, which reflects quantitative sampling methods, is the selection and required number of PPI contributors, and this requires further methodological development. Direct approaches to PPI in quantitative health research may potentially increase its impact, but the facilitation and partnership skills required may require further training for all stakeholders, including statisticians. © 2018 The Authors Health Expectations published by John Wiley & Sons Ltd.
Determination of precipitation profiles from airborne passive microwave radiometric measurements
NASA Technical Reports Server (NTRS)
Kummerow, Christian; Hakkarinen, Ida M.; Pierce, Harold F.; Weinman, James A.
1991-01-01
This study presents the first quantitative retrievals of vertical profiles of precipitation derived from multispectral passive microwave radiometry. Measurements of microwave brightness temperature (Tb) obtained by a NASA high-altitude research aircraft are related to profiles of rainfall rate through a multichannel piecewise-linear statistical regression procedure. Statistics for Tb are obtained from a set of cloud radiative models representing a wide variety of convective, stratiform, and anvil structures. The retrieval scheme itself determines which cloud model best fits the observed meteorological conditions. Retrieved rainfall rate profiles are converted to equivalent radar reflectivity for comparison with observed reflectivities from a ground-based research radar. Results for two case studies, a stratiform rain situation and an intense convective thunderstorm, show that the radiometrically derived profiles capture the major features of the observed vertical structure of hydrometer density.
Chen, Qianqian; Xie, Qian; Zhao, Min; Chen, Bin; Gao, Shi; Zhang, Haishan; Xing, Hua; Ma, Qingjie
2015-01-01
To compare the diagnostic value of visual and semi-quantitative analysis of technetium-99m-poly-ethylene glycol, 4-arginine-glycine-aspartic acid ((99m)Tc-3PRGD2) scintimammography (SMG) for better differentiation of benign from malignant breast masses, and also investigate the incremental role of semi-quantitative index of SMG. A total of 72 patients with breast lesions were included in the study. Technetium-99m-3PRGD2 SMG was performed with single photon emission computed tomography (SPET) at 60 min after intravenous injection of 749 ± 86MBq of the radiotracer. Images were evaluated by visual interpretation and semi-quantitative indices of tumor to non-tumor (T/N) ratios, which were compared with pathology results. Receiver operating characteristics (ROC) curve analyses were performed to determine the optimal visual grade, to calculate cut-off values of semi-quantitative indices, and to compare visual and semi-quantitative diagnostic values. Among the 72 patients, 89 lesions were confirmed by histopathology after fine needle aspiration biopsy or surgery, 48 malignant and 41 benign lesions. The mean T/N ratio of (99m)Tc-3PRGD2 SMG in malignant lesions was significantly higher than that in benign lesions (P<0.05). When grade 2 of the disease was used as cut-off value for the detection of primary breast cancer, the sensitivity, specificity and accuracy were 81.3%, 70.7%, and 76.4%, respectively. When a T/N ratio of 2.01 was used as cut-off value, the sensitivity, specificity and accuracy were 79.2%, 75.6%, and 77.5%, respectively. According to ROC analysis, the area under the curve for semi-quantitative analysis was higher than that for visual analysis, but the statistical difference was not significant (P=0.372). Compared with visual analysis or semi-quantitative analysis alone, the sensitivity, specificity and accuracy of visual analysis combined with semi-quantitative analysis in diagnosing primary breast cancer were higher, being: 87.5%, 82.9%, and 85.4%, respectively. The area under the curve was 0.891. Results of the present study suggest that the semi-quantitative and visual analysis statistically showed similar results. The semi-quantitative analysis provided incremental value additive to visual analysis of (99m)Tc-3PRGD2 SMG for the detection of breast cancer. It seems from our results that, when the tumor was located in the medial part of the breast, the semi-quantitative analysis gave better diagnostic results.
Selecting the most appropriate inferential statistical test for your quantitative research study.
Bettany-Saltikov, Josette; Whittaker, Victoria Jane
2014-06-01
To discuss the issues and processes relating to the selection of the most appropriate statistical test. A review of the basic research concepts together with a number of clinical scenarios is used to illustrate this. Quantitative nursing research generally features the use of empirical data which necessitates the selection of both descriptive and statistical tests. Different types of research questions can be answered by different types of research designs, which in turn need to be matched to a specific statistical test(s). Discursive paper. This paper discusses the issues relating to the selection of the most appropriate statistical test and makes some recommendations as to how these might be dealt with. When conducting empirical quantitative studies, a number of key issues need to be considered. Considerations for selecting the most appropriate statistical tests are discussed and flow charts provided to facilitate this process. When nursing clinicians and researchers conduct quantitative research studies, it is crucial that the most appropriate statistical test is selected to enable valid conclusions to be made. © 2013 John Wiley & Sons Ltd.
Analysis of longitudinal "time series" data in toxicology.
Cox, C; Cory-Slechta, D A
1987-02-01
Studies focusing on chronic toxicity or on the time course of toxicant effect often involve repeated measurements or longitudinal observations of endpoints of interest. Experimental design considerations frequently necessitate between-group comparisons of the resulting trends. Typically, procedures such as the repeated-measures analysis of variance have been used for statistical analysis, even though the required assumptions may not be satisfied in some circumstances. This paper describes an alternative analytical approach which summarizes curvilinear trends by fitting cubic orthogonal polynomials to individual profiles of effect. The resulting regression coefficients serve as quantitative descriptors which can be subjected to group significance testing. Randomization tests based on medians are proposed to provide a comparison of treatment and control groups. Examples from the behavioral toxicology literature are considered, and the results are compared to more traditional approaches, such as repeated-measures analysis of variance.
Shahlaei, Mohsen; Sabet, Razieh; Ziari, Maryam Bahman; Moeinifard, Behzad; Fassihi, Afshin; Karbakhsh, Reza
2010-10-01
Quantitative relationships between molecular structure and methionine aminopeptidase-2 inhibitory activity of a series of cytotoxic anthranilic acid sulfonamide derivatives were discovered. We have demonstrated the detailed application of two efficient nonlinear methods for evaluation of quantitative structure-activity relationships of the studied compounds. Components produced by principal component analysis as input of developed nonlinear models were used. The performance of the developed models namely PC-GRNN and PC-LS-SVM were tested by several validation methods. The resulted PC-LS-SVM model had a high statistical quality (R(2)=0.91 and R(CV)(2)=0.81) for predicting the cytotoxic activity of the compounds. Comparison between predictability of PC-GRNN and PC-LS-SVM indicates that later method has higher ability to predict the activity of the studied molecules. Copyright (c) 2010 Elsevier Masson SAS. All rights reserved.
Mothering occupations when parenting children with feeding concerns: a mixed methods study.
Winston, Kristin A; Dunbar, Sandra B; Reed, Carol N; Francis-Connolly, Elizabeth
2010-06-01
The occupations of mothering have gained attention in occupation-based research and literature; however, many aspects of mothering remain unexplored. PURPOSE; The purpose of this study was to gain insight into mothers' perceptions of their occupations when mothering a child with feeding difficulties. Study design used mixed methodology utilizing the Parental Stress Scale (PSS), Life Satisfaction Index for Parents (LSI-P), and phenomenological interviews. Comparison of the datasets illuminated the quantitative findings with the words of the women interviewed. Although there was only one statistically significant finding in the quantitative data in terms of satisfaction with leisure and recreation, the qualitative data provided rich descriptions of mothers' perceptions of stress and life satisfaction. Mixed methods data analysis revealed the complex nature of the interaction between mothering occupations and mothering a child with feeding concerns as well as how these concerns might influence occupational therapy practice.
CSGRqtl: A Comparative Quantitative Trait Locus Database for Saccharinae Grasses.
Zhang, Dong; Paterson, Andrew H
2017-01-01
Conventional biparental quantitative trait locus (QTL) mapping has led to some successes in the identification of causal genes in many organisms. QTL likelihood intervals not only provide "prior information" for finer-resolution approaches such as GWAS but also provide better statistical power than GWAS to detect variants with low/rare frequency in a natural population. Here, we describe a new element of an ongoing effort to provide online resources to facilitate study and improvement of the important Saccharinae clade. The primary goal of this new resource is the anchoring of published QTLs for this clade to the Sorghum genome. Genetic map alignments translate a wealth of genomic information from sorghum to Saccharum spp., Miscanthus spp., and other taxa. In addition, genome alignments facilitate comparison of the Saccharinae QTL sets to those of other taxa that enjoy comparable resources, exemplified herein by rice.
Müller, Hans-Peter; Grön, Georg; Sprengelmeyer, Reiner; Kassubek, Jan; Ludolph, Albert C.; Hobbs, Nicola; Cole, James; Roos, Raymund A.C.; Duerr, Alexandra; Tabrizi, Sarah J.; Landwehrmeyer, G. Bernhard; Süssmuth, Sigurd D.
2013-01-01
Purpose Assessment of the feasibility to average diffusion tensor imaging (DTI) metrics of MRI data acquired in the course of a multicenter study. Materials and methods Sixty-one early stage Huntington's disease patients and forty healthy controls were studied using four different MR scanners at four European sites with acquisition protocols as close as possible to a given standard protocol. The potential and feasibility of averaging data acquired at different sites was evaluated quantitatively by region-of-interest (ROI) based statistical comparisons of coefficients of variation (CV) across centers, as well as by testing for significant group-by-center differences on averaged fractional anisotropy (FA) values between patients and controls. In addition, a whole-brain based statistical between-group comparison was performed using FA maps. Results The ex post facto statistical evaluation of CV and FA-values in a priori defined ROIs showed no differences between sites above chance indicating that data were not systematically biased by center specific factors. Conclusion Averaging FA-maps from DTI data acquired at different study sites and different MR scanner types does not appear to be systematically biased. A suitable recipe for testing on the possibility to pool multicenter DTI data is provided to permit averaging of DTI-derived metrics to differentiate patients from healthy controls at a larger scale. PMID:24179771
Shang, Xiaoyan; Carlson, Michelle C; Tang, Xiaoying
2018-04-30
Total intracranial volume (TIV) is often used as a measure of brain size to correct for individual variability in magnetic resonance imaging (MRI) based morphometric studies. An adjustment of TIV can greatly increase the statistical power of brain morphometry methods. As such, an accurate and precise TIV estimation is of great importance in MRI studies. In this paper, we compared three automated TIV estimation methods (multi-atlas likelihood fusion (MALF), Statistical Parametric Mapping 8 (SPM8) and FreeSurfer (FS)) using longitudinal T1-weighted MR images in a cohort of 70 older participants at elevated sociodemographic risk for Alzheimer's disease. Statistical group comparisons in terms of four different metrics were performed. Furthermore, sex, education level, and intervention status were investigated separately for their impacts on the TIV estimation performance of each method. According to our experimental results, MALF was the least susceptible to atrophy, while SPM8 and FS suffered a loss in precision. In group-wise analysis, MALF was the least sensitive method to group variation, whereas SPM8 was particularly sensitive to sex and FS was unstable with respect to education level. In terms of effectiveness, both MALF and SPM8 delivered a user-friendly performance, while FS was relatively computationally intensive. Copyright © 2018 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Chamberlain, John Martyn; Hillier, John; Signoretta, Paola
2015-01-01
This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…
Nanoscale Structure of Type I Collagen Fibrils: Quantitative Measurement of D-spacing
Erickson, Blake; Fang, Ming; Wallace, Joseph M.; Orr, Bradford G.; Les, Clifford M.; Holl, Mark M. Banaszak
2012-01-01
This paper details a quantitative method to measure the D-periodic spacing of Type I collagen fibrils using Atomic Force Microscopy coupled with analysis using a 2D Fast Fourier Transform approach. Instrument calibration, data sampling and data analysis are all discussed and comparisons of the data to the complementary methods of electron microscopy and X-ray scattering are made. Examples of the application of this new approach to the analysis of Type I collagen morphology in disease models of estrogen depletion and Osteogenesis Imperfecta are provided. We demonstrate that it is the D-spacing distribution, not the D-spacing mean, that showed statistically significant differences in estrogen depletion associated with early stage Osteoporosis and Osteogenesis Imperfecta. The ability to quantitatively characterize nanoscale morphological features of Type I collagen fibrils will provide important structural information regarding Type I collagen in many research areas, including tissue aging and disease, tissue engineering, and gene knock out studies. Furthermore, we also envision potential clinical applications including evaluation of tissue collagen integrity under the impact of diseases or drug treatments. PMID:23027700
Aiken, Leona S; West, Stephen G; Millsap, Roger E
2008-01-01
In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD programs (86%) participated. This survey replicated and extended a previous survey (L. S. Aiken, S. G. West, L. B. Sechrest, & R. R. Reno, 1990), permitting examination of curriculum development. Most training supported laboratory and not field research. The median of 1.6 years of training in statistics and measurement was mainly devoted to the modally 1-year introductory statistics course, leaving little room for advanced study. Curricular enhancements were noted in statistics and to a minor degree in measurement. Additional coverage of both fundamental and innovative quantitative methodology is needed. The research design curriculum has largely stagnated, a cause for great concern. Elite programs showed no overall advantage in quantitative training. Forces that support curricular innovation are characterized. Human capital challenges to quantitative training, including recruiting and supporting young quantitative faculty, are discussed. Steps must be taken to bring innovations in quantitative methodology into the curriculum of PhD programs in psychology. PsycINFO Database Record (c) 2008 APA, all rights reserved.
Apical extrusion of debris and irrigant using hand and rotary systems: A comparative study
Ghivari, Sheetal B; Kubasad, Girish C; Chandak, Manoj G; Akarte, NR
2011-01-01
Aim: To evaluate and compare the amount of debris and irrigant extruded quantitatively by using two hand and rotary nickel–titanium (Ni–Ti) instrumentation techniques. Materials and Methods: Eighty freshly extracted mandibular premolars having similar canal length and curvature were selected and mounted in a debris collection apparatus. After each instrument change, 1 ml of distilled water was used as an irrigant and the amount of irrigant extruded was measured using the Meyers and Montgomery method. After drying, the debris was weighed using an electronic microbalance to determine its weight. Statistical analysis used: The data was analyzed statistically to determine the mean difference between the groups. The mean weight of the dry debris and irrigant within the group and between the groups was calculated by the one-way ANOVA and multiple comparison (Dunnet D) test. Results: The step-back technique extruded a greater quantity of debris and irrigant in comparison to other hand and rotary Ni–Ti systems. Conclusions: All instrumentation techniques extrude debris and irrigant, it is prudent on the part of the clinician to select the instrumentation technique that extrudes the least amount of debris and irrigant, to prevent a flare-up phenomena. PMID:21814364
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pezner, R.D.; Patterson, M.P.; Hill, L.R.
Breast Retraction Assessment (BRA) is an objective evaluation of the amount of cosmetic retraction of the treated breast in comparison to the untreated breast in patients who receive conservative treatment for breast cancer. A clear acrylic sheet supported vertically and marked as a grid at 1 cm intervals is employed to perform the measurements. Average BRA value in 29 control patients without breast cancer was 1.2 cm. Average BRA value in 27 patients treated conservatively for clinical Stage I or II unilateral breast cancer was 3.7 cm. BRA values in breast cancer patients ranged from 0.0 to 8.5 cm. Patientsmore » who received a local radiation boost to the primary tumor bed site had statistically significantly less retraction than those who did not receive a boost. Patients who had an extensive primary tumor resection had statistically significantly more retraction than those who underwent a more limited resection. In comparison to qualitative forms of cosmetic analysis, BRA is an objective test that can quantitatively evaluate factors which may be related to cosmetic retraction in patients treated conservatively for breast cancer.« less
Goodenberger, Martin H; Wagner-Bartak, Nicolaus A; Gupta, Shiva; Liu, Xinming; Yap, Ramon Q; Sun, Jia; Tamm, Eric P; Jensen, Corey T
The purpose of this study was to compare abdominopelvic computed tomography images reconstructed with adaptive statistical iterative reconstruction-V (ASIR-V) with model-based iterative reconstruction (Veo 3.0), ASIR, and filtered back projection (FBP). Abdominopelvic computed tomography scans for 36 patients (26 males and 10 females) were reconstructed using FBP, ASIR (80%), Veo 3.0, and ASIR-V (30%, 60%, 90%). Mean ± SD patient age was 32 ± 10 years with mean ± SD body mass index of 26.9 ± 4.4 kg/m. Images were reviewed by 2 independent readers in a blinded, randomized fashion. Hounsfield unit, noise, and contrast-to-noise ratio (CNR) values were calculated for each reconstruction algorithm for further comparison. Phantom evaluation of low-contrast detectability (LCD) and high-contrast resolution was performed. Adaptive statistical iterative reconstruction-V 30%, ASIR-V 60%, and ASIR 80% were generally superior qualitatively compared with ASIR-V 90%, Veo 3.0, and FBP (P < 0.05). Adaptive statistical iterative reconstruction-V 90% showed superior LCD and had the highest CNR in the liver, aorta, and, pancreas, measuring 7.32 ± 3.22, 11.60 ± 4.25, and 4.60 ± 2.31, respectively, compared with the next best series of ASIR-V 60% with respective CNR values of 5.54 ± 2.39, 8.78 ± 3.15, and 3.49 ± 1.77 (P <0.0001). Veo 3.0 and ASIR 80% had the best and worst spatial resolution, respectively. Adaptive statistical iterative reconstruction-V 30% and ASIR-V 60% provided the best combination of qualitative and quantitative performance. Adaptive statistical iterative reconstruction 80% was equivalent qualitatively, but demonstrated inferior spatial resolution and LCD.
Shi, Ximin; Li, Nan; Ding, Haiyan; Dang, Yonghong; Hu, Guilan; Liu, Shuai; Cui, Jie; Zhang, Yue; Li, Fang; Zhang, Hui; Huo, Li
2018-01-01
Kinetic modeling of dynamic 11 C-acetate PET imaging provides quantitative information for myocardium assessment. The quality and quantitation of PET images are known to be dependent on PET reconstruction methods. This study aims to investigate the impacts of reconstruction algorithms on the quantitative analysis of dynamic 11 C-acetate cardiac PET imaging. Suspected alcoholic cardiomyopathy patients ( N = 24) underwent 11 C-acetate dynamic PET imaging after low dose CT scan. PET images were reconstructed using four algorithms: filtered backprojection (FBP), ordered subsets expectation maximization (OSEM), OSEM with time-of-flight (TOF), and OSEM with both time-of-flight and point-spread-function (TPSF). Standardized uptake values (SUVs) at different time points were compared among images reconstructed using the four algorithms. Time-activity curves (TACs) in myocardium and blood pools of ventricles were generated from the dynamic image series. Kinetic parameters K 1 and k 2 were derived using a 1-tissue-compartment model for kinetic modeling of cardiac flow from 11 C-acetate PET images. Significant image quality improvement was found in the images reconstructed using iterative OSEM-type algorithms (OSME, TOF, and TPSF) compared with FBP. However, no statistical differences in SUVs were observed among the four reconstruction methods at the selected time points. Kinetic parameters K 1 and k 2 also exhibited no statistical difference among the four reconstruction algorithms in terms of mean value and standard deviation. However, for the correlation analysis, OSEM reconstruction presented relatively higher residual in correlation with FBP reconstruction compared with TOF and TPSF reconstruction, and TOF and TPSF reconstruction were highly correlated with each other. All the tested reconstruction algorithms performed similarly for quantitative analysis of 11 C-acetate cardiac PET imaging. TOF and TPSF yielded highly consistent kinetic parameter results with superior image quality compared with FBP. OSEM was relatively less reliable. Both TOF and TPSF were recommended for cardiac 11 C-acetate kinetic analysis.
Wirtzfeld, Lauren A; Ghoshal, Goutam; Rosado-Mendez, Ivan M; Nam, Kibo; Park, Yeonjoo; Pawlicki, Alexander D; Miller, Rita J; Simpson, Douglas G; Zagzebski, James A; Oelze, Michael L; Hall, Timothy J; O'Brien, William D
2015-08-01
Quantitative ultrasound estimates such as the frequency-dependent backscatter coefficient (BSC) have the potential to enhance noninvasive tissue characterization and to identify tumors better than traditional B-mode imaging. Thus, investigating system independence of BSC estimates from multiple imaging platforms is important for assessing their capabilities to detect tissue differences. Mouse and rat mammary tumor models, 4T1 and MAT, respectively, were used in a comparative experiment using 3 imaging systems (Siemens, Ultrasonix, and VisualSonics) with 5 different transducers covering a range of ultrasonic frequencies. Functional analysis of variance of the MAT and 4T1 BSC-versus-frequency curves revealed statistically significant differences between the two tumor types. Variations also were found among results from different transducers, attributable to frequency range effects. At 3 to 8 MHz, tumor BSC functions using different systems showed no differences between tumor type, but at 10 to 20 MHz, there were differences between 4T1 and MAT tumors. Fitting an average spline model to the combined BSC estimates (3-22 MHz) demonstrated that the BSC differences between tumors increased with increasing frequency, with the greatest separation above 15 MHz. Confining the analysis to larger tumors resulted in better discrimination over a wider bandwidth. Confining the comparison to higher ultrasonic frequencies or larger tumor sizes allowed for separation of BSC-versus-frequency curves from 4T1 and MAT tumors. These constraints ensure that a greater fraction of the backscattered signals originated from within the tumor, thus demonstrating that statistically significant tumor differences were detected. © 2015 by the American Institute of Ultrasound in Medicine.
NASA Astrophysics Data System (ADS)
Simon, E.; Nowicki, S.; Neumann, T.; Tyahla, L.; Saba, J. L.; Guerber, J. R.; Bonin, J. A.; DiMarzio, J. P.
2017-12-01
The Cryosphere model Comparison tool (CmCt) is a web based ice sheet model validation tool that is being developed by NASA to facilitate direct comparison between observational data and various ice sheet models. The CmCt allows the user to take advantage of several decades worth of observations from Greenland and Antarctica. Currently, the CmCt can be used to compare ice sheet models provided by the user with remotely sensed satellite data from ICESat (Ice, Cloud, and land Elevation Satellite) laser altimetry, GRACE (Gravity Recovery and Climate Experiment) satellite, and radar altimetry (ERS-1, ERS-2, and Envisat). One or more models can be uploaded through the CmCt website and compared with observational data, or compared to each other or other models. The CmCt calculates statistics on the differences between the model and observations, and other quantitative and qualitative metrics, which can be used to evaluate the different model simulations against the observations. The qualitative metrics consist of a range of visual outputs and the quantitative metrics consist of several whole-ice-sheet scalar values that can be used to assign an overall score to a particular simulation. The comparison results from CmCt are useful in quantifying improvements within a specific model (or within a class of models) as a result of differences in model dynamics (e.g., shallow vs. higher-order dynamics approximations), model physics (e.g., representations of ice sheet rheological or basal processes), or model resolution (mesh resolution and/or changes in the spatial resolution of input datasets). The framework and metrics could also be used for use as a model-to-model intercomparison tool, simply by swapping outputs from another model as the observational datasets. Future versions of the tool will include comparisons with other datasets that are of interest to the modeling community, such as ice velocity, ice thickness, and surface mass balance.
FAMILY ANALYSIS OF IMMUNOGLOBULIN CLASSES AND SUBCLASSES IN CHILDREN WITH AUTISTIC DISORDER
Spiroski, Mirko; Trajkovski, Vladimir; Trajkov, Dejan; Petlichkovski, Aleksandar; Efinska-Mladenovska, Olivija; Hristomanova, Slavica; Djulejic, Eli; Paneva, Meri; Bozhikov, Jadranka
2009-01-01
Autistic disorder is a severe neurodevelopment disorder characterized by a triad of impairments in reciprocal social interaction, verbal and nonverbal communication, and a pattern of repetitive stereotyped activities, behaviours and interests. There are strong lines of evidence to suggest that the immune system plays an important role in the pathogenesis of autistic disorder. The aim of this study was to analyze quantitative plasma concentration of immunoglobulin classes, and subclasses in autistic patients and their families. The investigation was performed retrospectively in 50 persons with autistic disorder in the Republic of Macedonia. Infantile autistic disorder was diagnosed by DSM-IV and ICD-10 criteria. Plasma immunoglobulin classes (IgM, IgA, and IgG) and subclasses (IgG1, IgG2, IgG3, and IgG4) were determined using Nephelometer Analyzer BN-100. Multiple comparisons for the IgA variable have shown statistically significant differences between three pairs: male autistic from the fathers (p = 0,001), female autistic from the mothers (p = 0,008), as well as healthy sisters from the fathers (p = 0,011). Statistically significant differences found between three groups regarding autistic disorder (person with autistic disorder, father/mother of a person with autistic disorder, and brother/sister) independent of sex belongs to IgA, IgG2, and IgG3 variables. Multiple comparisons for the IgA variable have shown statistically significant differences between children with autistic disorder from the fathers and mothers (p < 0,001), and healthy brothers and sisters from the fathers and mothers (p < 0,001). Comparison between healthy children and children with autistic disorder from the same family should be tested for immunoglobulin classes and subclasses in order to avoid differences between generations. PMID:20001993
Family analysis of immunoglobulin classes and subclasses in children with autistic disorder.
Spiroski, Mirko; Trajkovski, Vladimir; Trajkov, Dejan; Petlichkovski, Aleksandar; Efinska-Mladenovska, Olivija; Hristomanova, Slavica; Djulejic, Eli; Paneva, Meri; Bozhikov, Jadranka
2009-11-01
Autistic disorder is a severe neurodevelopment disorder characterized by a triad of impairments in reciprocal social interaction, verbal and nonverbal communication, and a pattern of repetitive stereotyped activities, behaviours and interests. There are strong lines of evidence to suggest that the immune system plays an important role in the pathogenesis of autistic disorder. The aim of this study was to analyze quantitative plasma concentration of immunoglobulin classes, and subclasses in autistic patients and their families. The investigation was performed retrospectively in 50 persons with autistic disorder in the Republic of Macedonia. Infantile autistic disorder was diagnosed by DSM-IV and ICD-10 criteria. Plasma immunoglobulin classes (IgM, IgA, and IgG) and subclasses (IgG1, IgG2, IgG3, and IgG4) were determined using Nephelometer Analyzer BN-100. Multiple comparisons for the IgA variable have shown statistically significant differences between three pairs: male autistic from the fathers (p = 0,001), female autistic from the mothers (p = 0,008), as well as healthy sisters from the fathers (p = 0,011). Statistically significant differences found between three groups regarding autistic disorder (person with autistic disorder, father/mother of a person with autistic disorder, and brother/sister) independent of sex belongs to IgA, IgG2, and IgG3 variables. Multiple comparisons for the IgA variable have shown statistically significant differences between children with autistic disorder from the fathers and mothers (p < 0,001), and healthy brothers and sisters from the fathers and mothers (p < 0,001). Comparison between healthy children and children with autistic disorder from the same family should be tested for immunoglobulin classes and subclasses in order to avoid differences between generations.
ERIC Educational Resources Information Center
Owens, Susan T.
2017-01-01
Technology is becoming an integral tool in the classroom and can make a positive impact on how the students learn. This quantitative comparative research study examined gender-based differences among secondary Advanced Placement (AP) Statistic students comparing Educational Testing Service (ETS) College Board AP Statistic examination scores…
ERIC Educational Resources Information Center
Hardy, Melissa
2005-01-01
This article presents a response to Timothy Patrick Moran's article "The Sociology of Teaching Graduate Statistics." In his essay, Moran argues that exciting developments in techniques of quantitative analysis are currently coupled with a much less exciting formulaic approach to teaching sociology graduate students about quantitative analysis. The…
On Quantitative Comparative Research in Communication and Language Evolution
Oller, D. Kimbrough; Griebel, Ulrike
2014-01-01
Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives. PMID:25285057
On Quantitative Comparative Research in Communication and Language Evolution.
Oller, D Kimbrough; Griebel, Ulrike
2014-09-01
Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives.
Nodal portraits of quantum billiards: Domains, lines, and statistics
NASA Astrophysics Data System (ADS)
Jain, Sudhir Ranjan; Samajdar, Rhine
2017-10-01
This is a comprehensive review of the nodal domains and lines of quantum billiards, emphasizing a quantitative comparison of theoretical findings to experiments. The nodal statistics are shown to distinguish not only between regular and chaotic classical dynamics but also between different geometric shapes of the billiard system itself. How a random superposition of plane waves can model chaotic eigenfunctions is discussed and the connections of the complex morphology of the nodal lines thereof to percolation theory and Schramm-Loewner evolution are highlighted. Various approaches to counting the nodal domains—using trace formulas, graph theory, and difference equations—are also illustrated with examples. The nodal patterns addressed pertain to waves on vibrating plates and membranes, acoustic and electromagnetic modes, wave functions of a "particle in a box" as well as to percolating clusters, and domains in ferromagnets, thus underlining the diversity and far-reaching implications of the problem.
Global estimates of shark catches using trade records from commercial markets.
Clarke, Shelley C; McAllister, Murdoch K; Milner-Gulland, E J; Kirkwood, G P; Michielsens, Catherine G J; Agnew, David J; Pikitch, Ellen K; Nakano, Hideki; Shivji, Mahmood S
2006-10-01
Despite growing concerns about overexploitation of sharks, lack of accurate, species-specific harvest data often hampers quantitative stock assessment. In such cases, trade studies can provide insights into exploitation unavailable from traditional monitoring. We applied Bayesian statistical methods to trade data in combination with genetic identification to estimate by species, the annual number of globally traded shark fins, the most commercially valuable product from a group of species often unrecorded in harvest statistics. Our results provide the first fishery-independent estimate of the scale of shark catches worldwide and indicate that shark biomass in the fin trade is three to four times higher than shark catch figures reported in the only global data base. Comparison of our estimates to approximated stock assessment reference points for one of the most commonly traded species, blue shark, suggests that current trade volumes in numbers of sharks are close to or possibly exceeding the maximum sustainable yield levels.
NASA Astrophysics Data System (ADS)
Beam, Margery Elizabeth
The combination of increasing enrollment and the importance of providing transfer students a solid foundation in science calls for science faculty to evaluate teaching methods in rural community colleges. The purpose of this study was to examine and compare the effectiveness of two teaching methods, inquiry teaching methods and didactic teaching methods, applied in a rural community college earth science course. Two groups of students were taught the same content via inquiry and didactic teaching methods. Analysis of quantitative data included a non-parametric ranking statistical testing method in which the difference between the rankings and the median of the post-test scores was analyzed for significance. Results indicated there was not a significant statistical difference between the teaching methods for the group of students participating in the research. The practical and educational significance of this study provides valuable perspectives on teaching methods and student learning styles in rural community colleges.
NASA Technical Reports Server (NTRS)
Green, Del L.; Walker, Eric L.; Everhart, Joel L.
2006-01-01
Minimization of uncertainty is essential to extend the usable range of the 15-psid Electronically Scanned Pressure [ESP) transducer measurements to the low free-stream static pressures found in hypersonic wind tunnels. Statistical characterization of environmental error sources inducing much of this uncertainty requires a well defined and controlled calibration method. Employing such a controlled calibration system, several studies were conducted that provide quantitative information detailing the required controls needed to minimize environmental and human induced error sources. Results of temperature, environmental pressure, over-pressurization, and set point randomization studies for the 15-psid transducers are presented along with a comparison of two regression methods using data acquired with both 0.36-psid and 15-psid transducers. Together these results provide insight into procedural and environmental controls required for long term high-accuracy pressure measurements near 0.01 psia in the hypersonic testing environment using 15-psid ESP transducers.
NASA Technical Reports Server (NTRS)
Green, Del L.; Walker, Eric L.; Everhart, Joel L.
2006-01-01
Minimization of uncertainty is essential to extend the usable range of the 15-psid Electronically Scanned Pressure (ESP) transducer measurements to the low free-stream static pressures found in hypersonic wind tunnels. Statistical characterization of environmental error sources inducing much of this uncertainty requires a well defined and controlled calibration method. Employing such a controlled calibration system, several studies were conducted that provide quantitative information detailing the required controls needed to minimize environmental and human induced error sources. Results of temperature, environmental pressure, over-pressurization, and set point randomization studies for the 15-psid transducers are presented along with a comparison of two regression methods using data acquired with both 0.36-psid and 15-psid transducers. Together these results provide insight into procedural and environmental controls required for long term high-accuracy pressure measurements near 0.01 psia in the hypersonic testing environment using 15-psid ESP transducers.
RAD-ADAPT: Software for modelling clonogenic assay data in radiation biology.
Zhang, Yaping; Hu, Kaiqiang; Beumer, Jan H; Bakkenist, Christopher J; D'Argenio, David Z
2017-04-01
We present a comprehensive software program, RAD-ADAPT, for the quantitative analysis of clonogenic assays in radiation biology. Two commonly used models for clonogenic assay analysis, the linear-quadratic model and single-hit multi-target model, are included in the software. RAD-ADAPT uses maximum likelihood estimation method to obtain parameter estimates with the assumption that cell colony count data follow a Poisson distribution. The program has an intuitive interface, generates model prediction plots, tabulates model parameter estimates, and allows automatic statistical comparison of parameters between different groups. The RAD-ADAPT interface is written using the statistical software R and the underlying computations are accomplished by the ADAPT software system for pharmacokinetic/pharmacodynamic systems analysis. The use of RAD-ADAPT is demonstrated using an example that examines the impact of pharmacologic ATM and ATR kinase inhibition on human lung cancer cell line A549 after ionizing radiation. Copyright © 2017 Elsevier B.V. All rights reserved.
Närhi, Mikko; Wetzel, Benjamin; Billet, Cyril; Toenger, Shanti; Sylvestre, Thibaut; Merolla, Jean-Marc; Morandotti, Roberto; Dias, Frederic; Genty, Goëry; Dudley, John M.
2016-01-01
Modulation instability is a fundamental process of nonlinear science, leading to the unstable breakup of a constant amplitude solution of a physical system. There has been particular interest in studying modulation instability in the cubic nonlinear Schrödinger equation, a generic model for a host of nonlinear systems including superfluids, fibre optics, plasmas and Bose–Einstein condensates. Modulation instability is also a significant area of study in the context of understanding the emergence of high amplitude events that satisfy rogue wave statistical criteria. Here, exploiting advances in ultrafast optical metrology, we perform real-time measurements in an optical fibre system of the unstable breakup of a continuous wave field, simultaneously characterizing emergent modulation instability breather pulses and their associated statistics. Our results allow quantitative comparison between experiment, modelling and theory, and are expected to open new perspectives on studies of instability dynamics in physics. PMID:27991513
SDAR 1.0 a New Quantitative Toolkit for Analyze Stratigraphic Data
NASA Astrophysics Data System (ADS)
Ortiz, John; Moreno, Carlos; Cardenas, Andres; Jaramillo, Carlos
2015-04-01
Since the foundation of stratigraphy geoscientists have recognized that data obtained from stratigraphic columns (SC), two dimensional schemes recording descriptions of both geological and paleontological features (e.g., thickness of rock packages, grain size, fossil and lithological components, and sedimentary structures), are key elements for establishing reliable hypotheses about the distribution in space and time of rock sequences, and ancient sedimentary environmental and paleobiological dynamics. Despite the tremendous advances on the way geoscientists store, plot, and quantitatively analyze sedimentological and paleontological data (e.g., Macrostrat [http://www.macrostrat.org/], Paleobiology Database [http://www.paleodb.org/], respectively), there is still a lack of computational methodologies designed to quantitatively examine data from a highly detailed SCs. Moreover, frequently the stratigraphic information is plotted "manually" using vector graphics editors (e.g., Corel Draw, Illustrator), however, this information although store on a digital format, cannot be used readily for any quantitative analysis. Therefore, any attempt to examine the stratigraphic data in an analytical fashion necessarily takes further steps. Given these issues, we have developed the sofware 'Stratigraphic Data Analysis in R' (SDAR), which stores in a database all sedimentological, stratigraphic, and paleontological information collected from a SC, allowing users to generate high-quality graphic plots (including one or multiple features stored in the database). SDAR also encompasses quantitative analyses helping users to quantify stratigraphic information (e.g. grain size, sorting and rounding, proportion of sand/shale). Finally, given that the SDAR analysis module, has been written in the open-source high-level computer language "R graphics/statistics language" [R Development Core Team, 2014], it is already loaded with many of the crucial features required to accomplish basic and complex tasks of statistical analysis (i.e., R language provide more than hundred spatial libraries that allow users to explore various Geostatistics and spatial analysis). Consequently, SDAR allows a deeper exploration of the stratigraphic data collected in the field, it will allow the geoscientific community in the near future to develop complex analyses related with the distribution in space and time of rock sequences, such as lithofacial correlations, by a multivariate comparison between empirical SCs with quantitative lithofacial models established from modern sedimentary environments.
Lazar, Cosmin; Gatto, Laurent; Ferro, Myriam; Bruley, Christophe; Burger, Thomas
2016-04-01
Missing values are a genuine issue in label-free quantitative proteomics. Recent works have surveyed the different statistical methods to conduct imputation and have compared them on real or simulated data sets and recommended a list of missing value imputation methods for proteomics application. Although insightful, these comparisons do not account for two important facts: (i) depending on the proteomics data set, the missingness mechanism may be of different natures and (ii) each imputation method is devoted to a specific type of missingness mechanism. As a result, we believe that the question at stake is not to find the most accurate imputation method in general but instead the most appropriate one. We describe a series of comparisons that support our views: For instance, we show that a supposedly "under-performing" method (i.e., giving baseline average results), if applied at the "appropriate" time in the data-processing pipeline (before or after peptide aggregation) on a data set with the "appropriate" nature of missing values, can outperform a blindly applied, supposedly "better-performing" method (i.e., the reference method from the state-of-the-art). This leads us to formulate few practical guidelines regarding the choice and the application of an imputation method in a proteomics context.
Quantitative framework for prospective motion correction evaluation.
Pannetier, Nicolas A; Stavrinos, Theano; Ng, Peter; Herbst, Michael; Zaitsev, Maxim; Young, Karl; Matson, Gerald; Schuff, Norbert
2016-02-01
Establishing a framework to evaluate performances of prospective motion correction (PMC) MRI considering motion variability between MRI scans. A framework was developed to obtain quantitative comparisons between different motion correction setups, considering that varying intrinsic motion patterns between acquisitions can induce bias. Intrinsic motion was considered by replaying in a phantom experiment the recorded motion trajectories from subjects. T1-weighted MRI on five volunteers and two different marker fixations (mouth guard and nose bridge fixations) were used to test the framework. Two metrics were investigated to quantify the improvement of the image quality with PMC. Motion patterns vary between subjects as well as between repeated scans within a subject. This variability can be approximated by replaying the motion in a distinct phantom experiment and used as a covariate in models comparing motion corrections. We show that considering the intrinsic motion alters the statistical significance in comparing marker fixations. As an example, two marker fixations, a mouth guard and a nose bridge, were evaluated in terms of their effectiveness for PMC. A mouth guard achieved better PMC performance. Intrinsic motion patterns can bias comparisons between PMC configurations and must be considered for robust evaluations. A framework for evaluating intrinsic motion patterns in PMC is presented. © 2015 Wiley Periodicals, Inc.
Impact of pediatric burn camps on participants' self esteem and body image: an empirical study.
Bakker, Anne; Van der Heijden, Peter G M; Van Son, Maarten J M; Van de Schoot, Rens; Van Loey, Nancy E E
2011-12-01
This study focuses on possible effects of specialized summer camps on young burn survivors' self esteem and body image. Quantitative as well as qualitative measures was used. To study possible effects, a pretest-posttest comparison group design with a follow-up was employed. Self-report questionnaires were used to measure self esteem and body image in a burn camp group (n=83, 8-18 years) and in a comparison group of children with burns who did not attend a burn camp during the course of the study (n=90, 8-18 years). Additionally, burn camp participants and parents completed an evaluation form about benefits derived from burn camp. A small positive short-term effect of burn camp participation was found on the 'satisfaction with appearance' component of body image. Overall, participants and parents showed high appreciation of the burn camps and reported several benefits, particularly concerning meeting other young burn survivors. Albeit statistically modest, this is the first quantitative study to document on a significant short-term impact of burn camp on young burn survivors' body image. Implications of this result for future research and burn camp organization were discussed, including the strengths of residential camps for young burn survivors. Copyright © 2011 Elsevier Ltd and ISBI. All rights reserved.
NASA Astrophysics Data System (ADS)
Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo
2014-05-01
This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.
Nadin-Davis, Susan; Knowles, Margaret K; Burke, Teresa; Böse, Reinhard; Devenish, John
2015-07-01
A quantitative real-time polymerase chain reaction method (qPCR) was developed and tested for the detection of Taylorella equigenitalis. It was shown to have an analytical sensitivity of 5 colony-forming units (CFU) of T. equigenitalis when applied to the testing of culture swabs that mimicked field samples, and a high analytical specificity in not reacting to 8 other commensal bacterial species associated with horses. As designed, it could also differentiate specifically between T. equigenitalis and T. asinigenitalis. The qPCR was compared to standard culture in a study that included 45 swab samples from 6 horses (1 stallion, 5 mares) naturally infected with T. equigenitalis in Canada, 39 swab samples from 5 naturally infected stallions in Germany, and 311 swab samples from 87 culture negative horses in Canada. When the comparison was conducted on an individual sample swab basis, the qPCR had a statistical sensitivity and specificity of 100% and 96.4%, respectively, and 100% and 99.1% when the comparison was conducted on a sample set basis. A comparison was also made on 203 sample swabs from the 5 German stallions taken over a span of 4 to 9 mo following antibiotic treatment. The qPCR was found to be highly sensitive and at least as good as culture in detecting the presence of T. equigenitalis in post-treatment samples. The work demonstrates that the qPCR assay described here can potentially be used to detect the presence of T. equigenitalis directly from submitted sample swabs taken from infected horses and also for determining T. equigenitalis freedom following treatment.
Nadin-Davis, Susan; Knowles, Margaret K.; Burke, Teresa; Böse, Reinhard; Devenish, John
2015-01-01
A quantitative real-time polymerase chain reaction method (qPCR) was developed and tested for the detection of Taylorella equigenitalis. It was shown to have an analytical sensitivity of 5 colony-forming units (CFU) of T. equigenitalis when applied to the testing of culture swabs that mimicked field samples, and a high analytical specificity in not reacting to 8 other commensal bacterial species associated with horses. As designed, it could also differentiate specifically between T. equigenitalis and T. asinigenitalis. The qPCR was compared to standard culture in a study that included 45 swab samples from 6 horses (1 stallion, 5 mares) naturally infected with T. equigenitalis in Canada, 39 swab samples from 5 naturally infected stallions in Germany, and 311 swab samples from 87 culture negative horses in Canada. When the comparison was conducted on an individual sample swab basis, the qPCR had a statistical sensitivity and specificity of 100% and 96.4%, respectively, and 100% and 99.1% when the comparison was conducted on a sample set basis. A comparison was also made on 203 sample swabs from the 5 German stallions taken over a span of 4 to 9 mo following antibiotic treatment. The qPCR was found to be highly sensitive and at least as good as culture in detecting the presence of T. equigenitalis in post-treatment samples. The work demonstrates that the qPCR assay described here can potentially be used to detect the presence of T. equigenitalis directly from submitted sample swabs taken from infected horses and also for determining T. equigenitalis freedom following treatment. PMID:26130847
NASA Technical Reports Server (NTRS)
Chilingaryan, A. A.; Galfayan, S. K.; Zazyan, M. Z.; Dunaevsky, A. M.
1985-01-01
Nonparametric statistical methods are used to carry out the quantitative comparison of the model and the experimental data. The same methods enable one to select the events initiated by the heavy nuclei and to calculate the portion of the corresponding events. For this purpose it is necessary to have the data on artificial events describing the experiment sufficiently well established. At present, the model with the small scaling violation in the fragmentation region is the closest to the experiments. Therefore, the treatment of gamma families obtained in the Pamir' experiment is being carried out at present with the application of these models.
Analysis of conditional genetic effects and variance components in developmental genetics.
Zhu, J
1995-12-01
A genetic model with additive-dominance effects and genotype x environment interactions is presented for quantitative traits with time-dependent measures. The genetic model for phenotypic means at time t conditional on phenotypic means measured at previous time (t-1) is defined. Statistical methods are proposed for analyzing conditional genetic effects and conditional genetic variance components. Conditional variances can be estimated by minimum norm quadratic unbiased estimation (MINQUE) method. An adjusted unbiased prediction (AUP) procedure is suggested for predicting conditional genetic effects. A worked example from cotton fruiting data is given for comparison of unconditional and conditional genetic variances and additive effects.
Analysis of Conditional Genetic Effects and Variance Components in Developmental Genetics
Zhu, J.
1995-01-01
A genetic model with additive-dominance effects and genotype X environment interactions is presented for quantitative traits with time-dependent measures. The genetic model for phenotypic means at time t conditional on phenotypic means measured at previous time (t - 1) is defined. Statistical methods are proposed for analyzing conditional genetic effects and conditional genetic variance components. Conditional variances can be estimated by minimum norm quadratic unbiased estimation (MINQUE) method. An adjusted unbiased prediction (AUP) procedure is suggested for predicting conditional genetic effects. A worked example from cotton fruiting data is given for comparison of unconditional and conditional genetic variances and additive effects. PMID:8601500
Abortion and mental health: quantitative synthesis and analysis of research published 1995-2009.
Coleman, Priscilla K
2011-09-01
Given the methodological limitations of recently published qualitative reviews of abortion and mental health, a quantitative synthesis was deemed necessary to represent more accurately the published literature and to provide clarity to clinicians. To measure the association between abortion and indicators of adverse mental health, with subgroup effects calculated based on comparison groups (no abortion, unintended pregnancy delivered, pregnancy delivered) and particular outcomes. A secondary objective was to calculate population-attributable risk (PAR) statistics for each outcome. After the application of methodologically based selection criteria and extraction rules to minimise bias, the sample comprised 22 studies, 36 measures of effect and 877 181 participants (163 831 experienced an abortion). Random effects pooled odds ratios were computed using adjusted odds ratios from the original studies and PAR statistics were derived from the pooled odds ratios. Women who had undergone an abortion experienced an 81% increased risk of mental health problems, and nearly 10% of the incidence of mental health problems was shown to be attributable to abortion. The strongest subgroup estimates of increased risk occurred when abortion was compared with term pregnancy and when the outcomes pertained to substance use and suicidal behaviour. This review offers the largest quantitative estimate of mental health risks associated with abortion available in the world literature. Calling into question the conclusions from traditional reviews, the results revealed a moderate to highly increased risk of mental health problems after abortion. Consistent with the tenets of evidence-based medicine, this information should inform the delivery of abortion services.
Bogema, D. R.; Deutscher, A. T.; Fell, S.; Collins, D.; Eamens, G. J.
2015-01-01
Theileria orientalis is an emerging pathogen of cattle in Asia, Australia, and New Zealand. This organism is a vector-borne hemoprotozoan that causes clinical disease characterized by anemia, abortion, and death, as well as persistent subclinical infections. Molecular methods of diagnosis are preferred due to their sensitivity and utility in differentiating between pathogenic and apathogenic genotypes. Conventional PCR (cPCR) assays for T. orientalis detection and typing are laborious and do not provide an estimate of parasite load. Current real-time PCR assays cannot differentiate between clinically relevant and benign genotypes or are only semiquantitative without a defined clinical threshold. Here, we developed and validated a hydrolysis probe quantitative PCR (qPCR) assay which universally detects and quantifies T. orientalis and identifies the clinically associated Ikeda and Chitose genotypes (UIC assay). Comparison of the UIC assay results with previously validated universal and genotype-specific cPCR results demonstrated that qPCR detects and differentiates T. orientalis with high sensitivity and specificiy. Comparison of quantitative results based on percent parasitemia, determined via blood film analysis and packed cell volume (PCV) revealed significant positive and negative correlations, respectively. One-way analysis of variance (ANOVA) indicated that blood samples from animals with clinical signs of disease contained statistically higher concentrations of T. orientalis DNA than animals with subclinical infections. We propose clinical thresholds to assist in classifying high-, moderate-, and low-level infections and describe how parasite load and the presence of the Ikeda and Chitose genotypes relate to disease. PMID:25588653
NASA Technical Reports Server (NTRS)
Cummins, Kenneth L.; Carey, Lawrence D.; Schultz, Christopher J.; Bateman, Monte G.; Cecil, Daniel J.; Rudlosky, Scott D.; Petersen, Walter Arthur; Blakeslee, Richard J.; Goodman, Steven J.
2011-01-01
In order to produce useful proxy data for the GOES-R Geostationary Lightning Mapper (GLM) in regions not covered by VLF lightning mapping systems, we intend to employ data produced by ground-based (regional or global) VLF/LF lightning detection networks. Before using these data in GLM Risk Reduction tasks, it is necessary to have a quantitative understanding of the performance of these networks, in terms of CG flash/stroke DE, cloud flash/pulse DE, location accuracy, and CLD/CG classification error. This information is being obtained through inter-comparison with LMAs and well-quantified VLF/LF lightning networks. One of our approaches is to compare "bulk" counting statistics on the spatial scale of convective cells, in order to both quantify relative performance and observe variations in cell-based temporal trends provided by each network. In addition, we are using microsecond-level stroke/pulse time correlation to facilitate detailed inter-comparisons at a more-fundamental level. The current development status of our ground-based inter-comparison and evaluation tools will be presented, and performance metrics will be discussed through a comparison of Vaisala s Global Lightning Dataset (GLD360) with the NLDN at locations within and outside the U.S.
NASA Astrophysics Data System (ADS)
Cummins, K. L.; Carey, L. D.; Schultz, C. J.; Bateman, M. G.; Cecil, D. J.; Rudlosky, S. D.; Petersen, W. A.; Blakeslee, R. J.; Goodman, S. J.
2011-12-01
In order to produce useful proxy data for the GOES-R Geostationary Lightning Mapper (GLM) in regions not covered by VLF lightning mapping systems, we intend to employ data produced by ground-based (regional or global) VLF/LF lightning detection networks. Before using these data in GLM Risk Reduction tasks, it is necessary to have a quantitative understanding of the performance of these networks, in terms of CG flash/stroke DE, cloud flash/pulse DE, location accuracy, and CLD/CG classification error. This information is being obtained through inter-comparison with LMAs and well-quantified VLF/LF lightning networks. One of our approaches is to compare "bulk" counting statistics on the spatial scale of convective cells, in order to both quantify relative performance and observe variations in cell-based temporal trends provided by each network. In addition, we are using microsecond-level stroke/pulse time correlation to facilitate detailed inter-comparisons at a more-fundamental level. The current development status of our ground-based inter-comparison and evaluation tools will be presented, and performance metrics will be discussed through a comparison of Vaisala's Global Lightning Dataset (GLD360) with the NLDN at locations within and outside the U.S.
ERIC Educational Resources Information Center
Teater, Barbra; Roy, Jessica; Carpenter, John; Forrester, Donald; Devaney, John; Scourfield, Jonathan
2017-01-01
Students in the United Kingdom (UK) are found to lack knowledge and skills in quantitative research methods. To address this gap, a quantitative research method and statistical analysis curriculum comprising 10 individual lessons was developed, piloted, and evaluated at two universities The evaluation found that BSW students' (N = 81)…
Baka, Zeliha Muge; Akin, Mehmet; Ucar, Faruk Izzet; Ileri, Zehra
2015-01-01
The aims of this study were to quantitatively evaluate the changes in arch widths and buccolingual inclinations of the posterior teeth after asymmetric rapid maxillary expansion (ARME) and to compare the measurements between the crossbite and the noncrossbite sides with cone-beam computed tomography (CBCT). From our clinic archives, we selected the CBCT records of 30 patients with unilateral skeletal crossbite (13 boys, 14.2 ± 1.3 years old; 17 girls, 13.8 ± 1.3 years old) who underwent ARME treatment. A modified acrylic bonded rapid maxillary expansion appliance including an occlusal locking mechanism was used in all patients. CBCT records had been taken before ARME treatment and after a 3-month retention period. Fourteen angular and 80 linear measurements were taken for the maxilla and the mandible. Frontally clipped CBCT images were used for the evaluation. Paired sample and independent sample t tests were used for statistical comparisons. Comparisons of the before-treatment and after-retention measurements showed that the arch widths and buccolingual inclinations of the posterior teeth increased significantly on the crossbite side of the maxilla and on the noncrossbite side of the mandible (P <0.05). Comparison of the 2 sides showed statistically significant differences in both the maxilla and the mandible (P <0.05). After ARME treatment, the crossbite side of the maxilla and the noncrossbite side of the mandible were more affected than were the opposite sides. Copyright © 2015. Published by Elsevier Inc.
Gender in the allocation of organs in kidney transplants: meta-analysis
Santiago, Erika Vieira Almeida e; Silveira, Micheline Rosa; de Araújo, Vânia Eloisa; Farah, Katia de Paula; Acurcio, Francisco de Assis; Ceccato, Maria das Graças Braga
2015-01-01
OBJECTIVE To analyze whether gender influence survival results of kidney transplant grafts and patients. METHODS Systematic review with meta-analysis of cohort studies available on Medline (PubMed), LILACS, CENTRAL, and Embase databases, including manual searching and in the grey literature. The selection of studies and the collection of data were conducted twice by independent reviewers, and disagreements were settled by a third reviewer. Graft and patient survival rates were evaluated as effectiveness measurements. Meta-analysis was conducted with the Review Manager® 5.2 software, through the application of a random effects model. Recipient, donor, and donor-recipient gender comparisons were evaluated. RESULTS : Twenty-nine studies involving 765,753 patients were included. Regarding graft survival, those from male donors were observed to have longer survival rates as compared to the ones from female donors, only regarding a 10-year follow-up period. Comparison between recipient genders was not found to have significant differences on any evaluated follow-up periods. In the evaluation between donor-recipient genders, male donor-male recipient transplants were favored in a statistically significant way. No statistically significant differences were observed in regards to patient survival for gender comparisons in all follow-up periods evaluated. CONCLUSIONS The quantitative analysis of the studies suggests that donor or recipient genders, when evaluated isolatedly, do not influence patient or graft survival rates. However, the combination between donor-recipient genders may be a determining factor for graft survival. PMID:26465666
Instrumental and statistical methods for the comparison of class evidence
NASA Astrophysics Data System (ADS)
Liszewski, Elisa Anne
Trace evidence is a major field within forensic science. Association of trace evidence samples can be problematic due to sample heterogeneity and a lack of quantitative criteria for comparing spectra or chromatograms. The aim of this study is to evaluate different types of instrumentation for their ability to discriminate among samples of various types of trace evidence. Chemometric analysis, including techniques such as Agglomerative Hierarchical Clustering, Principal Components Analysis, and Discriminant Analysis, was employed to evaluate instrumental data. First, automotive clear coats were analyzed by using microspectrophotometry to collect UV absorption data. In total, 71 samples were analyzed with classification accuracy of 91.61%. An external validation was performed, resulting in a prediction accuracy of 81.11%. Next, fiber dyes were analyzed using UV-Visible microspectrophotometry. While several physical characteristics of cotton fiber can be identified and compared, fiber color is considered to be an excellent source of variation, and thus was examined in this study. Twelve dyes were employed, some being visually indistinguishable. Several different analyses and comparisons were done, including an inter-laboratory comparison and external validations. Lastly, common plastic samples and other polymers were analyzed using pyrolysis-gas chromatography/mass spectrometry, and their pyrolysis products were then analyzed using multivariate statistics. The classification accuracy varied dependent upon the number of classes chosen, but the plastics were grouped based on composition. The polymers were used as an external validation and misclassifications occurred with chlorinated samples all being placed into the category containing PVC.
Morrison, Geoffrey Stewart
2014-05-01
In this paper it is argued that one should not attempt to directly assess whether a forensic analysis technique is scientifically acceptable. Rather one should first specify what one considers to be appropriate principles governing acceptable practice, then consider any particular approach in light of those principles. This paper focuses on one principle: the validity and reliability of an approach should be empirically tested under conditions reflecting those of the case under investigation using test data drawn from the relevant population. Versions of this principle have been key elements in several reports on forensic science, including forensic voice comparison, published over the last four-and-a-half decades. The aural-spectrographic approach to forensic voice comparison (also known as "voiceprint" or "voicegram" examination) and the currently widely practiced auditory-acoustic-phonetic approach are considered in light of this principle (these two approaches do not appear to be mutually exclusive). Approaches based on data, quantitative measurements, and statistical models are also considered in light of this principle. © 2013.
Infrared thermography quantitative image processing
NASA Astrophysics Data System (ADS)
Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB
2017-11-01
Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.
Detailed Quantitative Classifications of Galaxy Morphology
NASA Astrophysics Data System (ADS)
Nair, Preethi
2018-01-01
Understanding the physical processes responsible for the growth of galaxies is one of the key challenges in extragalactic astronomy. The assembly history of a galaxy is imprinted in a galaxy’s detailed morphology. The bulge-to-total ratio of galaxies, the presence or absence of bars, rings, spiral arms, tidal tails etc, all have implications for the past merger, star formation, and feedback history of a galaxy. However, current quantitative galaxy classification schemes are only useful for broad binning. They cannot classify or exploit the wide variety of galaxy structures seen in nature. Therefore, comparisons of observations with theoretical predictions of secular structure formation have only been conducted on small samples of visually classified galaxies. However large samples are needed to disentangle the complex physical processes of galaxy formation. With the advent of large surveys, like the Sloan Digital Sky Survey (SDSS) and the upcoming Large Synoptic Survey Telescope (LSST) and WFIRST, the problem of statistics will be resolved. However, the need for a robust quantitative classification scheme will still remain. Here I will present early results on promising machine learning algorithms that are providing detailed classifications, identifying bars, rings, multi-armed spiral galaxies, and Hubble type.
NASA Astrophysics Data System (ADS)
Weinerová, Hedvika; Hron, Karel; Bábek, Ondřej; Šimíček, Daniel; Hladil, Jindřich
2017-06-01
Quantitative allochem compositional trends across the Lochkovian-Pragian boundary Event were examined at three sections recording the proximal to more distal carbonate ramp environment of the Prague Basin. Multivariate statistical methods (principal component analysis, correspondence analysis, cluster analysis) of point-counted thin section data were used to reconstruct facies stacking patterns and sea-level history. Both the closed-nature allochem percentages and their centred log-ratio (clr) coordinates were used. Both these approaches allow for distinguishing of lowstand, transgressive and highstand system tracts within the Praha Formation, which show gradual transition from crinoid-dominated facies deposited above the storm wave base to dacryoconarid-dominated facies of deep-water environment below the storm wave base. Quantitative compositional data also indicate progradative-retrogradative trends in the macrolithologically monotonous shallow-water succession and enable its stratigraphic correlation with successions from deeper-water environments. Generally, the stratigraphic trends of the clr data are more sensitive to subtle changes in allochem composition in comparison to the results based on raw data. A heterozoan-dominated allochem association in shallow-water environments of the Praha Formation supports the carbonate ramp environment assumed by previous authors.
NASA Astrophysics Data System (ADS)
Emam, Aml A.; Abdelaleem, Eglal A.; Naguib, Ibrahim A.; Abdallah, Fatma F.; Ali, Nouruddin W.
2018-03-01
Furosemide and spironolactone are commonly prescribed antihypertensive drugs. Canrenone is the main degradation product and main metabolite of spironolactone. Ratio subtraction and extended ratio subtraction spectrophotometric methods were previously applied for quantitation of only binary mixtures. An extension of the above mentioned methods; successive ratio subtraction, is introduced in the presented work for quantitative determination of ternary mixtures exemplified by furosemide, spironolactone and canrenone. Manipulating the ratio spectra of the ternary mixture allowed their determination at 273.6 nm, 285 nm and 240 nm and in the concentration ranges of (2-16 μg mL- 1), (4-32 μg mL- 1) and (1-18 μg mL- 1) for furosemide, spironolactone and canrenone, respectively. Method specificity was ensured by the application to laboratory prepared mixtures. The introduced method was ensured to be accurate and precise. Validation of the developed method was done with respect to ICH guidelines and its validity was further ensured by the application to the pharmaceutical formulation. Statistical comparison between the obtained results and those obtained from the reported HPLC method was achieved concerning student's t-test and F ratio test where no significant difference was observed.
Laban, Mohamed; Ibrahim, Eman Abdel-Salam; Elsafty, Mohammed Saeed Eldin; Hassanin, Alaa Sayed
2014-10-01
Placenta accreta is a general term describes abnormal adherent placenta to the uterine wall. When the chorionic villi invade the myometrium, the term placenta increta is appropriate. Nowadays, it is one of the increasing causes of materno-fetal morbidities and mortality. The aim of this research was to evaluate density of decidual natural killer cells (dNK, CD56+(bright)) in decidua basalis in patients with placenta accreta. We recruited 76 patients from Ain Shams Maternity Hospital between June 2012 to August 2013, they were divided into study subgroup (A) which included 10 patients who underwent cesarean hysterectomy due to unseparated placenta accreta, study subgroup (B) included 16 patients with separated placenta accreta, a comparison group included 25 patients with placenta previa and a control group included 25 patients with normally situated placenta. All patients underwent elective cesarean delivery. Decidual biopsies were taken during the operation. An immunohistochemical staining for (dNK, CD56+(bright)) and a semi quantitative scoring were done. One-way ANOVA and Fisher Exact tests were used for statistical correlation. The mean dNK cells scores were (0.4±0.5, 1.9±1, 3.3±0.5 and 3.5±0.5) for study subgroups (A), (B) comparison and control groups respectively) with a highly significant statistical difference (P<0.001). There was a significant statistical difference between study subgroups (A) and (B) P=0.002 .There was an insignificant statistical correlation between dNK scores and number of previous uterine scars (P=0.46). These findings suggest that low dNK score was associated with cases of morbidly adherent placenta accreta. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Optimization of Statistical Methods Impact on Quantitative Proteomics Data.
Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L
2015-10-02
As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.
McClelland, G R; Sutton, J A
1986-01-01
Eight healthy male volunteers participated in a single-blind, random allocation, crossover, comparison of intravenous metoclopramide (10 mg), the peripherally acting, gastrointestinal stimulant BRL 20627 (10 mg) and saline. The central nervous system effects were assessed by quantitative electroencephalography (EEG) and by visual analogue scales. Gastric motility and emptying were assessed by epigastric impedance. Metoclopramide increased the EEG amplitude by 10.4% (a statistically significant, P less than 0.05, effect) and increased frequencies above 22 Hz, whereas both BRL 20627 and placebo had only minor effect on the EEG frequencies and slightly decreased the EEG amplitude. Ratings on visual analogue scales showed that metoclopramide caused statistically significant (P less than 0.01 difference from placebo) restlessness and slight but significantly less (P less than 0.05 difference from placebo) feeling of happiness. Epigastic impedance changes indicated that both metoclopramide and BRL 20627 increased gastric contractile activity, but the rate of gastric emptying was not significantly altered by either drug although it tended to be shortened following metoclopramide but not BRL 20627 treatment. It is concluded that since the published animal data show that BRL 20627 has only weak dopamine antagonistic properties this study further implicates dopamine receptor blockade in the akathisia but not in the gastric effect of metoclopramide. PMID:3755051
Zhang, Zhiqing; Kuzmin, Nikolay V; Groot, Marie Louise; de Munck, Jan C
2017-06-01
The morphologies contained in 3D third harmonic generation (THG) images of human brain tissue can report on the pathological state of the tissue. However, the complexity of THG brain images makes the usage of modern image processing tools, especially those of image filtering, segmentation and validation, to extract this information challenging. We developed a salient edge-enhancing model of anisotropic diffusion for image filtering, based on higher order statistics. We split the intrinsic 3-phase segmentation problem into two 2-phase segmentation problems, each of which we solved with a dedicated model, active contour weighted by prior extreme. We applied the novel proposed algorithms to THG images of structurally normal ex-vivo human brain tissue, revealing key tissue components-brain cells, microvessels and neuropil, enabling statistical characterization of these components. Comprehensive comparison to manually delineated ground truth validated the proposed algorithms. Quantitative comparison to second harmonic generation/auto-fluorescence images, acquired simultaneously from the same tissue area, confirmed the correctness of the main THG features detected. The software and test datasets are available from the authors. z.zhang@vu.nl. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Enzinger, Ewald; Morrison, Geoffrey Stewart; Ochoa, Felipe
2016-01-01
The new paradigm for the evaluation of the strength of forensic evidence includes: The use of the likelihood-ratio framework. The use of relevant data, quantitative measurements, and statistical models. Empirical testing of validity and reliability under conditions reflecting those of the case under investigation. Transparency as to decisions made and procedures employed. The present paper illustrates the use of the new paradigm to evaluate strength of evidence under conditions reflecting those of a real forensic-voice-comparison case. The offender recording was from a landline telephone system, had background office noise, and was saved in a compressed format. The suspect recording included substantial reverberation and ventilation system noise, and was saved in a different compressed format. The present paper includes descriptions of the selection of the relevant hypotheses, sampling of data from the relevant population, simulation of suspect and offender recording conditions, and acoustic measurement and statistical modelling procedures. The present paper also explores the use of different techniques to compensate for the mismatch in recording conditions. It also examines how system performance would have differed had the suspect recording been of better quality. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
Quantitative Simulation of QARBM Challenge Events During Radiation Belt Enhancements
NASA Astrophysics Data System (ADS)
Li, W.; Ma, Q.; Thorne, R. M.; Bortnik, J.; Chu, X.
2017-12-01
Various physical processes are known to affect energetic electron dynamics in the Earth's radiation belts, but their quantitative effects at different times and locations in space need further investigation. This presentation focuses on discussing the quantitative roles of various physical processes that affect Earth's radiation belt electron dynamics during radiation belt enhancement challenge events (storm-time vs. non-storm-time) selected by the GEM Quantitative Assessment of Radiation Belt Modeling (QARBM) focus group. We construct realistic global distributions of whistler-mode chorus waves, adopt various versions of radial diffusion models (statistical and event-specific), and use the global evolution of other potentially important plasma waves including plasmaspheric hiss, magnetosonic waves, and electromagnetic ion cyclotron waves from all available multi-satellite measurements. These state-of-the-art wave properties and distributions on a global scale are used to calculate diffusion coefficients, that are then adopted as inputs to simulate the dynamical electron evolution using a 3D diffusion simulation during the storm-time and the non-storm-time acceleration events respectively. We explore the similarities and differences in the dominant physical processes that cause radiation belt electron dynamics during the storm-time and non-storm-time acceleration events. The quantitative role of each physical process is determined by comparing against the Van Allen Probes electron observations at different energies, pitch angles, and L-MLT regions. This quantitative comparison further indicates instances when quasilinear theory is sufficient to explain the observed electron dynamics or when nonlinear interaction is required to reproduce the energetic electron evolution observed by the Van Allen Probes.
The occurrence and distribution of trace metals in the Mississippi River and its tributaries
Taylor, Howard E.; Garbarino, J.R.; Brinton, T.I.
1990-01-01
Quantitative and semiquantitative analyses of dissolved trace metals are reported for designated sampling sites on the Mississippi River and its main tributaries utilizing depth-integrated and width-integrated sampling technology to collect statistically representative samples. Data are reported for three sampling periods, including: July-August 1987, November-December 1987, and May-June 1988. Concentrations of Al, As, Ba, Be, Cd, Co, Cr, Cu, Fe, Li, Mn, Mo, Pb, Sr, Tl, U, V, and Zn are reported quantitatively, with the remainder of the stable metals in the periodic table reported semiquantitatively. Correlations between As and V, Ba and U, Cu and Zn, Li and Ba, and Li and U are significant at the 99% confidence level for each of the sampling trips. Comparison of the results of this study for selected metals with other published data show generally good agreement for Cr, Cu, Fe, and Zn, moderate agreement for Mo, and poor agreement for Cd and V.
Monitoring Peptidase Activities in Complex Proteomes by MALDI-TOF Mass Spectrometry
Villanueva, Josep; Nazarian, Arpi; Lawlor, Kevin; Tempst, Paul
2009-01-01
Measuring enzymatic activities in biological fluids is a form of activity-based proteomics and may be utilized as a means of developing disease biomarkers. Activity-based assays allow amplification of output signals, thus potentially visualizing low-abundant enzymes on a virtually transparent whole-proteome background. The protocol presented here describes a semi-quantitative in vitro assay of proteolytic activities in complex proteomes by monitoring breakdown of designer peptide-substrates using robotic extraction and a MALDI-TOF mass spectrometric read-out. Relative quantitation of the peptide metabolites is done by comparison with spiked internal standards, followed by statistical analysis of the resulting mini-peptidome. Partial automation provides reproducibility and throughput essential for comparing large sample sets. The approach may be employed for diagnostic or predictive purposes and enables profiling of 96 samples in 30 hours. It could be tailored to many diagnostic and pharmaco-dynamic purposes, as a read-out of catalytic and metabolic activities in body fluids or tissues. PMID:19617888
Quantitative analysis of single-molecule force spectroscopy on folded chromatin fibers
Meng, He; Andresen, Kurt; van Noort, John
2015-01-01
Single-molecule techniques allow for picoNewton manipulation and nanometer accuracy measurements of single chromatin fibers. However, the complexity of the data, the heterogeneity of the composition of individual fibers and the relatively large fluctuations in extension of the fibers complicate a structural interpretation of such force-extension curves. Here we introduce a statistical mechanics model that quantitatively describes the extension of individual fibers in response to force on a per nucleosome basis. Four nucleosome conformations can be distinguished when pulling a chromatin fiber apart. A novel, transient conformation is introduced that coexists with single wrapped nucleosomes between 3 and 7 pN. Comparison of force-extension curves between single nucleosomes and chromatin fibers shows that embedding nucleosomes in a fiber stabilizes the nucleosome by 10 kBT. Chromatin fibers with 20- and 50-bp linker DNA follow a different unfolding pathway. These results have implications for accessibility of DNA in fully folded and partially unwrapped chromatin fibers and are vital for understanding force unfolding experiments on nucleosome arrays. PMID:25779043
Ma, Ruoshui; Zhang, Xiumei; Wang, Yi; Zhang, Xiao
2018-04-27
The heterogeneous and complex structural characteristics of lignin present a significant challenge to predict its processability (e.g. depolymerization, modifications etc) to valuable products. This study provides a detailed characterization and comparison of structural properties of seven representative biorefinery lignin samples derived from forest and agricultural residues, which were subjected to representative pretreatment methods. A range of wet chemistry and spectroscopy methods were applied to determine specific lignin structural characteristics such as functional groups, inter-unit linkages and peak molecular weight. In parallel, oxidative depolymerization of these lignin samples to either monomeric phenolic compounds or dicarboxylic acids were conducted, and the product yields were quantified. Based on these results (lignin structural characteristics and monomer yields), we demonstrated for the first time to apply multiple-variable linear estimations (MVLE) approach using R statistics to gain insight toward a quantitative correlation between lignin structural properties and their conversion reactivity toward oxidative depolymerization to monomers. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Bruining, Nico; Tanimoto, Shuzou; Otsuka, Masato; Weustink, Annick; Ligthart, Jurgen; de Winter, Sebastiaan; van Mieghem, Carlos; Nieman, Koen; de Feyter, Pim J; van Domburg, Ron T; Serruys, Patrick W
2008-08-01
To investigate if three-dimensional (3D) based quantitative techniques are comparable to each other and to explore possible differences with respect to the reference method of 2D-QCA in the acute phase and to study whether non-invasive MSCT could potentially be applied to quantify luminal dimensions of a stented coronary segment with a novel bioabsorable drug-eluting stent made of poly-l-lactic-acid (PLLA). Quantitative imaging data derived from 16 patients enrolled at our institution in a first-in-man trial (ABSORB) receiving a biodegradable stent and who were imaged with standard coronary angiography and intravascular ultrasound were compared. Shortly, after stenting the patients also underwent a MSCT procedure. Standard 2D-QCA showed significant smaller stent lengths (p < 0.01). Although, the absolute measured stent diameters and areas by 2D-QCA tend to be smaller, the differences failed to be statistically different when compared to the 3D based quantitative modalities. Measurements made by non-invasive QMSCT-CA of implanted PLLA stents appeared to be comparable to the other 3D modalities without significant differences. Three-dimensional based quantitative analyses showed similar results quantifying luminal dimensions as compared to 2D-QCA during an evaluation of a new bioabsorbable coronary stent design in the acute phase. Furthermore, in biodegradable stents made of PLLA, non-invasive QMSCT-CA can be used to quantify luminal dimensions.
Panzer, Stephanie; Mc Coy, Mark R; Hitzl, Wolfgang; Piombino-Mascali, Dario; Jankauskas, Rimantas; Zink, Albert R; Augat, Peter
2015-01-01
The purpose of this study was to develop a checklist for standardized assessment of soft tissue preservation in human mummies based on whole-body computed tomography examinations, and to add a scoring system to facilitate quantitative comparison of mummies. Computed tomography examinations of 23 mummies from the Capuchin Catacombs of Palermo, Sicily (17 adults, 6 children; 17 anthropogenically and 6 naturally mummified) and 7 mummies from the crypt of the Dominican Church of the Holy Spirit of Vilnius, Lithuania (5 adults, 2 children; all naturally mummified) were used to develop the checklist following previously published guidelines. The scoring system was developed by assigning equal scores for checkpoints with equivalent quality. The checklist was evaluated by intra- and inter-observer reliability. The finalized checklist was applied to compare the groups of anthropogenically and naturally mummified bodies. The finalized checklist contains 97 checkpoints and was divided into two main categories, "A. Soft Tissues of Head and Musculoskeletal System" and "B. Organs and Organ Systems", each including various subcategories. The complete checklist had an intra-observer reliability of 98% and an inter-observer reliability of 93%. Statistical comparison revealed significantly higher values in anthropogenically compared to naturally mummified bodies for the total score and for three subcategories. In conclusion, the developed checklist allows for a standardized assessment and documentation of soft tissue preservation in whole-body computed tomography examinations of human mummies. The scoring system facilitates a quantitative comparison of the soft tissue preservation status between single mummies or mummy collections.
Less label, more free: approaches in label-free quantitative mass spectrometry.
Neilson, Karlie A; Ali, Naveid A; Muralidharan, Sridevi; Mirzaei, Mehdi; Mariani, Michael; Assadourian, Gariné; Lee, Albert; van Sluyter, Steven C; Haynes, Paul A
2011-02-01
In this review we examine techniques, software, and statistical analyses used in label-free quantitative proteomics studies for area under the curve and spectral counting approaches. Recent advances in the field are discussed in an order that reflects a logical workflow design. Examples of studies that follow this design are presented to highlight the requirement for statistical assessment and further experiments to validate results from label-free quantitation. Limitations of label-free approaches are considered, label-free approaches are compared with labelling techniques, and forward-looking applications for label-free quantitative data are presented. We conclude that label-free quantitative proteomics is a reliable, versatile, and cost-effective alternative to labelled quantitation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Smith, Ben J; Zehle, Katharina; Bauman, Adrian E; Chau, Josephine; Hawkshaw, Barbara; Frost, Steven; Thomas, Margaret
2006-04-01
This study examined the use of quantitative methods in Australian health promotion research in order to identify methodological trends and priorities for strengthening the evidence base for health promotion. Australian health promotion articles were identified by hand searching publications from 1992-2002 in six journals: Health Promotion Journal of Australia, Australian and New Zealand journal of Public Health, Health Promotion International, Health Education Research, Health Education and Behavior and the American Journal of Health Promotion. The study designs and statistical methods used in articles presenting quantitative research were recorded. 591 (57.7%) of the 1,025 articles used quantitative methods. Cross-sectional designs were used in the majority (54.3%) of studies with pre- and post-test (14.6%) and post-test only (9.5%) the next most common designs. Bivariate statistical methods were used in 45.9% of papers, multivariate methods in 27.1% and simple numbers and proportions in 25.4%. Few studies used higher-level statistical techniques. While most studies used quantitative methods, the majority were descriptive in nature. The study designs and statistical methods used provided limited scope for demonstrating intervention effects or understanding the determinants of change.
What Good Are Statistics that Don't Generalize?
ERIC Educational Resources Information Center
Shaffer, David Williamson; Serlin, Ronald C.
2004-01-01
Quantitative and qualitative inquiry are sometimes portrayed as distinct and incompatible paradigms for research in education. Approaches to combining qualitative and quantitative research typically "integrate" the two methods by letting them co-exist independently within a single research study. Here we describe intra-sample statistical analysis…
Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S
2015-01-16
Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.
Mapping Quantitative Traits in Unselected Families: Algorithms and Examples
Dupuis, Josée; Shi, Jianxin; Manning, Alisa K.; Benjamin, Emelia J.; Meigs, James B.; Cupples, L. Adrienne; Siegmund, David
2009-01-01
Linkage analysis has been widely used to identify from family data genetic variants influencing quantitative traits. Common approaches have both strengths and limitations. Likelihood ratio tests typically computed in variance component analysis can accommodate large families but are highly sensitive to departure from normality assumptions. Regression-based approaches are more robust but their use has primarily been restricted to nuclear families. In this paper, we develop methods for mapping quantitative traits in moderately large pedigrees. Our methods are based on the score statistic which in contrast to the likelihood ratio statistic, can use nonparametric estimators of variability to achieve robustness of the false positive rate against departures from the hypothesized phenotypic model. Because the score statistic is easier to calculate than the likelihood ratio statistic, our basic mapping methods utilize relatively simple computer code that performs statistical analysis on output from any program that computes estimates of identity-by-descent. This simplicity also permits development and evaluation of methods to deal with multivariate and ordinal phenotypes, and with gene-gene and gene-environment interaction. We demonstrate our methods on simulated data and on fasting insulin, a quantitative trait measured in the Framingham Heart Study. PMID:19278016
Developing Sampling Frame for Case Study: Challenges and Conditions
ERIC Educational Resources Information Center
Ishak, Noriah Mohd; Abu Bakar, Abu Yazid
2014-01-01
Due to statistical analysis, the issue of random sampling is pertinent to any quantitative study. Unlike quantitative study, the elimination of inferential statistical analysis, allows qualitative researchers to be more creative in dealing with sampling issue. Since results from qualitative study cannot be generalized to the bigger population,…
NASA Astrophysics Data System (ADS)
Mayes, R.; Lyford, M. E.; Myers, J. D.
2009-12-01
The Quantitative Reasoning in STEM (QR STEM) project is a state level Mathematics and Science Partnership Project (MSP) with a focus on the mathematics and statistics that underlies the understanding of complex global scientific issues. This session is a companion session to the QR STEM: The Science presentation. The focus of this session is the quantitative reasoning aspects of the project. As students move from understandings that range from local to global in perspective on issues of energy and environment, there is a significant increase in the need for mathematical and statistical conceptual understanding. These understandings must be accessible to the students within the scientific context, requiring the special understandings that are endemic within quantitative reasoning. The QR STEM project brings together interdisciplinary teams of higher education faculty and middle/high school teachers to explore complex problems in energy and environment. The disciplines include life sciences, physics, chemistry, earth science, statistics, and mathematics. These interdisciplinary teams develop open ended performance tasks to implement in the classroom, based on scientific concepts that underpin energy and environment. Quantitative reasoning is broken down into three components: Quantitative Literacy, Quantitative Interpretation, and Quantitative Modeling. Quantitative Literacy is composed of arithmetic concepts such as proportional reasoning, numeracy, and descriptive statistics. Quantitative Interpretation includes algebraic and geometric concepts that underlie the ability to interpret a model of natural phenomena which is provided for the student. This model may be a table, graph, or equation from which the student is to make predictions or identify trends, or from which they would use statistics to explore correlations or patterns in data. Quantitative modeling is the ability to develop the model from data, including the ability to test hypothesis using statistical procedures. We use the term model very broadly, so it includes visual models such as box models, as well as best fit equation models and hypothesis testing. One of the powerful outcomes of the project is the conversation which takes place between science teachers and mathematics teachers. First they realize that though they are teaching concepts that cross their disciplines, the barrier of scientific language within their subjects restricts students from applying the concepts across subjects. Second the mathematics teachers discover the context of science as a means of providing real world situations that engage students in the utility of mathematics as a tool for solving problems. Third the science teachers discover the barrier to understanding science that is presented by poor quantitative reasoning ability. Finally the students are engaged in exploring energy and environment in a manner which exposes the importance of seeing a problem from multiple interdisciplinary perspectives. The outcome is a democratic citizen capable of making informed decisions, and perhaps a future scientist.
Tanuja, Penmatsa; Venugopal, Namburi; Sashidhar, Rao Beedu
2007-01-01
A simple thin-layer chromatography-digital image-based analytical method has been developed for the quantitation of the botanical pesticide, azadirachtin. The method was validated by analyzing azadirachtin in the spiked food matrixes and processed commercial pesticide formulations, using acidified vanillin reagent as a postchromatographic derivatizing agent. The separated azadirachtin was clearly identified as a green spot. The Rf value was found to be 0.55, which was similar to that of a reference standard. A standard calibration plot was established using a reference standard, based on the linear regression analysis [r2 = 0.996; y = 371.43 + (634.82)x]. The sensitivity of the method was found to be 0.875 microg azadirachtin. Spiking studies conducted at the 1 ppm (microg/g) level in various agricultural matrixes, such as brinjal, tomato, coffee, and cotton seeds, revealed the recoveries of azadirachtin in the range of 67-92%. Azadirachtin content of commercial neem formulations analyzed by the method was in the range of 190-1825 ppm (microg/mL). Further, the present method was compared with an immunoanalytical method enzyme-linked immonosorbent assay developed earlier in our laboratory. Statistical comparison of the 2 methods, using Fischer's F-test, indicated no significant difference in variance, suggesting that both methods are comparable.
Preparing and Presenting Effective Research Posters
Miller, Jane E
2007-01-01
Objectives Posters are a common way to present results of a statistical analysis, program evaluation, or other project at professional conferences. Often, researchers fail to recognize the unique nature of the format, which is a hybrid of a published paper and an oral presentation. This methods note demonstrates how to design research posters to convey study objectives, methods, findings, and implications effectively to varied professional audiences. Methods A review of existing literature on research communication and poster design is used to identify and demonstrate important considerations for poster content and layout. Guidelines on how to write about statistical methods, results, and statistical significance are illustrated with samples of ineffective writing annotated to point out weaknesses, accompanied by concrete examples and explanations of improved presentation. A comparison of the content and format of papers, speeches, and posters is also provided. Findings Each component of a research poster about a quantitative analysis should be adapted to the audience and format, with complex statistical results translated into simplified charts, tables, and bulleted text to convey findings as part of a clear, focused story line. Conclusions Effective research posters should be designed around two or three key findings with accompanying handouts and narrative description to supply additional technical detail and encourage dialog with poster viewers. PMID:17355594
NASA Astrophysics Data System (ADS)
Andersson, C. David; Hillgren, J. Mikael; Lindgren, Cecilia; Qian, Weixing; Akfur, Christine; Berg, Lotta; Ekström, Fredrik; Linusson, Anna
2015-03-01
Scientific disciplines such as medicinal- and environmental chemistry, pharmacology, and toxicology deal with the questions related to the effects small organic compounds exhort on biological targets and the compounds' physicochemical properties responsible for these effects. A common strategy in this endeavor is to establish structure-activity relationships (SARs). The aim of this work was to illustrate benefits of performing a statistical molecular design (SMD) and proper statistical analysis of the molecules' properties before SAR and quantitative structure-activity relationship (QSAR) analysis. Our SMD followed by synthesis yielded a set of inhibitors of the enzyme acetylcholinesterase (AChE) that had very few inherent dependencies between the substructures in the molecules. If such dependencies exist, they cause severe errors in SAR interpretation and predictions by QSAR-models, and leave a set of molecules less suitable for future decision-making. In our study, SAR- and QSAR models could show which molecular sub-structures and physicochemical features that were advantageous for the AChE inhibition. Finally, the QSAR model was used for the prediction of the inhibition of AChE by an external prediction set of molecules. The accuracy of these predictions was asserted by statistical significance tests and by comparisons to simple but relevant reference models.
3D analysis of bone formation around titanium implants using micro-computed tomography (μCT)
NASA Astrophysics Data System (ADS)
Bernhardt, Ricardo; Scharnweber, Dieter; Müller, Bert; Beckmann, Felix; Goebbels, Jürgen; Jansen, John; Schliephake, Henning; Worch, Hartmut
2006-08-01
The quantitative analysis of bone formation around biofunctionalised metallic implants is an important tool for the further development of implants with higher success rates. This is, nowadays, especially important in cases of additional diseases like diabetes or osteoporosis. Micro computed tomography (μCT), as non-destructive technique, offers the possibility for quantitative three-dimensional recording of bone close to the implant's surface with micrometer resolution, which is the range of the relevant bony structures. Within different animal models using cylindrical and screw-shaped Ti6Al4V implants we have compared visualization and quantitative analysis of newly formed bone by the use of synchrotron-radiation-based CT-systems in comparison with histological findings. The SRμCT experiments were performed at the beamline BW 5 (HASYLAB at DESY, Hamburg, Germany; at the BAMline (BESSY, Berlin, Germany). For the experiments, PMMA-embedded samples were prepared with diameters of about 8 mm, which contain in the center the implant surrounded by the bony tissue. To (locally) quantify the bone formation, models were developed and optimized. The comparison of the results obtained by SRμCT and histology demonstrates the advantages and disadvantages of both approaches, although the bone formation values for the different biofunctionalized implants are identical within the error bars. SRμCT allows the clear identification of fully mineralized bone around the different titanium implants. As hundreds of virtual slices were easily generated for the individual samples, the quantification and interactive bone detection led to conclusions of high precision and statistical relevance. In this way, SRμCT in combination with interactive data analysis is proven to be more significant with respect to classical histology.
Elschner, Cindy; Korn, Paula; Hauptstock, Maria; Schulz, Matthias C.; Range, Ursula; Jünger, Diana; Scheler, Ulrich
2017-01-01
One consequence of demographic change is the increasing demand for biocompatible materials for use in implants and prostheses. This is accompanied by a growing number of experimental animals because the interactions between new biomaterials and its host tissue have to be investigated. To evaluate novel materials and engineered tissues the use of non-destructive imaging modalities have been identified as a strategic priority. This provides the opportunity for studying interactions repeatedly with individual animals, along with the advantages of reduced biological variability and decreased number of laboratory animals. However, histological techniques are still the golden standard in preclinical biomaterial research. The present article demonstrates a detailed method comparison between histology and magnetic resonance imaging. This includes the presentation of their image qualities as well as the detailed statistical analysis for assessing agreement between quantitative measures. Exemplarily, the bony ingrowth of tissue engineered bone substitutes for treatment of a cleft-like maxillary bone defect has been evaluated. By using a graphical concordance analysis the mean difference between MRI results and histomorphometrical measures has been examined. The analysis revealed a slightly but significant bias in the case of the bone volume (biasHisto−MRI:Bone volume=2.40 %, p<0.005) and a clearly significant deviation for the remaining defect width (biasHisto−MRI:Defect width=−6.73 %, p≪0.005). But the study although showed a considerable effect of the analyzed section position to the quantitative result. It could be proven, that the bias of the data sets was less originated due to the imaging modalities, but mainly on the evaluation of different slice positions. The article demonstrated that method comparisons not always need the use of an independent animal study, additionally. PMID:28666026
Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S
2016-11-01
There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.
Bruxism: is it a new sign of the cardiovascular diseases?
Atilgan, Z; Buyukkaya, R; Yaman, F; Tekbas, G; Atilgan, S; Gunay, A; Palanci, Y; Guven, S
2011-12-01
To determine the relationship between bruxism and cardiovascular diseases. 120 patients who referred to the Dentistry Faculty with the complaint of bruxism were selected. All patients gave informed consent for participation in the study. All of the patients were examined and bruxism was classified. And also these were examined by B-mode ultrasound to measure the Intima Media Thickness (IMT) at the far wall of the common carotid artery. A wide range of vascular risk factors including age, gender, body mass index, and previous history were surveyed. Spearman correlation analysis was performed to ascertain quantitative comparison, Mann-Whitney U and Kruskal-Wallis test were used for comparison of means There were 66 (55%) male and 54 (45%) female patients, with a female to male ratio of 1/1.2. The mean age was 35.6 +/- 1,25 years (range 18-65 years). In the analysis of bruxism classification and IMT there was a statistical significance between bruxism classification subgroup 1, 2, 3 and IMT. There was no statistical significance between bruxism classification Subgroup 4 and IMT due to the small number of the patients (n = 12). Stressful situations can cause both bruxism and cardiovascular disease such as coronary artery diseases, hypertension, arrhythmias, cardiomyopathy. The statistical analysis supported this hypothesis. However, we need to new studies with large number of samples to confirm this hypothesis. Clearly, future studies in this field will need to take into consideration the influence of the following variables: age, use of medication or drugs, smoking habits, and other sleep disorders.
Algorithms for constructing optimal paths and statistical analysis of passenger traffic
NASA Astrophysics Data System (ADS)
Trofimov, S. P.; Druzhinina, N. G.; Trofimova, O. G.
2018-01-01
Several existing information systems of urban passenger transport (UPT) are considered. Author’s UPT network model is presented. To a passenger a new service is offered that is the best path from one stop to another stop at a specified time. The algorithm and software implementation for finding the optimal path are presented. The algorithm uses the current UPT schedule. The article also describes the algorithm of statistical analysis of trip payments by the electronic E-cards. The algorithm allows obtaining the density of passenger traffic during the day. This density is independent of the network topology and UPT schedules. The resulting density of the traffic flow can solve a number of practical problems. In particular, the forecast for the overflow of passenger transport in the «rush» hours, the quantitative comparison of different topologies transport networks, constructing of the best UPT timetable. The efficiency of the proposed integrated approach is demonstrated by the example of the model town with arbitrary dimensions.
[Quantitative and qualitative changes in the sex chromatin of diabetic women of different ages].
Kaiumov, E G; Dmitrieva, E N
1975-01-01
There was revealed a statistically significant reduction in the frequency of occurrence of sex chromatine (SC) in the patients (female) suffering from diabetes mellitus aged from 15 to 65 years before the treatment in comparison with the healthy women. After the compensation of the carbohydrate metabolism there was noted its further reduction in the patients aged from 25 to 65 years. In 15-65-year women who contracted diabetes mellitus there was an increase in the circular form of the SC bodies looking like thickenings of the nuclear membrane; SC bodies of round shape enlarged as well in women aged from 25 to 65 years. Oval, triangular and semicircular forms decreased in all the age groups. After the compensation of the carbohydrate metabolism the content of the SC bodies of various shapes remained the same as at the beginning of the disease without returning to the normal level. The area of the SC bodies enlargement was statistically significant in women who fell ill with diabetes mellitus.
Handwriting Examination: Moving from Art to Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarman, K.H.; Hanlen, R.C.; Manzolillo, P.A.
In this document, we present a method for validating the premises and methodology of forensic handwriting examination. This method is intuitively appealing because it relies on quantitative measurements currently used qualitatively by FDE's in making comparisons, and it is scientifically rigorous because it exploits the power of multivariate statistical analysis. This approach uses measures of both central tendency and variation to construct a profile for a given individual. (Central tendency and variation are important for characterizing an individual's writing and both are currently used by FDE's in comparative analyses). Once constructed, different profiles are then compared for individuality using clustermore » analysis; they are grouped so that profiles within a group cannot be differentiated from one another based on the measured characteristics, whereas profiles between groups can. The cluster analysis procedure used here exploits the power of multivariate hypothesis testing. The result is not only a profile grouping but also an indication of statistical significance of the groups generated.« less
Gersberg, Richard; Tiedge, Jürgen; Gottstein, Dana; Altmann, Sophie; Watanabe, Kayo; Lüderitz, Volker
2008-04-01
In early 1999, primary treatment and discharge of sewage from Tijuana, Mexico (approximately 95 million liters per day) began through South Bay Ocean Outfall (SBOO) into the ocean 4.3 km offshore. In this study, statistical comparisons were made of the bacterial water quality (total and fecal coliforms and enterococci densities) of the ocean, both before and after discharge of sewage to the SBOO began, so that the effect of this ocean discharge on nearshore ocean water quality could be quantitatively assessed. The frequency of exceedence of bacterial indicator thresholds was statistically analyzed for 11 shore (surfzone) stations throughout US and Mexico using the Fisher's exact test, for the years before (1995-1998) as compared to after the SBOO discharge began (1999-2003). Only four of the 11 shoreline stations (S2, S3, S11, and S12) showed significant improvement (decreased frequency of exceedence of bacterial indicator thresholds) after SBOO discharge began.
Calibrating genomic and allelic coverage bias in single-cell sequencing.
Zhang, Cheng-Zhong; Adalsteinsson, Viktor A; Francis, Joshua; Cornils, Hauke; Jung, Joonil; Maire, Cecile; Ligon, Keith L; Meyerson, Matthew; Love, J Christopher
2015-04-16
Artifacts introduced in whole-genome amplification (WGA) make it difficult to derive accurate genomic information from single-cell genomes and require different analytical strategies from bulk genome analysis. Here, we describe statistical methods to quantitatively assess the amplification bias resulting from whole-genome amplification of single-cell genomic DNA. Analysis of single-cell DNA libraries generated by different technologies revealed universal features of the genome coverage bias predominantly generated at the amplicon level (1-10 kb). The magnitude of coverage bias can be accurately calibrated from low-pass sequencing (∼0.1 × ) to predict the depth-of-coverage yield of single-cell DNA libraries sequenced at arbitrary depths. We further provide a benchmark comparison of single-cell libraries generated by multi-strand displacement amplification (MDA) and multiple annealing and looping-based amplification cycles (MALBAC). Finally, we develop statistical models to calibrate allelic bias in single-cell whole-genome amplification and demonstrate a census-based strategy for efficient and accurate variant detection from low-input biopsy samples.
Calibrating genomic and allelic coverage bias in single-cell sequencing
Francis, Joshua; Cornils, Hauke; Jung, Joonil; Maire, Cecile; Ligon, Keith L.; Meyerson, Matthew; Love, J. Christopher
2016-01-01
Artifacts introduced in whole-genome amplification (WGA) make it difficult to derive accurate genomic information from single-cell genomes and require different analytical strategies from bulk genome analysis. Here, we describe statistical methods to quantitatively assess the amplification bias resulting from whole-genome amplification of single-cell genomic DNA. Analysis of single-cell DNA libraries generated by different technologies revealed universal features of the genome coverage bias predominantly generated at the amplicon level (1–10 kb). The magnitude of coverage bias can be accurately calibrated from low-pass sequencing (~0.1 ×) to predict the depth-of-coverage yield of single-cell DNA libraries sequenced at arbitrary depths. We further provide a benchmark comparison of single-cell libraries generated by multi-strand displacement amplification (MDA) and multiple annealing and looping-based amplification cycles (MALBAC). Finally, we develop statistical models to calibrate allelic bias in single-cell whole-genome amplification and demonstrate a census-based strategy for efficient and accurate variant detection from low-input biopsy samples. PMID:25879913
Sargin, Mehmet Akif; Yassa, Murat; Taymur, Bilge Dogan; Taymur, Bulent; Akca, Gizem; Tug, Niyazi
2017-04-01
To compare the status of female sexual dysfunction (FSD) between women with a history of previous gestational diabetes mellitus (GDM) and those with follow-up of a healthy pregnancy, using the female sexual function index (FSFI) questionnaire. Cross-sectional study. Department of Obstetrics and Gynecology, Fatih Sultan Mehmet Training and Research Hospital, Istanbul, Turkey, from September to December 2015. Healthy sexually active adult parous females were included. Participants were asked to complete the validated Turkish versions of the FSFI and Hospital Anxiety and Depression Scale (HADS) questionnaires. Student's t-test was used for two-group comparisons of normally distributed variables and quantitative data. Mann-Whitney U-test was used for two-group comparisons of non-normally distributed variables. Pearson's chi-squared test, the Fisher-FreemanHalton test, Fisher's exact test, and Yates' continuity correction test were used for comparison of qualitative data. The mean FSFI scores of the 179 participants was 23.50 ±3.94. FSFI scores and scores of desire, arousal, lubrication, orgasm, satisfaction, and pain were not statistically significantly different (p>0.05), according to a history of GDM and types of FSD (none, mild, severe). HADS scores and anxiety and depression types did not statistically significantly differ according to the history of GDM (p>0.05). An association could not be found in FSFI scores between participants with both the history of previous GDM and with healthy pregnancy; subclinical sexual dysfunction may be observed in the late postpartum period among women with a history of previous GDM. This may adversely affect their sexual health.
Steinka-Fry, Katarzyna T; Tanner-Smith, Emily E; Dakof, Gayle A; Henderson, Craig
2017-04-01
This systematic review and meta-analysis synthesized findings from studies examining culturally sensitive substance use treatment for racial/ethnic minority youth. An extensive literature search located eight eligible studies using experimental or quasi-experimental designs. The meta-analysis quantitatively synthesized findings comparing seven culturally sensitive treatment conditions to seven alternative conditions on samples composed of at least 90% racial/ethnic minority youth. The results from the meta-analysis indicated that culturally sensitive treatments were associated with significantly larger reductions in post-treatment substance use levels relative to their comparison conditions (g=0.37, 95% CI [0.12, 0.62], k=7, total number participants=723). The average time between pretest and posttest was 21weeks (SD=11.79). There was a statistically significant amount of heterogeneity across the seven studies (Q=26.5, p=0.00, τ 2 =0.08, I 2 =77.4%). Differential effects were not statistically significant when contrasts were active generic counterparts of treatment conditions (direct "bona fide" comparisons; g=-0.08, 95% CI [-0.51, 0.35]) and 'treatment as usual' conditions (g=0.39, 95% CI [-0.14, 0.91]). Strong conclusions from the review were hindered by the small number of available studies for synthesis, variability in comparison conditions across studies, and lack of diversity in the adolescent clients served in the studies. Nonetheless, this review suggests that culturally sensitive treatments offer promise as an effective way to address substance use among racial/ethnic minority youth. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn
2016-07-01
Application of mathematical and statistical thinking and reasoning, typically referred to as quantitative skills, is essential for university bioscience students. First, this study developed an assessment task intended to gauge graduating students' quantitative skills. The Quantitative Skills Assessment of Science Students (QSASS) was the result, which examined 10 mathematical and statistical sub-topics. Second, the study established an evidential baseline of students' quantitative skills performance and confidence levels by piloting the QSASS with 187 final-year biosciences students at a research-intensive university. The study is framed within the planned-enacted-experienced curriculum model and contributes to science reform efforts focused on enhancing the quantitative skills of university graduates, particularly in the biosciences. The results found, on average, weak performance and low confidence on the QSASS, suggesting divergence between academics' intentions and students' experiences of learning quantitative skills. Implications for curriculum design and future studies are discussed.
Horger, M; Fritz, J; Thaiss, W M; Ditt, H; Weisel, K; Haap, M; Kloth, Christopher
2018-03-01
To compare qualitative and quantitative computed tomography (CT) and magnetic resonance imaging (MRI) parameters for longitudinal disease monitoring of multiple myeloma (MM) of the axial skeleton. We included 31 consecutive patients (17 m; mean age 59.20 ± 8.08 years) with MM, who underwent all baseline (n = 31) and at least one or more (n = 47) follow-up examinations consisting of multi-parametric non-enhanced whole-body MRI ( WB MRI) and non-enhanced whole-body reduced-dose thin-section MDCT (NEWBMDCT) between 06/2013 and 09/2016. We classified response according to qualitative CT criteria into progression (PD), stable(SD), partial/very good partial (PR/VGPR) and complete response(CR), grouping the latter three together for statistical analysis because CT cannot reliably assess PR and CR. Qualitative MR-response criteria were defined and grouped similarly to CT using longitudinal quantification of signal-intensity changes on T1w/STIR/ T2*w and calculating ADC-values. Standard of reference was the hematological laboratory (M-gradient). Hematological response categories were CR (14/47, 29.7%), PR (2/47, 4.2%), SD (16/47, 34.0%) and PD (15/47, 29.9%). Qualitative-CT-evaluation showed PD in 12/47 (25.5%) and SD/PR/VGPR/CR in 35/47 (74.5%) cases. These results were confirmed by quantitative-CT in all focal lytic lesions (p < 0.001). Quantitative-CT at sites with diffuse bone involvement showed significant increase of maximum bone attenuation (p < 0.001*) and significant decrease of minimal bone (p < 0.002*) in the SD/PR/VGPR/CR group. Qualitative MRI showed PD in 14/47 (29.7%) and SD/PR/VGPR/CR in 33/47 (70.3%). Quantitative MRI diagnosis showed a statistically significant decrease in signal intensity on short tau inversion recovery sequences (STIR) in bone marrow in patients with diffuse bone marrow involvement achieving SD/PR/VGPR/CR (p < 0.001*). Imaging response monitoring using MRI is superior to CT only if qualitative parameters are used, whereas there was no definite benefit from using quantitative parameters with either CT or MRI.
Testing for nonrandom shape similarity between sister cells using automated shape comparison
NASA Astrophysics Data System (ADS)
Guo, Monica; Marshall, Wallace F.
2009-02-01
Several reports in the biological literature have indicated that when a living cell divides, the two daughter cells have a tendency to be mirror images of each other in terms of their overall cell shape. This phenomenon would be consistent with inheritance of spatial organization from mother cell to daughters. However the published data rely on a small number of examples that were visually chosen, raising potential concerns about inadvertent selection bias. We propose to revisit this issue using automated quantitative shape comparison methods which would have no contribution from the observer and which would allow statistical testing of similarity in large numbers of cells. In this report we describe a first order approach to the problem using rigid curve matching. Using test images, we compare a pointwise correspondence based distance metric with a chamfer matching strategy and find that the latter provides better correspondence and smaller distances between aligned curves, especially when we allow nonrigid deformation of the outlines in addition to rotation.
NASA Astrophysics Data System (ADS)
Kirstetter, P.; Hong, Y.; Gourley, J. J.; Chen, S.; Flamig, Z.; Zhang, J.; Howard, K.; Petersen, W. A.
2011-12-01
Proper characterization of the error structure of TRMM Precipitation Radar (PR) quantitative precipitation estimation (QPE) is needed for their use in TRMM combined products, water budget studies and hydrological modeling applications. Due to the variety of sources of error in spaceborne radar QPE (attenuation of the radar signal, influence of land surface, impact of off-nadir viewing angle, etc.) and the impact of correction algorithms, the problem is addressed by comparison of PR QPEs with reference values derived from ground-based measurements (GV) using NOAA/NSSL's National Mosaic QPE (NMQ) system. An investigation of this subject has been carried out at the PR estimation scale (instantaneous and 5 km) on the basis of a 3-month-long data sample. A significant effort has been carried out to derive a bias-corrected, robust reference rainfall source from NMQ. The GV processing details will be presented along with preliminary results of PR's error characteristics using contingency table statistics, probability distribution comparisons, scatter plots, semi-variograms, and systematic biases and random errors.
NASA Astrophysics Data System (ADS)
Li, Min; Yuan, Yunbin; Wang, Ningbo; Li, Zishen; Liu, Xifeng; Zhang, Xiao
2018-07-01
This paper presents a quantitative comparison of several widely used interpolation algorithms, i.e., Ordinary Kriging (OrK), Universal Kriging (UnK), planar fit and Inverse Distance Weighting (IDW), based on a grid-based single-shell ionosphere model over China. The experimental data were collected from the Crustal Movement Observation Network of China (CMONOC) and the International GNSS Service (IGS), covering the days of year 60-90 in 2015. The quality of these interpolation algorithms was assessed by cross-validation in terms of both the ionospheric correction performance and Single-Frequency (SF) Precise Point Positioning (PPP) accuracy on an epoch-by-epoch basis. The results indicate that the interpolation models perform better at mid-latitudes than low latitudes. For the China region, the performance of OrK and UnK is relatively better than the planar fit and IDW model for estimating ionospheric delay and positioning. In addition, the computational efficiencies of the IDW and planar fit models are better than those of OrK and UnK.
Using Statistics to Lie, Distort, and Abuse Data
ERIC Educational Resources Information Center
Bintz, William; Moore, Sara; Adams, Cheryll; Pierce, Rebecca
2009-01-01
Statistics is a branch of mathematics that involves organization, presentation, and interpretation of data, both quantitative and qualitative. Data do not lie, but people do. On the surface, quantitative data are basically inanimate objects, nothing more than lifeless and meaningless symbols that appear on a page, calculator, computer, or in one's…
QuantCrit: Education, Policy, "Big Data" and Principles for a Critical Race Theory of Statistics
ERIC Educational Resources Information Center
Gillborn, David; Warmington, Paul; Demack, Sean
2018-01-01
Quantitative research enjoys heightened esteem among policy-makers, media, and the general public. Whereas qualitative research is frequently dismissed as subjective and impressionistic, statistics are often assumed to be objective and factual. We argue that these distinctions are wholly false; quantitative data is no less socially constructed…
Nedelcu, R; Olsson, P; Nyström, I; Rydén, J; Thor, A
2018-02-01
To evaluate a novel methodology using industrial scanners as a reference, and assess in vivo accuracy of 3 intraoral scanners (IOS) and conventional impressions. Further, to evaluate IOS precision in vivo. Four reference-bodies were bonded to the buccal surfaces of upper premolars and incisors in five subjects. After three reference-scans, ATOS Core 80 (ATOS), subjects were scanned three times with three IOS systems: 3M True Definition (3M), CEREC Omnicam (OMNI) and Trios 3 (TRIOS). One conventional impression (IMPR) was taken, 3M Impregum Penta Soft, and poured models were digitized with laboratory scanner 3shape D1000 (D1000). Best-fit alignment of reference-bodies and 3D Compare Analysis was performed. Precision of ATOS and D1000 was assessed for quantitative evaluation and comparison. Accuracy of IOS and IMPR were analyzed using ATOS as reference. Precision of IOS was evaluated through intra-system comparison. Precision of ATOS reference scanner (mean 0.6 μm) and D1000 (mean 0.5 μm) was high. Pairwise multiple comparisons of reference-bodies located in different tooth positions displayed a statistically significant difference of accuracy between two scanner-groups: 3M and TRIOS, over OMNI (p value range 0.0001 to 0.0006). IMPR did not show any statistically significant difference to IOS. However, deviations of IOS and IMPR were within a similar magnitude. No statistical difference was found for IOS precision. The methodology can be used for assessing accuracy of IOS and IMPR in vivo in up to five units bilaterally from midline. 3M and TRIOS had a higher accuracy than OMNI. IMPR overlapped both groups. Intraoral scanners can be used as a replacement for conventional impressions when restoring up to ten units without extended edentulous spans. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.
Gao, Yi; Bouix, Sylvain
2016-05-01
Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.
Quantitative Validation of the Integrated Medical Model (IMM) for ISS Missions
NASA Technical Reports Server (NTRS)
Young, Millennia; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Goodenow, D. A.; Myers, J. G.
2016-01-01
Lifetime Surveillance of Astronaut Health (LSAH) provided observed medical event data on 33 ISS and 111 STS person-missions for use in further improving and validating the Integrated Medical Model (IMM). Using only the crew characteristics from these observed missions, the newest development version, IMM v4.0, will simulate these missions to predict medical events and outcomes. Comparing IMM predictions to the actual observed medical event counts will provide external validation and identify areas of possible improvement. In an effort to improve the power of detecting differences in this validation study, the total over each program ISS and STS will serve as the main quantitative comparison objective, specifically the following parameters: total medical events (TME), probability of loss of crew life (LOCL), and probability of evacuation (EVAC). Scatter plots of observed versus median predicted TMEs (with error bars reflecting the simulation intervals) will graphically display comparisons while linear regression will serve as the statistical test of agreement. Two scatter plots will be analyzed 1) where each point reflects a mission and 2) where each point reflects a condition-specific total number of occurrences. The coefficient of determination (R2) resulting from a linear regression with no intercept bias (intercept fixed at zero) will serve as an overall metric of agreement between IMM and the real world system (RWS). In an effort to identify as many possible discrepancies as possible for further inspection, the -level for all statistical tests comparing IMM predictions to observed data will be set to 0.1. This less stringent criterion, along with the multiple testing being conducted, should detect all perceived differences including many false positive signals resulting from random variation. The results of these analyses will reveal areas of the model requiring adjustment to improve overall IMM output, which will thereby provide better decision support for mission critical applications.
A framework for organizing and selecting quantitative approaches for benefit-harm assessment.
Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M
2012-11-19
Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.
A framework for organizing and selecting quantitative approaches for benefit-harm assessment
2012-01-01
Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976
Fisher statistics for analysis of diffusion tensor directional information.
Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P
2012-04-30
A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (p<0.0005) differences were found that robustly confirmed observations that were suggested by visual inspection of directionally encoded color DTI maps. The Fisher approach is a potentially useful analysis tool that may extend the current capabilities of DTI investigation by providing a means of statistical comparison of tissue structural orientation. Copyright © 2012 Elsevier B.V. All rights reserved.
Multiple alignment-free sequence comparison
Ren, Jie; Song, Kai; Sun, Fengzhu; Deng, Minghua; Reinert, Gesine
2013-01-01
Motivation: Recently, a range of new statistics have become available for the alignment-free comparison of two sequences based on k-tuple word content. Here, we extend these statistics to the simultaneous comparison of more than two sequences. Our suite of statistics contains, first, and , extensions of statistics for pairwise comparison of the joint k-tuple content of all the sequences, and second, , and , averages of sums of pairwise comparison statistics. The two tasks we consider are, first, to identify sequences that are similar to a set of target sequences, and, second, to measure the similarity within a set of sequences. Results: Our investigation uses both simulated data as well as cis-regulatory module data where the task is to identify cis-regulatory modules with similar transcription factor binding sites. We find that although for real data, all of our statistics show a similar performance, on simulated data the Shepp-type statistics are in some instances outperformed by star-type statistics. The multiple alignment-free statistics are more sensitive to contamination in the data than the pairwise average statistics. Availability: Our implementation of the five statistics is available as R package named ‘multiAlignFree’ at be http://www-rcf.usc.edu/∼fsun/Programs/multiAlignFree/multiAlignFreemain.html. Contact: reinert@stats.ox.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23990418
Wengert, Georg Johannes; Helbich, Thomas H; Vogl, Wolf-Dieter; Baltzer, Pascal; Langs, Georg; Weber, Michael; Bogner, Wolfgang; Gruber, Stephan; Trattnig, Siegfried; Pinker, Katja
2015-02-01
The purposes of this study were to introduce and assess an automated user-independent quantitative volumetric (AUQV) breast density (BD) measurement system on the basis of magnetic resonance imaging (MRI) using the Dixon technique as well as to compare it with qualitative and quantitative mammographic (MG) BD measurements. Forty-three women with normal mammogram results (Breast Imaging Reporting and Data System 1) were included in this institutional review board-approved prospective study. All participants were subjected to BD assessment with MRI using the following sequence with the Dixon technique (echo time/echo time, 6 milliseconds/2.45 milliseconds/2.67 milliseconds; 1-mm isotropic; 3 minutes 38 seconds). To test the reproducibility, a second MRI after patient repositioning was performed. The AUQV magnetic resonance (MR) BD measurement system automatically calculated percentage (%) BD. The qualitative BD assessment was performed using the American College of Radiology Breast Imaging Reporting and Data System BD categories. Quantitative BD was estimated semiautomatically using the thresholding technique Cumulus4. Appropriate statistical tests were used to assess the agreement between the AUQV MR measurements and to compare them with qualitative and quantitative MG BD estimations. The AUQV MR BD measurements were successfully performed in all 43 women. There was a nearly perfect agreement of AUQV MR BD measurements between the 2 MR examinations for % BD (P < 0.001; intraclass correlation coefficient, 0.998) with no significant differences (P = 0.384). The AUQV MR BD measurements were significantly lower than quantitative and qualitative MG BD assessment (P < 0.001). The AUQV MR BD measurement system allows a fully automated, user-independent, robust, reproducible, as well as radiation- and compression-free volumetric quantitative BD assessment through different levels of BD. The AUQV MR BD measurements were significantly lower than the currently used qualitative and quantitative MG-based approaches, implying that the current assessment might overestimate breast density with MG.
NASA Astrophysics Data System (ADS)
O'Shea, Thomas T.; Beale, Kristy L. C.; Brucker, Kyle A.; Wyatt, Donald C.; Drazen, David; Fullerton, Anne M.; Fu, Tom C.; Dommermuth, Douglas G.
2010-11-01
Numerical Flow Analysis (NFA) predictions of the flow around a transom-stern hull form are compared to laboratory measurements collected at NSWCCD. The simulations are two-phase, three-dimensional, and unsteady. Each required 1.15 billion grid cells and 200,000 CPU hours to accurately resolve the unsteady flow and obtain a sufficient statistical ensemble size. Two speeds, 7 and 8 knots, are compared. The 7 knots (Fr=Uo /√gLo=0.38) case is a partially wetted transom condition and the 8 knots (Fr=0.43) case is a dry transom condition. The results of a detailed comparison of the mean free surface elevation, surface roughness (RMS), and spectra of the breaking stern-waves, measured by Light Detection And Ranging (LiDAR) and Quantitative Visualization (QViz) sensors, are presented. All of the comparisons showed excellent agreement. The concept of height-function processing is introduced, and the application of this type of processing to the simulation data shows a k-5/3 power law behavior for both the 7 and 8 knot cases. The simulations also showed that a multiphase shear layer forms in the rooster-tail region and that its thickness depends on the Froude number.
Dyjack, D T; Levine, S P; Holtshouser, J L; Schork, M A
1998-06-01
Numerous manufacturing and service organizations have integrated or are considering integration of their respective occupational health and safety management and audit systems into the International Organization for Standardization-based (ISO) audit-driven Quality Management Systems (ISO 9000) or Environmental Management Systems (ISO 14000) models. Companies considering one of these options will likely need to identify and evaluate several key factors before embarking on such efforts. The purpose of this article is to identify and address the key factors through a case study approach. Qualitative and quantitative comparisons of the key features of the American Industrial Hygiene Association ISO-9001 harmonized Occupational Health and Safety Management System with The Goodyear Tire & Rubber Co. management and audit system were conducted. The comparisons showed that the two management systems and their respective audit protocols, although structured differently, were not substantially statistically dissimilar in content. The authors recommend that future studies continue to evaluate the advantages and disadvantages of various audit protocols. Ideally, these studies would identify those audit outcome measures that can be reliably correlated with health and safety performance.
NASA Astrophysics Data System (ADS)
Reineker, P.; Kenkre, V. M.; Kühne, R.
1981-08-01
A quantitative comparison of a simple theoretical prediction for the drift mobility of photo-electrons in organic molecular crystals, calculated within the model of the coupled band-like and hopping motion, with experiments in napthalene of Schein et al. and Karl et al. is given.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Research study demonstrates computer simulation can predict warpage and assist in its elimination
NASA Astrophysics Data System (ADS)
Glozer, G.; Post, S.; Ishii, K.
1994-10-01
Programs for predicting warpage in injection molded parts are relatively new. Commercial software for simulating the flow and cooling stages of injection molding have steadily gained acceptance; however, warpage software is not yet as readily accepted. This study focused on gaining an understanding of the predictive capabilities of the warpage software. The following aspects of this study were unique. (1) Quantitative results were found using a statistically designed set of experiments. (2) Comparisons between experimental and simulation results were made with parts produced in a well-instrumented and controlled injection molding machine. (3) The experimental parts were accurately measured on a coordinate measuring machine with a non-contact laser probe. (4) The effect of part geometry on warpage was investigated.
Color normalization for robust evaluation of microscopy images
NASA Astrophysics Data System (ADS)
Švihlík, Jan; Kybic, Jan; Habart, David
2015-09-01
This paper deals with color normalization of microscopy images of Langerhans islets in order to increase robustness of the islet segmentation to illumination changes. The main application is automatic quantitative evaluation of the islet parameters, useful for determining the feasibility of islet transplantation in diabetes. First, background illumination inhomogeneity is compensated and a preliminary foreground/background segmentation is performed. The color normalization itself is done in either lαβ or logarithmic RGB color spaces, by comparison with a reference image. The color-normalized images are segmented using color-based features and pixel-wise logistic regression, trained on manually labeled images. Finally, relevant statistics such as the total islet area are evaluated in order to determine the success likelihood of the transplantation.
Teleradiology Via The Naval Remote Medical Diagnosis System (RMDS)
NASA Astrophysics Data System (ADS)
Rasmussen, Will; Stevens, Ilya; Gerber, F. H.; Kuhlman, Jayne A.
1982-01-01
Testing was conducted to obtain qualitative and quantitative (statistical) data on radiology performance using the Remote Medical Diagnosis System (RMDS) Advanced Development Models (ADMs)1. Based upon data collected during testing with professional radiologists, this analysis addresses the clinical utility of radiographic images transferred through six possible RMDS transmission modes. These radiographs were also viewed under closed-circuit television (CCTV) and lightbox conditions to provide a basis for comparison. The analysis indicates that the RMDS ADM terminals (with a system video resolution of 525 x 256 x 6) would provide satisfactory radiographic images for radiology consultations in emergency cases with gross pathological disorders. However, in cases involving more subtle findings, a system video resolution of 525 x 512 x 8 would be preferable.
Leadership and Culture-Building in Schools: Quantitative and Qualitative Understandings.
ERIC Educational Resources Information Center
Sashkin, Marshall; Sashkin, Molly G.
Understanding effective school leadership as a function of culture building through quantitative and qualitative analyses is the purpose of this paper. The two-part quantitative phase of the research focused on statistical measures of culture and leadership behavior directed toward culture building in the school. The first quantitative part…
Non-equilibrium statistical mechanics theory for the large scales of geophysical flows
NASA Astrophysics Data System (ADS)
Eric, S.; Bouchet, F.
2010-12-01
The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.
Four-point bending as a method for quantitatively evaluating spinal arthrodesis in a rat model.
Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F
2015-02-01
The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague-Dawley rat spines after single-level posterolateral fusion procedures at L4-L5. Segments were classified as 'not fused,' 'restricted motion,' or 'fused' by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4-L5 motion segment, and stiffness was measured as the slope of the moment-displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery.
Song, Fujian; Xiong, Tengbin; Parekh-Bhurke, Sheetal; Loke, Yoon K; Sutton, Alex J; Eastwood, Alison J; Holland, Richard; Chen, Yen-Fu; Glenny, Anne-Marie; Deeks, Jonathan J; Altman, Doug G
2011-08-16
To investigate the agreement between direct and indirect comparisons of competing healthcare interventions. Meta-epidemiological study based on sample of meta-analyses of randomised controlled trials. Data sources Cochrane Database of Systematic Reviews and PubMed. Inclusion criteria Systematic reviews that provided sufficient data for both direct comparison and independent indirect comparisons of two interventions on the basis of a common comparator and in which the odds ratio could be used as the outcome statistic. Inconsistency measured by the difference in the log odds ratio between the direct and indirect methods. The study included 112 independent trial networks (including 1552 trials with 478,775 patients in total) that allowed both direct and indirect comparison of two interventions. Indirect comparison had already been explicitly done in only 13 of the 85 Cochrane reviews included. The inconsistency between the direct and indirect comparison was statistically significant in 16 cases (14%, 95% confidence interval 9% to 22%). The statistically significant inconsistency was associated with fewer trials, subjectively assessed outcomes, and statistically significant effects of treatment in either direct or indirect comparisons. Owing to considerable inconsistency, many (14/39) of the statistically significant effects by direct comparison became non-significant when the direct and indirect estimates were combined. Significant inconsistency between direct and indirect comparisons may be more prevalent than previously observed. Direct and indirect estimates should be combined in mixed treatment comparisons only after adequate assessment of the consistency of the evidence.
Xiong, Tengbin; Parekh-Bhurke, Sheetal; Loke, Yoon K; Sutton, Alex J; Eastwood, Alison J; Holland, Richard; Chen, Yen-Fu; Glenny, Anne-Marie; Deeks, Jonathan J; Altman, Doug G
2011-01-01
Objective To investigate the agreement between direct and indirect comparisons of competing healthcare interventions. Design Meta-epidemiological study based on sample of meta-analyses of randomised controlled trials. Data sources Cochrane Database of Systematic Reviews and PubMed. Inclusion criteria Systematic reviews that provided sufficient data for both direct comparison and independent indirect comparisons of two interventions on the basis of a common comparator and in which the odds ratio could be used as the outcome statistic. Main outcome measure Inconsistency measured by the difference in the log odds ratio between the direct and indirect methods. Results The study included 112 independent trial networks (including 1552 trials with 478 775 patients in total) that allowed both direct and indirect comparison of two interventions. Indirect comparison had already been explicitly done in only 13 of the 85 Cochrane reviews included. The inconsistency between the direct and indirect comparison was statistically significant in 16 cases (14%, 95% confidence interval 9% to 22%). The statistically significant inconsistency was associated with fewer trials, subjectively assessed outcomes, and statistically significant effects of treatment in either direct or indirect comparisons. Owing to considerable inconsistency, many (14/39) of the statistically significant effects by direct comparison became non-significant when the direct and indirect estimates were combined. Conclusions Significant inconsistency between direct and indirect comparisons may be more prevalent than previously observed. Direct and indirect estimates should be combined in mixed treatment comparisons only after adequate assessment of the consistency of the evidence. PMID:21846695
Parker, Elizabeth O; Chang, Jennifer; Thomas, Volker
2016-01-01
We examined the trends of quantitative research over the past 10 years in the Journal of Marital and Family Therapy (JMFT). Specifically, within the JMFT, we investigated the types and trends of research design and statistical analysis within the quantitative research that was published in JMFT from 2005 to 2014. We found that while the amount of peer-reviewed articles have increased over time, the percentage of quantitative research has remained constant. We discussed the types and trends of statistical analysis and the implications for clinical work and training programs in the field of marriage and family therapy. © 2016 American Association for Marriage and Family Therapy.
Automated feature extraction and spatial organization of seafloor pockmarks, Belfast Bay, Maine, USA
Andrews, Brian D.; Brothers, Laura L.; Barnhardt, Walter A.
2010-01-01
Seafloor pockmarks occur worldwide and may represent millions of m3 of continental shelf erosion, but few numerical analyses of their morphology and spatial distribution of pockmarks exist. We introduce a quantitative definition of pockmark morphology and, based on this definition, propose a three-step geomorphometric method to identify and extract pockmarks from high-resolution swath bathymetry. We apply this GIS-implemented approach to 25 km2 of bathymetry collected in the Belfast Bay, Maine USA pockmark field. Our model extracted 1767 pockmarks and found a linear pockmark depth-to-diameter ratio for pockmarks field-wide. Mean pockmark depth is 7.6 m and mean diameter is 84.8 m. Pockmark distribution is non-random, and nearly half of the field's pockmarks occur in chains. The most prominent chains are oriented semi-normal to the steepest gradient in Holocene sediment thickness. A descriptive model yields field-wide spatial statistics indicating that pockmarks are distributed in non-random clusters. Results enable quantitative comparison of pockmarks in fields worldwide as well as similar concave features, such as impact craters, dolines, or salt pools.
Path analysis of the genetic integration of traits in the sand cricket: a novel use of BLUPs.
Roff, D A; Fairbairn, D J
2011-09-01
This study combines path analysis with quantitative genetics to analyse a key life history trade-off in the cricket, Gryllus firmus. We develop a path model connecting five traits associated with the trade-off between flight capability and reproduction and test this model using phenotypic data and estimates of breeding values (best linear unbiased predictors) from a half-sibling experiment. Strong support by both types of data validates our causal model and indicates concordance between the phenotypic and genetic expression of the trade-off. Comparisons of the trade-off between sexes and wing morphs reveal that these discrete phenotypes are not genetically independent and that the evolutionary trajectories of the two wing morphs are more tightly constrained to covary than those of the two sexes. Our results illustrate the benefits of combining a quantitative genetic analysis, which examines statistical correlations between traits, with a path model that focuses upon the causal components of variation. © 2011 The Authors. Journal of Evolutionary Biology © 2011 European Society For Evolutionary Biology.
Barkla, Bronwyn J.
2016-01-01
Modern day agriculture practice is narrowing the genetic diversity in our food supply. This may compromise the ability to obtain high yield under extreme climactic conditions, threatening food security for a rapidly growing world population. To identify genetic diversity, tolerance mechanisms of cultivars, landraces and wild relatives of major crops can be identified and ultimately exploited for yield improvement. Quantitative proteomics allows for the identification of proteins that may contribute to tolerance mechanisms by directly comparing protein abundance under stress conditions between genotypes differing in their stress responses. In this review, a summary is provided of the data accumulated from quantitative proteomic comparisons of crop genotypes/cultivars which present different stress tolerance responses when exposed to various abiotic stress conditions, including drought, salinity, high/low temperature, nutrient deficiency and UV-B irradiation. This field of research aims to identify molecular features that can be developed as biomarkers for crop improvement, however without accurate phenotyping, careful experimental design, statistical robustness and appropriate biomarker validation and verification it will be challenging to deliver what is promised. PMID:28248236
Barkla, Bronwyn J
2016-09-08
Modern day agriculture practice is narrowing the genetic diversity in our food supply. This may compromise the ability to obtain high yield under extreme climactic conditions, threatening food security for a rapidly growing world population. To identify genetic diversity, tolerance mechanisms of cultivars, landraces and wild relatives of major crops can be identified and ultimately exploited for yield improvement. Quantitative proteomics allows for the identification of proteins that may contribute to tolerance mechanisms by directly comparing protein abundance under stress conditions between genotypes differing in their stress responses. In this review, a summary is provided of the data accumulated from quantitative proteomic comparisons of crop genotypes/cultivars which present different stress tolerance responses when exposed to various abiotic stress conditions, including drought, salinity, high/low temperature, nutrient deficiency and UV-B irradiation. This field of research aims to identify molecular features that can be developed as biomarkers for crop improvement, however without accurate phenotyping, careful experimental design, statistical robustness and appropriate biomarker validation and verification it will be challenging to deliver what is promised.
Comparison of a direct and indirect ELISA for quantitating antisperm antibody in semen.
Lynch, D M; Howe, S E
1987-01-01
A direct and an indirect quantitative ELISA for antisperm antibody were compared using the spermatozoa and cell-free seminal fluid of 66 infertile males. The normal concentration of sperm binding immunoglobulin was less than or equal to 1.5 fg Ig per spermatozoon for the indirect seminal plasma assay and less than or equal to 1.5 fg Ig per spermatozoon by the direct assay. Of the 66 infertile males, 21% (14/66) had elevated levels of antisperm antibody in their seminal plasma and 26% (17/66) had elevated levels bound directly to their spermatozoa. The direct correlation between the results of these assays was 94%. A simple linear regression analysis between the indirect and direct measurements of antisperm antibody resulted in a correlation coefficient of r = 0.907. There was no statistically significant difference between results from the direct and indirect methods of the patients as a group. However, there was evidence of autospecificity in a small percentage of males who had elevated levels of antisperm antibody by the direct assay that was not detected by the indirect assay using pooled donor spermatozoa.
Cielecka-Piontek, Judyta
2013-07-01
A simple and selective derivative spectrophotometric method was developed for the quantitative determination of faropenem in pure form and in pharmaceutical dosage. The method is based on the zero-crossing effect of first-derivative spectrophotometry (λ = 324 nm), which eliminates the overlapping effect caused by the excipients present in the pharmaceutical preparation, as well as degradation products, formed during hydrolysis, oxidation, photolysis, and thermolysis. The method was linear in the concentration range 2.5-300 μg/mL (r = 0.9989) at λ = 341 nm; the limits of detection and quantitation were 0.16 and 0.46 μg/mL, respectively. The method had good precision (relative standard deviation from 0.68 to 2.13%). Recovery of faropenem ranged from 97.9 to 101.3%. The first-order rate constants of the degradation of faropenem in pure form and in pharmaceutical dosage were determined by using first-derivative spectrophotometry. A statistical comparison of the validation results and the observed rate constants for faropenem degradation with these obtained with the high-performance liquid chromatography method demonstrated that both were compatible.
Qiu, Shanshan; Wang, Jun; Gao, Liping
2014-07-09
An electronic nose (E-nose) and an electronic tongue (E-tongue) have been used to characterize five types of strawberry juices based on processing approaches (i.e., microwave pasteurization, steam blanching, high temperature short time pasteurization, frozen-thawed, and freshly squeezed). Juice quality parameters (vitamin C, pH, total soluble solid, total acid, and sugar/acid ratio) were detected by traditional measuring methods. Multivariate statistical methods (linear discriminant analysis (LDA) and partial least squares regression (PLSR)) and neural networks (Random Forest (RF) and Support Vector Machines) were employed to qualitative classification and quantitative regression. E-tongue system reached higher accuracy rates than E-nose did, and the simultaneous utilization did have an advantage in LDA classification and PLSR regression. According to cross-validation, RF has shown outstanding and indisputable performances in the qualitative and quantitative analysis. This work indicates that the simultaneous utilization of E-nose and E-tongue can discriminate processed fruit juices and predict quality parameters successfully for the beverage industry.
Matsunaga, Tomoko M; Ogawa, Daisuke; Taguchi-Shiobara, Fumio; Ishimoto, Masao; Matsunaga, Sachihiro; Habu, Yoshiki
2017-06-01
Leaf color is an important indicator when evaluating plant growth and responses to biotic/abiotic stress. Acquisition of images by digital cameras allows analysis and long-term storage of the acquired images. However, under field conditions, where light intensity can fluctuate and other factors (shade, reflection, and background, etc.) vary, stable and reproducible measurement and quantification of leaf color are hard to achieve. Digital scanners provide fixed conditions for obtaining image data, allowing stable and reliable comparison among samples, but require detached plant materials to capture images, and the destructive processes involved often induce deformation of plant materials (curled leaves and faded colors, etc.). In this study, by using a lightweight digital scanner connected to a mobile computer, we obtained digital image data from intact plant leaves grown in natural-light greenhouses without detaching the targets. We took images of soybean leaves infected by Xanthomonas campestris pv. glycines , and distinctively quantified two disease symptoms (brown lesions and yellow halos) using freely available image processing software. The image data were amenable to quantitative and statistical analyses, allowing precise and objective evaluation of disease resistance.
Reconstruction of three-dimensional porous media using a single thin section
NASA Astrophysics Data System (ADS)
Tahmasebi, Pejman; Sahimi, Muhammad
2012-06-01
The purpose of any reconstruction method is to generate realizations of two- or multiphase disordered media that honor limited data for them, with the hope that the realizations provide accurate predictions for those properties of the media for which there are no data available, or their measurement is difficult. An important example of such stochastic systems is porous media for which the reconstruction technique must accurately represent their morphology—the connectivity and geometry—as well as their flow and transport properties. Many of the current reconstruction methods are based on low-order statistical descriptors that fail to provide accurate information on the properties of heterogeneous porous media. On the other hand, due to the availability of high resolution two-dimensional (2D) images of thin sections of a porous medium, and at the same time, the high cost, computational difficulties, and even unavailability of complete 3D images, the problem of reconstructing porous media from 2D thin sections remains an outstanding unsolved problem. We present a method based on multiple-point statistics in which a single 2D thin section of a porous medium, represented by a digitized image, is used to reconstruct the 3D porous medium to which the thin section belongs. The method utilizes a 1D raster path for inspecting the digitized image, and combines it with a cross-correlation function, a grid splitting technique for deciding the resolution of the computational grid used in the reconstruction, and the Shannon entropy as a measure of the heterogeneity of the porous sample, in order to reconstruct the 3D medium. It also utilizes an adaptive technique for identifying the locations and optimal number of hard (quantitative) data points that one can use in the reconstruction process. The method is tested on high resolution images for Berea sandstone and a carbonate rock sample, and the results are compared with the data. To make the comparison quantitative, two sets of statistical tests consisting of the autocorrelation function, histogram matching of the local coordination numbers, the pore and throat size distributions, multiple-points connectivity, and single- and two-phase flow permeabilities are used. The comparison indicates that the proposed method reproduces the long-range connectivity of the porous media, with the computed properties being in good agreement with the data for both porous samples. The computational efficiency of the method is also demonstrated.
ERIC Educational Resources Information Center
Liau, Albert K.; Kiat, John E.; Nie, Youyan
2015-01-01
The purpose of this study was to examine the extent to which the pedagogical approaches used in the course were related to improvements in students' attitudes toward statistics in a Quantitative Methods course for psychology undergraduate students in a Malaysian University. The study examined whether increasing availability of the instructor and…
Supe, S; Milicić, J; Pavićević, R
1997-06-01
Recent studies on the etiopathogenesis of multiple sclerosis (MS) all point out that there is a polygenetical predisposition for this illness. The so called "MS Trait" determines the reactivity of the immunological system upon ecological factors. The development of the glyphological science and the study of the characteristics of the digito-palmar dermatoglyphic complex (for which it was established that they are polygenetically determined characteristics) all enable a better insight into the genetic development during early embriogenesis. The aim of this study was to estimate certain differences in the dermatoglyphics of digito-palmar complexes between the group with multiple sclerosis and the comparable, phenotypically healthy groups of both sexes. This study is based on the analysis of 18 quantitative characteristics of the digito-palmar complex in 125 patients with multiple sclerosis (41 males and 84 females) in comparison to a group of 400 phenotypically healthy patients (200 males and 200 females). The conducted analysis pointed towards a statistically significant decrease of the number of digital and palmar ridges, as well as with lower values of atd angles in a group of MS patients of both sexes. The main discriminators were the characteristic palmar dermatoglyphics with the possibility that the discriminate analysis classifies over 80% of the examinees which exceeds the statistical significance. The results of this study suggest a possible discrimination of patients with MS and the phenotypically health population through the analysis of the dermatoglyphic status, and therefore the possibility that multiple sclerosis is genetically predisposed disease.
Lee, Ji Won; Lee, Geewon; Lee, Nam Kyung; Moon, Jin Il; Ju, Yun Hye; Suh, Young Ju; Jeong, Yeon Joo
2016-01-01
The aim of the study was to assess the effectiveness of the adaptive statistical iterative reconstruction (ASIR) for dual-energy computed tomography pulmonary angiography (DE-CTPA) with a reduced iodine load. One hundred forty patients referred for chest CT were randomly divided into a DE-CTPA group with a reduced iodine load or a standard CTPA group. Quantitative and qualitative image qualities of virtual monochromatic spectral (VMS) images with filtered back projection (VMS-FBP) and those with 50% ASIR (VMS-ASIR) in the DE-CTPA group were compared. Image qualities of VMS-ASIR images in the DE-CTPA group and ASIR images in the standard CTPA group were also compared. All quantitative and qualitative indices, except attenuation value of pulmonary artery in the VMS-ASIR subgroup, were superior to those in the VMS-FBP subgroup (all P < 0.001). Noise and signal-to-noise ratio of VMS-ASIR images were superior to those of ASIR images in the standard CTPA group (P < 0.001 and P = 0.007, respectively). Regarding qualitative indices, noise was significantly lower in VMS-ASIR images of the DE-CTPA group than in ASIR images of the standard CTPA group (P = 0.001). The ASIR technique tends to improve the image quality of VMS imaging. Dual-energy computed tomography pulmonary angiography with ASIR can reduce contrast medium volume and produce images of comparable quality with those of standard CTPA.
Watanabe, Hiroshi
2012-01-01
Procedures of statistical analysis are reviewed to provide an overview of applications of statistics for general use. Topics that are dealt with are inference on a population, comparison of two populations with respect to means and probabilities, and multiple comparisons. This study is the second part of series in which we survey medical statistics. Arguments related to statistical associations and regressions will be made in subsequent papers.
Kim, Min Soon; Rodney, William N; Cooper, Tara; Kite, Chris; Reece, Gregory P; Markey, Mia K
2009-02-01
Scarring is a significant cause of dissatisfaction for women who undergo breast surgery. Scar tissue may be clinically distinguished from normal skin by aberrant colour, rough surface texture, increased thickness (hypertrophy) and firmness. Colorimeters or spectrophotometers can be used to quantitatively assess scar colour, but they require direct patient interaction and can cost thousands of dollars. By comparison, digital photography is already in widespread use to document clinical outcomes and requires less patient interaction. Thus, assessment of scar coloration by digital photography is an attractive alternative. The goal of this study was to compare colour measurements obtained by digital photography and colorimetry. Agreements between photographic and colorimetric measurements of colour were evaluated. Experimental conditions were controlled by performing measurements on artificial scars created by a make-up artist. The colorimetric measurements of the artificial scars were compared with those reported in the literature for real scars in order to confirm the validity of this approach. We assessed the agreement between the colorimetric and photographic measurements of colour using a hypothesis test for equivalence, the intraclass correlation coefficient and the Bland-Altman method. Overall, good agreement was obtained for three parameters (L*a*b*) measured by colorimetry and photography from the results of three statistical analyses. Colour measurements obtained by digital photography were equivalent to those obtained using colorimetry. Thus, digital photography is a reliable, cost-effective measurement method of skin colour and should be further investigated for quantitative analysis of surgical outcomes.
Comparison of quantitative EEG characteristics of quiet and active sleep in newborns.
Paul, Karel; Krajca, Vladimír; Roth, Zdenek; Melichar, Jan; Petránek, Svojmil
2003-11-01
The aim of the present study was to verify whether the proposed method of computer-supported EEG analysis is able to differentiate the EEG activity in quiet sleep (QS) from that in active sleep (AS) in newborns. A quantitative description of the neonatal EEG may contribute to a more exact evaluation of the functional state of the brain, as well as to a refinement of diagnostics of brain dysfunction manifesting itself frequently as 'dysrhythmia' or 'dysmaturity'. Twenty-one healthy newborns (10 full-term and 11 pre-term) were examined polygraphically (EEG-eight channels, respiration, ECG, EOG and EMG) in the course of sleep. From each EEG record, two 5-min samples (one from QS and one from AS) were subject to an off-line computerized analysis. The obtained data were averaged with respect to the sleep state and to the conceptional age. The number of variables was reduced by means of factor analysis. All factors identified by factor analysis were highly significantly influenced by sleep states in both developmental periods. Likewise, a comparison of the measured variables between QS and AS revealed many statistically significant differences. The variables describing (a) the number and length of quasi-stationary segments, (b) voltage and (c) power in delta and theta bands contributed to the greatest degree to the differentiation of EEGs between both sleep states. The presented method of the computerized EEG analysis which has good discriminative potential is adequately sensitive and describes the neonatal EEG with convenient accuracy.
Single-cell real-time imaging of transgene expression upon lipofection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fiume, Giuseppe; Di Rienzo, Carmine; NEST, Scuola Normale Superiore and Istituto Nanoscienze-CNR, Piazza San Silvestro 12, 56127, Pisa
2016-05-20
Here we address the process of lipofection by quantifying the expression of a genetically-encoded fluorescent reporter at the single-cell level, and in real-time, by confocal imaging in live cells. The Lipofectamine gold-standard formulation is compared to the alternative promising DC-Chol/DOPE formulation. In both cases, we report that only dividing cells are able to produce a detectable amount of the fluorescent reporter protein. Notably, by measuring fluorescence over time in each pair of daughter cells, we find that Lipofectamine-based transfection statistically yields a remarkably higher degree of “symmetry” in protein expression between daughter cells as compared to DC-Chol/DOPE. A model ismore » envisioned in which the degree of symmetry of protein expression is linked to the number of bioavailable DNA copies within the cell before nuclear breakdown. Reported results open new perspectives for the understanding of the lipofection mechanism and define a new experimental platform for the quantitative comparison of transfection reagents. -- Highlights: •The process of lipofection is followed by quantifying the transgene expression in real time. •The Lipofectamine gold-standard is compared to the promising DC-Chol/DOPE formulation. •We report that only dividing cells are able to produce the fluorescent reporter protein. •The degree of symmetry of protein expression in daughter cells is linked to DNA bioavailability. •A new experimental platform for the quantitative comparison of transfection reagents is proposed.« less
Arabidopsis phenotyping through Geometric Morphometrics.
Manacorda, Carlos A; Asurmendi, Sebastian
2018-06-18
Recently, much technical progress was achieved in the field of plant phenotyping. High-throughput platforms and the development of improved algorithms for rosette image segmentation make it now possible to extract shape and size parameters for genetic, physiological and environmental studies on a large scale. The development of low-cost phenotyping platforms and freeware resources make it possible to widely expand phenotypic analysis tools for Arabidopsis. However, objective descriptors of shape parameters that could be used independently of platform and segmentation software used are still lacking and shape descriptions still rely on ad hoc or even sometimes contradictory descriptors, which could make comparisons difficult and perhaps inaccurate. Modern geometric morphometrics is a family of methods in quantitative biology proposed to be the main source of data and analytical tools in the emerging field of phenomics studies. Based on the location of landmarks (corresponding points) over imaged specimens and by combining geometry, multivariate analysis and powerful statistical techniques, these tools offer the possibility to reproducibly and accurately account for shape variations amongst groups and measure them in shape distance units. Here, a particular scheme of landmarks placement on Arabidopsis rosette images is proposed to study shape variation in the case of viral infection processes. Shape differences between controls and infected plants are quantified throughout the infectious process and visualized. Quantitative comparisons between two unrelated ssRNA+ viruses are shown and reproducibility issues are assessed. Combined with the newest automated platforms and plant segmentation procedures, geometric morphometric tools could boost phenotypic features extraction and processing in an objective, reproducible manner.
Bredbenner, Todd L.; Eliason, Travis D.; Francis, W. Loren; McFarland, John M.; Merkle, Andrew C.; Nicolella, Daniel P.
2014-01-01
Cervical spinal injuries are a significant concern in all trauma injuries. Recent military conflicts have demonstrated the substantial risk of spinal injury for the modern warfighter. Finite element models used to investigate injury mechanisms often fail to examine the effects of variation in geometry or material properties on mechanical behavior. The goals of this study were to model geometric variation for a set of cervical spines, to extend this model to a parametric finite element model, and, as a first step, to validate the parametric model against experimental data for low-loading conditions. Individual finite element models were created using cervical spine (C3–T1) computed tomography data for five male cadavers. Statistical shape modeling (SSM) was used to generate a parametric finite element model incorporating variability of spine geometry, and soft-tissue material property variation was also included. The probabilistic loading response of the parametric model was determined under flexion-extension, axial rotation, and lateral bending and validated by comparison to experimental data. Based on qualitative and quantitative comparison of the experimental loading response and model simulations, we suggest that the model performs adequately under relatively low-level loading conditions in multiple loading directions. In conclusion, SSM methods coupled with finite element analyses within a probabilistic framework, along with the ability to statistically validate the overall model performance, provide innovative and important steps toward describing the differences in vertebral morphology, spinal curvature, and variation in material properties. We suggest that these methods, with additional investigation and validation under injurious loading conditions, will lead to understanding and mitigating the risks of injury in the spine and other musculoskeletal structures. PMID:25506051
Singh, Param Priya; Arora, Jatin; Isambert, Hervé
2015-07-01
Whole genome duplications (WGD) have now been firmly established in all major eukaryotic kingdoms. In particular, all vertebrates descend from two rounds of WGDs, that occurred in their jawless ancestor some 500 MY ago. Paralogs retained from WGD, also coined 'ohnologs' after Susumu Ohno, have been shown to be typically associated with development, signaling and gene regulation. Ohnologs, which amount to about 20 to 35% of genes in the human genome, have also been shown to be prone to dominant deleterious mutations and frequently implicated in cancer and genetic diseases. Hence, identifying ohnologs is central to better understand the evolution of vertebrates and their susceptibility to genetic diseases. Early computational analyses to identify vertebrate ohnologs relied on content-based synteny comparisons between the human genome and a single invertebrate outgroup genome or within the human genome itself. These approaches are thus limited by lineage specific rearrangements in individual genomes. We report, in this study, the identification of vertebrate ohnologs based on the quantitative assessment and integration of synteny conservation between six amniote vertebrates and six invertebrate outgroups. Such a synteny comparison across multiple genomes is shown to enhance the statistical power of ohnolog identification in vertebrates compared to earlier approaches, by overcoming lineage specific genome rearrangements. Ohnolog gene families can be browsed and downloaded for three statistical confidence levels or recompiled for specific, user-defined, significance criteria at http://ohnologs.curie.fr/. In the light of the importance of WGD on the genetic makeup of vertebrates, our analysis provides a useful resource for researchers interested in gaining further insights on vertebrate evolution and genetic diseases.
Singh, Param Priya; Arora, Jatin; Isambert, Hervé
2015-01-01
Whole genome duplications (WGD) have now been firmly established in all major eukaryotic kingdoms. In particular, all vertebrates descend from two rounds of WGDs, that occurred in their jawless ancestor some 500 MY ago. Paralogs retained from WGD, also coined ‘ohnologs’ after Susumu Ohno, have been shown to be typically associated with development, signaling and gene regulation. Ohnologs, which amount to about 20 to 35% of genes in the human genome, have also been shown to be prone to dominant deleterious mutations and frequently implicated in cancer and genetic diseases. Hence, identifying ohnologs is central to better understand the evolution of vertebrates and their susceptibility to genetic diseases. Early computational analyses to identify vertebrate ohnologs relied on content-based synteny comparisons between the human genome and a single invertebrate outgroup genome or within the human genome itself. These approaches are thus limited by lineage specific rearrangements in individual genomes. We report, in this study, the identification of vertebrate ohnologs based on the quantitative assessment and integration of synteny conservation between six amniote vertebrates and six invertebrate outgroups. Such a synteny comparison across multiple genomes is shown to enhance the statistical power of ohnolog identification in vertebrates compared to earlier approaches, by overcoming lineage specific genome rearrangements. Ohnolog gene families can be browsed and downloaded for three statistical confidence levels or recompiled for specific, user-defined, significance criteria at http://ohnologs.curie.fr/. In the light of the importance of WGD on the genetic makeup of vertebrates, our analysis provides a useful resource for researchers interested in gaining further insights on vertebrate evolution and genetic diseases. PMID:26181593
Physical validation of a patient-specific contact finite element model of the ankle.
Anderson, Donald D; Goldsworthy, Jane K; Li, Wendy; James Rudert, M; Tochigi, Yuki; Brown, Thomas D
2007-01-01
A validation study was conducted to determine the extent to which computational ankle contact finite element (FE) results agreed with experimentally measured tibio-talar contact stress. Two cadaver ankles were loaded in separate test sessions, during which ankle contact stresses were measured with a high-resolution (Tekscan) pressure sensor. Corresponding contact FE analyses were subsequently performed for comparison. The agreement was good between FE-computed and experimentally measured mean (3.2% discrepancy for one ankle, 19.3% for the other) and maximum (1.5% and 6.2%) contact stress, as well as for contact area (1.7% and 14.9%). There was also excellent agreement between histograms of fractional areas of cartilage experiencing specific ranges of contact stress. Finally, point-by-point comparisons between the computed and measured contact stress distributions over the articular surface showed substantial agreement, with correlation coefficients of 90% for one ankle and 86% for the other. In the past, general qualitative, but little direct quantitative agreement has been demonstrated with articular joint contact FE models. The methods used for this validation enable formal comparison of computational and experimental results, and open the way for objective statistical measures of regional correlation between FE-computed contact stress distributions from comparison articular joint surfaces (e.g., those from an intact versus those with residual intra-articular fracture incongruity).
Quantitative trait nucleotide analysis using Bayesian model selection.
Blangero, John; Goring, Harald H H; Kent, Jack W; Williams, Jeff T; Peterson, Charles P; Almasy, Laura; Dyer, Thomas D
2005-10-01
Although much attention has been given to statistical genetic methods for the initial localization and fine mapping of quantitative trait loci (QTLs), little methodological work has been done to date on the problem of statistically identifying the most likely functional polymorphisms using sequence data. In this paper we provide a general statistical genetic framework, called Bayesian quantitative trait nucleotide (BQTN) analysis, for assessing the likely functional status of genetic variants. The approach requires the initial enumeration of all genetic variants in a set of resequenced individuals. These polymorphisms are then typed in a large number of individuals (potentially in families), and marker variation is related to quantitative phenotypic variation using Bayesian model selection and averaging. For each sequence variant a posterior probability of effect is obtained and can be used to prioritize additional molecular functional experiments. An example of this quantitative nucleotide analysis is provided using the GAW12 simulated data. The results show that the BQTN method may be useful for choosing the most likely functional variants within a gene (or set of genes). We also include instructions on how to use our computer program, SOLAR, for association analysis and BQTN analysis.
Identification of common coexpression modules based on quantitative network comparison.
Jo, Yousang; Kim, Sanghyeon; Lee, Doheon
2018-06-13
Finding common molecular interactions from different samples is essential work to understanding diseases and other biological processes. Coexpression networks and their modules directly reflect sample-specific interactions among genes. Therefore, identification of common coexpression network or modules may reveal the molecular mechanism of complex disease or the relationship between biological processes. However, there has been no quantitative network comparison method for coexpression networks and we examined previous methods for other networks that cannot be applied to coexpression network. Therefore, we aimed to propose quantitative comparison methods for coexpression networks and to find common biological mechanisms between Huntington's disease and brain aging by the new method. We proposed two similarity measures for quantitative comparison of coexpression networks. Then, we performed experiments using known coexpression networks. We showed the validity of two measures and evaluated threshold values for similar coexpression network pairs from experiments. Using these similarity measures and thresholds, we quantitatively measured the similarity between disease-specific and aging-related coexpression modules and found similar Huntington's disease-aging coexpression module pairs. We identified similar Huntington's disease-aging coexpression module pairs and found that these modules are related to brain development, cell death, and immune response. It suggests that up-regulated cell signalling related cell death and immune/ inflammation response may be the common molecular mechanisms in the pathophysiology of HD and normal brain aging in the frontal cortex.
Making predictions of mangrove deforestation: a comparison of two methods in Kenya.
Rideout, Alasdair J R; Joshi, Neha P; Viergever, Karin M; Huxham, Mark; Briers, Robert A
2013-11-01
Deforestation of mangroves is of global concern given their importance for carbon storage, biogeochemical cycling and the provision of other ecosystem services, but the links between rates of loss and potential drivers or risk factors are rarely evaluated. Here, we identified key drivers of mangrove loss in Kenya and compared two different approaches to predicting risk. Risk factors tested included various possible predictors of anthropogenic deforestation, related to population, suitability for land use change and accessibility. Two approaches were taken to modelling risk; a quantitative statistical approach and a qualitative categorical ranking approach. A quantitative model linking rates of loss to risk factors was constructed based on generalized least squares regression and using mangrove loss data from 1992 to 2000. Population density, soil type and proximity to roads were the most important predictors. In order to validate this model it was used to generate a map of losses of Kenyan mangroves predicted to have occurred between 2000 and 2010. The qualitative categorical model was constructed using data from the same selection of variables, with the coincidence of different risk factors in particular mangrove areas used in an additive manner to create a relative risk index which was then mapped. Quantitative predictions of loss were significantly correlated with the actual loss of mangroves between 2000 and 2010 and the categorical risk index values were also highly correlated with the quantitative predictions. Hence, in this case the relatively simple categorical modelling approach was of similar predictive value to the more complex quantitative model of mangrove deforestation. The advantages and disadvantages of each approach are discussed, and the implications for mangroves are outlined. © 2013 Blackwell Publishing Ltd.
McEvoy, Maureen P; Lewis, Lucy K; Luker, Julie
2018-05-11
Dedicated Evidence-Based Practice (EBP) courses are often included in health professional education programs. It is important to understand the effectiveness of this training. This study investigated EBP outcomes in entry-level physiotherapy students from baseline to completion of all EBP training (graduation). Mixed methods with an explanatory sequential design. Physiotherapy students completed two psychometrically-tested health professional EBP instruments at baseline and graduation. The Evidence-Based Practice Profile questionnaire collected self-reported data (Terminology, Confidence, Practice, Relevance, Sympathy), and the Knowledge of Research Evidence Competencies instrument collected objective data (Actual Knowledge). Focus groups with students were conducted at graduation to gain a deeper understanding of the factors impacting changes in students' EBP knowledge, attitudes, behaviour and competency. Descriptive statistics, paired t-tests, 95% CI and effect sizes (ES) were used to examine changes in outcome scores from baseline to graduation. Transcribed focus group data were analysed following a qualitative descriptive approach with thematic analysis. A second stage of merged data analysis for mixed methods studies was undertaken using side-by-side comparisons to explore quantitatively assessed EBP measures with participants' personal perceptions. Data were analysed from 56 participants who completed both instruments at baseline and graduation, and from 21 focus group participants. Large ES were reported across most outcomes: Relevance (ES 2.29, p ≤ 0.001), Practice (1.8, p ≤ 0.001), Confidence (1.67, p ≤ 0.001), Terminology (3.13, p ≤ 0.001) and Actual Knowledge (4.3, p ≤ 0.001). A medium ES was found for Sympathy (0.49, p = 0.008). Qualitative and quantitative findings mostly aligned but for statistical terminology, participants' self-reported understanding was disparate with focus group reported experiences. Qualitative findings highlighted the importance of providing relevant context and positive role models for students during EBP training. Following EBP training across an entry-level physiotherapy program, there were qualitative and significant quantitative changes in participants' knowledge and perceptions of EBP. The qualitative and quantitative findings were mainly well-aligned with the exception of the Terminology domain, where the qualitative findings did not support the strength of the effect reported quantitatively. The findings of this study have implications for the timing and content of EBP curricula in entry-level health professional programs.
Investigating Children's Abilities to Count and Make Quantitative Comparisons
ERIC Educational Resources Information Center
Lee, Joohi; Md-Yunus, Sham'ah
2016-01-01
This study was designed to investigate children's abilities to count and make quantitative comparisons. In addition, this study utilized reasoning questions (i.e., how did you know?). Thirty-four preschoolers, mean age 4.5 years old, participated in the study. According to the results, 89% of the children (n = 30) were able to do rote counting and…
Quantitative comparison of 3D third harmonic generation and fluorescence microscopy images.
Zhang, Zhiqing; Kuzmin, Nikolay V; Groot, Marie Louise; de Munck, Jan C
2018-01-01
Third harmonic generation (THG) microscopy is a label-free imaging technique that shows great potential for rapid pathology of brain tissue during brain tumor surgery. However, the interpretation of THG brain images should be quantitatively linked to images of more standard imaging techniques, which so far has been done qualitatively only. We establish here such a quantitative link between THG images of mouse brain tissue and all-nuclei-highlighted fluorescence images, acquired simultaneously from the same tissue area. For quantitative comparison of a substantial pair of images, we present here a segmentation workflow that is applicable for both THG and fluorescence images, with a precision of 91.3 % and 95.8 % achieved respectively. We find that the correspondence between the main features of the two imaging modalities amounts to 88.9 %, providing quantitative evidence of the interpretation of dark holes as brain cells. Moreover, 80 % bright objects in THG images overlap with nuclei highlighted in the fluorescence images, and they are 2 times smaller than the dark holes, showing that cells of different morphologies can be recognized in THG images. We expect that the described quantitative comparison is applicable to other types of brain tissue and with more specific staining experiments for cell type identification. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
2015-07-15
Long-term effects on cancer survivors’ quality of life of physical training versus physical training combined with cognitive-behavioral therapy ...COMPARISON OF NEURAL NETWORK AND LINEAR REGRESSION MODELS IN STATISTICALLY PREDICTING MENTAL AND PHYSICAL HEALTH STATUS OF BREAST...34Comparison of Neural Network and Linear Regression Models in Statistically Predicting Mental and Physical Health Status of Breast Cancer Survivors
40 CFR 796.2750 - Sediment and soil adsorption isotherm.
Code of Federal Regulations, 2014 CFR
2014-07-01
... are highly reproducible. The test provides excellent quantitative data readily amenable to statistical... combination of methods suitable for the identification and quantitative detection of the parent test chemical... quantitative analysis of the parent chemical. (3) Amount of parent test chemical applied, the amount recovered...
40 CFR 796.2750 - Sediment and soil adsorption isotherm.
Code of Federal Regulations, 2013 CFR
2013-07-01
... highly reproducible. The test provides excellent quantitative data readily amenable to statistical... combination of methods suitable for the identification and quantitative detection of the parent test chemical... quantitative analysis of the parent chemical. (3) Amount of parent test chemical applied, the amount recovered...
40 CFR 796.2750 - Sediment and soil adsorption isotherm.
Code of Federal Regulations, 2012 CFR
2012-07-01
... highly reproducible. The test provides excellent quantitative data readily amenable to statistical... combination of methods suitable for the identification and quantitative detection of the parent test chemical... quantitative analysis of the parent chemical. (3) Amount of parent test chemical applied, the amount recovered...
Evaluating and Reporting Statistical Power in Counseling Research
ERIC Educational Resources Information Center
Balkin, Richard S.; Sheperis, Carl J.
2011-01-01
Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…
The Local Geometry of Multiattribute Tradeoff Preferences
McGeachie, Michael; Doyle, Jon
2011-01-01
Existing representations for multiattribute ceteris paribus preference statements have provided useful treatments and clear semantics for qualitative comparisons, but have not provided similarly clear representations or semantics for comparisons involving quantitative tradeoffs. We use directional derivatives and other concepts from elementary differential geometry to interpret conditional multiattribute ceteris paribus preference comparisons that state bounds on quantitative tradeoff ratios. This semantics extends the familiar economic notion of marginal rate of substitution to multiple continuous or discrete attributes. The same geometric concepts also provide means for interpreting statements about the relative importance of different attributes. PMID:21528018
Strifling, Kelly M B; Lu, Na; Wang, Mei; Cao, Kevin; Ackman, Jeffrey D; Klein, John P; Schwab, Jeffrey P; Harris, Gerald F
2008-10-01
This prospective study analyzes the upper extremity kinematics of 10 children with spastic diplegic cerebral palsy using anterior and posterior walkers. Although both types of walkers are commonly prescribed by clinicians, no quantitative data comparing the two in regards to upper extremity motion has been published. The study methodology included testing of each subject with both types of walkers in a motion analysis laboratory after an acclimation period of at least 1 month. Overall results showed that statistically, both walkers are relatively similar. With both anterior and posterior walkers, the shoulders were extended, elbows flexed, and wrists extended. Energy expenditure, walking speed and stride length was also similar with both walker types. Several differences were also noted although not statistically significant. Anterior torso tilt was reduced with the posterior walker and shoulder extension and elbow flexion were increased. Outcomes analysis indicated that differences in upper extremity torso and joint motion were not dependent on spasticity or hand dominance. These findings may help to build an understanding of upper extremity motion in walker-assisted gait and potentially to improve walker prescription.
The exposure-crossover design is a new method for studying sustained changes in recurrent events.
Redelmeier, Donald A
2013-09-01
To introduce a new design that explores how an acute exposure might lead to a sustained change in the risk of a recurrent outcome. The exposure-crossover design uses self-matching to control within-person confounding due to genetics, personality, and all other stable patient characteristics. The design is demonstrated using population-based individual-level health data from Ontario, Canada, for three separate medical conditions (n > 100,000 for each) related to the risk of a motor vehicle crash (total outcomes, >2,000 for each). The exposure-crossover design yields numerical risk estimates during the baseline interval before an intervention, the induction interval immediately ahead of the intervention, and the subsequent interval after the intervention. Accompanying graphs summarize results, provide an intuitive display to readers, and show risk comparisons (absolute and relative). Self-matching increases statistical efficiency, reduces selection bias, and yields quantitative analyses. The design has potential limitations related to confounding, artifacts, pragmatics, survivor bias, statistical models, potential misunderstandings, and serendipity. The exposure-crossover design may help in exploring selected questions in epidemiology science. Copyright © 2013 Elsevier Inc. All rights reserved.
A carcinogenic potency database of the standardized results of animal bioassays
Gold, Lois Swirsky; Sawyer, Charles B.; Magaw, Renae; Backman, Georganne M.; De Veciana, Margarita; Levinson, Robert; Hooper, N. Kim; Havender, William R.; Bernstein, Leslie; Peto, Richard; Pike, Malcolm C.; Ames, Bruce N.
1984-01-01
The preceding paper described our numerical index of carcinogenic potency, the TD50 and the statistical procedures adopted for estimating it from experimental data. This paper presents the Carcinogenic Potency Database, which includes results of about 3000 long-term, chronic experiments of 770 test compounds. Part II is a discussion of the sources of our data, the rationale for the inclusion of particular experiments and particular target sites, and the conventions adopted in summarizing the literature. Part III is a guide to the plot of results presented in Part IV. A number of appendices are provided to facilitate use of the database. The plot includes information about chronic cancer tests in mammals, such as dose and other aspects of experimental protocol, histopathology and tumor incidence, TD50 and its statistical significance, dose response, author's opinion and literature reference. The plot readily permits comparisons of carcinogenic potency and many other aspects of cancer tests; it also provides quantitative information about negative tests. The range of carcinogenic potency is over 10 million-fold. PMID:6525996
NASA Astrophysics Data System (ADS)
Yu, Fu-Yun; Liu, Yu-Hsin
2005-09-01
The potential value of a multiple-choice question-construction instructional strategy for the support of students’ learning of physics experiments was examined in the study. Forty-two university freshmen participated in the study for a whole semester. A constant comparison method adopted to categorize students’ qualitative data indicated that the influences of multiple-choice question construction were evident in several significant ways (promoting constructive and productive studying habits; reflecting and previewing course-related materials; increasing in-group communication and interaction; breaking passive learning style and habits, etc.), which, worked together, not only enhanced students’ comprehension and retention of the obtained knowledge, but also helped distil a sense of empowerment and learning community within the participants. Analysis with one-group t-tests, using 3 as the expected mean, on quantitative data further found that students’ satisfaction toward past learning experience, and perceptions toward this strategy’s potentials for promoting learning were statistically significant at the 0.0005 level, while learning anxiety was not statistically significant. Suggestions for incorporating question-generation activities within classroom and topics for future studies were rendered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu
2014-01-15
According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located onmore » a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.« less
Johnston, Iain G; Burgstaller, Joerg P; Havlicek, Vitezslav; Kolbe, Thomas; Rülicke, Thomas; Brem, Gottfried; Poulton, Jo; Jones, Nick S
2015-01-01
Dangerous damage to mitochondrial DNA (mtDNA) can be ameliorated during mammalian development through a highly debated mechanism called the mtDNA bottleneck. Uncertainty surrounding this process limits our ability to address inherited mtDNA diseases. We produce a new, physically motivated, generalisable theoretical model for mtDNA populations during development, allowing the first statistical comparison of proposed bottleneck mechanisms. Using approximate Bayesian computation and mouse data, we find most statistical support for a combination of binomial partitioning of mtDNAs at cell divisions and random mtDNA turnover, meaning that the debated exact magnitude of mtDNA copy number depletion is flexible. New experimental measurements from a wild-derived mtDNA pairing in mice confirm the theoretical predictions of this model. We analytically solve a mathematical description of this mechanism, computing probabilities of mtDNA disease onset, efficacy of clinical sampling strategies, and effects of potential dynamic interventions, thus developing a quantitative and experimentally-supported stochastic theory of the bottleneck. DOI: http://dx.doi.org/10.7554/eLife.07464.001 PMID:26035426
Racial Differences in Quantitative Measures of Area and Volumetric Breast Density
McCarthy, Anne Marie; Keller, Brad M.; Pantalone, Lauren M.; Hsieh, Meng-Kang; Synnestvedt, Marie; Conant, Emily F.; Armstrong, Katrina; Kontos, Despina
2016-01-01
Abstract Background: Increased breast density is a strong risk factor for breast cancer and also decreases the sensitivity of mammographic screening. The purpose of our study was to compare breast density for black and white women using quantitative measures. Methods: Breast density was assessed among 5282 black and 4216 white women screened using digital mammography. Breast Imaging-Reporting and Data System (BI-RADS) density was obtained from radiologists’ reports. Quantitative measures for dense area, area percent density (PD), dense volume, and volume percent density were estimated using validated, automated software. Breast density was categorized as dense or nondense based on BI-RADS categories or based on values above and below the median for quantitative measures. Logistic regression was used to estimate the odds of having dense breasts by race, adjusted for age, body mass index (BMI), age at menarche, menopause status, family history of breast or ovarian cancer, parity and age at first birth, and current hormone replacement therapy (HRT) use. All statistical tests were two-sided. Results: There was a statistically significant interaction of race and BMI on breast density. After accounting for age, BMI, and breast cancer risk factors, black women had statistically significantly greater odds of high breast density across all quantitative measures (eg, PD nonobese odds ratio [OR] = 1.18, 95% confidence interval [CI] = 1.02 to 1.37, P = .03, PD obese OR = 1.26, 95% CI = 1.04 to 1.53, P = .02). There was no statistically significant difference in BI-RADS density by race. Conclusions: After accounting for age, BMI, and other risk factors, black women had higher breast density than white women across all quantitative measures previously associated with breast cancer risk. These results may have implications for risk assessment and screening. PMID:27130893
Nakagami-based total variation method for speckle reduction in thyroid ultrasound images.
Koundal, Deepika; Gupta, Savita; Singh, Sukhwinder
2016-02-01
A good statistical model is necessary for the reduction in speckle noise. The Nakagami model is more general than the Rayleigh distribution for statistical modeling of speckle in ultrasound images. In this article, the Nakagami-based noise removal method is presented to enhance thyroid ultrasound images and to improve clinical diagnosis. The statistics of log-compressed image are derived from the Nakagami distribution following a maximum a posteriori estimation framework. The minimization problem is solved by optimizing an augmented Lagrange and Chambolle's projection method. The proposed method is evaluated on both artificial speckle-simulated and real ultrasound images. The experimental findings reveal the superiority of the proposed method both quantitatively and qualitatively in comparison with other speckle reduction methods reported in the literature. The proposed method yields an average signal-to-noise ratio gain of more than 2.16 dB over the non-convex regularizer-based speckle noise removal method, 3.83 dB over the Aubert-Aujol model, 1.71 dB over the Shi-Osher model and 3.21 dB over the Rudin-Lions-Osher model on speckle-simulated synthetic images. Furthermore, visual evaluation of the despeckled images shows that the proposed method suppresses speckle noise well while preserving the textures and fine details. © IMechE 2015.
Wathen, John B; Lazorchak, James M; Olsen, Anthony R; Batt, Angela
2015-03-01
The U.S. EPA conducted a national statistical survey of fish fillet tissue with a sample size of 541 sites on boatable rivers =>5th order in 2008-2009. This is the first such study of mercury (Hg) in fish tissue from river sites focused on potential impacts to human health from fish consumption to also address wildlife impacts. Sample sites were identified as being urban or non-urban. All sample mercury concentrations were above the 3.33ugkg(-1) (ppb) quantitation limit, and an estimated 25.4% (±4.4%) of the 51663 river miles assessed exceeded the U.S. EPA 300ugkg(-1) fish-tissue based water quality criterion for mercury, representing 13144±181.8 river miles. Estimates of river miles exceeding comparable aquatic life thresholds (translated from fillet concentrations to whole fish equivalents) in avian species were similar to the number of river miles exceeding the human health threshold, whereas some mammalian species were more at risk than human from lower mercury concentrations. A comparison of means from the non-urban and urban data and among three ecoregions did not indicate a statistically significant difference in fish tissue Hg concentrations at p<0.05. Published by Elsevier Ltd.
Hofman, Abe D.; Visser, Ingmar; Jansen, Brenda R. J.; van der Maas, Han L. J.
2015-01-01
We propose and test three statistical models for the analysis of children’s responses to the balance scale task, a seminal task to study proportional reasoning. We use a latent class modelling approach to formulate a rule-based latent class model (RB LCM) following from a rule-based perspective on proportional reasoning and a new statistical model, the Weighted Sum Model, following from an information-integration approach. Moreover, a hybrid LCM using item covariates is proposed, combining aspects of both a rule-based and information-integration perspective. These models are applied to two different datasets, a standard paper-and-pencil test dataset (N = 779), and a dataset collected within an online learning environment that included direct feedback, time-pressure, and a reward system (N = 808). For the paper-and-pencil dataset the RB LCM resulted in the best fit, whereas for the online dataset the hybrid LCM provided the best fit. The standard paper-and-pencil dataset yielded more evidence for distinct solution rules than the online data set in which quantitative item characteristics are more prominent in determining responses. These results shed new light on the discussion on sequential rule-based and information-integration perspectives of cognitive development. PMID:26505905
Selected 1966-69 interior Alaska wildfire statistics with long-term comparisons.
Richard J. Barney
1971-01-01
This paper presents selected interior Alaska forest and range wildfire statistics for the period 1966-69. Comparisons are made with the decade 1956-65 and the 30-year period 1940-69, which are essentially the total recorded statistical history on wildfires available for Alaska.
Some aspects of robotics calibration, design and control
NASA Technical Reports Server (NTRS)
Tawfik, Hazem
1990-01-01
The main objective is to introduce techniques in the areas of testing and calibration, design, and control of robotic systems. A statistical technique is described that analyzes a robot's performance and provides quantitative three-dimensional evaluation of its repeatability, accuracy, and linearity. Based on this analysis, a corrective action should be taken to compensate for any existing errors and enhance the robot's overall accuracy and performance. A comparison between robotics simulation software packages that were commercially available (SILMA, IGRIP) and that of Kennedy Space Center (ROBSIM) is also included. These computer codes simulate the kinematics and dynamics patterns of various robot arm geometries to help the design engineer in sizing and building the robot manipulator and control system. A brief discussion on an adaptive control algorithm is provided.
Lopes, Vagner José; Shmeil, Marcos Augusto Hochuli
2017-04-27
To compare computer-generated guidelines with and without the use of a Clinical Decision Support System - Oncology Care and Healthcare for Chemotherapy Patients, for the caregivers of children undergoing chemotherapy. This is a descriptive, evaluative, and quantitative study conducted at a paediatrics hospital in Curitiba, Paraná, Brazil, from December 2015 to January 2016. The sample consisted of 58 participants divided into two groups: Group 1, without the aid of software, and Group 2, with the aid of the software. The data were analysed using the Mann-Whitney U test. The guidelines revealed a statistical significance (p<0.05), with a prevalence of a higher concordance average in Group 2 in comparison with Group 1. Computer-generated guidelines are a valuable qualitative support tool for nurses.
Comparison of in silico models for prediction of mutagenicity.
Bakhtyari, Nazanin G; Raitano, Giuseppa; Benfenati, Emilio; Martin, Todd; Young, Douglas
2013-01-01
Using a dataset with more than 6000 compounds, the performance of eight quantitative structure activity relationships (QSAR) models was evaluated: ACD/Tox Suite, Absorption, Distribution, Metabolism, Elimination, and Toxicity of chemical substances (ADMET) predictor, Derek, Toxicity Estimation Software Tool (T.E.S.T.), TOxicity Prediction by Komputer Assisted Technology (TOPKAT), Toxtree, CEASAR, and SARpy (SAR in python). In general, the results showed a high level of performance. To have a realistic estimate of the predictive ability, the results for chemicals inside and outside the training set for each model were considered. The effect of applicability domain tools (when available) on the prediction accuracy was also evaluated. The predictive tools included QSAR models, knowledge-based systems, and a combination of both methods. Models based on statistical QSAR methods gave better results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamachi La Commare, Kristina
Metrics for reliability, such as the frequency and duration of power interruptions, have been reported by electric utilities for many years. This study examines current utility practices for collecting and reporting electricity reliability information and discusses challenges that arise in assessing reliability because of differences among these practices. The study is based on reliability information for year 2006 reported by 123 utilities in 37 states representing over 60percent of total U.S. electricity sales. We quantify the effects that inconsistencies among current utility reporting practices have on comparisons of System Average Interruption Duration Index (SAIDI) and System Average Interruption Frequency Indexmore » (SAIFI) reported by utilities. We recommend immediate adoption of IEEE Std. 1366-2003 as a consistent method for measuring and reporting reliability statistics.« less
Sequential Inverse Problems Bayesian Principles and the Logistic Map Example
NASA Astrophysics Data System (ADS)
Duan, Lian; Farmer, Chris L.; Moroz, Irene M.
2010-09-01
Bayesian statistics provides a general framework for solving inverse problems, but is not without interpretation and implementation problems. This paper discusses difficulties arising from the fact that forward models are always in error to some extent. Using a simple example based on the one-dimensional logistic map, we argue that, when implementation problems are minimal, the Bayesian framework is quite adequate. In this paper the Bayesian Filter is shown to be able to recover excellent state estimates in the perfect model scenario (PMS) and to distinguish the PMS from the imperfect model scenario (IMS). Through a quantitative comparison of the way in which the observations are assimilated in both the PMS and the IMS scenarios, we suggest that one can, sometimes, measure the degree of imperfection.
Lecompte, Emily; Baril, Mireille
2008-01-01
To meet the unique health needs of Aboriginal peoples (First Nations, Inuit and Métis), it is important to increase and encourage Aboriginal representation in health care. One Federal initiative, the Aboriginal Health Human Resource Initiative (AHHRI) at Health Canada, focuses on: (1) increasing the number of Aboriginal people working in health careers; (2) adapting health care educational curricula to support the development of cultural competencies; and (3) improving the retention of health care workers in Aboriginal communities. A health care system that focuses on understanding the unique challenges, concerns, and needs of Aboriginal people can better respond to this specific population, which suffers disproportionately from ill health in comparison to their non-Aboriginal counterparts. This report examines the supply of Aboriginal health care providers in Canada, based on geographic region, area of residence, Aboriginal identity, and occupation. Findings are drawn from the 1996 and 2001 censuses from Statistics Canada. Quantitative results provide a greater understanding of labour force characteristics of First Nation, Inuit, Métis, and non-Aboriginal health providers.
Friesen, Melissa C; Benke, Geza; Del Monaco, Anthony; Dennekamp, Martine; Fritschi, Lin; de Klerk, Nick; Hoving, Jan L; MacFarlane, Ewan; Sim, Malcolm R
2009-08-01
We examined the risk of mortality and cancer incidence with quantitative exposure to benzene-soluble fraction (BSF), benzo(a)pyrene (BaP), fluoride, and inhalable dust in two Australian prebake smelters. A total of 4,316 male smelter workers were linked to mortality and cancer incidence registries and followed from 1983 through 2002 (mean follow-up: 15.9 years, maximum: 20 years). Internal comparisons using Poisson regression were undertaken based on quantitative exposure levels. Smoking-adjusted, monotonic relationships were observed between respiratory cancer and cumulative inhalable dust exposure (trend p = 0.1), cumulative fluoride exposure (p = 0.1), and cumulative BaP exposure (p = 0.2). The exposure-response trends were stronger when examined across the exposed categories (BaP p = 0.1; inhalable dust p = 0.04). A monotonic, but not statistically significant trend was observed between cumulative BaP exposure and stomach cancer (n = 14). Bladder cancer was not associated with BaP or BSF exposure. No other cancer and no mortality outcomes were associated with these smelter exposures. The carcinogenicity of Söderberg smelter exposures is well established; in these prebake smelters we observed an association between smelter exposures and respiratory cancer, but not bladder cancer. The exploratory finding for stomach cancer needs confirmation. These results are preliminary due to the young cohort and short follow-up time.
[Self-perception of health care team leaders in Andalusia. A quantitative and qualitative study].
García-Romera, I; Danet, A; March-Cerdà, J C
To determine the perception and self-assessment on leadership among health care team leaders in Andalusia. Design: Exploratory descriptive study using quantitative and qualitative methodology, developed between 2013 and 2015, using a questionnaire and semi-structured interviews. Andalusia. All health managers from the Primary Care Management Units and Health Management Areas of the Departments of Paediatrics, Emergency and Internal Medicine, for the quantitative study. A purposive sample of 24 health managers was used for the qualitative study. Descriptive statistical study and bivariate analysis of comparison of means. Content analysis of the semi-structured interviews: Codification, category tree, and triangulation of results. The best self-assessment dimension relates to support, and the worst to considering oneself as a 'good leader'. The definition of a 'good leader' includes: Honesty, trust, and attitudes of good communication, closeness, appreciation, and reinforcement of the health team members. Different leadership styles were perceived. Main difficulties for leadership are related to the economic crisis and the management of personal conflicts. Health managers describe an adaptive leadership style, based on personal and professional support, and using communication as the main cohesive element for the team project. More studies on leaders' perspectives are important, in order to better understand their experiences, needs and expectations. Copyright © 2016 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-20
... and opinions, but are not statistical surveys that yield quantitative results that can be generalized... generic clearance for qualitative information will not be used for quantitative information collections... for submission for other generic mechanisms that are designed to yield quantitative results. The...
76 FR 13018 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-09
... statistical surveys that yield quantitative results that can be generalized to the population of study. This... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. Total Burden Estimate for the...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-13
... insights on perceptions and opinions, but are not statistical surveys that yield quantitative results that.... This type of generic clearance for qualitative information will not be used for quantitative... for submission for other generic mechanisms that are designed to yield quantitative results. The...
Teaching Statistics to Social Science Students: Making It Valuable
ERIC Educational Resources Information Center
North, D.; Zewotir, T.
2006-01-01
In this age of rapid information expansion and technology, statistics is playing an ever increasing role in education, particularly also in the training of social scientists. Statistics enables the social scientist to obtain a quantitative awareness of socio-economic phenomena hence is essential in their training. Statistics, however, is becoming…
Hoppe, Elisabeth; Körzdörfer, Gregor; Würfl, Tobias; Wetzl, Jens; Lugauer, Felix; Pfeuffer, Josef; Maier, Andreas
2017-01-01
The purpose of this work is to evaluate methods from deep learning for application to Magnetic Resonance Fingerprinting (MRF). MRF is a recently proposed measurement technique for generating quantitative parameter maps. In MRF a non-steady state signal is generated by a pseudo-random excitation pattern. A comparison of the measured signal in each voxel with the physical model yields quantitative parameter maps. Currently, the comparison is done by matching a dictionary of simulated signals to the acquired signals. To accelerate the computation of quantitative maps we train a Convolutional Neural Network (CNN) on simulated dictionary data. As a proof of principle we show that the neural network implicitly encodes the dictionary and can replace the matching process.
ERIC Educational Resources Information Center
Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn
2016-01-01
Application of mathematical and statistical thinking and reasoning, typically referred to as quantitative skills, is essential for university bioscience students. First, this study developed an assessment task intended to gauge graduating students' quantitative skills. The Quantitative Skills Assessment of Science Students (QSASS) was the result,…
Bono, Gioacchino; Okpala, Charles Odilichukwu R; Alberio, Giuseppina R A; Messina, Concetta M; Santulli, Andrea; Giacalone, Gabriele; Spagna, Giovanni
2016-04-15
The combined effects of freezing and modified atmosphere packaging (MAP) (100% N2 and 50% N2+50% CO2) on some quality characteristics of Giant Red Shrimp (GRS) (Aristaeomorpha foliacea) was studied during 12-month storage. In particular, the quality characteristics determined proximal and gas compositions, melanosis scores, pH, total volatile basic-nitrogen (TVB-N), thiobarbituric acid (TBA) as well as free amino acid (FAA). In addition, the emergent data were compared to those subject to vacuum packaging as well as conventional preservative method of sulphite treatment (SUL). Most determined qualities exhibited quantitative differences with storage. By comparisons, while pH and TVB-N statistically varied between treatments (P<0.05) and TBA that ranged between ∼0.15 and 0.30 mg MDA/kg appeared least at end of storage for 100% N2 treated-group, the latter having decreased melanosis scores showed such treatments with high promise to keep the colour of GRS sample hence, potential replacement for SUL group. By comparisons also, while some individual FAA values showed increases especially at the 100% N2-treated group, the total FAAs statistically differed with storage (P<0.05). The combination of freezing and MAP treatments as preservative treatment method shows high promise to influence some quality characteristics of GRS samples of this study. Copyright © 2015 Elsevier Ltd. All rights reserved.
Silveira-Neto, Nicolau; Flores, Mateus Ericson; De Carli, João Paulo; Costa, Max Dória; Matos, Felipe de Souza; Paranhos, Luiz Renato; Linden, Maria Salete Sandini
2017-11-01
This research evaluated detail registration in peri-implant bone using two different cone beam computer tomography systems and a digital periapical radiograph. Three different image acquisition protocols were established for each cone beam computer tomography apparatus, and three clinical situations were simulated in an ex vivo fresh pig mandible: buccal bone defect, peri-implant bone defect, and bone contact. Data were subjected to two analyses: quantitative and qualitative. The quantitative analyses involved a comparison of real specimen measures using a digital caliper in three regions of the preserved buccal bone - A, B and E (control group) - to cone beam computer tomography images obtained with different protocols (kp1, kp2, kp3, ip1, ip2, and ip3). In the qualitative analyses, the ability to register peri-implant details via tomography and digital periapical radiography was verified, as indicated by twelve evaluators. Data were analyzed with ANOVA and Tukey's test (α=0.05). The quantitative assessment showed means statistically equal to those of the control group under the following conditions: buccal bone defect B and E with kp1 and ip1, peri-implant bone defect E with kp2 and kp3, and bone contact A with kp1, kp2, kp3, and ip2. Qualitatively, only bone contacts were significantly different among the assessments, and the p3 results differed from the p1 and p2 results. The other results were statistically equivalent. The registration of peri-implant details was influenced by the image acquisition protocol, although metal artifacts were produced in all situations. The evaluators preferred the Kodak 9000 3D cone beam computer tomography in most cases. The evaluators identified buccal bone defects better with cone beam computer tomography and identified peri-implant bone defects better with digital periapical radiography.
Middle school students' earthquake content and preparedness knowledge - A mixed method study
NASA Astrophysics Data System (ADS)
Henson, Harvey, Jr.
The purpose of this study was to assess the effect of earthquake instruction on students' earthquake content and preparedness for earthquakes. This study used an innovative direct instruction on earthquake science content and concepts with an inquiry-based group activity on earthquake safety followed by an earthquake simulation and preparedness video to help middle school students understand and prepare for the regional seismic threat. A convenience sample of 384 sixth and seventh grade students at two small middle schools in southern Illinois was used in this study. Qualitative information was gathered using open-ended survey questions, classroom observations, and semi-structured interviews. Quantitative data were collected using a 21 item content questionnaire administered to test students' General Earthquake Knowledge, Local Earthquake Knowledge, and Earthquake Preparedness Knowledge before and after instruction. A pre-test and post-test survey Likert scale with 21 items was used to collect students' perceptions and attitudes. Qualitative data analysis included quantification of student responses to the open-ended questions and thematic analysis of observation notes and interview transcripts. Quantitative datasets were analyzed using descriptive and inferential statistical methods, including t tests to evaluate the differences in means scores between paired groups before and after interventions and one-way analysis of variance (ANOVA) to test for differences between mean scores of the comparison groups. Significant mean differences between groups were further examined using a Dunnett's C post hoc statistical analysis. Integration and interpretation of the qualitative and quantitative results of the study revealed a significant increase in general, local and preparedness earthquake knowledge among middle school students after the interventions. The findings specifically indicated that these students felt most aware and prepared for an earthquake after an intervention that consisted of an inquiry-based group discussion on safety, earthquake content presentation and earthquake simulation video presentation on preparedness. Variations of the intervention, including no intervention, were not as effective in significantly increasing students' conceptual learning of earthquake knowledge.
The impact of injector-based contrast agent administration in time-resolved MRA.
Budjan, Johannes; Attenberger, Ulrike I; Schoenberg, Stefan O; Pietsch, Hubertus; Jost, Gregor
2018-05-01
Time-resolved contrast-enhanced MR angiography (4D-MRA), which allows the simultaneous visualization of the vasculature and blood-flow dynamics, is widely used in clinical routine. In this study, the impact of two different contrast agent injection methods on 4D-MRA was examined in a controlled, standardized setting in an animal model. Six anesthetized Goettingen minipigs underwent two identical 4D-MRA examinations at 1.5 T in a single session. The contrast agent (0.1 mmol/kg body weight gadobutrol, followed by 20 ml saline) was injected using either manual injection or an automated injection system. A quantitative comparison of vascular signal enhancement and quantitative renal perfusion analyses were performed. Analysis of signal enhancement revealed higher peak enhancements and shorter time to peak intervals for the automated injection. Significantly different bolus shapes were found: automated injection resulted in a compact first-pass bolus shape clearly separated from the recirculation while manual injection resulted in a disrupted first-pass bolus with two peaks. In the quantitative perfusion analyses, statistically significant differences in plasma flow values were found between the injection methods. The results of both qualitative and quantitative 4D-MRA depend on the contrast agent injection method, with automated injection providing more defined bolus shapes and more standardized examination protocols. • Automated and manual contrast agent injection result in different bolus shapes in 4D-MRA. • Manual injection results in an undefined and interrupted bolus with two peaks. • Automated injection provides more defined bolus shapes. • Automated injection can lead to more standardized examination protocols.
A comparison of cosegregation analysis methods for the clinical setting.
Rañola, John Michael O; Liu, Quanhui; Rosenthal, Elisabeth A; Shirts, Brian H
2018-04-01
Quantitative cosegregation analysis can help evaluate the pathogenicity of genetic variants. However, genetics professionals without statistical training often use simple methods, reporting only qualitative findings. We evaluate the potential utility of quantitative cosegregation in the clinical setting by comparing three methods. One thousand pedigrees each were simulated for benign and pathogenic variants in BRCA1 and MLH1 using United States historical demographic data to produce pedigrees similar to those seen in the clinic. These pedigrees were analyzed using two robust methods, full likelihood Bayes factors (FLB) and cosegregation likelihood ratios (CSLR), and a simpler method, counting meioses. Both FLB and CSLR outperform counting meioses when dealing with pathogenic variants, though counting meioses is not far behind. For benign variants, FLB and CSLR greatly outperform as counting meioses is unable to generate evidence for benign variants. Comparing FLB and CSLR, we find that the two methods perform similarly, indicating that quantitative results from either of these methods could be combined in multifactorial calculations. Combining quantitative information will be important as isolated use of cosegregation in single families will yield classification for less than 1% of variants. To encourage wider use of robust cosegregation analysis, we present a website ( http://www.analyze.myvariant.org ) which implements the CSLR, FLB, and Counting Meioses methods for ATM, BRCA1, BRCA2, CHEK2, MEN1, MLH1, MSH2, MSH6, and PMS2. We also present an R package, CoSeg, which performs the CSLR analysis on any gene with user supplied parameters. Future variant classification guidelines should allow nuanced inclusion of cosegregation evidence against pathogenicity.
Clinical and quantitative analysis of patients with crowned dens syndrome.
Takahashi, Teruyuki; Tamura, Masato; Takasu, Toshiaki; Kamei, Satoshi
2017-05-15
Crowned dens syndrome (CDS) is a radioclinical entity defined by calcium deposition on the transverse ligament of atlas (TLA). In this study, the novel semi-quantitative diagnostic criteria for CDS to evaluate the degree of calcification on TLA by cervical CT are proposed. From January 2010 to September 2014, 35 patients who were diagnosed with CDS by cervical CT were adopted as subjects in this study. Based on novel criteria, calcium deposition on TLA was classified into "Stage" and "Grade", to make a score, which was evaluated semi-quantitatively. The correlation between calcification score and CRP level or pain score, and the effects of treatments, such as NSAIDs and corticosteroids, were statistically analyzed. The total calcification score from added "Stage" and "Grade" scores demonstrated a significantly strong and linear correlation with CRP level (R 2 =0.823, **p<0.01). In the multiple comparison test for the treatment effects, significant improvement of the CRP level and pain score were demonstrated after corticosteroid therapy (**p<0.01) compared with NSAIDs. In the conditional logistic regression analysis, the rapid end of corticosteroid therapy was an independent risk factor for relapse of cervico-occipital pain [OR=50.761, *p=0.0419]. The degree of calcification on TLA evaluated by the novel semi-quantitative criteria significantly correlated with CRP level. In the treatment of CDS, it is recommended that a low dosage (15-30mg) of corticosteroids be used as first-line drugs rather than conventional NSAID therapy. Additionally, it is also recommended to gradually decrease the dosage of corticosteroids. Copyright © 2017 Elsevier B.V. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-04
... not statistical surveys that yield quantitative results that can be generalized to the population of... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. No comments were received in response...
77 FR 75498 - Request for Comments on a New Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-20
... statistical surveys that yield quantitative results that can be generalized to the population of study. DATES... surveys that yield quantitative results that can be generalized to the population of study. This feedback... qualitative information will not be used for quantitative information collections that are designed to yield...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-12
... insights on perceptions and opinions, but are not statistical surveys that yield quantitative results that.... This type of generic clearance for qualitative information will not be used for quantitative... quantitative results. The Digital Government Strategy released by the White House in May 2012 drives agencies...
ERIC Educational Resources Information Center
Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.
2008-01-01
In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD…
Econophysical visualization of Adam Smith’s invisible hand
NASA Astrophysics Data System (ADS)
Cohen, Morrel H.; Eliazar, Iddo I.
2013-02-01
Consider a complex system whose macrostate is statistically observable, but yet whose operating mechanism is an unknown black-box. In this paper we address the problem of inferring, from the system’s macrostate statistics, the system’s intrinsic force yielding the observed statistics. The inference is established via two diametrically opposite approaches which result in the very same intrinsic force: a top-down approach based on the notion of entropy, and a bottom-up approach based on the notion of Langevin dynamics. The general results established are applied to the problem of visualizing the intrinsic socioeconomic force-Adam Smith’s invisible hand-shaping the distribution of wealth in human societies. Our analysis yields quantitative econophysical representations of figurative socioeconomic forces, quantitative definitions of “poor” and “rich”, and a quantitative characterization of the “poor-get-poorer” and the “rich-get-richer” phenomena.
Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping
2018-05-22
Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.
Zhi, Naiqian; Jaeger, Beverly Kris; Gouldstone, Andrew; Sipahi, Rifat; Frank, Samuel
2017-03-01
Detection of changes in micrographia as a manifestation of symptomatic progression or therapeutic response in Parkinson's disease (PD) is challenging as such changes can be subtle. A computerized toolkit based on quantitative analysis of handwriting samples would be valuable as it could supplement and support clinical assessments, help monitor micrographia, and link it to PD. Such a toolkit would be especially useful if it could detect subtle yet relevant changes in handwriting morphology, thus enhancing resolution of the detection procedure. This would be made possible by developing a set of metrics sensitive enough to detect and discern micrographia with specificity. Several metrics that are sensitive to the characteristics of micrographia were developed, with minimal sensitivity to confounding handwriting artifacts. These metrics capture character size-reduction, ink utilization, and pixel density within a writing sample from left to right. They are used here to "score" handwritten signatures of 12 different individuals corresponding to healthy and symptomatic PD conditions, and sample control signatures that had been artificially reduced in size for comparison purposes. Moreover, metric analyses of samples from ten of the 12 individuals for which clinical diagnosis time is known show considerable informative variations when applied to static signature samples obtained before and after diagnosis. In particular, a measure called pixel density variation showed statistically significant differences ( ) between two comparison groups of remote signature recordings: earlier versus recent, based on independent and paired t-test analyses on a total of 40 signature samples. The quantitative framework developed here has the potential to be used in future controlled experiments to study micrographia and links to PD from various aspects, including monitoring and assessment of applied interventions and treatments. The inherent value in this methodology is further enhanced by its reliance solely on static signatures, not requiring dynamic sampling with specialized equipment.
Sarrami-Foroushani, Ali; Nasr Esfahany, Mohsen; Nasiraei Moghaddam, Abbas; Saligheh Rad, Hamidreza; Firouznia, Kavous; Shakiba, Madjid; Ghanaati, Hossein; Wilkinson, Iain David; Frangi, Alejandro Federico
2015-01-01
Background: Understanding hemodynamic environment in vessels is important for realizing the mechanisms leading to vascular pathologies. Objectives: Three-dimensional velocity vector field in carotid bifurcation is visualized using TR 3D phase-contrast magnetic resonance imaging (TR 3D PC MRI) and computational fluid dynamics (CFD). This study aimed to present a qualitative and quantitative comparison of the velocity vector field obtained by each technique. Subjects and Methods: MR imaging was performed on a 30-year old male normal subject. TR 3D PC MRI was performed on a 3 T scanner to measure velocity in carotid bifurcation. 3D anatomical model for CFD was created using images obtained from time-of-flight MR angiography. Velocity vector field in carotid bifurcation was predicted using CFD and PC MRI techniques. A statistical analysis was performed to assess the agreement between the two methods. Results: Although the main flow patterns were the same for the both techniques, CFD showed a greater resolution in mapping the secondary and circulating flows. Overall root mean square (RMS) errors for all the corresponding data points in PC MRI and CFD were 14.27% in peak systole and 12.91% in end diastole relative to maximum velocity measured at each cardiac phase. Bland-Altman plots showed a very good agreement between the two techniques. However, this study was not aimed to validate any of methods, instead, the consistency was assessed to accentuate the similarities and differences between Time-resolved PC MRI and CFD. Conclusion: Both techniques provided quantitatively consistent results of in vivo velocity vector fields in right internal carotid artery (RCA). PC MRI represented a good estimation of main flow patterns inside the vasculature, which seems to be acceptable for clinical use. However, limitations of each technique should be considered while interpreting results. PMID:26793288
Golditz, T; Steib, S; Pfeifer, K; Uder, M; Gelse, K; Janka, R; Hennig, F F; Welsch, G H
2014-10-01
The aim of this study was to investigate, using T2-mapping, the impact of functional instability in the ankle joint on the development of early cartilage damage. Ethical approval for this study was provided. Thirty-six volunteers from the university sports program were divided into three groups according to their ankle status: functional ankle instability (FAI, initial ankle sprain with residual instability); ankle sprain Copers (initial sprain, without residual instability); and controls (without a history of ankle injuries). Quantitative T2-mapping magnetic resonance imaging (MRI) was performed at the beginning ('early-unloading') and at the end ('late-unloading') of the MR-examination, with a mean time span of 27 min. Zonal region-of-interest T2-mapping was performed on the talar and tibial cartilage in the deep and superficial layers. The inter-group comparisons of T2-values were analyzed using paired and unpaired t-tests. Statistical analysis of variance was performed. T2-values showed significant to highly significant differences in 11 of 12 regions throughout the groups. In early-unloading, the FAI-group showed a significant increase in quantitative T2-values in the medial, talar regions (P = 0.008, P = 0.027), whereas the Coper-group showed this enhancement in the central-lateral regions (P = 0.05). Especially the comparison of early-loading to late-unloading values revealed significantly decreasing T2-values over time laterally and significantly increasing T2-values medially in the FAI-group, which were not present in the Coper- or control-group. Functional instability causes unbalanced loading in the ankle joint, resulting in cartilage alterations as assessed by quantitative T2-mapping. This approach can visualize and localize early cartilage abnormalities, possibly enabling specific treatment options to prevent osteoarthritis in young athletes. Copyright © 2014 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Majid, Farjana; Jahan, Munira; Lutful Moben, Ahmed; Tabassum, Shahina
2014-01-01
Both real-time-polymerase chain reaction (PCR) and hybrid capture 2 (HC2) assay can detect and quantify hepatitis B virus (HBV) DNA. However, real-time-PCR can detect a wide range of HBV DNA, while HC2 assay could not detect lower levels of viremia. The present study was designed to detect and quantify HBV DNA by real-time-PCR and HC2 assay and compare the quantitative data of these two assays. A cross-sectional study was conducted in between July 2010 and June 2011. A total of 66 serologically diagnosed chronic hepatitis B (CHB) patients were selected for the study. Real-time-PCR and HC2 assay was done to detect HBV DNA. Data were analyzed by statistical Package for the social sciences (SPSS). Among 66 serologically diagnosed chronic hepatitis B patients 40 (60.61%) patients had detectable and 26 (39.39%) had undetectable HBV DNA by HC2 assay. Concordant results were obtained for 40 (60.61%) out of these 66 patients by real-time-PCR and HC2 assay with mean viral load of 7.06 ± 1.13 log 10 copies/ml and 6.95 ± 1.08 log 10 copies/ml, respectively. In the remaining 26 patients, HBV DNA was detectable by real-time-PCR in 20 patients (mean HBV DNA level was 3.67 ± 0.72 log 10 copies/ml. However, HBV DNA could not be detectable in six cases by the both assays. The study showed strong correlation (r = 0.915) between real-time-PCR and HC2 assay for the detection and quantification of HBV DNA. HC2 assay may be used as an alternative to real-time-PCR for CHB patients. How to cite this article: Majid F, Jahan M, Moben AL, Tabassum S. Comparison of Hybrid Capture 2 Assay with Real-time-PCR for Detection and Quantitation of Hepatitis B Virus DNA. Euroasian J Hepato-Gastroenterol 2014;4(1):31-35.
NASA Technical Reports Server (NTRS)
Merceret, Francis J.; Crawford, Winifred C.
2010-01-01
Peak wind speed is an important forecast element to ensure the safety of personnel and flight hardware at Kennedy Space Center (KSC) and the Cape Canaveral Air Force Station (CCAFS) in East-Central Florida. The 45th Weather Squadron (45 WS), the organization that issues forecasts for the KSC/CCAFS area, finds that peak winds are more difficult to forecast than mean winds. This difficulty motivated the 45 WS to request two independent studies. The first (Merceret 2009) was the development of a reliable model for gust factors (GF) relating the peak to the mean wind speed in tropical storms (TS). The second (Lambert et al. 2008) was a climatological study of non-TS cool season (October-April) mean and peak wind speeds by the Applied Meteorology Unit (AMU; Bauman et al. 2004) without the use of GF. Both studies presented their statistics as functions of mean wind speed and height. Most of the few comparisons of TS and non-TS GF in the literature suggest that non-TS GF at a given height and mean wind speed are smaller than the corresponding TS GF. The investigation reported here converted the non-TS peak wind statistics calculated by the AMU to the equivalent GF statistics and compared them with the previous TS GF results. The advantage of this effort over all previously reported studies of its kind is that the TS and non-TS data were taken from the same towers in the same locations. This eliminates differing surface attributes, including roughness length and thermal properties, as a major source of variance in the comparison. The goal of this study is two-fold: to determine the relationship between the non-TS and TS GF and their standard deviations (GFSD) and to determine if models similar to those developed for TS data in Merceret (2009) could be developed for the non-TS environment. The results are consistent with the literature, but include much more detailed, quantitative information on the nature of the relationship between TS and non-TS GF and GFSD as a function of height and mean wind speed.
Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A
2014-12-01
Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.
Tan, Ming T; Liu, Jian-ping; Lao, Lixing
2012-08-01
Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.
Katsoulidou, Antigoni; Petrodaskalaki, Maria; Sypsa, Vana; Papachristou, Eleni; Anastassopoulou, Cleo G; Gargalianos, Panagiotis; Karafoulidou, Anastasia; Lazanas, Marios; Kordossis, Theodoros; Andoniadou, Anastasia; Hatzakis, Angelos
2006-02-01
The COBAS TaqMan HIV-1 test (Roche Diagnostics) was compared with the LCx HIV RNA quantitative assay (Abbott Laboratories), the Versant HIV-1 RNA 3.0 (bDNA) assay (Bayer) and the COBAS Amplicor HIV-1 Monitor v1.5 test (Roche Diagnostics), using plasma samples of various viral load levels from HIV-1-infected individuals. In the comparison of TaqMan with LCx, TaqMan identified as positive 77.5% of the 240 samples versus 72.1% identified by LCx assay, while their overall agreement was 94.6% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.91). Similarly, in the comparison of TaqMan with bDNA 3.0, both methods identified 76.3% of the 177 samples as positive, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.95). Finally, in the comparison of TaqMan with Monitor v1.5, TaqMan identified 79.5% of the 156 samples as positive versus 80.1% identified by Monitor v1.5, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.96). In conclusion, the new COBAS TaqMan HIV-1 test showed excellent agreement with other widely used commercially available tests for the quantitation of HIV-1 viral load.
Bortoluzzi, C; Paras, K L; Applegate, T J; Verocai, G G
2018-04-30
Monitoring Eimeria shedding has become more important due to the recent restrictions to the use of antibiotics within the poultry industry. Therefore, there is a need for the implementation of more precise and accurate quantitative diagnostic techniques. The objective of this study was to compare the precision and accuracy between the Mini-FLOTAC and the McMaster techniques for quantitative diagnosis of Eimeria maxima oocyst in poultry. Twelve pools of excreta samples of broiler chickens experimentally infected with E. maxima were analyzed for the comparison between Mini-FLOTAC and McMaster technique using, the detection limits (dl) of 23 and 25, respectively. Additionally, six excreta samples were used to compare the precision of different dl (5, 10, 23, and 46) using the Mini-FLOTAC technique. For precision comparisons, five technical replicates of each sample (five replicate slides on one excreta slurry) were read for calculating the mean oocyst per gram of excreta (OPG) count, standard deviation (SD), coefficient of variation (CV), and precision of both aforementioned comparisons. To compare accuracy between the methods (McMaster, and Mini-FLOTAC dl 5 and 23), excreta from uninfected chickens was spiked with 100, 500, 1,000, 5,000, or 10,000 OPG; additional samples remained unspiked (negative control). For each spiking level, three samples were read in triplicate, totaling nine reads per spiking level per technique. Data were transformed using log10 to obtain normality and homogeneity of variances. A significant correlation (R = 0.74; p = 0.006) was observed between the mean OPG of the McMaster dl 25 and the Mini-FLOTAC dl 23. Mean OPG, CV, SD, and precision were not statistically different between the McMaster dl 25 and Mini-FLOTAC dl 23. Despite the absence of statistical difference (p > 0.05), Mini-FLOTAC dl 5 showed a numerically lower SD and CV than Mini-FLOTAC dl 23. The Pearson correlation coefficient revealed significant and positive correlation among the four dl (p ≤ 0.05). In the accuracy study, it was observed that the Mini-FLOTAC dl 5 and 23 were more accurate than the McMaster for 100 OPG, and the Mini-FLOTAC dl 23 had the highest accuracy for 500 OPG. The McMaster and Mini-FLOTAC dl 23 techniques were more accurate than the Mini-FLOTAC dl 5 for 5,000 OPG, and both dl of the Mini-FLOTAC were less accurate for 10,000 OPG counts than the McMaster technique. However, the overall accuracy of the Mini-FLOTAC dl 23 was higher than the McMaster and Mini-FLOTAC dl 5 techniques. Copyright © 2018 Elsevier B.V. All rights reserved.
Recent advances in the sequencing of relevant water intrusion fungi by the EPA, combined with the development of probes and primers have allowed for the unequivocal quantitative and qualitative identification of fungi in selected matrices.
In this pilot study, quantitative...
Akman, Cigdem Inan; Provenzano, Frank; Wang, Dong; Engelstad, Kristin; Hinton, Veronica; Yu, Julia; Tikofsky, Ronald; Ichese, Masonari; De Vivo, Darryl C
2015-02-01
(18)F fluorodeoxyglucose positron emission tomography ((18)F FDG-PET) facilitates examination of glucose metabolism. Previously, we described regional cerebral glucose hypometabolism using (18)F FDG-PET in patients with Glucose transporter 1 Deficiency Syndrome (Glut1 DS). We now expand this observation in Glut1 DS using quantitative image analysis to identify the epileptic network based on the regional distribution of glucose hypometabolism. (18)F FDG-PET scans of 16 Glut1 DS patients and 7 healthy participants were examined using Statistical parametric Mapping (SPM). Summed images were preprocessed for statistical analysis using MATLAB 7.1 and SPM 2 software. Region of interest (ROI) analysis was performed to validate SPM results. Visual analysis of the (18)F FDG-PET images demonstrated prominent regional glucose hypometabolism in the thalamus, neocortical regions and cerebellum bilaterally. Group comparison using SPM analysis confirmed that the regional distribution of glucose hypo-metabolism was present in thalamus, cerebellum, temporal cortex and central lobule. Two mildly affected patients without epilepsy had hypometabolism in cerebellum, inferior frontal cortex, and temporal lobe, but not thalamus. Glucose hypometabolism did not correlate with age at the time of PET imaging, head circumference, CSF glucose concentration at the time of diagnosis, RBC glucose uptake, or CNS score. Quantitative analysis of (18)F FDG-PET imaging in Glut1 DS patients confirmed that hypometabolism was present symmetrically in thalamus, cerebellum, frontal and temporal cortex. The hypometabolism in thalamus correlated with the clinical history of epilepsy. Copyright © 2014. Published by Elsevier B.V.
Quantitative metrics for assessment of chemical image quality and spatial resolution
Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.
2016-02-28
Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less
NASA Astrophysics Data System (ADS)
Xu, Z.; Zhu, L.; Sojka, J. J.; Kokoszka, P.; Jach, A.
2006-12-01
A wavelet-based index of storm activities (WISA) has been recently developed (Jach et al., 2006) to complement the traditional Dst index. The new index can be computed automatically using the wavelet-based statistical procedure without human intervention on the selection of quiet days and the removal of secular variations. In addition, the WISA is flexible on data stretch and has a higher temporal resolution (one minute), which can provide a better description of the dynamical variations of magnetic storms. In this work, we perform a systematic assessment study on the WISA index. First, we statistically compare the WISA to the Dst for various quiet and disturbing periods and analyze the differences of their spectrum features. Then we quantitatively assess the flexibility of the WISA on data stretch and study the effects of varying number of stations on the index. In addition, how well the WISA can handle the missing data is also quantitatively assessed. The assessment results show that the hourly-averaged WISA index can describe storm activities equally well as the Dst index, but its full automation, high flexibility on data stretch, easiness of using the data from varying number of stations, high temporal resolution, and high tolerance on missing data from individual station can be very valuable and essential for real-time monitoring of the dynamical variations of magnetic storm activities and space weather applications, thus significantly complementing the existing Dst index. Jach, A., P. Kokoszka, J. Sojka, and L. Zhu, Wavelet-based index of magnetic storm activity, J. Geophys. Res., in press, 2006.
Glasby, Michael A; Tsirikos, Athanasios I; Henderson, Lindsay; Horsburgh, Gillian; Jordan, Brian; Michaelson, Ciara; Adams, Christopher I; Garrido, Enrique
2017-08-01
To compare measurements of motor evoked potential latency stimulated either magnetically (mMEP) or electrically (eMEP) and central motor conduction time (CMCT) made pre-operatively in conscious patients using transcranial and intra-operatively using electrical cortical stimulation before and after successful instrumentation for the treatment of adolescent idiopathic scoliosis. A group initially of 51 patients with adolescent idiopathic scoliosis aged 12-19 years was evaluated pre-operatively in the outpatients' department with transcranial magnetic stimulation. The neurophysiological data were then compared statistically with intra-operative responses elicited by transcranial electrical stimulation both before and after successful surgical intervention. MEPs were measured as the cortically evoked compound action potentials of Abductor hallucis. Minimum F-waves were measured using conventional nerve conduction methods and the lower motor neuron conduction time was calculated and this was subtracted from MEP latency to give CMCT. Pre-operative testing was well tolerated in our paediatric/adolescent patients. No neurological injury occurred in any patient in this series. There was no significant difference in the values of mMEP and eMEP latencies seen pre-operatively in conscious patients and intra-operatively in patients under anaesthetic. The calculated quantities mCMCT and eCMCT showed the same statistical correlations as the quantities mMEP and eMEP latency. The congruency of mMEP and eMEP and of mCMCT and eCMCT suggests that these measurements may be used comparatively and semi-quantitatively for the comparison of pre-, intra-, and post-operative spinal cord function in spinal deformity surgery.
Quantitative metrics for assessment of chemical image quality and spatial resolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.
Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less
Kelly, Martin J; Feeley, Iain H; O'Byrne, John M
2016-10-01
Direct to consumer (DTC) advertising, targeting the public over the physician, is an increasingly pervasive presence in medical clinics. It is trending toward a format of online interaction rather than that of traditional print and television advertising. We analyze patient-focused Web pages from the top 5 companies supplying prostheses for total hip arthroplasties, comparing them to the top 10 independent medical websites. Quantitative comparison is performed using the Journal of American Medical Association benchmark and DISCERN criteria, and for comparative readability, we use the Flesch-Kincaid grade level, the Flesch reading ease, and the Gunning fog index. Content is analyzed for information on type of surgery and surgical approach. There is a statistically significant difference between the independent and DTC websites in both the mean DISCERN score (independent 74.6, standard deviation [SD] = 4.77; DTC 32.2, SD = 10.28; P = .0022) and the mean Journal of American Medical Association score (Independent 3.45, SD = 0.49; DTC 1.9, SD = 0.74; P = .004). The difference between the readability scores is not statistically significantly. The commercial content is found to be heavily biased in favor of the direct anterior approach and minimally invasive surgical techniques. We demonstrate that the quality of information on commercial websites is inferior to that of the independent sites. The advocacy of surgical approaches by industry to the patient group is a concern. This study underlines the importance of future regulation of commercial patient education Web pages. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Lesser, Lawrence M.; Wagler, Amy E.; Esquinca, Alberto; Valenzuela, M. Guadalupe
2013-01-01
The framework of linguistic register and case study research on Spanish-speaking English language learners (ELLs) learning statistics informed the construction of a quantitative instrument, the Communication, Language, And Statistics Survey (CLASS). CLASS aims to assess whether ELLs and non-ELLs approach the learning of statistics differently with…
ERIC Educational Resources Information Center
Lovett, Jennifer Nickell
2016-01-01
The purpose of this study is to provide researchers, mathematics educators, and statistics educators information about the current state of preservice secondary mathematics teachers' preparedness to teach statistics. To do so, this study employed an explanatory mixed methods design to quantitatively examine the statistical knowledge and statistics…
NASA Astrophysics Data System (ADS)
Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; More, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin
2016-03-01
How to rationally identify epithelial ovarian cancer (EOC) patients who will benefit from bevacizumab or other antiangiogenic therapies is a critical issue in EOC treatments. The motivation of this study is to quantitatively measure adiposity features from CT images and investigate the feasibility of predicting potential benefit of EOC patients with or without receiving bevacizumab-based chemotherapy treatment using multivariate statistical models built based on quantitative adiposity image features. A dataset involving CT images from 59 advanced EOC patients were included. Among them, 32 patients received maintenance bevacizumab after primary chemotherapy and the remaining 27 patients did not. We developed a computer-aided detection (CAD) scheme to automatically segment subcutaneous fat areas (VFA) and visceral fat areas (SFA) and then extracted 7 adiposity-related quantitative features. Three multivariate data analysis models (linear regression, logistic regression and Cox proportional hazards regression) were performed respectively to investigate the potential association between the model-generated prediction results and the patients' progression-free survival (PFS) and overall survival (OS). The results show that using all 3 statistical models, a statistically significant association was detected between the model-generated results and both of the two clinical outcomes in the group of patients receiving maintenance bevacizumab (p<0.01), while there were no significant association for both PFS and OS in the group of patients without receiving maintenance bevacizumab. Therefore, this study demonstrated the feasibility of using quantitative adiposity-related CT image features based statistical prediction models to generate a new clinical marker and predict the clinical outcome of EOC patients receiving maintenance bevacizumab-based chemotherapy.
Statistics attack on `quantum private comparison with a malicious third party' and its improvement
NASA Astrophysics Data System (ADS)
Gu, Jun; Ho, Chih-Yung; Hwang, Tzonelih
2018-02-01
Recently, Sun et al. (Quantum Inf Process:14:2125-2133, 2015) proposed a quantum private comparison protocol allowing two participants to compare the equality of their secrets via a malicious third party (TP). They designed an interesting trap comparison method to prevent the TP from knowing the final comparison result. However, this study shows that the malicious TP can use the statistics attack to reveal the comparison result. A simple modification is hence proposed to solve this problem.
Applying Knowledge of Quantitative Design and Analysis
ERIC Educational Resources Information Center
Baskas, Richard S.
2011-01-01
This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…
NASA Technical Reports Server (NTRS)
Bell, Thomas L.; Kundu, Prasun K.; Einaudi, Franco (Technical Monitor)
2000-01-01
Estimates from TRMM satellite data of monthly total rainfall over an area are subject to substantial sampling errors due to the limited number of visits to the area by the satellite during the month. Quantitative comparisons of TRMM averages with data collected by other satellites and by ground-based systems require some estimate of the size of this sampling error. A method of estimating this sampling error based on the actual statistics of the TRMM observations and on some modeling work has been developed. "Sampling error" in TRMM monthly averages is defined here relative to the monthly total a hypothetical satellite permanently stationed above the area would have reported. "Sampling error" therefore includes contributions from the random and systematic errors introduced by the satellite remote sensing system. As part of our long-term goal of providing error estimates for each grid point accessible to the TRMM instruments, sampling error estimates for TRMM based on rain retrievals from TRMM microwave (TMI) data are compared for different times of the year and different oceanic areas (to minimize changes in the statistics due to algorithmic differences over land and ocean). Changes in sampling error estimates due to changes in rain statistics due 1) to evolution of the official algorithms used to process the data, and 2) differences from other remote sensing systems such as the Defense Meteorological Satellite Program (DMSP) Special Sensor Microwave/Imager (SSM/I), are analyzed.
Mager, P P; Rothe, H
1990-10-01
Multicollinearity of physicochemical descriptors leads to serious consequences in quantitative structure-activity relationship (QSAR) analysis, such as incorrect estimators and test statistics of regression coefficients of the ordinary least-squares (OLS) model applied usually to QSARs. Beside the diagnosis of the known simple collinearity, principal component regression analysis (PCRA) also allows the diagnosis of various types of multicollinearity. Only if the absolute values of PCRA estimators are order statistics that decrease monotonically, the effects of multicollinearity can be circumvented. Otherwise, obscure phenomena may be observed, such as good data recognition but low predictive model power of a QSAR model.
NASA Astrophysics Data System (ADS)
Jackson, S.; Szpaklewicz, M.; Tomutsa, L.
1987-09-01
The primary objective of this research is to develop a methodology for constructing accurate quantitative models of reservoir heterogeneities. The resulting models are expected to improve predictions of flow patterns, spatial distribution of residual oil after secondary and tertiary recovery operations, and ultimate oil recovery. The purpose of this study is to provide preliminary evaluation of the usefulness of outcrop information in characterizing analogous reservoirs and to develop research techniques necessary for model development. The Shannon Sandstone, a shelf sand ridge deposit in the Powder River Basin, Wyoming, was studied. Sedimentologic and petrophysical features of an outcrop exposure of the High-Energy Ridge-Margin facies (HERM) within the Shannon were compared with those from a Shannon sandstone reservoir in Teapot Dome field. Comparisons of outcrop and subsurface permeability and porosity histograms, cumulative distribution functions, correlation lengths and natural logarithm of permeability versus porosity plots indicate a strong similarity between Shannon outcrop and Teapot Dome HERM facies petrophysical properties. Permeability classes found in outcrop samples can be related to crossbedded zones and shaley, rippled, and bioturbated zones. Similar permeability classes related to similar sedimentologic features were found in Teapot Dome field. The similarities of outcrop and Teapot Dome petrophysical properties, which are from the same geologic facies but from different depositional episodes, suggest that rocks deposited under similar depositional processes within a given deposystem have similar reservoir properties. The results of the study indicate that the use of quantitative outcrop information in characterizing reservoirs may provide a significant improvement in reservoir characterization.
Kim, Min Soon; Rodney, William N.; Cooper, Tara; Kite, Chris; Reece, Gregory P.; Markey, Mia K.
2011-01-01
Rationale, aims and objectives Scarring is a significant cause of dissatisfaction for women who undergo breast surgery. Scar tissue may be clinically distinguished from normal skin by aberrant color, rough surface texture, increased thickness (hypertrophy), and firmness. Colorimeters or spectrophotometers can be used to quantitatively assess scar color, but they require direct patient interaction and can cost thousands of dollars By comparison, digital photography is already in widespread use to document clinical outcomes and requires less patient interaction. Thus, assessment of scar coloration by digital photography is an attractive alternative. The goal of this study was to compare color measurements obtained by digital photography and colorimetry. Method Agreement between photographic and colorimetric measurements of color were evaluated. Experimental conditions were controlled by performing measurements on artificial scars created by a makeup artist. The colorimetric measurements of the artificial scars were compared to those reported in the literature for real scars in order to confirm the validity of this approach. We assessed the agreement between the colorimetric and photographic measurements of color using a hypothesis test for equivalence, the intra-class correlation coefficient (ICC), and the Bland-Altman method. Results Overall, good agreement was obtained for three parameters (L*a*b*) measured by colorimetry and photography from the results of three statistical analyses. Conclusion Color measurements obtained by digital photography were equivalent to those obtained using colorimetry. Thus, digital photography is a reliable, cost-effective measurement method of skin color and should be further investigated for quantitative analysis of surgical outcomes. PMID:19239578
Ultrasound-guided injection for MR arthrography of the hip: comparison of two different techniques.
Kantarci, Fatih; Ozbayrak, Mustafa; Gulsen, Fatih; Gencturk, Mert; Botanlioglu, Huseyin; Mihmanli, Ismail
2013-01-01
The purpose of this study was to prospectively evaluate the two different ultrasound-guided injection techniques for MR arthrography of the hip. Fifty-nine consecutive patients (21 men, 38 women) referred for MR arthrographies of the hip were prospectively included in the study. Three patients underwent bilateral MR arthrography. The two injection techniques were quantitatively and qualitatively compared. Quantitative analysis was performed by the comparison of injected contrast material volume into the hip joint. Qualitative analysis was performed with regard to extraarticular leakage of contrast material into the soft tissues. Extraarticular leakage of contrast material was graded as none, minimal, moderate, or severe according to the MR images. Each patient rated discomfort after the procedure using a visual analogue scale (VAS). The injected contrast material volume was less in femoral head puncture technique (mean 8.9 ± 3.4 ml) when compared to femoral neck puncture technique (mean 11.2 ± 2.9 ml) (p < 0.05). The chi-squared test showed significantly more contrast leakage by femoral head puncture technique (p < 0.05). Statistical analysis showed no difference between the head and neck puncture groups in terms of feeling of pain (p = 0.744) or in the body mass index (p = 0.658) of the patients. The femoral neck injection technique provides high intraarticular contrast volume and produces less extraarticular contrast leakage than the femoral head injection technique when US guidance is used for MR arthrography of the hip.
Reflective terahertz (THz) imaging: system calibration using hydration phantoms
NASA Astrophysics Data System (ADS)
Bajwa, Neha; Garritano, James; Lee, Yoon Kyung; Tewari, Priyamvada; Sung, Shijun; Maccabi, Ashkan; Nowroozi, Bryan; Babakhanian, Meghedi; Sanghvi, Sajan; Singh, Rahul; Grundfest, Warren; Taylor, Zachary
2013-02-01
Terahertz (THz) hydration sensing continues to gain traction in the medical imaging community due to its unparalleled sensitivity to tissue water content. Rapid and accurate detection of fluid shifts following induction of thermal skin burns as well as remote corneal hydration sensing have been previously demonstrated in vivo using reflective, pulsed THz imaging. The hydration contrast sensing capabilities of this technology were recently confirmed in a parallel 7 Tesla Magnetic Resonance (MR) imaging study, in which burn areas are associated with increases in local mobile water content. Successful clinical translation of THz sensing, however, still requires quantitative assessments of system performance measurements, specifically hydration concentration sensitivity, with tissue substitutes. This research aims to calibrate the sensitivity of a novel, reflective THz system to tissue water content through the use of hydration phantoms for quantitative comparisons of THz hydration imagery.Gelatin phantoms were identified as an appropriate tissue-mimicking model for reflective THz applications, and gel composition, comprising mixtures of water and protein, was varied between 83% to 95% hydration, a physiologically relevant range. A comparison of four series of gelatin phantom studies demonstrated a positive linear relationship between THz reflectivity and water concentration, with statistically significant hydration sensitivities (p < .01) ranging between 0.0209 - 0.038% (reflectivity: %hydration). The THz-phantom interaction is simulated with a three-layer model using the Transfer Matrix Method with agreement in hydration trends. Having demonstrated the ability to accurately and noninvasively measure water content in tissue equivalent targets with high sensitivity, reflective THz imaging is explored as a potential tool for early detection and intervention of corneal pathologies.
Singh, Abhinav; Sequiera, Peter; Acharya, Shashidhar; Bhat, Maghashyam
2011-01-01
The aim of the present study was to compare and assess the oral health status of 12-year-old children from two socially disadvantaged groups in the Udupi district of South India. A total of 327 children were examined in Ashrama schools, and 340 children were randomly selected for comparison from other government schools. Modified WHO proforma was used for clinical examination. Oral hygiene practices, dental fluorosis, periodontal status, dentition status and dentofacial anomalies were assessed and compared. Chi square test was used for comparison between categorical variables and Mann-Whitney test for comparison between two groups for quantitative variables. P u 0.05 was considered as statistically significant. Dental fluorosis was detected in 22.9% children from Ashrama schools, whereas in the comparison group 14.4% children had dental fluorosis (P u 0.001). Mean Decayed teeth and DMFT value in Ashrama school children were 1.15 ± 1.62, and 1.15 ± 1.62, respectively. In the comparison group, the corresponding values were 0.46 ± 0.98 and 0.48 ± 1.04, respectively (P u 0.001). The mean number of sextants in the Ashrama school children with Community Periodontal Index score 2 was 2.00 ± 1.53, whereas in the comparison group it was 1.31 ± 1.53 (P u 0.001). No significant differences were noted between two groups with respect to Dental Aesthetic Index scores. The present study revealed higher levels of dental caries experience, untreated dental disease and social disadvantage of the children attending Ashrama schools, providing evidence for the need to address the health inequalities of these children.
Burroni, L; Aucone, A M; Volterrani, D; Hayek, Y; Bertelli, P; Vella, A; Zappella, M; Vattimo, A
1997-06-01
Rett syndrome is a progressive neurological paediatric disorder associated with severe mental deficiency, which affects only girls. The aim of this study was to determine if brain blood flow abnormalities detected with 99Tc(m)-ethyl-cysteinate-dimer (99Tc[m]-ECD) single photon emission tomography (SPET) can explain the clinical manifestation and progression of the disease. Qualitative and quantitative global and regional brain blood flow was evaluated in 12 girls with Rett syndrome and compared with an aged-matched reference group of children. In comparison with the reference group, SPET revealed a considerable global reduction in cerebral perfusion in the groups of girls with Rett syndrome. A large statistical difference was noted, which was more evident when comparing the control group with girls with stage IV Rett syndrome than girls with stage III Rett syndrome. The reduction in cerebral perfusion reflects functional disturbance in the brain of children with Rett syndrome. These data confirm that 99Tc(m)-ECD brain SPET is sensitive in detecting hypoperfused areas in girls with Rett syndrome that may be associated with brain atrophy, even when magnetic resonance imaging appears normal.
NASA Astrophysics Data System (ADS)
Dinç, Erdal; Kanbur, Murat; Baleanu, Dumitru
2007-10-01
Comparative simultaneous determination of chlortetracycline and benzocaine in the commercial veterinary powder product was carried out by continuous wavelet transform (CWT) and classical derivative transform (or classical derivative spectrophotometry). In this quantitative spectral analysis, two proposed analytical methods do not require any chemical separation process. In the first step, several wavelet families were tested to find an optimal CWT for the overlapping signal processing of the analyzed compounds. Subsequently, we observed that the coiflets (COIF-CWT) method with dilation parameter, a = 400, gives suitable results for this analytical application. For a comparison, the classical derivative spectrophotometry (CDS) approach was also applied to the simultaneous quantitative resolution of the same analytical problem. Calibration functions were obtained by measuring the transform amplitudes corresponding to zero-crossing points for both CWT and CDS methods. The utility of these two analytical approaches were verified by analyzing various synthetic mixtures consisting of chlortetracycline and benzocaine and they were applied to the real samples consisting of veterinary powder formulation. The experimental results obtained from the COIF-CWT approach were statistically compared with those obtained by classical derivative spectrophotometry and successful results were reported.
Effects of complex aural stimuli on mental performance.
Vij, Mohit; Aghazadeh, Fereydoun; Ray, Thomas G; Hatipkarasulu, Selen
2003-06-01
The objective of this study is to investigate the effect of complex aural stimuli on mental performance. A series of experiments were designed to obtain data for two different analyses. The first analysis is a "Stimulus" versus "No-stimulus" comparison for each of the four dependent variables, i.e. quantitative ability, reasoning ability, spatial ability and memory of an individual, by comparing the control treatment with the rest of the treatments. The second set of analysis is a multi-variant analysis of variance for component level main effects and interactions. The two component factors are tempo of the complex aural stimuli and sound volume level, each administered at three discrete levels for all four dependent variables. Ten experiments were conducted on eleven subjects. It was found that complex aural stimuli influence the quantitative and spatial aspect of the mind, while the reasoning ability was unaffected by the stimuli. Although memory showed a trend to be worse with the presence of complex aural stimuli, the effect was statistically insignificant. Variation in tempo and sound volume level of an aural stimulus did not significantly affect the mental performance of an individual. The results of these experiments can be effectively used in designing work environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Damao; Wang, Zhien; Heymsfield, Andrew J.
Measurement of ice number concentration in clouds is important but still challenging. Stratiform mixed-phase clouds (SMCs) provide a simple scenario for retrieving ice number concentration from remote sensing measurements. The simple ice generation and growth pattern in SMCs offers opportunities to use cloud radar reflectivity (Ze) measurements and other cloud properties to infer ice number concentration quantitatively. To understand the strong temperature dependency of ice habit and growth rate quantitatively, we develop a 1-D ice growth model to calculate the ice diffusional growth along its falling trajectory in SMCs. The radar reflectivity and fall velocity profiles of ice crystals calculatedmore » from the 1-D ice growth model are evaluated with the Atmospheric Radiation Measurements (ARM) Climate Research Facility (ACRF) ground-based high vertical resolution radar measurements. Combining Ze measurements and 1-D ice growth model simulations, we develop a method to retrieve the ice number concentrations in SMCs at given cloud top temperature (CTT) and liquid water path (LWP). The retrieved ice concentrations in SMCs are evaluated with in situ measurements and with a three-dimensional cloud-resolving model simulation with a bin microphysical scheme. These comparisons show that the retrieved ice number concentrations are within an uncertainty of a factor of 2, statistically.« less
Patel, Rashmin B; Patel, Nilay M; Patel, Mrunali R; Solanki, Ajay B
2017-03-01
The aim of this work was to develop and optimize a robust HPLC method for the separation and quantitation of ambroxol hydrochloride and roxithromycin utilizing Design of Experiment (DoE) approach. The Plackett-Burman design was used to assess the impact of independent variables (concentration of organic phase, mobile phase pH, flow rate and column temperature) on peak resolution, USP tailing and number of plates. A central composite design was utilized to evaluate the main, interaction, and quadratic effects of independent variables on the selected dependent variables. The optimized HPLC method was validated based on ICH Q2R1 guideline and was used to separate and quantify ambroxol hydrochloride and roxithromycin in tablet formulations. The findings showed that DoE approach could be effectively applied to optimize a robust HPLC method for quantification of ambroxol hydrochloride and roxithromycin in tablet formulations. Statistical comparison between results of proposed and reported HPLC method revealed no significant difference; indicating the ability of proposed HPLC method for analysis of ambroxol hydrochloride and roxithromycin in pharmaceutical formulations. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Joucla, Sébastien; Franconville, Romain; Pippow, Andreas; Kloppenburg, Peter; Pouzat, Christophe
2013-08-01
Calcium imaging has become a routine technique in neuroscience for subcellular to network level investigations. The fast progresses in the development of new indicators and imaging techniques call for dedicated reliable analysis methods. In particular, efficient and quantitative background fluorescence subtraction routines would be beneficial to most of the calcium imaging research field. A background-subtracted fluorescence transients estimation method that does not require any independent background measurement is therefore developed. This method is based on a fluorescence model fitted to single-trial data using a classical nonlinear regression approach. The model includes an appropriate probabilistic description of the acquisition system's noise leading to accurate confidence intervals on all quantities of interest (background fluorescence, normalized background-subtracted fluorescence time course) when background fluorescence is homogeneous. An automatic procedure detecting background inhomogeneities inside the region of interest is also developed and is shown to be efficient on simulated data. The implementation and performances of the proposed method on experimental recordings from the mouse hypothalamus are presented in details. This method, which applies to both single-cell and bulk-stained tissues recordings, should help improving the statistical comparison of fluorescence calcium signals between experiments and studies. Copyright © 2013 Elsevier Ltd. All rights reserved.
Comparison of two trajectory based models for locating particle sources for two rural New York sites
NASA Astrophysics Data System (ADS)
Zhou, Liming; Hopke, Philip K.; Liu, Wei
Two back trajectory-based statistical models, simplified quantitative transport bias analysis and residence-time weighted concentrations (RTWC) have been compared for their capabilities of identifying likely locations of source emissions contributing to observed particle concentrations at Potsdam and Stockton, New York. Quantitative transport bias analysis (QTBA) attempts to take into account the distribution of concentrations around the directions of the back trajectories. In full QTBA approach, deposition processes (wet and dry) are also considered. Simplified QTBA omits the consideration of deposition. It is best used with multiple site data. Similarly the RTWC approach uses concentrations measured at different sites along with the back trajectories to distribute the concentration contributions across the spatial domain of the trajectories. In this study, these models are used in combination with the source contribution values obtained by the previous positive matrix factorization analysis of particle composition data from Potsdam and Stockton. The six common sources for the two sites, sulfate, soil, zinc smelter, nitrate, wood smoke and copper smelter were analyzed. The results of the two methods are consistent and locate large and clearly defined sources well. RTWC approach can find more minor sources but may also give unrealistic estimations of the source locations.
Quantitative Reporting Practices in Middle-Grades Research Journals: Lessons to Learn
ERIC Educational Resources Information Center
Capraro, Robert M.; Capraro, Mary Margaret
2009-01-01
This study examines two journals specific to the middles grades where original quantitative empirical articles are published, Research in Middle Level Education and Middle Grades Research Journal to determine what quantitative statistics are used, how they are used, and what study designs are used. Important for those who write for the…
Scientists and Mathematicians Collaborating to Build Quantitative Skills in Undergraduate Science
ERIC Educational Resources Information Center
Rylands, Leanne; Simbag, Vilma; Matthews, Kelly E.; Coady, Carmel; Belward, Shaun
2013-01-01
There is general agreement in Australia and beyond that quantitative skills (QS) in science, the ability to use mathematics and statistics in context, are important for science. QS in the life sciences are becoming ever more important as these sciences become more quantitative. Consequently, undergraduates studying the life sciences require better…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-08
... statistical surveys that yield quantitative results that can be generalized to the population of study. This... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. The FHWA received no comments in...
Challenge in Enhancing the Teaching and Learning of Variable Measurements in Quantitative Research
ERIC Educational Resources Information Center
Kee, Chang Peng; Osman, Kamisah; Ahmad, Fauziah
2013-01-01
Statistical analysis is one component that cannot be avoided in a quantitative research. Initial observations noted that students in higher education institution faced difficulty analysing quantitative data which were attributed to the confusions of various variable measurements. This paper aims to compare the outcomes of two approaches applied in…
Fouchard, A; Bréchat, P-H; Castiel, D; Pascal, J; Sass, C; Lebas, J; Chauvin, P
2014-08-01
Inequality in health care is a growing problem, leading to the development of different tools for the assessment of individual deprivation. In France, three tools are mainly used: Epices (which stands for "score for the evaluation of social deprivation and health inequities among the centers for medical examination"), a score called "Handicap social" and a screening tool built for medical consultations by Pascal et al. at Nantes' hospital. The purpose of this study was to make a metrological assessment of those tools and a quantitative comparison by using them on a single deprived population. In order to assess the metrological properties of the three scores, we used the quality criteria published by Terwee et al. which are: content validity, internal consistency, criterion validity, construct validity, reproducibility (agreement and reliability), responsiveness, floor and ceiling effects and interpretability. For the comparison, we used data from the patients who had attended a free hospital outpatient clinic dedicated to socially deprived people in Paris, during one month in 2010. The "Handicap social" survey was first filled in by the 721 outpatients before being recoded to allow the comparison with the other scores. While the population of interest was quite well defined by all three scores, other quality criteria were less satisfactory. For this outpatient population, the "Handicap social" score classed 3.2% as non-deprived (class 1), 32.7% as socially deprived (class 2) and 64.7% as very deprived (class 3). With the Epices score, the rates of deprivation varied from 97.9% to 100% depending on the way the score was estimated. For the Pascal score, rates ranged from 83.4% to 88.1%. On a subgroup level, only the Pascal score showed statistically significant associations with gender, occupation, education and origin. These three scores have very different goal and meanings. They are not interchangeable. Users should be aware of their advantages and disadvantages in order to use them wisely. Much remains to be done to fully assess their metrological performances. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Knudsen, Anders Dahl; Bennike, Tue; Kjeldal, Henrik; Birkelund, Svend; Otzen, Daniel Erik; Stensballe, Allan
2014-05-30
We describe Condenser, a freely available, comprehensive open-source tool for merging multidimensional quantitative proteomics data from the Matrix Science Mascot Distiller Quantitation Toolbox into a common format ready for subsequent bioinformatic analysis. A number of different relative quantitation technologies, such as metabolic (15)N and amino acid stable isotope incorporation, label-free and chemical-label quantitation are supported. The program features multiple options for curative filtering of the quantified peptides, allowing the user to choose data quality thresholds appropriate for the current dataset, and ensure the quality of the calculated relative protein abundances. Condenser also features optional global normalization, peptide outlier removal, multiple testing and calculation of t-test statistics for highlighting and evaluating proteins with significantly altered relative protein abundances. Condenser provides an attractive addition to the gold-standard quantitative workflow of Mascot Distiller, allowing easy handling of larger multi-dimensional experiments. Source code, binaries, test data set and documentation are available at http://condenser.googlecode.com/. Copyright © 2014 Elsevier B.V. All rights reserved.
Predictive value of PD-L1 based on mRNA level in the treatment of stage IV melanoma with ipilimumab.
Brüggemann, C; Kirchberger, M C; Goldinger, S M; Weide, B; Konrad, A; Erdmann, M; Schadendorf, D; Croner, R S; Krähenbühl, L; Kähler, K C; Hafner, C; Leisgang, W; Kiesewetter, F; Dummer, R; Schuler, G; Stürzl, M; Heinzerling, L
2017-10-01
PD-L1 is established as a predictive marker for therapy of non-small cell lung cancer with pembrolizumab. Furthermore, PD-L1 positive melanoma has shown more favorable outcomes when treated with anti-PD1 antibodies and dacarbazine compared to PD-L1 negative melanoma. However, the role of PD-L1 expression with regard to response to checkpoint inhibition with anti-CTLA-4 is not clear, yet. In addition, the lack of standardization in the immunohistochemical assessment of PD-L1 makes the comparison of results difficult. In this study, we investigated the PD-L1 gene expression with a new fully automated technique via RT-PCR and correlated the findings with the response to the anti-CTLA-4 antibody ipilimumab. Within a retrospective multi-center trial, PD-L1 gene expression was evaluated in 78 melanoma patients in a total of 111 pre-treatment tumor samples from 6 skin cancer centers and analyzed with regard to response to ipilimumab. For meaningful statistical analysis, the cohort was enriched for responders with 30 responders and 48 non-responders. Gene expression was assessed by quantitative RT-PCR after extracting mRNA from formalin-fixed paraffin embedded tumor tissue and correlated with results from immunohistochemical (IHC) stainings. The evaluation of PD-L1 expression based on mRNA level is feasible. Correlation between PD-L1 expression as assessed by IHC and RT-PCR showed varying levels of concordance depending on the antibody employed. RT-PCR should be further investigated to measure PD-L1 expression, since it is a semi-quantitative method with observer-independent evaluation. With this approach, there was no statistical significant difference in the PD-L1 expression between responders and non-responders to the therapy with ipilimumab. The evaluation of PD-L1 expression based on mRNA level is feasible. Correlation between PD-L1 expression as assessed by IHC and RT-PCR showed varying levels of concordance depending on the antibody employed. RT-PCR should be further investigated to measure PD-L1 expression, since it is a semi-quantitative method with observer-independent evaluation. With this approach, there was no statistical significant difference in the PD-L1 expression between responders and non-responders to the therapy with ipilimumab.
Bryce A. Richardson; Gerald E. Rehfeldt; Mee-Sook Kim
2009-01-01
Analyses of molecular and quantitative genetic data demonstrate the existence of congruent climate-related patterns in western white pine (Pinus monticola). Two independent studies allowed comparisons of amplified fragment length polymorphism (AFLP) markers with quantitative variation in adaptive traits. Principal component analyses...
How to Compare Parametric and Nonparametric Person-Fit Statistics Using Real Data
ERIC Educational Resources Information Center
Sinharay, Sandip
2017-01-01
Person-fit assessment (PFA) is concerned with uncovering atypical test performance as reflected in the pattern of scores on individual items on a test. Existing person-fit statistics (PFSs) include both parametric and nonparametric statistics. Comparison of PFSs has been a popular research topic in PFA, but almost all comparisons have employed…
Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A
2017-12-19
As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.
The other half of the story: effect size analysis in quantitative research.
Maher, Jessica Middlemis; Markey, Jonathan C; Ebert-May, Diane
2013-01-01
Statistical significance testing is the cornerstone of quantitative research, but studies that fail to report measures of effect size are potentially missing a robust part of the analysis. We provide a rationale for why effect size measures should be included in quantitative discipline-based education research. Examples from both biological and educational research demonstrate the utility of effect size for evaluating practical significance. We also provide details about some effect size indices that are paired with common statistical significance tests used in educational research and offer general suggestions for interpreting effect size measures. Finally, we discuss some inherent limitations of effect size measures and provide further recommendations about reporting confidence intervals.
Determining absolute protein numbers by quantitative fluorescence microscopy.
Verdaasdonk, Jolien Suzanne; Lawrimore, Josh; Bloom, Kerry
2014-01-01
Biological questions are increasingly being addressed using a wide range of quantitative analytical tools to examine protein complex composition. Knowledge of the absolute number of proteins present provides insights into organization, function, and maintenance and is used in mathematical modeling of complex cellular dynamics. In this chapter, we outline and describe three microscopy-based methods for determining absolute protein numbers--fluorescence correlation spectroscopy, stepwise photobleaching, and ratiometric comparison of fluorescence intensity to known standards. In addition, we discuss the various fluorescently labeled proteins that have been used as standards for both stepwise photobleaching and ratiometric comparison analysis. A detailed procedure for determining absolute protein number by ratiometric comparison is outlined in the second half of this chapter. Counting proteins by quantitative microscopy is a relatively simple yet very powerful analytical tool that will increase our understanding of protein complex composition. © 2014 Elsevier Inc. All rights reserved.
Comparative policy analysis for alcohol and drugs: Current state of the field.
Ritter, Alison; Livingston, Michael; Chalmers, Jenny; Berends, Lynda; Reuter, Peter
2016-05-01
A central policy research question concerns the extent to which specific policies produce certain effects - and cross-national (or between state/province) comparisons appear to be an ideal way to answer such a question. This paper explores the current state of comparative policy analysis (CPA) with respect to alcohol and drugs policies. We created a database of journal articles published between 2010 and 2014 as the body of CPA work for analysis. We used this database of 57 articles to clarify, extract and analyse the ways in which CPA has been defined. Quantitative and qualitative analysis of the CPA methods employed, the policy areas that have been studied, and differences between alcohol CPA and drug CPA are explored. There is a lack of clear definition as to what counts as a CPA. The two criteria for a CPA (explicit study of a policy, and comparison across two or more geographic locations), exclude descriptive epidemiology and single state comparisons. With the strict definition, most CPAs were with reference to alcohol (42%), although the most common policy to be analysed was medical cannabis (23%). The vast majority of papers undertook quantitative data analysis, with a variety of advanced statistical methods. We identified five approaches to the policy specification: classification or categorical coding of policy as present or absent; the use of an index; implied policy differences; described policy difference and data-driven policy coding. Each of these has limitations, but perhaps the most common limitation was the inability for the method to account for the differences between policy-as-stated versus policy-as-implemented. There is significant diversity in CPA methods for analysis of alcohol and drugs policy, and some substantial challenges with the currently employed methods. The absence of clear boundaries to a definition of what counts as a 'comparative policy analysis' may account for the methodological plurality but also appears to stand in the way of advancing the techniques. Copyright © 2016 Elsevier B.V. All rights reserved.
2017-05-10
repertoire-wide properties. Finally, through 75 the use of appropriate statistical analyses, the repertoire profiles can be quantitatively compared and 76...cell response to eVLP and 503 quantitatively compare GC B-cell repertoires from immunization conditions. We partitioned the 504 resulting clonotype... Quantitative analysis of repertoire-scale immunoglobulin properties in vaccine-induced B-cell responses Ilja V. Khavrutskii1, Sidhartha Chaudhury*1
Noufal, Ahammed; George, Antony; Jose, Maji; Khader, Mohasin Abdul; Jayapalan, Cheriyanthal Sisupalan
2014-01-01
Tobacco in any form (smoking or chewing), arecanut chewing, and alcohol are considered to be the major extrinsic etiological factors for potentially malignant disorders of the oral cavity and for squamous cell carcinoma, the most common oral malignancy in India. An increase in nuclear diameter (ND) and nucleus-cell ratio (NCR) with a reduction in cell diameter (CD) are early cytological indicators of dysplastic change. The authors sought to identify cytomorphometric changes in ND, CD, and NCR of oral buccal cells in tobacco and arecanut chewers who chewed with or without betel leaf. Participants represented 3 groups. Group I consisted of 30 individuals who chewed tobacco and arecanut with betel leaf (BQT chewers). Group II consisted of 30 individuals who chewed tobacco and arecanut without betel leaf (Gutka chewers). Group III comprised 30 apparently healthy nonabusers. Cytological smears were prepared and stained with modified-Papanicolaou stain. Comparisons between Groups I and II and Groups II and III showed that ND was increased, with P values of .054 and .008, respectively, whereas a comparison of Groups I and III showed no statistical significance. Comparisons between Groups I and II and Groups II and III showed that CD was statistically reduced, with P values of .037 and <.000, respectively, whereas comparison of Groups I and III showed no statistical significance. Comparisons between Groups I and II and groups II and III showed that NCR was statistically increased, with P values of <.000, whereas a comparison of Groups I and III showed no statistical significance. CD, ND, and NCR showed statistically significant changes in Group II in comparison with Group I, which could indicate larger and earlier risk of carcinoma for Gutka chewers than in BQT chewers.
75 FR 68468 - List of Fisheries for 2011
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-08
...-existent; therefore, quantitative data on the frequency of incidental mortality and serious injury is... currently available for most of these marine mammals on the high seas, and quantitative comparison of...
ERIC Educational Resources Information Center
Mirick, Rebecca G.; Davis, Ashley
2017-01-01
Although statistics and research are key components of social work education, students are often described as reluctant consumers and users of statistics. Self-efficacy theory has been used to understand students' engagement with the statistical knowledge needed for practice. This quantitative study explores the relationship between self-efficacy,…
A Simple Illustration for the Need of Multiple Comparison Procedures
ERIC Educational Resources Information Center
Carter, Rickey E.
2010-01-01
Statistical adjustments to accommodate multiple comparisons are routinely covered in introductory statistical courses. The fundamental rationale for such adjustments, however, may not be readily understood. This article presents a simple illustration to help remedy this.
Clinical importance of voluntary and induced Bennett movement.
Tupac, R G
1978-07-01
A total of 136 dentulous patients were divided into three groups for purposes of quantitative pantographic comparison of voluntary and induced Bennett movement. The effects of patient age and operator experience on recording the Bennett movement were also studied. The results indicates that for patients studied with Bennett movement iduced in the manner described: 1. Experienced operators can obtain more induced Bennett movement that inexperienced operators. 2. Inducing Bennett movement has a greater effect on the immediate side shift component than it has on the progressive side shift component. 3. For older individuals the amount and direction of induced immediate side shift is greater than for younger patients, statistically highly significant, and therefore clinically important. In conclusion, if the objective of a pantographic survey is to record the complete capacity of the joint to move, *lateral jaw movements must be induced.
Complexity-entropy causality plane: A useful approach for distinguishing songs
NASA Astrophysics Data System (ADS)
Ribeiro, Haroldo V.; Zunino, Luciano; Mendes, Renio S.; Lenzi, Ervin K.
2012-04-01
Nowadays we are often faced with huge databases resulting from the rapid growth of data storage technologies. This is particularly true when dealing with music databases. In this context, it is essential to have techniques and tools able to discriminate properties from these massive sets. In this work, we report on a statistical analysis of more than ten thousand songs aiming to obtain a complexity hierarchy. Our approach is based on the estimation of the permutation entropy combined with an intensive complexity measure, building up the complexity-entropy causality plane. The results obtained indicate that this representation space is very promising to discriminate songs as well as to allow a relative quantitative comparison among songs. Additionally, we believe that the here-reported method may be applied in practical situations since it is simple, robust and has a fast numerical implementation.
Olimpo, Jeffrey T.; Pevey, Ryan S.; McCabe, Thomas M.
2018-01-01
Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students’ reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce. PMID:29904549
Olimpo, Jeffrey T; Pevey, Ryan S; McCabe, Thomas M
2018-01-01
Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students' reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce.
Analysis of defect structure in silicon. Characterization of samples from UCP ingot 5848-13C
NASA Technical Reports Server (NTRS)
Natesh, R.; Guyer, T.; Stringfellow, G. B.
1982-01-01
Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13 C. Important trends were noticed between the measured data, cell efficiency, and diffusion length. Grain boundary substructure appears to have an important effect on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements give statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for QTM analysis was perfected.
ERIC Educational Resources Information Center
Yousef, Darwish Abdulrahamn
2017-01-01
Purpose: This paper aims to investigate the impacts of teaching style, English language and communication and assessment methods on the academic performance of undergraduate business students in introductory quantitative courses such as Statistics for Business 1 and 2, Quantitative Methods for Business, Operations and Production Management and…
ERIC Educational Resources Information Center
Flanagan, K. M.; Einarson, J.
2017-01-01
In a world filled with big data, mathematical models, and statistics, the development of strong quantitative skills is becoming increasingly critical for modern biologists. Teachers in this field must understand how students acquire quantitative skills and explore barriers experienced by students when developing these skills. In this study, we…
New powerful statistics for alignment-free sequence comparison under a pattern transfer model.
Liu, Xuemei; Wan, Lin; Li, Jing; Reinert, Gesine; Waterman, Michael S; Sun, Fengzhu
2011-09-07
Alignment-free sequence comparison is widely used for comparing gene regulatory regions and for identifying horizontally transferred genes. Recent studies on the power of a widely used alignment-free comparison statistic D2 and its variants D*2 and D(s)2 showed that their power approximates a limit smaller than 1 as the sequence length tends to infinity under a pattern transfer model. We develop new alignment-free statistics based on D2, D*2 and D(s)2 by comparing local sequence pairs and then summing over all the local sequence pairs of certain length. We show that the new statistics are much more powerful than the corresponding statistics and the power tends to 1 as the sequence length tends to infinity under the pattern transfer model. Copyright © 2011 Elsevier Ltd. All rights reserved.
New Powerful Statistics for Alignment-free Sequence Comparison Under a Pattern Transfer Model
Liu, Xuemei; Wan, Lin; Li, Jing; Reinert, Gesine; Waterman, Michael S.; Sun, Fengzhu
2011-01-01
Alignment-free sequence comparison is widely used for comparing gene regulatory regions and for identifying horizontally transferred genes. Recent studies on the power of a widely used alignment-free comparison statistic D2 and its variants D2∗ and D2s showed that their power approximates a limit smaller than 1 as the sequence length tends to infinity under a pattern transfer model. We develop new alignment-free statistics based on D2, D2∗ and D2s by comparing local sequence pairs and then summing over all the local sequence pairs of certain length. We show that the new statistics are much more powerful than the corresponding statistics and the power tends to 1 as the sequence length tends to infinity under the pattern transfer model. PMID:21723298
NASA Technical Reports Server (NTRS)
Natesh, R.; Stringfellow, G. B.; Virkar, A. V.; Dunn, J.; Guyer, T.
1983-01-01
Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13C. Important correlation was obtained between defect densities, cell efficiency, and diffusion length. Grain boundary substructure displayed a strong influence on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements gave statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for quantimet quantitative image analyzer (QTM) analysis was perfected and is used routinely. The relationships between hole mobility and grain boundary density was determined. Mobility was measured using the van der Pauw technique, and grain boundary density was measured using quantitative microscopy technique. Mobility was found to decrease with increasing grain boundary density.
ERIC Educational Resources Information Center
Norris, John M.
2015-01-01
Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…
NASA Astrophysics Data System (ADS)
He, Honghui; Dong, Yang; Zhou, Jialing; Ma, Hui
2017-03-01
As one of the salient features of light, polarization contains abundant structural and optical information of media. Recently, as a comprehensive description of polarization property, the Mueller matrix polarimetry has been applied to various biomedical studies such as cancerous tissues detections. In previous works, it has been found that the structural information encoded in the 2D Mueller matrix images can be presented by other transformed parameters with more explicit relationship to certain microstructural features. In this paper, we present a statistical analyzing method to transform the 2D Mueller matrix images into frequency distribution histograms (FDHs) and their central moments to reveal the dominant structural features of samples quantitatively. The experimental results of porcine heart, intestine, stomach, and liver tissues demonstrate that the transformation parameters and central moments based on the statistical analysis of Mueller matrix elements have simple relationships to the dominant microstructural properties of biomedical samples, including the density and orientation of fibrous structures, the depolarization power, diattenuation and absorption abilities. It is shown in this paper that the statistical analysis of 2D images of Mueller matrix elements may provide quantitative or semi-quantitative criteria for biomedical diagnosis.
Rhee, Chang-Hoon; Shin, Sang Min; Choi, Yong-Seok; Yamaguchi, Tetsutaro; Maki, Koutaro; Kim, Yong-Il; Kim, Seong-Sik; Park, Soo-Byung; Son, Woo-Sung
2015-12-01
From computed tomographic images, the dentocentral synchondrosis can be identified in the second cervical vertebra. This can demarcate the border between the odontoid process and the body of the 2nd cervical vertebra and serve as a good model for the prediction of bone and forensic age. Nevertheless, until now, there has been no application of the 2nd cervical vertebra based on the dentocentral synchondrosis. In this study, statistical shape analysis was used to build bone and forensic age estimation regression models. Following the principles of statistical shape analysis and principal components analysis, we used cone-beam computed tomography (CBCT) to evaluate a Japanese population (35 males and 45 females, from 5 to 19 years old). The narrowest prediction intervals among the multivariate regression models were 19.63 for bone age and 2.99 for forensic age. There was no significant difference between form space and shape space in the bone and forensic age estimation models. However, for gender comparison, the bone and forensic age estimation models for males had the higher explanatory power. This study derived an improved objective and quantitative method for bone and forensic age estimation based on only the 2nd, 3rd and 4th cervical vertebral shapes. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Johnson, Gregory R.; Kangas, Joshua D.; Dovzhenko, Alexander; Trojok, Rüdiger; Voigt, Karsten; Majarian, Timothy D.; Palme, Klaus; Murphy, Robert F.
2017-01-01
Quantitative image analysis procedures are necessary for the automated discovery of effects of drug treatment in large collections of fluorescent micrographs. When compared to their mammalian counterparts, the effects of drug conditions on protein localization in plant species are poorly understood and underexplored. To investigate this relationship, we generated a large collection of images of single plant cells after various drug treatments. For this, protoplasts were isolated from six transgenic lines of A. thaliana expressing fluorescently tagged proteins. Nine drugs at three concentrations were applied to protoplast cultures followed by automated image acquisition. For image analysis, we developed a cell segmentation protocol for detecting drug effects using a Hough-transform based region of interest detector and a novel cross-channel texture feature descriptor. In order to determine treatment effects, we summarized differences between treated and untreated experiments with an L1 Cramér-von Mises statistic. The distribution of these statistics across all pairs of treated and untreated replicates was compared to the variation within control replicates to determine the statistical significance of observed effects. Using this pipeline, we report the dose dependent drug effects in the first high-content Arabidopsis thaliana drug screen of its kind. These results can function as a baseline for comparison to other protein organization modeling approaches in plant cells. PMID:28245335
ERIC Educational Resources Information Center
Bliss, Leonard B.; Tashakkori, Abbas
This paper discusses the objectives that would be appropriate for statistics classes for students who are not majoring in statistics, evaluation, or quantitative research design. These "non-majors" should be able to choose appropriate analytical methods for specific sets of data based on the research question and the nature of the data, and they…
Gunawardena, Harsha P.; Feltcher, Meghan E.; Wrobel, John A.; Gu, Sheng; Braunstein, Miriam; Chen, Xian
2015-01-01
The Mycobacterium tuberculosis (MTB) membrane is rich in antigens that are potential targets for diagnostics and the development of new vaccines. To better understand the mechanisms underlying MTB virulence and identify new targets for therapeutic intervention we investigated the differential composition of membrane proteomes between virulent M. tuberculosis H37Rv (MTB) and the Mycobacterium bovis BCG vaccine strain. To compare the membrane proteomes, we used LC-MS/MS analysis in combination with label-free quantitative (LFQ) proteomics, utilizing the area-under-curve (AUC) of the extracted ion chromatograms (XIC) of peptides obtained from m/z and retention time alignment of MS1 features. With this approach, we obtained relative abundance ratios for 2,203 identified membrane-associated proteins in high confidence. Of these proteins, 294 showed statistically significant differences of at least 2 fold, in relative abundance between MTB and BCG membrane fractions. Our comparative analysis detected several proteins associated with known genomic regions of difference between MTB and BCG as being absent, which validated the accuracy of our approach. In further support of our label-free quantitative data, we verified select protein differences by immunoblotting. To our knowledge we have generated the first comprehensive and high coverage profile of comparative membrane proteome changes between virulent MTB and its attenuated relative BCG, which helps elucidate the proteomic basis of the intrinsic virulence of the MTB pathogen. PMID:24093440
Ehrhardt, J; Säring, D; Handels, H
2007-01-01
Modern tomographic imaging devices enable the acquisition of spatial and temporal image sequences. But, the spatial and temporal resolution of such devices is limited and therefore image interpolation techniques are needed to represent images at a desired level of discretization. This paper presents a method for structure-preserving interpolation between neighboring slices in temporal or spatial image sequences. In a first step, the spatiotemporal velocity field between image slices is determined using an optical flow-based registration method in order to establish spatial correspondence between adjacent slices. An iterative algorithm is applied using the spatial and temporal image derivatives and a spatiotemporal smoothing step. Afterwards, the calculated velocity field is used to generate an interpolated image at the desired time by averaging intensities between corresponding points. Three quantitative measures are defined to evaluate the performance of the interpolation method. The behavior and capability of the algorithm is demonstrated by synthetic images. A population of 17 temporal and spatial image sequences are utilized to compare the optical flow-based interpolation method to linear and shape-based interpolation. The quantitative results show that the optical flow-based method outperforms the linear and shape-based interpolation statistically significantly. The interpolation method presented is able to generate image sequences with appropriate spatial or temporal resolution needed for image comparison, analysis or visualization tasks. Quantitative and qualitative measures extracted from synthetic phantoms and medical image data show that the new method definitely has advantages over linear and shape-based interpolation.
Quantitative Hyperspectral Reflectance Imaging
Klein, Marvin E.; Aalderink, Bernard J.; Padoan, Roberto; de Bruin, Gerrit; Steemers, Ted A.G.
2008-01-01
Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared). By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands) to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms. PMID:27873831
Yehia, Ali Mohamed; Essam, Hebatallah Mohamed
2016-09-01
A generally applicable high-performance liquid chromatographic method for the qualitative and quantitative determination of pharmaceutical preparations containing phenylephrine hydrochloride, paracetamol, ephedrine hydrochloride, guaifenesin, doxylamine succinate, and dextromethorphan hydrobromide is developed. Optimization of chromatographic conditions was performed for the gradient elution using different buffer pH values, flow rates and two C18 stationary phases. The method was developed using a Kinetex® C18 column as a core-shell stationary phase with a gradient profile using buffer pH 5.0 and acetonitrile at 2.0 mL/min flow rate. Detection was carried out at 220 nm and linear calibrations were obtained for all components within the studied ranges. The method was fully validated in agreement with ICH guidelines. The proposed method is specific, accurate and precise (RSD% < 3%). Limits of detection are lower than 2.0 μg/mL. Qualitative and quantitative responses were evaluated using experimental design to assist the method robustness. The method was proved to be highly robust against 10% change in buffer pH and flow rate (RSD% < 10%), however, the flow rate may significantly influence the quantitative responses of phenylephrine, paracetamol, and doxylamine (RSD% > 10%). Satisfactory results were obtained for commercial combinations analyses. Statistical comparison between the proposed chromatographic and official methods revealed no significant difference. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Moulder, Robert; Filén, Jan-Jonas; Salmi, Jussi; Katajamaa, Mikko; Nevalainen, Olli S; Oresic, Matej; Aittokallio, Tero; Lahesmaa, Riitta; Nyman, Tuula A
2005-07-01
The options available for processing quantitative data from isotope coded affinity tag (ICAT) experiments have mostly been confined to software specific to the instrument of acquisition. However, recent developments with data format conversion have subsequently increased such processing opportunities. In the present study, data sets from ICAT experiments, analysed with liquid chromatography/tandem mass spectrometry (MS/MS), using an Applied Biosystems QSTAR Pulsar quadrupole-TOF mass spectrometer, were processed in triplicate using separate mass spectrometry software packages. The programs Pro ICAT, Spectrum Mill and SEQUEST with XPRESS were employed. Attention was paid towards the extent of common identification and agreement of quantitative results, with additional interest in the flexibility and productivity of these programs. The comparisons were made with data from the analysis of a specifically prepared test mixture, nine proteins at a range of relative concentration ratios from 0.1 to 10 (light to heavy labelled forms), as a known control, and data selected from an ICAT study involving the measurement of cytokine induced protein expression in human lymphoblasts, as an applied example. Dissimilarities were detected in peptide identification that reflected how the associated scoring parameters favoured information from the MS/MS data sets. Accordingly, there were differences in the numbers of peptides and protein identifications, although from these it was apparent that both confirmatory and complementary information was present. In the quantitative results from the three programs, no statistically significant differences were observed.
Preuner, Sandra; Barna, Agnes; Frommlet, Florian; Czurda, Stefan; Konstantin, Byrgazov; Alikian, Mary; Machova Polakova, Katerina; Sacha, Tomasz; Richter, Johan; Lion, Thomas; Gabriel, Christian
2016-01-01
Identification and quantitative monitoring of mutant BCR-ABL1 subclones displaying resistance to tyrosine kinase inhibitors (TKIs) have become important tasks in patients with Ph-positive leukemias. Different technologies have been established for patient screening. Various next-generation sequencing (NGS) platforms facilitating sensitive detection and quantitative monitoring of mutations in the ABL1-kinase domain (KD) have been introduced recently, and are expected to become the preferred technology in the future. However, broad clinical implementation of NGS methods has been hampered by the limited accessibility at different centers and the current costs of analysis which may not be regarded as readily affordable for routine diagnostic monitoring. It is therefore of interest to determine whether NGS platforms can be adequately substituted by other methodological approaches. We have tested three different techniques including pyrosequencing, LD (ligation-dependent)-PCR and NGS in a series of peripheral blood specimens from chronic myeloid leukemia (CML) patients carrying single or multiple mutations in the BCR-ABL1 KD. The proliferation kinetics of mutant subclones in serial specimens obtained during the course of TKI-treatment revealed similar profiles via all technical approaches, but individual specimens showed statistically significant differences between NGS and the other methods tested. The observations indicate that different approaches to detection and quantification of mutant subclones may be applicable for the monitoring of clonal kinetics, but careful calibration of each method is required for accurate size assessment of mutant subclones at individual time points. PMID:27136541
Pittaluga, Fabrizia; Allice, Tiziano; Abate, Maria Lorena; Ciancio, Alessia; Cerutti, Francesco; Varetto, Silvia; Colucci, Giuseppe; Smedile, Antonina; Ghisetti, Valeria
2008-02-01
Diagnosis and monitoring of HCV infection relies on sensitive and accurate HCV RNA detection and quantitation. The performance of the COBAS AmpliPrep/COBAS TaqMan 48 (CAP/CTM) (Roche, Branchburg, NJ), a fully automated, real-time PCR HCV RNA quantitative test was assessed and compared with the branched-DNA (bDNA) assay. Clinical evaluation on 576 specimens obtained from patients with chronic hepatitis C showed a good correlation (r = 0.893) between the two test, but the CAP/CTM scored higher HCV RNA titers than the bDNA across all viral genotypes. The mean bDNA versus CAP/CTM log10 IU/ml differences were -0.49, -0.4, -0.54, -0.26 for genotype 1a, 1b, 2a/2c, 3a, and 4, respectively. These differences reached statistical significance for genotypes 1b, 2a/c, and 3a. The ability of the CAP/CTM to monitor patients undergoing antiviral therapy and correctly identify the weeks 4 and 12 rapid and early virological responses was confirmed. The broader dynamic range of the CAP/CTM compared with the bDNA allowed for a better definition of viral kinetics. In conclusion, the CAP/CTM appears as a reliable and user-friendly assay to monitor HCV viremia during treatment of patients with chronic hepatitis. Its high sensitivity and wide dynamic range may help a better definition of viral load changes during antiviral therapy. (Copyright) 2007 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Slezak, Thomas Joseph; Radebaugh, Jani; Christiansen, Eric
2017-10-01
The shapes of craterform morphology on planetary surfaces provides rich information about their origins and evolution. While morphologic information provides rich visual clues to geologic processes and properties, the ability to quantitatively communicate this information is less easily accomplished. This study examines the morphology of craterforms using the quantitative outline-based shape methods of geometric morphometrics, commonly used in biology and paleontology. We examine and compare landforms on planetary surfaces using shape, a property of morphology that is invariant to translation, rotation, and size. We quantify the shapes of paterae on Io, martian calderas, terrestrial basaltic shield calderas, terrestrial ash-flow calderas, and lunar impact craters using elliptic Fourier analysis (EFA) and the Zahn and Roskies (Z-R) shape function, or tangent angle approach to produce multivariate shape descriptors. These shape descriptors are subjected to multivariate statistical analysis including canonical variate analysis (CVA), a multiple-comparison variant of discriminant analysis, to investigate the link between craterform shape and classification. Paterae on Io are most similar in shape to terrestrial ash-flow calderas and the shapes of terrestrial basaltic shield volcanoes are most similar to martian calderas. The shapes of lunar impact craters, including simple, transitional, and complex morphology, are classified with a 100% rate of success in all models. Multiple CVA models effectively predict and classify different craterforms using shape-based identification and demonstrate significant potential for use in the analysis of planetary surfaces.
McCann, Una D; Szabo, Zsolt; Seckin, Esen; Rosenblatt, Peter; Mathews, William B; Ravert, Hayden T; Dannals, Robert F; Ricaurte, George A
2005-09-01
(+/-)3,4-Methylenedioxymethamphetamine (MDMA, 'Ecstasy') is a widely used illicit drug that produces toxic effects on brain serotonin axons and axon terminals in animals. The results of clinical studies addressing MDMA's serotonin neurotoxic potential in humans have been inconclusive. In the present study, 23 abstinent MDMA users and 19 non-MDMA controls underwent quantitative positron emission tomography (PET) studies using [11C]McN5652 and [11C]DASB, first- and second-generation serotonin transporter (SERT) ligands previously validated in baboons for detecting MDMA-induced brain serotonin neurotoxicity. Global and regional distribution volumes (DVs) and two additional SERT-binding parameters (DV(spec) and DVR) were compared in the two subject populations using parametric statistical analyses. Data from PET studies revealed excellent correlations between the various binding parameters of [11C]McN5652 and [11C]DASB, both in individual brain regions and individual subjects. Global SERT reductions were found in MDMA users with both PET ligands, using all three of the above-mentioned SERT-binding parameters. Preplanned comparisons in 15 regions of interest demonstrated reductions in selected cortical and subcortical structures. Exploratory correlational analyses suggested that SERT measures recover with time, and that loss of the SERT is directly associated with MDMA use intensity. These quantitative PET data, obtained using validated first- and second-generation SERT PET ligands, provide strong evidence of reduced SERT density in some recreational MDMA users.
Validating internal controls for quantitative plant gene expression studies.
Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H
2004-08-18
Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments.
Preuner, Sandra; Barna, Agnes; Frommlet, Florian; Czurda, Stefan; Konstantin, Byrgazov; Alikian, Mary; Machova Polakova, Katerina; Sacha, Tomasz; Richter, Johan; Lion, Thomas; Gabriel, Christian
2016-04-29
Identification and quantitative monitoring of mutant BCR-ABL1 subclones displaying resistance to tyrosine kinase inhibitors (TKIs) have become important tasks in patients with Ph-positive leukemias. Different technologies have been established for patient screening. Various next-generation sequencing (NGS) platforms facilitating sensitive detection and quantitative monitoring of mutations in the ABL1-kinase domain (KD) have been introduced recently, and are expected to become the preferred technology in the future. However, broad clinical implementation of NGS methods has been hampered by the limited accessibility at different centers and the current costs of analysis which may not be regarded as readily affordable for routine diagnostic monitoring. It is therefore of interest to determine whether NGS platforms can be adequately substituted by other methodological approaches. We have tested three different techniques including pyrosequencing, LD (ligation-dependent)-PCR and NGS in a series of peripheral blood specimens from chronic myeloid leukemia (CML) patients carrying single or multiple mutations in the BCR-ABL1 KD. The proliferation kinetics of mutant subclones in serial specimens obtained during the course of TKI-treatment revealed similar profiles via all technical approaches, but individual specimens showed statistically significant differences between NGS and the other methods tested. The observations indicate that different approaches to detection and quantification of mutant subclones may be applicable for the monitoring of clonal kinetics, but careful calibration of each method is required for accurate size assessment of mutant subclones at individual time points.
Valencia Arango, M; Torres Martí, A; Insausti Ordeñana, J; Alvarez Lerma, F; Carrasco Joaquinet, N; Herranz Casado, M; Tirapu León, J P
2003-09-01
To study the validity of quantitative cultures of tracheal aspirate (TA) in comparison with the plugged telescoping catheter (PTC) for the diagnosis of mechanical ventilator-associated pneumonia. Prospective multicenter study enrolling patients undergoing mechanical ventilation for longer than 72 hours. TA samples were collected from patients with suspected ventilator-associated pneumonia, followed by PTC sampling. Quantitative cultures were performed on all samples. Patients were classified according to the presence or not of pneumonia, based on clinical and radiologic criteria, clinical course and autopsy findings. The cutoff points were > or = 103 colony-forming units (cfu)/mL for PTC cultures; the TA cutoffs analyzed were > or = 105 and > or = 106 cfu/mL. Of the 120 patients studied, 84 had diagnoses of pneumonia and 36 did not (controls). The sensitivity values for TA > or = 106, TA > or = 105, and PTC, respectively, were 54% (95% confidence interval [CI], 42%-64%), 71% (95% CI, 60%-81%), and 68% (95% CI, 57%-78%). The specificity values were 75% (95% CI, 58%-88%), 58% (95% CI, 41%-74%), and 75% (95% CI, 58%-88%), respectively. Staphylococcus aureus was the microorganism most frequently isolated in both TA and PTC samples, followed in frequency by Pseudomomonas aeruginosa in TA samples and Haemophilus influenzae in PTC samples. No significant differences were found between the sensitivity of TA > or = 105 and that of PTC, nor between the specificities of TA > or = 106 and PTC. No differences in the specificities of PTC and TA were found when a TA cutoff of > or = 106 cfu/ml was used. Moreover, at a cutoff of > or = 105 the sensitivity of TA was not statistically different from that of PTC. Quantitative cultures of TA can be considered acceptable for the diagnosis of ventilator-associated pneumonia.
Comparison of 18F-FDG PET/CT and PET/MRI in patients with multiple myeloma
Sachpekidis, Christos; Hillengass, Jens; Goldschmidt, Hartmut; Mosebach, Jennifer; Pan, Leyun; Schlemmer, Heinz-Peter; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2015-01-01
PET/MRI represents a promising hybrid imaging modality with several potential clinical applications. Although PET/MRI seems highly attractive in the diagnostic approach of multiple myeloma (MM), its role has not yet been evaluated. The aims of this prospective study are to evaluate the feasibility of 18F-FDG PET/MRI in detection of MM lesions, and to investigate the reproducibility of bone marrow lesions detection and quantitative data of 18F-FDG uptake between the functional (PET) component of PET/CT and PET/MRI in MM patients. The study includes 30 MM patients. All patients initially underwent 18F-FDG PET/CT (60 min p.i.), followed by PET/MRI (120 min p.i.). PET/CT and PET/MRI data were assessed and compared based on qualitative (lesion detection) and quantitative (SUV) evaluation. The hybrid PET/MRI system provided good image quality in all cases without artefacts. PET/MRI identified 65 of the 69 lesions, which were detectable with PET/CT (94.2%). Quantitative PET evaluations showed the following mean values in MM lesions: SUVaverage=5.5 and SUVmax=7.9 for PET/CT; SUVaverage=3.9 and SUVmax=5.8 for PET/MRI. Both SUVaverage and SUVmax were significantly higher on PET/CT than on PET/MRI. Spearman correlation analysis demonstrated a strong correlation between both lesional SUVaverage (r=0.744) and lesional SUVmax (r=0.855) values derived from PET/CT and PET/MRI. Regarding detection of myeloma skeletal lesions, PET/MRI exhibited equivalent performance to PET/CT. In terms of tracer uptake quantitation, a significant correlation between the two techniques was demonstrated, despite the statistically significant differences in lesional SUVs between PET/CT and PET/MRI. PMID:26550538
Park, Han Sang; Rinehart, Matthew T; Walzer, Katelyn A; Chi, Jen-Tsan Ashley; Wax, Adam
2016-01-01
Malaria detection through microscopic examination of stained blood smears is a diagnostic challenge that heavily relies on the expertise of trained microscopists. This paper presents an automated analysis method for detection and staging of red blood cells infected by the malaria parasite Plasmodium falciparum at trophozoite or schizont stage. Unlike previous efforts in this area, this study uses quantitative phase images of unstained cells. Erythrocytes are automatically segmented using thresholds of optical phase and refocused to enable quantitative comparison of phase images. Refocused images are analyzed to extract 23 morphological descriptors based on the phase information. While all individual descriptors are highly statistically different between infected and uninfected cells, each descriptor does not enable separation of populations at a level satisfactory for clinical utility. To improve the diagnostic capacity, we applied various machine learning techniques, including linear discriminant classification (LDC), logistic regression (LR), and k-nearest neighbor classification (NNC), to formulate algorithms that combine all of the calculated physical parameters to distinguish cells more effectively. Results show that LDC provides the highest accuracy of up to 99.7% in detecting schizont stage infected cells compared to uninfected RBCs. NNC showed slightly better accuracy (99.5%) than either LDC (99.0%) or LR (99.1%) for discriminating late trophozoites from uninfected RBCs. However, for early trophozoites, LDC produced the best accuracy of 98%. Discrimination of infection stage was less accurate, producing high specificity (99.8%) but only 45.0%-66.8% sensitivity with early trophozoites most often mistaken for late trophozoite or schizont stage and late trophozoite and schizont stage most often confused for each other. Overall, this methodology points to a significant clinical potential of using quantitative phase imaging to detect and stage malaria infection without staining or expert analysis.
Yin, Ji Yong; Huo, Jun Sheng; Ma, Xin Xin; Sun, Jing; Huang, Jian
2017-12-01
To research a protein chip method which can simultaneously quantitative detect β-Lactoglobulin (β-L) and Lactoferrin (Lf) at one time. Protein chip printer was used to print both anti-β-L antibodies and anti-Lf antibodies on each block of protein chip. And then an improved sandwich detection method was applied while the other two detecting antibodies for the two antigens were added in the block after they were mixed. The detection conditions of the quantitative detection for simultaneous measurement of β-L and Lf with protein chip were optimized and evaluated. Based on these detected conditions, two standard curves of the two proteins were simultaneously established on one protein chip. Finally, the new detection method was evaluated by using the analysis of precision and accuracy. By comparison experiment, mouse monoclonal antibodies of the two antigens were chosen as the printing probe. The concentrations of β-L and Lf probes were 0.5 mg/mL and 0.5 mg/mL, respectively, while the titers of detection antibodies both of β-L and Lf were 1:2,000. Intra- and inter-assay variability was between 4.88% and 38.33% for all tests. The regression coefficients of protein chip comparing with ELISA for β-L and Lf were better than 0.734, and both of the two regression coefficients were statistically significant (r = 0.734, t = 2.644, P = 0.038; and r = 0.774, t = 2.998, P = 0.024). A protein chip method of simultaneously quantitative detection for β-L and Lf has been established and this method is worthy in further application. Copyright © 2017 The Editorial Board of Biomedical and Environmental Sciences. Published by China CDC. All rights reserved.
Park, Han Sang; Rinehart, Matthew T.; Walzer, Katelyn A.; Chi, Jen-Tsan Ashley; Wax, Adam
2016-01-01
Malaria detection through microscopic examination of stained blood smears is a diagnostic challenge that heavily relies on the expertise of trained microscopists. This paper presents an automated analysis method for detection and staging of red blood cells infected by the malaria parasite Plasmodium falciparum at trophozoite or schizont stage. Unlike previous efforts in this area, this study uses quantitative phase images of unstained cells. Erythrocytes are automatically segmented using thresholds of optical phase and refocused to enable quantitative comparison of phase images. Refocused images are analyzed to extract 23 morphological descriptors based on the phase information. While all individual descriptors are highly statistically different between infected and uninfected cells, each descriptor does not enable separation of populations at a level satisfactory for clinical utility. To improve the diagnostic capacity, we applied various machine learning techniques, including linear discriminant classification (LDC), logistic regression (LR), and k-nearest neighbor classification (NNC), to formulate algorithms that combine all of the calculated physical parameters to distinguish cells more effectively. Results show that LDC provides the highest accuracy of up to 99.7% in detecting schizont stage infected cells compared to uninfected RBCs. NNC showed slightly better accuracy (99.5%) than either LDC (99.0%) or LR (99.1%) for discriminating late trophozoites from uninfected RBCs. However, for early trophozoites, LDC produced the best accuracy of 98%. Discrimination of infection stage was less accurate, producing high specificity (99.8%) but only 45.0%-66.8% sensitivity with early trophozoites most often mistaken for late trophozoite or schizont stage and late trophozoite and schizont stage most often confused for each other. Overall, this methodology points to a significant clinical potential of using quantitative phase imaging to detect and stage malaria infection without staining or expert analysis. PMID:27636719
Improved diagnosis of pulmonary emphysema using in vivo dark-field radiography.
Meinel, Felix G; Yaroshenko, Andre; Hellbach, Katharina; Bech, Martin; Müller, Mark; Velroyen, Astrid; Bamberg, Fabian; Eickelberg, Oliver; Nikolaou, Konstantin; Reiser, Maximilian F; Pfeiffer, Franz; Yildirim, Ali Ö
2014-10-01
The purpose of this study was to assess whether the recently developed method of grating-based x-ray dark-field radiography can improve the diagnosis of pulmonary emphysema in vivo. Pulmonary emphysema was induced in female C57BL/6N mice using endotracheal instillation of porcine pancreatic elastase and confirmed by in vivo pulmonary function tests, histopathology, and quantitative morphometry. The mice were anesthetized but breathing freely during imaging. Experiments were performed using a prototype small-animal x-ray dark-field scanner that was operated at 35 kilovolt (peak) with an exposure time of 5 seconds for each of the 10 grating steps. Images were compared visually. For quantitative comparison of signal characteristics, regions of interest were placed in the upper, middle, and lower zones of each lung. Receiver-operating-characteristic statistics were performed to compare the effectiveness of transmission and dark-field signal intensities and the combined parameter "normalized scatter" to differentiate between healthy and emphysematous lungs. A clear visual difference between healthy and emphysematous mice was found for the dark-field images. Quantitative measurements of x-ray dark-field signal and normalized scatter were significantly different between the mice with pulmonary emphysema and the control mice and showed good agreement with pulmonary function tests and quantitative histology. The normalized scatter showed a significantly higher discriminatory power (area under the receiver-operating-characteristic curve [AUC], 0.99) than dark-field (AUC, 0.90; P = 0.01) or transmission signal (AUC, 0.69; P < 0.001) alone did, allowing for an excellent discrimination of healthy and emphysematous lung regions. In a murine model, x-ray dark-field radiography is technically feasible in vivo and represents a substantial improvement over conventional transmission-based x-ray imaging for the diagnosis of pulmonary emphysema.
Wang, Lin; Du, Jing; Li, Feng-Hua; Fang, Hua; Hua, Jia; Wan, Cai-Feng
2013-10-01
The purpose of this study was to evaluate the diagnostic efficacy of contrast-enhanced sonography for differentiation of breast lesions by combined qualitative and quantitative analyses in comparison to magnetic resonance imaging (MRI). Fifty-six patients with American College of Radiology Breast Imaging Reporting and Data System category 3 to 5 breast lesions on conventional sonography were evaluated by contrast-enhanced sonography and MRI. A comparative analysis of diagnostic results between contrast-enhanced sonography and MRI was conducted in light of the pathologic findings. Pathologic analysis showed 26 benign and 30 malignant lesions. The predominant enhancement patterns of the benign lesions on contrast-enhanced sonography were homogeneous, centrifugal, and isoenhancement or hypoenhancement, whereas the patterns of the malignant lesions were mainly heterogeneous, centripetal, and hyperenhancement. The detection rates for perfusion defects and peripheral radial vessels in the malignant group were much higher than those in the benign group (P < .05). As to quantitative analysis, statistically significant differences were found in peak and time-to-peak values between the groups (P < .05). With pathologic findings as the reference standard, the sensitivity, specificity, and accuracy of contrast-enhanced sonography and MRI were 90.0%, 92.3%, 91.1% and 96.7%, 88.5%, and 92.9%, respectively. The two methods had a concordant rate of 87.5% (49 of 56), and the concordance test gave a value of κ = 0.75, indicating that there was high concordance in breast lesion assessment between the two diagnostic modalities. Contrast-enhanced sonography provided typical enhancement patterns and valuable quantitative parameters, which showed good agreement with MRI in diagnostic efficacy and may potentially improve characterization of breast lesions.
Quantitative Susceptibility Mapping after Sports-Related Concussion.
Koch, K M; Meier, T B; Karr, R; Nencka, A S; Muftuler, L T; McCrea, M
2018-06-07
Quantitative susceptibility mapping using MR imaging can assess changes in brain tissue structure and composition. This report presents preliminary results demonstrating changes in tissue magnetic susceptibility after sports-related concussion. Longitudinal quantitative susceptibility mapping metrics were produced from imaging data acquired from cohorts of concussed and control football athletes. One hundred thirty-six quantitative susceptibility mapping datasets were analyzed across 3 separate visits (24 hours after injury, 8 days postinjury, and 6 months postinjury). Longitudinal quantitative susceptibility mapping group analyses were performed on stability-thresholded brain tissue compartments and selected subregions. Clinical concussion metrics were also measured longitudinally in both cohorts and compared with the measured quantitative susceptibility mapping. Statistically significant increases in white matter susceptibility were identified in the concussed athlete group during the acute (24 hour) and subacute (day 8) period. These effects were most prominent at the 8-day visit but recovered and showed no significant difference from controls at the 6-month visit. The subcortical gray matter showed no statistically significant group differences. Observed susceptibility changes after concussion appeared to outlast self-reported clinical recovery metrics at a group level. At an individual subject level, susceptibility increases within the white matter showed statistically significant correlations with return-to-play durations. The results of this preliminary investigation suggest that sports-related concussion can induce physiologic changes to brain tissue that can be detected using MR imaging-based magnetic susceptibility estimates. In group analyses, the observed tissue changes appear to persist beyond those detected on clinical outcome assessments and were associated with return-to-play duration after sports-related concussion. © 2018 by American Journal of Neuroradiology.
NASA Astrophysics Data System (ADS)
Keown, Sandra L.
This study was devised to determine effects of the use of interactive thematic organizers and concept maps in middle school science classes during a unit study on minerals. The design, a pretest-posttest control group, consisted of matched groups (three experimental groups and one comparison group). It also included a student survey assessing qualitative aspects of the investigation. The 67 6th-grade students and one science teacher who participated in the study were from an independent K-12 school. Students represented a normal, well-distributed range of abilities. Group I (control) proceeded with their usual method of studying a unit---reading aloud the text and answering workbook questions. Group II worked with interactive thematic organizers, designed to activate prior knowledge and help students make inferences about target concepts in three treatments. Group III created three interactive concept maps, which represented both understandings and misconceptions. Concept maps were reviewed and repaired as students completed each treatment. Group IV participated in both thematic organizer and concept map treatments. Statistical analyses were determined through a pretest and a delayed recall posttest essay for all four groups. Two scores were assigned---one quantitative raw score of correct explicit answers and one rubric score based on the quality of interpretive responses. Group II also received scores for thematic organizer responses. Group III received rubric scores for concept maps. Group IV received all possible scores. Paired t-tests reported comparisons of scores across the treatment groups. A linear regression indicated whether or not concept map misconceptions affected posttest scores. Finally, an ANCOVA reported statistical significance across the four treatment groups. Findings of data analysis indicated statistically significant improvement in posttest scores among students in the three experimental groups. Students who participated in both treatments represented the highest scores among the four groups. Results of the ANCOVA indicated there was statistically significant difference in scores among the four treatments. Recommendations were made to further investigate development of interactive thematic organizers with student-chosen hyperlinks to concepts, as well as a recommendation that researchers investigate teacher understandings of interpretive purpose and form in the creation of thematic organizers.
NASA Astrophysics Data System (ADS)
Baumgartner, Peter O.
A database on Middle Jurassic-Early Cretaceous radiolarians consisting of first and final occurrences of 110 species in 226 samples from 43 localities was used to compute Unitary Associations and probabilistic ranking and scaling (RASC), in order to test deterministic versus probabilistic quantitative biostratigraphic methods. Because the Mesozoic radiolarian fossil record is mainly dissolution-controlled, the sequence of events differs greatly from section to section. The scatter of local first and final appearances along a time scale is large compared to the species range; it is asymmetrical, with a maximum near the ends of the range and it is non-random. Thus, these data do not satisfy the statistical assumptions made in ranking and scaling. Unitary Associations produce maximum ranges of the species relative to each other by stacking cooccurrence data from all sections and therefore compensate for the local dissolution effects. Ranking and scaling, based on the assumption of a normal random distribution of the events, produces average ranges which are for most species much shorter than the maximum UA-ranges. There are, however, a number of species with similar ranges in both solutions. These species are believed to be the most dissolution-resistant and, therefore, the most reliable ones for the definition of biochronozones. The comparison of maximum and average ranges may be a powerful tool to test reliability of species for biochronology. Dissolution-controlled fossil data yield high crossover frequencies and therefore small, statistically insignificant interfossil distances. Scaling has not produced a useful sequence for this type of data.
Carvalho, Ana P; Malcata, F Xavier
2005-06-29
Assays for fatty acid composition in biological materials are commonly carried out by gas chromatography, after conversion of the lipid material into the corresponding methyl esters (FAME) via suitable derivatization reactions. Quantitative derivatization depends on the type of catalyst and processing conditions employed, as well as the solubility of said sample in the reaction medium. Most literature pertinent to derivatization has focused on differential comparison between alternative methods; although useful to find out the best method for a particular sample, additional studies on factors that may affect each step of FAME preparation are urged. In this work, the influence of various parameters in each step of derivatization reactions was studied, using both cod liver oil and microalgal biomass as model systems. The accuracies of said methodologies were tested via comparison with the AOCS standard method, whereas their reproducibility was assessed by analysis of variance of (replicated) data. Alkaline catalysts generated lower levels of long-chain unsaturated FAME than acidic ones. Among these, acetyl chloride and BF(3) were statistically equivalent to each other. The standard method, which involves alkaline treatment of samples before acidic methylation with BF(3), provided equivalent results when compared with acidic methylation with BF(3) alone. Polarity of the reaction medium was found to be of the utmost importance in the process: intermediate values of polarity [e.g., obtained by a 1:1 (v/v) mixture of methanol with diethyl ether or toluene] provided amounts of extracted polyunsaturated fatty acids statistically higher than those obtained via the standard method.
Jeon, Young J.; Kim, Jaeuk U.; Lee, Hae J.; Lee, Jeon; Ryu, Hyun H.; Lee, Yu J.; Kim, Jong Y.
2011-01-01
In this work, we analyze the baseline, signal strength, aortic augmentation index (AIx), radial AIx, time to reflection and P_T2 at Chon, Gwan, and Cheok, which are the three pulse diagnosis positions in Oriental medicine. For the pulse measurement, we used the SphygmoCor apparatus, which has been widely used for the evaluation of the arterial stiffness at the aorta. By two-way repeated measures analysis of variance, we tested two independent measurements for repeatability and investigated their mean differences among Chon, Gwan and Cheok. To characterize further the parameters that were shown to be different between each palpation position, we carried out Duncan's test for the multiple comparisons. The baseline and signal strength were statistically different (P < .05) among Chon, Gwan and Cheok, respectively, which supports the major hypothesis of Oriental medicine that all of the three palpation positions contain different clinical information. On the other hand, aortic AIx and time to reflection were found to be statistically different between Chon and the others, and radial AIx and P_T2 did not show any difference between pulse positions. In the clinical sense, however, the aortic AIx at each palpation position was found to fall within the 90% confidence interval of normal arterial compliance. The results of the multiple comparisons indicate that the parameters of arterial stiffness were independent of the palpation positions. This work is the first attempt to characterize quantitatively the pulse signals at Chon, Gwan and Cheok with some relevant parameters extracted from the SphygmoCor apparatus. PMID:19789213
Shemtov-Yona, K; Rittel, D
2016-09-01
The fatigue performance of dental implants is usually assessed on the basis of cyclic S/N curves. This neither provides information on the anticipated service performance of the implant, nor does it allow for detailed comparisons between implants unless a thorough statistical analysis is performed, of the kind not currently required by certification standards. The notion of endurance limit is deemed to be of limited applicability, given unavoidable stress concentrations and random load excursions, that all characterize dental implants and their service conditions. We propose a completely different approach, based on random spectrum loading, as long used in aeronautical design. The implant is randomly loaded by a sequence of loads encompassing all load levels it would endure during its service life. This approach provides a quantitative and comparable estimate of its performance in terms of lifetime, based on the very fact that the implant will fracture sooner or later, instead of defining a fatigue endurance limit of limited practical application. Five commercial monolithic Ti-6Al-4V implants were tested under cyclic, and another 5 under spectrum loading conditions, at room temperature and dry air. The failure modes and fracture planes were identical for all implants. The approach is discussed, including its potential applications, for systematic, straightforward and reliable comparisons of various implant designs and environments, without the need for cumbersome statistical analyses. It is believed that spectrum loading can be considered for the generation of new standardization procedures and design applications. Copyright © 2016 Elsevier Ltd. All rights reserved.
Clinical performance of the LCx HCV RNA quantitative assay.
Bertuzis, Rasa; Hardie, Alison; Hottentraeger, Barbara; Izopet, Jacques; Jilg, Wolfgang; Kaesdorf, Barbara; Leckie, Gregor; Leete, Jean; Perrin, Luc; Qiu, Chunfu; Ran, Iris; Schneider, George; Simmonds, Peter; Robinson, John
2005-02-01
This study was conducted to assess the performance of the Abbott laboratories LCx HCV RNA Quantitative Assay (LCx assay) in the clinical setting. Four clinical laboratories measured LCx assay precision, specificity, and linearity. In addition, a method comparison was conducted between the LCx assay and the Roche HCV Amplicor Monitor, version 2.0 (Roche Monitor 2.0) and the Bayer VERSANT HCV RNA 3.0 Assay (Bayer bDNA 3.0) quantitative assays. For precision, the observed LCx assay intra-assay standard deviation (S.D.) was 0.060-0.117 log IU/ml, the inter-assay S.D. was 0.083-0.133 log IU/ml, the inter-lot S.D. was 0.105-0.177 log IU/ml, the inter-site S.D. was 0.099-0.190 log IU/ml, and the total S.D. was 0.113-0.190 log IU/ml. The specificity of the LCx assay was 99.4% (542/545; 95% CI, 98.4-99.9%). For linearity, the mean pooled LCx assay results were linear (r=0.994) over the range of the panel (2.54-5.15 log IU/ml). A method comparison demonstrated a correlation coefficient of 0.881 between the LCx assay and Roche Monitor 2.0, 0.872 between the LCx assay and Bayer bDNA 3.0, and 0.870 between Roche Monitor 2.0 and Bayer bDNA 3.0. The mean LCx assay result was 0.04 log IU/ml (95% CI, -0.08, 0.01) lower than the mean Roche Monitor 2.0 result, but 0.57 log IU/ml (95% CI, 0.53, 0.61) higher than the mean Bayer bDNA 3.0 result. The mean Roche Monitor 2.0 result was 0.60 log IU/ml (95% CI, 0.56, 0.65) higher than the mean Bayer bDNA 3.0 result. The LCx assay quantitated genotypes 1-4 with statistical equivalency. The vast majority (98.9%, 278/281) of paired LCx assay-Roche Monitor 2.0 specimen results were within 1 log IU/ml. Similarly, 86.6% (240/277) of paired LCx assay and Bayer bDNA 3.0 specimen results were within 1 log, as were 85.6% (237/277) of paired Roche Monitor 2.0 and Bayer specimen results. These data demonstrate that the LCx assay may be used for quantitation of HCV RNA in HCV-infected individuals.
Lages Barbosa, Guilherme; Almeida Gadelha, Francisca Daiane; Kublik, Natalya; Proctor, Alan; Reichelm, Lucas; Weissinger, Emily; Wohlleb, Gregory M.; Halden, Rolf U.
2015-01-01
The land, water, and energy requirements of hydroponics were compared to those of conventional agriculture by example of lettuce production in Yuma, Arizona, USA. Data were obtained from crop budgets and governmental agricultural statistics, and contrasted with theoretical data for hydroponic lettuce production derived by using engineering equations populated with literature values. Yields of lettuce per greenhouse unit (815 m2) of 41 ± 6.1 kg/m2/y had water and energy demands of 20 ± 3.8 L/kg/y and 90,000 ± 11,000 kJ/kg/y (±standard deviation), respectively. In comparison, conventional production yielded 3.9 ± 0.21 kg/m2/y of produce, with water and energy demands of 250 ± 25 L/kg/y and 1100 ± 75 kJ/kg/y, respectively. Hydroponics offered 11 ± 1.7 times higher yields but required 82 ± 11 times more energy compared to conventionally produced lettuce. To the authors’ knowledge, this is the first quantitative comparison of conventional and hydroponic produce production by example of lettuce grown in the southwestern United States. It identified energy availability as a major factor in assessing the sustainability of hydroponics, and it points to water-scarce settings offering an abundance of renewable energy (e.g., from solar, geothermal, or wind power) as particularly attractive regions for hydroponic agriculture. PMID:26086708
Barbosa, Guilherme Lages; Gadelha, Francisca Daiane Almeida; Kublik, Natalya; Proctor, Alan; Reichelm, Lucas; Weissinger, Emily; Wohlleb, Gregory M; Halden, Rolf U
2015-06-16
The land, water, and energy requirements of hydroponics were compared to those of conventional agriculture by example of lettuce production in Yuma, Arizona, USA. Data were obtained from crop budgets and governmental agricultural statistics, and contrasted with theoretical data for hydroponic lettuce production derived by using engineering equations populated with literature values. Yields of lettuce per greenhouse unit (815 m2) of 41 ± 6.1 kg/m2/y had water and energy demands of 20 ± 3.8 L/kg/y and 90,000 ± 11,000 kJ/kg/y (±standard deviation), respectively. In comparison, conventional production yielded 3.9 ± 0.21 kg/m2/y of produce, with water and energy demands of 250 ± 25 L/kg/y and 1100 ± 75 kJ/kg/y, respectively. Hydroponics offered 11 ± 1.7 times higher yields but required 82 ± 11 times more energy compared to conventionally produced lettuce. To the authors' knowledge, this is the first quantitative comparison of conventional and hydroponic produce production by example of lettuce grown in the southwestern United States. It identified energy availability as a major factor in assessing the sustainability of hydroponics, and it points to water-scarce settings offering an abundance of renewable energy (e.g., from solar, geothermal, or wind power) as particularly attractive regions for hydroponic agriculture.
Gender differences in joint biomechanics during walking: normative study in young adults.
Kerrigan, D C; Todd, M K; Della Croce, U
1998-01-01
The effect of gender on specific joint biomechanics during gait has been largely unexplored. Given the perceived, subjective, and temporal differences in walking between genders, we hypothesized that quantitative analysis would reveal specific gender differences in joint biomechanics as well. Sagittal kinematic (joint motion) and kinetic (joint torque and power) data from the lower limbs during walking were collected and analyzed in 99 young adult subjects (49 females), aged 20 to 40 years, using an optoelectronic motion analysis and force platform system. Kinetic data were normalized for both height and weight. Female and male data were compared graphically and statistically to assess differences in all major peak joint kinematic and kinetic values. Females had significantly greater hip flexion and less knee extension before initial contact, greater knee flexion moment in pre-swing, and greater peak mechanical joint power absorption at the knee in pre-swing (P < 0.0019 for each parameter). Other differences were noted (P < 0.05) that were not statistically significant when accounting for multiple comparisons. These gender differences may provide new insights into walking dynamics and may be important for both clinical and research studies in motivating the development of separate biomechanical reference databases for males and females.
Jeukens, Cécile R L P N; Lalji, Ulrich C; Meijer, Eduard; Bakija, Betina; Theunissen, Robin; Wildberger, Joachim E; Lobbes, Marc B I
2014-10-01
Contrast-enhanced spectral mammography (CESM) shows promising initial results but comes at the cost of increased dose as compared with full-field digital mammography (FFDM). We aimed to quantitatively assess the dose increase of CESM in comparison with FFDM. Radiation exposure-related data (such as kilovoltage, compressed breast thickness, glandularity, entrance skin air kerma (ESAK), and average glandular dose (AGD) were retrieved for 47 CESM and 715 FFDM patients. All examinations were performed on 1 mammography unit. Radiation dose values reported by the unit were validated by phantom measurements. Descriptive statistics of the patient data were generated using a statistical software package. Dose values reported by the mammography unit were in good qualitative agreement with those of phantom measurements. Mean ESAK was 10.5 mGy for a CESM exposure and 7.46 mGy for an FFDM exposure. Mean AGD for a CESM exposure was 2.80 mGy and 1.55 mGy for an FFDM exposure. Compared with our institutional FFDM, the AGD of a single CESM exposure is increased by 1.25 mGy (+81%), whereas ESAK is increased by 3.07 mGy (+41%). Dose values of both techniques meet the recommendations for maximum dose in mammography.
Moore, A. C.; DeLucca, J. F.; Elliott, D. M.; Burris, D. L.
2016-01-01
This paper describes a new method, based on a recent analytical model (Hertzian biphasic theory (HBT)), to simultaneously quantify cartilage contact modulus, tension modulus, and permeability. Standard Hertzian creep measurements were performed on 13 osteochondral samples from three mature bovine stifles. Each creep dataset was fit for material properties using HBT. A subset of the dataset (N = 4) was also fit using Oyen's method and FEBio, an open-source finite element package designed for soft tissue mechanics. The HBT method demonstrated statistically significant sensitivity to differences between cartilage from the tibial plateau and cartilage from the femoral condyle. Based on the four samples used for comparison, no statistically significant differences were detected between properties from the HBT and FEBio methods. While the finite element method is considered the gold standard for analyzing this type of contact, the expertise and time required to setup and solve can be prohibitive, especially for large datasets. The HBT method agreed quantitatively with FEBio but also offers ease of use by nonexperts, rapid solutions, and exceptional fit quality (R2 = 0.999 ± 0.001, N = 13). PMID:27536012
NASA Astrophysics Data System (ADS)
Callahan, Brendan E.
There is a distinct divide between theory and practice in American science education. Research indicates that a constructivist philosophy, in which students construct their own knowledge, is conductive to learning, while in many cases teachers continue to present science in a more traditional manner. This study sought to explore possible relationships between a socioscientific issues based curriculum and three outcome variables: nature of science understanding, reflective judgment, and argumentation skill. Both quantitative and qualitative methods were used to examine both whole class differences as well as individual differences between the beginning and end of a semester of high school Biology I. Results indicated that the socioscientific issues based curriculum did not produce statistically significant changes over the course of one semester. However, the treatment group scored better on all three instruments than the comparison group. The small sample size may have contributed to the inability to find statistical significance in this study. The qualitative interviews did indicate that some students provided more sophisticated views on nature of science and reflective judgment, and were able to provide slightly more complex argumentation structures. Theoretical implications regarding the use of explicit use of socioscientific issues in the classroom are presented.
Sparse intervertebral fence composition for 3D cervical vertebra segmentation
NASA Astrophysics Data System (ADS)
Liu, Xinxin; Yang, Jian; Song, Shuang; Cong, Weijian; Jiao, Peifeng; Song, Hong; Ai, Danni; Jiang, Yurong; Wang, Yongtian
2018-06-01
Statistical shape models are capable of extracting shape prior information, and are usually utilized to assist the task of segmentation of medical images. However, such models require large training datasets in the case of multi-object structures, and it also is difficult to achieve satisfactory results for complex shapes. This study proposed a novel statistical model for cervical vertebra segmentation, called sparse intervertebral fence composition (SiFC), which can reconstruct the boundary between adjacent vertebrae by modeling intervertebral fences. The complex shape of the cervical spine is replaced by a simple intervertebral fence, which considerably reduces the difficulty of cervical segmentation. The final segmentation results are obtained by using a 3D active contour deformation model without shape constraint, which substantially enhances the recognition capability of the proposed method for objects with complex shapes. The proposed segmentation framework is tested on a dataset with CT images from 20 patients. A quantitative comparison against corresponding reference vertebral segmentation yields an overall mean absolute surface distance of 0.70 mm and a dice similarity index of 95.47% for cervical vertebral segmentation. The experimental results show that the SiFC method achieves competitive cervical vertebral segmentation performances, and completely eliminates inter-process overlap.
NASA Astrophysics Data System (ADS)
Mussen, Kimberly S.
This quantitative research study evaluated the effectiveness of employing pedagogy based on the theory of multiple intelligences (MI). Currently, not all students are performing at the rate mandated by the government. When schools do not meet the required state standards, the school is labeled as not achieving adequate yearly progress (AYP), which may lead to the loss of funding. Any school not achieving AYP would be interested in this study. Due to low state standardized test scores in the district for science, student achievement and attitudes towards learning science were evaluated on a pretest, posttest, essay question, and one attitudinal survey. Statistical significance existed on one of the four research questions. Utilizing the Analysis of Covariance (ANCOVA) for data analysis, student attitudes towards learning science were statically significant in the MI (experimental) group. No statistical significance was found in student achievement on the posttest, delayed posttest, or the essay question test. Social change can result from this study because studying the effects of the multiple intelligence theory incorporated into classroom instruction can have significant effect on how children learn, allowing them to compete in a knowledge society.
Miller, M; Coville, B; Abou-Madi, N; Olsen, J
1999-03-01
Serum samples from captive giraffe (Giraffa camelopardalis) were tested to assess passive transfer of immunoglobulins using in vitro methods developed for domestic ruminants. Estimated immunoglobulin levels were compared using five tests (protein electrophoresis, total protein refractometry, zinc sulfate turbidity, glutaraldehyde coagulation, and sodium sulfite turbidity). A linear relationship was observed among total protein, gamma globulin (electrophoretic measurement), and immunoglobulin level based on spectrophotometric measurement of zinc sulfate turbidity. Nonquantitative assays also demonstrated statistical correlation with the quantitative methods. Using criteria similar to those established for domestic species, cutoff values for failure of passive transfer (FPT) were established for these tests in neonatal giraffe: 1) total protein <6.0 g/dl; 2) gamma globulin < 0.5 g/dl; 3) estimated immunoglobulin level < 1,000 mg/dl (zinc sulfate turbidity); 4) glutaraldehyde coagulation test negative; or 5) no visually detectable turbidity in 16% sodium sulfite or Bova-S negative. Retrospective examination of the medical histories showed a strong statistical association between animals designated as having FPT and those that were removed from their dams based on clinical assessment to be hand-reared. Application of these tests in the field should allow earlier detection and intervention for FPT in neonatal giraffe.
Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C
2015-02-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Standardizing Quality Assessment of Fused Remotely Sensed Images
NASA Astrophysics Data System (ADS)
Pohl, C.; Moellmann, J.; Fries, K.
2017-09-01
The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.
High pressure rinsing system comparison
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. Sertore; M. Fusetti; P. Michelato
2007-06-01
High pressure rinsing (HPR) is a key process for the surface preparation of high field superconducting cavities. A portable apparatus for the water jet characterization, based on the transferred momentum between the water jet and a load cell, has been used in different laboratories. This apparatus allows to collected quantitative parameters that characterize the HPR water jet. In this paper, we present a quantitative comparison of the different water jet produced by various nozzles routinely used in different laboratories for the HPR process
2007-01-05
positive / false negatives. The quantitative on-site methods were evaluated using linear regression analysis and relative percent difference (RPD) comparison...Conclusion ...............................................................................................3-9 3.2 Quantitative Analysis Using CRREL...3-37 3.3 Quantitative Analysis for NG by GC/TID.........................................................3-38 3.3.1 Introduction
Zhao, Binbin; Chen, Wei; Jiang, Rui; Zhang, Rui; Wang, Yan; Wang, Ling; Gordon, Lynn; Chen, Ling
2015-09-01
The purpose of this study was to evaluate the cytokine expression profile of specific IL-1 family members in the aqueous humor and sera of patients with HLA-B27 associated acute anterior uveitis (AAU) and idiopathic AAU. Following informed consent, a total of 13 patients with HLA-B27 associated AAU, 12 patients with idiopathic AAU and 9 controls were recruited to this study from May 2013 to July 2014. Each individual received a complete ophthalmologic examination. Aqueous humor and sera samples were collected and 11 inflammation-related cytokines of the IL-1 family (IL-1α, IL-1β, IL-1 receptor antagonist [IL-1Ra], IL-18, IL-36 receptor antagonist [IL-36Ra], IL-33, IL-36α, IL-36β, IL-36γ, IL-37, IL-38) were quantitatively measured and analyzed for statistical significance between groups. The degree of inflammation, anterior chamber cell or flare, correlated with expression of IL-1β, IL-1Ra, and IL-18. The highest levels of IL-1β, IL-1Ra, IL-18, and IL-36Ra were seen in the aqueous of patients with HLA-B27 associated AAU and this was statically significant when compared to the controls, but not to idiopathic AAU. Expression of IL-18 was statistically higher in the aqueous of patients with HLA-B27 associated AAU in comparison to either idiopathic AAU or controls, but this may reflect greater inflammation in this patient group. In the sera only IL-1α was statistically higher in the HLA-B27 associated AAU in comparison to the control. Cytokine analysis reveals elevation of multiple IL-1 family members in the aqueous humor of patients with AAU as compared to controls. The specific signature of inflammation may potentially be useful in developing new future therapies for AAU. Copyright © 2015 Elsevier Ltd. All rights reserved.
Grid Resolution Study over Operability Space for a Mach 1.7 Low Boom External Compression Inlet
NASA Technical Reports Server (NTRS)
Anderson, Bernhard H.
2014-01-01
This paper presents a statistical methodology whereby the probability limits associated with CFD grid resolution of inlet flow analysis can be determined which provide quantitative information on the distribution of that error over the specified operability range. The objectives of this investigation is to quantify the effects of both random (accuracy) and systemic (biasing) errors associated with grid resolution in the analysis of the Lockheed Martin Company (LMCO) N+2 Low Boom external compression supersonic inlet. The study covers the entire operability space as defined previously by the High Speed Civil Transport (HSCT) High Speed Research (HSR) program goals. The probability limits in terms of a 95.0% confidence interval on the analysis data were evaluated for four ARP1420 inlet metrics, namely (1) total pressure recovery (PFAIP), (2) radial hub distortion (DPH/P), (3) ) radial tip distortion (DPT/P), and (4) ) circumferential distortion (DPC/P). In general, the resulting +/-0.95 delta Y interval was unacceptably large in comparison to the stated goals of the HSCT program. Therefore, the conclusion was reached that the "standard grid" size was insufficient for this type of analysis. However, in examining the statistical data, it was determined that the CFD analysis results at the outer fringes of the operability space were the determining factor in the measure of statistical uncertainty. Adequate grids are grids that are free of biasing (systemic) errors and exhibit low random (precision) errors in comparison to their operability goals. In order to be 100% certain that the operability goals have indeed been achieved for each of the inlet metrics, the Y+/-0.95 delta Y limit must fall inside the stated operability goals. For example, if the operability goal for DPC/P circumferential distortion is =0.06, then the forecast Y for DPC/P plus the 95% confidence interval on DPC/P, i.e. +/-0.95 delta Y, must all be less than or equal to 0.06.
Higher-order nonclassicalities of finite dimensional coherent states: A comparative study
NASA Astrophysics Data System (ADS)
Alam, Nasir; Verma, Amit; Pathak, Anirban
2018-07-01
Conventional coherent states (CSs) are defined in various ways. For example, CS is defined as an infinite Poissonian expansion in Fock states, as displaced vacuum state, or as an eigenket of annihilation operator. In the infinite dimensional Hilbert space, these definitions are equivalent. However, these definitions are not equivalent for the finite dimensional systems. In this work, we present a comparative description of the lower- and higher-order nonclassical properties of the finite dimensional CSs which are also referred to as qudit CSs (QCSs). For the comparison, nonclassical properties of two types of QCSs are used: (i) nonlinear QCS produced by applying a truncated displacement operator on the vacuum and (ii) linear QCS produced by the Poissonian expansion in Fock states of the CS truncated at (d - 1)-photon Fock state. The comparison is performed using a set of nonclassicality witnesses (e.g., higher order antibunching, higher order sub-Poissonian statistics, higher order squeezing, Agarwal-Tara parameter, Klyshko's criterion) and a set of quantitative measures of nonclassicality (e.g., negativity potential, concurrence potential and anticlassicality). The higher order nonclassicality witnesses have found to reveal the existence of higher order nonclassical properties of QCS for the first time.
Computerized resources in language therapy with children of the autistic spectrum.
Fernandes, Fernanda Dreux Miranda; Santos, Thaís Helena Ferreira; Amato, Cibelle Albuquerque de la Higuera; Molini-Avejonas, Daniela Regina
2010-01-01
The use of computerized technology in language therapy with children of the autistic spectrum. To assess the interference of using computers and specific programs during language therapy in the functional communicative profile and socio-cognitive performance of children of the autistic spectrum. 23 children with ages ranging between 3 and 12 years were individually video recorded prior to and after a set of 10 regular language therapy sessions (i.e. a total of two video samples per subject) using computerized games according to the child's choice. The following expressions were used by the therapists to describe the children's performance during the use of computers: more attentive, more communicative initiatives, more eye contact, more interactive, more verbalizations, more attention and more action requests. Qualitative and quantitative progresses were identified, although without statistical significance. Those progresses were observed after a time period that is smaller than the usually applied to this kind of comparison and it seems to be a promising result. More controlled associations and comparisons were not possible due to the groups' heterogeneity and therefore more consistent conclusions are not possible. It was clear that the subjects presented different reactions to the use of computerized resources during language therapy.
AlKhalidi, Bashar A; Shtaiwi, Majed; AlKhatib, Hatim S; Mohammad, Mohammad; Bustanji, Yasser
2008-01-01
A fast and reliable method for the determination of repaglinide is highly desirable to support formulation screening and quality control. A first-derivative UV spectroscopic method was developed for the determination of repaglinide in tablet dosage form and for dissolution testing. First-derivative UV absorbance was measured at 253 nm. The developed method was validated for linearity, accuracy, precision, limit of detection (LOD), and limit of quantitation (LOQ) in comparison to the U.S. Pharmacopeia (USP) column high-performance liquid chromatographic (HPLC) method. The first-derivative UV spectrophotometric method showed excellent linearity [correlation coefficient (r) = 0.9999] in the concentration range of 1-35 microg/mL and precision (relative standard deviation < 1.5%). The LOD and LOQ were 0.23 and 0.72 microg/mL, respectively, and good recoveries were achieved (98-101.8%). Statistical comparison of results of the first-derivative UV spectrophotometric and the USP HPLC methods using the t-test showed that there was no significant difference between the 2 methods. Additionally, the method was successfully used for the dissolution test of repaglinide and was found to be reliable, simple, fast, and inexpensive.
Assessment of apically extruded debris produced by the self-adjusting file system.
De-Deus, Gustavo André; Nogueira Leal Silva, Emmanuel João; Moreira, Edson Jorge; de Almeida Neves, Aline; Belladonna, Felipe Gonçalves; Tameirão, Michele
2014-04-01
This study was designed to quantitatively evaluate the amount of apically extruded debris by the Self-Adjusting-File system (SAF; ReDent-Nova, Ra'anana, Israel). Hand and rotary instruments were used as references for comparison. Sixty mesial roots of mandibular molars were randomly assigned to 3 groups (n = 20). The root canals were instrumented with hand files using a crown-down technique. The ProTaper (Dentsply Maillefer, Ballaigues, Switzerland) and SAF systems were used according to the manufacturers' instructions. Sodium hypochlorite was used as an irrigant, and the apically extruded debris was collected in preweighted glass vials and dried afterward. The mean weight of debris was assessed with a microbalance and statistically analyzed using 1-way analysis of variance and the post hoc Tukey multiple comparison test. Hand file instrumentation produced significantly more debris compared with the ProTaper and SAF systems (P < .05). The ProTaper system produced significantly more debris compared with the SAF system (P < .05). Under the conditions of this study, all systems caused apical debris extrusion. SAF instrumentation was associated with less debris extrusion compared with the use of hand and rotary files. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Guilloteau, C.; Foufoula-Georgiou, E.; Kummerow, C.; Kirstetter, P. E.
2017-12-01
A multiscale approach is used to compare precipitation fields retrieved from GMI using the last version of the GPROF algorithm (GPROF-2017) to the DPR fields all over the globe. Using a wavelet-based spectral analysis, which renders the multi-scale decompositions of the original fields independent of each other spatially and across scales, we quantitatively assess the various scales of variability of the retrieved fields, and thus define the spatially-variable "effective resolution" (ER) of the retrievals. Globally, a strong agreement is found between passive microwave and radar patterns at scales coarser than 80km. Over oceans the patterns match down to the 20km scale. Over land, comparison statistics are spatially heterogeneous. In most areas a strong discrepancy is observed between passive microwave and radar patterns at scales finer than 40-80km. The comparison is also supported by ground-based observations over the continental US derived from the NOAA/NSSL MRMS suite of products. While larger discrepancies over land than over oceans are classically explained by land complex surface emissivity perturbing the passive microwave retrieval, other factors are investigated here, such as intricate differences in the storm structure over oceans and land. Differences in term of statistical properties (PDF of intensities and spatial organization) of precipitation fields over land and oceans are assessed from radar data, as well as differences in the relation between the 89GHz brightness temperature and precipitation. Moreover, the multiscale approach allows quantifying the part of discrepancies caused by miss-match of the location of intense cells and instrument-related geometric effects. The objective is to diagnose shortcomings of current retrieval algorithms such that targeted improvements can be made to achieve over land the same retrieval performance as over oceans.
ERIC Educational Resources Information Center
Madhere, Serge
An analytic procedure, efficiency analysis, is proposed for improving the utility of quantitative program evaluation for decision making. The three features of the procedure are explained: (1) for statistical control, it adopts and extends the regression-discontinuity design; (2) for statistical inferences, it de-emphasizes hypothesis testing in…
On a Calculus-Based Statistics Course for Life Science Students
ERIC Educational Resources Information Center
Watkins, Joseph C.
2010-01-01
The choice of pedagogy in statistics should take advantage of the quantitative capabilities and scientific background of the students. In this article, we propose a model for a statistics course that assumes student competency in calculus and a broadening knowledge in biology. We illustrate our methods and practices through examples from the…
ERIC Educational Resources Information Center
Theoret, Julie M.; Luna, Andrea
2009-01-01
This action research combined qualitative and quantitative techniques to investigate two different types of writing assignments in an introductory undergraduate statistics course. The assignments were written in response to the same set of prompts but in two different ways: homework journal assignments or initial posts to a computer discussion…
Pertuz, Said; McDonald, Elizabeth S; Weinstein, Susan P; Conant, Emily F; Kontos, Despina
2016-04-01
To assess a fully automated method for volumetric breast density (VBD) estimation in digital breast tomosynthesis (DBT) and to compare the findings with those of full-field digital mammography (FFDM) and magnetic resonance (MR) imaging. Bilateral DBT images, FFDM images, and sagittal breast MR images were retrospectively collected from 68 women who underwent breast cancer screening from October 2011 to September 2012 with institutional review board-approved, HIPAA-compliant protocols. A fully automated computer algorithm was developed for quantitative estimation of VBD from DBT images. FFDM images were processed with U.S. Food and Drug Administration-cleared software, and the MR images were processed with a previously validated automated algorithm to obtain corresponding VBD estimates. Pearson correlation and analysis of variance with Tukey-Kramer post hoc correction were used to compare the multimodality VBD estimates. Estimates of VBD from DBT were significantly correlated with FFDM-based and MR imaging-based estimates with r = 0.83 (95% confidence interval [CI]: 0.74, 0.90) and r = 0.88 (95% CI: 0.82, 0.93), respectively (P < .001). The corresponding correlation between FFDM and MR imaging was r = 0.84 (95% CI: 0.76, 0.90). However, statistically significant differences after post hoc correction (α = 0.05) were found among VBD estimates from FFDM (mean ± standard deviation, 11.1% ± 7.0) relative to MR imaging (16.6% ± 11.2) and DBT (19.8% ± 16.2). Differences between VDB estimates from DBT and MR imaging were not significant (P = .26). Fully automated VBD estimates from DBT, FFDM, and MR imaging are strongly correlated but show statistically significant differences. Therefore, absolute differences in VBD between FFDM, DBT, and MR imaging should be considered in breast cancer risk assessment.
A quantitative assessment of patient and nurse outcomes of bedside nursing report implementation.
Sand-Jecklin, Kari; Sherman, Jay
2014-10-01
To quantify quantitative outcomes of a practice change to a blended form of bedside nursing report. The literature identifies several benefits of bedside nursing shift report. However, published studies have not adequately quantified outcomes related to this process change, having either small or unreported sample sizes or not testing for statistical significance. Quasi-experimental pre- and postimplementation design. Seven medical-surgical units in a large university hospital implemented a blend of recorded and bedside nursing report. Outcomes monitored included patient and nursing satisfaction, patient falls, nursing overtime and medication errors. We found statistically significant improvements postimplementation in four patient survey items specifically impacted by the change to bedside report. Nursing perceptions of report were significantly improved in the areas of patient safety and involvement in care and nurse accountability postimplementation. However, there was a decline in nurse perception that report took a reasonable amount of time after bedside report implementation; contrary to these perceptions, there was no significant increase in nurse overtime. Patient falls at shift change decreased substantially after the implementation of bedside report. An intervening variable during the study period invalidated the comparison of medication errors pre- and postintervention. There was some indication from both patients and nurses that bedside report was not always consistently implemented. Several positive outcomes were documented in relation to the implementation of a blended bedside shift report, with few drawbacks. Nurse attitudes about report at the final data collection were more positive than at the initial postimplementation data collection. If properly implemented, nursing bedside report can result in improved patient and nursing satisfaction and patient safety outcomes. However, managers should involve staff nurses in the implementation process and continue to monitor consistency in report format as well as satisfaction with the process. © 2014 John Wiley & Sons Ltd.
Quantification of liver fat with respiratory-gated quantitative chemical shift encoded MRI.
Motosugi, Utaroh; Hernando, Diego; Bannas, Peter; Holmes, James H; Wang, Kang; Shimakawa, Ann; Iwadate, Yuji; Taviani, Valentina; Rehm, Jennifer L; Reeder, Scott B
2015-11-01
To evaluate free-breathing chemical shift-encoded (CSE) magnetic resonance imaging (MRI) for quantification of hepatic proton density fat-fraction (PDFF). A secondary purpose was to evaluate hepatic R2* values measured using free-breathing quantitative CSE-MRI. Fifty patients (mean age, 56 years) were prospectively recruited and underwent the following four acquisitions to measure PDFF and R2*; 1) conventional breath-hold CSE-MRI (BH-CSE); 2) respiratory-gated CSE-MRI using respiratory bellows (BL-CSE); 3) respiratory-gated CSE-MRI using navigator echoes (NV-CSE); and 4) single voxel MR spectroscopy (MRS) as the reference standard for PDFF. Image quality was evaluated by two radiologists. MRI-PDFF measured from the three CSE-MRI methods were compared with MRS-PDFF using linear regression. The PDFF and R2* values were compared using two one-sided t-test to evaluate statistical equivalence. There was no significant difference in the image quality scores among the three CSE-MRI methods for either PDFF (P = 1.000) or R2* maps (P = 0.359-1.000). Correlation coefficients (95% confidence interval [CI]) for the PDFF comparisons were 0.98 (0.96-0.99) for BH-, 0.99 (0.97-0.99) for BL-, and 0.99 (0.98-0.99) for NV-CSE. The statistical equivalence test revealed that the mean difference in PDFF and R2* between any two of the three CSE-MRI methods was less than ±1 percentage point (pp) and ±5 s(-1) , respectively (P < 0.046). Respiratory-gated CSE-MRI with respiratory bellows or navigator echo are feasible methods to quantify liver PDFF and R2* and are as valid as the standard breath-hold technique. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Xu, Zhonghua; Zhu, Lie; Sojka, Jan; Kokoszka, Piotr; Jach, Agnieszka
2008-08-01
A wavelet-based index of storm activity (WISA) has been recently developed [Jach, A., Kokoszka, P., Sojka, L., Zhu, L., 2006. Wavelet-based index of magnetic storm activity. Journal of Geophysical Research 111, A09215, doi:10.1029/2006JA011635] to complement the traditional Dst index. The new index can be computed automatically by using the wavelet-based statistical procedure without human intervention on the selection of quiet days and the removal of secular variations. In addition, the WISA is flexible on data stretch and has a higher temporal resolution (1 min), which can provide a better description of the dynamical variations of magnetic storms. In this work, we perform a systematic assessment study on the WISA index. First, we statistically compare the WISA to the Dst for various quiet and disturbed periods and analyze the differences of their spectral features. Then we quantitatively assess the flexibility of the WISA on data stretch and study the effects of varying number of stations on the index. In addition, the ability of the WISA for handling the missing data is also quantitatively assessed. The assessment results show that the hourly averaged WISA index can describe storm activities equally well as the Dst index, but its full automation, high flexibility on data stretch, easiness of using the data from varying number of stations, high temporal resolution, and high tolerance to missing data from individual station can be very valuable and essential for real-time monitoring of the dynamical variations of magnetic storm activities and space weather applications, thus significantly complementing the existing Dst index.
Al JABBARI, Youssef S.; TSAKIRIDIS, Peter; ELIADES, George; AL-HADLAQ, Solaiman M.; ZINELIS, Spiros
2012-01-01
Objective The aim of this study was to quantify the surface area, volume and specific surface area of endodontic files employing quantitative X-ray micro computed tomography (mXCT). Material and Methods Three sets (six files each) of the Flex-Master Ni-Ti system (Nº 20, 25 and 30, taper .04) were utilized in this study. The files were scanned by mXCT. The surface area and volume of all files were determined from the cutting tip up to 16 mm. The data from the surface area, volume and specific area were statistically evaluated using the one-way ANOVA and SNK multiple comparison tests at α=0.05, employing the file size as a discriminating variable. The correlation between the surface area and volume with nominal ISO sizes were tested employing linear regression analysis. Results The surface area and volume of Nº 30 files showed the highest value followed by Nº 25 and Nº 20 and the differences were statistically significant. The Nº 20 files showed a significantly higher specific surface area compared to Nº 25 and Nº 30. The increase in surface and volume towards higher file sizes follows a linear relationship with the nominal ISO sizes (r2=0.930 for surface area and r2=0.974 for volume respectively). Results indicated that the surface area and volume demonstrated an almost linear increase while the specific surface area exhibited an abrupt decrease towards higher sizes. Conclusions This study demonstrates that mXCT can be effectively applied to discriminate very small differences in the geometrical features of endodontic micro-instruments, while providing quantitative information for their geometrical properties. PMID:23329248
Kraft, Indra; Schreiber, Jan; Cafiero, Riccardo; Metere, Riccardo; Schaadt, Gesa; Brauer, Jens; Neef, Nicole E; Müller, Bent; Kirsten, Holger; Wilcke, Arndt; Boltze, Johannes; Friederici, Angela D; Skeide, Michael A
2016-12-01
Recent studies suggest that neurobiological anomalies are already detectable in pre-school children with a family history of developmental dyslexia (DD). However, there is a lack of longitudinal studies showing a direct link between those differences at a preliterate age and the subsequent literacy difficulties seen in school. It is also not clear whether the prediction of DD in pre-school children can be significantly improved when considering neurobiological predictors, compared to models based on behavioral literacy precursors only. We recruited 53 pre-reading children either with (N=25) or without a family risk of DD (N=28). Quantitative T1 MNI data and literacy precursor abilities were assessed at kindergarten age. A subsample of 35 children was tested for literacy skills either one or two years later, that is, either in first or second grade. The group comparison of quantitative T1 measures revealed significantly higher T1 intensities in the left anterior arcuate fascicle (AF), suggesting reduced myelin concentration in preliterate children at risk of DD. A logistic regression showed that DD can be predicted significantly better (p=.024) when neuroanatomical differences between groups are used as predictors (80%) compared to a model based on behavioral predictors only (63%). The Wald statistic confirmed that the T1 intensity of the left AF is a statistically significant predictor of DD (p<.05). Our longitudinal results provide evidence for the hypothesis that neuroanatomical anomalies in children with a family risk of DD are related to subsequent problems in acquiring literacy. Particularly, solid white matter organization in the left anterior arcuate fascicle seems to play a pivotal role. Copyright © 2016 Elsevier Inc. All rights reserved.
Stelzer, Erin A.; Strickler, Kriston M.; Schill, William B.
2012-01-01
During summer and early fall 2010, 15 river samples and 6 fecal-source samples were collected in West Virginia. These samples were analyzed by three laboratories for three microbial source tracking (MST) markers: AllBac, a general fecal indicator; BacHum, a human-associated fecal indicator; and BoBac, a ruminant-associated fecal indicator. MST markers were analyzed by means of the quantitative polymerase chain reaction (qPCR) method. The aim was to assess interlaboratory precision when the three laboratories used the same MST marker and shared deoxyribonucleic acid (DNA) extracts of the samples, but different equipment, reagents, and analyst experience levels. The term assay refers to both the markers and the procedure differences listed above. Interlaboratory precision was best for all three MST assays when using the geometric mean absolute relative percent difference (ARPD) and Friedman's statistical test as a measure of interlaboratory precision. Adjustment factors (one for each MST assay) were calculated using results from fecal-source samples analyzed by all three laboratories and applied retrospectively to sample concentrations to account for differences in qPCR results among labs using different standards and procedures. Following the application of adjustment factors to qPCR results, ARPDs were lower; however, statistically significant differences between labs were still observed for the BacHum and BoBac assays. This was a small study and two of the MST assays had 52 percent of samples with concentrations at or below the limit of accurate quantification; hence, more testing could be done to determine if the adjustment factors would work better if the majority of sample concentrations were above the quantification limit.
Dimova, Violeta; Oertel, Bruno G; Lötsch, Jörn
2017-01-01
Skin sensitivity to sensory stimuli varies among different body areas. A standardized clinical quantitative sensory testing (QST) battery, established for the diagnosis of neuropathic pain, was used to assess whether the magnitude of differences between test sites reaches clinical significance. Ten different sensory QST measures derived from thermal and mechanical stimuli were obtained from 21 healthy volunteers (10 men) and used to create somatosensory profiles bilateral from the dorsum of the hands (the standard area for the assessment of normative values for the upper extremities as proposed by the German Research Network on Neuropathic Pain) and bilateral at volar forearms as a neighboring nonstandard area. The parameters obtained were statistically compared between test sites. Three of the 10 QST parameters differed significantly with respect to the "body area," that is, warmth detection, thermal sensory limen, and mechanical pain thresholds. After z-transformation and interpretation according to the QST battery's standard instructions, 22 abnormal values were obtained at the hand. Applying the same procedure to parameters assessed at the nonstandard site forearm, that is, z-transforming them to the reference values for the hand, 24 measurements values emerged as abnormal, which was not significantly different compared with the hand (P=0.4185). Sensory differences between neighboring body areas are statistically significant, reproducing prior knowledge. This has to be considered in scientific assessments where a small variation of the tested body areas may not be an option. However, the magnitude of these differences was below the difference in sensory parameters that is judged as abnormal, indicating a robustness of the QST instrument against protocol deviations with respect to the test area when using the method of comparison with a 95 % confidence interval of a reference dataset.
Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.
Counsell, Alyssa; Harlow, Lisa L
2017-05-01
With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.
A Dimensionally Reduced Clustering Methodology for Heterogeneous Occupational Medicine Data Mining.
Saâdaoui, Foued; Bertrand, Pierre R; Boudet, Gil; Rouffiac, Karine; Dutheil, Frédéric; Chamoux, Alain
2015-10-01
Clustering is a set of techniques of the statistical learning aimed at finding structures of heterogeneous partitions grouping homogenous data called clusters. There are several fields in which clustering was successfully applied, such as medicine, biology, finance, economics, etc. In this paper, we introduce the notion of clustering in multifactorial data analysis problems. A case study is conducted for an occupational medicine problem with the purpose of analyzing patterns in a population of 813 individuals. To reduce the data set dimensionality, we base our approach on the Principal Component Analysis (PCA), which is the statistical tool most commonly used in factorial analysis. However, the problems in nature, especially in medicine, are often based on heterogeneous-type qualitative-quantitative measurements, whereas PCA only processes quantitative ones. Besides, qualitative data are originally unobservable quantitative responses that are usually binary-coded. Hence, we propose a new set of strategies allowing to simultaneously handle quantitative and qualitative data. The principle of this approach is to perform a projection of the qualitative variables on the subspaces spanned by quantitative ones. Subsequently, an optimal model is allocated to the resulting PCA-regressed subspaces.
ERIC Educational Resources Information Center
Delaval, Marine; Michinov, Nicolas; Le Bohec, Olivier; Le Hénaff, Benjamin
2017-01-01
The aim of this study was to examine how social or temporal-self comparison feedback, delivered in real-time in a web-based training environment, could influence the academic performance of students in a statistics examination. First-year psychology students were given the opportunity to train for a statistics examination during a semester by…
Chahal, Gurparkash Singh; Chhina, Kamalpreet; Chhabra, Vipin; Bhatnagar, Rakhi; Chahal, Amna
2014-01-01
Background: A surface smear layer consisting of organic and inorganic material is formed on the root surface following mechanical instrumentation and may inhibit the formation of new connective tissue attachment to the root surface. Modification of the tooth surface by root conditioning has resulted in improved connective tissue attachment and has advanced the goal of reconstructive periodontal treatment. Aim: The aim of this study was to compare the effects of citric acid, tetracycline, and doxycycline on the instrumented periodontally involved root surfaces in vitro using a scanning electron microscope. Settings and Design: A total of 45 dentin samples obtained from 15 extracted, scaled, and root planed teeth were divided into three groups. Materials and Methods: The root conditioning agents were applied with cotton pellets using the Passive burnishing technique for 5 minutes. The samples were then examined by the scanning electron microscope. Statistical Analysis Used: The statistical analysis was carried out using Statistical Package for Social Sciences (SPSS Inc., Chicago, IL, version 15.0 for Windows). For all quantitative variables means and standard deviations were calculated and compared. For more than two groups ANOVA was applied. For multiple comparisons post hoc tests with Bonferroni correction was used. Results: Upon statistical analysis the root conditioning agents used in this study were found to be effective in removing the smear layer, uncovering and widening the dentin tubules and unmasking the dentin collagen matrix. Conclusion: Tetracycline HCl was found to be the best root conditioner among the three agents used. PMID:24744541
Turbulent/non-turbulent interfaces detected in DNS of incompressible turbulent boundary layers
NASA Astrophysics Data System (ADS)
Watanabe, T.; Zhang, X.; Nagata, K.
2018-03-01
The turbulent/non-turbulent interface (TNTI) detected in direct numerical simulations is studied for incompressible, temporally developing turbulent boundary layers at momentum thickness Reynolds number Reθ ≈ 2000. The outer edge of the TNTI layer is detected as an isosurface of the vorticity magnitude with the threshold determined with the dependence of the turbulent volume on a threshold level. The spanwise vorticity magnitude and passive scalar are shown to be good markers of turbulent fluids, where the conditional statistics on a distance from the outer edge of the TNTI layer are almost identical to the ones obtained with the vorticity magnitude. Significant differences are observed for the conditional statistics between the TNTI detected by the kinetic energy and vorticity magnitude. A widely used grid setting determined solely from the wall unit results in an insufficient resolution in a streamwise direction in the outer region, whose influence is found for the geometry of the TNTI and vorticity jump across the TNTI layer. The present results suggest that the grid spacing should be similar for the streamwise and spanwise directions. Comparison of the TNTI layer among different flows requires appropriate normalization of the conditional statistics. Reference quantities of the turbulence near the TNTI layer are obtained with the average of turbulent fluids in the intermittent region. The conditional statistics normalized by the reference turbulence characteristics show good quantitative agreement for the turbulent boundary layer and planar jet when they are plotted against the distance from the outer edge of the TNTI layer divided by the Kolmogorov scale defined for turbulent fluids in the intermittent region.
NASA Astrophysics Data System (ADS)
Santiago-Lona, Cynthia V.; Hernández-Montes, María del Socorro; Mendoza-Santoyo, Fernando; Esquivel-Tejeda, Jesús
2018-02-01
The study and quantification of the tympanic membrane (TM) displacements add important information to advance the knowledge about the hearing process. A comparative statistical analysis between two commonly used demodulation methods employed to recover the optical phase in digital holographic interferometry, namely the fast Fourier transform and phase-shifting interferometry, is presented as applied to study thin tissues such as the TM. The resulting experimental TM surface displacement data are used to contrast both methods through the analysis of variance and F tests. Data are gathered when the TMs are excited with continuous sound stimuli at levels 86, 89 and 93 dB SPL for the frequencies of 800, 1300 and 2500 Hz under the same experimental conditions. The statistical analysis shows repeatability in z-direction displacements with a standard deviation of 0.086, 0.098 and 0.080 μm using the Fourier method, and 0.080, 0.104 and 0.055 μm with the phase-shifting method at a 95% confidence level for all frequencies. The precision and accuracy are evaluated by means of the coefficient of variation; the results with the Fourier method are 0.06143, 0.06125, 0.06154 and 0.06154, 0.06118, 0.06111 with phase-shifting. The relative error between both methods is 7.143, 6.250 and 30.769%. On comparing the measured displacements, the results indicate that there is no statistically significant difference between both methods for frequencies at 800 and 1300 Hz; however, errors and other statistics increase at 2500 Hz.
Bartoo, G T; Nochlin, D; Chang, D; Kim, Y; Sumi, S M
1997-05-01
Using image analysis techniques to quantify the percentage area covered by the immunopositive marker for amyloid beta-peptide (A beta), we examined subjects with combinations of either early-onset or late-onset Alzheimer disease (AD) and either familial Alzheimer disease (FAD) or sporadic Alzheimer disease (SAD). We measured the mean and maximum A beta loads, in the hippocampus of each subject. There were no statistically significant differences in the mean A beta load between familial and sporadic AD subjects. Although sample sizes were too small for statistical testing, subjects with the epsilon 4/epsilon 4 allele of the apolipoprotein E (ApoE) gene had higher mean A beta loads than those with the epsilon 3/epsilon 3 or epsilon 3/epsilon 4 alleles. Members of the Volga German families (recently linked to chromosome 1) all had high mean A beta loads, and one of the chromosome 14-linked subjects had the highest mean A beta load while the other had a relatively small load, but the sample was too small for statistical comparisons. The duration of dementia and neuropsychological test scores showed a statistically significant correlation with the mean A beta load in the hippocampus, but not with the maximum A beta load. This difference indicates that the mean A beta load may be a more useful feature than the maximum A beta load as an objective neuropathological measure for cognitive status. This finding may help to improve the established methods for quantitative assessment of the neuropathological changes in AD.
DAnTE: a statistical tool for quantitative analysis of –omics data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep
2008-05-03
DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lima, F.W.; Pagano, C.; Schneiderman, B.
1959-07-01
Boron can be determined quantitatively by absorption spectrophotometry of solutions of the red compound formed by the reaction of boric acid with curcumin. This reaction is affected by various factors, some of which can be detected easily in the data interpretation. Others, however, provide more difficulty. The application of modern statistical method to the study of the influence of these factors on the quantitative determination of boron is presented. These methods provide objective ways of establishing significant effects of the factors involved. (auth)
Yin, Xiaoming; Guo, Yang; Li, Weiguo; Huo, Eugene; Zhang, Zhuoli; Nicolai, Jodi; Kleps, Robert A.; Hernando, Diego; Katsaggelos, Aggelos K.; Omary, Reed A.
2012-01-01
Purpose: To demonstrate the feasibility of using chemical shift magnetic resonance (MR) imaging fat-water separation methods for quantitative estimation of transcatheter lipiodol delivery to liver tissues. Materials and Methods: Studies were performed in accordance with institutional Animal Care and Use Committee guidelines. Proton nuclear MR spectroscopy was first performed to identify lipiodol spectral peaks and relative amplitudes. Next, phantoms were constructed with increasing lipiodol-water volume fractions. A multiecho chemical shift–based fat-water separation method was used to quantify lipiodol concentration within each phantom. Six rats served as controls; 18 rats underwent catheterization with digital subtraction angiography guidance for intraportal infusion of a 15%, 30%, or 50% by volume lipiodol-saline mixture. MR imaging measurements were used to quantify lipiodol delivery to each rat liver. Lipiodol concentration maps were reconstructed by using both single-peak and multipeak chemical shift models. Intraclass and Spearman correlation coefficients were calculated for statistical comparison of MR imaging–based lipiodol concentration and volume measurements to reference standards (known lipiodol phantom compositions and the infused lipiodol dose during rat studies). Results: Both single-peak and multipeak measurements were well correlated to phantom lipiodol concentrations (r2 > 0.99). Lipiodol volume measurements were progressively and significantly higher when comparing between animals receiving different doses (P < .05 for each comparison). MR imaging–based lipiodol volume measurements strongly correlated with infused dose (intraclass correlation coefficients > 0.93, P < .001) with both single- and multipeak approaches. Conclusion: Chemical shift MR imaging fat-water separation methods can be used for quantitative measurements of lipiodol delivery to liver tissues. © RSNA, 2012 PMID:22623693
NASA Astrophysics Data System (ADS)
Fan, X.; Chen, L.; Ma, Z.
2010-12-01
Climate downscaling has been an active research and application area in the past several decades focusing on regional climate studies. Dynamical downscaling, in addition to statistical methods, has been widely used in downscaling as the advanced modern numerical weather and regional climate models emerge. The utilization of numerical models enables that a full set of climate variables are generated in the process of downscaling, which are dynamically consistent due to the constraints of physical laws. While we are generating high resolution regional climate, the large scale climate patterns should be retained. To serve this purpose, nudging techniques, including grid analysis nudging and spectral nudging, have been used in different models. There are studies demonstrating the benefit and advantages of each nudging technique; however, the results are sensitive to many factors such as nudging coefficients and the amount of information to nudge to, and thus the conclusions are controversy. While in a companion work of developing approaches for quantitative assessment of the downscaled climate, in this study, the two nudging techniques are under extensive experiments in the Weather Research and Forecasting (WRF) model. Using the same model provides fair comparability. Applying the quantitative assessments provides objectiveness of comparison. Three types of downscaling experiments were performed for one month of choice. The first type is serving as a base whereas the large scale information is communicated through lateral boundary conditions only; the second is using the grid analysis nudging; and the third is using spectral nudging. Emphases are given to the experiments of different nudging coefficients and nudging to different variables in the grid analysis nudging; while in spectral nudging, we focus on testing the nudging coefficients, different wave numbers on different model levels to nudge.
Mayrovitz, H N; Weingrad, D N; Brlit, F; Lopez, L B; Desfor, R
2015-03-01
An easily measured, non-invasive, quantitative estimate of local skin tissue water is useful to assess local lymphedema and its change. One method uses skin tissue dielectric constant (TDC) values that at 300 MHz TDC depend on free and bound water within the measurement volume. In practice such measurements have been done with a research-type multi-probe, but recently a hand-held compact-probe has become available that may be more clinically convenient. Because most available published data is based on multiprobe measurements it is important to characterize possible differences between devices that unless known might lead to ambiguous quantitative comparisons between TDC values. Thus, our purpose was to evaluate potential differences in measured TDC values between multi-probe and compact-probe devices with respect to probe effective sampling depth, anatomical site, and gender and also to compare compact-probe TDC values measured on women with and without breast cancer (BC). TDC was measured bilaterally on forearms and biceps of 32 male and 32 female volunteers and on 12 female patients awaiting surgery for breast cancer. Results show that 1) TDC values at 2.5 mm depth were significantly less than at 1.5 mm; 2) Female TDC values were significantly less than male values; 3) TDC values were not different between females with and without BC; and 4) dominant/non-dominant arm TDC ratios were not significantly different for any probe among genders or arm anatomical site. These findings indicate that probe type differences in absolute TDC values are present and should be taken into account when TDC values are compared. However, comparisons based on inter-arm TDC ratios are not statistically different among probes with respect to gender or anatomical location.
Arneja, Jugpal S; Narasimhan, Kailash; Bouwman, David; Bridge, Patrick D
2009-12-01
In-training evaluations in graduate medical education have typically been challenging. Although the majority of standardized examination delivery methods have become computer-based, in-training examinations generally remain pencil-paper-based, if they are performed at all. Audience response systems present a novel way to stimulate and evaluate the resident-learner. The purpose of this study was to assess the outcomes of audience response systems testing as compared with traditional testing in a plastic surgery residency program. A prospective 1-year pilot study of 10 plastic surgery residents was performed using audience response systems-delivered testing for the first half of the academic year and traditional pencil-paper testing for the second half. Examination content was based on monthly "Core Quest" curriculum conferences. Quantitative outcome measures included comparison of pretest and posttest and cumulative test scores of both formats. Qualitative outcomes from the individual participants were obtained by questionnaire. When using the audience response systems format, pretest and posttest mean scores were 67.5 and 82.5 percent, respectively; using traditional pencil-paper format, scores were 56.5 percent and 79.5 percent. A comparison of the cumulative mean audience response systems score (85.0 percent) and traditional pencil-paper score (75.0 percent) revealed statistically significantly higher scores with audience response systems (p = 0.01). Qualitative outcomes revealed increased conference enthusiasm, greater enjoyment of testing, and no user difficulties with the audience response systems technology. The audience response systems modality of in-training evaluation captures participant interest and reinforces material more effectively than traditional pencil-paper testing does. The advantages include a more interactive learning environment, stimulation of class participation, immediate feedback to residents, and immediate tabulation of results for the educator. Disadvantages include start-up costs and lead-time preparation.
Ince, P; Irving, D; MacArthur, F; Perry, R H
1991-12-01
A Lewy body dementing syndrome in the elderly has been recently described and designated senile dementia of Lewy body type (SDLT) on the basis of a distinct clinicopathological profile. The pathological changes seen in SDLT include the presence of cortical Lewy bodies (LB) frequently, but not invariably, associated with senile plaque (SP) formation. Whilst neocortical neurofibrillary tangles (NFT) are sparse or absent, a proportion of these cases show involvement of the temporal archicortex by lesions comprising Alzheimer-type pathology (ATP, i.e. NFT, SP and granulovacuolar degeneration [GVD]). Thus the relationship between SDLT and senile dementia of Alzheimer type (SDAT) is complex and controversial. In this study quantitative neuropathology was used to compare the intensity and distribution of ATP in the hippocampus and entorhinal cortex of 53 patients from 3 disease groups (SDLT, SDAT, Parkinson's disease (PD)) and a group of neurologically and mentally normal elderly control patients. For most brain areas examined the extent of ATP between the patient groups followed the trend SDAT greater than SDLT greater than PD greater than control. Statistical comparison of these groups revealed significant differences between the mean densities of NFT, SP and GVD although individual cases showed considerable variability. These results confirm additional pathological differences between SDAT and SDLT regarding the intensity of involvement of the temporal archicortex by ATP. Many patients with Lewy body disorders (LBdis) show a predisposition to develop ATP albeit in a more restricted distribution (e.g. low or absent neocortical NFT) and at lower densities than is found in SDAT. Some cases of SDLT show minimal SP and NFT formation in both neocortex and archicortex supporting previously published data distinguishing this group from Alzheimer's disease.
Kalra, Mannudeep K; Maher, Michael M; Blake, Michael A; Lucey, Brian C; Karau, Kelly; Toth, Thomas L; Avinash, Gopal; Halpern, Elkan F; Saini, Sanjay
2004-09-01
To assess the effect of noise reduction filters on detection and characterization of lesions on low-radiation-dose abdominal computed tomographic (CT) images. Low-dose CT images of abdominal lesions in 19 consecutive patients (11 women, eight men; age range, 32-78 years) were obtained at reduced tube currents (120-144 mAs). These baseline low-dose CT images were postprocessed with six noise reduction filters; the resulting postprocessed images were then randomly assorted with baseline images. Three radiologists performed independent evaluation of randomized images for presence, number, margins, attenuation, conspicuity, calcification, and enhancement of lesions, as well as image noise. Side-by-side comparison of baseline images with postprocessed images was performed by using a five-point scale for assessing lesion conspicuity and margins, image noise, beam hardening, and diagnostic acceptability. Quantitative noise and contrast-to-noise ratio were obtained for all liver lesions. Statistical analysis was performed by using the Wilcoxon signed rank test, Student t test, and kappa test of agreement. Significant reduction of noise was observed in images postprocessed with filter F compared with the noise in baseline nonfiltered images (P =.004). Although the number of lesions seen on baseline images and that seen on postprocessed images were identical, lesions were less conspicuous on postprocessed images than on baseline images. A decrease in quantitative image noise and contrast-to-noise ratio for liver lesions was noted with all noise reduction filters. There was good interobserver agreement (kappa = 0.7). Although the use of currently available noise reduction filters improves image noise and ameliorates beam-hardening artifacts at low-dose CT, such filters are limited by a compromise in lesion conspicuity and appearance in comparison with lesion conspicuity and appearance on baseline low-dose CT images. Copyright RSNA, 2004
Gartlehner, Gerald; Dobrescu, Andreea; Evans, Tammeka Swinson; Thaler, Kylie; Nussbaumer, Barbara; Sommer, Isolde; Lohr, Kathleen N
2016-01-01
The objective of our study was to use a diverse sample of medical interventions to assess empirically whether first trials rendered substantially different treatment effect estimates than reliable, high-quality bodies of evidence. We used a meta-epidemiologic study design using 100 randomly selected bodies of evidence from Cochrane reports that had been graded as high quality of evidence. To determine the concordance of effect estimates between first and subsequent trials, we applied both quantitative and qualitative approaches. For quantitative assessment, we used Lin's concordance correlation and calculated z-scores; to determine the magnitude of differences of treatment effects, we calculated standardized mean differences (SMDs) and ratios of relative risks. We determined qualitative concordance based on a two-tiered approach incorporating changes in statistical significance and magnitude of effect. First trials both overestimated and underestimated the true treatment effects in no discernible pattern. Nevertheless, depending on the definition of concordance, effect estimates of first trials were concordant with pooled subsequent studies in at least 33% but up to 50% of comparisons. The pooled magnitude of change as bodies of evidence advanced from single trials to high-quality bodies of evidence was 0.16 SMD [95% confidence interval (CI): 0.12, 0.21]. In 80% of comparisons, the difference in effect estimates was smaller than 0.5 SMDs. In first trials with large treatment effects (>0.5 SMD), however, estimates of effect substantially changed as new evidence accrued (mean change 0.68 SMD; 95% CI: 0.50, 0.86). Results of first trials often change, but the magnitude of change, on average, is small. Exceptions are first trials that present large treatment effects, which often dissipate as new evidence accrues. Copyright © 2016 Elsevier Inc. All rights reserved.
Mai, Lan-Yin; Li, Yi-Xuan; Chen, Yong; Xie, Zhen; Li, Jie; Zhong, Ming-Yu
2014-05-01
The compatibility of traditional Chinese medicines (TCMs) formulae containing enormous information, is a complex component system. Applications of mathematical statistics methods on the compatibility researches of traditional Chinese medicines formulae have great significance for promoting the modernization of traditional Chinese medicines and improving clinical efficacies and optimizations of formulae. As a tool for quantitative analysis, data inference and exploring inherent rules of substances, the mathematical statistics method can be used to reveal the working mechanisms of the compatibility of traditional Chinese medicines formulae in qualitatively and quantitatively. By reviewing studies based on the applications of mathematical statistics methods, this paper were summarized from perspective of dosages optimization, efficacies and changes of chemical components as well as the rules of incompatibility and contraindication of formulae, will provide the references for further studying and revealing the working mechanisms and the connotations of traditional Chinese medicines.
Poole, Kerry; Mason, Howard
2007-03-15
To establish the relationship between quantitative tests of hand function and upper limb disability, as measured by the Disability of the Arm, Shoulder and Hand (DASH) questionnaire, in hand-arm vibration syndrome (HAVS). A total of 228 individuals with HAVS were included in this study. Each had undergone a full HAVS assessment by an experienced physician, including quantitative tests of vibrotactile and thermal perception thresholds, maximal hand-grip strength (HG) and the Purdue pegboard (PP) test. Individuals were also asked to complete a DASH questionnaire. PP and HG of the quantitative tests gave the best and statistically significant individual correlations with the DASH disability score (r2 = 0.168 and 0.096). Stepwise linear regression analysis revealed that only PP and HG measurements were statistically significant predictors of upper limb disability (r2 = 0.178). Overall a combination of the PP and HG measurements, rather than each alone, gave slightly better discrimination, although not statistically significant, between normal and abnormal DASH scores with a sensitivity of 73.1% and specificity of 64.3%. Measurements of manual dexterity and hand-grip strength using PP and HG may be useful in helping to confirm lack of upper limb function and 'perceived' disability in HAVS.
A quantitative approach to evolution of music and philosophy
NASA Astrophysics Data System (ADS)
Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano
2012-08-01
The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.
Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses
Abstract: Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative...
[Development of an Excel spreadsheet for meta-analysis of indirect and mixed treatment comparisons].
Tobías, Aurelio; Catalá-López, Ferrán; Roqué, Marta
2014-01-01
Meta-analyses in clinical research usually aimed to evaluate treatment efficacy and safety in direct comparison with a unique comparator. Indirect comparisons, using the Bucher's method, can summarize primary data when information from direct comparisons is limited or nonexistent. Mixed comparisons allow combining estimates from direct and indirect comparisons, increasing statistical power. There is a need for simple applications for meta-analysis of indirect and mixed comparisons. These can easily be conducted using a Microsoft Office Excel spreadsheet. We developed a spreadsheet for indirect and mixed effects comparisons of friendly use for clinical researchers interested in systematic reviews, but non-familiarized with the use of more advanced statistical packages. The use of the proposed Excel spreadsheet for indirect and mixed comparisons can be of great use in clinical epidemiology to extend the knowledge provided by traditional meta-analysis when evidence from direct comparisons is limited or nonexistent.
2015-08-01
the nine questions. The Statistical Package for the Social Sciences ( SPSS ) [11] was used to conduct statistical analysis on the sample. Two types...constructs. SPSS was again used to conduct statistical analysis on the sample. This time factor analysis was conducted. Factor analysis attempts to...Business Research Methods and Statistics using SPSS . P432. 11 IBM SPSS Statistics . (2012) 12 Burns, R.B., Burns, R.A. (2008) ‘Business Research
Gu, Z.; Sam, S. S.; Sun, Y.; Tang, L.; Pounds, S.; Caliendo, A. M.
2016-01-01
A potential benefit of digital PCR is a reduction in result variability across assays and platforms. Three sets of PCR reagents were tested on two digital PCR systems (Bio-Rad and RainDance), using three different sets of PCR reagents for quantitation of cytomegalovirus (CMV). Both commercial quantitative viral standards and 16 patient samples (n = 16) were tested. Quantitative accuracy (compared to nominal values) and variability were determined based on viral standard testing results. Quantitative correlation and variability were assessed with pairwise comparisons across all reagent-platform combinations for clinical plasma sample results. The three reagent sets, when used to assay quantitative standards on the Bio-Rad system, all showed a high degree of accuracy, low variability, and close agreement with one another. When used on the RainDance system, one of the three reagent sets appeared to have a much better correlation to nominal values than did the other two. Quantitative results for patient samples showed good correlation in most pairwise comparisons, with some showing poorer correlations when testing samples with low viral loads. Digital PCR is a robust method for measuring CMV viral load. Some degree of result variation may be seen, depending on platform and reagents used; this variation appears to be greater in samples with low viral load values. PMID:27535685
Purified oocysts of Cryptosporidium parvum were used to evaluate applicability of two quantitative PCR (qPCR) viability detection methods in raw surface water and disinfection treated water. Propidium monoazide-qPCR targeting hsp70 gene was compared to reverse transcription (RT)-...
3D-QSAR analysis of MCD inhibitors by CoMFA and CoMSIA.
Pourbasheer, Eslam; Aalizadeh, Reza; Ebadi, Amin; Ganjali, Mohammad Reza
2015-01-01
Three-dimensional quantitative structure-activity relationship was developed for the series of compounds as malonyl-CoA decarboxylase antagonists (MCD) using the CoMFA and CoMSIA methods. The statistical parameters for CoMFA (q(2)=0.558, r(2)=0.841) and CoMSIA (q(2)= 0.615, r(2) = 0.870) models were derived based on 38 compounds as training set in the basis of the selected alignment. The external predictive abilities of the built models were evaluated by using the test set of nine compounds. From obtained results, the CoMSIA method was found to have highly predictive capability in comparison with CoMFA method. Based on the given results by CoMSIA and CoMFA contour maps, some features that can enhance the activity of compounds as MCD antagonists were introduced and used to design new compounds with better inhibition activity.
NASA Astrophysics Data System (ADS)
Shanmugavadivu, P.; Eliahim Jeevaraj, P. S.
2014-06-01
The Adaptive Iterated Functions Systems (AIFS) Filter presented in this paper has an outstanding potential to attenuate the fixed-value impulse noise in images. This filter has two distinct phases namely noise detection and noise correction which uses Measure of Statistics and Iterated Function Systems (IFS) respectively. The performance of AIFS filter is assessed by three metrics namely, Peak Signal-to-Noise Ratio (PSNR), Mean Structural Similarity Index Matrix (MSSIM) and Human Visual Perception (HVP). The quantitative measures PSNR and MSSIM endorse the merit of this filter in terms of degree of noise suppression and details/edge preservation respectively, in comparison with the high performing filters reported in the recent literature. The qualitative measure HVP confirms the noise suppression ability of the devised filter. This computationally simple noise filter broadly finds application wherein the images are highly degraded by fixed-value impulse noise.
An X-ray and optical study of the cluster of galaxies Abell 754
NASA Technical Reports Server (NTRS)
Fabricant, D.; Beers, T. C.; Geller, M. J.; Gorenstein, P.; Huchra, J. P.
1986-01-01
X-ray and optical data for A754 are used to study the relative distribution of the luminous and dark matter in this dense, rich cluster of galaxies with X-ray luminosity comparable to that of the Coma Cluster. A quantitative statistical comparison is made of the galaxy positions with the total mass responsible for maintaining the X-ray emitting gas in hydrostatic equilibrium. A simple bimodal model which fits both the X-ray and optical data suggests that the galaxies are distributed consistently with the projected matter distribution within the region covered by the X-ray map (0.5-1 Mpc). The X-ray and optical estimates of the mass in the central region of the cluster are 2.9 x 10 to the 14th and 3.6 + or - 0.5 x 10 to the 14th solar masses, respectively.
Alonso, E; Rubio, A; March, J C; Danet, A
2011-01-01
The aim of this study is to compare the emotional climate, quality of communication and performance indicators in a clinical management unit and two traditional hospital services. Quantitative study. questionnaire of 94 questions. 83 health professionals (63 responders) from the clinical management unit of breast pathology and the hospital services of medical oncology and radiation oncology. descriptive statistics, comparison of means, correlation and linear regression models. The clinical management unit reaches higher values compared with the hospital services about: performance indicators, emotional climate, internal communication and evaluation of the leadership. An important gap between existing and desired sources, channels, media and subjects of communication appear, in both clinical management unit and traditional services. The clinical management organization promotes better internal communication and interpersonal relations, leading to improved performance indicators. Copyright © 2011 SECA. Published by Elsevier Espana. All rights reserved.
Pathway Towards Fluency: Using 'disaggregate instruction' to promote science literacy
NASA Astrophysics Data System (ADS)
Brown, Bryan A.; Ryoo, Kihyun; Rodriguez, Jamie
2010-07-01
This study examines the impact of Disaggregate Instruction on students' science learning. Disaggregate Instruction is the idea that science teaching and learning can be separated into conceptual and discursive components. Using randomly assigned experimental and control groups, 49 fifth-grade students received web-based science lessons on photosynthesis using our experimental approach. We supplemented quantitative statistical comparisons of students' performance on pre- and post-test questions (multiple choice and short answer) with a qualitative analysis of students' post-test interviews. The results revealed that students in the experimental group outscored their control group counterparts across all measures. In addition, students taught using the experimental method demonstrated an improved ability to write using scientific language as well as an improved ability to provide oral explanations using scientific language. This study has important implications for how science educators can prepare teachers to teach diverse student populations.
TASI: A software tool for spatial-temporal quantification of tumor spheroid dynamics.
Hou, Yue; Konen, Jessica; Brat, Daniel J; Marcus, Adam I; Cooper, Lee A D
2018-05-08
Spheroid cultures derived from explanted cancer specimens are an increasingly utilized resource for studying complex biological processes like tumor cell invasion and metastasis, representing an important bridge between the simplicity and practicality of 2-dimensional monolayer cultures and the complexity and realism of in vivo animal models. Temporal imaging of spheroids can capture the dynamics of cell behaviors and microenvironments, and when combined with quantitative image analysis methods, enables deep interrogation of biological mechanisms. This paper presents a comprehensive open-source software framework for Temporal Analysis of Spheroid Imaging (TASI) that allows investigators to objectively characterize spheroid growth and invasion dynamics. TASI performs spatiotemporal segmentation of spheroid cultures, extraction of features describing spheroid morpho-phenotypes, mathematical modeling of spheroid dynamics, and statistical comparisons of experimental conditions. We demonstrate the utility of this tool in an analysis of non-small cell lung cancer spheroids that exhibit variability in metastatic and proliferative behaviors.
NASA Astrophysics Data System (ADS)
van Poppel, Bret; Owkes, Mark; Nelson, Thomas; Lee, Zachary; Sowell, Tyler; Benson, Michael; Vasquez Guzman, Pablo; Fahrig, Rebecca; Eaton, John; Kurman, Matthew; Kweon, Chol-Bum; Bravo, Luis
2014-11-01
In this work, we present high-fidelity Computational Fluid Dynamics (CFD) results of liquid fuel injection from a pressure-swirl atomizer and compare the simulations to experimental results obtained using both shadowgraphy and phase-averaged X-ray computed tomography (CT) scans. The CFD and experimental results focus on the dense near-nozzle region to identify the dominant mechanisms of breakup during primary atomization. Simulations are performed using the NGA code of Desjardins et al (JCP 227 (2008)) and employ the volume of fluid (VOF) method proposed by Owkes and Desjardins (JCP 270 (2013)), a second order accurate, un-split, conservative, three-dimensional VOF scheme providing second order density fluxes and capable of robust and accurate high density ratio simulations. Qualitative features and quantitative statistics are assessed and compared for the simulation and experimental results, including the onset of atomization, spray cone angle, and drop size and distribution.
Quantitative validation of carbon-fiber laminate low velocity impact simulations
English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.
2015-09-26
Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less
Ion specific correlations in bulk and at biointerfaces.
Kalcher, I; Horinek, D; Netz, R R; Dzubiella, J
2009-10-21
Ion specific effects are ubiquitous in any complex colloidal or biological fluid in bulk or at interfaces. The molecular origins of these 'Hofmeister effects' are not well understood and their theoretical description poses a formidable challenge to the modeling and simulation community. On the basis of the combination of atomistically resolved molecular dynamics (MD) computer simulations and statistical mechanics approaches, we present a few selected examples of specific electrolyte effects in bulk, at simple neutral and charged interfaces, and on a short α-helical peptide. The structural complexity in these strongly Coulomb-correlated systems is highlighted and analyzed in the light of available experimental data. While in general the comparison of MD simulations to experiments often lacks quantitative agreement, mostly because molecular force fields and coarse-graining procedures remain to be optimized, the consensus as regards trends provides important insights into microscopic hydration and binding mechanisms.
ERIC Educational Resources Information Center
Tractenberg, Rochelle E.
2017-01-01
Statistical literacy is essential to an informed citizenry; and two emerging trends highlight a growing need for training that achieves this literacy. The first trend is towards "big" data: while automated analyses can exploit massive amounts of data, the interpretation--and possibly more importantly, the replication--of results are…
On a Calculus-based Statistics Course for Life Science Students
2010-01-01
The choice of pedagogy in statistics should take advantage of the quantitative capabilities and scientific background of the students. In this article, we propose a model for a statistics course that assumes student competency in calculus and a broadening knowledge in biology. We illustrate our methods and practices through examples from the curriculum. PMID:20810962
Montague, J R; Frei, J K
1993-04-01
To determine whether significant correlations existed among quantitative and qualitative predictors of students' academic success and quantitative outcomes of such success over a 12-year period in a small university's premedical program. A database was assembled from information on the 199 graduates who earned BS degrees in biology from Barry University's School of Natural and Health Sciences from 1980 through 1991. The quantitative variables were year of BS degree, total score on the Scholastic Aptitude Test (SAT), various measures of undergraduate grade-point averages (GPAs), and total score on the Medical College Admission Test (MCAT); and the qualitative variables were minority (54% of the students) or majority status and transfer (about one-third of the students) or nontransfer status. The statistical methods were multiple analysis of variance and stepwise multiple regression. Statistically significant positive correlations were found among SAT total scores, final GPAs, biology GPAs versus nonbiology GPAs, and MCAT total scores. These correlations held for transfer versus nontransfer students and for minority versus majority students. Over the 12-year period there were significant fluctuations in mean MCAT scores. The students' SAT scores and GPAs proved to be statistically reliable predictors of MCAT scores, but the minority or majority status and the transfer or nontransfer status of the students were statistically insignificant.
Velasco-Tapia, Fernando
2014-01-01
Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures).
A quantitative comparison of leading-edge vortices in incompressible and supersonic flows
DOT National Transportation Integrated Search
2002-01-14
When requiring quantitative data on delta-wing vortices for design purposes, low-speed results have often been extrapolated to configurations intended for supersonic operation. This practice stems from a lack of database owing to difficulties that pl...
Behavioral Assembly Required: Particularly for Quantitative Courses
ERIC Educational Resources Information Center
Mazen, Abdelmagid
2008-01-01
This article integrates behavioral approaches into the teaching and learning of quantitative subjects with application to statistics. Focusing on the emotional component of learning, the article presents a system dynamic model that provides descriptive and prescriptive accounts of learners' anxiety. Metaphors and the metaphorizing process are…
Power Analysis Software for Educational Researchers
ERIC Educational Resources Information Center
Peng, Chao-Ying Joanne; Long, Haiying; Abaci, Serdar
2012-01-01
Given the importance of statistical power analysis in quantitative research and the repeated emphasis on it by American Educational Research Association/American Psychological Association journals, the authors examined the reporting practice of power analysis by the quantitative studies published in 12 education/psychology journals between 2005…
In a previously published study, quantitative relationships were developed between landscape metrics and sediment contamination for 25 small estuarine systems within Chesapeake Bay. Nonparametric statistical analysis (rank transformation) was used to develop an empirical relation...
Lee, Eugene; Choi, Jung-Ah; Oh, Joo Han; Ahn, Soyeon; Hong, Sung Hwan; Chai, Jee Won; Kang, Heung Sik
2013-09-01
To retrospectively evaluate fatty degeneration (FD) of rotator cuff muscles on CTA using Goutallier's grading system and quantitative measurements with comparison between pre- and postoperative states. IRB approval was obtained for this study. Two radiologists independently reviewed pre- and postoperative CTAs of 43 patients (24 males and 19 females, mean age, 58.1 years) with 46 shoulders confirmed as full-thickness tears with random distribution. FD of supraspinatus, infraspinatus/teres minor, and subscapularis was assessed using Goutallier's system and by quantitative measurements of Hounsfield units (HUs) on sagittal images. Changes in FD grades and HUs were compared between pre- and postoperative CTAs and analyzed with respect to preoperative tear size and postoperative cuff integrity. The correlations between qualitative grades and quantitative measurements and their inter-observer reliabilities were also assessed. There was statistically significant correlation between FD grades and HU measurements of all muscles on pre- and postoperative CTA (p < 0.05). Inter-observer reliability of fatty degeneration grades were excellent to substantial on both pre- and postoperative CTA in supraspinatus (0.8685 and 0.8535) and subscapularis muscles (0.7777 and 0.7972), but fair in infraspinatus/teres minor muscles (0.5791 and 0.5740); however, quantitative Hounsfield units measurements showed excellent reliability for all muscles (ICC: 0.7950 and 0.9346 for SST, 0.7922 and 0.8492 for SSC, and 0.9254 and 0.9052 for IST/TM). No muscle showed improvement of fatty degeneration after surgical repair on qualitative and quantitative assessments; there was no difference in changes of fatty degeneration after surgical repair according to preoperative tear size and post-operative cuff integrity (p > 0.05). The average dose-length product (DLP, mGy · cm) was 365.2 mGy · cm (range, 323.8-417.2 mGy · cm) and estimated average effective dose was 5.1 mSv. Goutallier grades correlated well with HUs of rotator cuff muscles. Reliability was excellent for both systems, except for FD grade of IST/TM muscles, which may be more reliably assessed using quantitative measurements.
Ji, Qinqin; Salomon, Arthur R.
2015-01-01
The activation of T-lymphocytes through antigen-mediated T-cell receptor (TCR) clustering is vital in regulating the adaptive-immune response. Although T cell receptor signaling has been extensively studied, the fundamental mechanisms for signal initiation are not fully understood. Reduced temperature initiated some of the hallmarks of TCR signaling such as increased phosphorylation and activation on ERK and calcium release from the endoplasmic reticulum as well as coalesce T-cell membrane microdomains. The precise mechanism of TCR signaling initiation due to temperature change remains obscure. One critical question is whether signaling initiated by cold treatment of T cells differs from signaling initiated by crosslinking of the T cell receptor. To address this uncertainty, a wide-scale, quantitative mass spectrometry-based phosphoproteomic analysis was performed on T cells stimulated either by temperature shift or through crosslinking of the TCR. Careful statistical comparison between the two stimulations revealed a striking level of identity between the subset of 339 sites that changed significantly with both stimulations. This study demonstrates for the first time, at unprecedented detail, that T cell cold treatment was sufficient to initiate signaling patterns nearly identical to soluble antibody stimulation, shedding new light on the mechanism of activation of these critically important immune cells. PMID:25839225
Proteomics wants cRacker: automated standardized data analysis of LC-MS derived proteomic data.
Zauber, Henrik; Schulze, Waltraud X
2012-11-02
The large-scale analysis of thousands of proteins under various experimental conditions or in mutant lines has gained more and more importance in hypothesis-driven scientific research and systems biology in the past years. Quantitative analysis by large scale proteomics using modern mass spectrometry usually results in long lists of peptide ion intensities. The main interest for most researchers, however, is to draw conclusions on the protein level. Postprocessing and combining peptide intensities of a proteomic data set requires expert knowledge, and the often repetitive and standardized manual calculations can be time-consuming. The analysis of complex samples can result in very large data sets (lists with several 1000s to 100,000 entries of different peptides) that cannot easily be analyzed using standard spreadsheet programs. To improve speed and consistency of the data analysis of LC-MS derived proteomic data, we developed cRacker. cRacker is an R-based program for automated downstream proteomic data analysis including data normalization strategies for metabolic labeling and label free quantitation. In addition, cRacker includes basic statistical analysis, such as clustering of data, or ANOVA and t tests for comparison between treatments. Results are presented in editable graphic formats and in list files.
Filograna, Laura; Magarelli, Nicola; Leone, Antonio; Guggenberger, Roman; Winklhofer, Sebastian; Thali, Michael John; Bonomo, Lorenzo
2015-09-01
The aim of this ex vivo study was to assess the performance of monoenergetic dual-energy CT (DECT) reconstructions to reduce metal artefacts in bodies with orthopedic devices in comparison with standard single-energy CT (SECT) examinations in forensic imaging. Forensic and clinical impacts of this study are also discussed. Thirty metallic implants in 20 consecutive cadavers with metallic implants underwent both SECT and DECT with a clinically suitable scanning protocol. Extrapolated monoenergetic DECT images at 64, 69, 88, 105, 120, and 130 keV and individually adjusted monoenergy for optimized image quality (OPTkeV) were generated. Image quality of the seven monoenergetic images and of the corresponding SECT image was assessed qualitatively and quantitatively by visual rating and measurements of attenuation changes induced by streak artefact. Qualitative and quantitative analyses showed statistically significant differences between monoenergetic DECT extrapolated images and SECT, with improvements in diagnostic assessment in monoenergetic DECT at higher monoenergies. The mean value of OPTkeV was 137.6 ± 4.9 with a range of 130 to 148 keV. This study demonstrates that monoenergetic DECT images extrapolated at high energy levels significantly reduce metallic artefacts from orthopedic implants and improve image quality compared to SECT examination in forensic imaging.
Mander, Luke; Li, Mao; Mio, Washington; Fowlkes, Charless C; Punyasena, Surangi W
2013-11-07
Taxonomic identification of pollen and spores uses inherently qualitative descriptions of morphology. Consequently, identifications are restricted to categories that can be reliably classified by multiple analysts, resulting in the coarse taxonomic resolution of the pollen and spore record. Grass pollen represents an archetypal example; it is not routinely identified below family level. To address this issue, we developed quantitative morphometric methods to characterize surface ornamentation and classify grass pollen grains. This produces a means of quantifying morphological features that are traditionally described qualitatively. We used scanning electron microscopy to image 240 specimens of pollen from 12 species within the grass family (Poaceae). We classified these species by developing algorithmic features that quantify the size and density of sculptural elements on the pollen surface, and measure the complexity of the ornamentation they form. These features yielded a classification accuracy of 77.5%. In comparison, a texture descriptor based on modelling the statistical distribution of brightness values in image patches yielded a classification accuracy of 85.8%, and seven human subjects achieved accuracies between 68.33 and 81.67%. The algorithmic features we developed directly relate to biologically meaningful features of grass pollen morphology, and could facilitate direct interpretation of unsupervised classification results from fossil material.
Does Augmented Reality Affect High School Students' Learning Outcomes in Chemistry?
NASA Astrophysics Data System (ADS)
Renner, Jonathan Christopher
Some teens may prefer using a self-directed, constructivist, and technologic approach to learning rather than traditional classroom instruction. If it can be demonstrated, educators may adjust their teaching methodology. The guiding research question for this study focused on how augmented reality affects high school students' learning outcomes in chemistry, as measured by a pretest and posttest methodology when ensuring that the individual outcomes were not the result of group collaboration. This study employed a quantitative, quasi-experimental study design that used a comparison and experimental group. Inferential statistical analysis was employed. The study was conducted at a high school in southwest Colorado. Eighty-nine respondents returned completed and signed consent forms, and 78 participants completed the study. Results demonstrated that augmented reality instruction caused posttest scores to significantly increase, as compared to pretest scores, but it was not as effective as traditional classroom instruction. Scores did improve under both types of instruction; therefore, more research is needed in this area. The present study was the first quantitative experiment controlling for individual learning to validate augmented reality using mobile handheld digital devices that affected individual students' learning outcomes without group collaboration. This topic was important to the field of education as it may help educators understand how students learn and it may also change the way students are taught.
Shi, Lei; Shuai, Jian; Xu, Kui
2014-08-15
Fire and explosion accidents of steel oil storage tanks (FEASOST) occur occasionally during the petroleum and chemical industry production and storage processes and often have devastating impact on lives, the environment and property. To contribute towards the development of a quantitative approach for assessing the occurrence probability of FEASOST, a fault tree of FEASOST is constructed that identifies various potential causes. Traditional fault tree analysis (FTA) can achieve quantitative evaluation if the failure data of all of the basic events (BEs) are available, which is almost impossible due to the lack of detailed data, as well as other uncertainties. This paper makes an attempt to perform FTA of FEASOST by a hybrid application between an expert elicitation based improved analysis hierarchy process (AHP) and fuzzy set theory, and the occurrence possibility of FEASOST is estimated for an oil depot in China. A comparison between statistical data and calculated data using fuzzy fault tree analysis (FFTA) based on traditional and improved AHP is also made. Sensitivity and importance analysis has been performed to identify the most crucial BEs leading to FEASOST that will provide insights into how managers should focus effective mitigation. Copyright © 2014 Elsevier B.V. All rights reserved.
Two High-Resolution, Quantitative, Infrared Spectral Libraries for Atmospheric Chemistry
NASA Astrophysics Data System (ADS)
Johnson, T. J.; Sharpe, S. W.; Sams, R. L.; Chu, P. M.
2001-12-01
The Pacific Northwest National Laboratory (PNNL) and the National Institute of Standards and Technology (NIST) are independently creating quantitative, 0.10 cm-1 resolution, infrared spectral libraries of vapor phase compounds. Both libraries contain many species of use to the gas-phase spectroscopist, including for atmospheric chemistry. The NIST library will consist of approximately 100 vapor phase spectra primarily associated with volatile hazardous air pollutants (HAPs) and suspected greenhouse gases, whereas the PNNL library will consist of approximately 400 vapor phase spectra associated with DOE's remediation mission. Data are being recorded from 600 to 6500 cm-1 to cover not only the classical fingerprint region, but much of the near-infrared as well. The wavelength axis is calibrated against published standards. To prepare the samples, the two laboratories use significantly different sample preparation and handling techniques: NIST uses gravimetric dilution and a continuous flowing sample while PNNL uses partial pressure dilution and a static sample. The data are validated against one another and agreement on the ordinate axis is generally found to be within the statistical uncertainties (2σ ) of the Beer's law fit and less than 3 % of the total integrated band areas for the 4 chemicals used in this comparison. The nature of the two databases and the rigorous nature used to acquire the data will be briefly discussed.
Xia, Qiangwei; Wang, Tiansong; Park, Yoonsuk; Lamont, Richard J.; Hackett, Murray
2009-01-01
Differential analysis of whole cell proteomes by mass spectrometry has largely been applied using various forms of stable isotope labeling. While metabolic stable isotope labeling has been the method of choice, it is often not possible to apply such an approach. Four different label free ways of calculating expression ratios in a classic “two-state” experiment are compared: signal intensity at the peptide level, signal intensity at the protein level, spectral counting at the peptide level, and spectral counting at the protein level. The quantitative data were mined from a dataset of 1245 qualitatively identified proteins, about 56% of the protein encoding open reading frames from Porphyromonas gingivalis, a Gram-negative intracellular pathogen being studied under extracellular and intracellular conditions. Two different control populations were compared against P. gingivalis internalized within a model human target cell line. The q-value statistic, a measure of false discovery rate previously applied to transcription microarrays, was applied to proteomics data. For spectral counting, the most logically consistent estimate of random error came from applying the locally weighted scatter plot smoothing procedure (LOWESS) to the most extreme ratios generated from a control technical replicate, thus setting upper and lower bounds for the region of experimentally observed random error. PMID:19337574
Inference for Stochastic Chemical Kinetics Using Moment Equations and System Size Expansion.
Fröhlich, Fabian; Thomas, Philipp; Kazeroonian, Atefeh; Theis, Fabian J; Grima, Ramon; Hasenauer, Jan
2016-07-01
Quantitative mechanistic models are valuable tools for disentangling biochemical pathways and for achieving a comprehensive understanding of biological systems. However, to be quantitative the parameters of these models have to be estimated from experimental data. In the presence of significant stochastic fluctuations this is a challenging task as stochastic simulations are usually too time-consuming and a macroscopic description using reaction rate equations (RREs) is no longer accurate. In this manuscript, we therefore consider moment-closure approximation (MA) and the system size expansion (SSE), which approximate the statistical moments of stochastic processes and tend to be more precise than macroscopic descriptions. We introduce gradient-based parameter optimization methods and uncertainty analysis methods for MA and SSE. Efficiency and reliability of the methods are assessed using simulation examples as well as by an application to data for Epo-induced JAK/STAT signaling. The application revealed that even if merely population-average data are available, MA and SSE improve parameter identifiability in comparison to RRE. Furthermore, the simulation examples revealed that the resulting estimates are more reliable for an intermediate volume regime. In this regime the estimation error is reduced and we propose methods to determine the regime boundaries. These results illustrate that inference using MA and SSE is feasible and possesses a high sensitivity.
Magnuson, Matthew L; Speth, Thomas F
2005-10-01
Granular activated carbon is a frequently explored technology for removing synthetic organic contaminants from drinking water sources. The success of this technology relies on a number of factors based not only on the adsorptive properties of the contaminant but also on properties of the water itself, notably the presence of substances in the water which compete for adsorption sites. Because it is impractical to perform field-scale evaluations for all possible contaminants, the pore surface diffusion model (PSDM) has been developed and used to predict activated carbon column performance using single-solute isotherm data as inputs. Many assumptions are built into this model to account for kinetics of adsorption and competition for adsorption sites. This work further evaluates and expands this model, through the use of quantitative structure-property relationships (QSPRs) to predict the effect of natural organic matter fouling on activated carbon adsorption of specific contaminants. The QSPRs developed are based on a combination of calculated topographical indices and quantum chemical parameters. The QSPRs were evaluated in terms of their statistical predictive ability,the physical significance of the descriptors, and by comparison with field data. The QSPR-enhanced PSDM was judged to give results better than what could previously be obtained.
Inference for Stochastic Chemical Kinetics Using Moment Equations and System Size Expansion
Thomas, Philipp; Kazeroonian, Atefeh; Theis, Fabian J.; Grima, Ramon; Hasenauer, Jan
2016-01-01
Quantitative mechanistic models are valuable tools for disentangling biochemical pathways and for achieving a comprehensive understanding of biological systems. However, to be quantitative the parameters of these models have to be estimated from experimental data. In the presence of significant stochastic fluctuations this is a challenging task as stochastic simulations are usually too time-consuming and a macroscopic description using reaction rate equations (RREs) is no longer accurate. In this manuscript, we therefore consider moment-closure approximation (MA) and the system size expansion (SSE), which approximate the statistical moments of stochastic processes and tend to be more precise than macroscopic descriptions. We introduce gradient-based parameter optimization methods and uncertainty analysis methods for MA and SSE. Efficiency and reliability of the methods are assessed using simulation examples as well as by an application to data for Epo-induced JAK/STAT signaling. The application revealed that even if merely population-average data are available, MA and SSE improve parameter identifiability in comparison to RRE. Furthermore, the simulation examples revealed that the resulting estimates are more reliable for an intermediate volume regime. In this regime the estimation error is reduced and we propose methods to determine the regime boundaries. These results illustrate that inference using MA and SSE is feasible and possesses a high sensitivity. PMID:27447730
Cross-Study Homogeneity of Psoriasis Gene Expression in Skin across a Large Expression Range
Kerkof, Keith; Timour, Martin; Russell, Christopher B.
2013-01-01
Background In psoriasis, only limited overlap between sets of genes identified as differentially expressed (psoriatic lesional vs. psoriatic non-lesional) was found using statistical and fold-change cut-offs. To provide a framework for utilizing prior psoriasis data sets we sought to understand the consistency of those sets. Methodology/Principal Findings Microarray expression profiling and qRT-PCR were used to characterize gene expression in PP and PN skin from psoriasis patients. cDNA (three new data sets) and cRNA hybridization (four existing data sets) data were compared using a common analysis pipeline. Agreement between data sets was assessed using varying qualitative and quantitative cut-offs to generate a DEG list in a source data set and then using other data sets to validate the list. Concordance increased from 67% across all probe sets to over 99% across more than 10,000 probe sets when statistical filters were employed. The fold-change behavior of individual genes tended to be consistent across the multiple data sets. We found that genes with <2-fold change values were quantitatively reproducible between pairs of data-sets. In a subset of transcripts with a role in inflammation changes detected by microarray were confirmed by qRT-PCR with high concordance. For transcripts with both PN and PP levels within the microarray dynamic range, microarray and qRT-PCR were quantitatively reproducible, including minimal fold-changes in IL13, TNFSF11, and TNFRSF11B and genes with >10-fold changes in either direction such as CHRM3, IL12B and IFNG. Conclusions/Significance Gene expression changes in psoriatic lesions were consistent across different studies, despite differences in patient selection, sample handling, and microarray platforms but between-study comparisons showed stronger agreement within than between platforms. We could use cut-offs as low as log10(ratio) = 0.1 (fold-change = 1.26), generating larger gene lists that validate on independent data sets. The reproducibility of PP signatures across data sets suggests that different sample sets can be productively compared. PMID:23308107
Cavalot, A L; Palonta, F; Preti, G; Nazionale, G; Ricci, E; Vione, N; Albera, R; Cortesina, G
2001-12-01
The insertion of a prosthesis and restoration with pectoralis major myocutaneous flaps for patients subjected to total pharyngolaryngectomy is a technique now universally accepted; however the literature on the subject is lacking. Our study considers 10 patients subjected to total pharyngolaryngectomy and restoration with pectoralis major myocutaneous flaps who were fitted with vocal function prostheses and a control group of 50 subjects treated with a total laryngectomy without pectoralis major myocutaneous flaps and who were fitted with vocal function prostheses. Specific qualitative and quantitative parameters were compared. The quantitative measurement of the levels of voice intensity and the evaluation of the harmonics-to-noise ratio were not statistically significant (p > 0.05) between the two study groups at either high- or low-volume speech. On the contrary, statistically significant differences were found (p < 0.05) for the basic frequency of both the low and the high volume voice. For the qualitative analysis seven parameters were established for evaluation by trained and untrained listeners: on the basis of these parameters the control group had statistically better voices.
Social Comparison and Body Image in Adolescence: A Grounded Theory Approach
ERIC Educational Resources Information Center
Krayer, A.; Ingledew, D. K.; Iphofen, R.
2008-01-01
This study explored the use of social comparison appraisals in adolescents' lives with particular reference to enhancement appraisals which can be used to counter threats to the self. Social comparison theory has been increasingly used in quantitative research to understand the processes through which societal messages about appearance influence…
Target Scattering Metrics: Model-Model and Model-Data Comparisons
2017-12-13
measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for
Target Scattering Metrics: Model-Model and Model Data comparisons
2017-12-13
measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for
Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.
Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao
2015-08-01
Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.
Jahan, Munira; Lutful Moben, Ahmed; Tabassum, Shahina
2014-01-01
ABSTRACT Background Both real-time-polymerase chain reaction (PCR) and hybrid capture 2 (HC2) assay can detect and quantify hepatitis B virus (HBV) DNA. However, real-time-PCR can detect a wide range of HBV DNA, while HC2 assay could not detect lower levels of viremia. The present study was designed to detect and quantify HBV DNA by real-time-PCR and HC2 assay and compare the quantitative data of these two assays. Materials and methods A cross-sectional study was conducted in between July 2010 and June 2011. A total of 66 serologically diagnosed chronic hepatitis B (CHB) patients were selected for the study. Real-time-PCR and HC2 assay was done to detect HBV DNA. Data were analyzed by statistical Package for the social sciences (SPSS). Results Among 66 serologically diagnosed chronic hepatitis B patients 40 (60.61%) patients had detectable and 26 (39.39%) had undetectable HBV DNA by HC2 assay. Concordant results were obtained for 40 (60.61%) out of these 66 patients by real-time-PCR and HC2 assay with mean viral load of 7.06 ± 1.13 log10 copies/ml and 6.95 ± 1.08 log10 copies/ml, respectively. In the remaining 26 patients, HBV DNA was detectable by real-time-PCR in 20 patients (mean HBV DNA level was 3.67 ± 0.72 log10 copies/ml. However, HBV DNA could not be detectable in six cases by the both assays. The study showed strong correlation (r = 0.915) between real-time-PCR and HC2 assay for the detection and quantification of HBV DNA. Conclusion HC2 assay may be used as an alternative to real-time-PCR for CHB patients. How to cite this article: Majid F, Jahan M, Moben AL, Tabassum S. Comparison of Hybrid Capture 2 Assay with Real-time-PCR for Detection and Quantitation of Hepatitis B Virus DNA. Euroasian J Hepato-Gastroenterol 2014;4(1):31-35. PMID:29264316
Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides
Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...
Some Epistemological Considerations Concerning Quantitative Analysis
ERIC Educational Resources Information Center
Dobrescu, Emilian
2008-01-01
This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that…
Quantitative Graphics in Newspapers.
ERIC Educational Resources Information Center
Tankard, James W., Jr.
The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…
78 FR 48681 - Proposed Data Collections Submitted for Public Comment and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
.... Qualitative and quantitative data will be collected through progress reports, surveys, the health impact tracking tool, and interviews. Quantitative data will be analyzed using descriptive statistics. Qualitative... States (SOTS) online surveys, (3) Interviews, and (4) Online surveys related to the Regional Network...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-01
.... Currie, Program Analyst, Office of policy for Extramural Research Administration, 6705 Rockledge Drive... perceptions and opinions, but are not statistical surveys that yield quantitative results that can be... generic clearance for qualitative information will not be used for quantitative information collections...