Statistical and Economic Techniques for Site-specific Nematode Management.
Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L
2014-03-01
Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.
Improved specifications for hydraulic cement concrete : final report.
DOT National Transportation Integrated Search
1983-01-01
This is the final report of a study of the application of statistical concepts to specifications for hydraulic cement concrete as used in highway facilities. It reviews the general problems associated with the application of statistical techniques to...
Chi-squared and C statistic minimization for low count per bin data
NASA Astrophysics Data System (ADS)
Nousek, John A.; Shue, David R.
1989-07-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
Chi-squared and C statistic minimization for low count per bin data. [sampling in X ray astronomy
NASA Technical Reports Server (NTRS)
Nousek, John A.; Shue, David R.
1989-01-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
Analysis of Variance in Statistical Image Processing
NASA Astrophysics Data System (ADS)
Kurz, Ludwik; Hafed Benteftifa, M.
1997-04-01
A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.
NOVEL STATISTICAL APPROACH TO EVALUATE SPATIAL DISTRIBUTION OF PM FROM SPECIFIC SOURCE CATEGORIES
This task addresses aspects of NRC recommendations 10A and 10B. Positive matrix factorization (PMF) is a new statistical techniques for determining the daily contribution to PM mass of specific source categories (auto exhaust, smelters, suspended soil, secondary sulfate, etc.). I...
Teaching the Meaning of Statistical Techniques with Microcomputer Simulation.
ERIC Educational Resources Information Center
Lee, Motoko Y.; And Others
Students in an introductory statistics course are often preoccupied with learning the computational routines of specific summary statistics and thereby fail to develop an understanding of the meaning of those statistics or their conceptual basis. To help students develop a better understanding of the meaning of three frequently used statistics,…
Oelze, Michael L.; Mamou, Jonathan
2017-01-01
Conventional medical imaging technologies, including ultrasound, have continued to improve over the years. For example, in oncology, medical imaging is characterized by high sensitivity, i.e., the ability to detect anomalous tissue features, but the ability to classify these tissue features from images often lacks specificity. As a result, a large number of biopsies of tissues with suspicious image findings are performed each year with a vast majority of these biopsies resulting in a negative finding. To improve specificity of cancer imaging, quantitative imaging techniques can play an important role. Conventional ultrasound B-mode imaging is mainly qualitative in nature. However, quantitative ultrasound (QUS) imaging can provide specific numbers related to tissue features that can increase the specificity of image findings leading to improvements in diagnostic ultrasound. QUS imaging techniques can encompass a wide variety of techniques including spectral-based parameterization, elastography, shear wave imaging, flow estimation and envelope statistics. Currently, spectral-based parameterization and envelope statistics are not available on most conventional clinical ultrasound machines. However, in recent years QUS techniques involving spectral-based parameterization and envelope statistics have demonstrated success in many applications, providing additional diagnostic capabilities. Spectral-based techniques include the estimation of the backscatter coefficient, estimation of attenuation, and estimation of scatterer properties such as the correlation length associated with an effective scatterer diameter and the effective acoustic concentration of scatterers. Envelope statistics include the estimation of the number density of scatterers and quantification of coherent to incoherent signals produced from the tissue. Challenges for clinical application include correctly accounting for attenuation effects and transmission losses and implementation of QUS on clinical devices. Successful clinical and pre-clinical applications demonstrating the ability of QUS to improve medical diagnostics include characterization of the myocardium during the cardiac cycle, cancer detection, classification of solid tumors and lymph nodes, detection and quantification of fatty liver disease, and monitoring and assessment of therapy. PMID:26761606
Salleh, Fatmah M; Al-Mekhlafi, Abdulsalam M; Nordin, Anisah; Yasin, 'Azlin M; Al-Mekhlafi, Hesham M; Moktar, Norhayati
2011-01-01
This study was conducted to evaluate the modification of the usual Gram-chromotrope staining technique developed in-house known as Gram-chromotrope Kinyoun (GCK) in comparison with the Weber Modified Trichrome (WMT) staining technique; as the reference technique. Two hundred and ninety fecal specimens received by the Microbiology Diagnostic Laboratory of Hospital Universiti Kebangsaan Malaysia were examined for the presence of microsporidial spores. The sensitivity and specificity of GCK compared to the reference technique were 98% and 98.3%, respectively. The positive and negative predictive values were 92.5% and 99.6%, respectively. The agreement between the reference technique and the GCK staining technique was statistically significant by Kappa statistics (K = 0.941, P < 0.001). It is concluded that the GCK staining technique has high sensitivity and specificity in the detection of microsporidial spores in fecal specimens. Hence, it is recommended to be used in the diagnosis of intestinal microsporidiosis. Copyright © 2011 Elsevier Inc. All rights reserved.
An Applied Statistics Course for Systematics and Ecology PhD Students
ERIC Educational Resources Information Center
Ojeda, Mario Miguel; Sosa, Victoria
2002-01-01
Statistics education is under review at all educational levels. Statistical concepts, as well as the use of statistical methods and techniques, can be taught in at least two contrasting ways. Specifically, (1) teaching can be theoretically and mathematically oriented, or (2) it can be less mathematically oriented being focused, instead, on…
Goddard trajectory determination subsystem: Mathematical specifications
NASA Technical Reports Server (NTRS)
Wagner, W. E. (Editor); Velez, C. E. (Editor)
1972-01-01
The mathematical specifications of the Goddard trajectory determination subsystem of the flight dynamics system are presented. These specifications include the mathematical description of the coordinate systems, dynamic and measurement model, numerical integration techniques, and statistical estimation concepts.
Oelze, Michael L; Mamou, Jonathan
2016-02-01
Conventional medical imaging technologies, including ultrasound, have continued to improve over the years. For example, in oncology, medical imaging is characterized by high sensitivity, i.e., the ability to detect anomalous tissue features, but the ability to classify these tissue features from images often lacks specificity. As a result, a large number of biopsies of tissues with suspicious image findings are performed each year with a vast majority of these biopsies resulting in a negative finding. To improve specificity of cancer imaging, quantitative imaging techniques can play an important role. Conventional ultrasound B-mode imaging is mainly qualitative in nature. However, quantitative ultrasound (QUS) imaging can provide specific numbers related to tissue features that can increase the specificity of image findings leading to improvements in diagnostic ultrasound. QUS imaging can encompass a wide variety of techniques including spectral-based parameterization, elastography, shear wave imaging, flow estimation, and envelope statistics. Currently, spectral-based parameterization and envelope statistics are not available on most conventional clinical ultrasound machines. However, in recent years, QUS techniques involving spectral-based parameterization and envelope statistics have demonstrated success in many applications, providing additional diagnostic capabilities. Spectral-based techniques include the estimation of the backscatter coefficient (BSC), estimation of attenuation, and estimation of scatterer properties such as the correlation length associated with an effective scatterer diameter (ESD) and the effective acoustic concentration (EAC) of scatterers. Envelope statistics include the estimation of the number density of scatterers and quantification of coherent to incoherent signals produced from the tissue. Challenges for clinical application include correctly accounting for attenuation effects and transmission losses and implementation of QUS on clinical devices. Successful clinical and preclinical applications demonstrating the ability of QUS to improve medical diagnostics include characterization of the myocardium during the cardiac cycle, cancer detection, classification of solid tumors and lymph nodes, detection and quantification of fatty liver disease, and monitoring and assessment of therapy.
Technical Note: The Initial Stages of Statistical Data Analysis
Tandy, Richard D.
1998-01-01
Objective: To provide an overview of several important data-related considerations in the design stage of a research project and to review the levels of measurement and their relationship to the statistical technique chosen for the data analysis. Background: When planning a study, the researcher must clearly define the research problem and narrow it down to specific, testable questions. The next steps are to identify the variables in the study, decide how to group and treat subjects, and determine how to measure, and the underlying level of measurement of, the dependent variables. Then the appropriate statistical technique can be selected for data analysis. Description: The four levels of measurement in increasing complexity are nominal, ordinal, interval, and ratio. Nominal data are categorical or “count” data, and the numbers are treated as labels. Ordinal data can be ranked in a meaningful order by magnitude. Interval data possess the characteristics of ordinal data and also have equal distances between levels. Ratio data have a natural zero point. Nominal and ordinal data are analyzed with nonparametric statistical techniques and interval and ratio data with parametric statistical techniques. Advantages: Understanding the four levels of measurement and when it is appropriate to use each is important in determining which statistical technique to use when analyzing data. PMID:16558489
Simplified estimation of age-specific reference intervals for skewed data.
Wright, E M; Royston, P
1997-12-30
Age-specific reference intervals are commonly used in medical screening and clinical practice, where interest lies in the detection of extreme values. Many different statistical approaches have been published on this topic. The advantages of a parametric method are that they necessarily produce smooth centile curves, the entire density is estimated and an explicit formula is available for the centiles. The method proposed here is a simplified version of a recent approach proposed by Royston and Wright. Basic transformations of the data and multiple regression techniques are combined to model the mean, standard deviation and skewness. Using these simple tools, which are implemented in almost all statistical computer packages, age-specific reference intervals may be obtained. The scope of the method is illustrated by fitting models to several real data sets and assessing each model using goodness-of-fit techniques.
A Review of Meta-Analysis Packages in R
ERIC Educational Resources Information Center
Polanin, Joshua R.; Hennessy, Emily A.; Tanner-Smith, Emily E.
2017-01-01
Meta-analysis is a statistical technique that allows an analyst to synthesize effect sizes from multiple primary studies. To estimate meta-analysis models, the open-source statistical environment R is quickly becoming a popular choice. The meta-analytic community has contributed to this growth by developing numerous packages specific to…
Mali, Matilda; Dell'Anna, Maria Michela; Mastrorilli, Piero; Damiani, Leonardo; Ungaro, Nicola; Belviso, Claudia; Fiore, Saverio
2015-11-01
Sediment contamination by metals poses significant risks to coastal ecosystems and is considered to be problematic for dredging operations. The determination of the background values of metal and metalloid distribution based on site-specific variability is fundamental in assessing pollution levels in harbour sediments. The novelty of the present work consists of addressing the scope and limitation of analysing port sediments through the use of conventional statistical techniques (such as: linear regression analysis, construction of cumulative frequency curves and the iterative 2σ technique), that are commonly employed for assessing Regional Geochemical Background (RGB) values in coastal sediments. This study ascertained that although the tout court use of such techniques in determining the RGB values in harbour sediments seems appropriate (the chemical-physical parameters of port sediments fit well with statistical equations), it should nevertheless be avoided because it may be misleading and can mask key aspects of the study area that can only be revealed by further investigations, such as mineralogical and multivariate statistical analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.
Exploring Tree Age & Diameter to Illustrate Sample Design & Inference in Observational Ecology
ERIC Educational Resources Information Center
Casady, Grant M.
2015-01-01
Undergraduate biology labs often explore the techniques of data collection but neglect the statistical framework necessary to express findings. Students can be confused about how to use their statistical knowledge to address specific biological questions. Growth in the area of observational ecology requires that students gain experience in…
Computer Techniques for Studying Coverage, Overlaps, and Gaps in Collections.
ERIC Educational Resources Information Center
White, Howard D.
1987-01-01
Describes techniques for using the Statistical Package for the Social Sciences (SSPS) to create tables for cooperative collection development across a number of libraries. Specific commands are given to generate holdings profiles focusing on collection coverage, overlaps, gaps, or other areas of interest, from a master bibliographic list. (CLB)
Tahir, Muhammad; Jan, Bismillah; Hayat, Maqsood; Shah, Shakir Ullah; Amin, Muhammad
2018-04-01
Discriminative and informative feature extraction is the core requirement for accurate and efficient classification of protein subcellular localization images so that drug development could be more effective. The objective of this paper is to propose a novel modification in the Threshold Adjacency Statistics technique and enhance its discriminative power. In this work, we utilized Threshold Adjacency Statistics from a novel perspective to enhance its discrimination power and efficiency. In this connection, we utilized seven threshold ranges to produce seven distinct feature spaces, which are then used to train seven SVMs. The final prediction is obtained through the majority voting scheme. The proposed ETAS-SubLoc system is tested on two benchmark datasets using 5-fold cross-validation technique. We observed that our proposed novel utilization of TAS technique has improved the discriminative power of the classifier. The ETAS-SubLoc system has achieved 99.2% accuracy, 99.3% sensitivity and 99.1% specificity for Endogenous dataset outperforming the classical Threshold Adjacency Statistics technique. Similarly, 91.8% accuracy, 96.3% sensitivity and 91.6% specificity values are achieved for Transfected dataset. Simulation results validated the effectiveness of ETAS-SubLoc that provides superior prediction performance compared to the existing technique. The proposed methodology aims at providing support to pharmaceutical industry as well as research community towards better drug designing and innovation in the fields of bioinformatics and computational biology. The implementation code for replicating the experiments presented in this paper is available at: https://drive.google.com/file/d/0B7IyGPObWbSqRTRMcXI2bG5CZWs/view?usp=sharing. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kerr, Laura T.; Adams, Aine; O'Dea, Shirley; Domijan, Katarina; Cullen, Ivor; Hennelly, Bryan M.
2014-05-01
Raman microspectroscopy can be applied to the urinary bladder for highly accurate classification and diagnosis of bladder cancer. This technique can be applied in vitro to bladder epithelial cells obtained from urine cytology or in vivo as an optical biopsy" to provide results in real-time with higher sensitivity and specificity than current clinical methods. However, there exists a high degree of variability across experimental parameters which need to be standardised before this technique can be utilized in an everyday clinical environment. In this study, we investigate different laser wavelengths (473 nm and 532 nm), sample substrates (glass, fused silica and calcium fluoride) and multivariate statistical methods in order to gain insight into how these various experimental parameters impact on the sensitivity and specificity of Raman cytology.
Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza
2015-01-01
To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation.
NASA Technical Reports Server (NTRS)
Brown, Andrew M.
2014-01-01
Numerical and Analytical methods developed to determine damage accumulation in specific engine components when speed variation included. Dither Life Ratio shown to be well over factor of 2 for specific example. Steady-State assumption shown to be accurate for most turbopump cases, allowing rapid calculation of DLR. If hot-fire speed data unknown, Monte Carlo method developed that uses speed statistics for similar engines. Application of techniques allow analyst to reduce both uncertainty and excess conservatism. High values of DLR could allow previously unacceptable part to pass HCF criteria without redesign. Given benefit and ease of implementation, recommend that any finite life turbomachine component analysis adopt these techniques. Probability Values calculated, compared, and evaluated for several industry-proposed methods for combining random and harmonic loads. Two new excel macros written to calculate combined load for any specific probability level. Closed form Curve fits generated for widely used 3(sigma) and 2(sigma) probability levels. For design of lightweight aerospace components, obtaining accurate, reproducible, statistically meaningful answer critical.
Optimizing construction quality management of pavements using mechanistic performance analysis.
DOT National Transportation Integrated Search
2004-08-01
This report presents a statistical-based algorithm that was developed to reconcile the results from several pavement performance models used in the state of practice with systematic process control techniques. These algorithms identify project-specif...
Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza
2015-01-01
To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation. PMID:25709940
NASA Astrophysics Data System (ADS)
Butlitsky, M. A.; Zelener, B. B.; Zelener, B. V.
2015-11-01
Earlier a two-component pseudopotential plasma model, which we called a “shelf Coulomb” model has been developed. A Monte-Carlo study of canonical NVT ensemble with periodic boundary conditions has been undertaken to calculate equations of state, pair distribution functions, internal energies and other thermodynamics properties of the model. In present work, an attempt is made to apply so-called hybrid Gibbs statistical ensemble Monte-Carlo technique to this model. First simulation results data show qualitatively similar results for critical point region for both methods. Gibbs ensemble technique let us to estimate the melting curve position and a triple point of the model (in reduced temperature and specific volume coordinates): T* ≈ 0.0476, v* ≈ 6 × 10-4.
ERIC Educational Resources Information Center
Bradford, Jennifer; Mowder, Denise; Bohte, Joy
2016-01-01
The current project conducted an assessment of specific, directed use of student-centered teaching techniques in a criminal justice and criminology research methods and statistics class. The project sought to ascertain to what extent these techniques improved or impacted student learning and engagement in this traditionally difficult course.…
Urresti-Estala, Begoña; Carrasco-Cantos, Francisco; Vadillo-Pérez, Iñaki; Jiménez-Gavilán, Pablo
2013-03-15
Determine background levels are a key element in the further characterisation of groundwater bodies, according to Water Framework Directive 2000/60/EC and, more specifically, Groundwater Directive 2006/118/EC. In many cases, these levels present very high values for some parameters and types of groundwater, which is significant for their correct estimation as a prior step to establishing thresholds, assessing the status of water bodies and subsequently identifying contaminant patterns. The Guadalhorce River basin presents widely varying hydrogeological and hydrochemical conditions. Therefore, its background levels are the result of the many factors represented in the natural chemical composition of water bodies in this basin. The question of determining background levels under objective criteria is generally addressed as a statistical problem, arising from the many aspects involved in its calculation. In the present study, we outline the advantages of applying two statistical techniques applied specifically for this purpose: (1) the iterative 2σ technique and (2) the distribution function, and examine whether the conclusions reached by these techniques are similar or whether they differ considerably. In addition, we identify the specific characteristics of each approach and the circumstances under which they should be used. Copyright © 2012 Elsevier Ltd. All rights reserved.
Theoretical limitations of quantification for noncompetitive sandwich immunoassays.
Woolley, Christine F; Hayes, Mark A; Mahanti, Prasun; Douglass Gilman, S; Taylor, Tom
2015-11-01
Immunoassays exploit the highly selective interaction between antibodies and antigens to provide a vital method for biomolecule detection at low concentrations. Developers and practitioners of immunoassays have long known that non-specific binding often restricts immunoassay limits of quantification (LOQs). Aside from non-specific binding, most efforts by analytical chemists to reduce the LOQ for these techniques have focused on improving the signal amplification methods and minimizing the limitations of the detection system. However, with detection technology now capable of sensing single-fluorescence molecules, this approach is unlikely to lead to dramatic improvements in the future. Here, fundamental interactions based on the law of mass action are analytically connected to signal generation, replacing the four- and five-parameter fittings commercially used to approximate sigmoidal immunoassay curves and allowing quantitative consideration of non-specific binding and statistical limitations in order to understand the ultimate detection capabilities of immunoassays. The restrictions imposed on limits of quantification by instrumental noise, non-specific binding, and counting statistics are discussed based on equilibrium relations for a sandwich immunoassay. Understanding the maximal capabilities of immunoassays for each of these regimes can greatly assist in the development and evaluation of immunoassay platforms. While many studies suggest that single molecule detection is possible through immunoassay techniques, here, it is demonstrated that the fundamental limit of quantification (precision of 10 % or better) for an immunoassay is approximately 131 molecules and this limit is based on fundamental and unavoidable statistical limitations.
Modeling And Detecting Anomalies In Scada Systems
NASA Astrophysics Data System (ADS)
Svendsen, Nils; Wolthusen, Stephen
The detection of attacks and intrusions based on anomalies is hampered by the limits of specificity underlying the detection techniques. However, in the case of many critical infrastructure systems, domain-specific knowledge and models can impose constraints that potentially reduce error rates. At the same time, attackers can use their knowledge of system behavior to mask their manipulations, causing adverse effects to observed only after a significant period of time. This paper describes elementary statistical techniques that can be applied to detect anomalies in critical infrastructure networks. A SCADA system employed in liquefied natural gas (LNG) production is used as a case study.
JIGSAW: Preference-directed, co-operative scheduling
NASA Technical Reports Server (NTRS)
Linden, Theodore A.; Gaw, David
1992-01-01
Techniques that enable humans and machines to cooperate in the solution of complex scheduling problems have evolved out of work on the daily allocation and scheduling of Tactical Air Force resources. A generalized, formal model of these applied techniques is being developed. It is called JIGSAW by analogy with the multi-agent, constructive process used when solving jigsaw puzzles. JIGSAW begins from this analogy and extends it by propagating local preferences into global statistics that dynamically influence the value and variable ordering decisions. The statistical projections also apply to abstract resources and time periods--allowing more opportunities to find a successful variable ordering by reserving abstract resources and deferring the choice of a specific resource or time period.
Effect of different mixing methods on the physical properties of Portland cement.
Shahi, Shahriar; Ghasemi, Negin; Rahimi, Saeed; Yavari, Hamidreza; Samiei, Mohammad; Jafari, Farnaz
2016-12-01
The Portland cement is hydrophilic cement; as a result, the powder-to-liquid ratio affects the properties of the final mix. In addition, the mixing technique affects hydration. The aim of this study was to evaluate the effect of different mixing techniques (conventional, amalgamator and ultrasonic) on some selective physical properties of Portland cement. The physical properties to be evaluated were determined using the ISO 6786:2001 specification. One hundred sixty two samples of Portland cement were prepared for three mixing techniques for each physical property (each 6 samples). Data were analyzed using descriptive statistics, one-way ANOVA and post hoc Tukey tests. Statistical significance was set at P <0.05. The mixing technique had no significant effect on the compressive strength, film thickness and flow of Portland cement ( P >0.05). Dimensional changes (shrinkage), solubility and pH increased significantly by amalgamator and ultrasonic mixing techniques ( P <0.05). The ultrasonic technique significantly decreased working time, and the amalgamator and ultrasonic techniques significantly decreased the setting time ( P <0.05). The mixing technique exerted no significant effect on the flow, film thickness and compressive strength of Portland cement samples. Key words: Physical properties, Portland cement, mixing methods.
León, Larry F; Cai, Tianxi
2012-04-01
In this paper we develop model checking techniques for assessing functional form specifications of covariates in censored linear regression models. These procedures are based on a censored data analog to taking cumulative sums of "robust" residuals over the space of the covariate under investigation. These cumulative sums are formed by integrating certain Kaplan-Meier estimators and may be viewed as "robust" censored data analogs to the processes considered by Lin, Wei & Ying (2002). The null distributions of these stochastic processes can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be generated by computer simulation. Each observed process can then be graphically compared with a few realizations from the Gaussian process. We also develop formal test statistics for numerical comparison. Such comparisons enable one to assess objectively whether an apparent trend seen in a residual plot reects model misspecification or natural variation. We illustrate the methods with a well known dataset. In addition, we examine the finite sample performance of the proposed test statistics in simulation experiments. In our simulation experiments, the proposed test statistics have good power of detecting misspecification while at the same time controlling the size of the test.
Fundamentals of nuclear medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alazraki, N.P.; Mishkin, F.S.
1988-01-01
The book begins with basic science and statistics relevant to nuclear medicine, and specific organ systems are addressed in separate chapters. A section of the text also covers imaging of groups of disease processes (eg, trauma, cancer). The authors present a comparison between nuclear medicine techniques and other diagnostic imaging studies. A table is given which comments on sensitivities and specificities of common nuclear medicine studies. The sensitivities and specificities are categorized as very high, high, moderate, and so forth.
21 CFR 820.250 - Statistical techniques.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
21 CFR 820.250 - Statistical techniques.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
Specific Physical Training in Elite Male Team Handball.
Wagner, Herbert; Gierlinger, Manuel; Adzamija, Nermin; Ajayi, Samuel; Bacharach, David W; von Duvillard, Serge P
2017-11-01
Wagner, H, Gierlinger, M, Adzamija, N, Ajayi, S, Bacharach, DW, and von Duvillard, SP. Specific physical training in elite male team handball. J Strength Cond Res 31(11): 3083-3093, 2017-Specific physical training in elite team handball is essential for optimal player's performance; however, scientific knowledge is generally based on temporary training studies with subelite athletes. Therefore, the aim of the study was to analyze the effects of specific physical training in an elite male handball team over the entire season. Twelve players of a male handball team from the First Austrian Handball League conducted a 1-year specific physical training program in addition to their normal (team handball techniques and tactics) weekly training. Performance was measured with 5 general and 4 specific tests as well as game statistics during competition. Repeated measures analysis of variances and paired sample t-test were used to analyze differences in performance during training. We found a significant increase in oxygen uptake, offense time, defense time, fast break time, and jump height in the specific tests. Game performance statistics revealed a lower throwing percentage in the hosting team (59%) compared with the rival teams (63%). Our results indicated that specific endurance and agility are an acceptable modality in elite male team handball. However, performance in competition is strongly influenced by specific techniques and tactics. We recommend to strength and conditioning professionals that they tailor strength and power training, coordination and endurance as specific as possible, using free weights, agility exercises that include change in direction and jumps as well as short (10-15 seconds) high-intensity intervals.
Statistical EMC: A new dimension electromagnetic compatibility of digital electronic systems
NASA Astrophysics Data System (ADS)
Tsaliovich, Anatoly
Electromagnetic compatibility compliance test results are used as a database for addressing three classes of electromagnetic-compatibility (EMC) related problems: statistical EMC profiles of digital electronic systems, the effect of equipment-under-test (EUT) parameters on the electromagnetic emission characteristics, and EMC measurement specifics. Open area test site (OATS) and absorber line shielded room (AR) results are compared for equipment-under-test highest radiated emissions. The suggested statistical evaluation methodology can be utilized to correlate the results of different EMC test techniques, characterize the EMC performance of electronic systems and components, and develop recommendations for electronic product optimal EMC design.
Image correlation and sampling study
NASA Technical Reports Server (NTRS)
Popp, D. J.; Mccormack, D. S.; Sedwick, J. L.
1972-01-01
The development of analytical approaches for solving image correlation and image sampling of multispectral data is discussed. Relevant multispectral image statistics which are applicable to image correlation and sampling are identified. The general image statistics include intensity mean, variance, amplitude histogram, power spectral density function, and autocorrelation function. The translation problem associated with digital image registration and the analytical means for comparing commonly used correlation techniques are considered. General expressions for determining the reconstruction error for specific image sampling strategies are developed.
Testing statistical isotropy in cosmic microwave background polarization maps
NASA Astrophysics Data System (ADS)
Rath, Pranati K.; Samal, Pramoda Kumar; Panda, Srikanta; Mishra, Debesh D.; Aluri, Pavan K.
2018-04-01
We apply our symmetry based Power tensor technique to test conformity of PLANCK Polarization maps with statistical isotropy. On a wide range of angular scales (l = 40 - 150), our preliminary analysis detects many statistically anisotropic multipoles in foreground cleaned full sky PLANCK polarization maps viz., COMMANDER and NILC. We also study the effect of residual foregrounds that may still be present in the Galactic plane using both common UPB77 polarization mask, as well as the individual component separation method specific polarization masks. However, some of the statistically anisotropic modes still persist, albeit significantly in NILC map. We further probed the data for any coherent alignments across multipoles in several bins from the chosen multipole range.
De Los Ríos, F. A.; Paluszny, M.
2015-01-01
We consider some methods to extract information about the rotator cuff based on magnetic resonance images; the study aims to define an alternative method of display that might facilitate the detection of partial tears in the supraspinatus tendon. Specifically, we are going to use families of ellipsoidal triangular patches to cover the humerus head near the affected area. These patches are going to be textured and displayed with the information of the magnetic resonance images using the trilinear interpolation technique. For the generation of points to texture each patch, we propose a new method that guarantees the uniform distribution of its points using a random statistical method. Its computational cost, defined as the average computing time to generate a fixed number of points, is significantly lower as compared with deterministic and other standard statistical techniques. PMID:25650281
NASA Astrophysics Data System (ADS)
Nikitin, S. Yu.; Priezzhev, A. V.; Lugovtsov, A. E.; Ustinov, V. D.; Razgulin, A. V.
2014-10-01
The paper is devoted to development of the laser ektacytometry technique for evaluation of the statistical characteristics of inhomogeneous ensembles of red blood cells (RBCs). We have analyzed theoretically laser beam scattering by the inhomogeneous ensembles of elliptical discs, modeling red blood cells in the ektacytometer. The analysis shows that the laser ektacytometry technique allows for quantitative evaluation of such population characteristics of RBCs as the cells mean shape, the cells deformability variance and asymmetry of the cells distribution in the deformability. Moreover, we show that the deformability distribution itself can be retrieved by solving a specific Fredholm integral equation of the first kind. At this stage we do not take into account the scatter in the RBC sizes.
NASA Astrophysics Data System (ADS)
El Kanawati, W.; Létang, J. M.; Dauvergne, D.; Pinto, M.; Sarrut, D.; Testa, É.; Freud, N.
2015-10-01
A Monte Carlo (MC) variance reduction technique is developed for prompt-γ emitters calculations in proton therapy. Prompt-γ emitted through nuclear fragmentation reactions and exiting the patient during proton therapy could play an important role to help monitoring the treatment. However, the estimation of the number and the energy of emitted prompt-γ per primary proton with MC simulations is a slow process. In order to estimate the local distribution of prompt-γ emission in a volume of interest for a given proton beam of the treatment plan, a MC variance reduction technique based on a specific track length estimator (TLE) has been developed. First an elemental database of prompt-γ emission spectra is established in the clinical energy range of incident protons for all elements in the composition of human tissues. This database of the prompt-γ spectra is built offline with high statistics. Regarding the implementation of the prompt-γ TLE MC tally, each proton deposits along its track the expectation of the prompt-γ spectra from the database according to the proton kinetic energy and the local material composition. A detailed statistical study shows that the relative efficiency mainly depends on the geometrical distribution of the track length. Benchmarking of the proposed prompt-γ TLE MC technique with respect to an analogous MC technique is carried out. A large relative efficiency gain is reported, ca. 105.
How to Use Value-Added Measures Right
ERIC Educational Resources Information Center
Di Carlo, Matthew
2012-01-01
Value-added models are a specific type of "growth model," a diverse group of statistical techniques to isolate a teacher's impact on his or her students' testing progress while controlling for other measurable factors, such as student and school characteristics, that are outside that teacher's control. Opponents, including many teachers, argue…
The study uses statistical analysis techniques to determine the effects of four heavy metals (cadmium, lead, manganese, and zinc) on the macroinvertebrate community using the data collected in the fall 1987.
1981-01-01
explanatory variable has been ommitted. Ramsey (1974) has developed a rather interesting test for detecting specification errors using estimates of the...Peter. (1979) A Guide to Econometrics , Cambridge, MA: The MIT Press. Ramsey , J.B. (1974), "Classical Model Selection Through Specification Error... Tests ," in P. Zarembka, Ed. Frontiers in Econometrics , New York: Academia Press. Theil, Henri. (1971), Principles of Econometrics , New York: John Wiley
Villamonte-Chevalier, A; van Bree, H; Broeckx, Bjg; Dingemanse, W; Soler, M; Van Ryssen, B; Gielen, I
2015-09-25
Diagnostic imaging is essential to assess the lame patient; lesions of the elbow joint have traditionally been evaluated radiographically, however computed tomography (CT) has been suggested as a useful technique to diagnose various elbow pathologies. The primary objective of this study was to determine the sensitivity and specificity of CT to assess medial coronoid disease (MCD), using arthroscopy as gold standard. The secondary objective was to ascertain the radiographic sensitivity and specificity for MCD compared with CT. For this study 180 elbow joints were assessed, of which 141 had been examined with radiography, CT and arthroscopy; and 39 joints, had radiographic and CT assessment. Sensitivity and specificity were calculated for CT and radiographic findings using available statistical software. Sensitivity and specificity of CT using arthroscopy as gold standard resulted in high values for sensitivity (100 %) and specificity (93 %) for the assessment of MCD. For the radiographic evaluation, a sensitivity of 98 % and specificity of 64 - 69 % using CT as the technique of reference, were found. These results suggest that in case of doubt during radiographic assessment, CT could be used as a non-invasive technique to assess the presence of MCD. Based on the high sensitivity and specificity obtained in this study it has been considered that CT, rather than arthroscopy, is the preferred noninvasive technique to assess MCD lesions of the canine elbow joint.
Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for fluid-particle flows
NASA Astrophysics Data System (ADS)
Kong, Bo; Patel, Ravi G.; Capecelatro, Jesse; Desjardins, Olivier; Fox, Rodney O.
2017-11-01
In this work, we study the performance of three simulation techniques for fluid-particle flows: (1) a volume-filtered Euler-Lagrange approach (EL), (2) a quadrature-based moment method using the anisotropic Gaussian closure (AG), and (3) a traditional two-fluid model. By simulating two problems: particles in frozen homogeneous isotropic turbulence (HIT), and cluster-induced turbulence (CIT), the convergence of the methods under grid refinement is found to depend on the simulation method and the specific problem, with CIT simulations facing fewer difficulties than HIT. Although EL converges under refinement for both HIT and CIT, its statistical results exhibit dependence on the techniques used to extract statistics for the particle phase. For HIT, converging both EE methods (TFM and AG) poses challenges, while for CIT, AG and EL produce similar results. Overall, all three methods face challenges when trying to extract converged, parameter-independent statistics due to the presence of shocks in the particle phase. National Science Foundation and National Energy Technology Laboratory.
Patel, Ravi G.; Desjardins, Olivier; Kong, Bo; ...
2017-09-01
Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patel, Ravi G.; Desjardins, Olivier; Kong, Bo
Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less
Scaling images using their background ratio. An application in statistical comparisons of images.
Kalemis, A; Binnie, D; Bailey, D L; Flower, M A; Ott, R J
2003-06-07
Comparison of two medical images often requires image scaling as a pre-processing step. This is usually done with the scaling-to-the-mean or scaling-to-the-maximum techniques which, under certain circumstances, in quantitative applications may contribute a significant amount of bias. In this paper, we present a simple scaling method which assumes only that the most predominant values in the corresponding images belong to their background structure. The ratio of the two images to be compared is calculated and its frequency histogram is plotted. The scaling factor is given by the position of the peak in this histogram which belongs to the background structure. The method was tested against the traditional scaling-to-the-mean technique on simulated planar gamma-camera images which were compared using pixelwise statistical parametric tests. Both sensitivity and specificity for each condition were measured over a range of different contrasts and sizes of inhomogeneity for the two scaling techniques. The new method was found to preserve sensitivity in all cases while the traditional technique resulted in significant degradation of sensitivity in certain cases.
The Benefit of Modified Rehabilitation and Minimally Invasive Techniques in Total Hip Replacement
Lilikakis, Anastasios K; Gillespie, Beryl; Villar, Richard N
2008-01-01
INTRODUCTION We wished to assess if an intensive rehabilitation regimen alone, or one combined with modified anaesthetic and surgical techniques, can change the speed of rehabilitation or the length of hospital stay after total hip replacement. PATIENTS AND METHODS We compared 44 patients who had followed a traditional care pathway, with 38 patients who had rehabilitated under a new rehabilitation protocol, with 40 patients who had also received modified, minimally invasive techniques. The speed of rehabilitation was measured in terms of three specific milestones accomplished on the day after surgery. RESULTS We found a statistically significant improvement in the day after surgery each activity was possible. The length of hospital stay was reduced from 6.5 days to 5.4 days to 4.1 days, a difference which was also statistically significant. CONCLUSIONS The data support the view that a new rehabilitation protocol alone can reduce the length of hospital stay and hasten rehabilitation. The combination of modified anaesthetic and minimally invasive surgical techniques with the new rehabilitation regimen can further improve short-term outcome after total hip replacement. PMID:18634739
USDA-ARS?s Scientific Manuscript database
The generation of realistic future precipitation scenarios is crucial for assessing their impacts on a range of environmental and socio-economic impact sectors. A scale mismatch exists, however, between the coarse spatial resolution at which global climate models (GCMs) output future climate scenari...
ERIC Educational Resources Information Center
Hamaker, Ellen L.; Dolan, Conor V.; Molenaar, Peter C. M.
2005-01-01
Results obtained with interindividual techniques in a representative sample of a population are not necessarily generalizable to the individual members of this population. In this article the specific condition is presented that must be satisfied to generalize from the interindividual level to the intraindividual level. A way to investigate…
Precision of guided scanning procedures for full-arch digital impressions in vivo.
Zimmermann, Moritz; Koller, Christina; Rumetsch, Moritz; Ender, Andreas; Mehl, Albert
2017-11-01
System-specific scanning strategies have been shown to influence the accuracy of full-arch digital impressions. Special guided scanning procedures have been implemented for specific intraoral scanning systems with special regard to the digital orthodontic workflow. The aim of this study was to evaluate the precision of guided scanning procedures compared to conventional impression techniques in vivo. Two intraoral scanning systems with implemented full-arch guided scanning procedures (Cerec Omnicam Ortho; Ormco Lythos) were included along with one conventional impression technique with irreversible hydrocolloid material (alginate). Full-arch impressions were taken three times each from 5 participants (n = 15). Impressions were then compared within the test groups using a point-to-surface distance method after best-fit model matching (OraCheck). Precision was calculated using the (90-10%)/2 quantile and statistical analysis with one-way repeated measures ANOVA and post hoc Bonferroni test was performed. The conventional impression technique with alginate showed the lowest precision for full-arch impressions with 162.2 ± 71.3 µm. Both guided scanning procedures performed statistically significantly better than the conventional impression technique (p < 0.05). Mean values for group Cerec Omnicam Ortho were 74.5 ± 39.2 µm and for group Ormco Lythos 91.4 ± 48.8 µm. The in vivo precision of guided scanning procedures exceeds conventional impression techniques with the irreversible hydrocolloid material alginate. Guided scanning procedures may be highly promising for clinical applications, especially for digital orthodontic workflows.
Mølsted, Kirsten; Humerinta, Kirsti; Küseler, Annelise; Skaare, Pål; Bellardie, Haydn; Shaw, William; Karsten, Agneta; Kåre Sæle, Paul; Rizell, Sara; Marcusson, Agneta; Eyres, Philip; Semb, Gunvor
2017-02-01
Facial appearance is one of the most relevant measures of success in cleft lip and palate treatment. The aim was to assess nasolabial appearance at 5 years of age in all children in the project. In this part of the project the local protocol for lip closure continued to be used because the primary lip and nose operations were not part of the randomisation. The great majority of the surgeons used Millard's technique together with McComb's technique for the nose. One center used Tennison-Randalls technique and in one center the centers own technique as well as nose plugs were used. Three hundred and fifty-nine children participated in this part of the project. Standardised photos according to a specific protocol developed for the Scandcleft project were taken. Only the nasolabial area was shown, the surrounding facial features were masked. Three components were scored using a 5-point ordinal scale. A new developed Scandcleft Yardstick was used. The reliability of the method was tested using the weighted kappa statistics. Both the interrater and intrarater reliability scores were good to very good. There were statistically significant differences between the three trials. The Millard procedure combined with McComb technique had been used in the majority of the cases in all three trials. There were statistically significant differences between the three trials concerning upper lip, nasal form, and cleft side profile. ISRCTN29932826.
NASA Astrophysics Data System (ADS)
Sahoo, Sasmita; Jha, Madan K.
2013-12-01
The potential of multiple linear regression (MLR) and artificial neural network (ANN) techniques in predicting transient water levels over a groundwater basin were compared. MLR and ANN modeling was carried out at 17 sites in Japan, considering all significant inputs: rainfall, ambient temperature, river stage, 11 seasonal dummy variables, and influential lags of rainfall, ambient temperature, river stage and groundwater level. Seventeen site-specific ANN models were developed, using multi-layer feed-forward neural networks trained with Levenberg-Marquardt backpropagation algorithms. The performance of the models was evaluated using statistical and graphical indicators. Comparison of the goodness-of-fit statistics of the MLR models with those of the ANN models indicated that there is better agreement between the ANN-predicted groundwater levels and the observed groundwater levels at all the sites, compared to the MLR. This finding was supported by the graphical indicators and the residual analysis. Thus, it is concluded that the ANN technique is superior to the MLR technique in predicting spatio-temporal distribution of groundwater levels in a basin. However, considering the practical advantages of the MLR technique, it is recommended as an alternative and cost-effective groundwater modeling tool.
Sampling methods to the statistical control of the production of blood components.
Pereira, Paulo; Seghatchian, Jerard; Caldeira, Beatriz; Santos, Paula; Castro, Rosa; Fernandes, Teresa; Xavier, Sandra; de Sousa, Gracinda; de Almeida E Sousa, João Paulo
2017-12-01
The control of blood components specifications is a requirement generalized in Europe by the European Commission Directives and in the US by the AABB standards. The use of a statistical process control methodology is recommended in the related literature, including the EDQM guideline. The control reliability is dependent of the sampling. However, a correct sampling methodology seems not to be systematically applied. Commonly, the sampling is intended to comply uniquely with the 1% specification to the produced blood components. Nevertheless, on a purely statistical viewpoint, this model could be argued not to be related to a consistent sampling technique. This could be a severe limitation to detect abnormal patterns and to assure that the production has a non-significant probability of producing nonconforming components. This article discusses what is happening in blood establishments. Three statistical methodologies are proposed: simple random sampling, sampling based on the proportion of a finite population, and sampling based on the inspection level. The empirical results demonstrate that these models are practicable in blood establishments contributing to the robustness of sampling and related statistical process control decisions for the purpose they are suggested for. Copyright © 2017 Elsevier Ltd. All rights reserved.
Probability Distribution Extraction from TEC Estimates based on Kernel Density Estimation
NASA Astrophysics Data System (ADS)
Demir, Uygar; Toker, Cenk; Çenet, Duygu
2016-07-01
Statistical analysis of the ionosphere, specifically the Total Electron Content (TEC), may reveal important information about its temporal and spatial characteristics. One of the core metrics that express the statistical properties of a stochastic process is its Probability Density Function (pdf). Furthermore, statistical parameters such as mean, variance and kurtosis, which can be derived from the pdf, may provide information about the spatial uniformity or clustering of the electron content. For example, the variance differentiates between a quiet ionosphere and a disturbed one, whereas kurtosis differentiates between a geomagnetic storm and an earthquake. Therefore, valuable information about the state of the ionosphere (and the natural phenomena that cause the disturbance) can be obtained by looking at the statistical parameters. In the literature, there are publications which try to fit the histogram of TEC estimates to some well-known pdf.s such as Gaussian, Exponential, etc. However, constraining a histogram to fit to a function with a fixed shape will increase estimation error, and all the information extracted from such pdf will continue to contain this error. In such techniques, it is highly likely to observe some artificial characteristics in the estimated pdf which is not present in the original data. In the present study, we use the Kernel Density Estimation (KDE) technique to estimate the pdf of the TEC. KDE is a non-parametric approach which does not impose a specific form on the TEC. As a result, better pdf estimates that almost perfectly fit to the observed TEC values can be obtained as compared to the techniques mentioned above. KDE is particularly good at representing the tail probabilities, and outliers. We also calculate the mean, variance and kurtosis of the measured TEC values. The technique is applied to the ionosphere over Turkey where the TEC values are estimated from the GNSS measurement from the TNPGN-Active (Turkish National Permanent GNSS Network) network. This study is supported by by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.
Prabitha, Vasumathi Gopala; Suchetha, Sambasivan; Jayanthi, Jayaraj Lalitha; Baiju, Kamalasanan Vijayakumary; Rema, Prabhakaran; Anuraj, Koyippurath; Mathews, Anita; Sebastian, Paul; Subhash, Narayanan
2016-01-01
Diffuse reflectance (DR) spectroscopy is a non-invasive, real-time, and cost-effective tool for early detection of malignant changes in squamous epithelial tissues. The present study aims to evaluate the diagnostic power of diffuse reflectance spectroscopy for non-invasive discrimination of cervical lesions in vivo. A clinical trial was carried out on 48 sites in 34 patients by recording DR spectra using a point-monitoring device with white light illumination. The acquired data were analyzed and classified using multivariate statistical analysis based on principal component analysis (PCA) and linear discriminant analysis (LDA). Diagnostic accuracies were validated using random number generators. The receiver operating characteristic (ROC) curves were plotted for evaluating the discriminating power of the proposed statistical technique. An algorithm was developed and used to classify non-diseased (normal) from diseased sites (abnormal) with a sensitivity of 72 % and specificity of 87 %. While low-grade squamous intraepithelial lesion (LSIL) could be discriminated from normal with a sensitivity of 56 % and specificity of 80 %, and high-grade squamous intraepithelial lesion (HSIL) from normal with a sensitivity of 89 % and specificity of 97 %, LSIL could be discriminated from HSIL with 100 % sensitivity and specificity. The areas under the ROC curves were 0.993 (95 % confidence interval (CI) 0.0 to 1) and 1 (95 % CI 1) for the discrimination of HSIL from normal and HSIL from LSIL, respectively. The results of the study show that DR spectroscopy could be used along with multivariate analytical techniques as a non-invasive technique to monitor cervical disease status in real time.
Recchia, Gabriel L; Louwerse, Max M
2016-11-01
Computational techniques comparing co-occurrences of city names in texts allow the relative longitudes and latitudes of cities to be estimated algorithmically. However, these techniques have not been applied to estimate the provenance of artifacts with unknown origins. Here, we estimate the geographic origin of artifacts from the Indus Valley Civilization, applying methods commonly used in cognitive science to the Indus script. We show that these methods can accurately predict the relative locations of archeological sites on the basis of artifacts of known provenance, and we further apply these techniques to determine the most probable excavation sites of four sealings of unknown provenance. These findings suggest that inscription statistics reflect historical interactions among locations in the Indus Valley region, and they illustrate how computational methods can help localize inscribed archeological artifacts of unknown origin. The success of this method offers opportunities for the cognitive sciences in general and for computational anthropology specifically. Copyright © 2015 Cognitive Science Society, Inc.
Porter, P Steven; Rao, S Trivikrama; Zurbenko, Igor G; Dunker, Alan M; Wolff, George T
2001-02-01
Assessment of regulatory programs aimed at improving ambient O 3 air quality is of considerable interest to the scientific community and to policymakers. Trend detection, the identification of statistically significant long-term changes, and attribution, linking change to specific clima-tological and anthropogenic forcings, are instrumental to this assessment. Detection and attribution are difficult because changes in pollutant concentrations of interest to policymakers may be much smaller than natural variations due to weather and climate. In addition, there are considerable differences in reported trends seemingly based on similar statistical methods and databases. Differences arise from the variety of techniques used to reduce nontrend variation in time series, including mitigating the effects of meteorology and the variety of metrics used to track changes. In this paper, we review the trend assessment techniques being used in the air pollution field and discuss their strengths and limitations in discerning and attributing changes in O 3 to emission control policies.
An overview of groundwater chemistry studies in Malaysia.
Kura, Nura Umar; Ramli, Mohammad Firuz; Sulaiman, Wan Nor Azmin; Ibrahim, Shaharin; Aris, Ahmad Zaharin
2018-03-01
In this paper, numerous studies on groundwater in Malaysia were reviewed with the aim of evaluating past trends and the current status for discerning the sustainability of the water resources in the country. It was found that most of the previous groundwater studies (44 %) focused on the islands and mostly concentrated on qualitative assessment with more emphasis being placed on seawater intrusion studies. This was then followed by inland-based studies, with Selangor state leading the studies which reflected the current water challenges facing the state. From a methodological perspective, geophysics, graphical methods, and statistical analysis are the dominant techniques (38, 25, and 25 %) respectively. The geophysical methods especially the 2D resistivity method cut across many subjects such as seawater intrusion studies, quantitative assessment, and hydraulic parameters estimation. The statistical techniques used include multivariate statistical analysis techniques and ANOVA among others, most of which are quality related studies using major ions, in situ parameters, and heavy metals. Conversely, numerical techniques like MODFLOW were somewhat less admired which is likely due to their complexity in nature and high data demand. This work will facilitate researchers in identifying the specific areas which need improvement and focus, while, at the same time, provide policymakers and managers with an executive summary and knowledge of the current situation in groundwater studies and where more work needs to be done for sustainable development.
Statistical detection of patterns in unidimensional distributions by continuous wavelet transforms
NASA Astrophysics Data System (ADS)
Baluev, R. V.
2018-04-01
Objective detection of specific patterns in statistical distributions, like groupings or gaps or abrupt transitions between different subsets, is a task with a rich range of applications in astronomy: Milky Way stellar population analysis, investigations of the exoplanets diversity, Solar System minor bodies statistics, extragalactic studies, etc. We adapt the powerful technique of the wavelet transforms to this generalized task, making a strong emphasis on the assessment of the patterns detection significance. Among other things, our method also involves optimal minimum-noise wavelets and minimum-noise reconstruction of the distribution density function. Based on this development, we construct a self-closed algorithmic pipeline aimed to process statistical samples. It is currently applicable to single-dimensional distributions only, but it is flexible enough to undergo further generalizations and development.
Capture approximations beyond a statistical quantum mechanical method for atom-diatom reactions
NASA Astrophysics Data System (ADS)
Barrios, Lizandra; Rubayo-Soneira, Jesús; González-Lezana, Tomás
2016-03-01
Statistical techniques constitute useful approaches to investigate atom-diatom reactions mediated by insertion dynamics which involves complex-forming mechanisms. Different capture schemes based on energy considerations regarding the specific diatom rovibrational states are suggested to evaluate the corresponding probabilities of formation of such collision species between reactants and products in an attempt to test reliable alternatives for computationally demanding processes. These approximations are tested in combination with a statistical quantum mechanical method for the S + H2(v = 0 ,j = 1) → SH + H and Si + O2(v = 0 ,j = 1) → SiO + O reactions, where this dynamical mechanism plays a significant role, in order to probe their validity.
Statistical procedures for analyzing mental health services data.
Elhai, Jon D; Calhoun, Patrick S; Ford, Julian D
2008-08-15
In mental health services research, analyzing service utilization data often poses serious problems, given the presence of substantially skewed data distributions. This article presents a non-technical introduction to statistical methods specifically designed to handle the complexly distributed datasets that represent mental health service use, including Poisson, negative binomial, zero-inflated, and zero-truncated regression models. A flowchart is provided to assist the investigator in selecting the most appropriate method. Finally, a dataset of mental health service use reported by medical patients is described, and a comparison of results across several different statistical methods is presented. Implications of matching data analytic techniques appropriately with the often complexly distributed datasets of mental health services utilization variables are discussed.
Current Developments in Machine Learning Techniques in Biological Data Mining.
Dumancas, Gerard G; Adrianto, Indra; Bello, Ghalib; Dozmorov, Mikhail
2017-01-01
This supplement is intended to focus on the use of machine learning techniques to generate meaningful information on biological data. This supplement under Bioinformatics and Biology Insights aims to provide scientists and researchers working in this rapid and evolving field with online, open-access articles authored by leading international experts in this field. Advances in the field of biology have generated massive opportunities to allow the implementation of modern computational and statistical techniques. Machine learning methods in particular, a subfield of computer science, have evolved as an indispensable tool applied to a wide spectrum of bioinformatics applications. Thus, it is broadly used to investigate the underlying mechanisms leading to a specific disease, as well as the biomarker discovery process. With a growth in this specific area of science comes the need to access up-to-date, high-quality scholarly articles that will leverage the knowledge of scientists and researchers in the various applications of machine learning techniques in mining biological data.
Boka, V; Arapostathis, K; Vretos, N; Kotsanos, N
2014-10-01
The aim of this study was to examine the acceptance by Greek parents of nine behaviour-management techniques and its association with several possible confounding factors. Following ethical approval, 106 parents whose 3- to 12-year-old children had been receiving treatment in a university postgraduate paediatric dental clinic, and 123 parents of children from a private paediatric dental practice agreed to participate. After being shown a video with nine behaviour-management techniques, parents rated the acceptance of each technique on a 0-10 scale. They were then asked to complete a questionnaire about demographics, their previous dental experience and dental anxiety (modified Corah dental anxiety scale). The best accepted technique was tell-show-do (9.76 ± 0.69), followed by parental presence/absence (PPA) technique (7.83 ± 3.06) and nitrous oxide inhalation sedation (7.09 ± 3.02). The least accepted techniques were passive restraint (4.21 ± 3.84) and general anaesthesia (4.21 ± 4.02). No correlations were found between acceptance of any individual management technique and parental age, gender, income, education, dental experience and dental anxiety or the child's age, gender and dental experience. Parents whose children had been treated at the University clinic had lower income and educational levels, and rated passive restraint, oral sedation and general anaesthesia higher than those from the private practice. When the parents were specifically asked to choose between general anaesthesia over any of the active or passive restraint, hand-over-mouth and voice control techniques, 10% preferred general anaesthesia, and these parents reported statistically significant more negative dental experience but not higher dental anxiety. Statistical significance of differences was explored using the Tukey-Kramer method. There was no correlation between parental dental experience and dental anxiety and the acceptance of any specific behaviour-management technique. However, parents with negative dental experience would prefer general anaesthesia over any of active or passive restraint, hand-over-mouth and voice control techniques. PPA is a highly acceptable technique for Greek parents.
Hauptmann, C; Roulet, J-C; Niederhauser, J J; Döll, W; Kirlangic, M E; Lysyansky, B; Krachkovskyi, V; Bhatti, M A; Barnikol, U B; Sasse, L; Bührle, C P; Speckmann, E-J; Götz, M; Sturm, V; Freund, H-J; Schnell, U; Tass, P A
2009-12-01
In the past decade deep brain stimulation (DBS)-the application of electrical stimulation to specific target structures via implanted depth electrodes-has become the standard treatment for medically refractory Parkinson's disease and essential tremor. These diseases are characterized by pathological synchronized neuronal activity in particular brain areas. We present an external trial DBS device capable of administering effectively desynchronizing stimulation techniques developed with methods from nonlinear dynamics and statistical physics according to a model-based approach. These techniques exploit either stochastic phase resetting principles or complex delayed-feedback mechanisms. We explain how these methods are implemented into a safe and user-friendly device.
Power Enhancement in High Dimensional Cross-Sectional Tests
Fan, Jianqing; Liao, Yuan; Yao, Jiawei
2016-01-01
We propose a novel technique to boost the power of testing a high-dimensional vector H : θ = 0 against sparse alternatives where the null hypothesis is violated only by a couple of components. Existing tests based on quadratic forms such as the Wald statistic often suffer from low powers due to the accumulation of errors in estimating high-dimensional parameters. More powerful tests for sparse alternatives such as thresholding and extreme-value tests, on the other hand, require either stringent conditions or bootstrap to derive the null distribution and often suffer from size distortions due to the slow convergence. Based on a screening technique, we introduce a “power enhancement component”, which is zero under the null hypothesis with high probability, but diverges quickly under sparse alternatives. The proposed test statistic combines the power enhancement component with an asymptotically pivotal statistic, and strengthens the power under sparse alternatives. The null distribution does not require stringent regularity conditions, and is completely determined by that of the pivotal statistic. As specific applications, the proposed methods are applied to testing the factor pricing models and validating the cross-sectional independence in panel data models. PMID:26778846
2010-01-01
Background Methods for the calculation and application of quantitative electromyographic (EMG) statistics for the characterization of EMG data detected from forearm muscles of individuals with and without pain associated with repetitive strain injury are presented. Methods A classification procedure using a multi-stage application of Bayesian inference is presented that characterizes a set of motor unit potentials acquired using needle electromyography. The utility of this technique in characterizing EMG data obtained from both normal individuals and those presenting with symptoms of "non-specific arm pain" is explored and validated. The efficacy of the Bayesian technique is compared with simple voting methods. Results The aggregate Bayesian classifier presented is found to perform with accuracy equivalent to that of majority voting on the test data, with an overall accuracy greater than 0.85. Theoretical foundations of the technique are discussed, and are related to the observations found. Conclusions Aggregation of motor unit potential conditional probability distributions estimated using quantitative electromyographic analysis, may be successfully used to perform electrodiagnostic characterization of "non-specific arm pain." It is expected that these techniques will also be able to be applied to other types of electrodiagnostic data. PMID:20156353
Weak Value Amplification is Suboptimal for Estimation and Detection
NASA Astrophysics Data System (ADS)
Ferrie, Christopher; Combes, Joshua
2014-01-01
We show by using statistically rigorous arguments that the technique of weak value amplification does not perform better than standard statistical techniques for the tasks of single parameter estimation and signal detection. Specifically, we prove that postselection, a necessary ingredient for weak value amplification, decreases estimation accuracy and, moreover, arranging for anomalously large weak values is a suboptimal strategy. In doing so, we explicitly provide the optimal estimator, which in turn allows us to identify the optimal experimental arrangement to be the one in which all outcomes have equal weak values (all as small as possible) and the initial state of the meter is the maximal eigenvalue of the square of the system observable. Finally, we give precise quantitative conditions for when weak measurement (measurements without postselection or anomalously large weak values) can mitigate the effect of uncharacterized technical noise in estimation.
Ignjatović, Aleksandra; Stojanović, Miodrag; Milošević, Zoran; Anđelković Apostolović, Marija
2017-12-02
The interest in developing risk models in medicine not only is appealing, but also associated with many obstacles in different aspects of predictive model development. Initially, the association of biomarkers or the association of more markers with the specific outcome was proven by statistical significance, but novel and demanding questions required the development of new and more complex statistical techniques. Progress of statistical analysis in biomedical research can be observed the best through the history of the Framingham study and development of the Framingham score. Evaluation of predictive models comes from a combination of the facts which are results of several metrics. Using logistic regression and Cox proportional hazards regression analysis, the calibration test, and the ROC curve analysis should be mandatory and eliminatory, and the central place should be taken by some new statistical techniques. In order to obtain complete information related to the new marker in the model, recently, there is a recommendation to use the reclassification tables by calculating the net reclassification index and the integrated discrimination improvement. Decision curve analysis is a novel method for evaluating the clinical usefulness of a predictive model. It may be noted that customizing and fine-tuning of the Framingham risk score initiated the development of statistical analysis. Clinically applicable predictive model should be a trade-off between all abovementioned statistical metrics, a trade-off between calibration and discrimination, accuracy and decision-making, costs and benefits, and quality and quantity of patient's life.
Khedmat, S; Rouhi, N; Drage, N; Shokouhinejad, N; Nekoofar, M H
2012-11-01
To compare the accuracy of digital radiography (DR), multidetector computed tomography (MDCT) and cone beam computed tomography (CBCT) in detecting vertical root fractures (VRF) in the absence and presence of gutta-percha root filling. The root canals of 100 extracted human single-rooted teeth were prepared and randomly divided into four groups: two experimental groups with artificially fractured root and two intact groups as controls. In one experimental and one control group, a size 40, 0.04 taper gutta-percha cone was inserted in the root canals. Then DR, MDCT and CBCT were performed and the images evaluated. Statistical analyses of sensitivity, specificity and accuracy of each imaging technique in the presence and absence of gutta-percha were calculated and compared. In the absence of gutta-percha, the specificity of DR, MDCT and CBCT was similar. CBCT was the most accurate and sensitive imaging technique (P < 0 .05). In the presence of gutta-percha, the accuracy of MDCT was higher than the other imaging techniques (P < 0.05). The sensitivity of CBCT and MDCT was significantly higher than that of DR (P < 0.05), whereas CBCT was the least specific technique. Under the conditions of this ex vivo study, CBCT was the most sensitive imaging technique in detecting vertical root fracture. The presence of gutta-percha reduced the accuracy, sensitivity and specificity of CBCT but not MDCT. The sensitivity of DR was reduced in the presence of gutta-percha. The use of MDCT as an alternative technique may be recommended when VRF are suspected in root filled teeth. However, as the radiation dose of MDCT is higher than CBCT, the technique could be considered at variance with the principles of ALARA. © 2012 International Endodontic Journal.
ERIC Educational Resources Information Center
Cantlon, Julie; And Others
1996-01-01
This study evaluated the use of "allegation blind" interviews (in which interviewers did not know the specific allegation involved) with children (n=1535) suspected of being victims of child sexual abuse. The "allegation blind" interview technique yielded a statistically higher disclosure rate than the "allegation informed" interviews. (Author/DB)
ERIC Educational Resources Information Center
Videtich, Patricia E.; Neal, William J.
2012-01-01
Using sieving and sample "unknowns" for instructional grain-size analysis and interpretation of sands in undergraduate sedimentology courses has advantages over other techniques. Students (1) learn to calculate and use statistics; (2) visually observe differences in the grain-size fractions, thereby developing a sense of specific size…
Rollins, Derrick K; Teh, Ailing
2010-12-17
Microarray data sets provide relative expression levels for thousands of genes for a small number, in comparison, of different experimental conditions called assays. Data mining techniques are used to extract specific information of genes as they relate to the assays. The multivariate statistical technique of principal component analysis (PCA) has proven useful in providing effective data mining methods. This article extends the PCA approach of Rollins et al. to the development of ranking genes of microarray data sets that express most differently between two biologically different grouping of assays. This method is evaluated on real and simulated data and compared to a current approach on the basis of false discovery rate (FDR) and statistical power (SP) which is the ability to correctly identify important genes. This work developed and evaluated two new test statistics based on PCA and compared them to a popular method that is not PCA based. Both test statistics were found to be effective as evaluated in three case studies: (i) exposing E. coli cells to two different ethanol levels; (ii) application of myostatin to two groups of mice; and (iii) a simulated data study derived from the properties of (ii). The proposed method (PM) effectively identified critical genes in these studies based on comparison with the current method (CM). The simulation study supports higher identification accuracy for PM over CM for both proposed test statistics when the gene variance is constant and for one of the test statistics when the gene variance is non-constant. PM compares quite favorably to CM in terms of lower FDR and much higher SP. Thus, PM can be quite effective in producing accurate signatures from large microarray data sets for differential expression between assays groups identified in a preliminary step of the PCA procedure and is, therefore, recommended for use in these applications.
Lightweight and Statistical Techniques for Petascale PetaScale Debugging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-06-30
This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.« less
Shikanov, Sergey; Woo, Jason; Al-Ahmadie, Hikmat; Katz, Mark H; Zagaja, Gregory P; Shalhav, Arieh L; Zorn, Kevin C
2009-09-01
To evaluate the pathologic and functional outcomes of patients with bilateral interfascial (IF) or extrafascial nerve-sparing (EF-NSP) techniques. It is believed that the IF-NSP technique used during robotic-assisted radical prostatectomy (RARP) spares more nerve fibers, while EF dissection may lower the risk for positive surgical margins (PSM). A prospective database was analyzed for RARP patients with bilateral IF- or EF-NSP technique. Collected parameters included age, body mass index, prostate-specific antigen, clinical and pathologic Gleason score and stage, estimated blood loss, operative time, and PSM characteristics. Functional outcomes were evaluated with the use of the University of California Los Angeles Prostate Cancer Index questionnaire. Men receiving postoperative hormonal or radiation therapy were excluded from sexual function analysis. A total of 110 and 703 cases with bilateral EF- and IF-NSP, respectively, were analyzed. EF-NSP patients had higher prostate-specific antigen, clinical, pathologic stage, and pathologic Gleason score. PSM rate did not achieve statistically significant difference between groups. There was a trend toward lower pT3-PSM in the EF group (51% vs 28%; P = .08). Mid- and posterolateral PSM location were lower in the EF-NSP group, 11% vs 37% and 11% vs 29%, respectively (P < .001). The IF-NSP group patients achieved statistically significant better sexual function (P = .02) and potency rates (P = .03) at 12 months after RARP. In lower risk patients, bilateral IF-NSP technique does not result in significantly higher PSM rates. EF-NSP appears to reduce posterolateral and mid-prostate PSM. Men with bilateral IF-NSP demonstrate significantly better sexual function outcomes.
Li, Jing-Sheng; Tsai, Tsung-Yuan; Wang, Shaobai; Li, Pingyue; Kwon, Young-Min; Freiberg, Andrew; Rubash, Harry E.; Li, Guoan
2014-01-01
Using computed tomography (CT) or magnetic resonance (MR) images to construct 3D knee models has been widely used in biomedical engineering research. Statistical shape modeling (SSM) method is an alternative way to provide a fast, cost-efficient, and subject-specific knee modeling technique. This study was aimed to evaluate the feasibility of using a combined dual-fluoroscopic imaging system (DFIS) and SSM method to investigate in vivo knee kinematics. Three subjects were studied during a treadmill walking. The data were compared with the kinematics obtained using a CT-based modeling technique. Geometric root-mean-square (RMS) errors between the knee models constructed using the SSM and CT-based modeling techniques were 1.16 mm and 1.40 mm for the femur and tibia, respectively. For the kinematics of the knee during the treadmill gait, the SSM model can predict the knee kinematics with RMS errors within 3.3 deg for rotation and within 2.4 mm for translation throughout the stance phase of the gait cycle compared with those obtained using the CT-based knee models. The data indicated that the combined DFIS and SSM technique could be used for quick evaluation of knee joint kinematics. PMID:25320846
Quantitative knowledge acquisition for expert systems
NASA Technical Reports Server (NTRS)
Belkin, Brenda L.; Stengel, Robert F.
1991-01-01
A common problem in the design of expert systems is the definition of rules from data obtained in system operation or simulation. While it is relatively easy to collect data and to log the comments of human operators engaged in experiments, generalizing such information to a set of rules has not previously been a direct task. A statistical method is presented for generating rule bases from numerical data, motivated by an example based on aircraft navigation with multiple sensors. The specific objective is to design an expert system that selects a satisfactory suite of measurements from a dissimilar, redundant set, given an arbitrary navigation geometry and possible sensor failures. The systematic development is described of a Navigation Sensor Management (NSM) Expert System from Kalman Filter convariance data. The method invokes two statistical techniques: Analysis of Variance (ANOVA) and the ID3 Algorithm. The ANOVA technique indicates whether variations of problem parameters give statistically different covariance results, and the ID3 algorithms identifies the relationships between the problem parameters using probabilistic knowledge extracted from a simulation example set. Both are detailed.
Defining window-boundaries for genomic analyses using smoothing spline techniques
Beissinger, Timothy M.; Rosa, Guilherme J.M.; Kaeppler, Shawn M.; ...
2015-04-17
High-density genomic data is often analyzed by combining information over windows of adjacent markers. Interpretation of data grouped in windows versus at individual locations may increase statistical power, simplify computation, reduce sampling noise, and reduce the total number of tests performed. However, use of adjacent marker information can result in over- or under-smoothing, undesirable window boundary specifications, or highly correlated test statistics. We introduce a method for defining windows based on statistically guided breakpoints in the data, as a foundation for the analysis of multiple adjacent data points. This method involves first fitting a cubic smoothing spline to the datamore » and then identifying the inflection points of the fitted spline, which serve as the boundaries of adjacent windows. This technique does not require prior knowledge of linkage disequilibrium, and therefore can be applied to data collected from individual or pooled sequencing experiments. Moreover, in contrast to existing methods, an arbitrary choice of window size is not necessary, since these are determined empirically and allowed to vary along the genome.« less
Ensembles of radial basis function networks for spectroscopic detection of cervical precancer
NASA Technical Reports Server (NTRS)
Tumer, K.; Ramanujam, N.; Ghosh, J.; Richards-Kortum, R.
1998-01-01
The mortality related to cervical cancer can be substantially reduced through early detection and treatment. However, current detection techniques, such as Pap smear and colposcopy, fail to achieve a concurrently high sensitivity and specificity. In vivo fluorescence spectroscopy is a technique which quickly, noninvasively and quantitatively probes the biochemical and morphological changes that occur in precancerous tissue. A multivariate statistical algorithm was used to extract clinically useful information from tissue spectra acquired from 361 cervical sites from 95 patients at 337-, 380-, and 460-nm excitation wavelengths. The multivariate statistical analysis was also employed to reduce the number of fluorescence excitation-emission wavelength pairs required to discriminate healthy tissue samples from precancerous tissue samples. The use of connectionist methods such as multilayered perceptrons, radial basis function (RBF) networks, and ensembles of such networks was investigated. RBF ensemble algorithms based on fluorescence spectra potentially provide automated and near real-time implementation of precancer detection in the hands of nonexperts. The results are more reliable, direct, and accurate than those achieved by either human experts or multivariate statistical algorithms.
Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses
Stephen, Emily P.; Lepage, Kyle Q.; Eden, Uri T.; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S.; Guenther, Frank H.; Kramer, Mark A.
2014-01-01
The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty—both in the functional network edges and the corresponding aggregate measures of network topology—are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here—appropriate for static and dynamic network inference and different statistical measures of coupling—permits the evaluation of confidence in network measures in a variety of settings common to neuroscience. PMID:24678295
Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses.
Stephen, Emily P; Lepage, Kyle Q; Eden, Uri T; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S; Guenther, Frank H; Kramer, Mark A
2014-01-01
The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty-both in the functional network edges and the corresponding aggregate measures of network topology-are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here-appropriate for static and dynamic network inference and different statistical measures of coupling-permits the evaluation of confidence in network measures in a variety of settings common to neuroscience.
Fukuda, Haruhisa; Kuroki, Manabu
2016-03-01
To develop and internally validate a surgical site infection (SSI) prediction model for Japan. Retrospective observational cohort study. We analyzed surveillance data submitted to the Japan Nosocomial Infections Surveillance system for patients who had undergone target surgical procedures from January 1, 2010, through December 31, 2012. Logistic regression analyses were used to develop statistical models for predicting SSIs. An SSI prediction model was constructed for each of the procedure categories by statistically selecting the appropriate risk factors from among the collected surveillance data and determining their optimal categorization. Standard bootstrapping techniques were applied to assess potential overfitting. The C-index was used to compare the predictive performances of the new statistical models with those of models based on conventional risk index variables. The study sample comprised 349,987 cases from 428 participant hospitals throughout Japan, and the overall SSI incidence was 7.0%. The C-indices of the new statistical models were significantly higher than those of the conventional risk index models in 21 (67.7%) of the 31 procedure categories (P<.05). No significant overfitting was detected. Japan-specific SSI prediction models were shown to generally have higher accuracy than conventional risk index models. These new models may have applications in assessing hospital performance and identifying high-risk patients in specific procedure categories.
ERIC Educational Resources Information Center
Berenson, Mark L.
2013-01-01
There is consensus in the statistical literature that severe departures from its assumptions invalidate the use of regression modeling for purposes of inference. The assumptions of regression modeling are usually evaluated subjectively through visual, graphic displays in a residual analysis but such an approach, taken alone, may be insufficient…
NASA Astrophysics Data System (ADS)
Li, Xiaohui; Yang, Sibo; Fan, Rongwei; Yu, Xin; Chen, Deying
2018-06-01
In this paper, discrimination of soft tissues using laser-induced breakdown spectroscopy (LIBS) in combination with multivariate statistical methods is presented. Fresh pork fat, skin, ham, loin and tenderloin muscle tissues are manually cut into slices and ablated using a 1064 nm pulsed Nd:YAG laser. Discrimination analyses between fat, skin and muscle tissues, and further between highly similar ham, loin and tenderloin muscle tissues, are performed based on the LIBS spectra in combination with multivariate statistical methods, including principal component analysis (PCA), k nearest neighbors (kNN) classification, and support vector machine (SVM) classification. Performances of the discrimination models, including accuracy, sensitivity and specificity, are evaluated using 10-fold cross validation. The classification models are optimized to achieve best discrimination performances. The fat, skin and muscle tissues can be definitely discriminated using both kNN and SVM classifiers, with accuracy of over 99.83%, sensitivity of over 0.995 and specificity of over 0.998. The highly similar ham, loin and tenderloin muscle tissues can also be discriminated with acceptable performances. The best performances are achieved with SVM classifier using Gaussian kernel function, with accuracy of 76.84%, sensitivity of over 0.742 and specificity of over 0.869. The results show that the LIBS technique assisted with multivariate statistical methods could be a powerful tool for online discrimination of soft tissues, even for tissues of high similarity, such as muscles from different parts of the animal body. This technique could be used for discrimination of tissues suffering minor clinical changes, thus may advance the diagnosis of early lesions and abnormalities.
Handling of Polyvinylsiloxane Versus Polyether for Implant Impressions.
Farhan, Daniel; Lauer, Wiebke; Heydecke, Guido; Aarabi, Ghazal; Reissmann, Daniel R
2016-01-01
This study compared polyvinylsiloxane with polyether in handling dental impressions. Each participant (N = 39) made four impressions, each a combination of pickup and reseating techniques with polyether or polyvinylsiloxane, of one implant cast representing a specific clinical situation (tooth gaps, limited residual dentition, or edentulous jaw). Handling of impressions was subsequently rated by using a 12-item questionnaire with 100-mm visual analog scales. While mean satisfaction scores were higher for polyvinylsiloxane than for polyether (69.5/63.0, P < .001), differences among subgroups were statistically significant only for pickup technique, limited residual dentition, and edentulous jaw. Implant impressions made with polyvinylsiloxane using a pickup technique seem to be the best option for most clinical situations.
Ben Chaabane, Salim; Fnaiech, Farhat
2014-01-23
Color image segmentation has been so far applied in many areas; hence, recently many different techniques have been developed and proposed. In the medical imaging area, the image segmentation may be helpful to provide assistance to doctor in order to follow-up the disease of a certain patient from the breast cancer processed images. The main objective of this work is to rebuild and also to enhance each cell from the three component images provided by an input image. Indeed, from an initial segmentation obtained using the statistical features and histogram threshold techniques, the resulting segmentation may represent accurately the non complete and pasted cells and enhance them. This allows real help to doctors, and consequently, these cells become clear and easy to be counted. A novel method for color edges extraction based on statistical features and automatic threshold is presented. The traditional edge detector, based on the first and the second order neighborhood, describing the relationship between the current pixel and its neighbors, is extended to the statistical domain. Hence, color edges in an image are obtained by combining the statistical features and the automatic threshold techniques. Finally, on the obtained color edges with specific primitive color, a combination rule is used to integrate the edge results over the three color components. Breast cancer cell images were used to evaluate the performance of the proposed method both quantitatively and qualitatively. Hence, a visual and a numerical assessment based on the probability of correct classification (PC), the false classification (Pf), and the classification accuracy (Sens(%)) are presented and compared with existing techniques. The proposed method shows its superiority in the detection of points which really belong to the cells, and also the facility of counting the number of the processed cells. Computer simulations highlight that the proposed method substantially enhances the segmented image with smaller error rates better than other existing algorithms under the same settings (patterns and parameters). Moreover, it provides high classification accuracy, reaching the rate of 97.94%. Additionally, the segmentation method may be extended to other medical imaging types having similar properties.
Japanese migration in contemporary Japan: economic segmentation and interprefectural migration.
Fukurai, H
1991-01-01
This paper examines the economic segmentation model in explaining 1985-86 Japanese interregional migration. The analysis takes advantage of statistical graphic techniques to illustrate the following substantive issues of interregional migration: (1) to examine whether economic segmentation significantly influences Japanese regional migration and (2) to explain socioeconomic characteristics of prefectures for both in- and out-migration. Analytic techniques include a latent structural equation (LISREL) methodology and statistical residual mapping. The residual dispersion patterns, for instance, suggest the extent to which socioeconomic and geopolitical variables explain migration differences by showing unique clusters of unexplained residuals. The analysis further points out that extraneous factors such as high residential land values, significant commuting populations, and regional-specific cultures and traditions need to be incorporated in the economic segmentation model in order to assess the extent of the model's reliability in explaining the pattern of interprefectural migration.
Sayago, Ana; González-Domínguez, Raúl; Beltrán, Rafael; Fernández-Recamales, Ángeles
2018-09-30
This work explores the potential of multi-element fingerprinting in combination with advanced data mining strategies to assess the geographical origin of extra virgin olive oil samples. For this purpose, the concentrations of 55 elements were determined in 125 oil samples from multiple Spanish geographic areas. Several unsupervised and supervised multivariate statistical techniques were used to build classification models and investigate the relationship between mineral composition of olive oils and their provenance. Results showed that Spanish extra virgin olive oils exhibit characteristic element profiles, which can be differentiated on the basis of their origin in accordance with three geographical areas: Atlantic coast (Huelva province), Mediterranean coast and inland regions. Furthermore, statistical modelling yielded high sensitivity and specificity, principally when random forest and support vector machines were employed, thus demonstrating the utility of these techniques in food traceability and authenticity research. Copyright © 2018 Elsevier Ltd. All rights reserved.
Enhanced Higgs boson to τ(+)τ(-) search with deep learning.
Baldi, P; Sadowski, P; Whiteson, D
2015-03-20
The Higgs boson is thought to provide the interaction that imparts mass to the fundamental fermions, but while measurements at the Large Hadron Collider (LHC) are consistent with this hypothesis, current analysis techniques lack the statistical power to cross the traditional 5σ significance barrier without more data. Deep learning techniques have the potential to increase the statistical power of this analysis by automatically learning complex, high-level data representations. In this work, deep neural networks are used to detect the decay of the Higgs boson to a pair of tau leptons. A Bayesian optimization algorithm is used to tune the network architecture and training algorithm hyperparameters, resulting in a deep network of eight nonlinear processing layers that improves upon the performance of shallow classifiers even without the use of features specifically engineered by physicists for this application. The improvement in discovery significance is equivalent to an increase in the accumulated data set of 25%.
NASA Astrophysics Data System (ADS)
Hapca, Simona
2015-04-01
Many soil properties and functions emerge from interactions of physical, chemical and biological processes at microscopic scales, which can be understood only by integrating techniques that traditionally are developed within separate disciplines. While recent advances in imaging techniques, such as X-ray computed tomography (X-ray CT), offer the possibility to reconstruct the 3D physical structure at fine resolutions, for the distribution of chemicals in soil, existing methods, based on scanning electron microscope (SEM) and energy dispersive X-ray detection (EDX), allow for characterization of the chemical composition only on 2D surfaces. At present, direct 3D measurement techniques are still lacking, sequential sectioning of soils, followed by 2D mapping of chemical elements and interpolation to 3D, being an alternative which is explored in this study. Specifically, we develop an integrated experimental and theoretical framework which combines 3D X-ray CT imaging technique with 2D SEM-EDX and use spatial statistics methods to map the chemical composition of soil in 3D. The procedure involves three stages 1) scanning a resin impregnated soil cube by X-ray CT, followed by precision cutting to produce parallel thin slices, the surfaces of which are scanned by SEM-EDX, 2) alignment of the 2D chemical maps within the internal 3D structure of the soil cube, and 3) development, of spatial statistics methods to predict the chemical composition of 3D soil based on the observed 2D chemical and 3D physical data. Specifically, three statistical models consisting of a regression tree, a regression tree kriging and cokriging model were used to predict the 3D spatial distribution of carbon, silicon, iron and oxygen in soil, these chemical elements showing a good spatial agreement between the X-ray grayscale intensities and the corresponding 2D SEM-EDX data. Due to the spatial correlation between the physical and chemical data, the regression-tree model showed a great potential in predicting chemical composition in particular for iron, which is generally sparsely distributed in soil. For carbon, silicon and oxygen, which are more densely distributed, the additional kriging of the regression tree residuals improved significantly the prediction, whereas prediction based on co-kriging was less consistent across replicates, underperforming regression-tree kriging. The present study shows a great potential in integrating geo-statistical methods with imaging techniques to unveil the 3D chemical structure of soil at very fine scales, the framework being suitable to be further applied to other types of imaging data such as images of biological thin sections for characterization of microbial distribution. Key words: X-ray CT, SEM-EDX, segmentation techniques, spatial correlation, 3D soil images, 2D chemical maps.
Automated Identification and Shape Analysis of Chorus Elements in the Van Allen Radiation Belts
NASA Astrophysics Data System (ADS)
Sen Gupta, Ananya; Kletzing, Craig; Howk, Robin; Kurth, William; Matheny, Morgan
2017-12-01
An important goal of the Van Allen Probes mission is to understand wave-particle interaction by chorus emissions in terrestrial Van Allen radiation belts. To test models, statistical characterization of chorus properties, such as amplitude variation and sweep rates, is an important scientific goal. The Electric and Magnetic Field Instrument Suite and Integrated Science (EMFISIS) instrumentation suite provides measurements of wave electric and magnetic fields as well as DC magnetic fields for the Van Allen Probes mission. However, manual inspection across terabytes of EMFISIS data is not feasible and as such introduces human confirmation bias. We present signal processing techniques for automated identification, shape analysis, and sweep rate characterization of high-amplitude whistler-mode chorus elements in the Van Allen radiation belts. Specifically, we develop signal processing techniques based on the radon transform that disambiguate chorus elements with a dominant sweep rate against hiss-like chorus. We present representative results validating our techniques and also provide statistical characterization of detected chorus elements across a case study of a 6 s epoch.
Lotfy, Hayam Mahmoud; Salem, Hesham; Abdelkawy, Mohammad; Samir, Ahmed
2015-04-05
Five spectrophotometric methods were successfully developed and validated for the determination of betamethasone valerate and fusidic acid in their binary mixture. Those methods are isoabsorptive point method combined with the first derivative (ISO Point--D1) and the recently developed and well established methods namely ratio difference (RD) and constant center coupled with spectrum subtraction (CC) methods, in addition to derivative ratio (1DD) and mean centering of ratio spectra (MCR). New enrichment technique called spectrum addition technique was used instead of traditional spiking technique. The proposed spectrophotometric procedures do not require any separation steps. Accuracy, precision and linearity ranges of the proposed methods were determined and the specificity was assessed by analyzing synthetic mixtures of both drugs. They were applied to their pharmaceutical formulation and the results obtained were statistically compared to that of official methods. The statistical comparison showed that there is no significant difference between the proposed methods and the official ones regarding both accuracy and precision. Copyright © 2015 Elsevier B.V. All rights reserved.
Rahman, Md Mahmudur; Bhattacharya, Prabir; Desai, Bipin C
2007-01-01
A content-based image retrieval (CBIR) framework for diverse collection of medical images of different imaging modalities, anatomic regions with different orientations and biological systems is proposed. Organization of images in such a database (DB) is well defined with predefined semantic categories; hence, it can be useful for category-specific searching. The proposed framework consists of machine learning methods for image prefiltering, similarity matching using statistical distance measures, and a relevance feedback (RF) scheme. To narrow down the semantic gap and increase the retrieval efficiency, we investigate both supervised and unsupervised learning techniques to associate low-level global image features (e.g., color, texture, and edge) in the projected PCA-based eigenspace with their high-level semantic and visual categories. Specially, we explore the use of a probabilistic multiclass support vector machine (SVM) and fuzzy c-mean (FCM) clustering for categorization and prefiltering of images to reduce the search space. A category-specific statistical similarity matching is proposed in a finer level on the prefiltered images. To incorporate a better perception subjectivity, an RF mechanism is also added to update the query parameters dynamically and adjust the proposed matching functions. Experiments are based on a ground-truth DB consisting of 5000 diverse medical images of 20 predefined categories. Analysis of results based on cross-validation (CV) accuracy and precision-recall for image categorization and retrieval is reported. It demonstrates the improvement, effectiveness, and efficiency achieved by the proposed framework.
Detection of Anomalies in Hydrometric Data Using Artificial Intelligence Techniques
NASA Astrophysics Data System (ADS)
Lauzon, N.; Lence, B. J.
2002-12-01
This work focuses on the detection of anomalies in hydrometric data sequences, such as 1) outliers, which are individual data having statistical properties that differ from those of the overall population; 2) shifts, which are sudden changes over time in the statistical properties of the historical records of data; and 3) trends, which are systematic changes over time in the statistical properties. For the purpose of the design and management of water resources systems, it is important to be aware of these anomalies in hydrometric data, for they can induce a bias in the estimation of water quantity and quality parameters. These anomalies may be viewed as specific patterns affecting the data, and therefore pattern recognition techniques can be used for identifying them. However, the number of possible patterns is very large for each type of anomaly and consequently large computing capacities are required to account for all possibilities using the standard statistical techniques, such as cluster analysis. Artificial intelligence techniques, such as the Kohonen neural network and fuzzy c-means, are clustering techniques commonly used for pattern recognition in several areas of engineering and have recently begun to be used for the analysis of natural systems. They require much less computing capacity than the standard statistical techniques, and therefore are well suited for the identification of outliers, shifts and trends in hydrometric data. This work constitutes a preliminary study, using synthetic data representing hydrometric data that can be found in Canada. The analysis of the results obtained shows that the Kohonen neural network and fuzzy c-means are reasonably successful in identifying anomalies. This work also addresses the problem of uncertainties inherent to the calibration procedures that fit the clusters to the possible patterns for both the Kohonen neural network and fuzzy c-means. Indeed, for the same database, different sets of clusters can be established with these calibration procedures. A simple method for analyzing uncertainties associated with the Kohonen neural network and fuzzy c-means is developed here. The method combines the results from several sets of clusters, either from the Kohonen neural network or fuzzy c-means, so as to provide an overall diagnosis as to the identification of outliers, shifts and trends. The results indicate an improvement in the performance for identifying anomalies when the method of combining cluster sets is used, compared with when only one cluster set is used.
Statistical assessment of the learning curves of health technologies.
Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T
2001-01-01
(1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)
MIDAS: Regionally linear multivariate discriminative statistical mapping.
Varol, Erdem; Sotiras, Aristeidis; Davatzikos, Christos
2018-07-01
Statistical parametric maps formed via voxel-wise mass-univariate tests, such as the general linear model, are commonly used to test hypotheses about regionally specific effects in neuroimaging cross-sectional studies where each subject is represented by a single image. Despite being informative, these techniques remain limited as they ignore multivariate relationships in the data. Most importantly, the commonly employed local Gaussian smoothing, which is important for accounting for registration errors and making the data follow Gaussian distributions, is usually chosen in an ad hoc fashion. Thus, it is often suboptimal for the task of detecting group differences and correlations with non-imaging variables. Information mapping techniques, such as searchlight, which use pattern classifiers to exploit multivariate information and obtain more powerful statistical maps, have become increasingly popular in recent years. However, existing methods may lead to important interpretation errors in practice (i.e., misidentifying a cluster as informative, or failing to detect truly informative voxels), while often being computationally expensive. To address these issues, we introduce a novel efficient multivariate statistical framework for cross-sectional studies, termed MIDAS, seeking highly sensitive and specific voxel-wise brain maps, while leveraging the power of regional discriminant analysis. In MIDAS, locally linear discriminative learning is applied to estimate the pattern that best discriminates between two groups, or predicts a variable of interest. This pattern is equivalent to local filtering by an optimal kernel whose coefficients are the weights of the linear discriminant. By composing information from all neighborhoods that contain a given voxel, MIDAS produces a statistic that collectively reflects the contribution of the voxel to the regional classifiers as well as the discriminative power of the classifiers. Critically, MIDAS efficiently assesses the statistical significance of the derived statistic by analytically approximating its null distribution without the need for computationally expensive permutation tests. The proposed framework was extensively validated using simulated atrophy in structural magnetic resonance imaging (MRI) and further tested using data from a task-based functional MRI study as well as a structural MRI study of cognitive performance. The performance of the proposed framework was evaluated against standard voxel-wise general linear models and other information mapping methods. The experimental results showed that MIDAS achieves relatively higher sensitivity and specificity in detecting group differences. Together, our results demonstrate the potential of the proposed approach to efficiently map effects of interest in both structural and functional data. Copyright © 2018. Published by Elsevier Inc.
Transverse mode analysis of optofluidic intracavity spectroscopy of canine hemangiosarcoma
NASA Astrophysics Data System (ADS)
Wang, Weina; Thamm, Douglas H.; Kisker, David W.; Lear, Kevin L.
2010-02-01
The label-free technique of optofluidic intracavity spectroscopy (OFIS) uses the optical transmission spectrum of a cell in a microfluidic optical resonator to distinguish cancerous and non-cancerous cells. Based on their distinctive characteristic transmission spectra, canine hemangiosarcoma (HSA) cancer cells and normal peripheral blood mononuclear cells (PBMCs) have been differentiated using the OFIS technique with high statistical significance (p<10- 6). 95% sensitivity and 98% specificity were achieved simultaneously. A cell lens model explains trends in the transverse mode pattern in the transmission spectra of HSA cells and allows extraction of cell focal length.
Methods and application of system identification in shock and vibration.
NASA Technical Reports Server (NTRS)
Collins, J. D.; Young, J. P.; Kiefling, L.
1972-01-01
A logical picture is presented of current useful system identification techniques in the shock and vibration field. A technology tree diagram is developed for the purpose of organizing and categorizing the widely varying approaches according to the fundamental nature of each. Specific examples of accomplished activity for each identification category are noted and discussed. To provide greater insight into the most current trends in the system identification field, a somewhat detailed description is presented of the essential features of a recently developed technique that is based on making the maximum use of all statistically known information about a system.
Hu, Xiangdong; Liu, Yujiang; Qian, Linxue
2017-10-01
Real-time elastography (RTE) and shear wave elastography (SWE) are noninvasive and easily available imaging techniques that measure the tissue strain, and it has been reported that the sensitivity and the specificity of elastography were better in differentiating between benign and malignant thyroid nodules than conventional technologies. Relevant articles were searched in multiple databases; the comparison of elasticity index (EI) was conducted with the Review Manager 5.0. Forest plots of the sensitivity and specificity and SROC curve of RTE and SWE were performed with STATA 10.0 software. In addition, sensitivity analysis and bias analysis of the studies were conducted to examine the quality of articles; and to estimate possible publication bias, funnel plot was used and the Egger test was conducted. Finally 22 articles which eventually satisfied the inclusion criteria were included in this study. After eliminating the inefficient, benign and malignant nodules were 2106 and 613, respectively. The meta-analysis suggested that the difference of EI between benign and malignant nodules was statistically significant (SMD = 2.11, 95% CI [1.67, 2.55], P < .00001). The overall sensitivities of RTE and SWE were roughly comparable, whereas the difference of specificities between these 2 methods was statistically significant. In addition, statistically significant difference of AUC between RTE and SWE was observed between RTE and SWE (P < .01). The specificity of RTE was statistically higher than that of SWE; which suggests that compared with SWE, RTE may be more accurate on differentiating benign and malignant thyroid nodules.
Hu, Xiangdong; Liu, Yujiang; Qian, Linxue
2017-01-01
Abstract Background: Real-time elastography (RTE) and shear wave elastography (SWE) are noninvasive and easily available imaging techniques that measure the tissue strain, and it has been reported that the sensitivity and the specificity of elastography were better in differentiating between benign and malignant thyroid nodules than conventional technologies. Methods: Relevant articles were searched in multiple databases; the comparison of elasticity index (EI) was conducted with the Review Manager 5.0. Forest plots of the sensitivity and specificity and SROC curve of RTE and SWE were performed with STATA 10.0 software. In addition, sensitivity analysis and bias analysis of the studies were conducted to examine the quality of articles; and to estimate possible publication bias, funnel plot was used and the Egger test was conducted. Results: Finally 22 articles which eventually satisfied the inclusion criteria were included in this study. After eliminating the inefficient, benign and malignant nodules were 2106 and 613, respectively. The meta-analysis suggested that the difference of EI between benign and malignant nodules was statistically significant (SMD = 2.11, 95% CI [1.67, 2.55], P < .00001). The overall sensitivities of RTE and SWE were roughly comparable, whereas the difference of specificities between these 2 methods was statistically significant. In addition, statistically significant difference of AUC between RTE and SWE was observed between RTE and SWE (P < .01). Conclusion: The specificity of RTE was statistically higher than that of SWE; which suggests that compared with SWE, RTE may be more accurate on differentiating benign and malignant thyroid nodules. PMID:29068996
Vapor Hydrogen Peroxide as Alternative to Dry Heat Microbial Reduction
NASA Technical Reports Server (NTRS)
Cash, Howard A.; Kern, Roger G.; Chung, Shirley Y.; Koukol, Robert C.; Barengoltz, Jack B.
2006-01-01
The Jet Propulsion Laboratory, in conjunction with the NASA Planetary Protection Officer, has selected vapor phase hydrogen peroxide (VHP) sterilization process for continued development as a NASA approved sterilization technique for spacecraft subsystems and systems. The goal is to include this technique, with appropriate specification, in NPG8020.12C as a low temperature complementary technique to the dry heat sterilization process. A series of experiments were conducted in vacuum to determine VHP process parameters that provided significant reductions in spore viability while allowing survival of sufficient spores for statistically significant enumeration. With this knowledge of D values, sensible margins can be applied in a planetary protection specification. The outcome of this study provided an optimization of test sterilizer process conditions: VHP concentration, process duration, a process temperature range for which the worst case D value may be imposed, a process humidity range for which the worst case D value may be imposed, and robustness to selected spacecraft material substrates.
ERIC Educational Resources Information Center
Haberman, Shelby J.; Lee, Yi-Hsuan
2017-01-01
In investigations of unusual testing behavior, a common question is whether a specific pattern of responses occurs unusually often within a group of examinees. In many current tests, modern communication techniques can permit quite large numbers of examinees to share keys, or common response patterns, to the entire test. To address this issue,…
The application of automatic recognition techniques in the Apollo 9 SO-65 experiment
NASA Technical Reports Server (NTRS)
Macdonald, R. B.
1970-01-01
A synoptic feature analysis is reported on Apollo 9 remote earth surface photographs that uses the methods of statistical pattern recognition to classify density points and clusterings in digital conversion of optical data. A computer derived geological map of a geological test site indicates that geological features of the range are separable, but that specific rock types are not identifiable.
ERIC Educational Resources Information Center
Moore, Andrea Lisa
2013-01-01
Toxic Release Inventory facilities are among the many environmental hazards shown to create environmental inequities in the United States. This project examined four factors associated with Toxic Release Inventory, specifically, manufacturing facility location at multiple spatial scales using spatial analysis techniques (i.e., O-ring statistic and…
Comparison analysis for classification algorithm in data mining and the study of model use
NASA Astrophysics Data System (ADS)
Chen, Junde; Zhang, Defu
2018-04-01
As a key technique in data mining, classification algorithm was received extensive attention. Through an experiment of classification algorithm in UCI data set, we gave a comparison analysis method for the different algorithms and the statistical test was used here. Than that, an adaptive diagnosis model for preventive electricity stealing and leakage was given as a specific case in the paper.
Measurement of the relationship between perceived and computed color differences
NASA Astrophysics Data System (ADS)
García, Pedro A.; Huertas, Rafael; Melgosa, Manuel; Cui, Guihua
2007-07-01
Using simulated data sets, we have analyzed some mathematical properties of different statistical measurements that have been employed in previous literature to test the performance of different color-difference formulas. Specifically, the properties of the combined index PF/3 (performance factor obtained as average of three terms), widely employed in current literature, have been considered. A new index named standardized residual sum of squares (STRESS), employed in multidimensional scaling techniques, is recommended. The main difference between PF/3 and STRESS is that the latter is simpler and allows inferences on the statistical significance of two color-difference formulas with respect to a given set of visual data.
Software for Data Analysis with Graphical Models
NASA Technical Reports Server (NTRS)
Buntine, Wray L.; Roy, H. Scott
1994-01-01
Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.
NASA Astrophysics Data System (ADS)
Boichuk, T. M.; Bachinskiy, V. T.; Vanchuliak, O. Ya.; Minzer, O. P.; Garazdiuk, M.; Motrich, A. V.
2014-08-01
This research presents the results of investigation of laser polarization fluorescence of biological layers (histological sections of the myocardium). The polarized structure of autofluorescence imaging layers of biological tissues was detected and investigated. Proposed the model of describing the formation of polarization inhomogeneous of autofluorescence imaging biological optically anisotropic layers. On this basis, analytically and experimentally tested to justify the method of laser polarimetry autofluorescent. Analyzed the effectiveness of this method in the postmortem diagnosis of infarction. The objective criteria (statistical moments) of differentiation of autofluorescent images of histological sections myocardium were defined. The operational characteristics (sensitivity, specificity, accuracy) of these technique were determined.
Survey of statistical techniques used in validation studies of air pollution prediction models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bornstein, R D; Anderson, S F
1979-03-01
Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.
Wathen, Brent; Kuiper, Michael; Walker, Virginia; Jia, Zongchao
2003-01-22
A novel computational technique for modeling crystal formation has been developed that combines three-dimensional (3-D) molecular representation and detailed energetics calculations of molecular mechanics techniques with the less-sophisticated probabilistic approach used by statistical techniques to study systems containing millions of molecules undergoing billions of interactions. Because our model incorporates both the structure of and the interaction energies between participating molecules, it enables the 3-D shape and surface properties of these molecules to directly affect crystal formation. This increase in model complexity has been achieved while simultaneously increasing the number of molecules in simulations by several orders of magnitude over previous statistical models. We have applied this technique to study the inhibitory effects of antifreeze proteins (AFPs) on ice-crystal formation. Modeling involving both fish and insect AFPs has produced results consistent with experimental observations, including the replication of ice-etching patterns, ice-growth inhibition, and specific AFP-induced ice morphologies. Our work suggests that the degree of AFP activity results more from AFP ice-binding orientation than from AFP ice-binding strength. This technique could readily be adapted to study other crystal and crystal inhibitor systems, or to study other noncrystal systems that exhibit regularity in the structuring of their component molecules, such as those associated with the new nanotechnologies.
Schramm, Jesper; Andersen, Morten; Vach, Kirstin; Kragstrup, Jakob; Peter Kampmann, Jens; Søndergaard, Jens
2007-01-01
Objective To examine the extent and composition of pharmaceutical industry representatives’ marketing techniques with a particular focus on drug sampling in relation to drug age. Design A group of 47 GPs prospectively collected data on drug promotional activities during a six-month period, and a sub-sample of 10 GPs furthermore recorded the representatives’ marketing techniques in detail. Setting Primary healthcare. Subjects General practitioners in the County of Funen, Denmark. Main outcome measures. Promotional visits and corresponding marketing techniques. Results The 47 GPs recorded 1050 visits corresponding to a median of 19 (range 3 to 63) per GP in the six months. The majority of drugs promoted (52%) were marketed more than five years ago. There was a statistically significant decline in the proportion of visits where drug samples were offered with drug age, but the decline was small OR 0.97 (95% CI 0.95;0.98) per year. Leaflets (68%), suggestions on how to improve therapy for a specific patient registered with the practice (53%), drug samples (48%), and gifts (36%) were the most frequently used marketing techniques. Conclusion Drug-industry representatives use a variety of promotional methods. The tendency to hand out drug samples was statistically significantly associated with drug age, but the decline was small. PMID:17497486
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qi, X. Sharon, E-mail: xqi@mednet.ucla.edu; Ruan, Dan; Lee, Steve P.
2015-03-15
Purpose: To develop a practical workflow for retrospectively analyzing target and normal tissue dose–volume endpoints for various intensity modulated radiation therapy (IMRT) delivery techniques; to develop technique-specific planning goals to improve plan consistency and quality when feasible. Methods and Materials: A total of 165 consecutive head-and-neck patients from our patient registry were selected and retrospectively analyzed. All IMRT plans were generated using the same dose–volume guidelines for TomoTherapy (Tomo, Accuray), TrueBeam (TB, Varian) using fixed-field IMRT (TB-IMRT) or RAPIDARC (TB-RAPIDARC), or Siemens Oncor (Siemens-IMRT, Siemens). A MATLAB-based dose–volume extraction and analysis tool was developed to export dosimetric endpoints for eachmore » patient. With a fair stratification of patient cohort, the variation of achieved dosimetric endpoints was analyzed among different treatment techniques. Upon identification of statistically significant variations, technique-specific planning goals were derived from dynamically accumulated institutional data. Results: Retrospective analysis showed that although all techniques yielded comparable target coverage, the doses to the critical structures differed. The maximum cord doses were 34.1 ± 2.6, 42.7 ± 2.1, 43.3 ± 2.0, and 45.1 ± 1.6 Gy for Tomo, TB-IMRT, TB-RAPIDARC, and Siemens-IMRT plans, respectively. Analyses of variance showed significant differences for the maximum cord doses but no significant differences for other selected structures among the investigated IMRT delivery techniques. Subsequently, a refined technique-specific dose–volume guideline for maximum cord dose was derived at a confidence level of 95%. The dosimetric plans that failed the refined technique-specific planning goals were reoptimized according to the refined constraints. We observed better cord sparing with minimal variations for the target coverage and other organ at risk sparing for the Tomo cases, and higher parotid doses for C-arm linear accelerator–based IMRT and RAPIDARC plans. Conclusion: Patient registry–based processes allowed easy and systematic dosimetric assessment of treatment plan quality and consistency. Our analysis revealed the dependence of certain dosimetric endpoints on the treatment techniques. Technique-specific refinement of planning goals may lead to improvement in plan consistency and plan quality.« less
A review of approaches to identifying patient phenotype cohorts using electronic health records
Shivade, Chaitanya; Raghavan, Preethi; Fosler-Lussier, Eric; Embi, Peter J; Elhadad, Noemie; Johnson, Stephen B; Lai, Albert M
2014-01-01
Objective To summarize literature describing approaches aimed at automatically identifying patients with a common phenotype. Materials and methods We performed a review of studies describing systems or reporting techniques developed for identifying cohorts of patients with specific phenotypes. Every full text article published in (1) Journal of American Medical Informatics Association, (2) Journal of Biomedical Informatics, (3) Proceedings of the Annual American Medical Informatics Association Symposium, and (4) Proceedings of Clinical Research Informatics Conference within the past 3 years was assessed for inclusion in the review. Only articles using automated techniques were included. Results Ninety-seven articles met our inclusion criteria. Forty-six used natural language processing (NLP)-based techniques, 24 described rule-based systems, 41 used statistical analyses, data mining, or machine learning techniques, while 22 described hybrid systems. Nine articles described the architecture of large-scale systems developed for determining cohort eligibility of patients. Discussion We observe that there is a rise in the number of studies associated with cohort identification using electronic medical records. Statistical analyses or machine learning, followed by NLP techniques, are gaining popularity over the years in comparison with rule-based systems. Conclusions There are a variety of approaches for classifying patients into a particular phenotype. Different techniques and data sources are used, and good performance is reported on datasets at respective institutions. However, no system makes comprehensive use of electronic medical records addressing all of their known weaknesses. PMID:24201027
Medial-based deformable models in nonconvex shape-spaces for medical image segmentation.
McIntosh, Chris; Hamarneh, Ghassan
2012-01-01
We explore the application of genetic algorithms (GA) to deformable models through the proposition of a novel method for medical image segmentation that combines GA with nonconvex, localized, medial-based shape statistics. We replace the more typical gradient descent optimizer used in deformable models with GA, and the convex, implicit, global shape statistics with nonconvex, explicit, localized ones. Specifically, we propose GA to reduce typical deformable model weaknesses pertaining to model initialization, pose estimation and local minima, through the simultaneous evolution of a large number of models. Furthermore, we constrain the evolution, and thus reduce the size of the search-space, by using statistically-based deformable models whose deformations are intuitive (stretch, bulge, bend) and are driven in terms of localized principal modes of variation, instead of modes of variation across the entire shape that often fail to capture localized shape changes. Although GA are not guaranteed to achieve the global optima, our method compares favorably to the prevalent optimization techniques, convex/nonconvex gradient-based optimizers and to globally optimal graph-theoretic combinatorial optimization techniques, when applied to the task of corpus callosum segmentation in 50 mid-sagittal brain magnetic resonance images.
Foster, Katherine T; Beltz, Adriene M
2018-08-01
Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Du, X.
2008-12-01
The Kaskaskia River basin contains 136,000 acres of bottomland forest, the largest contiguous tract of bottomland forest remaining in the state of Illinois. Since the 1960's, the Carlyle Lake Dam impoundment and channelization activities have altered the natural hydrologic and ecological equilibrium of the Kaskaskia River. Morphological changes of the river channel have necessitated conservation and restoration efforts to create and maintain the sustainability, diversity, health, and connectivity of the river watershed. This study utilized the specific gage technique and historical aerial photographs to investigate the spatial and temporal changes of the river. Historical daily discharge and daily stage data from the Carlyle (1966 to 2002) and Venedy Station gages (1984 to 2003) were analyzed. Logs of daily discharge data were used to generate annual rating curves. The best fit equations were produced from annual rating regressions. A stage associated with a chosen reference discharge, the minimum available discharge (MAD), was calculated. A stage decreasing/increasing trend was used as a primary indicator of channel bed incision/aggradation. Pseudo specific gage analysis (PSGA) was used to model channel cross sectional geometry changes over time. PSGA applied similar procedures as compared to the specific gage technique. Instead of using the stage variable, PSGA utilized cross sectional width, cross-sectional area, mean velocity and gage height individually. At each gage, the historical change of each cross sectional parameter was plotted against the log of discharge. Ratings of specific stages, specific cross sectional width, specific depth, specific area, and specific velocity associated with the chosen discharge, MAD, were produced. The decreasing/increasing trend of each parameter mentioned above corresponded with changes of channel cross sectional geometries over time. Historical aerial photographs were also used to assess the bankfull channel width changing rates during the pre and post modification period. The statistical significance of the regression trendlines from the specific gage analyses and PSGA was tested. Results suggested that there was no significant channel bed incision trend near the river gages within the studied time period. A statistically significant increase in channel width changing rates was found during post-modification period. Following the channelization and dam construction on the Kaskaskia River, substantial channel bed widening has accelerated bank erosion and associated channel morphology change, which has consequently resulted in a net loss of riparian habitat in this important bottomland forest corridor in southern Illinois, USA.
ERIC Educational Resources Information Center
Li, Yuan H.; Yang, Yu N.; Tompkins, Leroy J.; Modarresi, Shahpar
2005-01-01
The statistical technique, "Zero-One Linear Programming," that has successfully been used to create multiple tests with similar characteristics (e.g., item difficulties, test information and test specifications) in the area of educational measurement, was deemed to be a suitable method for creating multiple sets of matched samples to be…
Kamath, Padmaja; Fernandez, Alberto; Giralt, Francesc; Rallo, Robert
2015-01-01
Nanoparticles are likely to interact in real-case application scenarios with mixtures of proteins and biomolecules that will absorb onto their surface forming the so-called protein corona. Information related to the composition of the protein corona and net cell association was collected from literature for a library of surface-modified gold and silver nanoparticles. For each protein in the corona, sequence information was extracted and used to calculate physicochemical properties and statistical descriptors. Data cleaning and preprocessing techniques including statistical analysis and feature selection methods were applied to remove highly correlated, redundant and non-significant features. A weighting technique was applied to construct specific signatures that represent the corona composition for each nanoparticle. Using this basic set of protein descriptors, a new Protein Corona Structure-Activity Relationship (PCSAR) that relates net cell association with the physicochemical descriptors of the proteins that form the corona was developed and validated. The features that resulted from the feature selection were in line with already published literature, and the computational model constructed on these features had a good accuracy (R(2)LOO=0.76 and R(2)LMO(25%)=0.72) and stability, with the advantage that the fingerprints based on physicochemical descriptors were independent of the specific proteins that form the corona.
Outcomes of role stress: a multisample constructive replication.
Kemery, E R; Bedeian, A G; Mossholder, K W; Touliatos, J
1985-06-01
Responses from four separate samples of accountants and hospital employees provided a constructive replication of the Bedeian and Armenakis (1981) model of the causal nexus between role stress and selected outcome variables. We investigated the relationship between both role ambiguity and role conflict--as specific forms of role stress--and job-related tension, job satisfaction, and propensity to leave, using LISREL IV, a technique capable of providing statistical data for a hypothesized population model, as well as for specific causal paths. Results, which support the Bedeian and Armenakis model, are discussed in light of previous research.
Antweiler, Ronald C.; Taylor, Howard E.
2008-01-01
The main classes of statistical treatment of below-detection limit (left-censored) environmental data for the determination of basic statistics that have been used in the literature are substitution methods, maximum likelihood, regression on order statistics (ROS), and nonparametric techniques. These treatments, along with using all instrument-generated data (even those below detection), were evaluated by examining data sets in which the true values of the censored data were known. It was found that for data sets with less than 70% censored data, the best technique overall for determination of summary statistics was the nonparametric Kaplan-Meier technique. ROS and the two substitution methods of assigning one-half the detection limit value to censored data or assigning a random number between zero and the detection limit to censored data were adequate alternatives. The use of these two substitution methods, however, requires a thorough understanding of how the laboratory censored the data. The technique of employing all instrument-generated data - including numbers below the detection limit - was found to be less adequate than the above techniques. At high degrees of censoring (greater than 70% censored data), no technique provided good estimates of summary statistics. Maximum likelihood techniques were found to be far inferior to all other treatments except substituting zero or the detection limit value to censored data.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Gounder, Revathy; Vikas, B. V. J.
2016-01-01
Objective: To evaluate and compare the effect of 0.5% chlorhexidine gluconate, 1% sodium hypochlorite, and 2% glutaraldehyde by immersion and spray atomization technique on the linear dimensional stability of Jet bite, Aluwax and Ramitec interocclusal recording materials. Materials and Methods: Three representative materials: Jet bite (addition silicone), Aluwax and Ramitec (polyether) were mixed according to manufacturer's instructions and then specimens were prepared according to the specifications of ISO 4823. All the specimens except the control (distilled water) were treated with disinfectant solutions (0.5% chlorhexidine gluconate, 1% sodium hypochlorite, and 2% glutaraldehyde) for 30 and 60 min (n = 10) by spray and immersion technique. Once removed from the solutions, the test samples were washed in water for 15 s, dried and measured after 24 h 3 times using a measuring microscope with an accuracy of 0.0001 mm. Two-way ANOVA and Tukey's test with significance level of 5% were used to assess the statistical data (α = 0.05). Result: All groups showed no significant difference statistically, in linear dimension when disinfected for 30 min by spray or immersion technique. Polyether had significantly higher dimensional variation when immersed in sodium hypochlorite for 60 min. Addition silicone showed the least dimensional change which ranged from 0.024% to 0.05%, followed by polyether from 0.004% to 0.171% and Aluwax from 0.146% to 0.228%. Conclusion: To preserve the dimensions and surface of the recording materials and effective microbial elimination, restrictions should be applied in the method of disinfection and time duration. However, using the disinfectants either by spray or immersion technique, the dimensional change was <0.5% which was not clinically significant according to the American Dental Association specification no. 19 criteria within the first 24 h. PMID:27011733
Selvarasu, Suresh; Kim, Do Yun; Karimi, Iftekhar A; Lee, Dong-Yup
2010-10-01
We present an integrated framework for characterizing fed-batch cultures of mouse hybridoma cells producing monoclonal antibody (mAb). This framework systematically combines data preprocessing, elemental balancing and statistical analysis technique. Initially, specific rates of cell growth, glucose/amino acid consumptions and mAb/metabolite productions were calculated via curve fitting using logistic equations, with subsequent elemental balancing of the preprocessed data indicating the presence of experimental measurement errors. Multivariate statistical analysis was then employed to understand physiological characteristics of the cellular system. The results from principal component analysis (PCA) revealed three major clusters of amino acids with similar trends in their consumption profiles: (i) arginine, threonine and serine, (ii) glycine, tyrosine, phenylalanine, methionine, histidine and asparagine, and (iii) lysine, valine and isoleucine. Further analysis using partial least square (PLS) regression identified key amino acids which were positively or negatively correlated with the cell growth, mAb production and the generation of lactate and ammonia. Based on these results, the optimal concentrations of key amino acids in the feed medium can be inferred, potentially leading to an increase in cell viability and productivity, as well as a decrease in toxic waste production. The study demonstrated how the current methodological framework using multivariate statistical analysis techniques can serve as a potential tool for deriving rational medium design strategies. Copyright © 2010 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Navard, Sharon E.
1989-01-01
In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.
Efficient solid rocket propulsion for access to space
NASA Astrophysics Data System (ADS)
Maggi, Filippo; Bandera, Alessio; Galfetti, Luciano; De Luca, Luigi T.; Jackson, Thomas L.
2010-06-01
Space launch activity is expected to grow in the next few years in order to follow the current trend of space exploitation for business purpose. Granting high specific thrust and volumetric specific impulse, and counting on decades of intense development, solid rocket propulsion is a good candidate for commercial access to space, even with common propellant formulations. Yet, some drawbacks such as low theoretical specific impulse, losses as well as safety issues, suggest more efficient propulsion systems, digging into the enhancement of consolidated techniques. Focusing the attention on delivered specific impulse, a consistent fraction of losses can be ascribed to the multiphase medium inside the nozzle which, in turn, is related to agglomeration; a reduction of agglomerate size is likely. The present paper proposes a model based on heterogeneity characterization capable of describing the agglomeration trend for a standard aluminized solid propellant formulation. Material microstructure is characterized through the use of two statistical descriptors (pair correlation function and near-contact particles) looking at the mean metal pocket size inside the bulk. Given the real formulation and density of a propellant, a packing code generates the material representative which is then statistically analyzed. Agglomerate predictions are successfully contrasted to experimental data at 5 bar for four different formulations.
Conesa, Claudia; García-Breijo, Eduardo; Loeff, Edwin; Seguí, Lucía; Fito, Pedro; Laguarda-Miró, Nicolás
2015-01-01
Electrochemical Impedance Spectroscopy (EIS) has been used to develop a methodology able to identify and quantify fermentable sugars present in the enzymatic hydrolysis phase of second-generation bioethanol production from pineapple waste. Thus, a low-cost non-destructive system consisting of a stainless double needle electrode associated to an electronic equipment that allows the implementation of EIS was developed. In order to validate the system, different concentrations of glucose, fructose and sucrose were added to the pineapple waste and analyzed both individually and in combination. Next, statistical data treatment enabled the design of specific Artificial Neural Networks-based mathematical models for each one of the studied sugars and their respective combinations. The obtained prediction models are robust and reliable and they are considered statistically valid (CCR% > 93.443%). These results allow us to introduce this EIS-based technique as an easy, fast, non-destructive, and in-situ alternative to the traditional laboratory methods for enzymatic hydrolysis monitoring. PMID:26378537
CAPSAS: Computer Assisted Program for the Selection of Appropriate Statistics.
ERIC Educational Resources Information Center
Shermis, Mark D.; Albert, Susan L.
A computer-assisted program has been developed for the selection of statistics or statistical techniques by both students and researchers. Based on Andrews, Klem, Davidson, O'Malley and Rodgers "A Guide for Selecting Statistical Techniques for Analyzing Social Science Data," this FORTRAN-compiled interactive computer program was…
NASA Technical Reports Server (NTRS)
Park, Steve
1990-01-01
A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng Guoyan
2010-04-15
Purpose: The aim of this article is to investigate the feasibility of using a statistical shape model (SSM)-based reconstruction technique to derive a scaled, patient-specific surface model of the pelvis from a single standard anteroposterior (AP) x-ray radiograph and the feasibility of estimating the scale of the reconstructed surface model by performing a surface-based 3D/3D matching. Methods: Data sets of 14 pelvises (one plastic bone, 12 cadavers, and one patient) were used to validate the single-image based reconstruction technique. This reconstruction technique is based on a hybrid 2D/3D deformable registration process combining a landmark-to-ray registration with a SSM-based 2D/3D reconstruction.more » The landmark-to-ray registration was used to find an initial scale and an initial rigid transformation between the x-ray image and the SSM. The estimated scale and rigid transformation were used to initialize the SSM-based 2D/3D reconstruction. The optimal reconstruction was then achieved in three stages by iteratively matching the projections of the apparent contours extracted from a 3D model derived from the SSM to the image contours extracted from the x-ray radiograph: Iterative affine registration, statistical instantiation, and iterative regularized shape deformation. The image contours are first detected by using a semiautomatic segmentation tool based on the Livewire algorithm and then approximated by a set of sparse dominant points that are adaptively sampled from the detected contours. The unknown scales of the reconstructed models were estimated by performing a surface-based 3D/3D matching between the reconstructed models and the associated ground truth models that were derived from a CT-based reconstruction method. Such a matching also allowed for computing the errors between the reconstructed models and the associated ground truth models. Results: The technique could reconstruct the surface models of all 14 pelvises directly from the landmark-based initialization. Depending on the surface-based matching techniques, the reconstruction errors were slightly different. When a surface-based iterative affine registration was used, an average reconstruction error of 1.6 mm was observed. This error was increased to 1.9 mm, when a surface-based iterative scaled rigid registration was used. Conclusions: It is feasible to reconstruct a scaled, patient-specific surface model of the pelvis from single standard AP x-ray radiograph using the present approach. The unknown scale of the reconstructed model can be estimated by performing a surface-based 3D/3D matching.« less
Poster - Thur Eve - 54: A software solution for ongoing DVH quality assurance in radiation therapy.
Annis, S-L; Zeng, G; Wu, X; Macpherson, M
2012-07-01
A program has been developed in MATLAB for use in quality assurance of treatment planning of radiation therapy. It analyzes patient DVH files and compiles dose volume data for review, trending, comparison and analysis. Patient DVH files are exported from the Eclipse treatment planning system and saved according to treatment sites and date. Currently analysis is available for 4 treatment sites; Prostate, Prostate Bed, Lung, and Upper GI, with two functions for data report and analysis: patient-specific and organ-specific. The patient-specific function loads one patient DVH file and reports the user-specified dose volume data of organs and targets. These data can be compiled to an external file for a third party analysis. The organ-specific function extracts a requested dose volume of an organ from the DVH files of a patient group and reports the statistics over this population. A graphical user interface is utilized to select clinical sites, function and structures, and input user's requests. We have implemented this program in planning quality assurance at our center. The program has tracked the dosimetric improvement in GU sites after VMAT was implemented clinically. It has generated dose volume statistics for different groups of patients associated with technique or time range. This program allows reporting and statistical analysis of DVH files. It is an efficient tool for the planning quality control in radiation therapy. © 2012 American Association of Physicists in Medicine.
Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.
Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan
2015-01-01
The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.
Protocol Design Challenges in the Detection of Awareness in Aware Subjects Using EEG Signals.
Henriques, J; Gabriel, D; Grigoryeva, L; Haffen, E; Moulin, T; Aubry, R; Pazart, L; Ortega, J-P
2016-10-01
Recent studies have evidenced serious difficulties in detecting covert awareness with electroencephalography-based techniques both in unresponsive patients and in healthy control subjects. This work reproduces the protocol design in two recent mental imagery studies with a larger group comprising 20 healthy volunteers. The main goal is assessing if modifications in the signal extraction techniques, training-testing/cross-validation routines, and hypotheses evoked in the statistical analysis, can provide solutions to the serious difficulties documented in the literature. The lack of robustness in the results advises for further search of alternative protocols more suitable for machine learning classification and of better performing signal treatment techniques. Specific recommendations are made using the findings in this work. © EEG and Clinical Neuroscience Society (ECNS) 2014.
Comparison Of The Performance Of Hybrid Coders Under Different Configurations
NASA Astrophysics Data System (ADS)
Gunasekaran, S.; Raina J., P.
1983-10-01
Picture bandwidth reduction employing DPCM and Orthogonal Transform (OT) coding for TV transmission have been widely discussed in literature; both the techniques have their own advantages and limitations in terms of compression ratio, implementation, sensitivity to picture statistics and their sensitivity to the channel noise. Hybrid coding introduced by Habibi, - a cascade of the two techniques, offers excellent performance and proves to be attractive retaining the special advantages of both the techniques. In the recent times, the interest has shifted over to Hybrid coding, and in the absence of a report on the relative performance specifications of hybrid coders at different configurations, an attempt has been made to colate the information. Fourier, Hadamard, Slant, Sine, Cosine and Harr transforms have been considered for the present work.
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
NASA Astrophysics Data System (ADS)
Bliefernicht, Jan; Seidel, Jochen; Salack, Seyni; Waongo, Moussa; Laux, Patrick; Kunstmann, Harald
2017-04-01
Seasonal precipitation forecasts are a crucial source of information for an early warning of hydro-meteorological extremes in West Africa. However, the current seasonal forecasting system used by the West African weather services in the framework of the West African Climate Outlook forum (PRESAO) is limited to probabilistic precipitation forecasts of 1-month lead time. To improve this provision, we use an ensemble-based quantile-quantile transformation for bias correction of precipitation forecasts provided by a global seasonal ensemble prediction system, the Climate Forecast System Version 2 (CFS2). The statistical technique eliminates systematic differences between global forecasts and observations with the potential to preserve the signal from the model. The technique has also the advantage that it can be easily implemented at national weather services with low capacities. The statistical technique is used to generate probabilistic forecasts of monthly and seasonal precipitation amount and other precipitation indices useful for an early warning of large-scale drought and floods in West Africa. The evaluation of the statistical technique is done using CFS hindcasts (1982 to 2009) in a cross-validation mode to determine the performance of the precipitation forecasts for several lead times focusing on drought and flood events depicted over the Volta and Niger basins. In addition, operational forecasts provided by PRESAO are analyzed from 1998 to 2015. The precipitation forecasts are compared to low-skill reference forecasts generated from gridded observations (i.e. GPCC, CHIRPS) and a novel in-situ gauge database from national observation networks (see Poster EGU2017-10271). The forecasts are evaluated using state-of-the-art verification techniques to determine specific quality attributes of probabilistic forecasts such as reliability, accuracy and skill. In addition, cost-loss approaches are used to determine the value of probabilistic forecasts for multiple users in warning situations. The outcomes of the hindcasts experiment for the Volta basin illustrate that the statistical technique can clearly improve the CFS precipitation forecasts with the potential to provide skillful and valuable early precipitation warnings for large-scale drought and flood situations several months in ahead. In this presentation we give a detailed overview about the ensemble-based quantile-quantile-transformation, its validation and verification and the possibilities of this technique to complement PRESAO. We also highlight the performance of this technique for extremes such as the Sahel drought in the 80ties and in comparison to the various reference data sets (e.g. CFS2, PRESAO, observational data sets) used in this study.
Wang, Guorong; Guo, Ling; Jiang, Bin; Huang, Min; Zhang, Jian; Qin, Ying
2015-01-01
Amplitude changes in the P-wave of intracavitary electrocardiography have been used to assess the tip placement of central venous catheters. The research assessed the sensitivity and specificity of this sign in comparison with standard radiographic techniques for tip location, focusing on factors influencing its clinical utility. Both intracavitary electrocardiography guided tip location and X-ray positioning were used to verify catheter tip locations in patients undergoing central venous catheter insertion. Intracavitary electrocardiograms from 1119 patients (of a total 1160 subjects) showed specific amplitude changes in the P-wave. As the results show, compared with X-ray positioning, the sensitivity of electrocardiography-guided tip location was 97.3%, with false negative rate of 2.7%; the specificity was 1, with false positive rate of zero. Univariate analyses indicated that features including age, gender, height, body weight, and heart rate have no statistically significant influence on P-wave amplitude changes (P>0.05). Multivariate logistic regression revealed that catheter insertion routes (OR = 2.280, P = 0.003) and basal P-wave amplitude (OR = 0.553, P = 0.003) have statistically significant impacts on P-wave amplitude changes. As a reliable indicator of tip location, amplitude change in the P-wave has proved of good sensitivity and excellent specificity, and the minor, zero, false positive rate supports the clinical utility of this technique in early recognition of malpositioned tips. A better sensitivity was achieved in placement of centrally inserted central catheters (CICCs) than that of peripherally inserted central catheters (PICCs). In clinical practice, a combination of intracavitary electrocardiography, ultrasonic inspection and the anthropometric measurement method would further improve the accuracy. PMID:25915758
An ANOVA approach for statistical comparisons of brain networks.
Fraiman, Daniel; Fraiman, Ricardo
2018-03-16
The study of brain networks has developed extensively over the last couple of decades. By contrast, techniques for the statistical analysis of these networks are less developed. In this paper, we focus on the statistical comparison of brain networks in a nonparametric framework and discuss the associated detection and identification problems. We tested network differences between groups with an analysis of variance (ANOVA) test we developed specifically for networks. We also propose and analyse the behaviour of a new statistical procedure designed to identify different subnetworks. As an example, we show the application of this tool in resting-state fMRI data obtained from the Human Connectome Project. We identify, among other variables, that the amount of sleep the days before the scan is a relevant variable that must be controlled. Finally, we discuss the potential bias in neuroimaging findings that is generated by some behavioural and brain structure variables. Our method can also be applied to other kind of networks such as protein interaction networks, gene networks or social networks.
SIRU utilization. Volume 1: Theory, development and test evaluation
NASA Technical Reports Server (NTRS)
Musoff, H.
1974-01-01
The theory, development, and test evaluations of the Strapdown Inertial Reference Unit (SIRU) are discussed. The statistical failure detection and isolation, single position calibration, and self alignment techniques are emphasized. Circuit diagrams of the system components are provided. Mathematical models are developed to show the performance characteristics of the subsystems. Specific areas of the utilization program are identified as: (1) error source propagation characteristics and (2) local level navigation performance demonstrations.
Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria
2009-09-01
Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.
Patient-specific bronchoscopy visualization through BRDF estimation and disocclusion correction.
Chung, Adrian J; Deligianni, Fani; Shah, Pallav; Wells, Athol; Yang, Guang-Zhong
2006-04-01
This paper presents an image-based method for virtual bronchoscope with photo-realistic rendering. The technique is based on recovering bidirectional reflectance distribution function (BRDF) parameters in an environment where the choice of viewing positions, directions, and illumination conditions are restricted. Video images of bronchoscopy examinations are combined with patient-specific three-dimensional (3-D) computed tomography data through two-dimensional (2-D)/3-D registration and shading model parameters are then recovered by exploiting the restricted lighting configurations imposed by the bronchoscope. With the proposed technique, the recovered BRDF is used to predict the expected shading intensity, allowing a texture map independent of lighting conditions to be extracted from each video frame. To correct for disocclusion artefacts, statistical texture synthesis was used to recreate the missing areas. New views not present in the original bronchoscopy video are rendered by evaluating the BRDF with different viewing and illumination parameters. This allows free navigation of the acquired 3-D model with enhanced photo-realism. To assess the practical value of the proposed technique, a detailed visual scoring that involves both real and rendered bronchoscope images is conducted.
Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M; Grün, Sonja
2017-01-01
Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis.
Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M.; Grün, Sonja
2017-01-01
Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis. PMID:28596729
Pulverman, Carey S; Hixon, J Gregory; Meston, Cindy M
2015-10-01
Based on analytic techniques that collapse data into a single average value, it has been reported that women lack category specificity and show genital sexual arousal to a large range of sexual stimuli including those that both match and do not match their self-reported sexual interests. These findings may be a methodological artifact of the way in which data are analyzed. This study examined whether using an analytic technique that models data over time would yield different results. Across two studies, heterosexual (N = 19) and lesbian (N = 14) women viewed erotic films featuring heterosexual, lesbian, and gay male couples, respectively, as their physiological sexual arousal was assessed with vaginal photoplethysmography. Data analysis with traditional methods comparing average genital arousal between films failed to detect specificity of genital arousal for either group. When data were analyzed with smoothing regression splines and a within-subjects approach, both heterosexual and lesbian women demonstrated different patterns of genital sexual arousal to the different types of erotic films, suggesting that sophisticated statistical techniques may be necessary to more fully understand women's genital sexual arousal response. Heterosexual women showed category-specific genital sexual arousal. Lesbian women showed higher arousal to the heterosexual film than the other films. However, within subjects, lesbian women showed significantly different arousal responses suggesting that lesbian women's genital arousal discriminates between different categories of stimuli at the individual level. Implications for the future use of vaginal photoplethysmography as a diagnostic tool of sexual preferences in clinical and forensic settings are discussed. © 2015 Society for Psychophysiological Research.
Cluster-based analysis improves predictive validity of spike-triggered receptive field estimates
Malone, Brian J.
2017-01-01
Spectrotemporal receptive field (STRF) characterization is a central goal of auditory physiology. STRFs are often approximated by the spike-triggered average (STA), which reflects the average stimulus preceding a spike. In many cases, the raw STA is subjected to a threshold defined by gain values expected by chance. However, such correction methods have not been universally adopted, and the consequences of specific gain-thresholding approaches have not been investigated systematically. Here, we evaluate two classes of statistical correction techniques, using the resulting STRF estimates to predict responses to a novel validation stimulus. The first, more traditional technique eliminated STRF pixels (time-frequency bins) with gain values expected by chance. This correction method yielded significant increases in prediction accuracy, including when the threshold setting was optimized for each unit. The second technique was a two-step thresholding procedure wherein clusters of contiguous pixels surviving an initial gain threshold were then subjected to a cluster mass threshold based on summed pixel values. This approach significantly improved upon even the best gain-thresholding techniques. Additional analyses suggested that allowing threshold settings to vary independently for excitatory and inhibitory subfields of the STRF resulted in only marginal additional gains, at best. In summary, augmenting reverse correlation techniques with principled statistical correction choices increased prediction accuracy by over 80% for multi-unit STRFs and by over 40% for single-unit STRFs, furthering the interpretational relevance of the recovered spectrotemporal filters for auditory systems analysis. PMID:28877194
NASA Astrophysics Data System (ADS)
Liu, Bilan; Qiu, Xing; Zhu, Tong; Tian, Wei; Hu, Rui; Ekholm, Sven; Schifitto, Giovanni; Zhong, Jianhui
2016-03-01
Subject-specific longitudinal DTI study is vital for investigation of pathological changes of lesions and disease evolution. Spatial Regression Analysis of Diffusion tensor imaging (SPREAD) is a non-parametric permutation-based statistical framework that combines spatial regression and resampling techniques to achieve effective detection of localized longitudinal diffusion changes within the whole brain at individual level without a priori hypotheses. However, boundary blurring and dislocation limit its sensitivity, especially towards detecting lesions of irregular shapes. In the present study, we propose an improved SPREAD (dubbed improved SPREAD, or iSPREAD) method by incorporating a three-dimensional (3D) nonlinear anisotropic diffusion filtering method, which provides edge-preserving image smoothing through a nonlinear scale space approach. The statistical inference based on iSPREAD was evaluated and compared with the original SPREAD method using both simulated and in vivo human brain data. Results demonstrated that the sensitivity and accuracy of the SPREAD method has been improved substantially by adapting nonlinear anisotropic filtering. iSPREAD identifies subject-specific longitudinal changes in the brain with improved sensitivity, accuracy, and enhanced statistical power, especially when the spatial correlation is heterogeneous among neighboring image pixels in DTI.
Adaptive correction of ensemble forecasts
NASA Astrophysics Data System (ADS)
Pelosi, Anna; Battista Chirico, Giovanni; Van den Bergh, Joris; Vannitsem, Stephane
2017-04-01
Forecasts from numerical weather prediction (NWP) models often suffer from both systematic and non-systematic errors. These are present in both deterministic and ensemble forecasts, and originate from various sources such as model error and subgrid variability. Statistical post-processing techniques can partly remove such errors, which is particularly important when NWP outputs concerning surface weather variables are employed for site specific applications. Many different post-processing techniques have been developed. For deterministic forecasts, adaptive methods such as the Kalman filter are often used, which sequentially post-process the forecasts by continuously updating the correction parameters as new ground observations become available. These methods are especially valuable when long training data sets do not exist. For ensemble forecasts, well-known techniques are ensemble model output statistics (EMOS), and so-called "member-by-member" approaches (MBM). Here, we introduce a new adaptive post-processing technique for ensemble predictions. The proposed method is a sequential Kalman filtering technique that fully exploits the information content of the ensemble. One correction equation is retrieved and applied to all members, however the parameters of the regression equations are retrieved by exploiting the second order statistics of the forecast ensemble. We compare our new method with two other techniques: a simple method that makes use of a running bias correction of the ensemble mean, and an MBM post-processing approach that rescales the ensemble mean and spread, based on minimization of the Continuous Ranked Probability Score (CRPS). We perform a verification study for the region of Campania in southern Italy. We use two years (2014-2015) of daily meteorological observations of 2-meter temperature and 10-meter wind speed from 18 ground-based automatic weather stations distributed across the region, comparing them with the corresponding COSMO-LEPS ensemble forecasts. Deterministic verification scores (e.g., mean absolute error, bias) and probabilistic scores (e.g., CRPS) are used to evaluate the post-processing techniques. We conclude that the new adaptive method outperforms the simpler running bias-correction. The proposed adaptive method often outperforms the MBM method in removing bias. The MBM method has the advantage of correcting the ensemble spread, although it needs more training data.
NASA Astrophysics Data System (ADS)
Guillen, George; Rainey, Gail; Morin, Michelle
2004-04-01
Currently, the Minerals Management Service uses the Oil Spill Risk Analysis model (OSRAM) to predict the movement of potential oil spills greater than 1000 bbl originating from offshore oil and gas facilities. OSRAM generates oil spill trajectories using meteorological and hydrological data input from either actual physical measurements or estimates generated from other hydrological models. OSRAM and many other models produce output matrices of average, maximum and minimum contact probabilities to specific landfall or target segments (columns) from oil spills at specific points (rows). Analysts and managers are often interested in identifying geographic areas or groups of facilities that pose similar risks to specific targets or groups of targets if a spill occurred. Unfortunately, due to the potentially large matrix generated by many spill models, this question is difficult to answer without the use of data reduction and visualization methods. In our study we utilized a multivariate statistical method called cluster analysis to group areas of similar risk based on potential distribution of landfall target trajectory probabilities. We also utilized ArcView™ GIS to display spill launch point groupings. The combination of GIS and multivariate statistical techniques in the post-processing of trajectory model output is a powerful tool for identifying and delineating areas of similar risk from multiple spill sources. We strongly encourage modelers, statistical and GIS software programmers to closely collaborate to produce a more seamless integration of these technologies and approaches to analyzing data. They are complimentary methods that strengthen the overall assessment of spill risks.
Effect of influenza vaccination on oxidative stress products in breath.
Phillips, Michael; Cataneo, Renee N; Chaturvedi, Anirudh; Danaher, Patrick J; Devadiga, Anantrai; Legendre, David A; Nail, Kim L; Schmitt, Peter; Wai, James
2010-06-01
Viral infections cause increased oxidative stress, so a breath test for oxidative stress biomarkers (alkanes and alkane derivatives) might provide a new tool for early diagnosis. We studied 33 normal healthy human subjects receiving scheduled treatment with live attenuated influenza vaccine (LAIV). Each subject was his or her own control, since they were studied on day 0 prior to vaccination, and then on days 2, 7 and 14 following vaccination. Breath volatile organic compounds (VOCs) were collected with a breath collection apparatus, then analyzed by automated thermal desorption with gas chromatography and mass spectroscopy. A Monte Carlo simulation technique identified non-random VOC biomarkers of infection based on their C-statistic values (area under curve of receiver operating characteristic). Treatment with LAIV was followed by non-random changes in the abundance of breath VOCs. 2, 8-Dimethyl-undecane and other alkane derivatives were observed on all days. Conservative multivariate models identified vaccinated subjects on day 2 (C-statistic = 0.82, sensitivity = 63.6% and specificity = 88.5%); day 7 (C-statistic = 0.94, sensitivity = 88.5% and specificity = 92.3%); and day 14 (C-statistic = 0.95, sensitivity = 92.3% and specificity = 92.3%). The altered breath VOCs were not detected in live attenuated influenza vaccine, excluding artifactual contamination. LAIV vaccination in healthy humans elicited a prompt and sustained increase in breath biomarkers of oxidative stress. A breath test for these VOCs could potentially identify humans who are acutely infected with influenza, but who have not yet developed clinical symptoms or signs of disease.
Normalization, bias correction, and peak calling for ChIP-seq
Diaz, Aaron; Park, Kiyoub; Lim, Daniel A.; Song, Jun S.
2012-01-01
Next-generation sequencing is rapidly transforming our ability to profile the transcriptional, genetic, and epigenetic states of a cell. In particular, sequencing DNA from the immunoprecipitation of protein-DNA complexes (ChIP-seq) and methylated DNA (MeDIP-seq) can reveal the locations of protein binding sites and epigenetic modifications. These approaches contain numerous biases which may significantly influence the interpretation of the resulting data. Rigorous computational methods for detecting and removing such biases are still lacking. Also, multi-sample normalization still remains an important open problem. This theoretical paper systematically characterizes the biases and properties of ChIP-seq data by comparing 62 separate publicly available datasets, using rigorous statistical models and signal processing techniques. Statistical methods for separating ChIP-seq signal from background noise, as well as correcting enrichment test statistics for sequence-dependent and sonication biases, are presented. Our method effectively separates reads into signal and background components prior to normalization, improving the signal-to-noise ratio. Moreover, most peak callers currently use a generic null model which suffers from low specificity at the sensitivity level requisite for detecting subtle, but true, ChIP enrichment. The proposed method of determining a cell type-specific null model, which accounts for cell type-specific biases, is shown to be capable of achieving a lower false discovery rate at a given significance threshold than current methods. PMID:22499706
Analysis strategies for high-resolution UHF-fMRI data.
Polimeni, Jonathan R; Renvall, Ville; Zaretskaya, Natalia; Fischl, Bruce
2018-03-01
Functional MRI (fMRI) benefits from both increased sensitivity and specificity with increasing magnetic field strength, making it a key application for Ultra-High Field (UHF) MRI scanners. Most UHF-fMRI studies utilize the dramatic increases in sensitivity and specificity to acquire high-resolution data reaching sub-millimeter scales, which enable new classes of experiments to probe the functional organization of the human brain. This review article surveys advanced data analysis strategies developed for high-resolution fMRI at UHF. These include strategies designed to mitigate distortion and artifacts associated with higher fields in ways that attempt to preserve spatial resolution of the fMRI data, as well as recently introduced analysis techniques that are enabled by these extremely high-resolution data. Particular focus is placed on anatomically-informed analyses, including cortical surface-based analysis, which are powerful techniques that can guide each step of the analysis from preprocessing to statistical analysis to interpretation and visualization. New intracortical analysis techniques for laminar and columnar fMRI are also reviewed and discussed. Prospects for single-subject individualized analyses are also presented and discussed. Altogether, there are both specific challenges and opportunities presented by UHF-fMRI, and the use of proper analysis strategies can help these valuable data reach their full potential. Copyright © 2017 Elsevier Inc. All rights reserved.
Statistical mechanics of neocortical interactions: Path-integral evolution of short-term memory
NASA Astrophysics Data System (ADS)
Ingber, Lester
1994-05-01
Previous papers in this series of statistical mechanics of neocortical interactions (SMNI) have detailed a development from the relatively microscopic scales of neurons up to the macroscopic scales as recorded by electroencephalography (EEG), requiring an intermediate mesocolumnar scale to be developed at the scale of minicolumns (~=102 neurons) and macrocolumns (~=105 neurons). Opportunity was taken to view SMNI as sets of statistical constraints, not necessarily describing specific synaptic or neuronal mechanisms, on neuronal interactions, on some aspects of short-term memory (STM), e.g., its capacity, stability, and duration. A recently developed c-language code, pathint, provides a non-Monte Carlo technique for calculating the dynamic evolution of arbitrary-dimension (subject to computer resources) nonlinear Lagrangians, such as derived for the two-variable SMNI problem. Here, pathint is used to explicitly detail the evolution of the SMNI constraints on STM.
Artificial neural network study on organ-targeting peptides
NASA Astrophysics Data System (ADS)
Jung, Eunkyoung; Kim, Junhyoung; Choi, Seung-Hoon; Kim, Minkyoung; Rhee, Hokyoung; Shin, Jae-Min; Choi, Kihang; Kang, Sang-Kee; Lee, Nam Kyung; Choi, Yun-Jaie; Jung, Dong Hyun
2010-01-01
We report a new approach to studying organ targeting of peptides on the basis of peptide sequence information. The positive control data sets consist of organ-targeting peptide sequences identified by the peroral phage-display technique for four organs, and the negative control data are prepared from random sequences. The capacity of our models to make appropriate predictions is validated by statistical indicators including sensitivity, specificity, enrichment curve, and the area under the receiver operating characteristic (ROC) curve (the ROC score). VHSE descriptor produces statistically significant training models and the models with simple neural network architectures show slightly greater predictive power than those with complex ones. The training and test set statistics indicate that our models could discriminate between organ-targeting and random sequences. We anticipate that our models will be applicable to the selection of organ-targeting peptides for generating peptide drugs or peptidomimetics.
Camera-Model Identification Using Markovian Transition Probability Matrix
NASA Astrophysics Data System (ADS)
Xu, Guanshuo; Gao, Shang; Shi, Yun Qing; Hu, Ruimin; Su, Wei
Detecting the (brands and) models of digital cameras from given digital images has become a popular research topic in the field of digital forensics. As most of images are JPEG compressed before they are output from cameras, we propose to use an effective image statistical model to characterize the difference JPEG 2-D arrays of Y and Cb components from the JPEG images taken by various camera models. Specifically, the transition probability matrices derived from four different directional Markov processes applied to the image difference JPEG 2-D arrays are used to identify statistical difference caused by image formation pipelines inside different camera models. All elements of the transition probability matrices, after a thresholding technique, are directly used as features for classification purpose. Multi-class support vector machines (SVM) are used as the classification tool. The effectiveness of our proposed statistical model is demonstrated by large-scale experimental results.
Using ontology network structure in text mining.
Berndt, Donald J; McCart, James A; Luther, Stephen L
2010-11-13
Statistical text mining treats documents as bags of words, with a focus on term frequencies within documents and across document collections. Unlike natural language processing (NLP) techniques that rely on an engineered vocabulary or a full-featured ontology, statistical approaches do not make use of domain-specific knowledge. The freedom from biases can be an advantage, but at the cost of ignoring potentially valuable knowledge. The approach proposed here investigates a hybrid strategy based on computing graph measures of term importance over an entire ontology and injecting the measures into the statistical text mining process. As a starting point, we adapt existing search engine algorithms such as PageRank and HITS to determine term importance within an ontology graph. The graph-theoretic approach is evaluated using a smoking data set from the i2b2 National Center for Biomedical Computing, cast as a simple binary classification task for categorizing smoking-related documents, demonstrating consistent improvements in accuracy.
The choice of statistical methods for comparisons of dosimetric data in radiotherapy.
Chaikh, Abdulhamid; Giraud, Jean-Yves; Perrin, Emmanuel; Bresciani, Jean-Pierre; Balosso, Jacques
2014-09-18
Novel irradiation techniques are continuously introduced in radiotherapy to optimize the accuracy, the security and the clinical outcome of treatments. These changes could raise the question of discontinuity in dosimetric presentation and the subsequent need for practice adjustments in case of significant modifications. This study proposes a comprehensive approach to compare different techniques and tests whether their respective dose calculation algorithms give rise to statistically significant differences in the treatment doses for the patient. Statistical investigation principles are presented in the framework of a clinical example based on 62 fields of radiotherapy for lung cancer. The delivered doses in monitor units were calculated using three different dose calculation methods: the reference method accounts the dose without tissues density corrections using Pencil Beam Convolution (PBC) algorithm, whereas new methods calculate the dose with tissues density correction for 1D and 3D using Modified Batho (MB) method and Equivalent Tissue air ratio (ETAR) method, respectively. The normality of the data and the homogeneity of variance between groups were tested using Shapiro-Wilks and Levene test, respectively, then non-parametric statistical tests were performed. Specifically, the dose means estimated by the different calculation methods were compared using Friedman's test and Wilcoxon signed-rank test. In addition, the correlation between the doses calculated by the three methods was assessed using Spearman's rank and Kendall's rank tests. The Friedman's test showed a significant effect on the calculation method for the delivered dose of lung cancer patients (p <0.001). The density correction methods yielded to lower doses as compared to PBC by on average (-5 ± 4.4 SD) for MB and (-4.7 ± 5 SD) for ETAR. Post-hoc Wilcoxon signed-rank test of paired comparisons indicated that the delivered dose was significantly reduced using density-corrected methods as compared to the reference method. Spearman's and Kendall's rank tests indicated a positive correlation between the doses calculated with the different methods. This paper illustrates and justifies the use of statistical tests and graphical representations for dosimetric comparisons in radiotherapy. The statistical analysis shows the significance of dose differences resulting from two or more techniques in radiotherapy.
Comparative Analysis Between Computed and Conventional Inferior Alveolar Nerve Block Techniques.
Araújo, Gabriela Madeira; Barbalho, Jimmy Charles Melo; Dias, Tasiana Guedes de Souza; Santos, Thiago de Santana; Vasconcellos, Ricardo José de Holanda; de Morais, Hécio Henrique Araújo
2015-11-01
The aim of this randomized, double-blind, controlled trial was to compare the computed and conventional inferior alveolar nerve block techniques in symmetrically positioned inferior third molars. Both computed and conventional anesthetic techniques were performed in 29 healthy patients (58 surgeries) aged between 18 and 40 years. The anesthetic of choice was 2% lidocaine with 1: 200,000 epinephrine. The Visual Analogue Scale assessed the pain variable after anesthetic infiltration. Patient satisfaction was evaluated using the Likert Scale. Heart and respiratory rates, mean time to perform technique, and the need for additional anesthesia were also evaluated. Pain variable means were higher for the conventional technique as compared with computed, 3.45 ± 2.73 and 2.86 ± 1.96, respectively, but no statistically significant differences were found (P > 0.05). Patient satisfaction showed no statistically significant differences. The average computed technique runtime and the conventional were 3.85 and 1.61 minutes, respectively, showing statistically significant differences (P <0.001). The computed anesthetic technique showed lower mean pain perception, but did not show statistically significant differences when contrasted to the conventional technique.
Mollart, L
2003-11-01
This single-blind randomised controlled trial explored the differential effects of two different foot reflexology techniques with a period of rest on oedema-relieving effects and symptom relief in healthy pregnant women with foot oedema. Fifty-five women in the third trimester were randomly assigned to one of the three groups: a period of rest, 'relaxing' reflexology techniques or a specific 'lymphatic' reflexology technique for 15 min with pre- and post-therapy ankle and foot circumference measurements and participant questionnaire. There was no statistically significant difference in the circumference measurements between the three groups; however, the lymphatic technique reflexology group mean circumference measurements were all decreased. A significant reduction in the women's symptom mean measurements in all groups (p<0.0001) was apparent. A 'perceived wellbeing' score revealed the lymphatic technique group (p<0.0001) significantly increased their wellbeing the most, followed closely by relaxing techniques (p<0.001) and then the control rest group (p<0.03). Lymphatic reflexology techniques, relaxing reflexology techniques and a period of rest had a non-significant oedema-relieving effect. From the women's viewpoint, lymphatic reflexology was the preferred therapy with significant increase in symptom relief.
Query-oriented evidence extraction to support evidence-based medicine practice.
Sarker, Abeed; Mollá, Diego; Paris, Cecile
2016-02-01
Evidence-based medicine practice requires medical practitioners to rely on the best available evidence, in addition to their expertise, when making clinical decisions. The medical domain boasts a large amount of published medical research data, indexed in various medical databases such as MEDLINE. As the size of this data grows, practitioners increasingly face the problem of information overload, and past research has established the time-associated obstacles faced by evidence-based medicine practitioners. In this paper, we focus on the problem of automatic text summarisation to help practitioners quickly find query-focused information from relevant documents. We utilise an annotated corpus that is specialised for the task of evidence-based summarisation of text. In contrast to past summarisation approaches, which mostly rely on surface level features to identify salient pieces of texts that form the summaries, our approach focuses on the use of corpus-based statistics, and domain-specific lexical knowledge for the identification of summary contents. We also apply a target-sentence-specific summarisation technique that reduces the problem of underfitting that persists in generic summarisation models. In automatic evaluations run over a large number of annotated summaries, our extractive summarisation technique statistically outperforms various baseline and benchmark summarisation models with a percentile rank of 96.8%. A manual evaluation shows that our extractive summarisation approach is capable of selecting content with high recall and precision, and may thus be used to generate bottom-line answers to practitioners' queries. Our research shows that the incorporation of specialised data and domain-specific knowledge can significantly improve text summarisation performance in the medical domain. Due to the vast amounts of medical text available, and the high growth of this form of data, we suspect that such summarisation techniques will address the time-related obstacles associated with evidence-based medicine. Copyright © 2015 Elsevier Inc. All rights reserved.
Predicting the binding preference of transcription factors to individual DNA k-mers.
Alleyne, Trevis M; Peña-Castillo, Lourdes; Badis, Gwenael; Talukder, Shaheynoor; Berger, Michael F; Gehrke, Andrew R; Philippakis, Anthony A; Bulyk, Martha L; Morris, Quaid D; Hughes, Timothy R
2009-04-15
Recognition of specific DNA sequences is a central mechanism by which transcription factors (TFs) control gene expression. Many TF-binding preferences, however, are unknown or poorly characterized, in part due to the difficulty associated with determining their specificity experimentally, and an incomplete understanding of the mechanisms governing sequence specificity. New techniques that estimate the affinity of TFs to all possible k-mers provide a new opportunity to study DNA-protein interaction mechanisms, and may facilitate inference of binding preferences for members of a given TF family when such information is available for other family members. We employed a new dataset consisting of the relative preferences of mouse homeodomains for all eight-base DNA sequences in order to ask how well we can predict the binding profiles of homeodomains when only their protein sequences are given. We evaluated a panel of standard statistical inference techniques, as well as variations of the protein features considered. Nearest neighbour among functionally important residues emerged among the most effective methods. Our results underscore the complexity of TF-DNA recognition, and suggest a rational approach for future analyses of TF families.
Immediate Feedback Assessment Technique in a Chemistry Classroom
NASA Astrophysics Data System (ADS)
Taylor, Kate R.
The Immediate Feedback Assessment Technique, or IFAT, is a new testing system that turns a student's traditional multiple-choice testing into a chance for hands-on learning; and provides teachers with an opportunity to obtain more information about a student's knowledge during testing. In the current study we wanted to know if: When students are given the second-chance afforded by the IFAT system, are they guessing or using prior knowledge when making their second chance choice. Additionally, while there has been some adaptation of this testing system in non-science disciplines, we wanted to study if the IFAT-system would be well- received among faculty in the sciences, more specifically chemistry faculty. By comparing the students rate of success on second-chance afforded by the IFAT-system versus the statistical likelihood of guessing correctly, statistical analysis was used to determine if we observed enough students earning the second-chance points to reject the likelihood that students were randomly guessing. Our data analysis revealed that is statistically highly unlikely that students were only guessing when the IFAT system was utilized. (It is important to note that while we can find that students are getting the answer correct at a much higher rate than random guessing we can never truly know if every student is using thought or not.).
Design of surface-water data networks for regional information
Moss, Marshall E.; Gilroy, E.J.; Tasker, Gary D.; Karlinger, M.R.
1982-01-01
This report describes a technique, Network Analysis of Regional Information (NARI), and the existing computer procedures that have been developed for the specification of the regional information-cost relation for several statistical parameters of streamflow. The measure of information used is the true standard error of estimate of a regional logarithmic regression. The cost is a function of the number of stations at which hydrologic data are collected and the number of years for which the data are collected. The technique can be used to obtain either (1) a minimum cost network that will attain a prespecified accuracy and reliability or (2) a network that maximizes information given a set of budgetary and time constraints.
Specification of the ISS Plasma Environment Variability
NASA Technical Reports Server (NTRS)
Minow, Joseph I.; Neergaard, Linda F.; Bui, Them H.; Mikatarian, Ronald R.; Barsamian, H.; Koontz, Steven L.
2002-01-01
Quantifying the spacecraft charging risks and corresponding hazards for the International Space Station (ISS) requires a plasma environment specification describing the natural variability of ionospheric temperature (Te) and density (Ne). Empirical ionospheric specification and forecast models such as the International Reference Ionosphere (IRI) model typically only provide estimates of long term (seasonal) mean Te and Ne values for the low Earth orbit environment. Knowledge of the Te and Ne variability as well as the likelihood of extreme deviations from the mean values are required to estimate both the magnitude and frequency of occurrence of potentially hazardous spacecraft charging environments for a given ISS construction stage and flight configuration. This paper describes the statistical analysis of historical ionospheric low Earth orbit plasma measurements used to estimate Ne, Te variability in the ISS flight environment. The statistical variability analysis of Ne and Te enables calculation of the expected frequency of Occurrence of any particular values of Ne and Te, especially those that correspond to possibly hazardous spacecraft charging environments. The database used in the original analysis included measurements from the AE-C, AE-D, and DE-2 satellites. Recent work on the database has added additional satellites to the database and ground based incoherent scatter radar observations as well. Deviations of the data values from the IRI estimated Ne, Te parameters for each data point provide a statistical basis for modeling the deviations of the plasma environment from the IRI model output. This technique, while developed specifically for the Space Station analysis, can also be generalized to provide ionospheric plasma environment risk specification models for low Earth orbit over an altitude range of 200 km through approximately 1000 km.
Bayesian methods in reliability
NASA Astrophysics Data System (ADS)
Sander, P.; Badoux, R.
1991-11-01
The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.
Descriptive Statistical Techniques for Librarians. 2nd Edition.
ERIC Educational Resources Information Center
Hafner, Arthur W.
A thorough understanding of the uses and applications of statistical techniques is integral in gaining support for library funding or new initiatives. This resource is designed to help practitioners develop and manipulate descriptive statistical information in evaluating library services, tracking and controlling limited resources, and analyzing…
ERIC Educational Resources Information Center
Williams, Immanuel James; Williams, Kelley Kim
2016-01-01
Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.
Matías, J M; Taboada, J; Ordóñez, C; Nieto, P G
2007-08-17
This article describes a methodology to model the degree of remedial action required to make short stretches of a roadway suitable for dangerous goods transport (DGT), particularly pollutant substances, using different variables associated with the characteristics of each segment. Thirty-one factors determining the impact of an accident on a particular stretch of road were identified and subdivided into two major groups: accident probability factors and accident severity factors. Given the number of factors determining the state of a particular road segment, the only viable statistical methods for implementing the model were machine learning techniques, such as multilayer perceptron networks (MLPs), classification trees (CARTs) and support vector machines (SVMs). The results produced by these techniques on a test sample were more favourable than those produced by traditional discriminant analysis, irrespective of whether dimensionality reduction techniques were applied. The best results were obtained using SVMs specifically adapted to ordinal data. This technique takes advantage of the ordinal information contained in the data without penalising the computational load. Furthermore, the technique permits the estimation of the utility function that is latent in expert knowledge.
Applications of optical coherence tomography in the non-contact assessment of automotive paints
NASA Astrophysics Data System (ADS)
Lawman, Samuel; Zhang, Jinke; Williams, Bryan M.; Zheng, Yalin; Shen, Yao-Chun
2017-06-01
The multiple layer paint systems on modern cars serve two end purposes, they firstly protect against corrosion and secondly give the desired visual appearance. To ensure consistent corrosion protection and appearance, suitable Quality Assurance (QA) measures on the final product are required. Various (layer thickness and consistency, layer composition, flake statistics, surface profile and layer dryness) parameters are of importance, each with specific techniques that can measure one or some of them but no technique that can measure all or most of them. Optical Coherence Tomography (OCT) is a 3D imaging technique with micrometre resolution. Since 2016, OCT measurements of layer thickness and consistency, layer composition fingerprint and flake statistics have been reported. In this paper we demonstrate two more novel applications of OCT to automotive paints. Firstly, we use OCT to quantify unwanted surface texture, which leads to an "orange peel" visual defect. This was done by measuring the surface profiles of automotive paints, with an unoptimised precision of 37 nm over lateral range of 7 mm, to quantify texture of less than 500 nm. Secondly, we demonstrate that OCT can measure how dry a coating layer is by measuring how fast it is still shrinking quasiinstantaneously, using Fourier phase sensitivity.
Lo Presti, Rossella; Barca, Emanuele; Passarella, Giuseppe
2010-01-01
Environmental time series are often affected by the "presence" of missing data, but when dealing statistically with data, the need to fill in the gaps estimating the missing values must be considered. At present, a large number of statistical techniques are available to achieve this objective; they range from very simple methods, such as using the sample mean, to very sophisticated ones, such as multiple imputation. A brand new methodology for missing data estimation is proposed, which tries to merge the obvious advantages of the simplest techniques (e.g. their vocation to be easily implemented) with the strength of the newest techniques. The proposed method consists in the application of two consecutive stages: once it has been ascertained that a specific monitoring station is affected by missing data, the "most similar" monitoring stations are identified among neighbouring stations on the basis of a suitable similarity coefficient; in the second stage, a regressive method is applied in order to estimate the missing data. In this paper, four different regressive methods are applied and compared, in order to determine which is the most reliable for filling in the gaps, using rainfall data series measured in the Candelaro River Basin located in South Italy.
Bertoldo, Carlos; Lima, Debora; Fragoso, Larissa; Ambrosano, Glaucia; Aguiar, Flavio; Lovadino, Jose
2014-01-01
The microabrasion technique of enamel consists of selectively abrading the discolored areas or causing superficial structural changes in a selective way. In microabrasion technique, abrasive products associated with acids are used, and the evaluation of enamel roughness after this treatment, as well as surface polishing, is necessary. This in-vitro study evaluated the enamel roughness after microabrasion, followed by different polishing techniques. Roughness analyses were performed before microabrasion (L1), after microabrasion (L2), and after polishing (L3).Thus, 60 bovine incisive teeth divided into two groups were selected (n=30): G1- 37% phosphoric acid (37%) (Dentsply) and pumice; G2- hydrochloric acid (6.6%) associated with silicon carbide (Opalustre - Ultradent). Thereafter, the groups were divided into three sub-groups (n=10), according to the system of polishing: A - Fine and superfine granulation aluminum oxide discs (SofLex 3M); B - Diamond Paste (FGM) associated with felt discs (FGM); C - Silicone tips (Enhance - Dentsply). A PROC MIXED procedure was applied after data exploratory analysis, as well as the Tukey-Kramer test (5%). No statistical differences were found between G1 and G2 groups. L2 differed statistically from L1 and showed superior amounts of roughness. Differences in the amounts of post-polishing roughness for specific groups (1A, 2B, and 1C) arose, which demonstrated less roughness in L3 and differed statistically from L2 in the polishing system. All products increased enamel roughness, and the effectiveness of the polishing systems was dependent upon the abrasive used.
Investigation of Error Patterns in Geographical Databases
NASA Technical Reports Server (NTRS)
Dryer, David; Jacobs, Derya A.; Karayaz, Gamze; Gronbech, Chris; Jones, Denise R. (Technical Monitor)
2002-01-01
The objective of the research conducted in this project is to develop a methodology to investigate the accuracy of Airport Safety Modeling Data (ASMD) using statistical, visualization, and Artificial Neural Network (ANN) techniques. Such a methodology can contribute to answering the following research questions: Over a representative sampling of ASMD databases, can statistical error analysis techniques be accurately learned and replicated by ANN modeling techniques? This representative ASMD sample should include numerous airports and a variety of terrain characterizations. Is it possible to identify and automate the recognition of patterns of error related to geographical features? Do such patterns of error relate to specific geographical features, such as elevation or terrain slope? Is it possible to combine the errors in small regions into an error prediction for a larger region? What are the data density reduction implications of this work? ASMD may be used as the source of terrain data for a synthetic visual system to be used in the cockpit of aircraft when visual reference to ground features is not possible during conditions of marginal weather or reduced visibility. In this research, United States Geologic Survey (USGS) digital elevation model (DEM) data has been selected as the benchmark. Artificial Neural Networks (ANNS) have been used and tested as alternate methods in place of the statistical methods in similar problems. They often perform better in pattern recognition, prediction and classification and categorization problems. Many studies show that when the data is complex and noisy, the accuracy of ANN models is generally higher than those of comparable traditional methods.
Computer Training for Entrepreneurial Meteorologists.
NASA Astrophysics Data System (ADS)
Koval, Joseph P.; Young, George S.
2001-05-01
Computer applications of increasing diversity form a growing part of the undergraduate education of meteorologists in the early twenty-first century. The advent of the Internet economy, as well as a waning demand for traditional forecasters brought about by better numerical models and statistical forecasting techniques has greatly increased the need for operational and commercial meteorologists to acquire computer skills beyond the traditional techniques of numerical analysis and applied statistics. Specifically, students with the skills to develop data distribution products are in high demand in the private sector job market. Meeting these demands requires greater breadth, depth, and efficiency in computer instruction. The authors suggest that computer instruction for undergraduate meteorologists should include three key elements: a data distribution focus, emphasis on the techniques required to learn computer programming on an as-needed basis, and a project orientation to promote management skills and support student morale. In an exploration of this approach, the authors have reinvented the Applications of Computers to Meteorology course in the Department of Meteorology at The Pennsylvania State University to teach computer programming within the framework of an Internet product development cycle. Because the computer skills required for data distribution programming change rapidly, specific languages are valuable for only a limited time. A key goal of this course was therefore to help students learn how to retrain efficiently as technologies evolve. The crux of the course was a semester-long project during which students developed an Internet data distribution product. As project management skills are also important in the job market, the course teamed students in groups of four for this product development project. The success, failures, and lessons learned from this experiment are discussed and conclusions drawn concerning undergraduate instructional methods for computer applications in meteorology.
Shinozaki, Kazuma; Zack, Jason W.; Pylypenko, Svitlana; ...
2015-09-17
Platinum electrocatalysts supported on high surface area and Vulcan carbon blacks (Pt/HSC, Pt/V) were characterized in rotating disk electrode (RDE) setups for electrochemical area (ECA) and oxygen reduction reaction (ORR) area specific activity (SA) and mass specific activity (MA) at 0.9 V. Films fabricated using several ink formulations and film-drying techniques were characterized for a statistically significant number of independent samples. The highest quality Pt/HSC films exhibited MA 870 ± 91 mA/mgPt and SA 864 ± 56 μA/cm 2 Pt while Pt/V had MA 706 ± 42 mA/mgPt and SA 1120 ± 70 μA/cm 2 Pt when measured in 0.1more » M HClO 4, 20 mV/s, 100 kPa O 2 and 23±2°C. An enhancement factor of 2.8 in themeasured SA was observable on eliminating Nafion ionomer and employing extremely thin, uniform films (~4.5 μg/cm 2 Pt) of Pt/HSC. The ECA for Pt/HSC (99 ± 7 m2/gPt) and Pt/V (65 ± 5 m 2/gPt) were statistically invariant and insensitive to film uniformity/thickness/fabrication technique; accordingly, enhancements in MA are wholly attributable to increases in SA. Impedance measurements coupled with scanning electron microscopy were used to de-convolute the losses within the catalyst layer and ascribed to the catalyst layer resistance, oxygen diffusion, and sulfonate anion adsorption/blocking. The ramifications of these results for proton exchange membrane fuel cells have also been examined.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shinozaki, Kazuma; Zack, Jason W.; Pylypenko, Svitlana
Platinum electrocatalysts supported on high surface area and Vulcan carbon blacks (Pt/HSC, Pt/V) were characterized in rotating disk electrode (RDE) setups for electrochemical area (ECA) and oxygen reduction reaction (ORR) area specific activity (SA) and mass specific activity (MA) at 0.9 V. Films fabricated using several ink formulations and film-drying techniques were characterized for a statistically significant number of independent samples. The highest quality Pt/HSC films exhibited MA 870 ± 91 mA/mgPt and SA 864 ± 56 μA/cm 2 Pt while Pt/V had MA 706 ± 42 mA/mgPt and SA 1120 ± 70 μA/cm 2 Pt when measured in 0.1more » M HClO 4, 20 mV/s, 100 kPa O 2 and 23±2°C. An enhancement factor of 2.8 in themeasured SA was observable on eliminating Nafion ionomer and employing extremely thin, uniform films (~4.5 μg/cm 2 Pt) of Pt/HSC. The ECA for Pt/HSC (99 ± 7 m2/gPt) and Pt/V (65 ± 5 m 2/gPt) were statistically invariant and insensitive to film uniformity/thickness/fabrication technique; accordingly, enhancements in MA are wholly attributable to increases in SA. Impedance measurements coupled with scanning electron microscopy were used to de-convolute the losses within the catalyst layer and ascribed to the catalyst layer resistance, oxygen diffusion, and sulfonate anion adsorption/blocking. The ramifications of these results for proton exchange membrane fuel cells have also been examined.« less
Kim, Hyeyoung; Lee, Youngsun; Shin, Insik; Kim, Kitae; Moon, Jeheon
2014-01-01
[Purpose] For maximum efficiency and to prevent injury during javelin throwing, it is critical to maintain muscle balance and coordination of the rotator cuff and the glenohumeral joint. In this study, we investigated the change in the rotator cuff muscle strength, throw distance and technique of javelin throwers after they had performed a specific physical training that combined elements of weight training, function movement screen training, and core training. [Subjects] Ten javelin throwers participated in this study: six university athletes in the experimental group and four national-level athletes in the control group. [Methods] The experimental group performed 8 weeks of the specific physical training. To evaluate the effects of the training, measurements were performed before and after the training for the experimental group. Measurements comprised anthropometry, isokinetic muscle strength measurements, the function movement screen test, and movement analysis. [Results] After the specific physical training, the function movement screen score and external and internal rotator muscle strength showed statistically significant increases. Among kinematic factors, only pull distance showed improvement after training. [Conclusion] Eight weeks of specific physical training for dynamic stabilizer muscles enhanced the rotator cuff muscle strength, core stability, throw distance, and flexibility of javelin throwers. These results suggest that specific physical training can be useful for preventing shoulder injuries and improving the performance for javelin throwers. PMID:25364111
Kim, Hyeyoung; Lee, Youngsun; Shin, Insik; Kim, Kitae; Moon, Jeheon
2014-10-01
[Purpose] For maximum efficiency and to prevent injury during javelin throwing, it is critical to maintain muscle balance and coordination of the rotator cuff and the glenohumeral joint. In this study, we investigated the change in the rotator cuff muscle strength, throw distance and technique of javelin throwers after they had performed a specific physical training that combined elements of weight training, function movement screen training, and core training. [Subjects] Ten javelin throwers participated in this study: six university athletes in the experimental group and four national-level athletes in the control group. [Methods] The experimental group performed 8 weeks of the specific physical training. To evaluate the effects of the training, measurements were performed before and after the training for the experimental group. Measurements comprised anthropometry, isokinetic muscle strength measurements, the function movement screen test, and movement analysis. [Results] After the specific physical training, the function movement screen score and external and internal rotator muscle strength showed statistically significant increases. Among kinematic factors, only pull distance showed improvement after training. [Conclusion] Eight weeks of specific physical training for dynamic stabilizer muscles enhanced the rotator cuff muscle strength, core stability, throw distance, and flexibility of javelin throwers. These results suggest that specific physical training can be useful for preventing shoulder injuries and improving the performance for javelin throwers.
Degeling, Koen; Schivo, Stefano; Mehra, Niven; Koffijberg, Hendrik; Langerak, Rom; de Bono, Johann S; IJzerman, Maarten J
2017-12-01
With the advent of personalized medicine, the field of health economic modeling is being challenged and the use of patient-level dynamic modeling techniques might be required. To illustrate the usability of two such techniques, timed automata (TA) and discrete event simulation (DES), for modeling personalized treatment decisions. An early health technology assessment on the use of circulating tumor cells, compared with prostate-specific antigen and bone scintigraphy, to inform treatment decisions in metastatic castration-resistant prostate cancer was performed. Both modeling techniques were assessed quantitatively, in terms of intermediate outcomes (e.g., overtreatment) and health economic outcomes (e.g., net monetary benefit). Qualitatively, among others, model structure, agent interactions, data management (i.e., importing and exporting data), and model transparency were assessed. Both models yielded realistic and similar intermediate and health economic outcomes. Overtreatment was reduced by 6.99 and 7.02 weeks by applying circulating tumor cell as a response marker at a net monetary benefit of -€1033 and -€1104 for the TA model and the DES model, respectively. Software-specific differences were observed regarding data management features and the support for statistical distributions, which were considered better for the DES software. Regarding method-specific differences, interactions were modeled more straightforward using TA, benefiting from its compositional model structure. Both techniques prove suitable for modeling personalized treatment decisions, although DES would be preferred given the current software-specific limitations of TA. When these limitations are resolved, TA would be an interesting modeling alternative if interactions are key or its compositional structure is useful to manage multi-agent complex problems. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Analysis of Acoustic Emission Parameters from Corrosion of AST Bottom Plate in Field Testing
NASA Astrophysics Data System (ADS)
Jomdecha, C.; Jirarungsatian, C.; Suwansin, W.
Field testing of aboveground storage tank (AST) to monitor corrosion of the bottom plate is presented in this chapter. AE testing data of the ten AST with different sizes, materials, and products were employed to monitor the bottom plate condition. AE sensors of 30 and 150 kHz were used to monitor the corrosion activity of up to 24 channels including guard sensors. Acoustic emission (AE) parameters were analyzed to explore the AE parameter patterns of occurring corrosion compared to the laboratory results. Amplitude, count, duration, and energy were main parameters of analysis. Pattern recognition technique with statistical was implemented to eliminate the electrical and environmental noises. The results showed the specific AE patterns of corrosion activities related to the empirical results. In addition, plane algorithm was utilized to locate the significant AE events from corrosion. Both results of parameter patterns and AE event locations can be used to interpret and locate the corrosion activities. Finally, basic statistical grading technique was used to evaluate the bottom plate condition of the AST.
NASA Astrophysics Data System (ADS)
Luo, Shuwen; Chen, Changshui; Mao, Hua; Jin, Shaoqin
2013-06-01
The feasibility of early detection of gastric cancer using near-infrared (NIR) Raman spectroscopy (RS) by distinguishing premalignant lesions (adenomatous polyp, n=27) and cancer tissues (adenocarcinoma, n=33) from normal gastric tissues (n=45) is evaluated. Significant differences in Raman spectra are observed among the normal, adenomatous polyp, and adenocarcinoma gastric tissues at 936, 1003, 1032, 1174, 1208, 1323, 1335, 1450, and 1655 cm-1. Diverse statistical methods are employed to develop effective diagnostic algorithms for classifying the Raman spectra of different types of ex vivo gastric tissues, including principal component analysis (PCA), linear discriminant analysis (LDA), and naive Bayesian classifier (NBC) techniques. Compared with PCA-LDA algorithms, PCA-NBC techniques together with leave-one-out, cross-validation method provide better discriminative results of normal, adenomatous polyp, and adenocarcinoma gastric tissues, resulting in superior sensitivities of 96.3%, 96.9%, and 96.9%, and specificities of 93%, 100%, and 95.2%, respectively. Therefore, NIR RS associated with multivariate statistical algorithms has the potential for early diagnosis of gastric premalignant lesions and cancer tissues in molecular level.
Transversus abdominis plane block in renal allotransplant recipients: A retrospective chart review.
Gopwani, S R; Rosenblatt, M A
2016-01-01
The efficacy of the transversus abdominis plane (TAP) block appears to vary considerably, depending on the surgical procedure and block technique. This study aims to add to the existing literature and provide a more clear understanding of the TAP blocks role as a postoperative analgesic technique, specifically in renal allotransplant recipients. A retrospective chart review was conducted by querying the intraoperative electronic medical record system of a 1200-bed tertiary academic hospital over a 5 months period, and reviewing anesthetic techniques, as well as postoperative morphine equivalent consumption. Fifty renal allotransplant recipients were identified, 13 of whom received TAP blocks while 37 received no regional analgesic technique. All blocks were performed under ultrasound guidance, with 20 mL of 0.25% bupivacaine injected in the transversus abdominis fascial plane under direct visualization. The primary outcome was postoperative morphine equivalent consumption. Morphine consumption was compared with the two-tailed Mann-Whitney U -test. Continuous variables of patient baseline characteristics were analyzed with unpaired t -test and categorical variables with Fischer Exact Test. A P < 0.05 was considered statistically significant. A statistically significant decrease in cumulative morphine consumption was found in the group that received the TAP block at 6 h (2.46 mg vs. 7.27 mg, P = 0.0010), 12 h (3.88 mg vs. 10.20 mg, P = 0.0005), 24 h (6.96 mg vs. 14.75 mg, P = 0.0013), and 48 h (11 mg vs. 20.13 mg, P = 0.0092). The TAP block is a beneficial postoperative analgesic, opiate-sparing technique in renal allotransplant recipients.
NASA Astrophysics Data System (ADS)
Sengupta, A.; Kletzing, C.; Howk, R.; Kurth, W. S.
2017-12-01
An important goal of the Van Allen Probes mission is to understand wave particle interactions that can energize relativistic electron in the Earth's Van Allen radiation belts. The EMFISIS instrumentation suite provides measurements of wave electric and magnetic fields of wave features such as chorus that participate in these interactions. Geometric signal processing discovers structural relationships, e.g. connectivity across ridge-like features in chorus elements to reveal properties such as dominant angles of the element (frequency sweep rate) and integrated power along the a given chorus element. These techniques disambiguate these wave features against background hiss-like chorus. This enables autonomous discovery of chorus elements across the large volumes of EMFISIS data. At the scale of individual or overlapping chorus elements, topological pattern recognition techniques enable interpretation of chorus microstructure by discovering connectivity and other geometric features within the wave signature of a single chorus element or between overlapping chorus elements. Thus chorus wave features can be quantified and studied at multiple scales of spectral geometry using geometric signal processing techniques. We present recently developed computational techniques that exploit spectral geometry of chorus elements and whistlers to enable large-scale automated discovery, detection and statistical analysis of these events over EMFISIS data. Specifically, we present different case studies across a diverse portfolio of chorus elements and discuss the performance of our algorithms regarding precision of detection as well as interpretation of chorus microstructure. We also provide large-scale statistical analysis on the distribution of dominant sweep rates and other properties of the detected chorus elements.
The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques
ERIC Educational Resources Information Center
Menil, Violeta C.
2005-01-01
In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…
Planetary mass function and planetary systems
NASA Astrophysics Data System (ADS)
Dominik, M.
2011-02-01
With planets orbiting stars, a planetary mass function should not be seen as a low-mass extension of the stellar mass function, but a proper formalism needs to take care of the fact that the statistical properties of planet populations are linked to the properties of their respective host stars. This can be accounted for by describing planet populations by means of a differential planetary mass-radius-orbit function, which together with the fraction of stars with given properties that are orbited by planets and the stellar mass function allows the derivation of all statistics for any considered sample. These fundamental functions provide a framework for comparing statistics that result from different observing techniques and campaigns which all have their very specific selection procedures and detection efficiencies. Moreover, recent results both from gravitational microlensing campaigns and radial-velocity surveys of stars indicate that planets tend to cluster in systems rather than being the lonely child of their respective parent star. While planetary multiplicity in an observed system becomes obvious with the detection of several planets, its quantitative assessment however comes with the challenge to exclude the presence of further planets. Current exoplanet samples begin to give us first hints at the population statistics, whereas pictures of planet parameter space in its full complexity call for samples that are 2-4 orders of magnitude larger. In order to derive meaningful statistics, however, planet detection campaigns need to be designed in such a way that well-defined fully deterministic target selection, monitoring and detection criteria are applied. The probabilistic nature of gravitational microlensing makes this technique an illustrative example of all the encountered challenges and uncertainties.
Real-time forecasts of tomorrow's earthquakes in California: a new mapping tool
Gerstenberger, Matt; Wiemer, Stefan; Jones, Lucy
2004-01-01
We have derived a multi-model approach to calculate time-dependent earthquake hazard resulting from earthquake clustering. This file report explains the theoretical background behind the approach, the specific details that are used in applying the method to California, as well as the statistical testing to validate the technique. We have implemented our algorithm as a real-time tool that has been automatically generating short-term hazard maps for California since May of 2002, at http://step.wr.usgs.gov
A Virtual Study of Grid Resolution on Experiments of a Highly-Resolved Turbulent Plume
NASA Astrophysics Data System (ADS)
Maisto, Pietro M. F.; Marshall, Andre W.; Gollner, Michael J.; Fire Protection Engineering Department Collaboration
2017-11-01
An accurate representation of sub-grid scale turbulent mixing is critical for modeling fire plumes and smoke transport. In this study, PLIF and PIV diagnostics are used with the saltwater modeling technique to provide highly-resolved instantaneous field measurements in unconfined turbulent plumes useful for statistical analysis, physical insight, and model validation. The effect of resolution was investigated employing a virtual interrogation window (of varying size) applied to the high-resolution field measurements. Motivated by LES low-pass filtering concepts, the high-resolution experimental data in this study can be analyzed within the interrogation windows (i.e. statistics at the sub-grid scale) and on interrogation windows (i.e. statistics at the resolved scale). A dimensionless resolution threshold (L/D*) criterion was determined to achieve converged statistics on the filtered measurements. Such a criterion was then used to establish the relative importance between large and small-scale turbulence phenomena while investigating specific scales for the turbulent flow. First order data sets start to collapse at a resolution of 0.3D*, while for second and higher order statistical moments the interrogation window size drops down to 0.2D*.
Should I Pack My Umbrella? Clinical versus Statistical Prediction of Mental Health Decisions
ERIC Educational Resources Information Center
Aegisdottir, Stefania; Spengler, Paul M.; White, Michael J.
2006-01-01
In this rejoinder, the authors respond to the insightful commentary of Strohmer and Arm, Chwalisz, and Hilton, Harris, and Rice about the meta-analysis on statistical versus clinical prediction techniques for mental health judgments. The authors address issues including the availability of statistical prediction techniques for real-life psychology…
Change Detection in Rough Time Series
2014-09-01
Business Statistics : An Inferential Approach, Dellen: San Francisco. [18] Winston, W. (1997) Operations Research Applications and Algorithms, Duxbury...distribution that can present significant challenges to conventional statistical tracking techniques. To address this problem the proposed method...applies hybrid fuzzy statistical techniques to series granules instead of to individual measures. Three examples demonstrated the robust nature of the
Enhancing Students' Ability to Use Statistical Reasoning with Everyday Problems
ERIC Educational Resources Information Center
Lawson, Timothy J.; Schwiers, Michael; Doellman, Maureen; Grady, Greg; Kelnhofer, Robert
2003-01-01
We discuss a technique for teaching students everyday applications of statistical concepts. We used this technique with students (n = 50) enrolled in several sections of an introductory statistics course; students (n = 45) in other sections served as a comparison group. A class of introductory psychology students (n = 24) served as a second…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watkins, W.T.; Siebers, J.V.
Purpose: To introduce quasi-constrained Multi-Criteria Optimization (qcMCO) for unsupervised radiation therapy optimization which generates alternative patient-specific plans emphasizing dosimetric tradeoffs and conformance to clinical constraints for multiple delivery techniques. Methods: For N Organs At Risk (OARs) and M delivery techniques, qcMCO generates M(N+1) alternative treatment plans per patient. Objective weight variations for OARs and targets are used to generate alternative qcMCO plans. For 30 locally advanced lung cancer patients, qcMCO plans were generated for dosimetric tradeoffs to four OARs: each lung, heart, and esophagus (N=4) and 4 delivery techniques (simple 4-field arrangements, 9-field coplanar IMRT, 27-field non-coplanar IMRT, and non-coplanarmore » Arc IMRT). Quasi-constrained objectives included target prescription isodose to 95% (PTV-D95), maximum PTV dose (PTV-Dmax)< 110% of prescription, and spinal cord Dmax<45 Gy. The algorithm’s ability to meet these constraints while simultaneously revealing dosimetric tradeoffs was investigated. Statistically significant dosimetric tradeoffs were defined such that the coefficient of determination between dosimetric indices which varied by at least 5 Gy between different plans was >0.8. Results: The qcMCO plans varied mean dose by >5 Gy to ipsilateral lung for 24/30 patients, contralateral lung for 29/30 patients, esophagus for 29/30 patients, and heart for 19/30 patients. In the 600 plans computed without human interaction, average PTV-D95=67.4±3.3 Gy, PTV-Dmax=79.2±5.3 Gy, and spinal cord Dmax was >45 Gy in 93 plans (>50 Gy in 2/600 plans). Statistically significant dosimetric tradeoffs were evident in 19/30 plans, including multiple tradeoffs of at least 5 Gy between multiple OARs in 7/30 cases. The most common statistically significant tradeoff was increasing PTV-Dmax to reduce OAR dose (15/30 patients). Conclusion: The qcMCO method can conform to quasi-constrained objectives while revealing significant variations in OAR doses including mean dose reductions >5 Gy. Clinical implementation will facilitate patient-specific decision making based on achievable dosimetry as opposed to accept/reject models based on population derived objectives.« less
Statistics, Adjusted Statistics, and Maladjusted Statistics.
Kaufman, Jay S
2017-05-01
Statistical adjustment is a ubiquitous practice in all quantitative fields that is meant to correct for improprieties or limitations in observed data, to remove the influence of nuisance variables or to turn observed correlations into causal inferences. These adjustments proceed by reporting not what was observed in the real world, but instead modeling what would have been observed in an imaginary world in which specific nuisances and improprieties are absent. These techniques are powerful and useful inferential tools, but their application can be hazardous or deleterious if consumers of the adjusted results mistake the imaginary world of models for the real world of data. Adjustments require decisions about which factors are of primary interest and which are imagined away, and yet many adjusted results are presented without any explanation or justification for these decisions. Adjustments can be harmful if poorly motivated, and are frequently misinterpreted in the media's reporting of scientific studies. Adjustment procedures have become so routinized that many scientists and readers lose the habit of relating the reported findings back to the real world in which we live.
Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.
Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil
2014-08-01
We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.
Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods
Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil
2015-01-01
We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called “Patient Recursive Survival Peeling” is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called “combined” cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication. PMID:26997922
Johnston, Douglas R; Gaslin, Michael; Boon, Maurits; Pribitkin, Edmund; Rosen, David
2010-07-01
This study was performed to determine whether teens have different rates of posttonsillectomy hemorrhage, admission for dehydration, or recurrent tonsillitis compared to adults. Specifically, these parameters were compared within two groups: patients who underwent powered intracapsular tonsillectomy (PIT) and those who underwent monopolar electrocautery tonsillectomy (MET). In a retrospective review of 579 patients at least 12 years of age from January 2000 to July 2006 in a tertiary referral center, outcome measures of reoperation for hemorrhage, readmission or emergency room visit for dehydration, and postoperative tonsillitis were compared for 200 patients 12 to 19 years of age and 379 patients more than 19 years of age. These outcome measures in teens were compared to those in adults who had tonsillectomy by the same technique (101 teens who underwent PIT compared to 117 adults who underwent PIT, and 99 teens who underwent MET compared to 262 adults who underwent MET). Outcome measures were also compared within the PIT and MET groups based on the indication for surgery (chronic tonsillitis, tonsillar hypertrophy, or both). In comparing teens to adults who underwent the same technique (PIT versus PIT, or MET versus MET), no statistically significant differences existed in the incidence of hemorrhage, dehydration, or postoperative tonsillitis. Greater hemorrhage rates for adults who underwent MET compared to teens, however, almost met statistical significance (p = 0.053). Analyzing complication rates by indication within the PIT and MET groups exclusively revealed higher rates of hemorrhage in adults who underwent the MET technique for the indication of chronic tonsillitis. Within the PIT comparison, no significant differences were found on the basis of indication for surgery. Teenage patients who undergo tonsillectomy should be considered unique as far as complication rates are concerned. Comparison of technique-specific complication rates between adults and teens showed no significant differences in either the PIT or MET groups, although adults who underwent MET had greater hemorrhage rates that almost met significance (p = 0.053). Adults who were undergoing tonsillectomy for chronic tonsillitis were more likely than teens to encounter postoperative hemorrhage if they underwent the MET technique.
Pattern recognition of satellite cloud imagery for improved weather prediction
NASA Technical Reports Server (NTRS)
Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.
1986-01-01
The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.
Hybrid Disease Diagnosis Using Multiobjective Optimization with Evolutionary Parameter Optimization
Nalluri, MadhuSudana Rao; K., Kannan; M., Manisha
2017-01-01
With the widespread adoption of e-Healthcare and telemedicine applications, accurate, intelligent disease diagnosis systems have been profoundly coveted. In recent years, numerous individual machine learning-based classifiers have been proposed and tested, and the fact that a single classifier cannot effectively classify and diagnose all diseases has been almost accorded with. This has seen a number of recent research attempts to arrive at a consensus using ensemble classification techniques. In this paper, a hybrid system is proposed to diagnose ailments using optimizing individual classifier parameters for two classifier techniques, namely, support vector machine (SVM) and multilayer perceptron (MLP) technique. We employ three recent evolutionary algorithms to optimize the parameters of the classifiers above, leading to six alternative hybrid disease diagnosis systems, also referred to as hybrid intelligent systems (HISs). Multiple objectives, namely, prediction accuracy, sensitivity, and specificity, have been considered to assess the efficacy of the proposed hybrid systems with existing ones. The proposed model is evaluated on 11 benchmark datasets, and the obtained results demonstrate that our proposed hybrid diagnosis systems perform better in terms of disease prediction accuracy, sensitivity, and specificity. Pertinent statistical tests were carried out to substantiate the efficacy of the obtained results. PMID:29065626
High-spatial-resolution passive microwave sounding systems
NASA Technical Reports Server (NTRS)
Staelin, D. H.; Rosenkranz, P. W.
1994-01-01
The principal contributions of this combined theoretical and experimental effort were to advance and demonstrate new and more accurate techniques for sounding atmospheric temperature, humidity, and precipitation profiles at millimeter wavelengths, and to improve the scientific basis for such soundings. Some of these techniques are being incorporated in both research and operational systems. Specific results include: (1) development of the MIT Microwave Temperature Sounder (MTS), a 118-GHz eight-channel imaging spectrometer plus a switched-frequency spectrometer near 53 GHz, for use on the NASA ER-2 high-altitude aircraft, (2) conduct of ER-2 MTS missions in multiple seasons and locations in combination with other instruments, mapping with unprecedented approximately 2-km lateral resolution atmospheric temperature and precipitation profiles, atmospheric transmittances (at both zenith and nadir), frontal systems, and hurricanes, (3) ground based 118-GHz 3-D spectral images of wavelike structure within clouds passing overhead, (4) development and analysis of approaches to ground- and space-based 5-mm wavelength sounding of the upper stratosphere and mesosphere, which supported the planning of improvements to operational weather satellites, (5) development of improved multidimensional and adaptive retrieval methods for atmospheric temperature and humidity profiles, (6) development of combined nonlinear and statistical retrieval techniques for 183-GHz humidity profile retrievals, (7) development of nonlinear statistical retrieval techniques for precipitation cell-top altitudes, and (8) numerical analyses of the impact of remote sensing data on the accuracy of numerical weather predictions; a 68-km gridded model was used to study the spectral properties of error growth.
Ghaffari, Mahsa; Tangen, Kevin; Alaraj, Ali; Du, Xinjian; Charbel, Fady T; Linninger, Andreas A
2017-12-01
In this paper, we present a novel technique for automatic parametric mesh generation of subject-specific cerebral arterial trees. This technique generates high-quality and anatomically accurate computational meshes for fast blood flow simulations extending the scope of 3D vascular modeling to a large portion of cerebral arterial trees. For this purpose, a parametric meshing procedure was developed to automatically decompose the vascular skeleton, extract geometric features and generate hexahedral meshes using a body-fitted coordinate system that optimally follows the vascular network topology. To validate the anatomical accuracy of the reconstructed vasculature, we performed statistical analysis to quantify the alignment between parametric meshes and raw vascular images using receiver operating characteristic curve. Geometric accuracy evaluation showed an agreement with area under the curves value of 0.87 between the constructed mesh and raw MRA data sets. Parametric meshing yielded on-average, 36.6% and 21.7% orthogonal and equiangular skew quality improvement over the unstructured tetrahedral meshes. The parametric meshing and processing pipeline constitutes an automated technique to reconstruct and simulate blood flow throughout a large portion of the cerebral arterial tree down to the level of pial vessels. This study is the first step towards fast large-scale subject-specific hemodynamic analysis for clinical applications. Copyright © 2017 Elsevier Ltd. All rights reserved.
Are Assumptions of Well-Known Statistical Techniques Checked, and Why (Not)?
Hoekstra, Rink; Kiers, Henk A. L.; Johnson, Addie
2012-01-01
A valid interpretation of most statistical techniques requires that one or more assumptions be met. In published articles, however, little information tends to be reported on whether the data satisfy the assumptions underlying the statistical techniques used. This could be due to self-selection: Only manuscripts with data fulfilling the assumptions are submitted. Another explanation could be that violations of assumptions are rarely checked for in the first place. We studied whether and how 30 researchers checked fictitious data for violations of assumptions in their own working environment. Participants were asked to analyze the data as they would their own data, for which often used and well-known techniques such as the t-procedure, ANOVA and regression (or non-parametric alternatives) were required. It was found that the assumptions of the techniques were rarely checked, and that if they were, it was regularly by means of a statistical test. Interviews afterward revealed a general lack of knowledge about assumptions, the robustness of the techniques with regards to the assumptions, and how (or whether) assumptions should be checked. These data suggest that checking for violations of assumptions is not a well-considered choice, and that the use of statistics can be described as opportunistic. PMID:22593746
Nonparametric Methods in Astronomy: Think, Regress, Observe—Pick Any Three
NASA Astrophysics Data System (ADS)
Steinhardt, Charles L.; Jermyn, Adam S.
2018-02-01
Telescopes are much more expensive than astronomers, so it is essential to minimize required sample sizes by using the most data-efficient statistical methods possible. However, the most commonly used model-independent techniques for finding the relationship between two variables in astronomy are flawed. In the worst case they can lead without warning to subtly yet catastrophically wrong results, and even in the best case they require more data than necessary. Unfortunately, there is no single best technique for nonparametric regression. Instead, we provide a guide for how astronomers can choose the best method for their specific problem and provide a python library with both wrappers for the most useful existing algorithms and implementations of two new algorithms developed here.
Development of Supersonic Combustion Experiments for CFD Modeling
NASA Technical Reports Server (NTRS)
Baurle, Robert; Bivolaru, Daniel; Tedder, Sarah; Danehy, Paul M.; Cutler, Andrew D.; Magnotti, Gaetano
2007-01-01
This paper describes the development of an experiment to acquire data for developing and validating computational fluid dynamics (CFD) models for turbulence in supersonic combusting flows. The intent is that the flow field would be simple yet relevant to flows within hypersonic air-breathing engine combustors undergoing testing in vitiated-air ground-testing facilities. Specifically, it describes development of laboratory-scale hardware to produce a supersonic combusting coaxial jet, discusses design calculations, operability and types of flames observed. These flames are studied using the dual-pump coherent anti- Stokes Raman spectroscopy (CARS) - interferometric Rayleigh scattering (IRS) technique. This technique simultaneously and instantaneously measures temperature, composition, and velocity in the flow, from which many of the important turbulence statistics can be found. Some preliminary CARS data are presented.
The magnifying glass - A feature space local expansion for visual analysis. [and image enhancement
NASA Technical Reports Server (NTRS)
Juday, R. D.
1981-01-01
The Magnifying Glass Transformation (MGT) technique is proposed, as a multichannel spectral operation yielding visual imagery which is enhanced in a specified spectral vicinity, guided by the statistics of training samples. An application example is that in which the discrimination among spectral neighbors within an interactive display may be increased without altering distant object appearances or overall interpretation. A direct histogram specification technique is applied to the channels within the multispectral image so that a subset of the spectral domain occupies an increased fraction of the domain. The transformation is carried out by obtaining the training information, establishing the condition of the covariance matrix, determining the influenced solid, and initializing the lookup table. Finally, the image is transformed.
Martian Chronology: Goals for Investigations from a Recent Multidisciplinary Workshop
NASA Technical Reports Server (NTRS)
Nyquist, L.; Doran, P. T.; Cerling, T. E.; Clifford, S. M.; Forman, S. L.; Papanastassiou, D. A.; Stewart, B. W.; Sturchio, N. C.; Swindle, T. D.
2000-01-01
The absolute chronology of Martian rocks and events is based mainly on crater statistics and remains highly uncertain. Martian chronology will be critical to building a time scale comparable to Earth's to address questions about the early evolution of the planets and their ecosystems. In order to address issues and strategies specific to Martian chronology, a workshop was held, 4-7 June 2000, with invited participants from the planetary, geochronology, geochemistry, and astrobiology communities. The workshop focused on identifying: a) key scientific questions of Martian chronology; b) chronological techniques applicable to Mars; c) unique processes on Mars that could be exploited to obtain rates, fluxes, ages; and d) sampling issues for these techniques. This is an overview of the workshop findings and recommendations.
Noble Metal Immersion Spectroscopy of Silica Alcogels and Aerogels
NASA Technical Reports Server (NTRS)
Smith, David D.; Sibille, Laurent; Cronise, Raymond J.; Noever, David A.
1998-01-01
We have fabricated aerogels containing gold and silver nanoparticles for gas catalysis applications. By applying the concept of an average or effective dielectric constant to the heterogeneous interlayer surrounding each particle, we extend the technique of immersion spectroscopy to porous or heterogeneous media. Specifically, we apply the predominant effective medium theories for the determination of the average fractional composition of each component in this inhomogeneous layer. Hence, the surface area of metal available for catalytic gas reaction is determined. The technique is satisfactory for statistically random metal particle distributions but needs further modification for aggregated or surfactant modified systems. Additionally, the kinetics suggest that collective particle interactions in coagulated clusters are perturbed during silica gelation resulting in a change in the aggregate geometry.
Surface Plasmon Resonance Evaluation of Colloidal Metal Aerogel Filters
NASA Technical Reports Server (NTRS)
Smith, David D.; Sibille, Laurent; Cronise, Raymond J.; Noever, David A.
1997-01-01
We have fabricated aerogels containing gold, silver, and platinum nanoparticles for gas catalysis applications. By applying the concept of an average or effective dielectric constant to the heterogeneous interlayer surrounding each particle, we extend the technique of immersion spectroscopy to porous or heterogeneous media. Specifically, we apply the predominant effective medium theories for the determination of the average fractional composition of each component in this inhomogeneous layer. Hence, the surface area of metal available for catalytic gas reaction is determined. The technique is satisfactory for statistically random metal particle distributions but needs further modification for aggregated or surfactant modified systems. Additionally, the kinetics suggest that collective particle interactions in coagulated clusters are perturbed during silica gelation resulting in a change in the aggregate geometry.
Adaption from LWIR to visible wavebands of methods to describe the population of GEO belt debris
NASA Astrophysics Data System (ADS)
Meng, Kevin; Murray-Krezan, Jeremy; Seitzer, Patrick
2018-05-01
Prior efforts to characterize the number of GEO belt debris objects by statistically analyzing the distribution of debris as a function of size have relied on techniques unique to infrared measurements of the debris. Specifically the infrared measurement techniques permitted inference of the characteristic size of the debris. This report describes a method to adapt the previous techniques and measurements to visible wavebands. Results will be presented using data from a NASA optical, visible band survey of objects near the geosynchronous orbit, GEO belt. This survey used the University of Michigan's 0.6-m Curtis-Schmidt telescope, Michigan Orbital DEbris Survey Telescope (MODEST), located at Cerro Tololo Inter-American Observatory in Chile. The system is equipped with a scanning CCD with a field of view of 1.6°×1.6°, and can detect objects smaller than 20 cm diameter at GEO.
Evaluation of Three Different Processing Techniques in the Fabrication of Complete Dentures
Chintalacheruvu, Vamsi Krishna; Balraj, Rajasekaran Uttukuli; Putchala, Lavanya Sireesha; Pachalla, Sreelekha
2017-01-01
Aims and Objectives: The objective of the present study is to compare the effectiveness of three different processing techniques and to find out the accuracy of processing techniques through number of occlusal interferences and increase in vertical dimension after denture processing. Materials and Methods: A cross-sectional study was conducted on a sample of 18 patients indicated for complete denture fabrication was selected for the study and they were divided into three subgroups. Three processing techniques, compression molding and injection molding using prepolymerized resin and unpolymerized resin, were used to fabricate dentures for each of the groups. After processing, laboratory-remounted dentures were evaluated for number of occlusal interferences in centric and eccentric relations and change in vertical dimension through vertical pin rise in articulator. Data were analyzed using statistical test ANOVA and SPSS software version 19.0 by IBM was used. Results: Data obtained from three groups were subjected to one-way ANOVA test. After ANOVA test, results with significant variations were subjected to post hoc test. Number of occlusal interferences with compression molding technique was reported to be more in both centric and eccentric positions as compared to the two injection molding techniques with statistical significance in centric, protrusive, right lateral nonworking, and left lateral working positions (P < 0.05). Mean vertical pin rise (0.52 mm) was reported to more in compression molding technique as compared to injection molding techniques, which is statistically significant (P < 0.001). Conclusions: Within the limitations of this study, injection molding techniques exhibited less processing errors as compared to compression molding technique with statistical significance. There was no statistically significant difference in processing errors reported within two injection molding systems. PMID:28713763
Evaluation of Three Different Processing Techniques in the Fabrication of Complete Dentures.
Chintalacheruvu, Vamsi Krishna; Balraj, Rajasekaran Uttukuli; Putchala, Lavanya Sireesha; Pachalla, Sreelekha
2017-06-01
The objective of the present study is to compare the effectiveness of three different processing techniques and to find out the accuracy of processing techniques through number of occlusal interferences and increase in vertical dimension after denture processing. A cross-sectional study was conducted on a sample of 18 patients indicated for complete denture fabrication was selected for the study and they were divided into three subgroups. Three processing techniques, compression molding and injection molding using prepolymerized resin and unpolymerized resin, were used to fabricate dentures for each of the groups. After processing, laboratory-remounted dentures were evaluated for number of occlusal interferences in centric and eccentric relations and change in vertical dimension through vertical pin rise in articulator. Data were analyzed using statistical test ANOVA and SPSS software version 19.0 by IBM was used. Data obtained from three groups were subjected to one-way ANOVA test. After ANOVA test, results with significant variations were subjected to post hoc test. Number of occlusal interferences with compression molding technique was reported to be more in both centric and eccentric positions as compared to the two injection molding techniques with statistical significance in centric, protrusive, right lateral nonworking, and left lateral working positions ( P < 0.05). Mean vertical pin rise (0.52 mm) was reported to more in compression molding technique as compared to injection molding techniques, which is statistically significant ( P < 0.001). Within the limitations of this study, injection molding techniques exhibited less processing errors as compared to compression molding technique with statistical significance. There was no statistically significant difference in processing errors reported within two injection molding systems.
Jacobson, Magdalena; Wallgren, Per; Nordengrahn, Ann; Merza, Malik; Emanuelson, Ulf
2011-04-01
Lawsonia intracellularis is a common cause of chronic diarrhoea and poor performance in young growing pigs. Diagnosis of this obligate intracellular bacterium is based on the demonstration of the microbe or microbial DNA in tissue specimens or faecal samples, or the demonstration of L. intracellularis-specific antibodies in sera. The aim of the present study was to evaluate a blocking ELISA in the detection of serum antibodies to L. intracellularis, by comparison to the previously widely used immunofluorescent antibody test (IFAT). Sera were collected from 176 pigs aged 8-12 weeks originating from 24 herds with or without problems with diarrhoea and poor performance in young growing pigs. Sera were analyzed by the blocking ELISA and by IFAT. Bayesian modelling techniques were used to account for the absence of a gold standard test and the results of the blocking ELISA was modelled against the IFAT test with a "2 dependent tests, 2 populations, no gold standard" model. At the finally selected cut-off value of percent inhibition (PI) 35, the diagnostic sensitivity of the blocking ELISA was 72% and the diagnostic specificity was 93%. The positive predictive value was 0.82 and the negative predictive value was 0.89, at the observed prevalence of 33.5%. The sensitivity and specificity as evaluated by Bayesian statistic techniques differed from that previously reported. Properties of diagnostic tests may well vary between countries, laboratories and among populations of animals. In the absence of a true gold standard, the importance of validating new methods by appropriate statistical methods and with respect to the target population must be emphasized.
In analyses supporting the development of numeric nutrient criteria, multiple statistical techniques can be used to extract critical values from stressor response relationships. However there is little guidance for choosing among techniques, and the extent to which log-transfor...
Incorporating Nonparametric Statistics into Delphi Studies in Library and Information Science
ERIC Educational Resources Information Center
Ju, Boryung; Jin, Tao
2013-01-01
Introduction: The Delphi technique is widely used in library and information science research. However, many researchers in the field fail to employ standard statistical tests when using this technique. This makes the technique vulnerable to criticisms of its reliability and validity. The general goal of this article is to explore how…
ERIC Educational Resources Information Center
Karadag, Engin
2010-01-01
To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…
Statistics in the Workplace: A Survey of Use by Recent Graduates with Higher Degrees
ERIC Educational Resources Information Center
Harraway, John A.; Barker, Richard J.
2005-01-01
A postal survey was conducted regarding statistical techniques, research methods and software used in the workplace by 913 graduates with PhD and Masters degrees in the biological sciences, psychology, business, economics, and statistics. The study identified gaps between topics and techniques learned at university and those used in the workplace,…
Fuangrod, Todsaporn; Greer, Peter B; Simpson, John; Zwan, Benjamin J; Middleton, Richard H
2017-03-13
Purpose Due to increasing complexity, modern radiotherapy techniques require comprehensive quality assurance (QA) programmes, that to date generally focus on the pre-treatment stage. The purpose of this paper is to provide a method for an individual patient treatment QA evaluation and identification of a "quality gap" for continuous quality improvement. Design/methodology/approach A statistical process control (SPC) was applied to evaluate treatment delivery using in vivo electronic portal imaging device (EPID) dosimetry. A moving range control chart was constructed to monitor the individual patient treatment performance based on a control limit generated from initial data of 90 intensity-modulated radiotherapy (IMRT) and ten volumetric-modulated arc therapy (VMAT) patient deliveries. A process capability index was used to evaluate the continuing treatment quality based on three quality classes: treatment type-specific, treatment linac-specific, and body site-specific. Findings The determined control limits were 62.5 and 70.0 per cent of the χ pass-rate for IMRT and VMAT deliveries, respectively. In total, 14 patients were selected for a pilot study the results of which showed that about 1 per cent of all treatments contained errors relating to unexpected anatomical changes between treatment fractions. Both rectum and pelvis cancer treatments demonstrated process capability indices were less than 1, indicating the potential for quality improvement and hence may benefit from further assessment. Research limitations/implications The study relied on the application of in vivo EPID dosimetry for patients treated at the specific centre. Sampling patients for generating the control limits were limited to 100 patients. Whilst the quantitative results are specific to the clinical techniques and equipment used, the described method is generally applicable to IMRT and VMAT treatment QA. Whilst more work is required to determine the level of clinical significance, the authors have demonstrated the capability of the method for both treatment specific QA and continuing quality improvement. Practical implications The proposed method is a valuable tool for assessing the accuracy of treatment delivery whilst also improving treatment quality and patient safety. Originality/value Assessing in vivo EPID dosimetry with SPC can be used to improve the quality of radiation treatment for cancer patients.
Accounting for standard errors of vision-specific latent trait in regression models.
Wong, Wan Ling; Li, Xiang; Li, Jialiang; Wong, Tien Yin; Cheng, Ching-Yu; Lamoureux, Ecosse L
2014-07-11
To demonstrate the effectiveness of Hierarchical Bayesian (HB) approach in a modeling framework for association effects that accounts for SEs of vision-specific latent traits assessed using Rasch analysis. A systematic literature review was conducted in four major ophthalmic journals to evaluate Rasch analysis performed on vision-specific instruments. The HB approach was used to synthesize the Rasch model and multiple linear regression model for the assessment of the association effects related to vision-specific latent traits. The effectiveness of this novel HB one-stage "joint-analysis" approach allows all model parameters to be estimated simultaneously and was compared with the frequently used two-stage "separate-analysis" approach in our simulation study (Rasch analysis followed by traditional statistical analyses without adjustment for SE of latent trait). Sixty-six reviewed articles performed evaluation and validation of vision-specific instruments using Rasch analysis, and 86.4% (n = 57) performed further statistical analyses on the Rasch-scaled data using traditional statistical methods; none took into consideration SEs of the estimated Rasch-scaled scores. The two models on real data differed for effect size estimations and the identification of "independent risk factors." Simulation results showed that our proposed HB one-stage "joint-analysis" approach produces greater accuracy (average of 5-fold decrease in bias) with comparable power and precision in estimation of associations when compared with the frequently used two-stage "separate-analysis" procedure despite accounting for greater uncertainty due to the latent trait. Patient-reported data, using Rasch analysis techniques, do not take into account the SE of latent trait in association analyses. The HB one-stage "joint-analysis" is a better approach, producing accurate effect size estimations and information about the independent association of exposure variables with vision-specific latent traits. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, B. P.; Valdez, C. A.; DeHope, A. J.
Critical to many modern forensic investigations is the chemical attribution of the origin of an illegal drug. This process greatly relies on identification of compounds indicative of its clandestine or commercial production. The results of these studies can yield detailed information on method of manufacture, sophistication of the synthesis operation, starting material source, and final product. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic 3- methylfentanyl, N-(3-methyl-1-phenethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods were studied in an effort to identify and classify route-specific signatures. These methods were chosen to minimize the use of scheduledmore » precursors, complicated laboratory equipment, number of overall steps, and demanding reaction conditions. Using gas and liquid chromatographies combined with mass spectrometric methods (GC-QTOF and LC-QTOF) in conjunction with inductivelycoupled plasma mass spectrometry (ICP-MS), over 240 distinct compounds and elements were monitored. As seen in our previous work with CAS of fentanyl synthesis the complexity of the resultant data matrix necessitated the use of multivariate statistical analysis. Using partial least squares discriminant analysis (PLS-DA), 62 statistically significant, route-specific CAS were identified. Statistical classification models using a variety of machine learning techniques were then developed with the ability to predict the method of 3-methylfentanyl synthesis from three blind crude samples generated by synthetic chemists without prior experience with these methods.« less
Statistical approach for selection of biologically informative genes.
Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N
2018-05-20
Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.
Spatial Modeling of Agricultural Land-Use Change at Global Scale
NASA Astrophysics Data System (ADS)
Meiyappan, Prasanth; Dalton, Michael; O'Neill, Brian C.; Jain, Atul K.
2013-12-01
Land use is both a source and consequence of climate change. Long-term modeling of land use is central in global scale assessments using Integrated Assessment Models (IAMs) to explore policy alternatives; especially because adaptation and mitigation of climate change requires long-term commitment. We present a land-use change modeling framework that can reproduce the past 100 years of evolution of global cropland and pastureland patterns to a reasonable accuracy. The novelty of our approach underlies in integrating knowledge from both the observed behavior and economic rationale behind land-use decisions, thereby making up for the intrinsic deficits in both the disciplines. The underlying economic rationale is profit maximization of individual landowners that implicitly reflects local-level decisions-making process at a larger scale. Observed behavior based on examining the relationships between contemporary land-use patterns and its socioeconomic and biophysical drivers, enters as an explicit factor into the economic framework. The land-use allocation is modified by autonomous developments and competition between land-use types. The framework accounts for spatial heterogeneity in the nature of driving factors across geographic regions. The model is currently configured to downscale continental-scale aggregate land-use information to region specific changes in land-use patterns (0.5-deg spatial resolution). The temporal resolution is one year. The historical validation experiment is facilitated by synthesizing gridded maps of a wide range of potential biophysical and socioeconomic driving factors for the 20th century. To our knowledge, this is the first retrospective analysis that has been successful in reproducing the historical experience at a global scale. We apply the method to gain useful insights on two questions: (1) what are the dominant socioeconomic and biophysical driving factors of contemporary cropland and pastureland patterns, across geographic regions, and (2) the impacts of various driving factors on shaping the cropland and pastureland patterns over the 20th century. Specifically, we focus on the causes of changes in land-use patterns in certain key regions of the world, such as the abandonment of cropland in the eastern US and a subsequent expansion to the mid-west US. This presentation will focus on the scientific basis behind the developed framework and motivations behind selecting specific statistical techniques to implement the scientific theory. Specifically, we will highlight the application of recently developed statistical techniques that are highly efficient in dealing with problems such as spatial autocorrelation and multicollinearity that are common in land-change studies. However, these statistical techniques have largely been confined to medical literature. We will present the validation results and an example application of the developed framework within an IAM. The presented framework provides a benchmark for long-term spatial modeling of land use that will benefit the IAM, land use and the Earth system modeling communities.
Sojoudi, Alireza; Goodyear, Bradley G
2016-12-01
Spontaneous fluctuations of blood-oxygenation level-dependent functional magnetic resonance imaging (BOLD fMRI) signals are highly synchronous between brain regions that serve similar functions. This provides a means to investigate functional networks; however, most analysis techniques assume functional connections are constant over time. This may be problematic in the case of neurological disease, where functional connections may be highly variable. Recently, several methods have been proposed to determine moment-to-moment changes in the strength of functional connections over an imaging session (so called dynamic connectivity). Here a novel analysis framework based on a hierarchical observation modeling approach was proposed, to permit statistical inference of the presence of dynamic connectivity. A two-level linear model composed of overlapping sliding windows of fMRI signals, incorporating the fact that overlapping windows are not independent was described. To test this approach, datasets were synthesized whereby functional connectivity was either constant (significant or insignificant) or modulated by an external input. The method successfully determines the statistical significance of a functional connection in phase with the modulation, and it exhibits greater sensitivity and specificity in detecting regions with variable connectivity, when compared with sliding-window correlation analysis. For real data, this technique possesses greater reproducibility and provides a more discriminative estimate of dynamic connectivity than sliding-window correlation analysis. Hum Brain Mapp 37:4566-4580, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Privacy-Preserving Data Exploration in Genome-Wide Association Studies.
Johnson, Aaron; Shmatikov, Vitaly
2013-08-01
Genome-wide association studies (GWAS) have become a popular method for analyzing sets of DNA sequences in order to discover the genetic basis of disease. Unfortunately, statistics published as the result of GWAS can be used to identify individuals participating in the study. To prevent privacy breaches, even previously published results have been removed from public databases, impeding researchers' access to the data and hindering collaborative research. Existing techniques for privacy-preserving GWAS focus on answering specific questions, such as correlations between a given pair of SNPs (DNA sequence variations). This does not fit the typical GWAS process, where the analyst may not know in advance which SNPs to consider and which statistical tests to use, how many SNPs are significant for a given dataset, etc. We present a set of practical, privacy-preserving data mining algorithms for GWAS datasets. Our framework supports exploratory data analysis, where the analyst does not know a priori how many and which SNPs to consider. We develop privacy-preserving algorithms for computing the number and location of SNPs that are significantly associated with the disease, the significance of any statistical test between a given SNP and the disease, any measure of correlation between SNPs, and the block structure of correlations. We evaluate our algorithms on real-world datasets and demonstrate that they produce significantly more accurate results than prior techniques while guaranteeing differential privacy.
Techniques in teaching statistics : linking research production and research use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martinez-Moyano, I .; Smith, A.; Univ. of Massachusetts at Boston)
In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between researchmore » and practice.« less
NASA Astrophysics Data System (ADS)
Mullan, Donal; Chen, Jie; Zhang, Xunchang John
2016-02-01
Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.
Statistical Symbolic Execution with Informed Sampling
NASA Technical Reports Server (NTRS)
Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco
2014-01-01
Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.
Das, D K; Maiti, A K; Chakraborty, C
2015-03-01
In this paper, we propose a comprehensive image characterization cum classification framework for malaria-infected stage detection using microscopic images of thin blood smears. The methodology mainly includes microscopic imaging of Leishman stained blood slides, noise reduction and illumination correction, erythrocyte segmentation, feature selection followed by machine classification. Amongst three-image segmentation algorithms (namely, rule-based, Chan-Vese-based and marker-controlled watershed methods), marker-controlled watershed technique provides better boundary detection of erythrocytes specially in overlapping situations. Microscopic features at intensity, texture and morphology levels are extracted to discriminate infected and noninfected erythrocytes. In order to achieve subgroup of potential features, feature selection techniques, namely, F-statistic and information gain criteria are considered here for ranking. Finally, five different classifiers, namely, Naive Bayes, multilayer perceptron neural network, logistic regression, classification and regression tree (CART), RBF neural network have been trained and tested by 888 erythrocytes (infected and noninfected) for each features' subset. Performance evaluation of the proposed methodology shows that multilayer perceptron network provides higher accuracy for malaria-infected erythrocytes recognition and infected stage classification. Results show that top 90 features ranked by F-statistic (specificity: 98.64%, sensitivity: 100%, PPV: 99.73% and overall accuracy: 96.84%) and top 60 features ranked by information gain provides better results (specificity: 97.29%, sensitivity: 100%, PPV: 99.46% and overall accuracy: 96.73%) for malaria-infected stage classification. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.
Contrast enhanced ultrasound and magnetic resonance imaging in hepatocellular carcinoma diagnosis.
Dumitrescu, Cristiana I; Gheonea, Ioana A; Săndulescu, Larisa; Surlin, Valeriu; Săftoiu, Adrian; Dumitrescu, Daniela
2013-12-01
The new developments in imaging technology, including contrast enhanced ultrasound (CEUS), computed tomography (CT), and magnetic resonance imaging (MRI), allow a better diagnosis of both malignant and benign liver lesions. A retrospective trial of 126 patients was conducted in the Gastroenterology and Imaging Departments of the University of Medicine and Pharmacy Craiova, Romania. CEUS and MRI were the imaging techniques used for diagnosis of focal liver lesions (FLL), especially for hepatocellular carcinoma (HCC). Histopathology was used only in 15 cases. For each method of investigation we calculated the sensitivity, specificity, positive and negative predictive values (PPV and NPV), positive and negative likelihood ratio (+LR, -LR), accuracy and we compared the ROC curves. Statistical analysis also included the Chi-square and Kappa tests. Seventy six cases were diagnosed as HCC, with average size of 5.2±3.3 cm in diameter. The sensitivity and specificity were 71.4% and 95.6% for CEUS and 91.4%, 98.9% respectively, for MRI. When comparing the ROC curves, we found a higher area under curve for MRI (0.952) then for CEUS (0.835) (p=0.005), and 95% confidence interval of 0.0343 to 0.199. No statistically significant difference in diagnosis of FLL was found between CEUS and MRI (p > 0.05) and the agreement between the two imaging techniques was good (k = 0.78). CEUS can be used as the first step in the diagnosis of liver lesions, but MRI remains the gold standard diagnostic method for liver tumors.
Statistical Model Selection for TID Hardness Assurance
NASA Technical Reports Server (NTRS)
Ladbury, R.; Gorelick, J. L.; McClure, S.
2010-01-01
Radiation Hardness Assurance (RHA) methodologies against Total Ionizing Dose (TID) degradation impose rigorous statistical treatments for data from a part's Radiation Lot Acceptance Test (RLAT) and/or its historical performance. However, no similar methods exist for using "similarity" data - that is, data for similar parts fabricated in the same process as the part under qualification. This is despite the greater difficulty and potential risk in interpreting of similarity data. In this work, we develop methods to disentangle part-to-part, lot-to-lot and part-type-to-part-type variation. The methods we develop apply not just for qualification decisions, but also for quality control and detection of process changes and other "out-of-family" behavior. We begin by discussing the data used in ·the study and the challenges of developing a statistic providing a meaningful measure of degradation across multiple part types, each with its own performance specifications. We then develop analysis techniques and apply them to the different data sets.
Mair, Patrick; Hofmann, Eva; Gruber, Kathrin; Hatzinger, Reinhold; Zeileis, Achim; Hornik, Kurt
2015-01-01
One of the cornerstones of the R system for statistical computing is the multitude of packages contributed by numerous package authors. This amount of packages makes an extremely broad range of statistical techniques and other quantitative methods freely available. Thus far, no empirical study has investigated psychological factors that drive authors to participate in the R project. This article presents a study of R package authors, collecting data on different types of participation (number of packages, participation in mailing lists, participation in conferences), three psychological scales (types of motivation, psychological values, and work design characteristics), and various socio-demographic factors. The data are analyzed using item response models and subsequent generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the social characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation. PMID:26554005
Mair, Patrick; Hofmann, Eva; Gruber, Kathrin; Hatzinger, Reinhold; Zeileis, Achim; Hornik, Kurt
2015-12-01
One of the cornerstones of the R system for statistical computing is the multitude of packages contributed by numerous package authors. This amount of packages makes an extremely broad range of statistical techniques and other quantitative methods freely available. Thus far, no empirical study has investigated psychological factors that drive authors to participate in the R project. This article presents a study of R package authors, collecting data on different types of participation (number of packages, participation in mailing lists, participation in conferences), three psychological scales (types of motivation, psychological values, and work design characteristics), and various socio-demographic factors. The data are analyzed using item response models and subsequent generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the social characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation.
NASA Technical Reports Server (NTRS)
Sprowls, D. O.; Bucci, R. J.; Ponchel, B. M.; Brazill, R. L.; Bretz, P. E.
1984-01-01
A technique is demonstrated for accelerated stress corrosion testing of high strength aluminum alloys. The method offers better precision and shorter exposure times than traditional pass fail procedures. The approach uses data from tension tests performed on replicate groups of smooth specimens after various lengths of exposure to static stress. The breaking strength measures degradation in the test specimen load carrying ability due to the environmental attack. Analysis of breaking load data by extreme value statistics enables the calculation of survival probabilities and a statistically defined threshold stress applicable to the specific test conditions. A fracture mechanics model is given which quantifies depth of attack in the stress corroded specimen by an effective flaw size calculated from the breaking stress and the material strength and fracture toughness properties. Comparisons are made with experimental results from three tempers of 7075 alloy plate tested by the breaking load method and by traditional tests of statistically loaded smooth tension bars and conventional precracked specimens.
NASA Astrophysics Data System (ADS)
Aalto, J.; Karjalainen, O.; Hjort, J.; Luoto, M.
2018-05-01
Mean annual ground temperature (MAGT) and active layer thickness (ALT) are key to understanding the evolution of the ground thermal state across the Arctic under climate change. Here a statistical modeling approach is presented to forecast current and future circum-Arctic MAGT and ALT in relation to climatic and local environmental factors, at spatial scales unreachable with contemporary transient modeling. After deploying an ensemble of multiple statistical techniques, distance-blocked cross validation between observations and predictions suggested excellent and reasonable transferability of the MAGT and ALT models, respectively. The MAGT forecasts indicated currently suitable conditions for permafrost to prevail over an area of 15.1 ± 2.8 × 106 km2. This extent is likely to dramatically contract in the future, as the results showed consistent, but region-specific, changes in ground thermal regime due to climate change. The forecasts provide new opportunities to assess future Arctic changes in ground thermal state and biogeochemical feedback.
Practical protocols for fast histopathology by Fourier transform infrared spectroscopic imaging
NASA Astrophysics Data System (ADS)
Keith, Frances N.; Reddy, Rohith K.; Bhargava, Rohit
2008-02-01
Fourier transform infrared (FT-IR) spectroscopic imaging is an emerging technique that combines the molecular selectivity of spectroscopy with the spatial specificity of optical microscopy. We demonstrate a new concept in obtaining high fidelity data using commercial array detectors coupled to a microscope and Michelson interferometer. Next, we apply the developed technique to rapidly provide automated histopathologic information for breast cancer. Traditionally, disease diagnoses are based on optical examinations of stained tissue and involve a skilled recognition of morphological patterns of specific cell types (histopathology). Consequently, histopathologic determinations are a time consuming, subjective process with innate intra- and inter-operator variability. Utilizing endogenous molecular contrast inherent in vibrational spectra, specially designed tissue microarrays and pattern recognition of specific biochemical features, we report an integrated algorithm for automated classifications. The developed protocol is objective, statistically significant and, being compatible with current tissue processing procedures, holds potential for routine clinical diagnoses. We first demonstrate that the classification of tissue type (histology) can be accomplished in a manner that is robust and rigorous. Since data quality and classifier performance are linked, we quantify the relationship through our analysis model. Last, we demonstrate the application of the minimum noise fraction (MNF) transform to improve tissue segmentation.
Reagent-free bacterial identification using multivariate analysis of transmission spectra
NASA Astrophysics Data System (ADS)
Smith, Jennifer M.; Huffman, Debra E.; Acosta, Dayanis; Serebrennikova, Yulia; García-Rubio, Luis; Leparc, German F.
2012-10-01
The identification of bacterial pathogens from culture is critical to the proper administration of antibiotics and patient treatment. Many of the tests currently used in the clinical microbiology laboratory for bacterial identification today can be highly sensitive and specific; however, they have the additional burdens of complexity, cost, and the need for specialized reagents. We present an innovative, reagent-free method for the identification of pathogens from culture. A clinical study has been initiated to evaluate the sensitivity and specificity of this approach. Multiwavelength transmission spectra were generated from a set of clinical isolates including Escherichia coli, Klebsiella pneumoniae, Pseudomonas aeruginosa, and Staphylococcus aureus. Spectra of an initial training set of these target organisms were used to create identification models representing the spectral variability of each species using multivariate statistical techniques. Next, the spectra of the blinded isolates of targeted species were identified using the model achieving >94% sensitivity and >98% specificity, with 100% accuracy for P. aeruginosa and S. aureus. The results from this on-going clinical study indicate this approach is a powerful and exciting technique for identification of pathogens. The menu of models is being expanded to include other bacterial genera and species of clinical significance.
Faires, Meredith C; Pearl, David L; Ciccotelli, William A; Berke, Olaf; Reid-Smith, Richard J; Weese, J Scott
2014-07-08
In healthcare facilities, conventional surveillance techniques using rule-based guidelines may result in under- or over-reporting of methicillin-resistant Staphylococcus aureus (MRSA) outbreaks, as these guidelines are generally unvalidated. The objectives of this study were to investigate the utility of the temporal scan statistic for detecting MRSA clusters, validate clusters using molecular techniques and hospital records, and determine significant differences in the rate of MRSA cases using regression models. Patients admitted to a community hospital between August 2006 and February 2011, and identified with MRSA>48 hours following hospital admission, were included in this study. Between March 2010 and February 2011, MRSA specimens were obtained for spa typing. MRSA clusters were investigated using a retrospective temporal scan statistic. Tests were conducted on a monthly scale and significant clusters were compared to MRSA outbreaks identified by hospital personnel. Associations between the rate of MRSA cases and the variables year, month, and season were investigated using a negative binomial regression model. During the study period, 735 MRSA cases were identified and 167 MRSA isolates were spa typed. Nine different spa types were identified with spa type 2/t002 (88.6%) the most prevalent. The temporal scan statistic identified significant MRSA clusters at the hospital (n=2), service (n=16), and ward (n=10) levels (P ≤ 0.05). Seven clusters were concordant with nine MRSA outbreaks identified by hospital staff. For the remaining clusters, seven events may have been equivalent to true outbreaks and six clusters demonstrated possible transmission events. The regression analysis indicated years 2009-2011, compared to 2006, and months March and April, compared to January, were associated with an increase in the rate of MRSA cases (P ≤ 0.05). The application of the temporal scan statistic identified several MRSA clusters that were not detected by hospital personnel. The identification of specific years and months with increased MRSA rates may be attributable to several hospital level factors including the presence of other pathogens. Within hospitals, the incorporation of the temporal scan statistic to standard surveillance techniques is a valuable tool for healthcare workers to evaluate surveillance strategies and aid in the identification of MRSA clusters.
Earth Observation System Flight Dynamics System Covariance Realism
NASA Technical Reports Server (NTRS)
Zaidi, Waqar H.; Tracewell, David
2016-01-01
This presentation applies a covariance realism technique to the National Aeronautics and Space Administration (NASA) Earth Observation System (EOS) Aqua and Aura spacecraft based on inferential statistics. The technique consists of three parts: collection calculation of definitive state estimates through orbit determination, calculation of covariance realism test statistics at each covariance propagation point, and proper assessment of those test statistics.
The Shock and Vibration Digest. Volume 16, Number 1
1984-01-01
investigation of the measure- ment of frequency band average loss factors of structural components for use in the statistical energy analysis method of...stiffness. Matrix methods Key Words: Finite element technique. Statistical energy analysis . Experimental techniques. Framed structures, Com- puter...programs In order to further understand the practical application of the statistical energy analysis , a two section plate-like frame structure is
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2000-01-01
The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.
Weak-value amplification and optimal parameter estimation in the presence of correlated noise
NASA Astrophysics Data System (ADS)
Sinclair, Josiah; Hallaji, Matin; Steinberg, Aephraim M.; Tollaksen, Jeff; Jordan, Andrew N.
2017-11-01
We analytically and numerically investigate the performance of weak-value amplification (WVA) and related parameter estimation methods in the presence of temporally correlated noise. WVA is a special instance of a general measurement strategy that involves sorting data into separate subsets based on the outcome of a second "partitioning" measurement. Using a simplified correlated noise model that can be analyzed exactly together with optimal statistical estimators, we compare WVA to a conventional measurement method. We find that WVA indeed yields a much lower variance of the parameter of interest than the conventional technique does, optimized in the absence of any partitioning measurements. In contrast, a statistically optimal analysis that employs partitioning measurements, incorporating all partitioned results and their known correlations, is found to yield an improvement—typically slight—over the noise reduction achieved by WVA. This result occurs because the simple WVA technique is not tailored to any specific noise environment and therefore does not make use of correlations between the different partitions. We also compare WVA to traditional background subtraction, a familiar technique where measurement outcomes are partitioned to eliminate unknown offsets or errors in calibration. Surprisingly, for the cases we consider, background subtraction turns out to be a special case of the optimal partitioning approach, possessing a similar typically slight advantage over WVA. These results give deeper insight into the role of partitioning measurements (with or without postselection) in enhancing measurement precision, which some have found puzzling. They also resolve previously made conflicting claims about the usefulness of weak-value amplification to precision measurement in the presence of correlated noise. We finish by presenting numerical results to model a more realistic laboratory situation of time-decaying correlations, showing that our conclusions hold for a wide range of statistical models.
Detection of plum pox virus infection in selection plum trees using spectral imaging
NASA Astrophysics Data System (ADS)
Angelova, Liliya; Stoev, Antoniy; Borisova, Ekaterina; Avramov, Latchezar
2016-01-01
Plum pox virus (PPV) is among the most studied viral diseases in the world in plants. It is considered to be one of the most devastating diseases of stone fruits in terms of agronomic impact and economic importance. Noninvasive, fast and reliable techniques are required for evaluation of the pathology in selection trees with economic impact. Such advanced tools for PPV detection could be optical techniques as light-induced fluorescence and diffuse reflectance spectroscopies. Specific regions in the electromagnetic spectra have been found to provide information about the physiological stress in plants, and consequently, diseased plants usually exhibit different spectral signature than non-stressed healthy plants in those specific ranges. In this study spectral reflectance and chlorophyll fluorescence were used for the identification of biotic stress caused by the pox virus on plum trees. The spectral responses of healthy and infected leaves from cultivars, which are widespread in Bulgaria were investigated. The two applied techniques revealed statistically significant differences between the spectral data of healthy plum leaves and those infected by PPV in the visible and near-infrared spectral ranges. Their application for biotic stress detection helps in monitoring diseases in plants using the different plant spectral properties in these spectral ranges. The strong relationship between the results indicates the applicability of diffuse reflectance and fluorescence techniques for conducting health condition assessments of vegetation and their importance for plant protection practices.
Shahi, Shahriar; Ghasemi, Negin; Rahimi, Saeed; Yavari, Hamidreza; Janani, Maryam; Mokhtari, Hadi; Bahari, Mahmood; Rabbani, Parastu
2015-01-01
The aim of the present study was to evaluate the effect of different mixing techniques (conventional, amalgamator and ultrasonic mixing) on the physical properties the working time (WT), setting time (ST), dimensional changes (DC) and film thickness (FT)] of calcium-enriched mixture (CEM) cement and mineral trioxide aggregate (MTA). The mentioned physical properties were determined using the ISO 6786:2001 specification. Six samples of each material were prepared for three mixing techniques (totally 36 samples). Data were analyzed using descriptive statistics, two-way ANOVA and Post Hoc Tukey's tests. The level of significance was defined at 0.05. Irrespective of mixing technique, there was no significant difference between the WT and FT of the tested materials. Except for the DC of MTA and the FT of the all materials, other properties were significantly affected with mixing techniques (P<0.05). The ultrasonic technique decreased the ST of MTA and CEM cement and increased the WT of CEM cement (P<0.05). The mixing technique of the materials had no significant effect on the dimensional changes of MTA and the film thickness of both materials.
Free-space optical communication through a forest canopy.
Edwards, Clinton L; Davis, Christopher C
2006-01-01
We model the effects of the leaves of mature broadleaf (deciduous) trees on air-to-ground free-space optical communication systems operating through the leaf canopy. The concept of leaf area index (LAI) is reviewed and related to a probabilistic model of foliage consisting of obscuring leaves randomly distributed throughout a treetop layer. Individual leaves are opaque. The expected fractional unobscured area statistic is derived as well as the variance around the expected value. Monte Carlo simulation results confirm the predictions of this probabilistic model. To verify the predictions of the statistical model experimentally, a passive optical technique has been used to make measurements of observed sky illumination in a mature broadleaf environment. The results of the measurements, as a function of zenith angle, provide strong evidence for the applicability of the model, and a single parameter fit to the data reinforces a natural connection to LAI. Specific simulations of signal-to-noise ratio degradation as a function of zenith angle in a specific ground-to-unmanned aerial vehicle communication situation have demonstrated the effect of obscuration on performance.
NASA Astrophysics Data System (ADS)
Smid, Marek; Costa, Ana; Pebesma, Edzer; Granell, Carlos; Bhattacharya, Devanjan
2016-04-01
Human kind is currently predominantly urban based, and the majority of ever continuing population growth will take place in urban agglomerations. Urban systems are not only major drivers of climate change, but also the impact hot spots. Furthermore, climate change impacts are commonly managed at city scale. Therefore, assessing climate change impacts on urban systems is a very relevant subject of research. Climate and its impacts on all levels (local, meso and global scale) and also the inter-scale dependencies of those processes should be a subject to detail analysis. While global and regional projections of future climate are currently available, local-scale information is lacking. Hence, statistical downscaling methodologies represent a potentially efficient way to help to close this gap. In general, the methodological reviews of downscaling procedures cover the various methods according to their application (e.g. downscaling for the hydrological modelling). Some of the most recent and comprehensive studies, such as the ESSEM COST Action ES1102 (VALUE), use the concept of Perfect Prog and MOS. Other examples of classification schemes of downscaling techniques consider three main categories: linear methods, weather classifications and weather generators. Downscaling and climate modelling represent a multidisciplinary field, where researchers from various backgrounds intersect their efforts, resulting in specific terminology, which may be somewhat confusing. For instance, the Polynomial Regression (also called the Surface Trend Analysis) is a statistical technique. In the context of the spatial interpolation procedures, it is commonly classified as a deterministic technique, and kriging approaches are classified as stochastic. Furthermore, the terms "statistical" and "stochastic" (frequently used as names of sub-classes in downscaling methodological reviews) are not always considered as synonymous, even though both terms could be seen as identical since they are referring to methods handling input modelling factors as variables with certain probability distributions. In addition, the recent development is going towards multi-step methodologies containing deterministic and stochastic components. This evolution leads to the introduction of new terms like hybrid or semi-stochastic approaches, which makes the efforts to systematically classifying downscaling methods to the previously defined categories even more challenging. This work presents a review of statistical downscaling procedures, which classifies the methods in two steps. In the first step, we describe several techniques that produce a single climatic surface based on observations. The methods are classified into two categories using an approximation to the broadest consensual statistical terms: linear and non-linear methods. The second step covers techniques that use simulations to generate alternative surfaces, which correspond to different realizations of the same processes. Those simulations are essential because there is a limited number of real observational data, and such procedures are crucial for modelling extremes. This work emphasises the link between statistical downscaling methods and the research of climate change impacts at city scale.
Rahm, Stefan; Camenzind, Roland S; Hingsammer, Andreas; Lenz, Christopher; Bauer, David E; Farshad, Mazda; Fucentese, Sandro F
2017-06-21
There have been conflicting studies published regarding the ability of various total knee arthroplasty (TKA) techniques to correct preoperative deformity. The purpose of this study was to compare the postoperative radiographic alignment in patients with severe preoperative coronal deformity (≥10° varus/valgus) who underwent three different TKA techniques; manual instrumentation (MAN), computer navigated instrumentation (NAV) and patient specific instrumentation (PSI). Patients, who received a TKA with a preoperative coronal deformity of ≥10° with available radiographs were included in this retrospective study. The groups were: MAN; n = 54, NAV; n = 52 and PSI; n = 53. The mechanical axis (varus / valgus) and the posterior tibial slope were measured and analysed using standing long leg- and lateral radiographs. The overall mean postoperative varus / valgus deformity was 2.8° (range, 0 to 9.9; SD 2.3) and 2.5° (range, 0 to 14.7; SD 2.3), respectively. The overall outliers (>3°) represented 30.2% (48 /159) of cases and were distributed as followed: MAN group: 31.5%, NAV group: 34.6%, PSI group: 24.4%. No significant statistical differences were found between these groups. The distribution of the severe outliers (>5°) was 14.8% in the MAN group, 23% in the NAV group and 5.6% in the PSI group. The PSI group had significantly (p = 0.0108) fewer severe outliers compared to the NAV group while all other pairs were not statistically significant. In severe varus / valgus deformity the three surgical techniques demonstrated similar postoperative radiographic alignment. However, in reducing severe outliers (> 5°) and in achieving the planned posterior tibial slope the PSI technique for TKA may be superior to computer navigation and the conventional technique. Further prospective studies are needed to determine which technique is the best regarding reducing outliers in patients with severe preoperative coronal deformity.
Al-Moraissi, E A; Elmansi, Y A; Al-Sharaee, Y A; Alrmali, A E; Alkhutari, A S
2016-03-01
A systematic review and meta-analysis was conducted to answer the clinical question "Does the piezoelectric surgical technique produce fewer postoperative sequelae after lower third molar surgery than conventional rotary instruments?" A systematic and electronic search of several databases with specific key words, a reference search, and a manual search were performed from respective dates of inception through November 2014. The inclusion criteria were clinical human studies, including randomized controlled trials (RCTs), controlled clinical trials (CCTs), and retrospective studies, with the aim of comparing the piezoelectric surgical osteotomy technique to the standard rotary instrument technique in lower third molar surgery. Postoperative sequelae (oedema, trismus, and pain), the total number of analgesics taken, and the duration of surgery were analyzed. A total of nine articles were included, six RCTs, two CCTs, and one retrospective study. Six studies had a low risk of bias and three had a moderate risk of bias. A statistically significant difference was found between piezoelectric surgery and conventional rotary instrument surgery for lower third molar extraction with regard to postoperative sequelae (oedema, trismus, and pain) and the total number of analgesics taken (P=0.0001, P=0.0001, P<0.00001, and P<0.0001, respectively). However, a statistically significant increased surgery time was required in the piezoelectric osteotomy group (P<0.00001). The results of the meta-analysis showed that piezoelectric surgery significantly reduced the occurrence of postoperative sequelae (oedema, trismus, and pain) and the total number of analgesics taken compared to the conventional rotary instrument technique in lower third molar surgery, but required a longer surgery time. Copyright © 2015 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Raiman, Laura B.
1992-01-01
Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .
Nahid, Abdullah-Al; Mehrabi, Mohamad Ali; Kong, Yinan
2018-01-01
Breast Cancer is a serious threat and one of the largest causes of death of women throughout the world. The identification of cancer largely depends on digital biomedical photography analysis such as histopathological images by doctors and physicians. Analyzing histopathological images is a nontrivial task, and decisions from investigation of these kinds of images always require specialised knowledge. However, Computer Aided Diagnosis (CAD) techniques can help the doctor make more reliable decisions. The state-of-the-art Deep Neural Network (DNN) has been recently introduced for biomedical image analysis. Normally each image contains structural and statistical information. This paper classifies a set of biomedical breast cancer images (BreakHis dataset) using novel DNN techniques guided by structural and statistical information derived from the images. Specifically a Convolutional Neural Network (CNN), a Long-Short-Term-Memory (LSTM), and a combination of CNN and LSTM are proposed for breast cancer image classification. Softmax and Support Vector Machine (SVM) layers have been used for the decision-making stage after extracting features utilising the proposed novel DNN models. In this experiment the best Accuracy value of 91.00% is achieved on the 200x dataset, the best Precision value 96.00% is achieved on the 40x dataset, and the best F -Measure value is achieved on both the 40x and 100x datasets.
NASA Astrophysics Data System (ADS)
Waubke, Holger; Kasess, Christian H.
2016-11-01
Devices that emit structure-borne sound are commonly decoupled by elastic components to shield the environment from acoustical noise and vibrations. The elastic elements often have a hysteretic behavior that is typically neglected. In order to take hysteretic behavior into account, Bouc developed a differential equation for such materials, especially joints made of rubber or equipped with dampers. In this work, the Bouc model is solved by means of the Gaussian closure technique based on the Kolmogorov equation. Kolmogorov developed a method to derive probability density functions for arbitrary explicit first-order vector differential equations under white noise excitation using a partial differential equation of a multivariate conditional probability distribution. Up to now no analytical solution of the Kolmogorov equation in conjunction with the Bouc model exists. Therefore a wide range of approximate solutions, especially the statistical linearization, were developed. Using the Gaussian closure technique that is an approximation to the Kolmogorov equation assuming a multivariate Gaussian distribution an analytic solution is derived in this paper for the Bouc model. For the stationary case the two methods yield equivalent results, however, in contrast to statistical linearization the presented solution allows to calculate the transient behavior explicitly. Further, stationary case leads to an implicit set of equations that can be solved iteratively with a small number of iterations and without instabilities for specific parameter sets.
NASA Astrophysics Data System (ADS)
Raiman, Laura B.
1992-12-01
Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .
Statistical technique for analysing functional connectivity of multiple spike trains.
Masud, Mohammad Shahed; Borisyuk, Roman
2011-03-15
A new statistical technique, the Cox method, used for analysing functional connectivity of simultaneously recorded multiple spike trains is presented. This method is based on the theory of modulated renewal processes and it estimates a vector of influence strengths from multiple spike trains (called reference trains) to the selected (target) spike train. Selecting another target spike train and repeating the calculation of the influence strengths from the reference spike trains enables researchers to find all functional connections among multiple spike trains. In order to study functional connectivity an "influence function" is identified. This function recognises the specificity of neuronal interactions and reflects the dynamics of postsynaptic potential. In comparison to existing techniques, the Cox method has the following advantages: it does not use bins (binless method); it is applicable to cases where the sample size is small; it is sufficiently sensitive such that it estimates weak influences; it supports the simultaneous analysis of multiple influences; it is able to identify a correct connectivity scheme in difficult cases of "common source" or "indirect" connectivity. The Cox method has been thoroughly tested using multiple sets of data generated by the neural network model of the leaky integrate and fire neurons with a prescribed architecture of connections. The results suggest that this method is highly successful for analysing functional connectivity of simultaneously recorded multiple spike trains. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Wolf, S. F.; Lipschutz, M. E.
1993-01-01
Multivariate statistical analysis techniques (linear discriminant analysis and logistic regression) can provide powerful discrimination tools which are generally unfamiliar to the planetary science community. Fall parameters were used to identify a group of 17 H chondrites (Cluster 1) that were part of a coorbital stream which intersected Earth's orbit in May, from 1855 - 1895, and can be distinguished from all other H chondrite falls. Using multivariate statistical techniques, it was demonstrated that a totally different criterion, labile trace element contents - hence thermal histories - or 13 Cluster 1 meteorites are distinguishable from those of 45 non-Cluster 1 H chondrites. Here, we focus upon the principles of multivariate statistical techniques and illustrate their application using non-meteoritic and meteoritic examples.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., other techniques, such as the use of statistical models or the use of historical data could be..., mathematical techniques should be applied to account for the trends to ensure that the expected annual values... emission patterns, either the most recent representative year(s) could be used or statistical techniques or...
The Use of a Context-Based Information Retrieval Technique
2009-07-01
provided in context. Latent Semantic Analysis (LSA) is a statistical technique for inferring contextual and structural information, and previous studies...WAIS). 10 DSTO-TR-2322 1.4.4 Latent Semantic Analysis LSA, which is also known as latent semantic indexing (LSI), uses a statistical and...1.4.6 Language Models In contrast, natural language models apply algorithms that combine statistical information with semantic information. Semantic
[Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].
Golder, W
1999-09-01
To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.
Schmitt, M; Groß, K; Grub, J; Heib, F
2015-06-01
Contact angle determination by sessile drop technique is essential to characterise surface properties in science and in industry. Different specific angles can be observed on every solid which are correlated with the advancing or the receding of the triple line. Different procedures and definitions for the determination of specific angles exist which are often not comprehensible or reproducible. Therefore one of the most important things in this area is to build standard, reproducible and valid methods for determining advancing/receding contact angles. This contribution introduces novel techniques to analyse dynamic contact angle measurements (sessile drop) in detail which are applicable for axisymmetric and non-axisymmetric drops. Not only the recently presented fit solution by sigmoid function and the independent analysis of the different parameters (inclination, contact angle, velocity of the triple point) but also the dependent analysis will be firstly explained in detail. These approaches lead to contact angle data and different access on specific contact angles which are independent from "user-skills" and subjectivity of the operator. As example the motion behaviour of droplets on flat silicon-oxide surfaces after different surface treatments is dynamically measured by sessile drop technique when inclining the sample plate. The triple points, the inclination angles, the downhill (advancing motion) and the uphill angles (receding motion) obtained by high-precision drop shape analysis are independently and dependently statistically analysed. Due to the small covered distance for the dependent analysis (<0.4mm) and the dominance of counted events with small velocity the measurements are less influenced by motion dynamics and the procedure can be called "slow moving" analysis. The presented procedures as performed are especially sensitive to the range which reaches from the static to the "slow moving" dynamic contact angle determination. They are characterised by small deviations of the computed values. Additional to the detailed introduction of this novel analytical approaches plus fit solution special motion relations for the drop on inclined surfaces and detailed relations about the reactivity of the freshly cleaned silicon wafer surface resulting in acceleration behaviour (reactive de-wetting) are presented. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zimnyakov, Dmitry A.; Tuchin, Valery V.; Yodh, Arjun G.; Mishin, Alexey A.; Peretochkin, Igor S.
1998-04-01
Relationships between decorrelation and depolarization of coherent light scattered by disordered media are examined by using the conception of the photon paths distribution functions. Analysis of behavior of the autocorrelation functions of the scattered field fluctuations and their polarization properties allows us to introduce generalized parameter of scattering media such as specific correlation time. Determination of specific correlation time has been carried out for phantom scattering media (water suspensions of polystyrene spheres). Results of statistical, correlation and polarization analysis of static and dynamic speckle patterns carried out in the experiments with human sclera with artificially controlled optical transmittance are presented. Some possibilities of applications of such polarization- correlation technique for monitoring and visualization of non- single scattering tissue structures are discussed.
Detecting Disease Specific Pathway Substructures through an Integrated Systems Biology Approach
Alaimo, Salvatore; Marceca, Gioacchino Paolo; Ferro, Alfredo; Pulvirenti, Alfredo
2017-01-01
In the era of network medicine, pathway analysis methods play a central role in the prediction of phenotype from high throughput experiments. In this paper, we present a network-based systems biology approach capable of extracting disease-perturbed subpathways within pathway networks in connection with expression data taken from The Cancer Genome Atlas (TCGA). Our system extends pathways with missing regulatory elements, such as microRNAs, and their interactions with genes. The framework enables the extraction, visualization, and analysis of statistically significant disease-specific subpathways through an easy to use web interface. Our analysis shows that the methodology is able to fill the gap in current techniques, allowing a more comprehensive analysis of the phenomena underlying disease states. PMID:29657291
Targeted numerical simulations of binary black holes for GW170104
NASA Astrophysics Data System (ADS)
Healy, J.; Lange, J.; O'Shaughnessy, R.; Lousto, C. O.; Campanelli, M.; Williamson, A. R.; Zlochower, Y.; Calderón Bustillo, J.; Clark, J. A.; Evans, C.; Ferguson, D.; Ghonge, S.; Jani, K.; Khamesra, B.; Laguna, P.; Shoemaker, D. M.; Boyle, M.; García, A.; Hemberger, D. A.; Kidder, L. E.; Kumar, P.; Lovelace, G.; Pfeiffer, H. P.; Scheel, M. A.; Teukolsky, S. A.
2018-03-01
In response to LIGO's observation of GW170104, we performed a series of full numerical simulations of binary black holes, each designed to replicate likely realizations of its dynamics and radiation. These simulations have been performed at multiple resolutions and with two independent techniques to solve Einstein's equations. For the nonprecessing and precessing simulations, we demonstrate the two techniques agree mode by mode, at a precision substantially in excess of statistical uncertainties in current LIGO's observations. Conversely, we demonstrate our full numerical solutions contain information which is not accurately captured with the approximate phenomenological models commonly used to infer compact binary parameters. To quantify the impact of these differences on parameter inference for GW170104 specifically, we compare the predictions of our simulations and these approximate models to LIGO's observations of GW170104.
CD process control through machine learning
NASA Astrophysics Data System (ADS)
Utzny, Clemens
2016-10-01
For the specific requirements of the 14nm and 20nm site applications a new CD map approach was developed at the AMTC. This approach relies on a well established machine learning technique called recursive partitioning. Recursive partitioning is a powerful technique which creates a decision tree by successively testing whether the quantity of interest can be explained by one of the supplied covariates. The test performed is generally a statistical test with a pre-supplied significance level. Once the test indicates significant association between the variable of interest and a covariate a split performed at a threshold value which minimizes the variation within the newly attained groups. This partitioning is recurred until either no significant association can be detected or the resulting sub group size falls below a pre-supplied level.
Forensic analysis of dyed textile fibers.
Goodpaster, John V; Liszewski, Elisa A
2009-08-01
Textile fibers are a key form of trace evidence, and the ability to reliably associate or discriminate them is crucial for forensic scientists worldwide. While microscopic and instrumental analysis can be used to determine the composition of the fiber itself, additional specificity is gained by examining fiber color. This is particularly important when the bulk composition of the fiber is relatively uninformative, as it is with cotton, wool, or other natural fibers. Such analyses pose several problems, including extremely small sample sizes, the desire for nondestructive techniques, and the vast complexity of modern dye compositions. This review will focus on more recent methods for comparing fiber color by using chromatography, spectroscopy, and mass spectrometry. The increasing use of multivariate statistics and other data analysis techniques for the differentiation of spectra from dyed fibers will also be discussed.
2017-10-01
casualty care using descriptive statistical analysis and modeling techniques. Aim 2: Identify the ideal provider training and competency assessment... Methodologies , Course Type, Course Availability, Assessment Criteria, Requirements, Funding, Alignment with Clinical Practice Guidelines (CPGs...Aim 1: Descriptive study of all available information for combat casualties in Afghanistan. Specific Tasks: 1) who – patients treated; clinician mix
Patient Populations, Clinical Associations, and System Efficiency in Healthcare Delivery System
NASA Astrophysics Data System (ADS)
Liu, Yazhuo
The efforts to improve health care delivery usually involve studies and analysis of patient populations and healthcare systems. In this dissertation, I present the research conducted in the following areas: identifying patient groups, improving treatments for specific conditions by using statistical as well as data mining techniques, and developing new operation research models to increase system efficiency from the health institutes' perspective. The results provide better understanding of high risk patient groups, more accuracy in detecting disease' correlations and practical scheduling tools that consider uncertain operation durations and real-life constraints.
Fluorescent-Antibody Measurement Of Cancer-Cell Urokinase
NASA Technical Reports Server (NTRS)
Morrison, Dennis R.
1993-01-01
Combination of laboratory techniques provides measurements of amounts of urokinase in and between normal and cancer cells. Includes use of fluorescent antibodies specific against different forms of urokinase-type plasminogen activator, (uPA), fluorescence microscopy, quantitative analysis of images of sections of tumor tissue, and flow cytometry of different uPA's and deoxyribonucleic acid (DNA) found in suspended-tumor-cell preparations. Measurements provide statistical method for indicating or predicting metastatic potentials of some invasive tumors. Assessments of metastatic potentials based on such measurements used in determining appropriate follow-up procedures after surgical removal of tumors.
Statistical description of tectonic motions
NASA Technical Reports Server (NTRS)
Agnew, Duncan Carr
1991-01-01
The behavior of stochastic processes was studied whose power spectra are described by power-law behavior. The details of the analysis and the conclusions that were reached are presented. This analysis was extended to compare detection capabilities of different measurement techniques (e.g., gravimetry and GPS for the vertical, and seismometers and GPS for horizontal), both in general and for the specific case of the deformations produced by a dislocation in a half-space (which applies to seismic of preseismic sources). The time-domain behavior of power-law noises is also investigated.
Meteorology Assessment of Historic Rainfall for Los Alamos During September 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruggeman, David Alan; Dewart, Jean Marie
2016-02-12
DOE Order 420.1, Facility Safety, requires that site natural phenomena hazards be evaluated every 10 years to support the design of nuclear facilities. The evaluation requires calculating return period rainfall to determine roof loading requirements and flooding potential based on our on-site rainfall measurements. The return period rainfall calculations are done based on statistical techniques and not site-specific meteorology. This and future studies analyze the meteorological factors that produce the significant rainfall events. These studies provide the meteorology context of the return period rainfall events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morton, April M; Piburn, Jesse O; McManamay, Ryan A
2017-01-01
Monte Carlo simulation is a popular numerical experimentation technique used in a range of scientific fields to obtain the statistics of unknown random output variables. Despite its widespread applicability, it can be difficult to infer required input probability distributions when they are related to population counts unknown at desired spatial resolutions. To overcome this challenge, we propose a framework that uses a dasymetric model to infer the probability distributions needed for a specific class of Monte Carlo simulations which depend on population counts.
Onay, Ulaş; Akpınar, Sercan; Akgün, Rahmi Can; Balçık, Cenk; Tuncay, Ismail Cengiz
2013-01-01
The aim of this study was to compare new knotless single-row and double-row suture anchor techniques with traditional transosseous suture techniques for different sized rotator cuff tears in an animal model. The study included 56 cadaveric sheep shoulders. Supraspinatus cuff tears of 1 cm repaired with new knotless single-row suture anchor technique and supraspinatus and infraspinatus rotator cuff tears of 3 cm repaired with double-row suture anchor technique were compared to traditional transosseous suture techniques and control groups. The repaired tendons were loaded with 5 mm/min static velocity with 2.5 kgN load cell in Instron 8874 machine until the repair failure. The 1 cm transosseous group was statistically superior to 1 cm control group (p=0.021, p<0.05) and the 3 cm SpeedBridge group was statistically superior to the 1 cm SpeedFix group (p=0.012, p<0.05). The differences between the other groups were not statistically significant. No significant difference was found between the new knotless suture anchor techniques and traditional transosseous suture techniques.
Automatic brain tumor detection in MRI: methodology and statistical validation
NASA Astrophysics Data System (ADS)
Iftekharuddin, Khan M.; Islam, Mohammad A.; Shaik, Jahangheer; Parra, Carlos; Ogg, Robert
2005-04-01
Automated brain tumor segmentation and detection are immensely important in medical diagnostics because it provides information associated to anatomical structures as well as potential abnormal tissue necessary to delineate appropriate surgical planning. In this work, we propose a novel automated brain tumor segmentation technique based on multiresolution texture information that combines fractal Brownian motion (fBm) and wavelet multiresolution analysis. Our wavelet-fractal technique combines the excellent multiresolution localization property of wavelets to texture extraction of fractal. We prove the efficacy of our technique by successfully segmenting pediatric brain MR images (MRIs) from St. Jude Children"s Research Hospital. We use self-organizing map (SOM) as our clustering tool wherein we exploit both pixel intensity and multiresolution texture features to obtain segmented tumor. Our test results show that our technique successfully segments abnormal brain tissues in a set of T1 images. In the next step, we design a classifier using Feed-Forward (FF) neural network to statistically validate the presence of tumor in MRI using both the multiresolution texture and the pixel intensity features. We estimate the corresponding receiver operating curve (ROC) based on the findings of true positive fractions and false positive fractions estimated from our classifier at different threshold values. An ROC, which can be considered as a gold standard to prove the competence of a classifier, is obtained to ascertain the sensitivity and specificity of our classifier. We observe that at threshold 0.4 we achieve true positive value of 1.0 (100%) sacrificing only 0.16 (16%) false positive value for the set of 50 T1 MRI analyzed in this experiment.
A critical review of the neuroimaging literature on synesthesia
Hupé, Jean-Michel; Dojat, Michel
2015-01-01
Synesthesia refers to additional sensations experienced by some people for specific stimulations, such as the systematic arbitrary association of colors to letters for the most studied type. Here, we review all the studies (based mostly on functional and structural magnetic resonance imaging) that have searched for the neural correlates of this subjective experience, as well as structural differences related to synesthesia. Most differences claimed for synesthetes are unsupported, due mainly to low statistical power, statistical errors, and methodological limitations. Our critical review therefore casts some doubts on whether any neural correlate of the synesthetic experience has been established yet. Rather than being a neurological condition (i.e., a structural or functional brain anomaly), synesthesia could be reconsidered as a special kind of childhood memory, whose signature in the brain may be out of reach with present brain imaging techniques. PMID:25873873
Model selection as a science driver for dark energy surveys
NASA Astrophysics Data System (ADS)
Mukherjee, Pia; Parkinson, David; Corasaniti, Pier Stefano; Liddle, Andrew R.; Kunz, Martin
2006-07-01
A key science goal of upcoming dark energy surveys is to seek time-evolution of the dark energy. This problem is one of model selection, where the aim is to differentiate between cosmological models with different numbers of parameters. However, the power of these surveys is traditionally assessed by estimating their ability to constrain parameters, which is a different statistical problem. In this paper, we use Bayesian model selection techniques, specifically forecasting of the Bayes factors, to compare the abilities of different proposed surveys in discovering dark energy evolution. We consider six experiments - supernova luminosity measurements by the Supernova Legacy Survey, SNAP, JEDI and ALPACA, and baryon acoustic oscillation measurements by WFMOS and JEDI - and use Bayes factor plots to compare their statistical constraining power. The concept of Bayes factor forecasting has much broader applicability than dark energy surveys.
SnapShot: Visualization to Propel Ice Hockey Analytics.
Pileggi, H; Stolper, C D; Boyle, J M; Stasko, J T
2012-12-01
Sports analysts live in a world of dynamic games flattened into tables of numbers, divorced from the rinks, pitches, and courts where they were generated. Currently, these professional analysts use R, Stata, SAS, and other statistical software packages for uncovering insights from game data. Quantitative sports consultants seek a competitive advantage both for their clients and for themselves as analytics becomes increasingly valued by teams, clubs, and squads. In order for the information visualization community to support the members of this blossoming industry, it must recognize where and how visualization can enhance the existing analytical workflow. In this paper, we identify three primary stages of today's sports analyst's routine where visualization can be beneficially integrated: 1) exploring a dataspace; 2) sharing hypotheses with internal colleagues; and 3) communicating findings to stakeholders.Working closely with professional ice hockey analysts, we designed and built SnapShot, a system to integrate visualization into the hockey intelligence gathering process. SnapShot employs a variety of information visualization techniques to display shot data, yet given the importance of a specific hockey statistic, shot length, we introduce a technique, the radial heat map. Through a user study, we received encouraging feedback from several professional analysts, both independent consultants and professional team personnel.
Reconstructing the behavior of walking fruit flies
NASA Astrophysics Data System (ADS)
Berman, Gordon; Bialek, William; Shaevitz, Joshua
2010-03-01
Over the past century, the fruit fly Drosophila melanogaster has arisen as almost a lingua franca in the study of animal behavior, having been utilized to study questions in fields as diverse as sleep deprivation, aging, and drug abuse, amongst many others. Accordingly, much is known about what can be done to manipulate these organisms genetically, behaviorally, and physiologically. Most of the behavioral work on this system to this point has been experiments where the flies in question have been given a choice between some discrete set of pre-defined behaviors. Our aim, however, is simply to spend some time with a cadre of flies, using techniques from nonlinear dynamics, statistical physics, and machine learning in an attempt to reconstruct and gain understanding into their behavior. More specifically, we use a multi-camera set-up combined with a motion tracking stage in order to obtain long time-series of walking fruit flies moving about a glass plate. This experimental system serves as a test-bed for analytical, statistical, and computational techniques for studying animal behavior. In particular, we attempt to reconstruct the natural modes of behavior for a fruit fly through a data-driven approach in a manner inspired by recent work in C. elegans and cockroaches.
Design of order statistics filters using feedforward neural networks
NASA Astrophysics Data System (ADS)
Maslennikova, Yu. S.; Bochkarev, V. V.
2016-08-01
In recent years significant progress have been made in the development of nonlinear data processing techniques. Such techniques are widely used in digital data filtering and image enhancement. Many of the most effective nonlinear filters based on order statistics. The widely used median filter is the best known order statistic filter. Generalized form of these filters could be presented based on Lloyd's statistics. Filters based on order statistics have excellent robustness properties in the presence of impulsive noise. In this paper, we present special approach for synthesis of order statistics filters using artificial neural networks. Optimal Lloyd's statistics are used for selecting of initial weights for the neural network. Adaptive properties of neural networks provide opportunities to optimize order statistics filters for data with asymmetric distribution function. Different examples demonstrate the properties and performance of presented approach.
Statistical evaluation of vibration analysis techniques
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Miller, Patrice S.
1987-01-01
An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.
1985-09-01
TECHNIQUES THESIS Robert A. Heinlein Captain, USAF AFIT/GLM/LSM/855-32.- _ DTIC MU’noN ’ST.,TEMENT A A-ZELECTE Approved lt public teleo*I Al \\ Z #&N0V21...343" A FEASIBILITY STUDY OF THE COLLECTION OF UNSCHEDULED MAINTENANCE DATA USING STrATISTICAL SAMPLING TECHNIQUES THESIS L .9 Robe-t A. Heinlein...a AFIT/GLM/LSM/85S-32 A FEASIBILITY STUDY OF THE COLLECTION OF UNSCHEDULED MAINTENANCE DATA USING STATISTICAL SAMPLING TECHNIQUES THESIS
Phylogeography Takes a Relaxed Random Walk in Continuous Space and Time
Lemey, Philippe; Rambaut, Andrew; Welch, John J.; Suchard, Marc A.
2010-01-01
Research aimed at understanding the geographic context of evolutionary histories is burgeoning across biological disciplines. Recent endeavors attempt to interpret contemporaneous genetic variation in the light of increasingly detailed geographical and environmental observations. Such interest has promoted the development of phylogeographic inference techniques that explicitly aim to integrate such heterogeneous data. One promising development involves reconstructing phylogeographic history on a continuous landscape. Here, we present a Bayesian statistical approach to infer continuous phylogeographic diffusion using random walk models while simultaneously reconstructing the evolutionary history in time from molecular sequence data. Moreover, by accommodating branch-specific variation in dispersal rates, we relax the most restrictive assumption of the standard Brownian diffusion process and demonstrate increased statistical efficiency in spatial reconstructions of overdispersed random walks by analyzing both simulated and real viral genetic data. We further illustrate how drawing inference about summary statistics from a fully specified stochastic process over both sequence evolution and spatial movement reveals important characteristics of a rabies epidemic. Together with recent advances in discrete phylogeographic inference, the continuous model developments furnish a flexible statistical framework for biogeographical reconstructions that is easily expanded upon to accommodate various landscape genetic features. PMID:20203288
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan Zhen; Zhang Qizhi; Sobel, Eric S.
Purpose: The aim of this study was to investigate the potential use of multimodality functional imaging techniques to identify the quantitative optical findings that can be used to distinguish between osteoarthritic and normal finger joints. Methods: Between 2006 and 2009, the distal interphalangeal finger joints from 40 female subjects including 22 patients and 18 healthy controls were examined clinically and scanned by a hybrid imaging system. This system integrated x-ray tomosynthetic setup with a diffuse optical imaging system. Optical absorption and scattering images were recovered based on a regularization-based hybrid reconstruction algorithm. A receiver operating characteristic curve was used tomore » calculate the statistical significance of specific optical features obtained from osteoarthritic and healthy joints groups. Results: The three-dimensional optical and x-ray images captured made it possible to quantify optical properties and joint space width of finger joints. Based on the recovered optical absorption and scattering parameters, the authors observed statistically significant differences between healthy and osteoarthritis finger joints. Conclusions: The statistical results revealed that sensitivity and specificity values up to 92% and 100%, respectively, can be achieved when optical properties of joint tissues were used as classifiers. This suggests that these optical imaging parameters are possible indicators for diagnosing osteoarthritis and monitoring its progression.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Labby, Z.
Physicists are often expected to have a solid grounding in experimental design and statistical analysis, sometimes filling in when biostatisticians or other experts are not available for consultation. Unfortunately, graduate education on these topics is seldom emphasized and few opportunities for continuing education exist. Clinical physicists incorporate new technology and methods into their practice based on published literature. A poor understanding of experimental design and analysis could Result in inappropriate use of new techniques. Clinical physicists also improve current practice through quality initiatives that require sound experimental design and analysis. Academic physicists with a poor understanding of design and analysismore » may produce ambiguous (or misleading) results. This can Result in unnecessary rewrites, publication rejection, and experimental redesign (wasting time, money, and effort). This symposium will provide a practical review of error and uncertainty, common study designs, and statistical tests. Instruction will primarily focus on practical implementation through examples and answer questions such as: where would you typically apply the test/design and where is the test/design typically misapplied (i.e., common pitfalls)? An analysis of error and uncertainty will also be explored using biological studies and associated modeling as a specific use case. Learning Objectives: Understand common experimental testing and clinical trial designs, what questions they can answer, and how to interpret the results Determine where specific statistical tests are appropriate and identify common pitfalls Understand the how uncertainty and error are addressed in biological testing and associated biological modeling.« less
Mitigating randomness of consumer preferences under certain conditional choices
NASA Astrophysics Data System (ADS)
Bothos, John M. A.; Thanos, Konstantinos-Georgios; Papadopoulou, Eirini; Daveas, Stelios; Thomopoulos, Stelios C. A.
2017-05-01
Agent-based crowd behaviour consists a significant field of research that has drawn a lot of attention in recent years. Agent-based crowd simulation techniques have been used excessively to forecast the behaviour of larger or smaller crowds in terms of certain given conditions influenced by specific cognition models and behavioural rules and norms, imposed from the beginning. Our research employs conditional event algebra, statistical methodology and agent-based crowd simulation techniques in developing a behavioural econometric model about the selection of certain economic behaviour by a consumer that faces a spectre of potential choices when moving and acting in a multiplex mall. More specifically we try to analyse the influence of demographic, economic, social and cultural factors on the economic behaviour of a certain individual and then we try to link its behaviour with the general behaviour of the crowds of consumers in multiplex malls using agent-based crowd simulation techniques. We then run our model using Generalized Least Squares and Maximum Likelihood methods to come up with the most probable forecast estimations, regarding the agent's behaviour. Our model is indicative about the formation of consumers' spectre of choices in multiplex malls under the condition of predefined preferences and can be used as a guide for further research in this area.
Advancing solar energy forecasting through the underlying physics
NASA Astrophysics Data System (ADS)
Yang, H.; Ghonima, M. S.; Zhong, X.; Ozge, B.; Kurtz, B.; Wu, E.; Mejia, F. A.; Zamora, M.; Wang, G.; Clemesha, R.; Norris, J. R.; Heus, T.; Kleissl, J. P.
2017-12-01
As solar power comprises an increasingly large portion of the energy generation mix, the ability to accurately forecast solar photovoltaic generation becomes increasingly important. Due to the variability of solar power caused by cloud cover, knowledge of both the magnitude and timing of expected solar power production ahead of time facilitates the integration of solar power onto the electric grid by reducing electricity generation from traditional ancillary generators such as gas and oil power plants, as well as decreasing the ramping of all generators, reducing start and shutdown costs, and minimizing solar power curtailment, thereby providing annual economic value. The time scales involved in both the energy markets and solar variability range from intra-hour to several days ahead. This wide range of time horizons led to the development of a multitude of techniques, with each offering unique advantages in specific applications. For example, sky imagery provides site-specific forecasts on the minute-scale. Statistical techniques including machine learning algorithms are commonly used in the intra-day forecast horizon for regional applications, while numerical weather prediction models can provide mesoscale forecasts on both the intra-day and days-ahead time scale. This talk will provide an overview of the challenges unique to each technique and highlight the advances in their ongoing development which come alongside advances in the fundamental physics underneath.
Novel technique for ST-T interval characterization in patients with acute myocardial ischemia.
Correa, Raúl; Arini, Pedro David; Correa, Lorena Sabrina; Valentinuzzi, Max; Laciar, Eric
2014-07-01
The novel signal processing techniques have allowed and improved the use of vectorcardiography (VCG) to diagnose and characterize myocardial ischemia. Herein, we studied vectorcardiographic dynamic changes of ventricular repolarization in 80 patients before (control) and during Percutaneous Transluminal Coronary Angioplasty (PTCA). We propose four vectorcardiographic ST-T parameters, i.e., (a) ST Vector Magnitude Area (aSTVM); (b) T-wave Vector Magnitude Area (aTVM); (c) ST-T Vector Magnitude Difference (ST-TVD), and (d) T-wave Vector Magnitude Difference (TVD). For comparison, the conventional ST-Change Vector Magnitude (STCVM) and Spatial Ventricular Gradient (SVG) were also calculated. Our results indicate that several vectorcardiographic parameters show significant differences (p-value<0.05) before starting and during PTCA. Statistical minute-by-minute PTCA comparison against the control situation showed that ischemic monitoring reached a sensitivity=90.5% and a specificity=92.6% at the 5th minute of the PTCA, when aSTVM and ST-TVD were used as classifiers. We conclude that the sensitivity and specificity for acute ischemia monitoring could be increased with the use of only two vectorcardiographic parameters. Hence, the proposed technique based on vectorcardiography could be used in addition to the conventional ST-T analysis for better monitoring of ischemic patients. Copyright © 2014 Elsevier Ltd. All rights reserved.
Statistical Modeling of Retinal Optical Coherence Tomography.
Amini, Zahra; Rabbani, Hossein
2016-06-01
In this paper, a new model for retinal Optical Coherence Tomography (OCT) images is proposed. This statistical model is based on introducing a nonlinear Gaussianization transform to convert the probability distribution function (pdf) of each OCT intra-retinal layer to a Gaussian distribution. The retina is a layered structure and in OCT each of these layers has a specific pdf which is corrupted by speckle noise, therefore a mixture model for statistical modeling of OCT images is proposed. A Normal-Laplace distribution, which is a convolution of a Laplace pdf and Gaussian noise, is proposed as the distribution of each component of this model. The reason for choosing Laplace pdf is the monotonically decaying behavior of OCT intensities in each layer for healthy cases. After fitting a mixture model to the data, each component is gaussianized and all of them are combined by Averaged Maximum A Posterior (AMAP) method. To demonstrate the ability of this method, a new contrast enhancement method based on this statistical model is proposed and tested on thirteen healthy 3D OCTs taken by the Topcon 3D OCT and five 3D OCTs from Age-related Macular Degeneration (AMD) patients, taken by Zeiss Cirrus HD-OCT. Comparing the results with two contending techniques, the prominence of the proposed method is demonstrated both visually and numerically. Furthermore, to prove the efficacy of the proposed method for a more direct and specific purpose, an improvement in the segmentation of intra-retinal layers using the proposed contrast enhancement method as a preprocessing step, is demonstrated.
NASA Astrophysics Data System (ADS)
Zan, Tao; Wang, Min; Hu, Jianzhong
2010-12-01
Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.
MetaboLyzer: A Novel Statistical Workflow for Analyzing Post-Processed LC/MS Metabolomics Data
Mak, Tytus D.; Laiakis, Evagelia C.; Goudarzi, Maryam; Fornace, Albert J.
2014-01-01
Metabolomics, the global study of small molecules in a particular system, has in the last few years risen to become a primary –omics platform for the study of metabolic processes. With the ever-increasing pool of quantitative data yielded from metabolomic research, specialized methods and tools with which to analyze and extract meaningful conclusions from these data are becoming more and more crucial. Furthermore, the depth of knowledge and expertise required to undertake a metabolomics oriented study is a daunting obstacle to investigators new to the field. As such, we have created a new statistical analysis workflow, MetaboLyzer, which aims to both simplify analysis for investigators new to metabolomics, as well as provide experienced investigators the flexibility to conduct sophisticated analysis. MetaboLyzer’s workflow is specifically tailored to the unique characteristics and idiosyncrasies of postprocessed liquid chromatography/mass spectrometry (LC/MS) based metabolomic datasets. It utilizes a wide gamut of statistical tests, procedures, and methodologies that belong to classical biostatistics, as well as several novel statistical techniques that we have developed specifically for metabolomics data. Furthermore, MetaboLyzer conducts rapid putative ion identification and putative biologically relevant analysis via incorporation of four major small molecule databases: KEGG, HMDB, Lipid Maps, and BioCyc. MetaboLyzer incorporates these aspects into a comprehensive workflow that outputs easy to understand statistically significant and potentially biologically relevant information in the form of heatmaps, volcano plots, 3D visualization plots, correlation maps, and metabolic pathway hit histograms. For demonstration purposes, a urine metabolomics data set from a previously reported radiobiology study in which samples were collected from mice exposed to gamma radiation was analyzed. MetaboLyzer was able to identify 243 statistically significant ions out of a total of 1942. Numerous putative metabolites and pathways were found to be biologically significant from the putative ion identification workflow. PMID:24266674
39 CFR 3050.1 - Definitions applicable to this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., mathematical, or statistical theory, precept, or assumption applied by the Postal Service in producing a... manipulation technique whose validity does not require the acceptance of a particular economic, mathematical, or statistical theory, precept, or assumption. A change in quantification technique should not change...
Electroencephalography signatures of attention-deficit/hyperactivity disorder: clinical utility.
Alba, Guzmán; Pereda, Ernesto; Mañas, Soledad; Méndez, Leopoldo D; González, Almudena; González, Julián J
2015-01-01
The techniques and the most important results on the use of electroencephalography (EEG) to extract different measures are reviewed in this work, which can be clinically useful to study subjects with attention-deficit/hyperactivity disorder (ADHD). First, we discuss briefly and in simple terms the EEG analysis and processing techniques most used in the context of ADHD. We review techniques that both analyze individual EEG channels (univariate measures) and study the statistical interdependence between different EEG channels (multivariate measures), the so-called functional brain connectivity. Among the former ones, we review the classical indices of absolute and relative spectral power and estimations of the complexity of the channels, such as the approximate entropy and the Lempel-Ziv complexity. Among the latter ones, we focus on the magnitude square coherence and on different measures based on the concept of generalized synchronization and its estimation in the state space. Second, from a historical point of view, we present the most important results achieved with these techniques and their clinical utility (sensitivity, specificity, and accuracy) to diagnose ADHD. Finally, we propose future research lines based on these results.
Quantitative Aspects of Single Molecule Microscopy
Ober, Raimund J.; Tahmasbi, Amir; Ram, Sripad; Lin, Zhiping; Ward, E. Sally
2015-01-01
Single molecule microscopy is a relatively new optical microscopy technique that allows the detection of individual molecules such as proteins in a cellular context. This technique has generated significant interest among biologists, biophysicists and biochemists, as it holds the promise to provide novel insights into subcellular processes and structures that otherwise cannot be gained through traditional experimental approaches. Single molecule experiments place stringent demands on experimental and algorithmic tools due to the low signal levels and the presence of significant extraneous noise sources. Consequently, this has necessitated the use of advanced statistical signal and image processing techniques for the design and analysis of single molecule experiments. In this tutorial paper, we provide an overview of single molecule microscopy from early works to current applications and challenges. Specific emphasis will be on the quantitative aspects of this imaging modality, in particular single molecule localization and resolvability, which will be discussed from an information theoretic perspective. We review the stochastic framework for image formation, different types of estimation techniques and expressions for the Fisher information matrix. We also discuss several open problems in the field that demand highly non-trivial signal processing algorithms. PMID:26167102
Machine learning techniques for medical diagnosis of diabetes using iris images.
Samant, Piyush; Agarwal, Ravinder
2018-04-01
Complementary and alternative medicine techniques have shown their potential for the treatment and diagnosis of chronical diseases like diabetes, arthritis etc. On the same time digital image processing techniques for disease diagnosis is reliable and fastest growing field in biomedical. Proposed model is an attempt to evaluate diagnostic validity of an old complementary and alternative medicine technique, iridology for diagnosis of type-2 diabetes using soft computing methods. Investigation was performed over a close group of total 338 subjects (180 diabetic and 158 non-diabetic). Infra-red images of both the eyes were captured simultaneously. The region of interest from the iris image was cropped as zone corresponds to the position of pancreas organ according to the iridology chart. Statistical, texture and discrete wavelength transformation features were extracted from the region of interest. The results show best classification accuracy of 89.63% calculated from RF classifier. Maximum specificity and sensitivity were absorbed as 0.9687 and 0.988, respectively. Results have revealed the effectiveness and diagnostic significance of proposed model for non-invasive and automatic diabetes diagnosis. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Bui-Thanh, T.; Girolami, M.
2014-11-01
We consider the Riemann manifold Hamiltonian Monte Carlo (RMHMC) method for solving statistical inverse problems governed by partial differential equations (PDEs). The Bayesian framework is employed to cast the inverse problem into the task of statistical inference whose solution is the posterior distribution in infinite dimensional parameter space conditional upon observation data and Gaussian prior measure. We discretize both the likelihood and the prior using the H1-conforming finite element method together with a matrix transfer technique. The power of the RMHMC method is that it exploits the geometric structure induced by the PDE constraints of the underlying inverse problem. Consequently, each RMHMC posterior sample is almost uncorrelated/independent from the others providing statistically efficient Markov chain simulation. However this statistical efficiency comes at a computational cost. This motivates us to consider computationally more efficient strategies for RMHMC. At the heart of our construction is the fact that for Gaussian error structures the Fisher information matrix coincides with the Gauss-Newton Hessian. We exploit this fact in considering a computationally simplified RMHMC method combining state-of-the-art adjoint techniques and the superiority of the RMHMC method. Specifically, we first form the Gauss-Newton Hessian at the maximum a posteriori point and then use it as a fixed constant metric tensor throughout RMHMC simulation. This eliminates the need for the computationally costly differential geometric Christoffel symbols, which in turn greatly reduces computational effort at a corresponding loss of sampling efficiency. We further reduce the cost of forming the Fisher information matrix by using a low rank approximation via a randomized singular value decomposition technique. This is efficient since a small number of Hessian-vector products are required. The Hessian-vector product in turn requires only two extra PDE solves using the adjoint technique. Various numerical results up to 1025 parameters are presented to demonstrate the ability of the RMHMC method in exploring the geometric structure of the problem to propose (almost) uncorrelated/independent samples that are far away from each other, and yet the acceptance rate is almost unity. The results also suggest that for the PDE models considered the proposed fixed metric RMHMC can attain almost as high a quality performance as the original RMHMC, i.e. generating (almost) uncorrelated/independent samples, while being two orders of magnitude less computationally expensive.
[Curricular design of health postgraduate programs: the case of Masters in epidemiology].
Bobadilla, J L; Lozano, R; Bobadilla, C
1991-01-01
This paper discusses the need to create specific programs for the training of researchers in epidemiology, a field that has traditionally been ignored by the graduate programs in public health. This is due, in part, to the emphasis that has been placed on the training of professionals in other areas of public health. The paper also includes the results of a consensus exercise developed during the curricular design of the Masters Program in Epidemiology of the School of Medicine of the National Autonomous University of Mexico. The technique used during the consensus exercise was the TKJ, which allows the presentation of ideas and possible solutions for a specific problem. This is probably the first published experience in the use of such a technique for the design of an academic curriculum. Taking as a base the general characteristics of the students, the substantive, disciplinary and methodological subjects were chosen. The results showed a need for a multidisciplinary approach based on modern methodologies of statistics and epidemiology. The usefulness of the results of the curricular design and the superiority of this method to reach consensus is also discussed.
Rath, Hemamalini; Rath, Rachna; Mahapatra, Sandeep; Debta, Tribikram
2017-01-01
The age of an individual can be assessed by a plethora of widely available tooth-based techniques, among which radiological methods prevail. The Demirjian's technique of age assessment based on tooth development stages has been extensively investigated in different populations of the world. The present study is to assess the applicability of Demirjian's modified 8-teeth technique in age estimation of population of East India (Odisha), utilizing Acharya's Indian-specific cubic functions. One hundred and six pretreatment orthodontic radiographs of patients in an age group of 7-23 years with representation from both genders were assessed for eight left mandibular teeth and scored as per the Demirjian's 9-stage criteria for teeth development stages. Age was calculated on the basis of Acharya's Indian formula. Statistical analysis was performed to compare the estimated and actual age. All data were analyzed using SPSS 20.0 (SPSS Inc., Chicago, Illinois, USA) and MS Excel Package. The results revealed that the mean absolute error (MAE) in age estimation of the entire sample was 1.3 years with 50% of the cases having an error rate within ± 1 year. The MAE in males and females (7-16 years) was 1.8 and 1.5, respectively. Likewise, the MAE in males and females (16.1-23 years) was 1.1 and 1.3, respectively. The low error rate in estimating age justifies the application of this modified technique and Acharya's Indian formulas in the present East Indian population.
Toth, Thomas L; Lee, Malinda S; Bendikson, Kristin A; Reindollar, Richard H
2017-04-01
To better understand practice patterns and opportunities for standardization of ET. Cross-sectional survey. Not applicable. Not applicable. An anonymous 82-question survey was emailed to the medical directors of 286 Society for Assisted Reproductive Technology member IVF practices. A follow-up survey composed of three questions specific to ET technique was emailed to the same medical directors. Descriptive statistics of the results were compiled. The survey assessed policies, protocols, restrictions, and specifics pertinent to the technique of ET. There were 117 (41%) responses; 32% practice in academic settings and 68% in private practice. Responders were experienced clinicians, half of whom had performed <10 procedures during training. Ninety-eight percent of practices allowed all practitioners to perform ET; half did not follow a standardized ET technique. Multiple steps in the ET process were identified as "highly conserved;" others demonstrated discordance. ET technique is divided among [1] trial transfer followed immediately with ET (40%); [2] afterload transfer (30%); and [3] direct transfer without prior trial or afterload (27%). Embryos are discharged in the upper (66%) and middle thirds (29%) of the endometrial cavity and not closer than 1-1.5 cm from fundus (87%). Details of each step were reported and allowed the development of a "common" practice ET procedure. ET training and practices vary widely. Improved training and standardization based on outcomes data and best practices are warranted. A common practice procedure is suggested for validation by a systematic literature review. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Approximate Model Checking of PCTL Involving Unbounded Path Properties
NASA Astrophysics Data System (ADS)
Basu, Samik; Ghosh, Arka P.; He, Ru
We study the problem of applying statistical methods for approximate model checking of probabilistic systems against properties encoded as
MetaGenyo: a web tool for meta-analysis of genetic association studies.
Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro
2017-12-16
Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .
New fluorescence techniques for high-throughput drug discovery.
Jäger, S; Brand, L; Eggeling, C
2003-12-01
The rapid increase of compound libraries as well as new targets emerging from the Human Genome Project require constant progress in pharmaceutical research. An important tool is High-Throughput Screening (HTS), which has evolved as an indispensable instrument in the pre-clinical target-to-IND (Investigational New Drug) discovery process. HTS requires machinery, which is able to test more than 100,000 potential drug candidates per day with respect to a specific biological activity. This calls for certain experimental demands especially with respect to sensitivity, speed, and statistical accuracy, which are fulfilled by using fluorescence technology instrumentation. In particular the recently developed family of fluorescence techniques, FIDA (Fluorescence Intensity Distribution Analysis), which is based on confocal single-molecule detection, has opened up a new field of HTS applications. This report describes the application of these new techniques as well as of common fluorescence techniques--such as confocal fluorescence lifetime and anisotropy--to HTS. It gives experimental examples and presents advantages and disadvantages of each method. In addition the most common artifacts (auto-fluorescence or quenching by the drug candidates) emerging from the fluorescence detection techniques are highlighted and correction methods for confocal fluorescence read-outs are presented, which are able to circumvent this deficiency.
Rapid Vision Correction by Special Operations Forces.
Reynolds, Mark E
This report describes a rapid method of vision correction used by Special Operations Medics in multiple operational engagements. Between 2011 and 2015, Special Operations Medics used an algorithm- driven refraction technique. A standard block of instruction was provided to the medics, along with a packaged kit. The technique was used in multiple operational engagements with host nation military and civilians. Data collected for program evaluation were later analyzed to assess the utility of the technique. Glasses were distributed to 230 patients with complaints of either decreased distance or near (reading). Most patients (84%) with distance complaints achieved corrected binocular vision of 20/40 or better, and 97% of patients with near-vision complaints achieved corrected near-binocular vision of 20/40 or better. There was no statistically significant difference between the percentages of patients achieving 20/40 when medics used the technique under direct supervision versus independent use. A basic refraction technique using a designed kit allows for meaningful improvement in distance and/or near vision at austere locations. Special Operations Medics can leverage this approach after specific training with minimal time commitment. It can serve as a rapid, effective intervention with multiple applications in diverse operational environments. 2017.
A Study on Predictive Analytics Application to Ship Machinery Maintenance
2013-09-01
Looking at the nature of the time series forecasting method , it would be better applied to offline analysis . The application for real- time online...other system attributes in future. Two techniques of statistical analysis , mainly time series models and cumulative sum control charts, are discussed in...statistical tool employed for the two techniques of statistical analysis . Both time series forecasting as well as CUSUM control charts are shown to be
Methods for trend analysis: Examples with problem/failure data
NASA Technical Reports Server (NTRS)
Church, Curtis K.
1989-01-01
Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.
Effects of band selection on endmember extraction for forestry applications
NASA Astrophysics Data System (ADS)
Karathanassi, Vassilia; Andreou, Charoula; Andronis, Vassilis; Kolokoussis, Polychronis
2014-10-01
In spectral unmixing theory, data reduction techniques play an important role as hyperspectral imagery contains an immense amount of data, posing many challenging problems such as data storage, computational efficiency, and the so called "curse of dimensionality". Feature extraction and feature selection are the two main approaches for dimensionality reduction. Feature extraction techniques are used for reducing the dimensionality of the hyperspectral data by applying transforms on hyperspectral data. Feature selection techniques retain the physical meaning of the data by selecting a set of bands from the input hyperspectral dataset, which mainly contain the information needed for spectral unmixing. Although feature selection techniques are well-known for their dimensionality reduction potentials they are rarely used in the unmixing process. The majority of the existing state-of-the-art dimensionality reduction methods set criteria to the spectral information, which is derived by the whole wavelength, in order to define the optimum spectral subspace. These criteria are not associated with any particular application but with the data statistics, such as correlation and entropy values. However, each application is associated with specific land c over materials, whose spectral characteristics present variations in specific wavelengths. In forestry for example, many applications focus on tree leaves, in which specific pigments such as chlorophyll, xanthophyll, etc. determine the wavelengths where tree species, diseases, etc., can be detected. For such applications, when the unmixing process is applied, the tree species, diseases, etc., are considered as the endmembers of interest. This paper focuses on investigating the effects of band selection on the endmember extraction by exploiting the information of the vegetation absorbance spectral zones. More precisely, it is explored whether endmember extraction can be optimized when specific sets of initial bands related to leaf spectral characteristics are selected. Experiments comprise application of well-known signal subspace estimation and endmember extraction methods on a hyperspectral imagery that presents a forest area. Evaluation of the extracted endmembers showed that more forest species can be extracted as endmembers using selected bands.
Statistics anxiety, state anxiety during an examination, and academic achievement.
Macher, Daniel; Paechter, Manuela; Papousek, Ilona; Ruggeri, Kai; Freudenthaler, H Harald; Arendasy, Martin
2013-12-01
A large proportion of students identify statistics courses as the most anxiety-inducing courses in their curriculum. Many students feel impaired by feelings of state anxiety in the examination and therefore probably show lower achievements. The study investigates how statistics anxiety, attitudes (e.g., interest, mathematical self-concept) and trait anxiety, as a general disposition to anxiety, influence experiences of anxiety as well as achievement in an examination. Participants were 284 undergraduate psychology students, 225 females and 59 males. Two weeks prior to the examination, participants completed a demographic questionnaire and measures of the STARS, the STAI, self-concept in mathematics, and interest in statistics. At the beginning of the statistics examination, students assessed their present state anxiety by the KUSTA scale. After 25 min, all examination participants gave another assessment of their anxiety at that moment. Students' examination scores were recorded. Structural equation modelling techniques were used to test relationships between the variables in a multivariate context. Statistics anxiety was the only variable related to state anxiety in the examination. Via state anxiety experienced before and during the examination, statistics anxiety had a negative influence on achievement. However, statistics anxiety also had a direct positive influence on achievement. This result may be explained by students' motivational goals in the specific educational setting. The results provide insight into the relationship between students' attitudes, dispositions, experiences of anxiety in the examination, and academic achievement, and give recommendations to instructors on how to support students prior to and in the examination. © 2012 The British Psychological Society.
Secondary Analysis of Qualitative Data.
ERIC Educational Resources Information Center
Turner, Paul D.
The reanalysis of data to answer the original research question with better statistical techniques or to answer new questions with old data is not uncommon in quantitative studies. Meta analysis and research syntheses have increased with the increase in research using similar statistical analyses, refinements of analytical techniques, and the…
Computer program uses Monte Carlo techniques for statistical system performance analysis
NASA Technical Reports Server (NTRS)
Wohl, D. P.
1967-01-01
Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, Richard O.
The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Somemore » statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.« less
Polarization speckle imaging as a potential technique for in vivo skin cancer detection.
Tchvialeva, Lioudmila; Dhadwal, Gurbir; Lui, Harvey; Kalia, Sunil; Zeng, Haishan; McLean, David I; Lee, Tim K
2013-06-01
Skin cancer is the most common cancer in the Western world. In order to accurately detect the disease, especially malignant melanoma-the most fatal form of skin cancer-at an early stage when the prognosis is excellent, there is an urgent need to develop noninvasive early detection methods. We believe that polarization speckle patterns, defined as a spatial distribution of depolarization ratio of traditional speckle patterns, can be an important tool for skin cancer detection. To demonstrate our technique, we conduct a large in vivo clinical study of 214 skin lesions, and show that statistical moments of the polarization speckle pattern could differentiate different types of skin lesions, including three common types of skin cancers, malignant melanoma, squamous cell carcinoma, basal cell carcinoma, and two benign lesions, melanocytic nevus and seborrheic keratoses. In particular, the fourth order moment achieves better or similar sensitivity and specificity than many well-known and accepted optical techniques used to differentiate melanoma and seborrheic keratosis.
Ivorra, Eugenio; Verdu, Samuel; Sánchez, Antonio J; Grau, Raúl; Barat, José M
2016-10-19
A technique that combines the spatial resolution of a 3D structured-light (SL) imaging system with the spectral analysis of a hyperspectral short-wave near infrared system was developed for freshness predictions of gilthead sea bream on the first storage days (Days 0-6). This novel approach allows the hyperspectral analysis of very specific fish areas, which provides more information for freshness estimations. The SL system obtains a 3D reconstruction of fish, and an automatic method locates gilthead's pupils and irises. Once these regions are positioned, the hyperspectral camera acquires spectral information and a multivariate statistical study is done. The best region is the pupil with an R² of 0.92 and an RMSE of 0.651 for predictions. We conclude that the combination of 3D technology with the hyperspectral analysis offers plenty of potential and is a very promising technique to non destructively predict gilthead freshness.
Ivorra, Eugenio; Verdu, Samuel; Sánchez, Antonio J.; Grau, Raúl; Barat, José M.
2016-01-01
A technique that combines the spatial resolution of a 3D structured-light (SL) imaging system with the spectral analysis of a hyperspectral short-wave near infrared system was developed for freshness predictions of gilthead sea bream on the first storage days (Days 0–6). This novel approach allows the hyperspectral analysis of very specific fish areas, which provides more information for freshness estimations. The SL system obtains a 3D reconstruction of fish, and an automatic method locates gilthead’s pupils and irises. Once these regions are positioned, the hyperspectral camera acquires spectral information and a multivariate statistical study is done. The best region is the pupil with an R2 of 0.92 and an RMSE of 0.651 for predictions. We conclude that the combination of 3D technology with the hyperspectral analysis offers plenty of potential and is a very promising technique to non destructively predict gilthead freshness. PMID:27775556
NASA Technical Reports Server (NTRS)
Rader, W. P.; Barrett, S.; Payne, K. R.
1975-01-01
Data measurement and interpretation techniques were defined for application to the first few space shuttle flights, so that the dynamic environment could be sufficiently well established to be used to reduce the cost of future payloads through more efficient design and environmental test techniques. It was concluded that: (1) initial payloads must be given comprehensive instrumentation coverage to obtain detailed definition of acoustics, vibration, and interface loads, (2) analytical models of selected initial payloads must be developed and verified by modal surveys and flight measurements, (3) acoustic tests should be performed on initial payloads to establish realistic test criteria for components and experiments in order to minimize unrealistic failures and retest requirements, (4) permanent data banks should be set up to establish statistical confidence in the data to be used, (5) a more unified design/test specification philosophy is needed, (6) additional work is needed to establish a practical testing technique for simulation of vehicle transients.
Kong, Jessica; Giridharagopal, Rajiv; Harrison, Jeffrey S; Ginger, David S
2018-05-31
Correlating nanoscale chemical specificity with operational physics is a long-standing goal of functional scanning probe microscopy (SPM). We employ a data analytic approach combining multiple microscopy modes, using compositional information in infrared vibrational excitation maps acquired via photoinduced force microscopy (PiFM) with electrical information from conductive atomic force microscopy. We study a model polymer blend comprising insulating poly(methyl methacrylate) (PMMA) and semiconducting poly(3-hexylthiophene) (P3HT). We show that PiFM spectra are different from FTIR spectra, but can still be used to identify local composition. We use principal component analysis to extract statistically significant principal components and principal component regression to predict local current and identify local polymer composition. In doing so, we observe evidence of semiconducting P3HT within PMMA aggregates. These methods are generalizable to correlated SPM data and provide a meaningful technique for extracting complex compositional information that are impossible to measure from any one technique.
Polarization speckle imaging as a potential technique for in vivo skin cancer detection
NASA Astrophysics Data System (ADS)
Tchvialeva, Lioudmila; Dhadwal, Gurbir; Lui, Harvey; Kalia, Sunil; Zeng, Haishan; McLean, David I.; Lee, Tim K.
2013-06-01
Skin cancer is the most common cancer in the Western world. In order to accurately detect the disease, especially malignant melanoma-the most fatal form of skin cancer-at an early stage when the prognosis is excellent, there is an urgent need to develop noninvasive early detection methods. We believe that polarization speckle patterns, defined as a spatial distribution of depolarization ratio of traditional speckle patterns, can be an important tool for skin cancer detection. To demonstrate our technique, we conduct a large in vivo clinical study of 214 skin lesions, and show that statistical moments of the polarization speckle pattern could differentiate different types of skin lesions, including three common types of skin cancers, malignant melanoma, squamous cell carcinoma, basal cell carcinoma, and two benign lesions, melanocytic nevus and seborrheic keratoses. In particular, the fourth order moment achieves better or similar sensitivity and specificity than many well-known and accepted optical techniques used to differentiate melanoma and seborrheic keratosis.
Synthetic Minority Oversampling Technique and Fractal Dimension for Identifying Multiple Sclerosis
NASA Astrophysics Data System (ADS)
Zhang, Yu-Dong; Zhang, Yin; Phillips, Preetha; Dong, Zhengchao; Wang, Shuihua
Multiple sclerosis (MS) is a severe brain disease. Early detection can provide timely treatment. Fractal dimension can provide statistical index of pattern changes with scale at a given brain image. In this study, our team used susceptibility weighted imaging technique to obtain 676 MS slices and 880 healthy slices. We used synthetic minority oversampling technique to process the unbalanced dataset. Then, we used Canny edge detector to extract distinguishing edges. The Minkowski-Bouligand dimension was a fractal dimension estimation method and used to extract features from edges. Single hidden layer neural network was used as the classifier. Finally, we proposed a three-segment representation biogeography-based optimization to train the classifier. Our method achieved a sensitivity of 97.78±1.29%, a specificity of 97.82±1.60% and an accuracy of 97.80±1.40%. The proposed method is superior to seven state-of-the-art methods in terms of sensitivity and accuracy.
Implementation of a new algorithm for Density Equalizing Map Projections (DEMP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Close, E.R.; Merrill, D.W.; Holmes, H.H.
The purpose of the PAREP (Populations at Risk to Environmental Pollution) Project at Lawrence Berkeley National Laboratory (LBNL), an ongoing Department of Energy (DOE) project since 1978, is to develop resources (data, computing techniques, and biostatistical methodology) applicable to DOE`s needs. Specifically, the PAREP project has developed techniques for statistically analyzing disease distributions in the vicinity of supposed environmental hazards. Such techniques can be applied to assess the health risks in populations residing near DOE installations, provided adequate small-area health data are available. The FY 1994 task descriptions for the PAREP project were determined in discussions at LBNL on 11/2/93.more » The FY94 PAREP Work Authorization specified three major tasks: a prototype small area study, a feasibility study for obtaining small-area data, and preservation of the PAREP data archive. The complete FY94 work plan, and the subtasks accomplished to date, were included in the Cumulative FY94 progress report.« less
NASA Astrophysics Data System (ADS)
Jacobson, Gloria; Rella, Chris; Farinas, Alejandro
2014-05-01
Technological advancement of instrumentation in atmospheric and other geoscience disciplines over the past decade has lead to a shift from discrete sample analysis to continuous, in-situ monitoring. Standard error analysis used for discrete measurements is not sufficient to assess and compare the error contribution of noise and drift from continuous-measurement instruments, and a different statistical analysis approach should be applied. The Allan standard deviation analysis technique developed for atomic clock stability assessment by David W. Allan [1] can be effectively and gainfully applied to continuous measurement instruments. As an example, P. Werle et al has applied these techniques to look at signal averaging for atmospheric monitoring by Tunable Diode-Laser Absorption Spectroscopy (TDLAS) [2]. This presentation will build on, and translate prior foundational publications to provide contextual definitions and guidelines for the practical application of this analysis technique to continuous scientific measurements. The specific example of a Picarro G2401 Cavity Ringdown Spectroscopy (CRDS) analyzer used for continuous, atmospheric monitoring of CO2, CH4 and CO will be used to define the basics features the Allan deviation, assess factors affecting the analysis, and explore the time-series to Allan deviation plot translation for different types of instrument noise (white noise, linear drift, and interpolated data). In addition, the useful application of using an Allan deviation to optimize and predict the performance of different calibration schemes will be presented. Even though this presentation will use the specific example of the Picarro G2401 CRDS Analyzer for atmospheric monitoring, the objective is to present the information such that it can be successfully applied to other instrument sets and disciplines. [1] D.W. Allan, "Statistics of Atomic Frequency Standards," Proc, IEEE, vol. 54, pp 221-230, Feb 1966 [2] P. Werle, R. Miicke, F. Slemr, "The Limits of Signal Averaging in Atmospheric Trace-Gas Monitoring by Tunable Diode-Laser Absorption Spectroscopy (TDLAS)," Applied Physics, B57, pp 131-139, April 1993
Statistical reconstruction for cosmic ray muon tomography.
Schultz, Larry J; Blanpied, Gary S; Borozdin, Konstantin N; Fraser, Andrew M; Hengartner, Nicolas W; Klimenko, Alexei V; Morris, Christopher L; Orum, Chris; Sossong, Michael J
2007-08-01
Highly penetrating cosmic ray muons constantly shower the earth at a rate of about 1 muon per cm2 per minute. We have developed a technique which exploits the multiple Coulomb scattering of these particles to perform nondestructive inspection without the use of artificial radiation. In prior work [1]-[3], we have described heuristic methods for processing muon data to create reconstructed images. In this paper, we present a maximum likelihood/expectation maximization tomographic reconstruction algorithm designed for the technique. This algorithm borrows much from techniques used in medical imaging, particularly emission tomography, but the statistics of muon scattering dictates differences. We describe the statistical model for multiple scattering, derive the reconstruction algorithm, and present simulated examples. We also propose methods to improve the robustness of the algorithm to experimental errors and events departing from the statistical model.
Technique for estimation of streamflow statistics in mineral areas of interest in Afghanistan
Olson, Scott A.; Mack, Thomas J.
2011-01-01
A technique for estimating streamflow statistics at ungaged stream sites in areas of mineral interest in Afghanistan using drainage-area-ratio relations of historical streamflow data was developed and is documented in this report. The technique can be used to estimate the following streamflow statistics at ungaged sites: (1) 7-day low flow with a 10-year recurrence interval, (2) 7-day low flow with a 2-year recurrence interval, (3) daily mean streamflow exceeded 90 percent of the time, (4) daily mean streamflow exceeded 80 percent of the time, (5) mean monthly streamflow for each month of the year, (6) mean annual streamflow, and (7) minimum monthly streamflow for each month of the year. Because they are based on limited historical data, the estimates of streamflow statistics at ungaged sites are considered preliminary.
NASA Astrophysics Data System (ADS)
O'Shea, Bethany; Jankowski, Jerzy
2006-12-01
The major ion composition of Great Artesian Basin groundwater in the lower Namoi River valley is relatively homogeneous in chemical composition. Traditional graphical techniques have been combined with multivariate statistical methods to determine whether subtle differences in the chemical composition of these waters can be delineated. Hierarchical cluster analysis and principal components analysis were successful in delineating minor variations within the groundwaters of the study area that were not visually identified in the graphical techniques applied. Hydrochemical interpretation allowed geochemical processes to be identified in each statistically defined water type and illustrated how these groundwaters differ from one another. Three main geochemical processes were identified in the groundwaters: ion exchange, precipitation, and mixing between waters from different sources. Both statistical methods delineated an anomalous sample suspected of being influenced by magmatic CO2 input. The use of statistical methods to complement traditional graphical techniques for waters appearing homogeneous is emphasized for all investigations of this type. Copyright
ERIC Educational Resources Information Center
Henry, Gary T.; And Others
1992-01-01
A statistical technique is presented for developing performance standards based on benchmark groups. The benchmark groups are selected using a multivariate technique that relies on a squared Euclidean distance method. For each observation unit (a school district in the example), a unique comparison group is selected. (SLD)
Redmond, Tony; O'Leary, Neil; Hutchison, Donna M; Nicolela, Marcelo T; Artes, Paul H; Chauhan, Balwantray C
2013-12-01
A new analysis method called permutation of pointwise linear regression measures the significance of deterioration over time at each visual field location, combines the significance values into an overall statistic, and then determines the likelihood of change in the visual field. Because the outcome is a single P value, individualized to that specific visual field and independent of the scale of the original measurement, the method is well suited for comparing techniques with different stimuli and scales. To test the hypothesis that frequency-doubling matrix perimetry (FDT2) is more sensitive than standard automated perimetry (SAP) in identifying visual field progression in glaucoma. Patients with open-angle glaucoma and healthy controls were examined by FDT2 and SAP, both with the 24-2 test pattern, on the same day at 6-month intervals in a longitudinal prospective study conducted in a hospital-based setting. Only participants with at least 5 examinations were included. Data were analyzed with permutation of pointwise linear regression. Permutation of pointwise linear regression is individualized to each participant, in contrast to current analyses in which the statistical significance is inferred from population-based approaches. Analyses were performed with both total deviation and pattern deviation. Sixty-four patients and 36 controls were included in the study. The median age, SAP mean deviation, and follow-up period were 65 years, -2.6 dB, and 5.4 years, respectively, in patients and 62 years, +0.4 dB, and 5.2 years, respectively, in controls. Using total deviation analyses, statistically significant deterioration was identified in 17% of patients with FDT2, in 34% of patients with SAP, and in 14% of patients with both techniques; in controls these percentages were 8% with FDT2, 31% with SAP, and 8% with both. Using pattern deviation analyses, statistically significant deterioration was identified in 16% of patients with FDT2, in 17% of patients with SAP, and in 3% of patients with both techniques; in controls these values were 3% with FDT2 and none with SAP. No evidence was found that FDT2 is more sensitive than SAP in identifying visual field deterioration. In about one-third of healthy controls, age-related deterioration with SAP reached statistical significance.
Validation of a new technique to detect Cryptosporidium spp. oocysts in bovine feces.
Inácio, Sandra Valéria; Gomes, Jancarlo Ferreira; Oliveira, Bruno César Miranda; Falcão, Alexandre Xavier; Suzuki, Celso Tetsuo Nagase; Dos Santos, Bianca Martins; de Aquino, Monally Conceição Costa; de Paula Ribeiro, Rafaela Silva; de Assunção, Danilla Mendes; Casemiro, Pamella Almeida Freire; Meireles, Marcelo Vasconcelos; Bresciani, Katia Denise Saraiva
2016-11-01
Due to its important zoonotic potential, cryptosporidiosis arouses strong interest in the scientific community, because, it was initially considered a rare and opportunistic disease. The parasitological diagnosis of the causative agent of this disease, the protozoan Cryptosporidium spp., requires the use of specific techniques of concentration and permanent staining, which are laborious and costly, and are difficult to use in routine laboratory tests. In view of the above, we conducted the feasibility, development, evaluation and intralaboratory validation of a new parasitological technique for analysis in optical microscopy of Cryptosporidium spp. oocysts, called TF-Test Coccidia, using fecal samples from calves from the city of Araçatuba, São Paulo. To confirm the aforementioned parasite and prove the diagnostic efficiency of the new technique, we used two established methodologies in the scientific literature: parasite concentration by centrifugal sedimentation and negative staining with malachite green (CSN-Malachite) and Nested-PCR. We observed good effectiveness of the TF-Test Coccidia technique, being statistically equivalent to CSN-Malachite. Thus, we verified the effectiveness of the TF-Test Coccidia parasitological technique for the detection of Cryptosporidium spp. oocysts and observed good concentration and morphology of the parasite, with a low amount of debris in the fecal smear. Copyright © 2016 Elsevier B.V. All rights reserved.
Rohling, Martin L; Williamson, David J; Miller, L Stephen; Adams, Russell L
2003-11-01
The aim of this project was to validate an alternative global measure of neurocognitive impairment (Rohling Interpretive Method, or RIM) that could be generated from data gathered from a flexible battery approach. A critical step in this process is to establish the utility of the technique against current standards in the field. In this paper, we compared results from the Rohling Interpretive Method to those obtained from the General Neuropsychological Deficit Scale (GNDS; Reitan & Wolfson, 1988) and the Halstead-Russell Average Impairment Rating (AIR; Russell, Neuringer & Goldstein, 1970) on a large previously published sample of patients assessed with the Halstead-Reitan Battery (HRB). Findings support the use of the Rohling Interpretive Method in producing summary statistics similar in diagnostic sensitivity and specificity to the traditional HRB indices.
Curve fitting air sample filter decay curves to estimate transuranic content.
Hayes, Robert B; Chiou, Hung Cheng
2004-01-01
By testing industry standard techniques for radon progeny evaluation on air sample filters, a new technique is developed to evaluate transuranic activity on air filters by curve fitting the decay curves. The industry method modified here is simply the use of filter activity measurements at different times to estimate the air concentrations of radon progeny. The primary modification was to not look for specific radon progeny values but rather transuranic activity. By using a method that will provide reasonably conservative estimates of the transuranic activity present on a filter, some credit for the decay curve shape can then be taken. By carrying out rigorous statistical analysis of the curve fits to over 65 samples having no transuranic activity taken over a 10-mo period, an optimization of the fitting function and quality tests for this purpose was attained.
Computer Aided Diagnostic Support System for Skin Cancer: A Review of Techniques and Algorithms
Masood, Ammara; Al-Jumaily, Adel Ali
2013-01-01
Image-based computer aided diagnosis systems have significant potential for screening and early detection of malignant melanoma. We review the state of the art in these systems and examine current practices, problems, and prospects of image acquisition, pre-processing, segmentation, feature extraction and selection, and classification of dermoscopic images. This paper reports statistics and results from the most important implementations reported to date. We compared the performance of several classifiers specifically developed for skin lesion diagnosis and discussed the corresponding findings. Whenever available, indication of various conditions that affect the technique's performance is reported. We suggest a framework for comparative assessment of skin cancer diagnostic models and review the results based on these models. The deficiencies in some of the existing studies are highlighted and suggestions for future research are provided. PMID:24575126
Small, J R
1993-01-01
This paper is a study into the effects of experimental error on the estimated values of flux control coefficients obtained using specific inhibitors. Two possible techniques for analysing the experimental data are compared: a simple extrapolation method (the so-called graph method) and a non-linear function fitting method. For these techniques, the sources of systematic errors are identified and the effects of systematic and random errors are quantified, using both statistical analysis and numerical computation. It is shown that the graph method is very sensitive to random errors and, under all conditions studied, that the fitting method, even under conditions where the assumptions underlying the fitted function do not hold, outperformed the graph method. Possible ways of designing experiments to minimize the effects of experimental errors are analysed and discussed. PMID:8257434
Alexander, Elizabeth L; Gardete, Susana; Bar, Haim Y; Wells, Martin T; Tomasz, Alexander; Rhee, Kyu Y
2014-01-01
Intermediate (VISA-type) vancomycin resistance in Staphylococcus aureus has been associated with a range of physiologic and genetic alterations. Previous work described the emergence of VISA-type resistance in two clonally-distinct series of isolates. In both series (the first belonging to MRSA clone ST8-USA300, and the second to ST5-USA100), resistance was conferred by a single mutation in yvqF (a negative regulator of the vraSR two-component system associated with vancomycin resistance). In the USA300 series, resistance was reversed by a secondary mutation in vraSR. In this study, we combined systems-level metabolomic profiling with statistical modeling techniques to discover specific, reversible metabolic alterations associated with the VISA phenotype.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akcakaya, Murat; Nehorai, Arye; Sen, Satyabrata
Most existing radar algorithms are developed under the assumption that the environment (clutter) is stationary. However, in practice, the characteristics of the clutter can vary enormously depending on the radar-operational scenarios. If unaccounted for, these nonstationary variabilities may drastically hinder the radar performance. Therefore, to overcome such shortcomings, we develop a data-driven method for target detection in nonstationary environments. In this method, the radar dynamically detects changes in the environment and adapts to these changes by learning the new statistical characteristics of the environment and by intelligibly updating its statistical detection algorithm. Specifically, we employ drift detection algorithms to detectmore » changes in the environment; incremental learning, particularly learning under concept drift algorithms, to learn the new statistical characteristics of the environment from the new radar data that become available in batches over a period of time. The newly learned environment characteristics are then integrated in the detection algorithm. Furthermore, we use Monte Carlo simulations to demonstrate that the developed method provides a significant improvement in the detection performance compared with detection techniques that are not aware of the environmental changes.« less
Cavalot, A L; Palonta, F; Preti, G; Nazionale, G; Ricci, E; Vione, N; Albera, R; Cortesina, G
2001-12-01
The insertion of a prosthesis and restoration with pectoralis major myocutaneous flaps for patients subjected to total pharyngolaryngectomy is a technique now universally accepted; however the literature on the subject is lacking. Our study considers 10 patients subjected to total pharyngolaryngectomy and restoration with pectoralis major myocutaneous flaps who were fitted with vocal function prostheses and a control group of 50 subjects treated with a total laryngectomy without pectoralis major myocutaneous flaps and who were fitted with vocal function prostheses. Specific qualitative and quantitative parameters were compared. The quantitative measurement of the levels of voice intensity and the evaluation of the harmonics-to-noise ratio were not statistically significant (p > 0.05) between the two study groups at either high- or low-volume speech. On the contrary, statistically significant differences were found (p < 0.05) for the basic frequency of both the low and the high volume voice. For the qualitative analysis seven parameters were established for evaluation by trained and untrained listeners: on the basis of these parameters the control group had statistically better voices.
Dynamic causal modelling: a critical review of the biophysical and statistical foundations.
Daunizeau, J; David, O; Stephan, K E
2011-09-15
The goal of dynamic causal modelling (DCM) of neuroimaging data is to study experimentally induced changes in functional integration among brain regions. This requires (i) biophysically plausible and physiologically interpretable models of neuronal network dynamics that can predict distributed brain responses to experimental stimuli and (ii) efficient statistical methods for parameter estimation and model comparison. These two key components of DCM have been the focus of more than thirty methodological articles since the seminal work of Friston and colleagues published in 2003. In this paper, we provide a critical review of the current state-of-the-art of DCM. We inspect the properties of DCM in relation to the most common neuroimaging modalities (fMRI and EEG/MEG) and the specificity of inference on neural systems that can be made from these data. We then discuss both the plausibility of the underlying biophysical models and the robustness of the statistical inversion techniques. Finally, we discuss potential extensions of the current DCM framework, such as stochastic DCMs, plastic DCMs and field DCMs. Copyright © 2009 Elsevier Inc. All rights reserved.
2D Affine and Projective Shape Analysis.
Bryner, Darshan; Klassen, Eric; Huiling Le; Srivastava, Anuj
2014-05-01
Current techniques for shape analysis tend to seek invariance to similarity transformations (rotation, translation, and scale), but certain imaging situations require invariance to larger groups, such as affine or projective groups. Here we present a general Riemannian framework for shape analysis of planar objects where metrics and related quantities are invariant to affine and projective groups. Highlighting two possibilities for representing object boundaries-ordered points (or landmarks) and parameterized curves-we study different combinations of these representations (points and curves) and transformations (affine and projective). Specifically, we provide solutions to three out of four situations and develop algorithms for computing geodesics and intrinsic sample statistics, leading up to Gaussian-type statistical models, and classifying test shapes using such models learned from training data. In the case of parameterized curves, we also achieve the desired goal of invariance to re-parameterizations. The geodesics are constructed by particularizing the path-straightening algorithm to geometries of current manifolds and are used, in turn, to compute shape statistics and Gaussian-type shape models. We demonstrate these ideas using a number of examples from shape and activity recognition.
Experimental Mathematics and Computational Statistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, David H.; Borwein, Jonathan M.
2009-04-30
The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.
Engaging with the Art & Science of Statistics
ERIC Educational Resources Information Center
Peters, Susan A.
2010-01-01
How can statistics clearly be mathematical and yet distinct from mathematics? The answer lies in the reality that statistics is both an art and a science, and both aspects are important for teaching and learning statistics. Statistics is a mathematical science in that it applies mathematical theories and techniques. Mathematics provides the…
A Multidisciplinary Approach for Teaching Statistics and Probability
ERIC Educational Resources Information Center
Rao, C. Radhakrishna
1971-01-01
The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)
Boe, Debra Thingstad; Parsons, Helen
2009-01-01
Local public health agencies are challenged to continually improve service delivery, yet they frequently operate with constrained resources. Quality improvement methods and techniques such as statistical process control are commonly used in other industries, and they have recently been proposed as a means of improving service delivery and performance in public health settings. We analyzed a quality improvement project undertaken at a local Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) clinic to reduce waiting times and improve client satisfaction with a walk-in nutrition education service. We used statistical process control techniques to evaluate initial process performance, implement an intervention, and assess process improvements. We found that implementation of these techniques significantly reduced waiting time and improved clients' satisfaction with the WIC service. PMID:19608964
Analysis of the sleep quality of elderly people using biomedical signals.
Moreno-Alsasua, L; Garcia-Zapirain, B; Mendez-Zorrilla, A
2015-01-01
This paper presents a technical solution that analyses sleep signals captured by biomedical sensors to find possible disorders during rest. Specifically, the method evaluates electrooculogram (EOG) signals, skin conductance (GSR), air flow (AS), and body temperature. Next, a quantitative sleep quality analysis determines significant changes in the biological signals, and any similarities between them in a given time period. Filtering techniques such as the Fourier transform method and IIR filters process the signal and identify significant variations. Once these changes have been identified, all significant data is compared and a quantitative and statistical analysis is carried out to determine the level of a person's rest. To evaluate the correlation and significant differences, a statistical analysis has been calculated showing correlation between EOG and AS signals (p=0,005), EOG, and GSR signals (p=0,037) and, finally, the EOG and Body temperature (p=0,04). Doctors could use this information to monitor changes within a patient.
Kinetics of bacterial fluorescence staining with 3,3'-diethylthiacyanine.
Thomas, Marlon S; Nuñez, Vicente; Upadhyayula, Srigokul; Zielins, Elizabeth R; Bao, Duoduo; Vasquez, Jacob M; Bahmani, Baharak; Vullev, Valentine I
2010-06-15
For more than a century, colorimetric and fluorescence staining have been the foundation of a broad range of key bioanalytical techniques. The dynamics of such staining processes, however, still remains largely unexplored. We investigated the kinetics of fluorescence staining of two gram-negative and two gram-positive species with 3,3'-diethylthiacyanine (THIA) iodide. An increase in the THIA fluorescence quantum yield, induced by the bacterial dye uptake, was the principal reason for the observed emission enhancement. The fluorescence quantum yield of THIA depended on the media viscosity and not on the media polarity, which suggested that the microenvironment of the dye molecules taken up by the cells was restrictive. The kinetics of fluorescence staining did not manifest a statistically significant dependence neither on the dye concentration, nor on the cell count. In the presence of surfactant additives, however, the fluorescence-enhancement kinetic patterns manifested species specificity with statistically significant discernibility.
Statistical Techniques to Analyze Pesticide Data Program Food Residue Observations.
Szarka, Arpad Z; Hayworth, Carol G; Ramanarayanan, Tharacad S; Joseph, Robert S I
2018-06-26
The U.S. EPA conducts dietary-risk assessments to ensure that levels of pesticides on food in the U.S. food supply are safe. Often these assessments utilize conservative residue estimates, maximum residue levels (MRLs), and a high-end estimate derived from registrant-generated field-trial data sets. A more realistic estimate of consumers' pesticide exposure from food may be obtained by utilizing residues from food-monitoring programs, such as the Pesticide Data Program (PDP) of the U.S. Department of Agriculture. A substantial portion of food-residue concentrations in PDP monitoring programs are below the limits of detection (left-censored), which makes the comparison of regulatory-field-trial and PDP residue levels difficult. In this paper, we present a novel adaption of established statistical techniques, the Kaplan-Meier estimator (K-M), the robust regression on ordered statistic (ROS), and the maximum-likelihood estimator (MLE), to quantify the pesticide-residue concentrations in the presence of heavily censored data sets. The examined statistical approaches include the most commonly used parametric and nonparametric methods for handling left-censored data that have been used in the fields of medical and environmental sciences. This work presents a case study in which data of thiamethoxam residue on bell pepper generated from registrant field trials were compared with PDP-monitoring residue values. The results from the statistical techniques were evaluated and compared with commonly used simple substitution methods for the determination of summary statistics. It was found that the maximum-likelihood estimator (MLE) is the most appropriate statistical method to analyze this residue data set. Using the MLE technique, the data analyses showed that the median and mean PDP bell pepper residue levels were approximately 19 and 7 times lower, respectively, than the corresponding statistics of the field-trial residues.
Statistical contact angle analyses; "slow moving" drops on a horizontal silicon-oxide surface.
Schmitt, M; Grub, J; Heib, F
2015-06-01
Sessile drop experiments on horizontal surfaces are commonly used to characterise surface properties in science and in industry. The advancing angle and the receding angle are measurable on every solid. Specially on horizontal surfaces even the notions themselves are critically questioned by some authors. Building a standard, reproducible and valid method of measuring and defining specific (advancing/receding) contact angles is an important challenge of surface science. Recently we have developed two/three approaches, by sigmoid fitting, by independent and by dependent statistical analyses, which are practicable for the determination of specific angles/slopes if inclining the sample surface. These approaches lead to contact angle data which are independent on "user-skills" and subjectivity of the operator which is also of urgent need to evaluate dynamic measurements of contact angles. We will show in this contribution that the slightly modified procedures are also applicable to find specific angles for experiments on horizontal surfaces. As an example droplets on a flat freshly cleaned silicon-oxide surface (wafer) are dynamically measured by sessile drop technique while the volume of the liquid is increased/decreased. The triple points, the time, the contact angles during the advancing and the receding of the drop obtained by high-precision drop shape analysis are statistically analysed. As stated in the previous contribution the procedure is called "slow movement" analysis due to the small covered distance and the dominance of data points with low velocity. Even smallest variations in velocity such as the minimal advancing motion during the withdrawing of the liquid are identifiable which confirms the flatness and the chemical homogeneity of the sample surface and the high sensitivity of the presented approaches. Copyright © 2014 Elsevier Inc. All rights reserved.
Hudson-Shore, Michelle
2016-03-01
The Annual Statistics of Scientific Procedures on Living Animals Great Britain 2014 reports a welcome decline in animal experimentation in the UK. However, caution has to be exercised when interpreting these most recent figures, due to the significant changes made to satisfy the requirements of Directive 2010/63/EU as to what information is reported and how it is reported. Comparisons to the figures and trends reported in previous years is difficult, so this paper focuses on the specifics of the current report, providing information on overall animal use and highlighting specific issues associated with genetically-altered animals, fish and primates. There is a detailed discussion of the extent of the changes, commenting on the benefits and disadvantages of the new format, in areas such as severity of procedures, legislation and techniques of special interest. It also considers the consequences of the changes on the effective monitoring of laboratory animal use, the openness and transparency regarding the impacts of animal use, and the implementation of Three Rs initiatives. In addition, suggestions for further improvements to the new format are made to the Home Office. 2016 FRAME.
Estimation of critical behavior from the density of states in classical statistical models
NASA Astrophysics Data System (ADS)
Malakis, A.; Peratzakis, A.; Fytas, N. G.
2004-12-01
We present a simple and efficient approximation scheme which greatly facilitates the extension of Wang-Landau sampling (or similar techniques) in large systems for the estimation of critical behavior. The method, presented in an algorithmic approach, is based on a very simple idea, familiar in statistical mechanics from the notion of thermodynamic equivalence of ensembles and the central limit theorem. It is illustrated that we can predict with high accuracy the critical part of the energy space and by using this restricted part we can extend our simulations to larger systems and improve the accuracy of critical parameters. It is proposed that the extensions of the finite-size critical part of the energy space, determining the specific heat, satisfy a scaling law involving the thermal critical exponent. The method is applied successfully for the estimation of the scaling behavior of specific heat of both square and simple cubic Ising lattices. The proposed scaling law is verified by estimating the thermal critical exponent from the finite-size behavior of the critical part of the energy space. The density of states of the zero-field Ising model on these lattices is obtained via a multirange Wang-Landau sampling.
Curve fitting and modeling with splines using statistical variable selection techniques
NASA Technical Reports Server (NTRS)
Smith, P. L.
1982-01-01
The successful application of statistical variable selection techniques to fit splines is demonstrated. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs, using the B-spline basis, were developed. The program for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.
Fitting multidimensional splines using statistical variable selection techniques
NASA Technical Reports Server (NTRS)
Smith, P. L.
1982-01-01
This report demonstrates the successful application of statistical variable selection techniques to fit splines. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs using the B-spline basis were developed, and the one for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.
Monte Carlo errors with less errors
NASA Astrophysics Data System (ADS)
Wolff, Ulli; Alpha Collaboration
2004-01-01
We explain in detail how to estimate mean values and assess statistical errors for arbitrary functions of elementary observables in Monte Carlo simulations. The method is to estimate and sum the relevant autocorrelation functions, which is argued to produce more certain error estimates than binning techniques and hence to help toward a better exploitation of expensive simulations. An effective integrated autocorrelation time is computed which is suitable to benchmark efficiencies of simulation algorithms with regard to specific observables of interest. A Matlab code is offered for download that implements the method. It can also combine independent runs (replica) allowing to judge their consistency.
Statistical Tests of Reliability of NDE
NASA Technical Reports Server (NTRS)
Baaklini, George Y.; Klima, Stanley J.; Roth, Don J.; Kiser, James D.
1987-01-01
Capabilities of advanced material-testing techniques analyzed. Collection of four reports illustrates statistical method for characterizing flaw-detecting capabilities of sophisticated nondestructive evaluation (NDE). Method used to determine reliability of several state-of-the-art NDE techniques for detecting failure-causing flaws in advanced ceramic materials considered for use in automobiles, airplanes, and space vehicles.
Statistical Techniques for Efficient Indexing and Retrieval of Document Images
ERIC Educational Resources Information Center
Bhardwaj, Anurag
2010-01-01
We have developed statistical techniques to improve the performance of document image search systems where the intermediate step of OCR based transcription is not used. Previous research in this area has largely focused on challenges pertaining to generation of small lexicons for processing handwritten documents and enhancement of poor quality…
A comparison of linear and nonlinear statistical techniques in performance attribution.
Chan, N H; Genovese, C R
2001-01-01
Performance attribution is usually conducted under the linear framework of multifactor models. Although commonly used by practitioners in finance, linear multifactor models are known to be less than satisfactory in many situations. After a brief survey of nonlinear methods, nonlinear statistical techniques are applied to performance attribution of a portfolio constructed from a fixed universe of stocks using factors derived from some commonly used cross sectional linear multifactor models. By rebalancing this portfolio monthly, the cumulative returns for procedures based on standard linear multifactor model and three nonlinear techniques-model selection, additive models, and neural networks-are calculated and compared. It is found that the first two nonlinear techniques, especially in combination, outperform the standard linear model. The results in the neural-network case are inconclusive because of the great variety of possible models. Although these methods are more complicated and may require some tuning, toolboxes are developed and suggestions on calibration are proposed. This paper demonstrates the usefulness of modern nonlinear statistical techniques in performance attribution.
Computerized scoring algorithms for the Autobiographical Memory Test.
Takano, Keisuke; Gutenbrunner, Charlotte; Martens, Kris; Salmon, Karen; Raes, Filip
2018-02-01
Reduced specificity of autobiographical memories is a hallmark of depressive cognition. Autobiographical memory (AM) specificity is typically measured by the Autobiographical Memory Test (AMT), in which respondents are asked to describe personal memories in response to emotional cue words. Due to this free descriptive responding format, the AMT relies on experts' hand scoring for subsequent statistical analyses. This manual coding potentially impedes research activities in big data analytics such as large epidemiological studies. Here, we propose computerized algorithms to automatically score AM specificity for the Dutch (adult participants) and English (youth participants) versions of the AMT by using natural language processing and machine learning techniques. The algorithms showed reliable performances in discriminating specific and nonspecific (e.g., overgeneralized) autobiographical memories in independent testing data sets (area under the receiver operating characteristic curve > .90). Furthermore, outcome values of the algorithms (i.e., decision values of support vector machines) showed a gradient across similar (e.g., specific and extended memories) and different (e.g., specific memory and semantic associates) categories of AMT responses, suggesting that, for both adults and youth, the algorithms well capture the extent to which a memory has features of specific memories. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
NASA Technical Reports Server (NTRS)
Zimmerman, G. A.; Olsen, E. T.
1992-01-01
Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.
Costantino, Cosimo; Romiti, Davide
2014-06-24
Chronic low back pain (CLBP) is a major cause of disability, for which clinical practice guidelines suggest exercise programs, such as Back School program (stretching and selective muscle reinforcement techniques) and Hydrotherapy technique, as an effective treatment to reduce pain intensity and disability. We enrolled 56 elderly individuals, affected by non-specific CLBP, whose pain had worsened in the last three months, which were randomly allocated to Back School (group A) or to Hydrotherapy program (group B). Each group underwent two one-hour-treatment sessions per week, over a 12-week period. Each patient was evaluated using the Roland Morris Disability Questionnaire (RMDQ) and the 36-Item Short Form Health Survey (SF-36) V2.0 at the beginning (T0), at the end of treatment (T1) and at the 3-month follow-up (T2). At T1 and T2 we observed a highly significant statistical difference in the values measured in both groups: at T1 in group A RMDQ improvement of 3.26±1.02 (p<0.001) and SF-36 of 13.30±1.44 (p<0.001); in group B RMDQ improvement of 4.96±0.71 (p<0.001) and SF-36 of 14.19±1.98 (p<0.001). We have also evaluated the difference in effectiveness of the two programs and no significant statistical differences were found between the two groups. Back School program and Hydrotherapy could be valid treatment options in the rehabilitation of non-specific CLBP in elderly people. Both therapies proved to be effective and can be used in association with other rehabilitation programs. We believe that Back School program should be favored for its simplicity and the small number of resources required.
Barratt, Dean C; Chan, Carolyn S K; Edwards, Philip J; Penney, Graeme P; Slomczykowski, Mike; Carter, Timothy J; Hawkes, David J
2008-06-01
Statistical shape modelling potentially provides a powerful tool for generating patient-specific, 3D representations of bony anatomy for computer-aided orthopaedic surgery (CAOS) without the need for a preoperative CT scan. Furthermore, freehand 3D ultrasound (US) provides a non-invasive method for digitising bone surfaces in the operating theatre that enables a much greater region to be sampled compared with conventional direct-contact (i.e., pointer-based) digitisation techniques. In this paper, we describe how these approaches can be combined to simultaneously generate and register a patient-specific model of the femur and pelvis to the patient during surgery. In our implementation, a statistical deformation model (SDM) was constructed for the femur and pelvis by performing a principal component analysis on the B-spline control points that parameterise the freeform deformations required to non-rigidly register a training set of CT scans to a carefully segmented template CT scan. The segmented template bone surface, represented by a triangulated surface mesh, is instantiated and registered to a cloud of US-derived surface points using an iterative scheme in which the weights corresponding to the first five principal modes of variation of the SDM are optimised in addition to the rigid-body parameters. The accuracy of the method was evaluated using clinically realistic data obtained on three intact human cadavers (three whole pelves and six femurs). For each bone, a high-resolution CT scan and rigid-body registration transformation, calculated using bone-implanted fiducial markers, served as the gold standard bone geometry and registration transformation, respectively. After aligning the final instantiated model and CT-derived surfaces using the iterative closest point (ICP) algorithm, the average root-mean-square distance between the surfaces was 3.5mm over the whole bone and 3.7mm in the region of surgical interest. The corresponding distances after aligning the surfaces using the marker-based registration transformation were 4.6 and 4.5mm, respectively. We conclude that despite limitations on the regions of bone accessible using US imaging, this technique has potential as a cost-effective and non-invasive method to enable surgical navigation during CAOS procedures, without the additional radiation dose associated with performing a preoperative CT scan or intraoperative fluoroscopic imaging. However, further development is required to investigate errors using error measures relevant to specific surgical procedures.
Cognitive Clusters in Specific Learning Disorder.
Poletti, Michele; Carretta, Elisa; Bonvicini, Laura; Giorgi-Rossi, Paolo
The heterogeneity among children with learning disabilities still represents a barrier and a challenge in their conceptualization. Although a dimensional approach has been gaining support, the categorical approach is still the most adopted, as in the recent fifth edition of the Diagnostic and Statistical Manual of Mental Disorders. The introduction of the single overarching diagnostic category of specific learning disorder (SLD) could underemphasize interindividual clinical differences regarding intracategory cognitive functioning and learning proficiency, according to current models of multiple cognitive deficits at the basis of neurodevelopmental disorders. The characterization of specific cognitive profiles associated with an already manifest SLD could help identify possible early cognitive markers of SLD risk and distinct trajectories of atypical cognitive development leading to SLD. In this perspective, we applied a cluster analysis to identify groups of children with a Diagnostic and Statistical Manual-based diagnosis of SLD with similar cognitive profiles and to describe the association between clusters and SLD subtypes. A sample of 205 children with a diagnosis of SLD were enrolled. Cluster analyses (agglomerative hierarchical and nonhierarchical iterative clustering technique) were used successively on 10 core subtests of the Wechsler Intelligence Scale for Children-Fourth Edition. The 4-cluster solution was adopted, and external validation found differences in terms of SLD subtype frequencies and learning proficiency among clusters. Clinical implications of these findings are discussed, tracing directions for further studies.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Use of communication techniques by Maryland dentists.
Maybury, Catherine; Horowitz, Alice M; Wang, Min Qi; Kleinman, Dushanka V
2013-12-01
Health care providers' use of recommended communication techniques can increase patients' adherence to prevention and treatment regimens and improve patient health outcomes. The authors conducted a survey of Maryland dentists to determine the number and type of communication techniques they use on a routine basis. The authors mailed a 30-item questionnaire to a random sample of 1,393 general practice dentists and all 169 members of the Maryland chapter of the American Academy of Pediatric Dentistry. The overall response rate was 38.4 percent. Analysis included descriptive statistics, analysis of variance and ordinary least squares regression analysis to examine the association of dentists' characteristics with the number of communication techniques used. They set the significance level at P < .05. General dentists reported routinely using a mean of 7.9 of the 18 communication techniques and 3.6 of the seven basic techniques, whereas pediatric dentists reported using a mean of 8.4 and 3.8 of those techniques, respectively. General dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .01) but not the seven basic techniques (P < .05). Pediatric dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .05) and the seven basic techniques (P < .01). The number of communication techniques that dentists used routinely varied across the 18 techniques and was low for most techniques. Practical Implications. Professional education is needed both in dental school curricula and continuing education courses to increase use of recommended communication techniques. Specifically, dentists and their team members should consider taking communication skills courses and conducting an overall evaluation of their practices for user friendliness.
Confidence Intervals from Realizations of Simulated Nuclear Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younes, W.; Ratkiewicz, A.; Ressler, J. J.
2017-09-28
Various statistical techniques are discussed that can be used to assign a level of confidence in the prediction of models that depend on input data with known uncertainties and correlations. The particular techniques reviewed in this paper are: 1) random realizations of the input data using Monte-Carlo methods, 2) the construction of confidence intervals to assess the reliability of model predictions, and 3) resampling techniques to impose statistical constraints on the input data based on additional information. These techniques are illustrated with a calculation of the keff value, based on the 235U(n, f) and 239Pu (n, f) cross sections.
Creighton, Doug; Gruca, Mark; Marsh, Douglas; Murphy, Nancy
2014-11-01
Cervical mobilization and manipulation have been shown to improve cervical range of motion and pain. Rotatory thrust manipulation applied to the lower cervical segments is associated with controversy and the potential for eliciting adverse reactions (AR). The purpose of this clinical trial was to describe two translatory non-thrust mobilization techniques and evaluate their effect on cervical pain, motion restriction, and whether any adverse effects were reported when applied to the C7 segment. This trial included 30 participants with painful and restricted cervical rotation. Participants were randomly assigned to receive one of the two mobilization techniques. Active cervical rotation and pain intensity measurements were recorded pre- and post-intervention. Within group comparisons were determined using the Wilcoxon signed-rank test and between group comparisons were analyzed using the Mann-Whitney U test. Significance was set at P = 0.05. Thirty participants were evaluated immediately after one of the two mobilization techniques was applied. There was a statistically significant difference (improvement) for active cervical rotation after application of the C7 facet distraction technique for both right (P = 0.022) and left (P = 0.022) rotation. Statistically significant improvement was also found for the C7 facet gliding technique for both right (P = 0.022) and left rotation (P = 0.020). Pain reduction was statistically significant for both right and left rotation after application of both techniques. Both mobilization techniques produced similar positive effects and one was not statistically superior to the other. A single application of both C7 mobilization techniques improved active cervical rotation, reduced perceived pain, and did not produce any AR in 30 patients with neck pain and movement limitation. These two non-thrust techniques may offer clinicians an additional safe and effective manual intervention for patients with limited and painful cervical rotation. A more robust experimental design is recommended to further examine these and similar cervical translatory mobilization techniques.
Noninvasive imaging of oral premalignancy and malignancy
NASA Astrophysics Data System (ADS)
Wilder-Smith, Petra; Krasieva, T.; Jung, W.; You, J. S.; Chen, Z.; Osann, K.; Tromberg, B.
2005-04-01
Objectives: Early detection of cancer and its curable precursors remains the best way to ensure patient survival and quality of life. Despite significant advances in treatment, oral cancer still results in 10,000 U.S. deaths annually, mainly due to the late detection of most oral lesions. Specific aim was to use a combination of non-invasive optical in vivo technologies to test a multi-modality approach to non-invasive diagnostics of oral premalignancy and malignancy. Methods: In the hamster cheek pouch model (120 hamsters), in vivo optical coherence tomography (OCT) and optical Doppler tomography (ODT) mapped epithelial, subepithelial and vascular change throughout carcinogenesis in specific, marked sites. In vivo multi-wavelength multi-photon (MPM) and second harmonic generated (SHG) fluorescence techniques provided parallel data on surface and subsurface tissue structure, specifically collagen presence and structure, cellular presence, and vasculature. Images were diagnosed by 2 blinded, pre-standardized investigators using a standardized scale from 0-6 for all modalities. After sacrifice, histopathological sections were prepared and pathology evaluated on a scale of 0-6. ANOVA techniques compared imaging diagnostics with histopathology. 95% confidence limits of the sensitivity and specificity were established for the diagnostic capability of OCT/ODT+ MPM/SHG using ROC curves and kappa statistics. Results: Imaging data were reproducibly obtained with good accuracy. Carcinogenesis-related structural and vascular changes were clearly visible to tissue depths of 2mm. Sensitivity (OCT/ODT alone: 71-88%; OCT+MPM/SHG: 79-91%) and specificity (OCT alone: 62-83%;OCT+MPM/SHG: 67-90%) compared well with conventional techniques. Conclusions: OCT/ODT and MPM/SHG are promising non-invasive in vivo diagnostic modalities for oral dysplasia and malignancy. Supported by CRFA 30003, CCRP 00-01391V-20235, NIH (LAMMP) RR01192, DOE DE903-91ER 61227, NIH EB-00293 CA91717, NSF BES-86924, AFOSR FA 9550-04-1-0101.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solaimani, Mohiuddin; Iftekhar, Mohammed; Khan, Latifur
Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. Asmore » a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.« less
Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.
Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V
2007-01-01
The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.
NASA Astrophysics Data System (ADS)
Lukman, Iing; Ibrahim, Noor A.; Daud, Isa B.; Maarof, Fauziah; Hassan, Mohd N.
2002-03-01
Survival analysis algorithm is often applied in the data mining process. Cox regression is one of the survival analysis tools that has been used in many areas, and it can be used to analyze the failure times of aircraft crashed. Another survival analysis tool is the competing risks where we have more than one cause of failure acting simultaneously. Lunn-McNeil analyzed the competing risks in the survival model using Cox regression with censored data. The modified Lunn-McNeil technique is a simplify of the Lunn-McNeil technique. The Kalbfleisch-Prentice technique is involving fitting models separately from each type of failure, treating other failure types as censored. To compare the two techniques, (the modified Lunn-McNeil and Kalbfleisch-Prentice) a simulation study was performed. Samples with various sizes and censoring percentages were generated and fitted using both techniques. The study was conducted by comparing the inference of models, using Root Mean Square Error (RMSE), the power tests, and the Schoenfeld residual analysis. The power tests in this study were likelihood ratio test, Rao-score test, and Wald statistics. The Schoenfeld residual analysis was conducted to check the proportionality of the model through its covariates. The estimated parameters were computed for the cause-specific hazard situation. Results showed that the modified Lunn-McNeil technique was better than the Kalbfleisch-Prentice technique based on the RMSE measurement and Schoenfeld residual analysis. However, the Kalbfleisch-Prentice technique was better than the modified Lunn-McNeil technique based on power tests measurement.
Correcting evaluation bias of relational classifiers with network cross validation
Neville, Jennifer; Gallagher, Brian; Eliassi-Rad, Tina; ...
2011-01-04
Recently, a number of modeling techniques have been developed for data mining and machine learning in relational and network domains where the instances are not independent and identically distributed (i.i.d.). These methods specifically exploit the statistical dependencies among instances in order to improve classification accuracy. However, there has been little focus on how these same dependencies affect our ability to draw accurate conclusions about the performance of the models. More specifically, the complex link structure and attribute dependencies in relational data violate the assumptions of many conventional statistical tests and make it difficult to use these tests to assess themore » models in an unbiased manner. In this work, we examine the task of within-network classification and the question of whether two algorithms will learn models that will result in significantly different levels of performance. We show that the commonly used form of evaluation (paired t-test on overlapping network samples) can result in an unacceptable level of Type I error. Furthermore, we show that Type I error increases as (1) the correlation among instances increases and (2) the size of the evaluation set increases (i.e., the proportion of labeled nodes in the network decreases). Lastly, we propose a method for network cross-validation that combined with paired t-tests produces more acceptable levels of Type I error while still providing reasonable levels of statistical power (i.e., 1–Type II error).« less
Noise-gating to Clean Astrophysical Image Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeForest, C. E.
I present a family of algorithms to reduce noise in astrophysical images and image sequences, preserving more information from the original data than is retained by conventional techniques. The family uses locally adaptive filters (“noise gates”) in the Fourier domain to separate coherent image structure from background noise based on the statistics of local neighborhoods in the image. Processing of solar data limited by simple shot noise or by additive noise reveals image structure not easily visible in the originals, preserves photometry of observable features, and reduces shot noise by a factor of 10 or more with little to nomore » apparent loss of resolution. This reveals faint features that were either not directly discernible or not sufficiently strongly detected for quantitative analysis. The method works best on image sequences containing related subjects, for example movies of solar evolution, but is also applicable to single images provided that there are enough pixels. The adaptive filter uses the statistical properties of noise and of local neighborhoods in the data to discriminate between coherent features and incoherent noise without reference to the specific shape or evolution of those features. The technique can potentially be modified in a straightforward way to exploit additional a priori knowledge about the functional form of the noise.« less
An Efficient Algorithm for the Detection of Infrequent Rapid Bursts in Time Series Data
NASA Astrophysics Data System (ADS)
Giles, A. B.
1997-01-01
Searching through data for infrequent rapid bursts is a common requirement in many areas of scientific research. In this paper, we present a powerful and flexible analysis method that, in a single pass through the data, searches for statistically significant bursts on a set of specified short timescales. The input data are binned, if necessary, and then quantified in terms of probabilities rather than rates or ratios. Using a measure-like probability makes the method relatively count rate independent. The method has been made computationally efficient by the use of lookup tables and cyclic buffers, and it is therefore particularly well suited to real-time applications. The technique has been developed specifically for use in an X-ray astronomy application to search for millisecond bursts from black hole candidates such as Cyg X-1. We briefly review the few observations of these types of features reported in the literature, as well as the variety of ways in which their statistical reliability was challenged. The developed technique, termed the burst expectation search (BES) method, is illustrated using some data simulations and archived data obtained during ground testing of the proportional counter array (PCA) experiment detectors on the Rossi X-Ray Timing Explorer (RXTE). A potential application for a real-time BES method on board RXTE is also examined.
Habachi, A El; Conil, E; Hadjem, A; Vazquez, E; Wong, M F; Gati, A; Fleury, G; Wiart, J
2010-04-07
In this paper, we propose identification of the morphological factors that may impact the whole-body averaged specific absorption rate (WBSAR). This study is conducted for the case of exposure to a front plane wave at a 2100 MHz frequency carrier. This study is based on the development of different regression models for estimating the WBSAR as a function of morphological factors. For this purpose, a database of 12 anatomical human models (phantoms) has been considered. Also, 18 supplementary phantoms obtained using the morphing technique were generated to build the required relation. This paper presents three models based on external morphological factors such as the body surface area, the body mass index or the body mass. These models show good results in estimating the WBSAR (<10%) for families obtained by the morphing technique, but these are still less accurate (30%) when applied to different original phantoms. This study stresses the importance of the internal morphological factors such as muscle and fat proportions in characterization of the WBSAR. The regression models are then improved using internal morphological factors with an estimation error of approximately 10% on the WBSAR. Finally, this study is suitable for establishing the statistical distribution of the WBSAR for a given population characterized by its morphology.
Noise-gating to Clean Astrophysical Image Data
NASA Astrophysics Data System (ADS)
DeForest, C. E.
2017-04-01
I present a family of algorithms to reduce noise in astrophysical images and image sequences, preserving more information from the original data than is retained by conventional techniques. The family uses locally adaptive filters (“noise gates”) in the Fourier domain to separate coherent image structure from background noise based on the statistics of local neighborhoods in the image. Processing of solar data limited by simple shot noise or by additive noise reveals image structure not easily visible in the originals, preserves photometry of observable features, and reduces shot noise by a factor of 10 or more with little to no apparent loss of resolution. This reveals faint features that were either not directly discernible or not sufficiently strongly detected for quantitative analysis. The method works best on image sequences containing related subjects, for example movies of solar evolution, but is also applicable to single images provided that there are enough pixels. The adaptive filter uses the statistical properties of noise and of local neighborhoods in the data to discriminate between coherent features and incoherent noise without reference to the specific shape or evolution of those features. The technique can potentially be modified in a straightforward way to exploit additional a priori knowledge about the functional form of the noise.
NASA Astrophysics Data System (ADS)
El Habachi, A.; Conil, E.; Hadjem, A.; Vazquez, E.; Wong, M. F.; Gati, A.; Fleury, G.; Wiart, J.
2010-04-01
In this paper, we propose identification of the morphological factors that may impact the whole-body averaged specific absorption rate (WBSAR). This study is conducted for the case of exposure to a front plane wave at a 2100 MHz frequency carrier. This study is based on the development of different regression models for estimating the WBSAR as a function of morphological factors. For this purpose, a database of 12 anatomical human models (phantoms) has been considered. Also, 18 supplementary phantoms obtained using the morphing technique were generated to build the required relation. This paper presents three models based on external morphological factors such as the body surface area, the body mass index or the body mass. These models show good results in estimating the WBSAR (<10%) for families obtained by the morphing technique, but these are still less accurate (30%) when applied to different original phantoms. This study stresses the importance of the internal morphological factors such as muscle and fat proportions in characterization of the WBSAR. The regression models are then improved using internal morphological factors with an estimation error of approximately 10% on the WBSAR. Finally, this study is suitable for establishing the statistical distribution of the WBSAR for a given population characterized by its morphology.
Current genetic methodologies in the identification of disaster victims and in forensic analysis.
Ziętkiewicz, Ewa; Witt, Magdalena; Daca, Patrycja; Zebracka-Gala, Jadwiga; Goniewicz, Mariusz; Jarząb, Barbara; Witt, Michał
2012-02-01
This review presents the basic problems and currently available molecular techniques used for genetic profiling in disaster victim identification (DVI). The environmental conditions of a mass disaster often result in severe fragmentation, decomposition and intermixing of the remains of victims. In such cases, traditional identification based on the anthropological and physical characteristics of the victims is frequently inconclusive. This is the reason why DNA profiling became the gold standard for victim identification in mass-casualty incidents (MCIs) or any forensic cases where human remains are highly fragmented and/or degraded beyond recognition. The review provides general information about the sources of genetic material for DNA profiling, the genetic markers routinely used during genetic profiling (STR markers, mtDNA and single-nucleotide polymorphisms [SNP]) and the basic statistical approaches used in DNA-based disaster victim identification. Automated technological platforms that allow the simultaneous analysis of a multitude of genetic markers used in genetic identification (oligonucleotide microarray techniques and next-generation sequencing) are also presented. Forensic and population databases containing information on human variability, routinely used for statistical analyses, are discussed. The final part of this review is focused on recent developments, which offer particularly promising tools for forensic applications (mRNA analysis, transcriptome variation in individuals/populations and genetic profiling of specific cells separated from mixtures).
Koerner, Tess K; Zhang, Yang
2017-02-27
Neurophysiological studies are often designed to examine relationships between measures from different testing conditions, time points, or analysis techniques within the same group of participants. Appropriate statistical techniques that can take into account repeated measures and multivariate predictor variables are integral and essential to successful data analysis and interpretation. This work implements and compares conventional Pearson correlations and linear mixed-effects (LME) regression models using data from two recently published auditory electrophysiology studies. For the specific research questions in both studies, the Pearson correlation test is inappropriate for determining strengths between the behavioral responses for speech-in-noise recognition and the multiple neurophysiological measures as the neural responses across listening conditions were simply treated as independent measures. In contrast, the LME models allow a systematic approach to incorporate both fixed-effect and random-effect terms to deal with the categorical grouping factor of listening conditions, between-subject baseline differences in the multiple measures, and the correlational structure among the predictor variables. Together, the comparative data demonstrate the advantages as well as the necessity to apply mixed-effects models to properly account for the built-in relationships among the multiple predictor variables, which has important implications for proper statistical modeling and interpretation of human behavior in terms of neural correlates and biomarkers.
NASA Astrophysics Data System (ADS)
Ahmad, M. F.; Rasi, R. Z.; Zakuan, N.; Hisyamudin, M. N. N.
2015-12-01
In today's highly competitive market, Total Quality Management (TQM) is vital management tool in ensuring a company can success in their business. In order to survive in the global market with intense competition amongst regions and enterprises, the adoption of tools and techniques are essential in improving business performance. There are consistent results between TQM and business performance. However, only few previous studies have examined the mediator effect namely statistical process control (SPC) between TQM and business performance. A mediator is a third variable that changes the association between an independent variable and an outcome variable. This study present research proposed a TQM performance model with mediator effect of SPC with structural equation modelling, which is a more comprehensive model for developing countries, specifically for Malaysia. A questionnaire was prepared and sent to 1500 companies from automotive industry and the related vendors in Malaysia, giving a 21.8 per cent rate. Attempts were made at findings significant impact of mediator between TQM practices and business performance showed that SPC is important tools and techniques in TQM implementation. The result concludes that SPC is partial correlation between and TQM and BP with indirect effect (IE) is 0.25 which can be categorised as high moderator effect.
Data Analysis Techniques for Physical Scientists
NASA Astrophysics Data System (ADS)
Pruneau, Claude A.
2017-10-01
Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.
Fusco, Diana; Barnum, Timothy J.; Bruno, Andrew E.; Luft, Joseph R.; Snell, Edward H.; Mukherjee, Sayan; Charbonneau, Patrick
2014-01-01
X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis. PMID:24988076
Fusco, Diana; Barnum, Timothy J; Bruno, Andrew E; Luft, Joseph R; Snell, Edward H; Mukherjee, Sayan; Charbonneau, Patrick
2014-01-01
X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis.
Gene coexpression measures in large heterogeneous samples using count statistics.
Wang, Y X Rachel; Waterman, Michael S; Huang, Haiyan
2014-11-18
With the advent of high-throughput technologies making large-scale gene expression data readily available, developing appropriate computational tools to process these data and distill insights into systems biology has been an important part of the "big data" challenge. Gene coexpression is one of the earliest techniques developed that is still widely in use for functional annotation, pathway analysis, and, most importantly, the reconstruction of gene regulatory networks, based on gene expression data. However, most coexpression measures do not specifically account for local features in expression profiles. For example, it is very likely that the patterns of gene association may change or only exist in a subset of the samples, especially when the samples are pooled from a range of experiments. We propose two new gene coexpression statistics based on counting local patterns of gene expression ranks to take into account the potentially diverse nature of gene interactions. In particular, one of our statistics is designed for time-course data with local dependence structures, such as time series coupled over a subregion of the time domain. We provide asymptotic analysis of their distributions and power, and evaluate their performance against a wide range of existing coexpression measures on simulated and real data. Our new statistics are fast to compute, robust against outliers, and show comparable and often better general performance.
Gao, Chao; Sun, Hanbo; Wang, Tuo; Tang, Ming; Bohnen, Nicolaas I; Müller, Martijn L T M; Herman, Talia; Giladi, Nir; Kalinin, Alexandr; Spino, Cathie; Dauer, William; Hausdorff, Jeffrey M; Dinov, Ivo D
2018-05-08
In this study, we apply a multidisciplinary approach to investigate falls in PD patients using clinical, demographic and neuroimaging data from two independent initiatives (University of Michigan and Tel Aviv Sourasky Medical Center). Using machine learning techniques, we construct predictive models to discriminate fallers and non-fallers. Through controlled feature selection, we identified the most salient predictors of patient falls including gait speed, Hoehn and Yahr stage, postural instability and gait difficulty-related measurements. The model-based and model-free analytical methods we employed included logistic regression, random forests, support vector machines, and XGboost. The reliability of the forecasts was assessed by internal statistical (5-fold) cross validation as well as by external out-of-bag validation. Four specific challenges were addressed in the study: Challenge 1, develop a protocol for harmonizing and aggregating complex, multisource, and multi-site Parkinson's disease data; Challenge 2, identify salient predictive features associated with specific clinical traits, e.g., patient falls; Challenge 3, forecast patient falls and evaluate the classification performance; and Challenge 4, predict tremor dominance (TD) vs. posture instability and gait difficulty (PIGD). Our findings suggest that, compared to other approaches, model-free machine learning based techniques provide a more reliable clinical outcome forecasting of falls in Parkinson's patients, for example, with a classification accuracy of about 70-80%.
Tasker, Gary D.; Granato, Gregory E.
2000-01-01
Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques, and to use tools and techniques that account for the unique nature of water-resources data sets. Populations of data on stormwater-runoff quantity and quality are often best modeled as logarithmic transformations. Therefore, these factors need to be considered to form valid, current, and technically defensible stormwater-runoff models. Regression analysis is an accepted method for interpretation of water-resources data and for prediction of current or future conditions at sites that fit the input data model. Regression analysis is designed to provide an estimate of the average response of a system as it relates to variation in one or more known variables. To produce valid models, however, regression analysis should include visual analysis of scatterplots, an examination of the regression equation, evaluation of the method design assumptions, and regression diagnostics. A number of statistical techniques are described in the text and in the appendixes to provide information necessary to interpret data by use of appropriate methods. Uncertainty is an important part of any decisionmaking process. In order to deal with uncertainty problems, the analyst needs to know the severity of the statistical uncertainty of the methods used to predict water quality. Statistical models need to be based on information that is meaningful, representative, complete, precise, accurate, and comparable to be deemed valid, up to date, and technically supportable. To assess uncertainty in the analytical tools, the modeling methods, and the underlying data set, all of these components need be documented and communicated in an accessible format within project publications.
Failure Analysis by Statistical Techniques (FAST). Volume 1. User’s Manual
1974-10-31
REPORT NUMBER DNA 3336F-1 2. OOVT ACCESSION NO 4. TITLE Cand Sublllle) • FAILURE ANALYSIS BY STATISTICAL TECHNIQUES (FAST) Volume I, User’s...SS2), and t’ a facility ( SS7 ). The other three diagrams break down the three critical subsystems. T le median probability of survival of the
USDA-ARS?s Scientific Manuscript database
The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...
ERIC Educational Resources Information Center
Martin, James L.
This paper reports on attempts by the author to construct a theoretical framework of adult education participation using a theory development process and the corresponding multivariate statistical techniques. Two problems are identified: the lack of theoretical framework in studying problems, and the limiting of statistical analysis to univariate…
ERIC Educational Resources Information Center
Vivo, Juana-Maria; Franco, Manuel
2008-01-01
This article attempts to present a novel application of a method of measuring accuracy for academic success predictors that could be used as a standard. This procedure is known as the receiver operating characteristic (ROC) curve, which comes from statistical decision techniques. The statistical prediction techniques provide predictor models and…
Statistical Techniques Used in Published Articles: A Historical Review of Reviews
ERIC Educational Resources Information Center
Skidmore, Susan Troncoso; Thompson, Bruce
2010-01-01
The purpose of the present study is to provide a historical account and metasynthesis of which statistical techniques are most frequently used in the fields of education and psychology. Six articles reviewing the "American Educational Research Journal" from 1969 to 1997 and five articles reviewing the psychological literature from 1948 to 2001…
A Technique for Merging Areas in Timber Mart-South Data
Jeffrey P. Prestemon; John M. Pye
2000-01-01
For over 20 yr, TimberMart-South (TMS) has been distributing prices of various wood products from southern forests. In the beginning of 1988, the reporting frequency changed from monthly to quarterly, a change readily addressed through a variety established statistical techniques. A more significant statistical challenge is Timber Mart-South's change in 1992 from...
NASA Technical Reports Server (NTRS)
Djorgovski, George
1993-01-01
The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multiparameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resource.
NASA Astrophysics Data System (ADS)
Poulain, Pierre-Marie; Luther, Douglas S.; Patzert, William C.
1992-11-01
Two techniques have been developed for estimating statistics of inertial oscillations from satellite-tracked drifters. These techniques overcome the difficulties inherent in estimating such statistics from data dependent upon space coordinates that are a function of time. Application of these techniques to tropical surface drifter data collected during the NORPAX, EPOCS, and TOGA programs reveals a latitude-dependent, statistically significant "blue shift" of inertial wave frequency. The latitudinal dependence of the blue shift is similar to predictions based on "global" internal wave spectral models, with a superposition of frequency shifting due to modification of the effective local inertial frequency by the presence of strongly sheared zonal mean currents within 12° of the equator.
NASA Technical Reports Server (NTRS)
Djorgovski, Stanislav
1992-01-01
The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multi parameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resources.
Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor
2016-09-01
In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].
Empirical performance of interpolation techniques in risk-neutral density (RND) estimation
NASA Astrophysics Data System (ADS)
Bahaludin, H.; Abdullah, M. H.
2017-03-01
The objective of this study is to evaluate the empirical performance of interpolation techniques in risk-neutral density (RND) estimation. Firstly, the empirical performance is evaluated by using statistical analysis based on the implied mean and the implied variance of RND. Secondly, the interpolation performance is measured based on pricing error. We propose using the leave-one-out cross-validation (LOOCV) pricing error for interpolation selection purposes. The statistical analyses indicate that there are statistical differences between the interpolation techniques:second-order polynomial, fourth-order polynomial and smoothing spline. The results of LOOCV pricing error shows that interpolation by using fourth-order polynomial provides the best fitting to option prices in which it has the lowest value error.
NASA Astrophysics Data System (ADS)
Kulikov, Mikhail Y.; Nechaev, Anton A.; Belikovich, Mikhail V.; Ermakova, Tatiana S.; Feigin, Alexander M.
2018-05-01
This Technical Note presents a statistical approach to evaluating simultaneous measurements of several atmospheric components under the assumption of photochemical equilibrium. We consider simultaneous measurements of OH, HO2, and O3 at the altitudes of the mesosphere as a specific example and their daytime photochemical equilibrium as an evaluating relationship. A simplified algebraic equation relating local concentrations of these components in the 50-100 km altitude range has been derived. The parameters of the equation are temperature, neutral density, local zenith angle, and the rates of eight reactions. We have performed a one-year simulation of the mesosphere and lower thermosphere using a 3-D chemical-transport model. The simulation shows that the discrepancy between the calculated evolution of the components and the equilibrium value given by the equation does not exceed 3-4 % in the full range of altitudes independent of season or latitude. We have developed a statistical Bayesian evaluation technique for simultaneous measurements of OH, HO2, and O3 based on the equilibrium equation taking into account the measurement error. The first results of the application of the technique to MLS/Aura data (Microwave Limb Sounder) are presented in this Technical Note. It has been found that the satellite data of the HO2 distribution regularly demonstrate lower altitudes of this component's mesospheric maximum. This has also been confirmed by model HO2 distributions and comparison with offline retrieval of HO2 from the daily zonal means MLS radiance.
Comparison of Sample Size by Bootstrap and by Formulas Based on Normal Distribution Assumption.
Wang, Zuozhen
2018-01-01
Bootstrapping technique is distribution-independent, which provides an indirect way to estimate the sample size for a clinical trial based on a relatively smaller sample. In this paper, sample size estimation to compare two parallel-design arms for continuous data by bootstrap procedure are presented for various test types (inequality, non-inferiority, superiority, and equivalence), respectively. Meanwhile, sample size calculation by mathematical formulas (normal distribution assumption) for the identical data are also carried out. Consequently, power difference between the two calculation methods is acceptably small for all the test types. It shows that the bootstrap procedure is a credible technique for sample size estimation. After that, we compared the powers determined using the two methods based on data that violate the normal distribution assumption. To accommodate the feature of the data, the nonparametric statistical method of Wilcoxon test was applied to compare the two groups in the data during the process of bootstrap power estimation. As a result, the power estimated by normal distribution-based formula is far larger than that by bootstrap for each specific sample size per group. Hence, for this type of data, it is preferable that the bootstrap method be applied for sample size calculation at the beginning, and that the same statistical method as used in the subsequent statistical analysis is employed for each bootstrap sample during the course of bootstrap sample size estimation, provided there is historical true data available that can be well representative of the population to which the proposed trial is planning to extrapolate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merkel, K.D.; Brown, M.L.; Dewanjee, M.K.
We prospectively compared sequential technetium-gallium imaging with indium-labeled-leukocyte imaging in fifty patients with suspected low-grade musculoskeletal sepsis. Adequate images and follow-up examinations were obtained for forty-two patients. The presence or absence of low-grade sepsis was confirmed by histological and bacteriological examinations of tissue specimens taken at surgery in thirty of the forty-two patients. In these thirty patients, the sensitivity of sequential Tc-Ga imaging was 48 per cent, the specificity was 86 per cent, and the accuracy was 57 per cent, whereas the sensitivity of the indium-labeled-leukocyte technique was 83 per cent, the specificity was 86 per cent, and the accuracymore » was 83 per cent. When the additional twelve patients for whom surgery was deemed unnecessary were considered, the sensitivity of sequential Tc-Ga imaging was 50 per cent, the specificity was 78 per cent, and the accuracy was 62 per cent, as compared with a sensitivity of 83 per cent, a specificity of 94 per cent, and an accuracy of 88 per cent with the indium-labeled-leukocyte method. In patients with a prosthesis the indium-labeled-leukocyte image was 94 per cent accurate, compared with 75 per cent accuracy for sequential Tc-Ga imaging. Statistical analysis of these data demonstrated that the indium-labeled-leukocyte technique was superior to sequential Tc-Ga imaging in detecting areas of low-grade musculoskeletal sepsis.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zauls, A. Jason; Ashenafi, Michael S.; Onicescu, Georgiana
2011-11-15
Purpose: To report our dosimetric results using a novel push-button seed delivery system that constructs custom links of seeds intraoperatively. Methods and Materials: From 2005 to 2007, 43 patients underwent implantation using a gun applicator (GA), and from 2007 to 2008, 48 patientsunderwent implantation with a novel technique allowing creation of intraoperatively built custom links of seeds (IBCL). Specific endpoint analyses were prostate D90% (pD90%), rV100% > 1.3 cc, and overall time under anesthesia. Results: Final analyses included 91 patients, 43 GA and 48 IBCL. Absolute change in pD90% ({Delta}pD90%) between intraoperative and postoperative plans was evaluated. Using GA method,more » the {Delta}pD90% was -8.1Gy and -12.8Gy for I-125 and Pd-103 implants, respectively. Similarly, the IBCL technique resulted in a {Delta}pD90% of -8.7Gy and -9.8Gy for I-125 and Pd-103 implants, respectively. No statistically significant difference in {Delta}pD90% was found comparing methods. The GA method had two intraoperative and 10 postoperative rV100% >1.3 cc. For IBCL, five intraoperative and eight postoperative plans had rV100% >1.3 cc. For GA, the mean time under anesthesia was 75 min and 87 min for Pd-103 and I-125 implants, respectively. For IBCL, the mean time was 86 and 98 min for Pd-103 and I-125. There was a statistical difference between the methods when comparing mean time under anesthesia. Conclusions: Dosimetrically relevant endpoints were equivalent between the two methods. Currently, time under anesthesia is longer using the IBCL technique but has decreased over time. IBCL is a straightforward brachytherapy technique that can be implemented into clinical practice as an alternative to gun applicators.« less
Spin Glass a Bridge Between Quantum Computation and Statistical Mechanics
NASA Astrophysics Data System (ADS)
Ohzeki, Masayuki
2013-09-01
In this chapter, we show two fascinating topics lying between quantum information processing and statistical mechanics. First, we introduce an elaborated technique, the surface code, to prepare the particular quantum state with robustness against decoherence. Interestingly, the theoretical limitation of the surface code, accuracy threshold, to restore the quantum state has a close connection with the problem on the phase transition in a special model known as spin glasses, which is one of the most active researches in statistical mechanics. The phase transition in spin glasses is an intractable problem, since we must strive many-body system with complicated interactions with change of their signs depending on the distance between spins. Fortunately, recent progress in spin-glass theory enables us to predict the precise location of the critical point, at which the phase transition occurs. It means that statistical mechanics is available for revealing one of the most interesting parts in quantum information processing. We show how to import the special tool in statistical mechanics into the problem on the accuracy threshold in quantum computation. Second, we show another interesting technique to employ quantum nature, quantum annealing. The purpose of quantum annealing is to search for the most favored solution of a multivariable function, namely optimization problem. The most typical instance is the traveling salesman problem to find the minimum tour while visiting all the cities. In quantum annealing, we introduce quantum fluctuation to drive a particular system with the artificial Hamiltonian, in which the ground state represents the optimal solution of the specific problem we desire to solve. Induction of the quantum fluctuation gives rise to the quantum tunneling effect, which allows nontrivial hopping from state to state. We then sketch a strategy to control the quantum fluctuation efficiently reaching the ground state. Such a generic framework is called quantum annealing. The most typical instance is quantum adiabatic computation based on the adiabatic theorem. The quantum adiabatic computation as discussed in the other chapter, unfortunately, has a crucial bottleneck for a part of the optimization problems. We here introduce several recent trials to overcome such a weakpoint by use of developments in statistical mechanics. Through both of the topics, we would shed light on the birth of the interdisciplinary field between quantum mechanics and statistical mechanics.
NASA Astrophysics Data System (ADS)
Schultz, E.; Genuer, V.; Marcoux, P.; Gal, O.; Belafdil, C.; Decq, D.; Maurin, Max; Morales, S.
2018-02-01
Elastic Light Scattering (ELS) is an innovative technique to identify bacterial pathogens directly on culture plates. Compelling results have already been reported for agri-food applications. Here, we have developed ELS for clinical diagnosis, starting with Staphylococcus aureus early screening. Our goal is to bring a result (positive/negative) after only 6 h of growth to fight surgical-site infections. The method starts with the acquisition of the scattering pattern arising from the interaction between a laser beam and a single bacterial colony growing on a culture medium. Then, the resulting image, considered as the bacterial species signature, is analyzed using statistical learning techniques. We present a custom optical setup able to target bacterial colonies with various sizes (30-500 microns). This system was used to collect a reference dataset of 38 strains of S. aureus and other Staphyloccocus species (5459 images) on ChromIDSAID/ MRSA bi-plates. A validation set from 20 patients has then been acquired and clinically-validated according to chromogenic enzymatic tests. The best correct-identification rate between S. aureus and S. non-aureus (94.7%) has been obtained using a support vector machine classifier trained on a combination of Fourier-Bessel moments and Local- Binary-Patterns extracted features. This statistical model applied to the validation set provided a sensitivity and a specificity of 90.0% and 56.9%, or alternatively, a positive predictive value of 47% and a negative predictive value of 93%. From a clinical point of view, the results head in the right direction and pave the way toward the WHO's requirements for rapid, low-cost, and automated diagnosis tools.
Martinez-Lozano Sinues, Pablo; Landoni, Elena; Miceli, Rosalba; Dibari, Vincenza F; Dugo, Matteo; Agresti, Roberto; Tagliabue, Elda; Cristoni, Simone; Orlandi, Rosaria
2015-09-21
Breath analysis represents a new frontier in medical diagnosis and a powerful tool for cancer biomarker discovery due to the recent development of analytical platforms for the detection and identification of human exhaled volatile compounds. Statistical and bioinformatic tools may represent an effective complement to the technical and instrumental enhancements needed to fully exploit clinical applications of breath analysis. Our exploratory study in a cohort of 14 breast cancer patients and 11 healthy volunteers used secondary electrospray ionization-mass spectrometry (SESI-MS) to detect a cancer-related volatile profile. SESI-MS full-scan spectra were acquired in a range of 40-350 mass-to-charge ratio (m/z), converted to matrix data and analyzed using a procedure integrating data pre-processing for quality control, and a two-step class prediction based on machine-learning techniques, including a robust feature selection, and a classifier development with internal validation. MS spectra from exhaled breath showed an individual-specific breath profile and high reciprocal homogeneity among samples, with strong agreement among technical replicates, suggesting a robust responsiveness of SESI-MS. Supervised analysis of breath data identified a support vector machine (SVM) model including 8 features corresponding to m/z 106, 126, 147, 78, 148, 52, 128, 315 and able to discriminate exhaled breath from breast cancer patients from that of healthy individuals, with sensitivity and specificity above 0.9.Our data highlight the significance of SESI-MS as an analytical technique for clinical studies of breath analysis and provide evidence that our noninvasive strategy detects volatile signatures that may support existing technologies to diagnose breast cancer.
Lewis, Gregory S; Conaway, William K; Wee, Hwabok; Kim, H Mike
2017-02-28
A novel technique of "anterior offsetting" of the humeral head component to address posterior instability in total shoulder arthroplasty has been proposed, and its biomechanical benefits have been previously demonstrated experimentally. The present study sought to characterize the changes in joint mechanics associated with anterior offsetting with various amounts of glenoid retroversion using cadaver specimen-specific 3-dimensional finite element models. Specimen-specific computational finite element models were developed through importing digitized locations of six musculotendinous units of the rotator cuff and deltoid muscles based off three cadaveric shoulder specimens implanted with total shoulder arthroplasty in either anatomic or anterior humeral head offset. Additional glenoid retroversion angles (0°, 10°, 20°, and 30°) other than each specimen׳s actual retroversion were modeled. Contact area, contact force, peak pressure, center of pressure, and humeral head displacement were calculated at each offset and retroversion for statistical analysis. Anterior offsetting was associated with significant anterior shift of center of pressure and humeral head displacement upon muscle loading (p<0.05). Although statistically insignificant, anterior offsetting was associated with increased contact area and decreased peak pressure (p > 0.05). All study variables showed significant differences when compared between the 4 different glenoid retroversion angles (p < 0.05) except for total force (p < 0.05). The study finding suggests that the anterior offsetting technique may contribute to joint stability in posteriorly unstable shoulder arthroplasty and may reduce eccentric loading on glenoid components although the long term clinical results are yet to be investigated in future. Copyright © 2017 Elsevier Ltd. All rights reserved.
Inverse Problems in Geodynamics Using Machine Learning Algorithms
NASA Astrophysics Data System (ADS)
Shahnas, M. H.; Yuen, D. A.; Pysklywec, R. N.
2018-01-01
During the past few decades numerical studies have been widely employed to explore the style of circulation and mixing in the mantle of Earth and other planets. However, in geodynamical studies there are many properties from mineral physics, geochemistry, and petrology in these numerical models. Machine learning, as a computational statistic-related technique and a subfield of artificial intelligence, has rapidly emerged recently in many fields of sciences and engineering. We focus here on the application of supervised machine learning (SML) algorithms in predictions of mantle flow processes. Specifically, we emphasize on estimating mantle properties by employing machine learning techniques in solving an inverse problem. Using snapshots of numerical convection models as training samples, we enable machine learning models to determine the magnitude of the spin transition-induced density anomalies that can cause flow stagnation at midmantle depths. Employing support vector machine algorithms, we show that SML techniques can successfully predict the magnitude of mantle density anomalies and can also be used in characterizing mantle flow patterns. The technique can be extended to more complex geodynamic problems in mantle dynamics by employing deep learning algorithms for putting constraints on properties such as viscosity, elastic parameters, and the nature of thermal and chemical anomalies.
Kate, Rohit J.; Swartz, Ann M.; Welch, Whitney A.; Strath, Scott J.
2016-01-01
Wearable accelerometers can be used to objectively assess physical activity. However, the accuracy of this assessment depends on the underlying method used to process the time series data obtained from accelerometers. Several methods have been proposed that use this data to identify the type of physical activity and estimate its energy cost. Most of the newer methods employ some machine learning technique along with suitable features to represent the time series data. This paper experimentally compares several of these techniques and features on a large dataset of 146 subjects doing eight different physical activities wearing an accelerometer on the hip. Besides features based on statistics, distance based features and simple discrete features straight from the time series were also evaluated. On the physical activity type identification task, the results show that using more features significantly improve results. Choice of machine learning technique was also found to be important. However, on the energy cost estimation task, choice of features and machine learning technique were found to be less influential. On that task, separate energy cost estimation models trained specifically for each type of physical activity were found to be more accurate than a single model trained for all types of physical activities. PMID:26862679
Autoregressive statistical pattern recognition algorithms for damage detection in civil structures
NASA Astrophysics Data System (ADS)
Yao, Ruigen; Pakzad, Shamim N.
2012-08-01
Statistical pattern recognition has recently emerged as a promising set of complementary methods to system identification for automatic structural damage assessment. Its essence is to use well-known concepts in statistics for boundary definition of different pattern classes, such as those for damaged and undamaged structures. In this paper, several statistical pattern recognition algorithms using autoregressive models, including statistical control charts and hypothesis testing, are reviewed as potentially competitive damage detection techniques. To enhance the performance of statistical methods, new feature extraction techniques using model spectra and residual autocorrelation, together with resampling-based threshold construction methods, are proposed. Subsequently, simulated acceleration data from a multi degree-of-freedom system is generated to test and compare the efficiency of the existing and proposed algorithms. Data from laboratory experiments conducted on a truss and a large-scale bridge slab model are then used to further validate the damage detection methods and demonstrate the superior performance of proposed algorithms.
Sung, Sheng-Feng; Chen, Kuanchin; Wu, Darren Philbert; Hung, Ling-Chien; Su, Yu-Hsiang; Hu, Ya-Han
2018-04-01
To reduce errors in determining eligibility for intravenous thrombolytic therapy (IVT) in stroke patients through use of an enhanced task-specific electronic medical record (EMR) interface powered by natural language processing (NLP) techniques. The information processing algorithm utilized MetaMap to extract medical concepts from IVT eligibility criteria and expanded the concepts using the Unified Medical Language System Metathesaurus. Concepts identified from clinical notes by MetaMap were compared to those from IVT eligibility criteria. The task-specific EMR interface displays IVT-relevant information by highlighting phrases that contain matched concepts. Clinical usability was assessed with clinicians staffing the acute stroke team by comparing user performance while using the task-specific and the current EMR interfaces. The algorithm identified IVT-relevant concepts with micro-averaged precisions, recalls, and F1 measures of 0.998, 0.812, and 0.895 at the phrase level and of 1, 0.972, and 0.986 at the document level. Users using the task-specific interface achieved a higher accuracy score than those using the current interface (91% versus 80%, p = 0.016) in assessing the IVT eligibility criteria. The completion time between the interfaces was statistically similar (2.46 min versus 1.70 min, p = 0.754). Although the information processing algorithm had room for improvement, the task-specific EMR interface significantly reduced errors in assessing IVT eligibility criteria. The study findings provide evidence to support an NLP enhanced EMR system to facilitate IVT decision-making by presenting meaningful and timely information to clinicians, thereby offering a new avenue for improvements in acute stroke care. Copyright © 2018 Elsevier B.V. All rights reserved.
Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L.M.; Hochstedler, R.D.
1997-02-01
Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less
Forensic detection of noise addition in digital images
NASA Astrophysics Data System (ADS)
Cao, Gang; Zhao, Yao; Ni, Rongrong; Ou, Bo; Wang, Yongbin
2014-03-01
We proposed a technique to detect the global addition of noise to a digital image. As an anti-forensics tool, noise addition is typically used to disguise the visual traces of image tampering or to remove the statistical artifacts left behind by other operations. As such, the blind detection of noise addition has become imperative as well as beneficial to authenticate the image content and recover the image processing history, which is the goal of general forensics techniques. Specifically, the special image blocks, including constant and strip ones, are used to construct the features for identifying noise addition manipulation. The influence of noising on blockwise pixel value distribution is formulated and analyzed formally. The methodology of detectability recognition followed by binary decision is proposed to ensure the applicability and reliability of noising detection. Extensive experimental results demonstrate the efficacy of our proposed noising detector.
Sci-Thur PM - Colourful Interactions: Highlights 08: ARC TBI using Single-Step Optimized VMAT Fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hudson, Alana; Gordon, Deborah; Moore, Roseanne
Purpose: This work outlines a new TBI delivery technique to replace a lateral POP full bolus technique. The new technique is done with VMAT arc delivery, without bolus, treating the patient prone and supine. The benefits of the arc technique include: increased patient experience and safety, better dose conformity, better organ at risk sparing, decreased therapist time and reduction of therapist injuries. Methods: In this work we build on a technique developed by Jahnke et al. We use standard arc fields with gantry speeds corrected for varying distance to the patient followed by a single step VMAT optimization on amore » patient CT to increase dose inhomogeneity and to reduce dose to the lungs (vs. blocks). To compare the arc TBI technique to our full bolus technique, we produced plans on patient CTs for both techniques and evaluated several dosimetric parameters using an ANOVA test. Results and Conclusions: The arc technique is able reduce both the hot areas to the body (D2% reduced from 122.2% to 111.8% p<0.01) and the lungs (mean lung dose reduced from 107.5% to 99.1%, p<0.01), both statistically significant, while maintaining coverage (D98% = 97.8% vs. 94.6%, p=0.313, not statistically significant). We developed a more patient and therapist-friendly TBI treatment technique that utilizes single-step optimized VMAT plans. It was found that this technique was dosimetrically equivalent to our previous lateral technique in terms of coverage and statistically superior in terms of reduced lung dose.« less
Garg, Bhavuk; Gupta, Manish; Singh, Menaka; Kalyanasundaram, Dinesh
2018-05-03
Spinal deformities are very challenging to treat and have a great risk of neurological complications due to hardware placement during corrective surgery. Various techniques have been introduced to ensure safe and accurate placement of pedicle screws. Patient-specific screw guides with pre-drawn and pre-validated trajectory seems to be an attractive option. We have focused on developing 3D printing technique for complex spinal deformities in India. This study also aimed to compare the placement of pedicle screw with 3D printing and free hand technique. This is a retrospective comparative clinical study at an academic institutional setting. A total of 20 patients were enrolled during the study, 10 were operated with the help of 3D printing (group 1) and 10 were operated with freehand technique (group 2). Group 1 included 6 congenital, 3 adolescent idiopathic scoliosis (AIS), one post tubercular kyphosis and Group 2 included 5 congenital, 4 AIS and one post tubercular kyphosis patient. Primary outcomes were measured in terms of screw violation and secondary outcome were measured in terms of Surgical time, Blood loss, Radiation exposure (no. of shoots required) and complications. MIMICS v18.0 Software was used for 3D reconstruction from CT scan images of all the patients. 3-Matic software was used to create drill guide. 3-D printer from Stratasys Mojo ABS P 430 model material cartilage (a thermoplastic material) was used for printing of vertebrae model and jigs. Two sample test of proportion was used to compare correctly and wrongly pedicle screw placement with 3D printing and freehand technique. T-test with equal variance was used for operating surgical time and blood loss. This work was carried out by collaboration of Orthopaedics Department, All India Institute of Medical Sciences (AIIMS), New Delhi and Biomedical Engineering Department, Indian Institute of Technology (IIT) Delhi. This project received the grant of USD 60000 from Department of Biotechnology (DBT), Government of India under DBT Innovative young Biotechnologist Award. No study-specific conflicts of interest-associated biases is declared by the authors. No superior or inferior screw violation was observed in any of our patients in either group. We found significant (p=0.03) difference between 2 groups regarding perfect screw placement in favour of 3D printing. There were 13 grade 2 medial perforations in free hand group and 3 in 3D printing group. There was no grade 3 medial perforation in either group. There were 6 grade 2 lateral perforations in free hand group and 7 in 3D printing group were observed. There were 3 grade 3 lateral perforation in free hand group and 2 in 3D printing group were observed. Analysis showed a statistically significant (p-value: 0.005) medial violation in free hand group. Surgical time was significantly (p-value: 0.03) less in 3D printing group as compared to free hand group. Mean Blood loss was higher in free hand group, however it was not statistically significant (p-value: 0.3) in 3D printing group. Fluoroscopic shots required were less in number in 3D printing group in comparison to free hand group. There was no neurological deficit in any of the patient in any group. In our study, focusing on spinal deformities statistically significant higher rate of accurate screw positioning and higher number of inserted screws with 3D printing was possible due to enhanced safety particularly at apical levels. As such, spinal deformities are difficult to treat worldwide. In India, these deformities are often neglected and present at a very late and much more deformed state when their treatment becomes even more challenging. Developing these patient specific drill templates will enable an average spine surgeon to treat these patients with much ease and safety. Copyright © 2018. Published by Elsevier Inc.
Resonance Raman spectroscopy for human cancer detection of key molecules with clinical diagnosis
NASA Astrophysics Data System (ADS)
Zhou, Yan; Liu, Cheng-hui; Li, Jiyou; Zhou, Lixin; He, Jingsheng; Sun, Yi; Pu, Yang; Zhu, Ke; Liu, Yulong; Li, Qingbo; Cheng, Gangge; Alfano, Robert R.
2013-03-01
Resonance Raman (RR) has the potential to reveal the differences between cancerous and normal breast and brain tissues in vitro. This differences caused by the changes of specific biomolecules in the tissues were displayed in resonance enhanced of vibrational fingerprints. It observed that the changes of reduced collagen contents and the number of methyl may show the sub-methylation of DNA in cancer cells. Statistical theoretical models of Bayesian, principal component analysis (PCA) and support vector machine (SVM) were used for distinguishing cancer from normal based on the RR spectral data of breast and meninges tissues yielding the diagnostic sensitivity of 80% and 90.9%, and specificity of 100% and 100%, respectively. The results demonstrated that the RR spectroscopic technique could be applied as clinical optical pathology tool with a high accuracy and reliability.
Abbas, Ahmar S; Moseley, Douglas; Kassam, Zahra; Kim, Sun Mo; Cho, Charles
2013-05-06
Recently, volumetric-modulated arc therapy (VMAT) has demonstrated the ability to deliver radiation dose precisely and accurately with a shorter delivery time compared to conventional intensity-modulated fixed-field treatment (IMRT). We applied the hypothesis of VMAT technique for the treatment of thoracic esophageal carcinoma to determine superior or equivalent conformal dose coverage for a large thoracic esophageal planning target volume (PTV) with superior or equivalent sparing of organs-at-risk (OARs) doses, and reduce delivery time and monitor units (MUs), in comparison with conventional fixed-field IMRT plans. We also analyzed and compared some other important metrics of treatment planning and treatment delivery for both IMRT and VMAT techniques. These metrics include: 1) the integral dose and the volume receiving intermediate dose levels between IMRT and VMATI plans; 2) the use of 4D CT to determine the internal motion margin; and 3) evaluating the dosimetry of every plan through patient-specific QA. These factors may impact the overall treatment plan quality and outcomes from the individual planning technique used. In this study, we also examined the significance of using two arcs vs. a single-arc VMAT technique for PTV coverage, OARs doses, monitor units and delivery time. Thirteen patients, stage T2-T3 N0-N1 (TNM AJCC 7th edn.), PTV volume median 395 cc (range 281-601 cc), median age 69 years (range 53 to 85), were treated from July 2010 to June 2011 with a four-field (n = 4) or five-field (n = 9) step-and-shoot IMRT technique using a 6 MV beam to a prescribed dose of 50 Gy in 20 to 25 F. These patients were retrospectively replanned using single arc (VMATI, 91 control points) and two arcs (VMATII, 182 control points). All treatment plans of the 13 study cases were evaluated using various dose-volume metrics. These included PTV D99, PTV D95, PTV V9547.5Gy(95%), PTV mean dose, Dmax, PTV dose conformity (Van't Riet conformation number (CN)), mean lung dose, lung V20 and V5, liver V30, and Dmax to the spinal canal prv3mm. Also examined were the total plan monitor units (MUs) and the beam delivery time. Equivalent target coverage was observed with both VMAT single and two-arc plans. The comparison of VMATI with fixed-field IMRT demonstrated equivalent target coverage; statistically no significant difference were found in PTV D99 (p = 0.47), PTV mean (p = 0.12), PTV D95 and PTV V9547.5Gy (95%) (p = 0.38). However, Dmax in VMATI plans was significantly lower compared to IMRT (p = 0.02). The Van't Riet dose conformation number (CN) was also statistically in favor of VMATI plans (p = 0.04). VMATI achieved lower lung V20 (p = 0.05), whereas lung V5 (p = 0.35) and mean lung dose (p = 0.62) were not significantly different. The other OARs, including spinal canal, liver, heart, and kidneys showed no statistically significant differences between the two techniques. Treatment time delivery for VMATI plans was reduced by up to 55% (p = 5.8E-10) and MUs reduced by up to 16% (p = 0.001). Integral dose was not statistically different between the two planning techniques (p = 0.99). There were no statistically significant differences found in dose distribution of the two VMAT techniques (VMATI vs. VMATII) Dose statistics for both VMAT techniques were: PTV D99 (p = 0.76), PTV D95 (p = 0.95), mean PTV dose (p = 0.78), conformation number (CN) (p = 0.26), and MUs (p = 0.1). However, the treatment delivery time for VMATII increased significantly by two-fold (p = 3.0E-11) compared to VMATI. VMAT-based treatment planning is safe and deliverable for patients with thoracic esophageal cancer with similar planning goals, when compared to standard IMRT. The key benefit for VMATI was the reduction in treatment delivery time and MUs, and improvement in dose conformality. In our study, we found no significant difference in VMATII over single-arc VMATI for PTV coverage or OARs doses. However, we observed significant increase in delivery time for VMATII compared to VMATI.
Multimodal fiber-probe spectroscopy for the diagnostics and classification of bladder tumors
NASA Astrophysics Data System (ADS)
Anand, Suresh; Cicchi, Riccardo; Fantechi, Riccardo; Gacci, Mauro; Nesi, Gabriella; Carini, Marco; Pavone, Francesco S.
2017-02-01
The gold standard for the detection of bladder cancer is white light cystoscopy, followed by an invasive biopsy and pathological examination. Tissue pathology is time consuming and often prone to sampling errors. Recently, optical spectroscopy techniques have evolved as promising techniques for the detection of neoplasia. The specific goal of this study is to evaluate the application of combined auto-fluorescence (excited using 378 nm and 445 nm wavelengths) and diffuse reflectance spectroscopy to discriminate normal bladder tissue from tumor at different grades. The fluorescence spectrum at both excitation wavelengths showed an increased spectral intensity in tumors with respect to normal tissues. Reflectance data indicated an increased reflectance in the wavelength range 610 nm - 700 nm for different grades of tumors, compared to normal tissues. The spectral data were further analyzed using principal component analysis for evaluating the sensitivity and specificity for diagnosing tumor. The spectral differences observed between various grades of tumors provides a strong genesis for the future evaluation on a larger patient population to achieve statistical significance. This study indicates that a combined spectroscopic strategy, incorporating fluorescence and reflectance spectroscopy, could improve the capability for diagnosing bladder tumor as well as for differentiating tumors in different grades.
Methodology and Applications of Disease Biomarker Identification in Human Serum
Sahab, Ziad J.; Semaan, Suzan M.; Sang, Qing-Xiang Amy
2007-01-01
Biomarkers are biomolecules that serve as indicators of biological and pathological processes, or physiological and pharmacological responses to a drug treatment. Because of the high abundance of albumin and heterogeneity of plasma lipoproteins and glycoproteins, biomarkers are difficult to identify in human serum. Due to the clinical significance the identification of disease biomarkers in serum holds great promise for personalized medicine, especially for disease diagnosis and prognosis. This review summarizes some common and emerging proteomics techniques utilized in the separation of serum samples and identification of disease signatures. The practical application of each protein separation or identification technique is analyzed using specific examples. Biomarkers of cancers of prostate, breast, ovary, and lung in human serum have been reviewed, as well as those of heart disease, arthritis, asthma, and cystic fibrosis. Despite the advancement of technology few biomarkers have been approved by the Food and Drug Administration for disease diagnosis and prognosis due to the complexity of structure and function of protein biomarkers and lack of high sensitivity, specificity, and reproducibility for those putative biomarkers. The combination of different types of technologies and statistical analysis may provide more effective methods to identify and validate new disease biomarkers in blood. PMID:19662190
NASA Astrophysics Data System (ADS)
Abdel-Ghany, Maha F.; Abdel-Aziz, Omar; Ayad, Miriam F.; Tadros, Mariam M.
New, simple, specific, accurate, precise and reproducible spectrophotometric methods have been developed and subsequently validated for determination of vildagliptin (VLG) and metformin (MET) in binary mixture. Zero order spectrophotometric method was the first method used for determination of MET in the range of 2-12 μg mL-1 by measuring the absorbance at 237.6 nm. The second method was derivative spectrophotometric technique; utilized for determination of MET at 247.4 nm, in the range of 1-12 μg mL-1. Derivative ratio spectrophotometric method was the third technique; used for determination of VLG in the range of 4-24 μg mL-1 at 265.8 nm. Fourth and fifth methods adopted for determination of VLG in the range of 4-24 μg mL-1; were ratio subtraction and mean centering spectrophotometric methods, respectively. All the results were statistically compared with the reported methods, using one-way analysis of variance (ANOVA). The developed methods were satisfactorily applied to analysis of the investigated drugs and proved to be specific and accurate for quality control of them in pharmaceutical dosage forms.
Quantitative multi-modal NDT data analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heideklang, René; Shokouhi, Parisa
2014-02-18
A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundantmore » information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.« less
NASA Astrophysics Data System (ADS)
Gallwas, Julia; Jalilova, Aydan; Ladurner, Roland; Kolben, Theresa Maria; Kolben, Thomas; Ditsch, Nina; Homann, Christian; Lankenau, Eva; Dannecker, Christian
2017-01-01
Optical coherence tomography (OCT) is a noninvasive high-resolution imaging technique that permits the detection of cancerous and precancerous lesions of the uterine cervix. The purpose of this study was to evaluate a new system that integrates an OCT device into a microscope. OCT images were taken from loop electrosurgical excision procedure (LEEP) specimens under microscopic guidance. The images were blinded with respect to their origin within the microscopic image and analyzed independently by two investigators using initially defined criteria and later compared to the corresponding histology. Sensitivity and specificity were calculated with respect to the correct identification of high-grade squamous intraepithelial lesions (HSIL). The interinvestigator agreement was assessed by using Cohen's kappa statistics. About 160 OCT images were obtained from 20 LEEP specimens. Sixty randomly chosen images were used to define reproducible criteria for evaluation. The assessment of the remaining 100 images showed a sensitivity of 88% (second investigator 84%) and a specificity of 69% (65%) in detecting HSIL. Surgical microscopy-guided OCT appears to be a promising technique for immediate assessment of microanatomical changes. In the gynecological setting, the combination of OCT with a colposcope may improve the detection of high-grade squamous intraepithelial lesions.
Use of Satellite Data Assimilation to Infer Land Surface Thermal Inertia
NASA Technical Reports Server (NTRS)
Lapenta, William; McNider, Richard T.; Biazar, Arastoo; Suggs, Ron; Jedlovec, Gary; Dembek, Scott
2002-01-01
There are two important but observationally uncertain parameters in the grid averaged surface energy budgets of mesoscale models - surface moisture availability and thermal heat capacity. A technique has been successfully developed for assimilating Geostationary Operational Environmental Satellite (GOES) skin temperature tendencies during the mid-morning time frame to improve specification of surface moisture. In a new application of the technique, the use of satellite skin temperature tendencies in early evening is explored to improve specification of the surface thermal heat capacity. Together, these two satellite assimilation constraints have been shown to significantly improve the characterization of the surface energy budget of a mesoscale model on fine spatial scales. The GOES assimilation without the adjusted heat capacity was run operationally during the International H2O Project on a 12-km grid. This paper presents the results obtained when using both the moisture availability and heat capacity retrievals in concert. Preliminary results indicate that retrieved moisture availability alone improved the verification statistics of 2-meter temperature and dew point forecasts. Results from the 1.5 month long study period using the bulk heat capacity will be presented at the meeting.
Nitti, Mariangela; Ciavolino, Enrico; Salvatore, Sergio; Gennaro, Alessandro
2010-09-01
The authors propose a method for analyzing the psychotherapy process: discourse flow analysis (DFA). DFA is a technique representing the verbal interaction between therapist and patient as a discourse network, aimed at measuring the therapist-patient discourse ability to generate new meanings through time. DFA assumes that the main function of psychotherapy is to produce semiotic novelty. DFA is applied to the verbatim transcript of the psychotherapy. It defines the main meanings active within the therapeutic discourse by means of the combined use of text analysis and statistical techniques. Subsequently, it represents the dynamic interconnections among these meanings in terms of a "discursive network." The dynamic and structural indexes of the discursive network have been shown to provide a valid representation of the patient-therapist communicative flow as well as an estimation of its clinical quality. Finally, a neural network is designed specifically to identify patterns of functioning of the discursive network and to verify the clinical validity of these patterns in terms of their association with specific phases of the psychotherapy process. An application of the DFA to a case of psychotherapy is provided to illustrate the method and the kinds of results it produces.
Comparison between two surgical techniques for root coverage with an acellular dermal matrix graft.
Andrade, Patrícia F; Felipe, Maria Emília M C; Novaes, Arthur B; Souza, Sérgio L S; Taba, Mário; Palioto, Daniela B; Grisi, Márcio F M
2008-03-01
The aim of this randomized, controlled, clinical study was to compare two surgical techniques with the acellular dermal matrix graft (ADMG) to evaluate which technique could provide better root coverage. Fifteen patients with bilateral Miller Class I gingival recession areas were selected. In each patient, one recession area was randomly assigned to the control group, while the contra-lateral recession area was assigned to the test group. The ADMG was used in both groups. The control group was treated with a broader flap and vertical-releasing incisions, and the test group was treated with the proposed surgical technique, without releasing incisions. The clinical parameters evaluated before the surgeries and after 12 months were: gingival recession height, probing depth, relative clinical attachment level and the width and thickness of keratinized tissue. There were no statistically significant differences between the groups for all parameters at baseline. After 12 months, there was a statistically significant reduction in recession height in both groups, and there was no statistically significant difference between the techniques with regard to root coverage. Both surgical techniques provided significant reduction in gingival recession height after 12 months, and similar results in relation to root coverage.
Mann, Michael P.; Rizzardo, Jule; Satkowski, Richard
2004-01-01
Accurate streamflow statistics are essential to water resource agencies involved in both science and decision-making. When long-term streamflow data are lacking at a site, estimation techniques are often employed to generate streamflow statistics. However, procedures for accurately estimating streamflow statistics often are lacking. When estimation procedures are developed, they often are not evaluated properly before being applied. Use of unevaluated or underevaluated flow-statistic estimation techniques can result in improper water-resources decision-making. The California State Water Resources Control Board (SWRCB) uses two key techniques, a modified rational equation and drainage basin area-ratio transfer, to estimate streamflow statistics at ungaged locations. These techniques have been implemented to varying degrees, but have not been formally evaluated. For estimating peak flows at the 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals, the SWRCB uses the U.S. Geological Surveys (USGS) regional peak-flow equations. In this study, done cooperatively by the USGS and SWRCB, the SWRCB estimated several flow statistics at 40 USGS streamflow gaging stations in the north coast region of California. The SWRCB estimates were made without reference to USGS flow data. The USGS used the streamflow data provided by the 40 stations to generate flow statistics that could be compared with SWRCB estimates for accuracy. While some SWRCB estimates compared favorably with USGS statistics, results were subject to varying degrees of error over the region. Flow-based estimation techniques generally performed better than rain-based methods, especially for estimation of December 15 to March 31 mean daily flows. The USGS peak-flow equations also performed well, but tended to underestimate peak flows. The USGS equations performed within reported error bounds, but will require updating in the future as peak-flow data sets grow larger. Little correlation was discovered between estimation errors and geographic locations or various basin characteristics. However, for 25-percentile year mean-daily-flow estimates for December 15 to March 31, the greatest estimation errors were at east San Francisco Bay area stations with mean annual precipitation less than or equal to 30 inches, and estimated 2-year/24-hour rainfall intensity less than 3 inches.
ERIC Educational Resources Information Center
Brossart, Daniel F.; Parker, Richard I.; Olson, Elizabeth A.; Mahadevan, Lakshmi
2006-01-01
This study explored some practical issues for single-case researchers who rely on visual analysis of graphed data, but who also may consider supplemental use of promising statistical analysis techniques. The study sought to answer three major questions: (a) What is a typical range of effect sizes from these analytic techniques for data from…
Arroyo-Hernández, M; Mellado-Romero, M A; Páramo-Díaz, P; Martín-López, C M; Cano-Egea, J M; Vilá Y Rico, J
2015-01-01
The purpose of this study is to analyze if there is any difference between the arthroscopic reparation of full-thickness supraspinatus tears with simple row technique versus suture bridge technique. We accomplished a retrospective study of 123 patients with full-thickness supraspinatus tears between January 2009 and January 2013 in our hospital. There were 60 simple row reparations, and 63 suture bridge ones. The mean age in the simple row group was 62.9, and in the suture bridge group was 63.3 years old. There were more women than men in both groups (67%). All patients were studied using the Constant test. The mean Constant test in the suture bridge group was 76.7, and in the simple row group was 72.4. We have also accomplished a statistical analysis of each Constant item. Strength was higher in the suture bridge group, with a significant statistical difference (p 0.04). The range of movement was also greater in the suture bridge group, but was not statistically significant. Suture bridge technique has better clinical results than single row reparations, but the difference is not statistically significant (p = 0.298).
Characterization of agricultural land using singular value decomposition
NASA Astrophysics Data System (ADS)
Herries, Graham M.; Danaher, Sean; Selige, Thomas
1995-11-01
A method is defined and tested for the characterization of agricultural land from multi-spectral imagery, based on singular value decomposition (SVD) and key vector analysis. The SVD technique, which bears a close resemblance to multivariate statistic techniques, has previously been successfully applied to problems of signal extraction for marine data and forestry species classification. In this study the SVD technique is used as a classifier for agricultural regions, using airborne Daedalus ATM data, with 1 m resolution. The specific region chosen is an experimental research farm in Bavaria, Germany. This farm has a large number of crops, within a very small region and hence is not amenable to existing techniques. There are a number of other significant factors which render existing techniques such as the maximum likelihood algorithm less suitable for this area. These include a very dynamic terrain and tessellated pattern soil differences, which together cause large variations in the growth characteristics of the crops. The SVD technique is applied to this data set using a multi-stage classification approach, removing unwanted land-cover classes one step at a time. Typical classification accuracy's for SVD are of the order of 85-100%. Preliminary results indicate that it is a fast and efficient classifier with the ability to differentiate between crop types such as wheat, rye, potatoes and clover. The results of characterizing 3 sub-classes of Winter Wheat are also shown.
Advanced techniques and technology for efficient data storage, access, and transfer
NASA Technical Reports Server (NTRS)
Rice, Robert F.; Miller, Warner
1991-01-01
Advanced techniques for efficiently representing most forms of data are being implemented in practical hardware and software form through the joint efforts of three NASA centers. These techniques adapt to local statistical variations to continually provide near optimum code efficiency when representing data without error. Demonstrated in several earlier space applications, these techniques are the basis of initial NASA data compression standards specifications. Since the techniques clearly apply to most NASA science data, NASA invested in the development of both hardware and software implementations for general use. This investment includes high-speed single-chip very large scale integration (VLSI) coding and decoding modules as well as machine-transferrable software routines. The hardware chips were tested in the laboratory at data rates as high as 700 Mbits/s. A coding module's definition includes a predictive preprocessing stage and a powerful adaptive coding stage. The function of the preprocessor is to optimally process incoming data into a standard form data source that the second stage can handle.The built-in preprocessor of the VLSI coder chips is ideal for high-speed sampled data applications such as imaging and high-quality audio, but additionally, the second stage adaptive coder can be used separately with any source that can be externally preprocessed into the 'standard form'. This generic functionality assures that the applicability of these techniques and their recent high-speed implementations should be equally broad outside of NASA.
Assessment of the beryllium lymphocyte proliferation test using statistical process control.
Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M
2006-10-01
Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that were beyond what would be expected due to chance alone. Patterns of test results suggested that variations were systematic. We conclude that laboratories performing the BeBLPT or other similar biological assays of immunological response could benefit from a statistical approach such as SPC to improve quality management.
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2001-01-01
The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.
Allen, R W; Harnsberger, H R; Shelton, C; King, B; Bell, D A; Miller, R; Parkin, J L; Apfelbaum, R I; Parker, D
1996-08-01
To determine whether unenhanced high-resolution T2-weighted fast spin-echo MR imaging provides an acceptable and less expensive alternative to contrast-enhanced conventional T1-weighted spin-echo MR techniques in the diagnosis of acoustic schwannoma. We reviewed in a blinded fashion the records of 25 patients with pathologically documented acoustic schwannoma and of 25 control subjects, all of whom had undergone both enhanced conventional spin-echo MR imaging and unenhanced fast spin-echo MR imaging of the cerebellopontine angle/internal auditory canal region. The patients were imaged with the use of a quadrature head receiver coil for the conventional spin-echo sequences and dual 3-inch phased-array receiver coils for the fast spin-echo sequences. The size of the acoustic schwannomas ranged from 2 to 40 mm in maximum dimension. The mean maximum diameter was 12 mm, and 12 neoplasms were less than 10 mm in diameter. Acoustic schwannoma was correctly diagnosed on 98% of the fast spin-echo images and on 100% of the enhanced conventional spin-echo images. Statistical analysis of the data using the kappa coefficient demonstrated agreement beyond chance between these two imaging techniques for the diagnosis of acoustic schwannoma. There is no statistically significant difference in the sensitivity and specificity of unenhanced high-resolution fast spin-echo imaging and enhance T1-weighted conventional spin-echo imaging in the detection of acoustic schwannoma. We believe that the unenhanced high-resolution fast spin-echo technique provides a cost-effective method for the diagnosis of acoustic schwannoma.
Robotic radical cystectomy and intracorporeal urinary diversion: The USC technique.
Abreu, Andre Luis de Castro; Chopra, Sameer; Azhar, Raed A; Berger, Andre K; Miranda, Gus; Cai, Jie; Gill, Inderbir S; Aron, Monish; Desai, Mihir M
2014-07-01
Radical cystectomy is the gold-standard treatment for muscle-invasive and refractory nonmuscle-invasive bladder cancer. We describe our technique for robotic radical cystectomy (RRC) and intracorporeal urinary diversion (ICUD), that replicates open surgical principles, and present our preliminary results. Specific descriptions for preoperative planning, surgical technique, and postoperative care are provided. Demographics, perioperative and 30-day complications data were collected prospectively and retrospectively analyzed. Learning curve trends were analyzed individually for ileal conduits (IC) and neobladders (NB). SAS(®) Software Version 9.3 was used for statistical analyses with statistical significance set at P < 0.05. Between July 2010 and September 2013, RRC and lymph node dissection with ICUD were performed in 103 consecutive patients (orthotopic NB=46, IC 57). All procedures were completed robotically replicating the open surgical principles. The learning curve trends showed a significant reduction in hospital stay for both IC (11 vs. 6-day, P < 0.01) and orthotopic NB (13 vs. 7.5-day, P < 0.01) when comparing the first third of the cohort with the rest of the group. Overall median (range) operative time and estimated blood loss was 7 h (4.8-13) and 200 mL (50-1200), respectively. Within 30-day postoperatively, complications occurred in 61 (59%) patients, with the majority being low grade (n = 43), and no patient died. Median (range) nodes yield was 36 (0-106) and 4 (3.9%) specimens had positive surgical margins. Robotic radical cystectomy with totally ICUD is safe and feasible. It can be performed using the established open surgical principles with encouraging perioperative outcomes.
Mammalian Cardiovascular Patterning as Determined by Hemodynamic Forces and Blood Vessel Genetics
NASA Astrophysics Data System (ADS)
Anderson, Gregory Arthur
Cardiovascular development is a process that involves the timing of multiple molecular events, and numerous subtle three-dimensional conformational changes. Traditional developmental biology techniques have provided large quantities of information as to how these complex organ systems develop. However, the major drawback of the majority of current developmental biological imaging is that they are two-dimensional in nature. It is now well recognized that circulation of blood is required for normal patterning and remodeling of blood vessels. Normal blood vessel formation is dependent upon a complex network of signaling pathways, and genetic mutations in these pathways leads to impaired vascular development, heart failure, and lethality. As such, it is not surprising that mutant mice with aberrant cardiovascular patterning are so common, since normal development requires proper coordination between three systems: the heart, the blood, and the vasculature. This thesis describes the implementation of a three-dimensional imaging technique, optical projection tomography (OPT), in conjunction with a computer-based registration algorithm to statistically analyze developmental differences in groups of wild-type mouse embryos. Embryos that differ by only a few hours' gestational time are shown to have developmental differences in blood vessel formation and heart development progression that can be discerned. This thesis describes how we analyzed mouse models of cardiovascular perturbation by OPT to detect morphological differences in embryonic development in both qualitative and quantitative ways. Both a blood vessel specific mutation and a cardiac specific mutation were analyzed, providing evidence that developmental defects of these types can be quantified. Finally, we describe the implementation of OPT imaging to identify statistically significant phenotypes from three different mouse models of cardiovascular perturbation across a range of developmental time points. Image registration methods, combined with intensity- and deformation-based analyses are described and utilized to fully characterize myosin light chain 2a (Mlc2a), delta-like ligand 4 (Dll4), and Endoglin (Eng) mutant mouse embryos. We show that Eng mutant embryos are statistically similar to the Mlc2a phenotype, confirming that these mouse mutants suffer from a primary cardiac developmental defect. Thus, a loss of hemodynamic force caused by defective pumping of the heart is the primary developmental defect affecting these mice.
The Effect of Student-Driven Projects on the Development of Statistical Reasoning
ERIC Educational Resources Information Center
Sovak, Melissa M.
2010-01-01
Research has shown that even if students pass a standard introductory statistics course, they often still lack the ability to reason statistically. Many instructional techniques for enhancing the development of statistical reasoning have been discussed, although there is often little to no experimental evidence that they produce effective results…
ERIC Educational Resources Information Center
DeMark, Sarah F.; Behrens, John T.
2004-01-01
Whereas great advances have been made in the statistical sophistication of assessments in terms of evidence accumulation and task selection, relatively little statistical work has explored the possibility of applying statistical techniques to data for the purposes of determining appropriate domain understanding and to generate task-level scoring…
Parametric Model Based On Imputations Techniques for Partly Interval Censored Data
NASA Astrophysics Data System (ADS)
Zyoud, Abdallah; Elfaki, F. A. M.; Hrairi, Meftah
2017-12-01
The term ‘survival analysis’ has been used in a broad sense to describe collection of statistical procedures for data analysis. In this case, outcome variable of interest is time until an event occurs where the time to failure of a specific experimental unit might be censored which can be right, left, interval, and Partly Interval Censored data (PIC). In this paper, analysis of this model was conducted based on parametric Cox model via PIC data. Moreover, several imputation techniques were used, which are: midpoint, left & right point, random, mean, and median. Maximum likelihood estimate was considered to obtain the estimated survival function. These estimations were then compared with the existing model, such as: Turnbull and Cox model based on clinical trial data (breast cancer data), for which it showed the validity of the proposed model. Result of data set indicated that the parametric of Cox model proved to be more superior in terms of estimation of survival functions, likelihood ratio tests, and their P-values. Moreover, based on imputation techniques; the midpoint, random, mean, and median showed better results with respect to the estimation of survival function.
Statistical Exploration of Electronic Structure of Molecules from Quantum Monte-Carlo Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prabhat, Mr; Zubarev, Dmitry; Lester, Jr., William A.
In this report, we present results from analysis of Quantum Monte Carlo (QMC) simulation data with the goal of determining internal structure of a 3N-dimensional phase space of an N-electron molecule. We are interested in mining the simulation data for patterns that might be indicative of the bond rearrangement as molecules change electronic states. We examined simulation output that tracks the positions of two coupled electrons in the singlet and triplet states of an H2 molecule. The electrons trace out a trajectory, which was analyzed with a number of statistical techniques. This project was intended to address the following scientificmore » questions: (1) Do high-dimensional phase spaces characterizing electronic structure of molecules tend to cluster in any natural way? Do we see a change in clustering patterns as we explore different electronic states of the same molecule? (2) Since it is hard to understand the high-dimensional space of trajectories, can we project these trajectories to a lower dimensional subspace to gain a better understanding of patterns? (3) Do trajectories inherently lie in a lower-dimensional manifold? Can we recover that manifold? After extensive statistical analysis, we are now in a better position to respond to these questions. (1) We definitely see clustering patterns, and differences between the H2 and H2tri datasets. These are revealed by the pamk method in a fairly reliable manner and can potentially be used to distinguish bonded and non-bonded systems and get insight into the nature of bonding. (2) Projecting to a lower dimensional subspace ({approx}4-5) using PCA or Kernel PCA reveals interesting patterns in the distribution of scalar values, which can be related to the existing descriptors of electronic structure of molecules. Also, these results can be immediately used to develop robust tools for analysis of noisy data obtained during QMC simulations (3) All dimensionality reduction and estimation techniques that we tried seem to indicate that one needs 4 or 5 components to account for most of the variance in the data, hence this 5D dataset does not necessarily lie on a well-defined, low dimensional manifold. In terms of specific clustering techniques, K-means was generally useful in exploring the dataset. The partition around medoids (pam) technique produced the most definitive results for our data showing distinctive patterns for both a sample of the complete data and time-series. The gap statistic with tibshirani criteria did not provide any distinction across the 2 dataset. The gap statistic w/DandF criteria, Model based clustering and hierarchical modeling simply failed to run on our datasets. Thankfully, the vanilla PCA technique was successful in handling our entire dataset. PCA revealed some interesting patterns for the scalar value distribution. Kernel PCA techniques (vanilladot, RBF, Polynomial) and MDS failed to run on the entire dataset, or even a significant fraction of the dataset, and we resorted to creating an explicit feature map followed by conventional PCA. Clustering using K-means and PAM in the new basis set seems to produce promising results. Understanding the new basis set in the scientific context of the problem is challenging, and we are currently working to further examine and interpret the results.« less
Canary, Jana D; Blizzard, Leigh; Barry, Ronald P; Hosmer, David W; Quinn, Stephen J
2016-05-01
Generalized linear models (GLM) with a canonical logit link function are the primary modeling technique used to relate a binary outcome to predictor variables. However, noncanonical links can offer more flexibility, producing convenient analytical quantities (e.g., probit GLMs in toxicology) and desired measures of effect (e.g., relative risk from log GLMs). Many summary goodness-of-fit (GOF) statistics exist for logistic GLM. Their properties make the development of GOF statistics relatively straightforward, but it can be more difficult under noncanonical links. Although GOF tests for logistic GLM with continuous covariates (GLMCC) have been applied to GLMCCs with log links, we know of no GOF tests in the literature specifically developed for GLMCCs that can be applied regardless of link function chosen. We generalize the Tsiatis GOF statistic originally developed for logistic GLMCCs, (TG), so that it can be applied under any link function. Further, we show that the algebraically related Hosmer-Lemeshow (HL) and Pigeon-Heyse (J(2) ) statistics can be applied directly. In a simulation study, TG, HL, and J(2) were used to evaluate the fit of probit, log-log, complementary log-log, and log models, all calculated with a common grouping method. The TG statistic consistently maintained Type I error rates, while those of HL and J(2) were often lower than expected if terms with little influence were included. Generally, the statistics had similar power to detect an incorrect model. An exception occurred when a log GLMCC was incorrectly fit to data generated from a logistic GLMCC. In this case, TG had more power than HL or J(2) . © 2015 John Wiley & Sons Ltd/London School of Economics.
Atmospheric Tracer Inverse Modeling Using Markov Chain Monte Carlo (MCMC)
NASA Astrophysics Data System (ADS)
Kasibhatla, P.
2004-12-01
In recent years, there has been an increasing emphasis on the use of Bayesian statistical estimation techniques to characterize the temporal and spatial variability of atmospheric trace gas sources and sinks. The applications have been varied in terms of the particular species of interest, as well as in terms of the spatial and temporal resolution of the estimated fluxes. However, one common characteristic has been the use of relatively simple statistical models for describing the measurement and chemical transport model error statistics and prior source statistics. For example, multivariate normal probability distribution functions (pdfs) are commonly used to model these quantities and inverse source estimates are derived for fixed values of pdf paramaters. While the advantage of this approach is that closed form analytical solutions for the a posteriori pdfs of interest are available, it is worth exploring Bayesian analysis approaches which allow for a more general treatment of error and prior source statistics. Here, we present an application of the Markov Chain Monte Carlo (MCMC) methodology to an atmospheric tracer inversion problem to demonstrate how more gereral statistical models for errors can be incorporated into the analysis in a relatively straightforward manner. The MCMC approach to Bayesian analysis, which has found wide application in a variety of fields, is a statistical simulation approach that involves computing moments of interest of the a posteriori pdf by efficiently sampling this pdf. The specific inverse problem that we focus on is the annual mean CO2 source/sink estimation problem considered by the TransCom3 project. TransCom3 was a collaborative effort involving various modeling groups and followed a common modeling and analysis protocoal. As such, this problem provides a convenient case study to demonstrate the applicability of the MCMC methodology to atmospheric tracer source/sink estimation problems.
Single cell adhesion assay using computer controlled micropipette.
Salánki, Rita; Hős, Csaba; Orgovan, Norbert; Péter, Beatrix; Sándor, Noémi; Bajtay, Zsuzsa; Erdei, Anna; Horvath, Robert; Szabó, Bálint
2014-01-01
Cell adhesion is a fundamental phenomenon vital for all multicellular organisms. Recognition of and adhesion to specific macromolecules is a crucial task of leukocytes to initiate the immune response. To gain statistically reliable information of cell adhesion, large numbers of cells should be measured. However, direct measurement of the adhesion force of single cells is still challenging and today's techniques typically have an extremely low throughput (5-10 cells per day). Here, we introduce a computer controlled micropipette mounted onto a normal inverted microscope for probing single cell interactions with specific macromolecules. We calculated the estimated hydrodynamic lifting force acting on target cells by the numerical simulation of the flow at the micropipette tip. The adhesion force of surface attached cells could be accurately probed by repeating the pick-up process with increasing vacuum applied in the pipette positioned above the cell under investigation. Using the introduced methodology hundreds of cells adhered to specific macromolecules were measured one by one in a relatively short period of time (∼30 min). We blocked nonspecific cell adhesion by the protein non-adhesive PLL-g-PEG polymer. We found that human primary monocytes are less adherent to fibrinogen than their in vitro differentiated descendants: macrophages and dendritic cells, the latter producing the highest average adhesion force. Validation of the here introduced method was achieved by the hydrostatic step-pressure micropipette manipulation technique. Additionally the result was reinforced in standard microfluidic shear stress channels. Nevertheless, automated micropipette gave higher sensitivity and less side-effect than the shear stress channel. Using our technique, the probed single cells can be easily picked up and further investigated by other techniques; a definite advantage of the computer controlled micropipette. Our experiments revealed the existence of a sub-population of strongly fibrinogen adherent cells appearing in macrophages and highly represented in dendritic cells, but not observed in monocytes.
Single Cell Adhesion Assay Using Computer Controlled Micropipette
Salánki, Rita; Hős, Csaba; Orgovan, Norbert; Péter, Beatrix; Sándor, Noémi; Bajtay, Zsuzsa; Erdei, Anna; Horvath, Robert; Szabó, Bálint
2014-01-01
Cell adhesion is a fundamental phenomenon vital for all multicellular organisms. Recognition of and adhesion to specific macromolecules is a crucial task of leukocytes to initiate the immune response. To gain statistically reliable information of cell adhesion, large numbers of cells should be measured. However, direct measurement of the adhesion force of single cells is still challenging and today’s techniques typically have an extremely low throughput (5–10 cells per day). Here, we introduce a computer controlled micropipette mounted onto a normal inverted microscope for probing single cell interactions with specific macromolecules. We calculated the estimated hydrodynamic lifting force acting on target cells by the numerical simulation of the flow at the micropipette tip. The adhesion force of surface attached cells could be accurately probed by repeating the pick-up process with increasing vacuum applied in the pipette positioned above the cell under investigation. Using the introduced methodology hundreds of cells adhered to specific macromolecules were measured one by one in a relatively short period of time (∼30 min). We blocked nonspecific cell adhesion by the protein non-adhesive PLL-g-PEG polymer. We found that human primary monocytes are less adherent to fibrinogen than their in vitro differentiated descendants: macrophages and dendritic cells, the latter producing the highest average adhesion force. Validation of the here introduced method was achieved by the hydrostatic step-pressure micropipette manipulation technique. Additionally the result was reinforced in standard microfluidic shear stress channels. Nevertheless, automated micropipette gave higher sensitivity and less side-effect than the shear stress channel. Using our technique, the probed single cells can be easily picked up and further investigated by other techniques; a definite advantage of the computer controlled micropipette. Our experiments revealed the existence of a sub-population of strongly fibrinogen adherent cells appearing in macrophages and highly represented in dendritic cells, but not observed in monocytes. PMID:25343359
Mathematical Modelling for Patient Selection in Proton Therapy.
Mee, T; Kirkby, N F; Kirkby, K J
2018-05-01
Proton beam therapy (PBT) is still relatively new in cancer treatment and the clinical evidence base is relatively sparse. Mathematical modelling offers assistance when selecting patients for PBT and predicting the demand for service. Discrete event simulation, normal tissue complication probability, quality-adjusted life-years and Markov Chain models are all mathematical and statistical modelling techniques currently used but none is dominant. As new evidence and outcome data become available from PBT, comprehensive models will emerge that are less dependent on the specific technologies of radiotherapy planning and delivery. Copyright © 2018 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Multidimensional competences of supply chain managers: an empirical study
NASA Astrophysics Data System (ADS)
Shou, Yongyi; Wang, Weijiao
2017-01-01
Supply chain manager competences have attracted increasing attention from both practitioners and scholars in recent years. This paper conducted an explorative study to understand the dimensionality of supply chain manager competences. Online job advertisements for supply chain managers were collected as secondary data, since these advertisements reflect employers' real job requirements. We adopted the multidimensional scaling (MDS) technique to process and analyse the data. Five dimensions of supply chain manager competences are identified: generic skills, functional skills, supply chain management (SCM) qualifications and leadership, SCM expertise, and industry-specific and senior management skills. Statistic tests indicate that supply chain manager competence saliences vary in different industries and regions.
Texture analysis of pulmonary parenchyma in normal and emphysematous lung
NASA Astrophysics Data System (ADS)
Uppaluri, Renuka; Mitsa, Theophano; Hoffman, Eric A.; McLennan, Geoffrey; Sonka, Milan
1996-04-01
Tissue characterization using texture analysis is gaining increasing importance in medical imaging. We present a completely automated method for discriminating between normal and emphysematous regions from CT images. This method involves extracting seventeen features which are based on statistical, hybrid and fractal texture models. The best subset of features is derived from the training set using the divergence technique. A minimum distance classifier is used to classify the samples into one of the two classes--normal and emphysema. Sensitivity and specificity and accuracy values achieved were 80% or greater in most cases proving that texture analysis holds great promise in identifying emphysema.
Far-from-Equilibrium Route to Superthermal Light in Bimodal Nanolasers
NASA Astrophysics Data System (ADS)
Marconi, Mathias; Javaloyes, Julien; Hamel, Philippe; Raineri, Fabrice; Levenson, Ariel; Yacomotti, Alejandro M.
2018-02-01
Microscale and nanoscale lasers inherently exhibit rich photon statistics due to complex light-matter interaction in a strong spontaneous emission noise background. It is well known that they may display superthermal fluctuations—photon superbunching—in specific situations due to either gain competition, leading to mode-switching instabilities, or carrier-carrier coupling in superradiant microcavities. Here we show a generic route to superbunching in bimodal nanolasers by preparing the system far from equilibrium through a parameter quench. We demonstrate, both theoretically and experimentally, that transient dynamics after a short-pump-pulse-induced quench leads to heavy-tailed superthermal statistics when projected onto the weak mode. We implement a simple experimental technique to access the probability density functions that further enables quantifying the distance from thermal equilibrium via the thermodynamic entropy. The universality of this mechanism relies on the far-from-equilibrium dynamical scenario, which can be mapped to a fast cooling process of a suspension of Brownian particles in a liquid. Our results open up new avenues to mold photon statistics in multimode optical systems and may constitute a test bed to investigate out-of-equilibrium thermodynamics using micro or nanocavity arrays.
Texture analysis with statistical methods for wheat ear extraction
NASA Astrophysics Data System (ADS)
Bakhouche, M.; Cointault, F.; Gouton, P.
2007-01-01
In agronomic domain, the simplification of crop counting, necessary for yield prediction and agronomic studies, is an important project for technical institutes such as Arvalis. Although the main objective of our global project is to conceive a mobile robot for natural image acquisition directly in a field, Arvalis has proposed us first to detect by image processing the number of wheat ears in images before to count them, which will allow to obtain the first component of the yield. In this paper we compare different texture image segmentation techniques based on feature extraction by first and higher order statistical methods which have been applied on our images. The extracted features are used for unsupervised pixel classification to obtain the different classes in the image. So, the K-means algorithm is implemented before the choice of a threshold to highlight the ears. Three methods have been tested in this feasibility study with very average error of 6%. Although the evaluation of the quality of the detection is visually done, automatic evaluation algorithms are currently implementing. Moreover, other statistical methods of higher order will be implemented in the future jointly with methods based on spatio-frequential transforms and specific filtering.
A novel data-driven learning method for radar target detection in nonstationary environments
Akcakaya, Murat; Nehorai, Arye; Sen, Satyabrata
2016-04-12
Most existing radar algorithms are developed under the assumption that the environment (clutter) is stationary. However, in practice, the characteristics of the clutter can vary enormously depending on the radar-operational scenarios. If unaccounted for, these nonstationary variabilities may drastically hinder the radar performance. Therefore, to overcome such shortcomings, we develop a data-driven method for target detection in nonstationary environments. In this method, the radar dynamically detects changes in the environment and adapts to these changes by learning the new statistical characteristics of the environment and by intelligibly updating its statistical detection algorithm. Specifically, we employ drift detection algorithms to detectmore » changes in the environment; incremental learning, particularly learning under concept drift algorithms, to learn the new statistical characteristics of the environment from the new radar data that become available in batches over a period of time. The newly learned environment characteristics are then integrated in the detection algorithm. Furthermore, we use Monte Carlo simulations to demonstrate that the developed method provides a significant improvement in the detection performance compared with detection techniques that are not aware of the environmental changes.« less
Phospholipid Fatty Acid Analysis: Past, Present and Future
NASA Astrophysics Data System (ADS)
Findlay, R. H.
2008-12-01
With their 1980 publication, Bobbie and White initiated the use of phospholipid fatty acids for the study of microbial communities. This method, integrated with a previously published biomass assay based on the colorimetric detection of orthophosphate liberated from phospholipids, provided the first quantitative method for determining microbial community structure. The method is based on a quantitative extraction of lipids from the sample matrix, isolation of the phospholipids, conversion of the phospholipid fatty acids to their corresponding fatty acid methyl esters (known by the acronym FAME) and the separation, identification and quantification of the FAME by gas chromatography. Early laboratory and field samples focused on correlating individual fatty acids to particular groups of microorganisms. Subsequent improvements to the methodology include reduced solvent volumes for extractions, improved sensitivity in the detection of orthophosphate and the use of solid phase extraction technology. Improvements in the field of gas chromatography also increased accessibility of the technique and it has been widely applied to water, sediment, soil and aerosol samples. Whole cell fatty acid analysis, a related but not equal technique, is currently used for phenotypic characterization in bacterial species descriptions and is the basis for a commercial, rapid bacterial identification system. In the early 1990ês application of multivariate statistical analysis, first cluster analysis and then principal component analysis, further improved the usefulness of the technique and allowed the development of a functional group approach to interpretation of phospholipid fatty acid profiles. Statistical techniques currently applied to the analysis of phospholipid fatty acid profiles include constrained ordinations and neutral networks. Using redundancy analysis, a form of constrained ordination, we have recently shown that both cation concentration and dissolved organic matter (DOM) quality are determinates of microbial community structure in forested headwater streams. One of the most exciting recent developments in phospholipid fatty acid analysis is the application of compound specific stable isotope analysis. We are currently applying this technique to stream sediments to help determine which microorganisms are involved in the initial processing of DOM and the technique promises to be a useful tool for assigning ecological function to microbial populations.
An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques
2018-01-09
ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological and...is no longer needed. Do not return it to the originator. ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy ...4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques 5a. CONTRACT NUMBER
The purpose of this memorandum is to inform recipients of concerns regarding Army Corps of Engineers statistical techniques, provide a list of installations and FWS where SiteStat/GridStats (SS/GS) have been used, and to provide direction on communicating with the public on the use of these 'tools' by USACE.
Application of multivariate statistical techniques in microbial ecology
Paliy, O.; Shankar, V.
2016-01-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791
The Statistical Package for the Social Sciences (SPSS) as an adjunct to pharmacokinetic analysis.
Mather, L E; Austin, K L
1983-01-01
Computer techniques for numerical analysis are well known to pharmacokineticists. Powerful techniques for data file management have been developed by social scientists but have, in general, been ignored by pharmacokineticists because of their apparent lack of ability to interface with pharmacokinetic programs. Extensive use has been made of the Statistical Package for the Social Sciences (SPSS) for its data handling capabilities, but at the same time, techniques have been developed within SPSS to interface with pharmacokinetic programs of the users' choice and to carry out a variety of user-defined pharmacokinetic tasks within SPSS commands, apart from the expected variety of statistical tasks. Because it is based on a ubiquitous package, this methodology has all of the benefits of excellent documentation, interchangeability between different types and sizes of machines and true portability of techniques and data files. An example is given of the total management of a pharmacokinetic study previously reported in the literature by the authors.
Barreto, Rafael E; Narváez, Javier; Sepúlveda, Natalia A; Velásquez, Fabián C; Díaz, Sandra C; López, Myriam Consuelo; Reyes, Patricia; Moncada, Ligia I
2017-09-01
Public health programs for the control of soil-transmitted helminthiases require valid diagnostic tests for surveillance and parasitic control evaluation. However, there is currently no agreement about what test should be used as a gold standard for the diagnosis of hookworm infection. Still, in presence of concurrent data for multiple tests it is possible to use statistical models to estimate measures of test performance and prevalence. The aim of this study was to estimate the diagnostic accuracy of five parallel tests (direct microscopic examination, Kato-Katz, Harada-Mori, modified Ritchie-Frick, and culture in agar plate) to detect hookworm infections in a sample of school-aged children from a rural area in Colombia. We used both, a frequentist approach, and Bayesian latent class models to estimate the sensitivity and specificity of five tests for hookworm detection, and to estimate the prevalence of hookworm infection in absence of a Gold Standard. The Kato-Katz and agar plate methods had an overall agreement of 95% and kappa coefficient of 0.76. Different models estimated a sensitivity between 76% and 92% for the agar plate technique, and 52% to 87% for the Kato-Katz technique. The other tests had lower sensitivity. All tests had specificity between 95% and 98%. The prevalence estimated by the Kato-Katz and Agar plate methods for different subpopulations varied between 10% and 14%, and was consistent with the prevalence estimated from the combination of all tests. The Harada-Mori, Ritchie-Frick and direct examination techniques resulted in lower and disparate prevalence estimates. Bayesian approaches assuming imperfect specificity resulted in lower prevalence estimates than the frequentist approach. Copyright © 2017 Elsevier B.V. All rights reserved.
On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
Signal Processing Methods for Liquid Rocket Engine Combustion Stability Assessments
NASA Technical Reports Server (NTRS)
Kenny, R. Jeremy; Lee, Erik; Hulka, James R.; Casiano, Matthew
2011-01-01
The J2X Gas Generator engine design specifications include dynamic, spontaneous, and broadband combustion stability requirements. These requirements are verified empirically based high frequency chamber pressure measurements and analyses. Dynamic stability is determined with the dynamic pressure response due to an artificial perturbation of the combustion chamber pressure (bomb testing), and spontaneous and broadband stability are determined from the dynamic pressure responses during steady operation starting at specified power levels. J2X Workhorse Gas Generator testing included bomb tests with multiple hardware configurations and operating conditions, including a configuration used explicitly for engine verification test series. This work covers signal processing techniques developed at Marshall Space Flight Center (MSFC) to help assess engine design stability requirements. Dynamic stability assessments were performed following both the CPIA 655 guidelines and a MSFC in-house developed statistical-based approach. The statistical approach was developed to better verify when the dynamic pressure amplitudes corresponding to a particular frequency returned back to pre-bomb characteristics. This was accomplished by first determining the statistical characteristics of the pre-bomb dynamic levels. The pre-bomb statistical characterization provided 95% coverage bounds; these bounds were used as a quantitative measure to determine when the post-bomb signal returned to pre-bomb conditions. The time for post-bomb levels to acceptably return to pre-bomb levels was compared to the dominant frequency-dependent time recommended by CPIA 655. Results for multiple test configurations, including stable and unstable configurations, were reviewed. Spontaneous stability was assessed using two processes: 1) characterization of the ratio of the peak response amplitudes to the excited chamber acoustic mode amplitudes and 2) characterization of the variability of the peak response's frequency over the test duration. This characterization process assists in evaluating the discreteness of a signal as well as the stability of the chamber response. Broadband stability was assessed using a running root-mean-square evaluation. These techniques were also employed, in a comparative analysis, on available Fastrac data, and these results are presented here.
Hou, Deyi; O'Connor, David; Nathanail, Paul; Tian, Li; Ma, Yan
2017-12-01
Heavy metal soil contamination is associated with potential toxicity to humans or ecotoxicity. Scholars have increasingly used a combination of geographical information science (GIS) with geostatistical and multivariate statistical analysis techniques to examine the spatial distribution of heavy metals in soils at a regional scale. A review of such studies showed that most soil sampling programs were based on grid patterns and composite sampling methodologies. Many programs intended to characterize various soil types and land use types. The most often used sampling depth intervals were 0-0.10 m, or 0-0.20 m, below surface; and the sampling densities used ranged from 0.0004 to 6.1 samples per km 2 , with a median of 0.4 samples per km 2 . The most widely used spatial interpolators were inverse distance weighted interpolation and ordinary kriging; and the most often used multivariate statistical analysis techniques were principal component analysis and cluster analysis. The review also identified several determining and correlating factors in heavy metal distribution in soils, including soil type, soil pH, soil organic matter, land use type, Fe, Al, and heavy metal concentrations. The major natural and anthropogenic sources of heavy metals were found to derive from lithogenic origin, roadway and transportation, atmospheric deposition, wastewater and runoff from industrial and mining facilities, fertilizer application, livestock manure, and sewage sludge. This review argues that the full potential of integrated GIS and multivariate statistical analysis for assessing heavy metal distribution in soils on a regional scale has not yet been fully realized. It is proposed that future research be conducted to map multivariate results in GIS to pinpoint specific anthropogenic sources, to analyze temporal trends in addition to spatial patterns, to optimize modeling parameters, and to expand the use of different multivariate analysis tools beyond principal component analysis (PCA) and cluster analysis (CA). Copyright © 2017 Elsevier Ltd. All rights reserved.
Lightfoot, Emma; O’Connell, Tamsin C.
2016-01-01
Oxygen isotope analysis of archaeological skeletal remains is an increasingly popular tool to study past human migrations. It is based on the assumption that human body chemistry preserves the δ18O of precipitation in such a way as to be a useful technique for identifying migrants and, potentially, their homelands. In this study, the first such global survey, we draw on published human tooth enamel and bone bioapatite data to explore the validity of using oxygen isotope analyses to identify migrants in the archaeological record. We use human δ18O results to show that there are large variations in human oxygen isotope values within a population sample. This may relate to physiological factors influencing the preservation of the primary isotope signal, or due to human activities (such as brewing, boiling, stewing, differential access to water sources and so on) causing variation in ingested water and food isotope values. We compare the number of outliers identified using various statistical methods. We determine that the most appropriate method for identifying migrants is dependent on the data but is likely to be the IQR or median absolute deviation from the median under most archaeological circumstances. Finally, through a spatial assessment of the dataset, we show that the degree of overlap in human isotope values from different locations across Europe is such that identifying individuals’ homelands on the basis of oxygen isotope analysis alone is not possible for the regions analysed to date. Oxygen isotope analysis is a valid method for identifying first-generation migrants from an archaeological site when used appropriately, however it is difficult to identify migrants using statistical methods for a sample size of less than c. 25 individuals. In the absence of local previous analyses, each sample should be treated as an individual dataset and statistical techniques can be used to identify migrants, but in most cases pinpointing a specific homeland should not be attempted. PMID:27124001
The application of latent curve analysis to testing developmental theories in intervention research.
Curran, P J; Muthén, B O
1999-08-01
The effectiveness of a prevention or intervention program has traditionally been assessed using time-specific comparisons of mean levels between the treatment and the control groups. However, many times the behavior targeted by the intervention is naturally developing over time, and the goal of the treatment is to alter this natural or normative developmental trajectory. Examining time-specific mean levels can be both limiting and potentially misleading when the behavior of interest is developing systematically over time. It is argued here that there are both theoretical and statistical advantages associated with recasting intervention treatment effects in terms of normative and altered developmental trajectories. The recently developed technique of latent curve (LC) analysis is reviewed and extended to a true experimental design setting in which subjects are randomly assigned to a treatment intervention or a control condition. LC models are applied to both artificially generated and real intervention data sets to evaluate the efficacy of an intervention program. Not only do the LC models provide a more comprehensive understanding of the treatment and control group developmental processes compared to more traditional fixed-effects models, but LC models have greater statistical power to detect a given treatment effect. Finally, the LC models are modified to allow for the computation of specific power estimates under a variety of conditions and assumptions that can provide much needed information for the planning and design of more powerful but cost-efficient intervention programs for the future.
Evaluation of IOTA Simple Ultrasound Rules to Distinguish Benign and Malignant Ovarian Tumours.
Garg, Sugandha; Kaur, Amarjit; Mohi, Jaswinder Kaur; Sibia, Preet Kanwal; Kaur, Navkiran
2017-08-01
IOTA stands for International Ovarian Tumour Analysis group. Ovarian cancer is one of the common cancers in women and is diagnosed at later stage in majority. The limiting factor for early diagnosis is lack of standardized terms and procedures in gynaecological sonography. Introduction of IOTA rules has provided some consistency in defining morphological features of ovarian masses through a standardized examination technique. To evaluate the efficacy of IOTA simple ultrasound rules in distinguishing benign and malignant ovarian tumours and establishing their use as a tool in early diagnosis of ovarian malignancy. A hospital based case control prospective study was conducted. Patients with suspected ovarian pathology were evaluated using IOTA ultrasound rules and designated as benign or malignant. Findings were correlated with histopathological findings. Collected data was statistically analysed using chi-square test and kappa statistical method. Out of initial 55 patients, 50 patients were included in the final analysis who underwent surgery. IOTA simple rules were applicable in 45 out of these 50 patients (90%). The sensitivity for the detection of malignancy in cases where IOTA simple rules were applicable was 91.66% and the specificity was 84.84%. Accuracy was 86.66%. Classifying inconclusive cases as malignant, the sensitivity and specificity was 93% and 80% respectively. High level of agreement was found between USG and histopathological diagnosis with Kappa value as 0.323. IOTA simple ultrasound rules were highly sensitive and specific in predicting ovarian malignancy preoperatively yet being reproducible, easy to train and use.
NASA Astrophysics Data System (ADS)
Asal, F. F.
2012-07-01
Digital elevation data obtained from different Engineering Surveying techniques is utilized in generating Digital Elevation Model (DEM), which is employed in many Engineering and Environmental applications. This data is usually in discrete point format making it necessary to utilize an interpolation approach for the creation of DEM. Quality assessment of the DEM is a vital issue controlling its use in different applications; however this assessment relies heavily on statistical methods with neglecting the visual methods. The research applies visual analysis investigation on DEMs generated using IDW interpolator of varying powers in order to examine their potential in the assessment of the effects of the variation of the IDW power on the quality of the DEMs. Real elevation data has been collected from field using total station instrument in a corrugated terrain. DEMs have been generated from the data at a unified cell size using IDW interpolator with power values ranging from one to ten. Visual analysis has been undertaken using 2D and 3D views of the DEM; in addition, statistical analysis has been performed for assessment of the validity of the visual techniques in doing such analysis. Visual analysis has shown that smoothing of the DEM decreases with the increase in the power value till the power of four; however, increasing the power more than four does not leave noticeable changes on 2D and 3D views of the DEM. The statistical analysis has supported these results where the value of the Standard Deviation (SD) of the DEM has increased with increasing the power. More specifically, changing the power from one to two has produced 36% of the total increase (the increase in SD due to changing the power from one to ten) in SD and changing to the powers of three and four has given 60% and 75% respectively. This refers to decrease in DEM smoothing with the increase in the power of the IDW. The study also has shown that applying visual methods supported by statistical analysis has proven good potential in the DEM quality assessment.
Local indicators of geocoding accuracy (LIGA): theory and application
Jacquez, Geoffrey M; Rommel, Robert
2009-01-01
Background Although sources of positional error in geographic locations (e.g. geocoding error) used for describing and modeling spatial patterns are widely acknowledged, research on how such error impacts the statistical results has been limited. In this paper we explore techniques for quantifying the perturbability of spatial weights to different specifications of positional error. Results We find that a family of curves describes the relationship between perturbability and positional error, and use these curves to evaluate sensitivity of alternative spatial weight specifications to positional error both globally (when all locations are considered simultaneously) and locally (to identify those locations that would benefit most from increased geocoding accuracy). We evaluate the approach in simulation studies, and demonstrate it using a case-control study of bladder cancer in south-eastern Michigan. Conclusion Three results are significant. First, the shape of the probability distributions of positional error (e.g. circular, elliptical, cross) has little impact on the perturbability of spatial weights, which instead depends on the mean positional error. Second, our methodology allows researchers to evaluate the sensitivity of spatial statistics to positional accuracy for specific geographies. This has substantial practical implications since it makes possible routine sensitivity analysis of spatial statistics to positional error arising in geocoded street addresses, global positioning systems, LIDAR and other geographic data. Third, those locations with high perturbability (most sensitive to positional error) and high leverage (that contribute the most to the spatial weight being considered) will benefit the most from increased positional accuracy. These are rapidly identified using a new visualization tool we call the LIGA scatterplot. Herein lies a paradox for spatial analysis: For a given level of positional error increasing sample density to more accurately follow the underlying population distribution increases perturbability and introduces error into the spatial weights matrix. In some studies positional error may not impact the statistical results, and in others it might invalidate the results. We therefore must understand the relationships between positional accuracy and the perturbability of the spatial weights in order to have confidence in a study's results. PMID:19863795
Chen, Chien-Chou; Teng, Yung-Chu; Lin, Bo-Cheng; Fan, I-Chun; Chan, Ta-Chien
2016-11-25
Cases of dengue fever have increased in areas of Southeast Asia in recent years. Taiwan hit a record-high 42,856 cases in 2015, with the majority in southern Tainan and Kaohsiung Cities. Leveraging spatial statistics and geo-visualization techniques, we aim to design an online analytical tool for local public health workers to prospectively identify ongoing hot spots of dengue fever weekly at the village level. A total of 57,516 confirmed cases of dengue fever in 2014 and 2015 were obtained from the Taiwan Centers for Disease Control (TCDC). Incorporating demographic information as covariates with cumulative cases (365 days) in a discrete Poisson model, we iteratively applied space-time scan statistics by SaTScan software to detect the currently active cluster of dengue fever (reported as relative risk) in each village of Tainan and Kaohsiung every week. A village with a relative risk >1 and p value <0.05 was identified as a dengue-epidemic area. Assuming an ongoing transmission might continuously spread for two consecutive weeks, we estimated the sensitivity and specificity for detecting outbreaks by comparing the scan-based classification (dengue-epidemic vs. dengue-free village) with the true cumulative case numbers from the TCDC's surveillance statistics. Among the 1648 villages in Tainan and Kaohsiung, the overall sensitivity for detecting outbreaks increases as case numbers grow in a total of 92 weekly simulations. The specificity for detecting outbreaks behaves inversely, compared to the sensitivity. On average, the mean sensitivity and specificity of 2-week hot spot detection were 0.615 and 0.891 respectively (p value <0.001) for the covariate adjustment model, as the maximum spatial and temporal windows were specified as 50% of the total population at risk and 28 days. Dengue-epidemic villages were visualized and explored in an interactive map. We designed an online analytical tool for front-line public health workers to prospectively detect ongoing dengue fever transmission on a weekly basis at the village level by using the routine surveillance data.
Chemokine Prostate Cancer Biomarkers — EDRN Public Portal
STUDY DESIGN 1. The need for pre-validation studies. Preliminary data from our laboratory demonstrates a potential utility for CXCL5 and CXCL12 as biomarkers to distinguish between patients at high-risk versus low-risk for harboring prostate malignancies. However, this pilot and feasibility study utilized a very small sample size of 51 patients, which limited the ability of this study to adequately assess certain technical aspects of the ELISA technique and statistical aspects of we propose studies designed assess the robustness (Specific Aim 1) and predictive value (Specific Aim 2) of these markers in a larger study population. 2. ELISA Assays. Serum, plasma, or urine chemokine levels are assessed using 50 ul frozen specimen per sandwich ELISA in duplicate using the appropriate commercially-available capture antibodies, detection antibodies, and standard ELISA reagents (R&D; Systems), as we have described previously (15, 17, 18). Measures within each patient group are regarded as biological replicates and permit statistical comparisons between groups. For all ELISAs, a standard curve is generated with the provided standards and utilized to calculate the quantity of chemokine in the sample tested. These assays provide measures of protein concentration with excellent reproducibility, with replicate measures characterized by standard deviations from the mean on the order of <3%.
Srivastava, Shweta; Vatsalya, Vatsalya; Arora, Ashoo; Arora, Kashmiri L; Karch, Robert
2012-03-22
Diarrhoea is one of the leading causes of morbidity and mortality in developing countries in Africa and South Asia such as India. Prevalence of diarrheal diseases in those countries is higher than developed western world and largely has been associated with socio-economic and sanitary conditions. However, present available data has not been sufficiently evaluated to study the role of other factors like healthcare development, population density, sex and regional influence on diarrheal prevalence pattern. Study was performed to understand the relationship of diarrheal prevalence with specific measures namely; healthcare services development, demographics, population density, socio-economic conditions, sex, and regional prevalence patterns in India. Data from Annual national health reports and other epidemiological studies were included and statistically analyzed. Our results demonstrate significant correlation of the disease prevalence pattern with certain measures like healthcare centers, population growth rate, sex and region-specific morbidity. Available information on sanitation like water supply and toilet availability and socioeconomic conditions like poverty and literacy measures could only be associated as trends of significance. This study can be valuable for improvisation of appropriate strategies focused on important measures like healthcare resources, population growth and regional significances to evaluate prevalence patterns and management of the diarrhoea locally and globally.
Review of a statistical specification for pugmill mixed material.
DOT National Transportation Integrated Search
1974-01-01
Since the spring of 1964, the Virginia Highway Research Council has been developing and implementing statistical specifications for highway operations. One of these specifications is used for the acceptance of pugmill mixed materials. The purpose of ...
Don, Elena; Farafonova, Olga; Pokhil, Suzanna; Barykina, Darya; Nikiforova, Marina; Shulga, Darya; Borshcheva, Alena; Tarasov, Sergey; Ermolaeva, Tatyana; Epstein, Oleg
2016-01-01
In preliminary ELISA studies where released-active forms (RAF) of antibodies (Abs) to interferon-gamma (IFNg) were added to the antigen-antibody system, a statistically significant difference in absorbance signals obtained in their presence in comparison to placebo was observed. A piezoelectric immunosensor assay was developed to support these data and investigate the effects of RAF Abs to IFNg on the specific interaction between Abs to IFNg and IFNg. The experimental conditions were designed and optimal electrode coating, detection circumstances and suitable chaotropic agents for electrode regeneration were selected. The developed technique was found to provide high repeatability, intermediate precision and specificity. The difference between the analytical signals of RAF Ab samples and those of the placebo was up to 50.8%, whereas the difference between non-specific controls and the placebo was within 5%–6%. Thus, the piezoelectric immunosensor as well as ELISA has the potential to be used for detecting the effects of RAF Abs to IFNg on the antigen-antibody interaction, which might be the result of RAF’s ability to modify the affinity of IFNg to specific/related Abs. PMID:26791304
Ali, Abid; Shakil-Ur-Rehman, Syed; Sibtain, Fozia
2014-07-01
To determine the efficacy of Sustained Natural Apophyseal Glides (SNAGs) with and without Isometric Exercise Training Program (IETP) in Non-specific Neck Pain (NSNP) Methods: This randomized control trial of one year duration was conducted at out-patient department of Physiotherapy and Rehabilitation, Khyber Teaching Hospital (KTH) Peshawar, Pakistan from July 2012 to June 2013. The sample of 102 patients of NSNP were randomly selected through simple random sampling technique, and placed into two groups. The SNAGs manual physical therapy technique with IETP was applied on 51 patients in group A and SNAGs manual physical therapy techniques was applied alone on 51 patients in group B. The duration of intervention was 6 weeks, at 4 times per week. The Neck Disability Index (NDI) and Visual Analog Scale (VAS) for neck pain were assessment tools used for all patients before and after 6 weeks of physical therapy intervention. All the patients were assessed through NDI and VAS before intervention and at the completion of 6 weeks program. The data of all 102 was analyzed by SPSS-20 and statistical test was applied at 95% level of significance determine the efficacy of both the treatments interventions and compare with each other. The patients in group A, treated with SNAGs and followed by IETP for 6 weeks, demonstrated more improvement in pain and physical activity as assessed by VAS (p=0.013) and NDI (p=0.003), as compared to the patients treated with SNAGS alone, as pain and function assessed by VAS (p=0.047) and NDI (p=0.164). In group A the NDI score improved from 40 to 15 and VAS from 7 to 4, while in group B the NDI score improved from 42 to 30 and VAS from 7 to 4. Patients with non-specific neck pain treated with SNAGs manual physical therapy techniques and followed by IETP was more effective in reduction of pain and enhancement of function, as compared to those patients treated with SNAGs manual physical therapy techniques alone.
McCormick, Frank; Gupta, Anil; Bruce, Ben; Harris, Josh; Abrams, Geoff; Wilson, Hillary; Hussey, Kristen; Cole, Brian J.
2014-01-01
Purpose: The purpose of this study was to measure and compare the subjective, objective, and radiographic healing outcomes of single-row (SR), double-row (DR), and transosseous equivalent (TOE) suture techniques for arthroscopic rotator cuff repair. Materials and Methods: A retrospective comparative analysis of arthroscopic rotator cuff repairs by one surgeon from 2004 to 2010 at minimum 2-year followup was performed. Cohorts were matched for age, sex, and tear size. Subjective outcome variables included ASES, Constant, SST, UCLA, and SF-12 scores. Objective outcome variables included strength, active range of motion (ROM). Radiographic healing was assessed by magnetic resonance imaging (MRI). Statistical analysis was performed using analysis of variance (ANOVA), Mann — Whitney and Kruskal — Wallis tests with significance, and the Fisher exact probability test <0.05. Results: Sixty-three patients completed the study requirements (20 SR, 21 DR, 22 TOE). There was a clinically and statistically significant improvement in outcomes with all repair techniques (ASES mean improvement P = <0.0001). The mean final ASES scores were: SR 83; (SD 21.4); DR 87 (SD 18.2); TOE 87 (SD 13.2); (P = 0.73). There was a statistically significant improvement in strength for each repair technique (P < 0.001). There was no significant difference between techniques across all secondary outcome assessments: ASES improvement, Constant, SST, UCLA, SF-12, ROM, Strength, and MRI re-tear rates. There was a decrease in re-tear rates from single row (22%) to double-row (18%) to transosseous equivalent (11%); however, this difference was not statistically significant (P = 0.6). Conclusions: Compared to preoperatively, arthroscopic rotator cuff repair, using SR, DR, or TOE techniques, yielded a clinically and statistically significant improvement in subjective and objective outcomes at a minimum 2-year follow-up. Level of Evidence: Therapeutic level 3. PMID:24926159
NASA Astrophysics Data System (ADS)
Rimov, A. A.; Chukanova, T. I.; Trofimov, Yu. V.
2016-12-01
Data on the comparative analysis variants of the quality of power installations (benchmarking) applied in the power industry is systematized. It is shown that the most efficient variant of implementation of the benchmarking technique is the analysis of statistical distributions of the indicators in the composed homogenous group of the uniform power installations. The benchmarking technique aimed at revealing the available reserves on improvement of the reliability and heat efficiency indicators of the power installations of the thermal power plants is developed in the furtherance of this approach. The technique provides a possibility of reliable comparison of the quality of the power installations in their homogenous group limited by the number and adoption of the adequate decision on improving some or other technical characteristics of this power installation. The technique provides structuring of the list of the comparison indicators and internal factors affecting them represented according to the requirements of the sectoral standards and taking into account the price formation characteristics in the Russian power industry. The mentioned structuring ensures traceability of the reasons of deviation of the internal influencing factors from the specified values. The starting point for further detail analysis of the delay of the certain power installation indicators from the best practice expressed in the specific money equivalent is positioning of this power installation on distribution of the key indicator being a convolution of the comparison indicators. The distribution of the key indicator is simulated by the Monte-Carlo method after receiving the actual distributions of the comparison indicators: specific lost profit due to the short supply of electric energy and short delivery of power, specific cost of losses due to the nonoptimal expenditures for repairs, and specific cost of excess fuel equivalent consumption. The quality loss indicators are developed facilitating the analysis of the benchmarking results permitting to represent the quality loss of this power installation in the form of the difference between the actual value of the key indicator or comparison indicator and the best quartile of the existing distribution. The uncertainty of the obtained values of the quality loss indicators was evaluated by transforming the standard uncertainties of the input values into the expanded uncertainties of the output values with the confidence level of 95%. The efficiency of the technique is demonstrated in terms of benchmarking of the main thermal and mechanical equipment of the extraction power-generating units T-250 and power installations of the thermal power plants with the main steam pressure 130 atm.
MANCOVA for one way classification with homogeneity of regression coefficient vectors
NASA Astrophysics Data System (ADS)
Mokesh Rayalu, G.; Ravisankar, J.; Mythili, G. Y.
2017-11-01
The MANOVA and MANCOVA are the extensions of the univariate ANOVA and ANCOVA techniques to multidimensional or vector valued observations. The assumption of a Gaussian distribution has been replaced with the Multivariate Gaussian distribution for the vectors data and residual term variables in the statistical models of these techniques. The objective of MANCOVA is to determine if there are statistically reliable mean differences that can be demonstrated between groups later modifying the newly created variable. When randomization assignment of samples or subjects to groups is not possible, multivariate analysis of covariance (MANCOVA) provides statistical matching of groups by adjusting dependent variables as if all subjects scored the same on the covariates. In this research article, an extension has been made to the MANCOVA technique with more number of covariates and homogeneity of regression coefficient vectors is also tested.
Deriving health utilities from the MacNew Heart Disease Quality of Life Questionnaire.
Chen, Gang; McKie, John; Khan, Munir A; Richardson, Jeff R
2015-10-01
Quality of life is included in the economic evaluation of health services by measuring the preference for health states, i.e. health state utilities. However, most intervention studies include a disease-specific, not a utility, instrument. Consequently, there has been increasing use of statistical mapping algorithms which permit utilities to be estimated from a disease-specific instrument. The present paper provides such algorithms between the MacNew Heart Disease Quality of Life Questionnaire (MacNew) instrument and six multi-attribute utility (MAU) instruments, the Euroqol (EQ-5D), the Short Form 6D (SF-6D), the Health Utilities Index (HUI) 3, the Quality of Wellbeing (QWB), the 15D (15 Dimension) and the Assessment of Quality of Life (AQoL-8D). Heart disease patients and members of the healthy public were recruited from six countries. Non-parametric rank tests were used to compare subgroup utilities and MacNew scores. Mapping algorithms were estimated using three separate statistical techniques. Mapping algorithms achieved a high degree of precision. Based on the mean absolute error and the intra class correlation the preferred mapping is MacNew into SF-6D or 15D. Using the R squared statistic the preferred mapping is MacNew into AQoL-8D. The algorithms reported in this paper enable MacNew data to be mapped into utilities predicted from any of six instruments. This permits studies which have included the MacNew to be used in cost utility analyses which, in turn, allows the comparison of services with interventions across the health system. © The European Society of Cardiology 2014.
OdorMapComparer: an application for quantitative analyses and comparisons of fMRI brain odor maps.
Liu, Nian; Xu, Fuqiang; Miller, Perry L; Shepherd, Gordon M
2007-01-01
Brain odor maps are reconstructed flat images that describe the spatial activity patterns in the glomerular layer of the olfactory bulbs in animals exposed to different odor stimuli. We have developed a software application, OdorMapComparer, to carry out quantitative analyses and comparisons of the fMRI odor maps. This application is an open-source window program that first loads two odor map images being compared. It allows image transformations including scaling, flipping, rotating, and warping so that the two images can be appropriately aligned to each other. It performs simple subtraction, addition, and average of signals in the two images. It also provides comparative statistics including the normalized correlation (NC) and spatial correlation coefficient. Experimental studies showed that the rodent fMRI odor maps for aliphatic aldehydes displayed spatial activity patterns that are similar in gross outlines but somewhat different in specific subregions. Analyses with OdorMapComparer indicate that the similarity between odor maps decreases with increasing difference in the length of carbon chains. For example, the map of butanal is more closely related to that of pentanal (with a NC = 0.617) than to that of octanal (NC = 0.082), which is consistent with animal behavioral studies. The study also indicates that fMRI odor maps are statistically odor-specific and repeatable across both the intra- and intersubject trials. OdorMapComparer thus provides a tool for quantitative, statistical analyses and comparisons of fMRI odor maps in a fashion that is integrated with the overall odor mapping techniques.
NASA Astrophysics Data System (ADS)
Rodó, Xavier; Rodríguez-Arias, Miquel-Àngel
2006-10-01
The study of transitory signals and local variability structures in both/either time and space and their role as sources of climatic memory, is an important but often neglected topic in climate research despite its obvious importance and extensive coverage in the literature. Transitory signals arise either from non-linearities, in the climate system, transitory atmosphere-ocean couplings, and other processes in the climate system evolving after a critical threshold is crossed. These temporary interactions that, though intense, may not last long, can be responsible for a large amount of unexplained variability but are normally considered of limited relevance and often, discarded. With most of the current techniques at hand these typology of signatures are difficult to isolate because the low signal-to-noise ratio in midlatitudes, the limited recurrence of the transitory signals during a customary interval of data considered. Also, there is often a serious problem arising from the smoothing of local or transitory processes if statistical techniques are applied, that consider all the length of data available, rather than taking into account the size of the specific variability structure under investigation. Scale-dependent correlation (SDC) analysis is a new statistical method capable of highlighting the presence of transitory processes, these former being understood as temporary significant lag-dependent autocovariance in a single series, or covariance structures between two series. This approach, therefore, complements other approaches such as those resulting from the families of wavelet analysis, singular-spectrum analysis and recurrence plots. A main feature of SDC is its high-performance for short time series, its ability to characterize phase-relationships and thresholds in the bivariate domain. Ultimately, SDC helps tracking short-lagged relationships among processes that locally or temporarily couple and uncouple. The use of SDC is illustrated in the present paper by means of some synthetic time-series examples of increasing complexity, and it is compared with wavelet analysis in order to provide a well-known reference of its capabilities. A comparison between SDC and companion techniques is also addressed and results are exemplified for the specific case of some relevant El Niño-Southern Oscillation teleconnections.
The statistics of identifying differentially expressed genes in Expresso and TM4: a comparison
Sioson, Allan A; Mane, Shrinivasrao P; Li, Pinghua; Sha, Wei; Heath, Lenwood S; Bohnert, Hans J; Grene, Ruth
2006-01-01
Background Analysis of DNA microarray data takes as input spot intensity measurements from scanner software and returns differential expression of genes between two conditions, together with a statistical significance assessment. This process typically consists of two steps: data normalization and identification of differentially expressed genes through statistical analysis. The Expresso microarray experiment management system implements these steps with a two-stage, log-linear ANOVA mixed model technique, tailored to individual experimental designs. The complement of tools in TM4, on the other hand, is based on a number of preset design choices that limit its flexibility. In the TM4 microarray analysis suite, normalization, filter, and analysis methods form an analysis pipeline. TM4 computes integrated intensity values (IIV) from the average intensities and spot pixel counts returned by the scanner software as input to its normalization steps. By contrast, Expresso can use either IIV data or median intensity values (MIV). Here, we compare Expresso and TM4 analysis of two experiments and assess the results against qRT-PCR data. Results The Expresso analysis using MIV data consistently identifies more genes as differentially expressed, when compared to Expresso analysis with IIV data. The typical TM4 normalization and filtering pipeline corrects systematic intensity-specific bias on a per microarray basis. Subsequent statistical analysis with Expresso or a TM4 t-test can effectively identify differentially expressed genes. The best agreement with qRT-PCR data is obtained through the use of Expresso analysis and MIV data. Conclusion The results of this research are of practical value to biologists who analyze microarray data sets. The TM4 normalization and filtering pipeline corrects microarray-specific systematic bias and complements the normalization stage in Expresso analysis. The results of Expresso using MIV data have the best agreement with qRT-PCR results. In one experiment, MIV is a better choice than IIV as input to data normalization and statistical analysis methods, as it yields as greater number of statistically significant differentially expressed genes; TM4 does not support the choice of MIV input data. Overall, the more flexible and extensive statistical models of Expresso achieve more accurate analytical results, when judged by the yardstick of qRT-PCR data, in the context of an experimental design of modest complexity. PMID:16626497
Statistical context shapes stimulus-specific adaptation in human auditory cortex
Henry, Molly J.; Fromboluti, Elisa Kim; McAuley, J. Devin
2015-01-01
Stimulus-specific adaptation is the phenomenon whereby neural response magnitude decreases with repeated stimulation. Inconsistencies between recent nonhuman animal recordings and computational modeling suggest dynamic influences on stimulus-specific adaptation. The present human electroencephalography (EEG) study investigates the potential role of statistical context in dynamically modulating stimulus-specific adaptation by examining the auditory cortex-generated N1 and P2 components. As in previous studies of stimulus-specific adaptation, listeners were presented with oddball sequences in which the presentation of a repeated tone was infrequently interrupted by rare spectral changes taking on three different magnitudes. Critically, the statistical context varied with respect to the probability of small versus large spectral changes within oddball sequences (half of the time a small change was most probable; in the other half a large change was most probable). We observed larger N1 and P2 amplitudes (i.e., release from adaptation) for all spectral changes in the small-change compared with the large-change statistical context. The increase in response magnitude also held for responses to tones presented with high probability, indicating that statistical adaptation can overrule stimulus probability per se in its influence on neural responses. Computational modeling showed that the degree of coadaptation in auditory cortex changed depending on the statistical context, which in turn affected stimulus-specific adaptation. Thus the present data demonstrate that stimulus-specific adaptation in human auditory cortex critically depends on statistical context. Finally, the present results challenge the implicit assumption of stationarity of neural response magnitudes that governs the practice of isolating established deviant-detection responses such as the mismatch negativity. PMID:25652920
Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques
NASA Astrophysics Data System (ADS)
Mishra, D.; Goyal, P.
2014-12-01
Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.
Mavrodi, Alexandra; Ohanyan, Ani; Kechagias, Nikos; Tsekos, Antonis; Vahtsevanos, Konstantinos
2015-09-01
Post-operative complications of various degrees of severity are commonly observed in third molar impaction surgery. For this reason, a surgical procedure that decreases the trauma of bone and soft tissues should be a priority for surgeons. In the present study, we compare the efficacy and the post-operative complications of patients to whom two different surgical techniques were applied for impacted lower third molar extraction. Patients of the first group underwent the classical bur technique, while patients of the second group underwent another technique, in which an elevator was placed on the buccal surface of the impacted molar in order to luxate the alveolar socket more easily. Comparing the two techniques, we observed a statistically significant decrease in the duration of the procedure and in the need for tooth sectioning when applying the second surgical technique, while the post-operative complications were similar in the two groups. We also found a statistically significant lower incidence of lingual nerve lesions and only a slightly higher frequency of sharp mandibular bone irregularities in the second group, which however was not statistically significant. The results of our study indicate that the surgical technique using an elevator on the buccal surface of the tooth seems to be a reliable method to extract impacted third molars safely, easily, quickly and with the minimum trauma to the surrounding tissues.
Liu, Siwei; Gates, Kathleen M; Blandon, Alysia Y
2018-06-01
Despite recent research indicating that interpersonal linkage in physiology is a common phenomenon during social interactions, and the well-established role of respiratory sinus arrhythmia (RSA) in socially facilitative physiological regulation, little research has directly examined interpersonal influences in RSA, perhaps due to methodological challenges in analyzing multivariate RSA data. In this article, we aim to bridge this methodological gap by introducing a new method for quantifying interpersonal RSA influences. Specifically, we show that a frequency-domain statistic, generalized partial directed coherence (gPDC), can be used to capture lagged relations in RSA between social partners without first estimating RSA for each person. We illustrate its utility by examining the relation between gPDC and marital conflict in a sample of married couples. Finally, we discuss how gPDC complements existing methods in the time domain and provide guidelines for choosing among these different statistical techniques. © 2018 Society for Psychophysiological Research.
Progress in Turbulence Detection via GNSS Occultation Data
NASA Technical Reports Server (NTRS)
Cornman, L. B.; Goodrich, R. K.; Axelrad, P.; Barlow, E.
2012-01-01
The increased availability of radio occultation (RO) data offers the ability to detect and study turbulence in the Earth's atmosphere. An analysis of how RO data can be used to determine the strength and location of turbulent regions is presented. This includes the derivation of a model for the power spectrum of the log-amplitude and phase fluctuations of the permittivity (or index of refraction) field. The bulk of the paper is then concerned with the estimation of the model parameters. Parameter estimators are introduced and some of their statistical properties are studied. These estimators are then applied to simulated log-amplitude RO signals. This includes the analysis of global statistics derived from a large number of realizations, as well as case studies that illustrate various specific aspects of the problem. Improvements to the basic estimation methods are discussed, and their beneficial properties are illustrated. The estimation techniques are then applied to real occultation data. Only two cases are presented, but they illustrate some of the salient features inherent in real data.
Perceptual basis of evolving Western musical styles
Rodriguez Zivic, Pablo H.; Shifres, Favio; Cecchi, Guillermo A.
2013-01-01
The brain processes temporal statistics to predict future events and to categorize perceptual objects. These statistics, called expectancies, are found in music perception, and they span a variety of different features and time scales. Specifically, there is evidence that music perception involves strong expectancies regarding the distribution of a melodic interval, namely, the distance between two consecutive notes within the context of another. The recent availability of a large Western music dataset, consisting of the historical record condensed as melodic interval counts, has opened new possibilities for data-driven analysis of musical perception. In this context, we present an analytical approach that, based on cognitive theories of music expectation and machine learning techniques, recovers a set of factors that accurately identifies historical trends and stylistic transitions between the Baroque, Classical, Romantic, and Post-Romantic periods. We also offer a plausible musicological and cognitive interpretation of these factors, allowing us to propose them as data-driven principles of melodic expectation. PMID:23716669
Statistical data mining of streaming motion data for fall detection in assistive environments.
Tasoulis, S K; Doukas, C N; Maglogiannis, I; Plagianakos, V P
2011-01-01
The analysis of human motion data is interesting for the purpose of activity recognition or emergency event detection, especially in the case of elderly or disabled people living independently in their homes. Several techniques have been proposed for identifying such distress situations using either motion, audio or video sensors on the monitored subject (wearable sensors) or the surrounding environment. The output of such sensors is data streams that require real time recognition, especially in emergency situations, thus traditional classification approaches may not be applicable for immediate alarm triggering or fall prevention. This paper presents a statistical mining methodology that may be used for the specific problem of real time fall detection. Visual data captured from the user's environment, using overhead cameras along with motion data are collected from accelerometers on the subject's body and are fed to the fall detection system. The paper includes the details of the stream data mining methodology incorporated in the system along with an initial evaluation of the achieved accuracy in detecting falls.
The expectancy-value muddle in the theory of planned behaviour - and some proposed solutions.
French, David P; Hankins, Matthew
2003-02-01
The authors of the Theories of Reasoned Action and Planned Behaviour recommended a method for statistically analysing the relationships between beliefs and the Attitude, Subjective Norm, and Perceived Behavioural Control constructs. This method has been used in the overwhelming majority of studies using these theories. However, there is a growing awareness that this method yields statistically uninterpretable results (Evans, 1991). Despite this, the use of this method is continuing, as is uninformed interpretation of this problematic research literature. This is probably due to the lack of a simple account of where the problem lies, and the large number of alternatives available. This paper therefore summarizes the problem as simply as possible, gives consideration to the conclusions that can be validly drawn from studies that contain this problem, and critically reviews the many alternatives that have been proposed to address this problem. Different techniques are identified as being suitable, according to the purpose of the specific research project.
Efficient bootstrap estimates for tail statistics
NASA Astrophysics Data System (ADS)
Breivik, Øyvind; Aarnes, Ole Johan
2017-03-01
Bootstrap resamples can be used to investigate the tail of empirical distributions as well as return value estimates from the extremal behaviour of the sample. Specifically, the confidence intervals on return value estimates or bounds on in-sample tail statistics can be obtained using bootstrap techniques. However, non-parametric bootstrapping from the entire sample is expensive. It is shown here that it suffices to bootstrap from a small subset consisting of the highest entries in the sequence to make estimates that are essentially identical to bootstraps from the entire sample. Similarly, bootstrap estimates of confidence intervals of threshold return estimates are found to be well approximated by using a subset consisting of the highest entries. This has practical consequences in fields such as meteorology, oceanography and hydrology where return values are calculated from very large gridded model integrations spanning decades at high temporal resolution or from large ensembles of independent and identically distributed model fields. In such cases the computational savings are substantial.
Quantum Biometrics with Retinal Photon Counting
NASA Astrophysics Data System (ADS)
Loulakis, M.; Blatsios, G.; Vrettou, C. S.; Kominis, I. K.
2017-10-01
It is known that the eye's scotopic photodetectors, rhodopsin molecules, and their associated phototransduction mechanism leading to light perception, are efficient single-photon counters. We here use the photon-counting principles of human rod vision to propose a secure quantum biometric identification based on the quantum-statistical properties of retinal photon detection. The photon path along the human eye until its detection by rod cells is modeled as a filter having a specific transmission coefficient. Precisely determining its value from the photodetection statistics registered by the conscious observer is a quantum parameter estimation problem that leads to a quantum secure identification method. The probabilities for false-positive and false-negative identification of this biometric technique can readily approach 10-10 and 10-4, respectively. The security of the biometric method can be further quantified by the physics of quantum measurements. An impostor must be able to perform quantum thermometry and quantum magnetometry with energy resolution better than 10-9ℏ , in order to foil the device by noninvasively monitoring the biometric activity of a user.
NASA Astrophysics Data System (ADS)
Singh, Sarvesh Kumar; Kumar, Pramod; Rani, Raj; Turbelin, Grégory
2017-04-01
The study highlights a theoretical comparison and various interpretations of a recent inversion technique, called renormalization, developed for the reconstruction of unknown tracer emissions from their measured concentrations. The comparative interpretations are presented in relation to the other inversion techniques based on principle of regularization, Bayesian, minimum norm, maximum entropy on mean, and model resolution optimization. It is shown that the renormalization technique can be interpreted in a similar manner to other techniques, with a practical choice of a priori information and error statistics, while eliminating the need of additional constraints. The study shows that the proposed weight matrix and weighted Gram matrix offer a suitable deterministic choice to the background error and measurement covariance matrices, respectively, in the absence of statistical knowledge about background and measurement errors. The technique is advantageous since it (i) utilizes weights representing a priori information apparent to the monitoring network, (ii) avoids dependence on background source estimates, (iii) improves on alternative choices for the error statistics, (iv) overcomes the colocalization problem in a natural manner, and (v) provides an optimally resolved source reconstruction. A comparative illustration of source retrieval is made by using the real measurements from a continuous point release conducted in Fusion Field Trials, Dugway Proving Ground, Utah.
Dentascan – Is the Investment Worth the Hype ???
Shah, Monali A; Shah, Sneha S; Dave, Deepak
2013-01-01
Background: Open Bone Measurement (OBM) and Bone Sounding (BS) are most reliable but invasive clinical methods for Alveolar Bone Level (ABL) assessment, causing discomfort to the patient. Routinely, IOPAs & OPGs are the commonest radiographic techniques used, which tend to underestimate bone loss and obscure buccal/lingual defects. Novel technique like dentascan (CBCT) eliminates this limitation by giving images in 3 planes – sagittal, coronal and axial. Aim: To compare & correlate non-invasive 3D radiographic technique of Dentascan with BS & OBM, and IOPA and OPG, in assessing the ABL. Settings and Design: Cross-sectional diagnostic study. Material and Methods: Two hundred and five sites were subjected to clinical and radiographic diagnostic techniques. Relative distance between the alveolar bone crest and reference wire was measured. All the measurements were compared and tested against the OBM. Statistical Analysis: Student’s t-test, ANOVA, Pearson correlation coefficient. Results: There is statistically significant difference between dentascan and OBM, only BS showed agreement with OBM (p < 0.05). Dentascan weakly correlated with OBM & BS lingually.Rest all techniques showed statistically significant difference between them (p= 0.00). Conclusion: Within the limitations of this study, only BS seems to be comparable with OBM with no superior result of Dentascan over the conventional techniques, except for lingual measurements. PMID:24551722
Sheikh, Adnan
2016-01-01
Objective: The aim of this study was to evaluate the impact of adaptive statistical iterative reconstruction (ASiR) technique on the image quality and radiation dose reduction. The comparison was made with the traditional filtered back projection (FBP) technique. Methods: We retrospectively reviewed 78 patients, who underwent cervical spine CT for blunt cervical trauma between 1 June 2010 and 30 November 2010. 48 patients were imaged using traditional FBP technique and the remaining 30 patients were imaged using the ASiR technique. The patient demographics, radiation dose, objective image signal and noise were recorded; while subjective noise, sharpness, diagnostic acceptability and artefacts were graded by two radiologists blinded to the techniques. Results: We found that the ASiR technique was able to reduce the volume CT dose index, dose–length product and effective dose by 36%, 36.5% and 36.5%, respectively, compared with the FBP technique. There was no significant difference in the image noise (p = 0.39), signal (p = 0.82) and signal-to-noise ratio (p = 0.56) between the groups. The subjective image quality was minimally better in the ASiR group but not statistically significant. There was excellent interobserver agreement on the subjective image quality and diagnostic acceptability for both groups. Conclusion: The use of ASiR technique allowed approximately 36% radiation dose reduction in the evaluation of cervical spine without degrading the image quality. Advances in knowledge: The present study highlights that the ASiR technique is extremely helpful in reducing the patient radiation exposure while maintaining the image quality. It is highly recommended to utilize this novel technique in CT imaging of different body regions. PMID:26882825
Patro, Satya N; Chakraborty, Santanu; Sheikh, Adnan
2016-01-01
The aim of this study was to evaluate the impact of adaptive statistical iterative reconstruction (ASiR) technique on the image quality and radiation dose reduction. The comparison was made with the traditional filtered back projection (FBP) technique. We retrospectively reviewed 78 patients, who underwent cervical spine CT for blunt cervical trauma between 1 June 2010 and 30 November 2010. 48 patients were imaged using traditional FBP technique and the remaining 30 patients were imaged using the ASiR technique. The patient demographics, radiation dose, objective image signal and noise were recorded; while subjective noise, sharpness, diagnostic acceptability and artefacts were graded by two radiologists blinded to the techniques. We found that the ASiR technique was able to reduce the volume CT dose index, dose-length product and effective dose by 36%, 36.5% and 36.5%, respectively, compared with the FBP technique. There was no significant difference in the image noise (p = 0.39), signal (p = 0.82) and signal-to-noise ratio (p = 0.56) between the groups. The subjective image quality was minimally better in the ASiR group but not statistically significant. There was excellent interobserver agreement on the subjective image quality and diagnostic acceptability for both groups. The use of ASiR technique allowed approximately 36% radiation dose reduction in the evaluation of cervical spine without degrading the image quality. The present study highlights that the ASiR technique is extremely helpful in reducing the patient radiation exposure while maintaining the image quality. It is highly recommended to utilize this novel technique in CT imaging of different body regions.
Testing prediction methods: Earthquake clustering versus the Poisson model
Michael, A.J.
1997-01-01
Testing earthquake prediction methods requires statistical techniques that compare observed success to random chance. One technique is to produce simulated earthquake catalogs and measure the relative success of predicting real and simulated earthquakes. The accuracy of these tests depends on the validity of the statistical model used to simulate the earthquakes. This study tests the effect of clustering in the statistical earthquake model on the results. Three simulation models were used to produce significance levels for a VLF earthquake prediction method. As the degree of simulated clustering increases, the statistical significance drops. Hence, the use of a seismicity model with insufficient clustering can lead to overly optimistic results. A successful method must pass the statistical tests with a model that fully replicates the observed clustering. However, a method can be rejected based on tests with a model that contains insufficient clustering. U.S. copyright. Published in 1997 by the American Geophysical Union.
UNCERTAINTY ON RADIATION DOSES ESTIMATED BY BIOLOGICAL AND RETROSPECTIVE PHYSICAL METHODS.
Ainsbury, Elizabeth A; Samaga, Daniel; Della Monaca, Sara; Marrale, Maurizio; Bassinet, Celine; Burbidge, Christopher I; Correcher, Virgilio; Discher, Michael; Eakins, Jon; Fattibene, Paola; Güçlü, Inci; Higueras, Manuel; Lund, Eva; Maltar-Strmecki, Nadica; McKeever, Stephen; Rääf, Christopher L; Sholom, Sergey; Veronese, Ivan; Wieser, Albrecht; Woda, Clemens; Trompier, Francois
2018-03-01
Biological and physical retrospective dosimetry are recognised as key techniques to provide individual estimates of dose following unplanned exposures to ionising radiation. Whilst there has been a relatively large amount of recent development in the biological and physical procedures, development of statistical analysis techniques has failed to keep pace. The aim of this paper is to review the current state of the art in uncertainty analysis techniques across the 'EURADOS Working Group 10-Retrospective dosimetry' members, to give concrete examples of implementation of the techniques recommended in the international standards, and to further promote the use of Monte Carlo techniques to support characterisation of uncertainties. It is concluded that sufficient techniques are available and in use by most laboratories for acute, whole body exposures to highly penetrating radiation, but further work will be required to ensure that statistical analysis is always wholly sufficient for the more complex exposure scenarios.
Simulation and statistics: Like rhythm and song
NASA Astrophysics Data System (ADS)
Othman, Abdul Rahman
2013-04-01
Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.
da Silva, R C V; de Sá, C C; Pascual-Vaca, Á O; de Souza Fontes, L H; Herbella Fernandes, F A M; Dib, R A; Blanco, C R; Queiroz, R A; Navarro-Rodriguez, T
2013-07-01
The treatment of gastroesophageal reflux disease may be clinical or surgical. The clinical consists basically of the use of drugs; however, there are new techniques to complement this treatment, osteopathic intervention in the diaphragmatic muscle is one these. The objective of the study is to compare pressure values in the examination of esophageal manometry of the lower esophageal sphincter (LES) before and immediately after osteopathic intervention in the diaphragm muscle. Thirty-eight patients with gastroesophageal reflux disease - 16 submitted to sham technique and 22 submitted osteopathic technique - were randomly selected. The average respiratory pressure (ARP) and the maximum expiratory pressure (MEP) of the LES were measured by manometry before and after osteopathic technique at the point of highest pressure. Statistical analysis was performed using the Student's t-test and Mann-Whitney, and magnitude of the technique proposed was measured using the Cohen's index. Statistically significant difference in the osteopathic technique was found in three out of four in relation to the group of patients who performed the sham technique for the following measures of LES pressure: ARP with P= 0.027. The MEP had no statistical difference (P= 0.146). The values of Cohen d for the same measures were: ARP with d= 0.80 and MEP d= 0.52. Osteopathic manipulative technique produces a positive increment in the LES region soon after its performance. © 2012 Copyright the Authors. Journal compilation © 2012, Wiley Periodicals, Inc. and the International Society for Diseases of the Esophagus.
Vaidya, Sharad; Parkash, Hari; Bhargava, Akshay; Gupta, Sharad
2014-01-01
Abundant resources and techniques have been used for complete coverage crown fabrication. Conventional investing and casting procedures for phosphate-bonded investments require a 2- to 4-h procedure before completion. Accelerated casting techniques have been used, but may not result in castings with matching marginal accuracy. The study measured the marginal gap and determined the clinical acceptability of single cast copings invested in a phosphate-bonded investment with the use of conventional and accelerated methods. One hundred and twenty cast coping samples were fabricated using conventional and accelerated methods, with three finish lines: Chamfer, shoulder and shoulder with bevel. Sixty copings were prepared with each technique. Each coping was examined with a stereomicroscope at four predetermined sites and measurements of marginal gaps were documented for each. A master chart was prepared for all the data and was analyzed using Statistical Package for the Social Sciences version. Evidence of marginal gap was then evaluated by t-test. Analysis of variance and Post-hoc analysis were used to compare two groups as well as to make comparisons between three subgroups . Measurements recorded showed no statistically significant difference between conventional and accelerated groups. Among the three marginal designs studied, shoulder with bevel showed the best marginal fit with conventional as well as accelerated casting techniques. Accelerated casting technique could be a vital alternative to the time-consuming conventional casting technique. The marginal fit between the two casting techniques showed no statistical difference.
Data mining and statistical inference in selective laser melting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamath, Chandrika
Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less
Data mining and statistical inference in selective laser melting
Kamath, Chandrika
2016-01-11
Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less
Vexler, Albert; Tanajian, Hovig; Hutson, Alan D
In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K -sample distributions. Recognizing that recent statistical software packages do not sufficiently address K -sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p -values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p -value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p -value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.
fMRI Brain-Computer Interface: A Tool for Neuroscientific Research and Treatment
Sitaram, Ranganatha; Caria, Andrea; Veit, Ralf; Gaber, Tilman; Rota, Giuseppina; Kuebler, Andrea; Birbaumer, Niels
2007-01-01
Brain-computer interfaces based on functional magnetic resonance imaging (fMRI-BCI) allow volitional control of anatomically specific regions of the brain. Technological advancement in higher field MRI scanners, fast data acquisition sequences, preprocessing algorithms, and robust statistical analysis are anticipated to make fMRI-BCI more widely available and applicable. This noninvasive technique could potentially complement the traditional neuroscientific experimental methods by varying the activity of the neural substrates of a region of interest as an independent variable to study its effects on behavior. If the neurobiological basis of a disorder (e.g., chronic pain, motor diseases, psychopathy, social phobia, depression) is known in terms of abnormal activity in certain regions of the brain, fMRI-BCI can be targeted to modify activity in those regions with high specificity for treatment. In this paper, we review recent results of the application of fMRI-BCI to neuroscientific research and psychophysiological treatment. PMID:18274615
Simulation of water-quality data at selected stream sites in the Missouri River Basin, Montana
Knapton, J.R.; Jacobson, M.A.
1980-01-01
Modification of sampling programs at some water-quality stations in the Missouri River basin in Montana has eliminated the means by which solute loads have been directly obtained in past years. To compensate for this loss, water-quality and streamflow data were statistically analyzed and solute loads were simulated using computer techniques.Functional relationships existing between specific conductance and solute concentration for monthly samples were used to develop linear regression models. The models were then used to simulate daily solute concentrations using daily specific conductance as the independent variable. Once simulated, the solute concentrations, in milligrams per liter, were transformed into daily solute loads, in tons, using mean daily streamflow records.Computer output was formatted into tables listing simulated mean monthly solute concentrations, in milligrams per liter, and the monthly and annual solute loads, in tons, for water years 1975-78.
Big Data and Health Economics: Strengths, Weaknesses, Opportunities and Threats.
Collins, Brendan
2016-02-01
'Big data' is the collective name for the increasing capacity of information systems to collect and store large volumes of data, which are often unstructured and time stamped, and to analyse these data by using regression and other statistical techniques. This is a review of the potential applications of big data and health economics, using a SWOT (strengths, weaknesses, opportunities, threats) approach. In health economics, large pseudonymized databases, such as the planned care.data programme in the UK, have the potential to increase understanding of how drugs work in the real world, taking into account adherence, co-morbidities, interactions and side effects. This 'real-world evidence' has applications in individualized medicine. More routine and larger-scale cost and outcomes data collection will make health economic analyses more disease specific and population specific but may require new skill sets. There is potential for biomonitoring and lifestyle data to inform health economic analyses and public health policy.
[Specificity of the Adultrap for capturing females of Aedes aegypti (Diptera: Culicidae)].
Gomes, Almério de Castro; da Silva, Nilza Nunes; Bernal, Regina Tomie Ivata; Leandro, André de Souza; de Camargo, Natal Jataí; da Silva, Allan Martins; Ferreira, Adão Celestino; Ogura, Luis Carlos; de Oliveira, Sebastião José; de Moura, Silvestre Marques
2007-01-01
The Adultrap is a new trap built for capturing females of Aedes aegypti. Tests were carried out to evaluate the specificity of this trap in comparison with the technique of aspiration of specimens in artificial shelters. Adultraps were kept for 24 hours inside and outside 120 randomly selected homes in two districts of the city of Foz do Iguaçú, State of Paraná. The statistical test was Poissons log-linear model. The result was 726 mosquitoes captured, of which 80 were Aedes aegypti. The Adultrap captured only females of this species, while the aspiration method captured both sexes of Aedes aegypti and another five species. The Adultrap captured Aedes aegypti inside and outside the homes, but the analysis indicated that, outside the homes, this trap captured significantly more females than aspiration did. The sensitivity of the Adultrap for detecting females of Aedes aegypti in low-frequency situations was also demonstrated.
NASA Astrophysics Data System (ADS)
Verkade, J. S.; Brown, J. D.; Davids, F.; Reggiani, P.; Weerts, A. H.
2017-12-01
Two statistical post-processing approaches for estimation of predictive hydrological uncertainty are compared: (i) 'dressing' of a deterministic forecast by adding a single, combined estimate of both hydrological and meteorological uncertainty and (ii) 'dressing' of an ensemble streamflow forecast by adding an estimate of hydrological uncertainty to each individual streamflow ensemble member. Both approaches aim to produce an estimate of the 'total uncertainty' that captures both the meteorological and hydrological uncertainties. They differ in the degree to which they make use of statistical post-processing techniques. In the 'lumped' approach, both sources of uncertainty are lumped by post-processing deterministic forecasts using their verifying observations. In the 'source-specific' approach, the meteorological uncertainties are estimated by an ensemble of weather forecasts. These ensemble members are routed through a hydrological model and a realization of the probability distribution of hydrological uncertainties (only) is then added to each ensemble member to arrive at an estimate of the total uncertainty. The techniques are applied to one location in the Meuse basin and three locations in the Rhine basin. Resulting forecasts are assessed for their reliability and sharpness, as well as compared in terms of multiple verification scores including the relative mean error, Brier Skill Score, Mean Continuous Ranked Probability Skill Score, Relative Operating Characteristic Score and Relative Economic Value. The dressed deterministic forecasts are generally more reliable than the dressed ensemble forecasts, but the latter are sharper. On balance, however, they show similar quality across a range of verification metrics, with the dressed ensembles coming out slightly better. Some additional analyses are suggested. Notably, these include statistical post-processing of the meteorological forecasts in order to increase their reliability, thus increasing the reliability of the streamflow forecasts produced with ensemble meteorological forcings.
Hicks, T; Taroni, F; Curran, J; Buckleton, J; Castella, V; Ribaux, O
2010-10-01
Familial searching consists of searching for a full profile left at a crime scene in a National DNA Database (NDNAD). In this paper we are interested in the circumstance where no full match is returned, but a partial match is found between a database member's profile and the crime stain. Because close relatives share more of their DNA than unrelated persons, this partial match may indicate that the crime stain was left by a close relative of the person with whom the partial match was found. This approach has successfully solved important crimes in the UK and the USA. In a previous paper, a model, which takes into account substructure and siblings, was used to simulate a NDNAD. In this paper, we have used this model to test the usefulness of familial searching and offer guidelines for pre-assessment of the cases based on the likelihood ratio. Siblings of "persons" present in the simulated Swiss NDNAD were created. These profiles (N=10,000) were used as traces and were then compared to the whole database (N=100,000). The statistical results obtained show that the technique has great potential confirming the findings of previous studies. However, effectiveness of the technique is only one part of the story. Familial searching has juridical and ethical aspects that should not be ignored. In Switzerland for example, there are no specific guidelines to the legality or otherwise of familial searching. This article both presents statistical results, and addresses criminological and civil liberties aspects to take into account risks and benefits of familial searching. Copyright © 2009 Elsevier Ireland Ltd. All rights reserved.
Reading biological processes from nucleotide sequences
NASA Astrophysics Data System (ADS)
Murugan, Anand
Cellular processes have traditionally been investigated by techniques of imaging and biochemical analysis of the molecules involved. The recent rapid progress in our ability to manipulate and read nucleic acid sequences gives us direct access to the genetic information that directs and constrains biological processes. While sequence data is being used widely to investigate genotype-phenotype relationships and population structure, here we use sequencing to understand biophysical mechanisms. We present work on two different systems. First, in chapter 2, we characterize the stochastic genetic editing mechanism that produces diverse T-cell receptors in the human immune system. We do this by inferring statistical distributions of the underlying biochemical events that generate T-cell receptor coding sequences from the statistics of the observed sequences. This inferred model quantitatively describes the potential repertoire of T-cell receptors that can be produced by an individual, providing insight into its potential diversity and the probability of generation of any specific T-cell receptor. Then in chapter 3, we present work on understanding the functioning of regulatory DNA sequences in both prokaryotes and eukaryotes. Here we use experiments that measure the transcriptional activity of large libraries of mutagenized promoters and enhancers and infer models of the sequence-function relationship from this data. For the bacterial promoter, we infer a physically motivated 'thermodynamic' model of the interaction of DNA-binding proteins and RNA polymerase determining the transcription rate of the downstream gene. For the eukaryotic enhancers, we infer heuristic models of the sequence-function relationship and use these models to find synthetic enhancer sequences that optimize inducibility of expression. Both projects demonstrate the utility of sequence information in conjunction with sophisticated statistical inference techniques for dissecting underlying biophysical mechanisms.
Hardiman, S; Miller, K; Murphy, M
1993-01-01
Safety observations during the clinical development of Mentane (velnacrine maleate) have included the occurrence of generally asymptomatic liver enzyme elevations confined to patients with Alzheimer's disease (AD). The clinical presentation of this reversible hepatocellular injury is analogous to that reported for tetrahydroaminoacridine (THA). Direct liver injury, possibly associated with the production of a toxic metabolite, would be consistent with reports of aberrant xenobiotic metabolism in Alzheimer's disease patients. Since a patient related aberration in drug metabolism was suspected, a biostatistical strategy was developed with the objective of predicting hepatotoxicity in individual patients prior to exposure to velnacrine maleate. The method used logistic regression techniques with variable selection restricted to those items which could be routinely and inexpensively accessed at screen evaluation for potential candidates for treatment. The model was to be predictive (a marker for eventual hepatotoxicity) rather than a causative model, and techniques employed "goodness of fit", percentage correct, and positive and negative predictive values. On the basis of demographic and baseline laboratory data from 942 patients, the PROPP statistic was developed (the Physician Reference Of Predicted Probabilities). Main effect variables included age, gender, and nine hematological and serum chemistry variables. The sensitivity of the current model is approximately 49%, specificity approximately 88%. Using prior probability estimates, however, in which the patient's likelihood of liver toxicity is presumed to be at least 30%, the positive predictive value ranged between 64-77%. Although the clinical utility of this statistic will require refinements and additional prospective confirmation, its potential existence speaks to the possibility of markers for idiosyncratic drug metabolism in patients with Alzheimer's disease.
Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software.
Lopez, Natalia; Perez, Elisa; Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E
2018-01-01
The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established goniometric methods determines that the proposed software agrees sufficiently to be used interchangeably.
Kadkhodazadeh, Mahdi; Ebadian, Ahmad Reza; Gholami, Gholam Ali; Khosravi, Alireza; Tabari, Zahra Alizadeh
2013-05-01
RANK/OPG/RANKL pathway plays a significant role in osteoclastogenesis, osteoclast activation, and regulation of bone resorption. The aim of this study was to investigate the association of RANKL gene polymorphisms (rs9533156 and rs2277438) with chronic periodontitis and peri-implantitis in an Iranian population. 77 patients with chronic periodontitis, 40 patients with peri-implantitis and 89 periodontally healthy patients were enrolled in this study. 5cc of blood was obtained from the cephalic vein of subjects arms and transferred into tubes containing EDTA. Genomic DNA was extracted using Miller's Salting Out technique. The DNA was transferred into 96 division plates, transported to Kbioscience Institute in United Kingdom and analyzed using the Kbioscience Competitive Allele Specific PCR (KASP) technique. Differences in the frequencies of genotypes and alleles in the disease and control groups were analyzed using Chi-square and Fisher's exact statistical tests. Comparison of frequency of alleles in SNP rs9533156 of RANKL gene between the chronic periodontitis group with the control and peri-implantitis groups revealed statistically significant differences (P=0.024 and P=0.027, respectively). Comparison of genotype expression of SNP rs9533156 on RANKL gene between the peri-implantitis group with chronic periodontitis and control groups revealed statistically significant differences (P=0.001); the prevalence of CT genotype was significantly higher amongst the chronic periodontitis group. Regarding SNP rs2277438 of RANKL gene, comparison of prevalence of genotypes and frequency of alleles did not reveal any significant differences (P=0.641/P=0.537, respectively). The results of this study indicate that CT genotype of rs9533156 RANKL gene polymorphism was significantly associated with peri-implantitis, and may be considered as a genetic determinant for peri-implantitis. Copyright © 2012 Elsevier Ltd. All rights reserved.
Regression: The Apple Does Not Fall Far From the Tree.
Vetter, Thomas R; Schober, Patrick
2018-05-15
Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.
Inferring epidemiological parameters from phylogenies using regression-ABC: A comparative study
Gascuel, Olivier
2017-01-01
Inferring epidemiological parameters such as the R0 from time-scaled phylogenies is a timely challenge. Most current approaches rely on likelihood functions, which raise specific issues that range from computing these functions to finding their maxima numerically. Here, we present a new regression-based Approximate Bayesian Computation (ABC) approach, which we base on a large variety of summary statistics intended to capture the information contained in the phylogeny and its corresponding lineage-through-time plot. The regression step involves the Least Absolute Shrinkage and Selection Operator (LASSO) method, which is a robust machine learning technique. It allows us to readily deal with the large number of summary statistics, while avoiding resorting to Markov Chain Monte Carlo (MCMC) techniques. To compare our approach to existing ones, we simulated target trees under a variety of epidemiological models and settings, and inferred parameters of interest using the same priors. We found that, for large phylogenies, the accuracy of our regression-ABC is comparable to that of likelihood-based approaches involving birth-death processes implemented in BEAST2. Our approach even outperformed these when inferring the host population size with a Susceptible-Infected-Removed epidemiological model. It also clearly outperformed a recent kernel-ABC approach when assuming a Susceptible-Infected epidemiological model with two host types. Lastly, by re-analyzing data from the early stages of the recent Ebola epidemic in Sierra Leone, we showed that regression-ABC provides more realistic estimates for the duration parameters (latency and infectiousness) than the likelihood-based method. Overall, ABC based on a large variety of summary statistics and a regression method able to perform variable selection and avoid overfitting is a promising approach to analyze large phylogenies. PMID:28263987
NASA Astrophysics Data System (ADS)
Whan, K. R.; Lindesay, J. A.; Timbal, B.; Raupach, M. R.; Williams, E.
2010-12-01
Australia’s natural environment is adapted to low rainfall availability and high variability but human systems are less able to adapt to variability in the hydrological cycle. Understanding the mechanisms underlying drought persistence and severity is vital to contextualising future climate change. Multiple external forcings mean the mechanisms of drought occurrence in south-eastern Australian are complex. The key influences on SEA climate are El Niño-Southern Oscillation (ENSO), the Indian Ocean Dipole (IOD), the Southern Annular Mode (SAM) and the sub-tropical ridge (STR); each of these large-scale climate modes (LSCM) has been studied widely. The need for research into the interactions among the modes has been noted [1], although to date this has received limited attention. Relationships between LSCM and hydrometeorological variability are nonlinear, making linearity assumptions underlying usual statistical techniques (e.g. correlation, principle components analysis) questionable. In the current research a statistical technique that can deal with nonlinear interactions is applied to a new dataset enabling a full examination of the Australian water balance. The Australian Water Availability Project (AWAP) dataset models the Australian water balance on a fine grid [2]. Hydrological parameters (e.g. soil moisture, evaporation, runoff) are modelled from meteorological data, allowing the complete Australian water balance (climate and hydrology) to be examined and the mechanisms of drought to be studied holistically. Classification and regression trees (CART) are a powerful regression-based technique that is capable of accounting for nonlinear effects. Although it has limited previous application in climate research [3] this methodology is particularly informative in cases with multiple predictors and nonlinear relationships such as climate variability. Statistical relationships between variables are the basis for the decision rules in CART that are used to split the data into increasingly homogeneous groups. CART is applied to the AWAP dataset to identify the hydroclimatic regimes associated with various combinations of LSCM and the importance of each mode in producing the regime. Analysis of the LSCM is conducted on a range of hydroclimatic variables to assess the relative and combined influences of these LSCM on the Australian water balance. This gives information about interactions between LSCM that are vital for specific hydroclimatic states (e.g. drought) and about which combinations of LSCM result in specific regimes. The dominant LSCM in different seasons and the relationships among the climate drivers have been identified. 1. Ummenhofer, C., et al., What causes southeast Australia's worst droughts? Geophysical Research Letters, 2009. 36: p. L04706. 2. Raupach, M., et al., Australian Water Availability Project (AWAP). CSIRO Marine and Atmospheric Research Component: Final Report for Phase 3. 2008. 3. Burrows, W., et al., CART Decision-Tree Statistical Analysis and Prediction of Summer Season Maximum Surface Ozone for the Vancouver, Montreal and Atlantic Regions of Canada. Journal of Applied Meteorology, 1995. 34: p. 1848-1862.
Playing at Statistical Mechanics
ERIC Educational Resources Information Center
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
A Review of Calibration Transfer Practices and Instrument Differences in Spectroscopy.
Workman, Jerome J
2018-03-01
Calibration transfer for use with spectroscopic instruments, particularly for near-infrared, infrared, and Raman analysis, has been the subject of multiple articles, research papers, book chapters, and technical reviews. There has been a myriad of approaches published and claims made for resolving the problems associated with transferring calibrations; however, the capability of attaining identical results over time from two or more instruments using an identical calibration still eludes technologists. Calibration transfer, in a precise definition, refers to a series of analytical approaches or chemometric techniques used to attempt to apply a single spectral database, and the calibration model developed using that database, for two or more instruments, with statistically retained accuracy and precision. Ideally, one would develop a single calibration for any particular application, and move it indiscriminately across instruments and achieve identical analysis or prediction results. There are many technical aspects involved in such precision calibration transfer, related to the measuring instrument reproducibility and repeatability, the reference chemical values used for the calibration, the multivariate mathematics used for calibration, and sample presentation repeatability and reproducibility. Ideally, a multivariate model developed on a single instrument would provide a statistically identical analysis when used on other instruments following transfer. This paper reviews common calibration transfer techniques, mostly related to instrument differences, and the mathematics of the uncertainty between instruments when making spectroscopic measurements of identical samples. It does not specifically address calibration maintenance or reference laboratory differences.
Wafer, Lucas; Kloczewiak, Marek; Luo, Yin
2016-07-01
Analytical ultracentrifugation-sedimentation velocity (AUC-SV) is often used to quantify high molar mass species (HMMS) present in biopharmaceuticals. Although these species are often present in trace quantities, they have received significant attention due to their potential immunogenicity. Commonly, AUC-SV data is analyzed as a diffusion-corrected, sedimentation coefficient distribution, or c(s), using SEDFIT to numerically solve Lamm-type equations. SEDFIT also utilizes maximum entropy or Tikhonov-Phillips regularization to further allow the user to determine relevant sample information, including the number of species present, their sedimentation coefficients, and their relative abundance. However, this methodology has several, often unstated, limitations, which may impact the final analysis of protein therapeutics. These include regularization-specific effects, artificial "ripple peaks," and spurious shifts in the sedimentation coefficients. In this investigation, we experimentally verified that an explicit Bayesian approach, as implemented in SEDFIT, can largely correct for these effects. Clear guidelines on how to implement this technique and interpret the resulting data, especially for samples containing micro-heterogeneity (e.g., differential glycosylation), are also provided. In addition, we demonstrated how the Bayesian approach can be combined with F statistics to draw more accurate conclusions and rigorously exclude artifactual peaks. Numerous examples with an antibody and an antibody-drug conjugate were used to illustrate the strengths and drawbacks of each technique.
Re-calibration of coronary risk prediction: an example of the Seven Countries Study.
Puddu, Paolo Emilio; Piras, Paolo; Kromhout, Daan; Tolonen, Hanna; Kafatos, Anthony; Menotti, Alessandro
2017-12-14
We aimed at performing a calibration and re-calibration process using six standard risk factors from Northern (NE, N = 2360) or Southern European (SE, N = 2789) middle-aged men of the Seven Countries Study, whose parameters and data were fully known, to establish whether re-calibration gave the right answer. Greenwood-Nam-D'Agostino technique as modified by Demler (GNDD) in 2015 produced chi-squared statistics using 10 deciles of observed/expected CHD mortality risk, corresponding to Hosmer-Lemeshaw chi-squared employed for multiple logistic equations whereby binary data are used. Instead of the number of events, the GNDD test uses survival probabilities of observed and predicted events. The exercise applied, in five different ways, the parameters of the NE-predictive model to SE (and vice-versa) and compared the outcome of the simulated re-calibration with the real data. Good re-calibration could be obtained only when risk factor coefficients were substituted, being similar in magnitude and not significantly different between NE-SE. In all other ways, a good re-calibration could not be obtained. This is enough to praise for an overall need of re-evaluation of most investigations that, without GNDD or another proper technique for statistically assessing the potential differences, concluded that re-calibration is a fair method and might therefore be used, with no specific caution.
NASA Astrophysics Data System (ADS)
Cenek, Martin; Dahl, Spencer K.
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Cenek, Martin; Dahl, Spencer K
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Koerner, Tess K.; Zhang, Yang
2017-01-01
Neurophysiological studies are often designed to examine relationships between measures from different testing conditions, time points, or analysis techniques within the same group of participants. Appropriate statistical techniques that can take into account repeated measures and multivariate predictor variables are integral and essential to successful data analysis and interpretation. This work implements and compares conventional Pearson correlations and linear mixed-effects (LME) regression models using data from two recently published auditory electrophysiology studies. For the specific research questions in both studies, the Pearson correlation test is inappropriate for determining strengths between the behavioral responses for speech-in-noise recognition and the multiple neurophysiological measures as the neural responses across listening conditions were simply treated as independent measures. In contrast, the LME models allow a systematic approach to incorporate both fixed-effect and random-effect terms to deal with the categorical grouping factor of listening conditions, between-subject baseline differences in the multiple measures, and the correlational structure among the predictor variables. Together, the comparative data demonstrate the advantages as well as the necessity to apply mixed-effects models to properly account for the built-in relationships among the multiple predictor variables, which has important implications for proper statistical modeling and interpretation of human behavior in terms of neural correlates and biomarkers. PMID:28264422