Sample records for advanced statistical techniques

  1. Statistical Tests of Reliability of NDE

    NASA Technical Reports Server (NTRS)

    Baaklini, George Y.; Klima, Stanley J.; Roth, Don J.; Kiser, James D.

    1987-01-01

    Capabilities of advanced material-testing techniques analyzed. Collection of four reports illustrates statistical method for characterizing flaw-detecting capabilities of sophisticated nondestructive evaluation (NDE). Method used to determine reliability of several state-of-the-art NDE techniques for detecting failure-causing flaws in advanced ceramic materials considered for use in automobiles, airplanes, and space vehicles.

  2. [Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].

    PubMed

    Golder, W

    1999-09-01

    To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.

  3. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  4. Multivariate mixed linear model analysis of longitudinal data: an information-rich statistical technique for analyzing disease resistance data

    USDA-ARS?s Scientific Manuscript database

    The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...

  5. Logo image clustering based on advanced statistics

    NASA Astrophysics Data System (ADS)

    Wei, Yi; Kamel, Mohamed; He, Yiwei

    2007-11-01

    In recent years, there has been a growing interest in the research of image content description techniques. Among those, image clustering is one of the most frequently discussed topics. Similar to image recognition, image clustering is also a high-level representation technique. However it focuses on the coarse categorization rather than the accurate recognition. Based on wavelet transform (WT) and advanced statistics, the authors propose a novel approach that divides various shaped logo images into groups according to the external boundary of each logo image. Experimental results show that the presented method is accurate, fast and insensitive to defects.

  6. Using Statistical Natural Language Processing for Understanding Complex Responses to Free-Response Tasks

    ERIC Educational Resources Information Center

    DeMark, Sarah F.; Behrens, John T.

    2004-01-01

    Whereas great advances have been made in the statistical sophistication of assessments in terms of evidence accumulation and task selection, relatively little statistical work has explored the possibility of applying statistical techniques to data for the purposes of determining appropriate domain understanding and to generate task-level scoring…

  7. Resampling: A Marriage of Computers and Statistics. ERIC/TM Digest.

    ERIC Educational Resources Information Center

    Rudner, Lawrence M.; Shafer, Mary Morello

    Advances in computer technology are making it possible for educational researchers to use simpler statistical methods to address a wide range of questions with smaller data sets and fewer, and less restrictive, assumptions. This digest introduces computationally intensive statistics, collectively called resampling techniques. Resampling is a…

  8. COLLABORATIVE RESEARCH:USING ARM OBSERVATIONS & ADVANCED STATISTICAL TECHNIQUES TO EVALUATE CAM3 CLOUDS FOR DEVELOPMENT OF STOCHASTIC CLOUD-RADIATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somerville, Richard

    2013-08-22

    The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key stepmore » in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been a collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen).« less

  9. Diffusion-weighted imaging and demyelinating diseases: new aspects of an old advanced sequence.

    PubMed

    Rueda-Lopes, Fernanda C; Hygino da Cruz, Luiz C; Doring, Thomas M; Gasparetto, Emerson L

    2014-01-01

    The purpose of this article is to discuss classic applications in diffusion-weighted imaging (DWI) in demyelinating disease and progression of DWI in the near future. DWI is an advanced technique used in the follow-up of demyelinating disease patients, focusing on the diagnosis of a new lesion before contrast enhancement. With technical advances, diffusion-tensor imaging; new postprocessing techniques, such as tract-based spatial statistics; new ways of calculating diffusion, such as kurtosis; and new applications for DWI and its spectrum are about to arise.

  10. Application of scanning acoustic microscopy to advanced structural ceramics

    NASA Technical Reports Server (NTRS)

    Vary, Alex; Klima, Stanley J.

    1987-01-01

    A review is presentod of research investigations of several acoustic microscopy techniques for application to structural ceramics for advanced heat engines. Results obtained with scanning acoustic microscopy (SAM), scanning laser acoustic microscopy (SLAM), scanning electron acoustic microscopy (SEAM), and photoacoustic microscopy (PAM) are compared. The techniques were evaluated on research samples of green and sintered monolithic silicon nitrides and silicon carbides in the form of modulus-of-rupture bars containing deliberately introduced flaws. Strengths and limitations of the techniques are described with emphasis on statistics of detectability of flaws that constitute potential fracture origins.

  11. Atmospheric statistics for aerospace vehicle operations

    NASA Technical Reports Server (NTRS)

    Smith, O. E.; Batts, G. W.

    1993-01-01

    Statistical analysis of atmospheric variables was performed for the Shuttle Transportation System (STS) design trade studies and the establishment of launch commit criteria. Atmospheric constraint statistics have been developed for the NASP test flight, the Advanced Launch System, and the National Launch System. The concepts and analysis techniques discussed in the paper are applicable to the design and operations of any future aerospace vehicle.

  12. Statistical and Economic Techniques for Site-specific Nematode Management.

    PubMed

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  13. Process optimization using combinatorial design principles: parallel synthesis and design of experiment methods.

    PubMed

    Gooding, Owen W

    2004-06-01

    The use of parallel synthesis techniques with statistical design of experiment (DoE) methods is a powerful combination for the optimization of chemical processes. Advances in parallel synthesis equipment and easy to use software for statistical DoE have fueled a growing acceptance of these techniques in the pharmaceutical industry. As drug candidate structures become more complex at the same time that development timelines are compressed, these enabling technologies promise to become more important in the future.

  14. Recent advances in lossless coding techniques

    NASA Astrophysics Data System (ADS)

    Yovanof, Gregory S.

    Current lossless techniques are reviewed with reference to both sequential data files and still images. Two major groups of sequential algorithms, dictionary and statistical techniques, are discussed. In particular, attention is given to Lempel-Ziv coding, Huffman coding, and arithmewtic coding. The subject of lossless compression of imagery is briefly discussed. Finally, examples of practical implementations of lossless algorithms and some simulation results are given.

  15. Large ensemble modeling of the last deglacial retreat of the West Antarctic Ice Sheet: comparison of simple and advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Pollard, David; Chang, Won; Haran, Murali; Applegate, Patrick; DeConto, Robert

    2016-05-01

    A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ˜ 20 000 yr. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. The analyses provide sea-level-rise envelopes with well-defined parametric uncertainty bounds, but the simple averaging method only provides robust results with full-factorial parameter sampling in the large ensemble. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree well with the more advanced techniques. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds.

  16. ADP of multispectral scanner data for land use mapping

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1971-01-01

    The advantages and disadvantages of various remote sensing instrumentation and analysis techniques are reviewed. The use of multispectral scanner data and the automatic data processing techniques are considered. A computer-aided analysis system for remote sensor data is described with emphasis on the image display, statistics processor, wavelength band selection, classification processor, and results display. Advanced techniques in using spectral and temporal data are also considered.

  17. Approaching Big Survey Data One Byte at a Time

    ERIC Educational Resources Information Center

    Blaich, Charles; Wise, Kathleen

    2017-01-01

    This chapter asserts that data are more likely to improve learning when assessment focuses on sensemaking conversations among students, faculty, and student affairs administrators, rather than on advanced statistical techniques.

  18. A Randomized Comparative Study of Two Techniques to Optimize the Root Coverage Using a Porcine Collagen Matrix.

    PubMed

    Reino, Danilo Maeda; Maia, Luciana Prado; Fernandes, Patrícia Garani; Souza, Sergio Luis Scombatti de; Taba Junior, Mario; Palioto, Daniela Bazan; Grisi, Marcio Fermandes de Moraes; Novaes, Arthur Belém

    2015-10-01

    The aim of this randomized controlled clinical study was to compare the extended flap technique (EFT) with the coronally advanced flap technique (CAF) using a porcine collagen matrix (PCM) for root coverage. Twenty patients with two bilateral gingival recessions, Miller class I or II on non-molar teeth were treated with CAF+PCM (control group) or EFT+PCM (test group). Clinical measurements of probing pocket depth (PPD), clinical attachment level (CAL), recession height (RH), keratinized tissue height (KTH), keratinized mucosa thickness (KMT) were determined at baseline, 3 and 6 months post-surgery. At 6 months, the mean root coverage for test group was 81.89%, and for control group it was 62.80% (p<0.01). The change of recession depth from baseline was statistically significant between test and control groups, with an mean of 2.21 mm gained at the control sites and 2.84 mm gained at the test sites (p=0.02). There were no statistically significant differences for KTH, PPD or CAL comparing the two therapies. The extended flap technique presented better root coverage than the coronally advanced flap technique when PCM was used.

  19. Factors related to student performance in statistics courses in Lebanon

    NASA Astrophysics Data System (ADS)

    Naccache, Hiba Salim

    The purpose of the present study was to identify factors that may contribute to business students in Lebanese universities having difficulty in introductory and advanced statistics courses. Two statistics courses are required for business majors at Lebanese universities. Students are not obliged to be enrolled in any math courses prior to taking statistics courses. Drawing on recent educational research, this dissertation attempted to identify the relationship between (1) students’ scores on Lebanese university math admissions tests; (2) students’ scores on a test of very basic mathematical concepts; (3) students’ scores on the survey of attitude toward statistics (SATS); (4) course performance as measured by students’ final scores in the course; and (5) their scores on the final exam. Data were collected from 561 students enrolled in multiple sections of two courses: 307 students in the introductory statistics course and 260 in the advanced statistics course in seven campuses across Lebanon over one semester. The multiple regressions results revealed four significant relationships at the introductory level: between students’ scores on the math quiz with their (1) final exam scores; (2) their final averages; (3) the Cognitive subscale of the SATS with their final exam scores; and (4) their final averages. These four significant relationships were also found at the advanced level. In addition, two more significant relationships were found between students’ final average and the two subscales of Effort (5) and Affect (6). No relationship was found between students’ scores on the admission math tests and both their final exam scores and their final averages in both the introductory and advanced level courses. On the other hand, there was no relationship between students’ scores on Lebanese admissions tests and their final achievement. Although these results were consistent across course formats and instructors, they may encourage Lebanese universities to assess the effectiveness of prerequisite math courses. Moreover, these findings may lead the Lebanese Ministry of Education to make changes to the admissions exams, course prerequisites, and course content. Finally, to enhance the attitude of students, new learning techniques, such as group work during class meetings can be helpful, and future research should aim to test the effectiveness of these pedagogical techniques on students’ attitudes toward statistics.

  20. The Truth, the Whole Truth, and Nothing but the Ground-Truth: Methods to Advance Environmental Justice and Researcher-Community Partnerships

    ERIC Educational Resources Information Center

    Sadd, James; Morello-Frosch, Rachel; Pastor, Manuel; Matsuoka, Martha; Prichard, Michele; Carter, Vanessa

    2014-01-01

    Environmental justice advocates often argue that environmental hazards and their health effects vary by neighborhood, income, and race. To assess these patterns and advance preventive policy, their colleagues in the research world often use complex and methodologically sophisticated statistical and geospatial techniques. One way to bridge the gap…

  1. Hypothesis Testing, "p" Values, Confidence Intervals, Measures of Effect Size, and Bayesian Methods in Light of Modern Robust Techniques

    ERIC Educational Resources Information Center

    Wilcox, Rand R.; Serang, Sarfaraz

    2017-01-01

    The article provides perspectives on p values, null hypothesis testing, and alternative techniques in light of modern robust statistical methods. Null hypothesis testing and "p" values can provide useful information provided they are interpreted in a sound manner, which includes taking into account insights and advances that have…

  2. Statistical methods used in the public health literature and implications for training of public health professionals

    PubMed Central

    Hayat, Matthew J.; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L.

    2017-01-01

    Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals. PMID:28591190

  3. Statistical methods used in the public health literature and implications for training of public health professionals.

    PubMed

    Hayat, Matthew J; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L

    2017-01-01

    Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals.

  4. A comparative evaluation of subepithelial connective tissue graft (SCTG) versus platelet concentrate graft (PCG) in the treatment of gingival recession using coronally advanced flap technique: A 12-month study

    PubMed Central

    Kumar, G. Naveen Vital; Murthy, K. Raja Venkatesh

    2013-01-01

    Objective: The objective of this study was to clinically evaluate and compare the efficacy of platelet concentrate graft (PCG) with that of subepithelial connective tissue graft (SCTG) using a coronally advanced flap technique in the treatment of gingival recession. Materials and Methods: Twelve patients with a total of 24 gingival recession defects were selected and randomly assigned either to experimental site-A (SCTG) or experimental site-B (PCG). The clinical parameters were recorded at baseline up to 12 months post-operatively and compared. Results: The mean vertical recession depth (VRD) statistically significantly decreased from 2.50 ± 0.48 mm to 0.54 ± 0.50 mm with PCG and from 2.75 ± 0.58 mm to 0.54 ± 0.45 mm with SCTG at 12 months. No statistically significant differences between the treatments were found for VRD and clinical attachment level (CAL), while keratinized tissue width (KTW) gain was statistically significant. Conclusion: Both the SCTG and the PCG group resulted in a significant amount of root coverage. The PCG technique was less invasive and required minimal time and clinical maneuver. It resulted in superior aesthetic outcome and lower post-surgical discomfort at the 12 months follow-up. PMID:24554889

  5. Selected Bibliography on Optimizing Techniques in Statistics

    DTIC Science & Technology

    1981-08-01

    problems in business, industry and .ogovern nt ae f rmulated as optimization problem. Topics in optimization constitute an essential area of study in...numerical, iii) mathematical programming, and (iv) variational. We provide pertinent references with statistical applications Sin the above areas in Part I...TMS Advanced Studies in Managentnt Sciences, North-Holland PIIENli iiiany, Amsterdam. (To appear.) Spang, H. A. (1962). A review of minimization

  6. Application of multivariate statistical techniques in microbial ecology

    PubMed Central

    Paliy, O.; Shankar, V.

    2016-01-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791

  7. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  8. An investigation of the feasibility of improving oculometer data analysis through application of advanced statistical techniques

    NASA Technical Reports Server (NTRS)

    Rana, D. S.

    1980-01-01

    The data reduction capabilities of the current data reduction programs were assessed and a search for a more comprehensive system with higher data analytic capabilities was made. Results of the investigation are presented.

  9. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    PubMed Central

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis. PMID:25705672

  10. The novel quantitative technique for assessment of gait symmetry using advanced statistical learning algorithm.

    PubMed

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  11. Improvements in approaches to forecasting and evaluation techniques

    NASA Astrophysics Data System (ADS)

    Weatherhead, Elizabeth

    2014-05-01

    The US is embarking on an experiment to make significant and sustained improvements in weather forecasting. The effort stems from a series of community conversations that recognized the rapid advancements in observations, modeling and computing techniques in the academic, governmental and private sectors. The new directions and initial efforts will be summarized, including information on possibilities for international collaboration. Most new projects are scheduled to start in the last half of 2014. Several advancements include ensemble forecasting with global models, and new sharing of computing resources. Newly developed techniques for evaluating weather forecast models will be presented in detail. The approaches use statistical techniques that incorporate pair-wise comparisons of forecasts with observations and account for daily auto-correlation to assess appropriate uncertainty in forecast changes. Some of the new projects allow for international collaboration, particularly on the research components of the projects.

  12. Space Weather in the Machine Learning Era: A Multidisciplinary Approach

    NASA Astrophysics Data System (ADS)

    Camporeale, E.; Wing, S.; Johnson, J.; Jackman, C. M.; McGranaghan, R.

    2018-01-01

    The workshop entitled Space Weather: A Multidisciplinary Approach took place at the Lorentz Center, University of Leiden, Netherlands, on 25-29 September 2017. The aim of this workshop was to bring together members of the Space Weather, Mathematics, Statistics, and Computer Science communities to address the use of advanced techniques such as Machine Learning, Information Theory, and Deep Learning, to better understand the Sun-Earth system and to improve space weather forecasting. Although individual efforts have been made toward this goal, the community consensus is that establishing interdisciplinary collaborations is the most promising strategy for fully utilizing the potential of these advanced techniques in solving Space Weather-related problems.

  13. Large ensemble modeling of last deglacial retreat of the West Antarctic Ice Sheet: comparison of simple and advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Pollard, D.; Chang, W.; Haran, M.; Applegate, P.; DeConto, R.

    2015-11-01

    A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ~ 20 000 years. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree quite well with the more advanced techniques, but only for a large ensemble with full factorial parameter sampling. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds. Each run is extended 5000 years into the "future" with idealized ramped climate warming. In the majority of runs with reasonable scores, this produces grounding-line retreat deep into the West Antarctic interior, and the analysis provides sea-level-rise envelopes with well defined parametric uncertainty bounds.

  14. Advances in Machine Learning and Data Mining for Astronomy

    NASA Astrophysics Data System (ADS)

    Way, Michael J.; Scargle, Jeffrey D.; Ali, Kamal M.; Srivastava, Ashok N.

    2012-03-01

    Advances in Machine Learning and Data Mining for Astronomy documents numerous successful collaborations among computer scientists, statisticians, and astronomers who illustrate the application of state-of-the-art machine learning and data mining techniques in astronomy. Due to the massive amount and complexity of data in most scientific disciplines, the material discussed in this text transcends traditional boundaries between various areas in the sciences and computer science. The book's introductory part provides context to issues in the astronomical sciences that are also important to health, social, and physical sciences, particularly probabilistic and statistical aspects of classification and cluster analysis. The next part describes a number of astrophysics case studies that leverage a range of machine learning and data mining technologies. In the last part, developers of algorithms and practitioners of machine learning and data mining show how these tools and techniques are used in astronomical applications. With contributions from leading astronomers and computer scientists, this book is a practical guide to many of the most important developments in machine learning, data mining, and statistics. It explores how these advances can solve current and future problems in astronomy and looks at how they could lead to the creation of entirely new algorithms within the data mining community.

  15. Application of multivariate statistical techniques in microbial ecology.

    PubMed

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.

  16. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  17. Assessing statistical differences between parameters estimates in Partial Least Squares path modeling.

    PubMed

    Rodríguez-Entrena, Macario; Schuberth, Florian; Gelhard, Carsten

    2018-01-01

    Structural equation modeling using partial least squares (PLS-SEM) has become a main-stream modeling approach in various disciplines. Nevertheless, prior literature still lacks a practical guidance on how to properly test for differences between parameter estimates. Whereas existing techniques such as parametric and non-parametric approaches in PLS multi-group analysis solely allow to assess differences between parameters that are estimated for different subpopulations, the study at hand introduces a technique that allows to also assess whether two parameter estimates that are derived from the same sample are statistically different. To illustrate this advancement to PLS-SEM, we particularly refer to a reduced version of the well-established technology acceptance model.

  18. Improved processes for meeting the data requirements for implementing the Highway Safety Manual (HSM) and Safety Analyst in Florida.

    DOT National Transportation Integrated Search

    2014-03-01

    Recent research in highway safety has focused on the more advanced and statistically proven techniques of highway : safety analysis. This project focuses on the two most recent safety analysis tools, the Highway Safety Manual (HSM) : and SafetyAnalys...

  19. A Fishy Problem for Advanced Students

    ERIC Educational Resources Information Center

    Patterson, Richard A.

    1977-01-01

    While developing a research course for gifted high school students, improvements were made in a local pond. Students worked for a semester learning research techniques, statistical analysis, and limnology. At the end of the course, the three students produced a joint scientific paper detailing their study of the pond. (MA)

  20. Statistics of Crack Growth in Engine Materials. Volume 2. Spectrum Loading and advanced Techniques

    DTIC Science & Technology

    1984-02-01

    Histories of Some WPB Fastener H oles ..................................................................................... 66 51 Typical Sample Function of...Computed Directly from Some Actual Time- Histories of W PB Fastener Holes ................................................................ 77 56 Simulated...Sample Functions of Crack Propagation Time- Histories for W PB Fastener Holes ................................................................ 78 57

  1. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  2. Privacy enhancing techniques - the key to secure communication and management of clinical and genomic data.

    PubMed

    De Moor, G J E; Claerhout, B; De Meyer, F

    2003-01-01

    To introduce some of the privacy protection problems related to genomics based medicine and to highlight the relevance of Trusted Third Parties (TTPs) and of Privacy Enhancing Techniques (PETs) in the restricted context of clinical research and statistics. Practical approaches based on two different pseudonymisation models, both for batch and interactive data collection and exchange, are described and analysed. The growing need of managing both clinical and genetic data raises important legal and ethical challenges. Protecting human rights in the realm of privacy, while optimising research potential and other statistical activities is a challenge that can easily be overcome with the assistance of a trust service provider offering advanced privacy enabling/enhancing solutions. As such, the use of pseudonymisation and other innovative Privacy Enhancing Techniques can unlock valuable data sources.

  3. Terminology, concepts, and models in genetic epidemiology.

    PubMed

    Teare, M Dawn; Koref, Mauro F Santibàñez

    2011-01-01

    Genetic epidemiology brings together approaches and techniques developed in mathematical genetics and statistics, medical genetics, quantitative genetics, and epidemiology. In the 1980s, the focus was on the mapping and identification of genes where defects had large effects at the individual level. More recently, statistical and experimental advances have made possible to identify and characterise genes associated with small effects at the individual level. In this chapter, we provide a brief outline of the models, concepts, and terminology used in genetic epidemiology.

  4. From inverse problems to learning: a Statistical Mechanics approach

    NASA Astrophysics Data System (ADS)

    Baldassi, Carlo; Gerace, Federica; Saglietti, Luca; Zecchina, Riccardo

    2018-01-01

    We present a brief introduction to the statistical mechanics approaches for the study of inverse problems in data science. We then provide concrete new results on inferring couplings from sampled configurations in systems characterized by an extensive number of stable attractors in the low temperature regime. We also show how these result are connected to the problem of learning with realistic weak signals in computational neuroscience. Our techniques and algorithms rely on advanced mean-field methods developed in the context of disordered systems.

  5. Archival Legacy Investigations of Circumstellar Environments (ALICE): Statistical assessment of point source detections

    NASA Astrophysics Data System (ADS)

    Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan

    2015-09-01

    The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.

  6. Diagnostic accuracy of three biopsy techniques in 117 dogs with intra-nasal neoplasia.

    PubMed

    Harris, B J; Lourenço, B N; Dobson, J M; Herrtage, M E

    2014-04-01

    To determine if nasal biopsies taken at rhinoscopy are more accurate for diagnosing neoplasia than biopsies taken blindly or using advanced imaging for guidance. A retrospective study of 117 dogs with nasal mass lesions that were divided into three groups according to the method of nasal biopsy collection; advanced imaging-guided, rhinoscopy-guided and blind biopsy. Signalment, imaging and rhinoscopic findings, and histopathological diagnosis were compared between groups. The proportion of first attempt biopsies confirming neoplasia were determined for each group. There were no statistically significant differences in the proportion of biopsies that confirmed neoplasia obtained via advanced imaging-guided, rhinoscopy-guided or blind biopsy techniques. In dogs with a high index of suspicion of nasal neoplasia, blind biopsy may be as diagnostic as rhinoscopy-guided biopsy. Repeated biopsies are frequently required for definitive diagnosis. © 2014 British Small Animal Veterinary Association.

  7. Guidelines for collecting and maintaining archives for genetic monitoring

    Treesearch

    Jennifer A. Jackson; Linda Laikre; C. Scott Baker; Katherine C. Kendall; F. W. Allendorf; M. K. Schwartz

    2011-01-01

    Rapid advances in molecular genetic techniques and the statistical analysis of genetic data have revolutionized the way that populations of animals, plants and microorganisms can be monitored. Genetic monitoring is the practice of using molecular genetic markers to track changes in the abundance, diversity or distribution of populations, species or ecosystems over time...

  8. Less label, more free: approaches in label-free quantitative mass spectrometry.

    PubMed

    Neilson, Karlie A; Ali, Naveid A; Muralidharan, Sridevi; Mirzaei, Mehdi; Mariani, Michael; Assadourian, Gariné; Lee, Albert; van Sluyter, Steven C; Haynes, Paul A

    2011-02-01

    In this review we examine techniques, software, and statistical analyses used in label-free quantitative proteomics studies for area under the curve and spectral counting approaches. Recent advances in the field are discussed in an order that reflects a logical workflow design. Examples of studies that follow this design are presented to highlight the requirement for statistical assessment and further experiments to validate results from label-free quantitation. Limitations of label-free approaches are considered, label-free approaches are compared with labelling techniques, and forward-looking applications for label-free quantitative data are presented. We conclude that label-free quantitative proteomics is a reliable, versatile, and cost-effective alternative to labelled quantitation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Results of a joint NOAA/NASA sounder simulation study

    NASA Technical Reports Server (NTRS)

    Phillips, N.; Susskind, Joel; Mcmillin, L.

    1988-01-01

    This paper presents the results of a joint NOAA and NASA sounder simulation study in which the accuracies of atmospheric temperature profiles and surface skin temperature measuremnents retrieved from two sounders were compared: (1) the currently used IR temperature sounder HIRS2 (High-resolution Infrared Radiation Sounder 2); and (2) the recently proposed high-spectral-resolution IR sounder AMTS (Advanced Moisture and Temperature Sounder). Simulations were conducted for both clear and partial cloud conditions. Data were analyzed at NASA using a physical inversion technique and at NOAA using a statistical technique. Results show significant improvement of AMTS compared to HIRS2 for both clear and cloudy conditions. The improvements are indicated by both methods of data analysis, but the physical retrievals outperform the statistical retrievals.

  10. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  11. Tenon advancement and duplication technique to prevent postoperative Ahmed valve tube exposure in patients with refractory glaucoma.

    PubMed

    Tamcelik, Nevbahar; Ozkok, Ahmet; Sarıcı, Ahmet Murat; Atalay, Eray; Yetik, Huseyin; Gungor, Kivanc

    2013-07-01

    To present and compare the long-term results of Dr. Tamcelik's previously described technique of Tenon advancement and duplication with the conventional Ahmed glaucoma valve (AGV) implantation technique in patients with refractory glaucoma. This study was a multicenter, retrospective case series that included 303 eyes of 276 patients with refractory glaucoma who underwent glaucoma valve implantation surgery. The patients were divided into three groups according to the surgical technique applied and the outcomes compared. In group 1, 96 eyes of 86 patients underwent AGV implant surgery without patch graft; in group 2, 78 eyes of 72 patients underwent AGV implant surgery with donor scleral patch; in group 3, 129 eyes of 118 patients underwent Ahmed valve implant surgery with "combined short scleral tunnel with Tenon advancement and duplication technique". The endpoint assessed was tube exposure through the conjunctiva. In group 1, conjunctival tube exposure was seen in 11 eyes (12.9 %) after a mean 9.2 ± 3.7 years of follow-up. In group 2, conjunctival tube exposure was seen in six eyes (2.2 %) after a mean 8.9 ± 3.3 years of follow-up. In group 3, there was no conjunctival exposure after a mean 7.8 ± 2.8 years of follow-up. The difference between the groups was statistically significant. (P = 0.0001, Chi-square test). This novel surgical technique combining a short scleral tunnel with Tenon advancement and duplication was found to be effective and safe to prevent conjunctival tube exposure after AGV implantation surgery in patients with refractory glaucoma.

  12. Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques

    NASA Astrophysics Data System (ADS)

    Mishra, D.; Goyal, P.

    2014-12-01

    Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.

  13. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    PubMed

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  14. New efficient optimizing techniques for Kalman filters and numerical weather prediction models

    NASA Astrophysics Data System (ADS)

    Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis

    2016-06-01

    The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.

  15. Scaling up to address data science challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, Joanne R.

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  16. Scaling up to address data science challenges

    DOE PAGES

    Wendelberger, Joanne R.

    2017-04-27

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  17. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  18. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  19. An exploratory investigation of weight estimation techniques for hypersonic flight vehicles

    NASA Technical Reports Server (NTRS)

    Cook, E. L.

    1981-01-01

    The three basic methods of weight prediction (fixed-fraction, statistical correlation, and point stress analysis) and some of the computer programs that have been developed to implement them are discussed. A modified version of the WAATS (Weights Analysis of Advanced Transportation Systems) program is presented, along with input data forms and an example problem.

  20. Advances in molecular labeling, high throughput imaging and machine intelligence portend powerful functional cellular biochemistry tools.

    PubMed

    Price, Jeffrey H; Goodacre, Angela; Hahn, Klaus; Hodgson, Louis; Hunter, Edward A; Krajewski, Stanislaw; Murphy, Robert F; Rabinovich, Andrew; Reed, John C; Heynen, Susanne

    2002-01-01

    Cellular behavior is complex. Successfully understanding systems at ever-increasing complexity is fundamental to advances in modern science and unraveling the functional details of cellular behavior is no exception. We present a collection of prospectives to provide a glimpse of the techniques that will aid in collecting, managing and utilizing information on complex cellular processes via molecular imaging tools. These include: 1) visualizing intracellular protein activity with fluorescent markers, 2) high throughput (and automated) imaging of multilabeled cells in statistically significant numbers, and 3) machine intelligence to analyze subcellular image localization and pattern. Although not addressed here, the importance of combining cell-image-based information with detailed molecular structure and ligand-receptor binding models cannot be overlooked. Advanced molecular imaging techniques have the potential to impact cellular diagnostics for cancer screening, clinical correlations of tissue molecular patterns for cancer biology, and cellular molecular interactions for accelerating drug discovery. The goal of finally understanding all cellular components and behaviors will be achieved by advances in both instrumentation engineering (software and hardware) and molecular biochemistry. Copyright 2002 Wiley-Liss, Inc.

  1. Genetics and child psychiatry: I Advances in quantitative and molecular genetics.

    PubMed

    Rutter, M; Silberg, J; O'Connor, T; Simonoff, E

    1999-01-01

    Advances in quantitative psychiatric genetics as a whole are reviewed with respect to conceptual and methodological issues in relation to statistical model fitting, new genetic designs, twin and adoptee studies, definition of the phenotype, pervasiveness of genetic influences, pervasiveness of environmental influences, shared and nonshared environmental effects, and nature-nurture interplay. Advances in molecular genetics are discussed in relation to the shifts in research strategies to investigate multifactorial disorders (affected relative linkage designs, association strategies, and quantitative trait loci studies); new techniques and identified genetic mechanisms (expansion of trinucleotide repeats, genomic imprinting, mitochondrial DNA, fluorescent in-situ hybridisation, behavioural phenotypes, and animal models); and the successful localisation of genes.

  2. The use of adaptive statistical iterative reconstruction (ASiR) technique in evaluation of patients with cervical spine trauma: impact on radiation dose reduction and image quality

    PubMed Central

    Sheikh, Adnan

    2016-01-01

    Objective: The aim of this study was to evaluate the impact of adaptive statistical iterative reconstruction (ASiR) technique on the image quality and radiation dose reduction. The comparison was made with the traditional filtered back projection (FBP) technique. Methods: We retrospectively reviewed 78 patients, who underwent cervical spine CT for blunt cervical trauma between 1 June 2010 and 30 November 2010. 48 patients were imaged using traditional FBP technique and the remaining 30 patients were imaged using the ASiR technique. The patient demographics, radiation dose, objective image signal and noise were recorded; while subjective noise, sharpness, diagnostic acceptability and artefacts were graded by two radiologists blinded to the techniques. Results: We found that the ASiR technique was able to reduce the volume CT dose index, dose–length product and effective dose by 36%, 36.5% and 36.5%, respectively, compared with the FBP technique. There was no significant difference in the image noise (p = 0.39), signal (p = 0.82) and signal-to-noise ratio (p = 0.56) between the groups. The subjective image quality was minimally better in the ASiR group but not statistically significant. There was excellent interobserver agreement on the subjective image quality and diagnostic acceptability for both groups. Conclusion: The use of ASiR technique allowed approximately 36% radiation dose reduction in the evaluation of cervical spine without degrading the image quality. Advances in knowledge: The present study highlights that the ASiR technique is extremely helpful in reducing the patient radiation exposure while maintaining the image quality. It is highly recommended to utilize this novel technique in CT imaging of different body regions. PMID:26882825

  3. Pathogenesis-based treatments in primary Sjogren's syndrome using artificial intelligence and advanced machine learning techniques: a systematic literature review.

    PubMed

    Foulquier, Nathan; Redou, Pascal; Le Gal, Christophe; Rouvière, Bénédicte; Pers, Jacques-Olivier; Saraux, Alain

    2018-05-17

    Big data analysis has become a common way to extract information from complex and large datasets among most scientific domains. This approach is now used to study large cohorts of patients in medicine. This work is a review of publications that have used artificial intelligence and advanced machine learning techniques to study physio pathogenesis-based treatments in pSS. A systematic literature review retrieved all articles reporting on the use of advanced statistical analysis applied to the study of systemic autoimmune diseases (SADs) over the last decade. An automatic bibliography screening method has been developed to perform this task. The program called BIBOT was designed to fetch and analyze articles from the pubmed database using a list of keywords and Natural Language Processing approaches. The evolution of trends in statistical approaches, sizes of cohorts and number of publications over this period were also computed in the process. In all, 44077 abstracts were screened and 1017 publications were analyzed. The mean number of selected articles was 101.0 (S.D. 19.16) by year, but increased significantly over the time (from 74 articles in 2008 to 138 in 2017). Among them only 12 focused on pSS but none of them emphasized on the aspect of pathogenesis-based treatments. To conclude, medicine progressively enters the era of big data analysis and artificial intelligence, but these approaches are not yet used to describe pSS-specific pathogenesis-based treatment. Nevertheless, large multicentre studies are investigating this aspect with advanced algorithmic tools on large cohorts of SADs patients.

  4. Treatment of multiple adjacent Miller Class I and II gingival recessions with collagen matrix and the modified coronally advanced tunnel technique.

    PubMed

    Molnár, Bálint; Aroca, Sofia; Keglevich, Tibor; Gera, István; Windisch, Péter; Stavropoulos, Andreas; Sculean, Anton

    2013-01-01

    To clinically evaluate the treatment of Miller Class I and II multiple adjacent gingival recessions using the modified coronally advanced tunnel technique combined with a newly developed bioresorbable collagen matrix of porcine origin. Eight healthy patients exhibiting at least three multiple Miller Class I and II multiple adjacent gingival recessions (a total of 42 recessions) were consecutively treated by means of the modified coronally advanced tunnel technique and collagen matrix. The following clinical parameters were assessed at baseline and 12 months postoperatively: full mouth plaque score (FMPS), full mouth bleeding score (FMBS), probing depth (PD), recession depth (RD), recession width (RW), keratinized tissue thickness (KTT), and keratinized tissue width (KTW). The primary outcome variable was complete root coverage. Neither allergic reactions nor soft tissue irritations or matrix exfoliations occurred. Postoperative pain and discomfort were reported to be low, and patient acceptance was generally high. At 12 months, complete root coverage was obtained in 2 out of the 8 patients and 30 of the 42 recessions (71%). Within their limits, the present results indicate that treatment of Miller Class I and II multiple adjacent gingival recessions by means of the modified coronally advanced tunnel technique and collagen matrix may result in statistically and clinically significant complete root coverage. Further studies are warranted to evaluate the performance of collagen matrix compared with connective tissue grafts and other soft tissue grafts.

  5. Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu; Lu, Jianping

    2015-09-15

    Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair basedmore » prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.« less

  6. Advanced Statistics for Exotic Animal Practitioners.

    PubMed

    Hodsoll, John; Hellier, Jennifer M; Ryan, Elizabeth G

    2017-09-01

    Correlation and regression assess the association between 2 or more variables. This article reviews the core knowledge needed to understand these analyses, moving from visual analysis in scatter plots through correlation, simple and multiple linear regression, and logistic regression. Correlation estimates the strength and direction of a relationship between 2 variables. Regression can be considered more general and quantifies the numerical relationships between an outcome and 1 or multiple variables in terms of a best-fit line, allowing predictions to be made. Each technique is discussed with examples and the statistical assumptions underlying their correct application. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Using 3D visualization and seismic attributes to improve structural and stratigraphic resolution of reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerr, J.; Jones, G.L.

    1996-01-01

    Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less

  8. Using 3D visualization and seismic attributes to improve structural and stratigraphic resolution of reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerr, J.; Jones, G.L.

    1996-12-31

    Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less

  9. The issue of multiple univariate comparisons in the context of neuroelectric brain mapping: an application in a neuromarketing experiment.

    PubMed

    Vecchiato, G; De Vico Fallani, F; Astolfi, L; Toppi, J; Cincotti, F; Mattia, D; Salinari, S; Babiloni, F

    2010-08-30

    This paper presents some considerations about the use of adequate statistical techniques in the framework of the neuroelectromagnetic brain mapping. With the use of advanced EEG/MEG recording setup involving hundred of sensors, the issue of the protection against the type I errors that could occur during the execution of hundred of univariate statistical tests, has gained interest. In the present experiment, we investigated the EEG signals from a mannequin acting as an experimental subject. Data have been collected while performing a neuromarketing experiment and analyzed with state of the art computational tools adopted in specialized literature. Results showed that electric data from the mannequin's head presents statistical significant differences in power spectra during the visualization of a commercial advertising when compared to the power spectra gathered during a documentary, when no adjustments were made on the alpha level of the multiple univariate tests performed. The use of the Bonferroni or Bonferroni-Holm adjustments returned correctly no differences between the signals gathered from the mannequin in the two experimental conditions. An partial sample of recently published literature on different neuroscience journals suggested that at least the 30% of the papers do not use statistical protection for the type I errors. While the occurrence of type I errors could be easily managed with appropriate statistical techniques, the use of such techniques is still not so largely adopted in the literature. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  10. Technical tips and advancements in pediatric minimally invasive surgical training on porcine based simulations.

    PubMed

    Narayanan, Sarath Kumar; Cohen, Ralph Clinton; Shun, Albert

    2014-06-01

    Minimal access techniques have transformed the way pediatric surgery is practiced. Due to various constraints, surgical residency programs have not been able to tutor adequate training skills in the routine setting. The advent of new technology and methods in minimally invasive surgery (MIS), has similarly contributed to the need for systematic skills' training in a safe, simulated environment. To enable the training of the proper technique among pediatric surgery trainees, we have advanced a porcine non-survival model for endoscopic surgery. The technical advancements over the past 3 years and a subjective validation of the porcine model from 114 participating trainees using a standard questionnaire and a 5-point Likert scale have been described here. Mean attitude scores and analysis of variance (ANOVA) were used for statistical analysis of the data. Almost all trainees agreed or strongly agreed that the animal-based model was appropriate (98.35%) and also acknowledged that such workshops provided adequate practical experience before attempting on human subjects (96.6%). Mean attitude score for respondents was 19.08 (SD 3.4, range 4-20). Attitude scores showed no statistical association with years of experience or the level of seniority, indicating a positive attitude among all groups of respondents. Structured porcine-based MIS training should be an integral part of skill acquisition for pediatric surgery trainees and the experience gained can be transferred into clinical practice. We advocate that laparoscopic training should begin in a controlled workshop setting before procedures are attempted on human patients.

  11. Methods for Evaluating Mammography Imaging Techniques

    DTIC Science & Technology

    1999-06-01

    Distribution Unlimited 12b. DIS5TRIBUTION CODE 13. ABSTRACT (Maximum 200 words) This Department of Defense Breast Cancer Research Program Career...Development Award is enabling Dr. Rütter to develop bio’statistical methods for breast cancer research. Dr. Rutter is focusing on methods for...evaluating the accuracy of breast cancer screening. This four year program includes advanced training in the epidemiology of breast cancer , training in

  12. Statistical strategy for anisotropic adventitia modelling in IVUS.

    PubMed

    Gil, Debora; Hernández, Aura; Rodriguez, Oriol; Mauri, Josepa; Radeva, Petia

    2006-06-01

    Vessel plaque assessment by analysis of intravascular ultrasound sequences is a useful tool for cardiac disease diagnosis and intervention. Manual detection of luminal (inner) and media-adventitia (external) vessel borders is the main activity of physicians in the process of lumen narrowing (plaque) quantification. Difficult definition of vessel border descriptors, as well as, shades, artifacts, and blurred signal response due to ultrasound physical properties trouble automated adventitia segmentation. In order to efficiently approach such a complex problem, we propose blending advanced anisotropic filtering operators and statistical classification techniques into a vessel border modelling strategy. Our systematic statistical analysis shows that the reported adventitia detection achieves an accuracy in the range of interobserver variability regardless of plaque nature, vessel geometry, and incomplete vessel borders.

  13. A Review of Recent Developments in X-Ray Diagnostics for Turbulent and Optically Dense Rocket Sprays

    NASA Technical Reports Server (NTRS)

    Radke, Christopher; Halls, Benjamin; Kastengren, Alan; Meyer, Terrence

    2017-01-01

    Highly efficient mixing and atomization of fuel and oxidizers is an important factor in many propulsion and power generating applications. To better quantify breakup and mixing in atomizing sprays, several diagnostic techniques have been developed to collect droplet information and spray statistics. Several optical based techniques, such as Ballistic Imaging and SLIPI have previously demonstrated qualitative measurements in optically dense sprays, however these techniques have produced limited quantitative information in the near injector region. To complement to these advances, a recent wave of developments utilizing synchrotron based x-rays have been successful been implemented facilitating the collection of quantitative measurements in optically dense sprays.

  14. Quantification of hemoglobin and its derivatives in oral cancer diagnosis by diffuse reflectance spectroscopy

    NASA Astrophysics Data System (ADS)

    Kaniyappan, Udayakumar; Gnanatheepam, Einstein; Aruna, Prakasarao; Dornadula, Koteeswaran; Ganesan, Singaravelu

    2017-02-01

    Cancer is one of the most common threat to human beings and it increases at an alarming level around the globe. In recent years, due to the advancements in opto-electronic technology, various optical spectroscopy techniques have emerged to assess the photophysicochemical and morphological conditions of normal and malignant tissues in micro as well as in macroscopic scale. In this regard, diffuse reflectance spectroscopy is considered to be the simplest, cost effective and rapid technique in diagnosis of cancerous tissues. In the present study, the hemoglobin concentration in normal and cancerous oral tissues was quantified and subsequent statistical analysis has been carried out to verify the diagnostic potentiality of the technique.

  15. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  16. Evaluation of wind field statistics near and inside clouds using a coherent Doppler lidar

    NASA Astrophysics Data System (ADS)

    Lottman, Brian Todd

    1998-09-01

    This work proposes advanced techniques for measuring the spatial wind field statistics near and inside clouds using a vertically pointing solid state coherent Doppler lidar on a fixed ground based platform. The coherent Doppler lidar is an ideal instrument for high spatial and temporal resolution velocity estimates. The basic parameters of lidar are discussed, including a complete statistical description of the Doppler lidar signal. This description is extended to cases with simple functional forms for aerosol backscatter and velocity. An estimate for the mean velocity over a sensing volume is produced by estimating the mean spectra. There are many traditional spectral estimators, which are useful for conditions with slowly varying velocity and backscatter. A new class of estimators (novel) is introduced that produces reliable velocity estimates for conditions with large variations in aerosol backscatter and velocity with range, such as cloud conditions. Performance of traditional and novel estimators is computed for a variety of deterministic atmospheric conditions using computer simulated data. Wind field statistics are produced for actual data for a cloud deck, and for multi- layer clouds. Unique results include detection of possible spectral signatures for rain, estimates for the structure function inside a cloud deck, reliable velocity estimation techniques near and inside thin clouds, and estimates for simple wind field statistics between cloud layers.

  17. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less

  18. Advanced imaging technologies increase detection of dysplasia and neoplasia in patients with Barrett's esophagus: a meta-analysis and systematic review.

    PubMed

    Qumseya, Bashar J; Wang, Haibo; Badie, Nicole; Uzomba, Rosemary N; Parasa, Sravanthi; White, Donna L; Wolfsen, Herbert; Sharma, Prateek; Wallace, Michael B

    2013-12-01

    US guidelines recommend surveillance of patients with Barrett's esophagus (BE) to detect dysplasia. BE conventionally is monitored via white-light endoscopy (WLE) and a collection of random biopsy specimens. However, this approach does not definitively or consistently detect areas of dysplasia. Advanced imaging technologies can increase the detection of dysplasia and cancer. We investigated whether these imaging technologies can increase the diagnostic yield for the detection of neoplasia in patients with BE, compared with WLE and analysis of random biopsy specimens. We performed a systematic review, using Medline and Embase, to identify relevant peer-review studies. Fourteen studies were included in the final analysis, with a total of 843 patients. Our metameter (estimate) of interest was the paired-risk difference (RD), defined as the difference in yield of the detection of dysplasia or cancer using advanced imaging vs WLE. The estimated paired-RD and 95% confidence interval (CI) were obtained using random-effects models. Heterogeneity was assessed by means of the Q statistic and the I(2) statistic. An exploratory meta-regression was performed to look for associations between the metameter and potential confounders or modifiers. Overall, advanced imaging techniques increased the diagnostic yield for detection of dysplasia or cancer by 34% (95% CI, 20%-56%; P < .0001). A subgroup analysis showed that virtual chromoendoscopy significantly increased the diagnostic yield (RD, 0.34; 95% CI, 0.14-0.56; P < .0001). The RD for chromoendoscopy was 0.35 (95% CI, 0.13-0.56; P = .0001). There was no significant difference between virtual chromoendoscopy and chromoendoscopy, based on Student t test analysis (P = .45). Based on a meta-analysis, advanced imaging techniques such as chromoendoscopy or virtual chromoendoscopy significantly increase the diagnostic yield for identification of dysplasia or cancer in patients with BE. Copyright © 2013 AGA Institute. Published by Elsevier Inc. All rights reserved.

  19. Advanced Bode Plot Techniques for Ultrasonic Transducers

    NASA Astrophysics Data System (ADS)

    DeAngelis, D. A.; Schulze, G. W.

    The Bode plot, displayed as either impedance or admittance versus frequency, is the most basic test used by ultrasonic transducer designers. With simplicity and ease-of-use, Bode plots are ideal for baseline comparisons such as spacing of parasitic modes or impedance, but quite often the subtleties that manifest as poor process control are hard to interpret or are nonexistence. In-process testing of transducers is time consuming for quantifying statistical aberrations, and assessments made indirectly via the workpiece are difficult. This research investigates the use of advanced Bode plot techniques to compare ultrasonic transducers with known "good" and known "bad" process performance, with the goal of a-priori process assessment. These advanced techniques expand from the basic constant voltage versus frequency sweep to include constant current and constant velocity interrogated locally on transducer or tool; they also include up and down directional frequency sweeps to quantify hysteresis effects like jumping and dropping phenomena. The investigation focuses solely on the common PZT8 piezoelectric material used with welding transducers for semiconductor wire bonding. Several metrics are investigated such as impedance, displacement/current gain, velocity/current gain, displacement/voltage gain and velocity/voltage gain. The experimental and theoretical research methods include Bode plots, admittance loops, laser vibrometry and coupled-field finite element analysis.

  20. Advancing solar energy forecasting through the underlying physics

    NASA Astrophysics Data System (ADS)

    Yang, H.; Ghonima, M. S.; Zhong, X.; Ozge, B.; Kurtz, B.; Wu, E.; Mejia, F. A.; Zamora, M.; Wang, G.; Clemesha, R.; Norris, J. R.; Heus, T.; Kleissl, J. P.

    2017-12-01

    As solar power comprises an increasingly large portion of the energy generation mix, the ability to accurately forecast solar photovoltaic generation becomes increasingly important. Due to the variability of solar power caused by cloud cover, knowledge of both the magnitude and timing of expected solar power production ahead of time facilitates the integration of solar power onto the electric grid by reducing electricity generation from traditional ancillary generators such as gas and oil power plants, as well as decreasing the ramping of all generators, reducing start and shutdown costs, and minimizing solar power curtailment, thereby providing annual economic value. The time scales involved in both the energy markets and solar variability range from intra-hour to several days ahead. This wide range of time horizons led to the development of a multitude of techniques, with each offering unique advantages in specific applications. For example, sky imagery provides site-specific forecasts on the minute-scale. Statistical techniques including machine learning algorithms are commonly used in the intra-day forecast horizon for regional applications, while numerical weather prediction models can provide mesoscale forecasts on both the intra-day and days-ahead time scale. This talk will provide an overview of the challenges unique to each technique and highlight the advances in their ongoing development which come alongside advances in the fundamental physics underneath.

  1. Combination of complementary data mining methods for geographical characterization of extra virgin olive oils based on mineral composition.

    PubMed

    Sayago, Ana; González-Domínguez, Raúl; Beltrán, Rafael; Fernández-Recamales, Ángeles

    2018-09-30

    This work explores the potential of multi-element fingerprinting in combination with advanced data mining strategies to assess the geographical origin of extra virgin olive oil samples. For this purpose, the concentrations of 55 elements were determined in 125 oil samples from multiple Spanish geographic areas. Several unsupervised and supervised multivariate statistical techniques were used to build classification models and investigate the relationship between mineral composition of olive oils and their provenance. Results showed that Spanish extra virgin olive oils exhibit characteristic element profiles, which can be differentiated on the basis of their origin in accordance with three geographical areas: Atlantic coast (Huelva province), Mediterranean coast and inland regions. Furthermore, statistical modelling yielded high sensitivity and specificity, principally when random forest and support vector machines were employed, thus demonstrating the utility of these techniques in food traceability and authenticity research. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Introduction to Geostatistics

    NASA Astrophysics Data System (ADS)

    Kitanidis, P. K.

    1997-05-01

    Introduction to Geostatistics presents practical techniques for engineers and earth scientists who routinely encounter interpolation and estimation problems when analyzing data from field observations. Requiring no background in statistics, and with a unique approach that synthesizes classic and geostatistical methods, this book offers linear estimation methods for practitioners and advanced students. Well illustrated with exercises and worked examples, Introduction to Geostatistics is designed for graduate-level courses in earth sciences and environmental engineering.

  3. Emerging Techniques for Dose Optimization in Abdominal CT

    PubMed Central

    Platt, Joel F.; Goodsitt, Mitchell M.; Al-Hawary, Mahmoud M.; Maturen, Katherine E.; Wasnik, Ashish P.; Pandya, Amit

    2014-01-01

    Recent advances in computed tomographic (CT) scanning technique such as automated tube current modulation (ATCM), optimized x-ray tube voltage, and better use of iterative image reconstruction have allowed maintenance of good CT image quality with reduced radiation dose. ATCM varies the tube current during scanning to account for differences in patient attenuation, ensuring a more homogeneous image quality, although selection of the appropriate image quality parameter is essential for achieving optimal dose reduction. Reducing the x-ray tube voltage is best suited for evaluating iodinated structures, since the effective energy of the x-ray beam will be closer to the k-edge of iodine, resulting in a higher attenuation for the iodine. The optimal kilovoltage for a CT study should be chosen on the basis of imaging task and patient habitus. The aim of iterative image reconstruction is to identify factors that contribute to noise on CT images with use of statistical models of noise (statistical iterative reconstruction) and selective removal of noise to improve image quality. The degree of noise suppression achieved with statistical iterative reconstruction can be customized to minimize the effect of altered image quality on CT images. Unlike with statistical iterative reconstruction, model-based iterative reconstruction algorithms model both the statistical noise and the physical acquisition process, allowing CT to be performed with further reduction in radiation dose without an increase in image noise or loss of spatial resolution. Understanding these recently developed scanning techniques is essential for optimization of imaging protocols designed to achieve the desired image quality with a reduced dose. © RSNA, 2014 PMID:24428277

  4. Statistical methods for efficient design of community surveys of response to noise: Random coefficients regression models

    NASA Technical Reports Server (NTRS)

    Tomberlin, T. J.

    1985-01-01

    Research studies of residents' responses to noise consist of interviews with samples of individuals who are drawn from a number of different compact study areas. The statistical techniques developed provide a basis for those sample design decisions. These techniques are suitable for a wide range of sample survey applications. A sample may consist of a random sample of residents selected from a sample of compact study areas, or in a more complex design, of a sample of residents selected from a sample of larger areas (e.g., cities). The techniques may be applied to estimates of the effects on annoyance of noise level, numbers of noise events, the time-of-day of the events, ambient noise levels, or other factors. Methods are provided for determining, in advance, how accurately these effects can be estimated for different sample sizes and study designs. Using a simple cost function, they also provide for optimum allocation of the sample across the stages of the design for estimating these effects. These techniques are developed via a regression model in which the regression coefficients are assumed to be random, with components of variance associated with the various stages of a multi-stage sample design.

  5. Validating Coherence Measurements Using Aligned and Unaligned Coherence Functions

    NASA Technical Reports Server (NTRS)

    Miles, Jeffrey Hilton

    2006-01-01

    This paper describes a novel approach based on the use of coherence functions and statistical theory for sensor validation in a harsh environment. By the use of aligned and unaligned coherence functions and statistical theory one can test for sensor degradation, total sensor failure or changes in the signal. This advanced diagnostic approach and the novel data processing methodology discussed provides a single number that conveys this information. This number as calculated with standard statistical procedures for comparing the means of two distributions is compared with results obtained using Yuen's robust statistical method to create confidence intervals. Examination of experimental data from Kulite pressure transducers mounted in a Pratt & Whitney PW4098 combustor using spectrum analysis methods on aligned and unaligned time histories has verified the effectiveness of the proposed method. All the procedures produce good results which demonstrates how robust the technique is.

  6. Chemical information obtained from Auger depth profiles by means of advanced factor analysis (MLCFA)

    NASA Astrophysics Data System (ADS)

    De Volder, P.; Hoogewijs, R.; De Gryse, R.; Fiermans, L.; Vennik, J.

    1993-01-01

    The advanced multivariate statistical technique "maximum likelihood common factor analysis (MLCFA)" is shown to be superior to "principal component analysis (PCA)" for decomposing overlapping peaks into their individual component spectra of which neither the number of components nor the peak shape of the component spectra is known. An examination of the maximum resolving power of both techniques, MLCFA and PCA, by means of artificially created series of multicomponent spectra confirms this finding unambiguously. Substantial progress in the use of AES as a chemical-analysis technique is accomplished through the implementation of MLCFA. Chemical information from Auger depth profiles is extracted by investigating the variation of the line shape of the Auger signal as a function of the changing chemical state of the element. In particular, MLCFA combined with Auger depth profiling has been applied to problems related to steelcord-rubber tyre adhesion. MLCFA allows one to elucidate the precise nature of the interfacial layer of reaction products between natural rubber vulcanized on a thin brass layer. This study reveals many interesting chemical aspects of the oxi-sulfidation of brass undetectable with classical AES.

  7. Alzheimer's Disease Assessment: A Review and Illustrations Focusing on Item Response Theory Techniques.

    PubMed

    Balsis, Steve; Choudhury, Tabina K; Geraci, Lisa; Benge, Jared F; Patrick, Christopher J

    2018-04-01

    Alzheimer's disease (AD) affects neurological, cognitive, and behavioral processes. Thus, to accurately assess this disease, researchers and clinicians need to combine and incorporate data across these domains. This presents not only distinct methodological and statistical challenges but also unique opportunities for the development and advancement of psychometric techniques. In this article, we describe relatively recent research using item response theory (IRT) that has been used to make progress in assessing the disease across its various symptomatic and pathological manifestations. We focus on applications of IRT to improve scoring, test development (including cross-validation and adaptation), and linking and calibration. We conclude by describing potential future multidimensional applications of IRT techniques that may improve the precision with which AD is measured.

  8. Soft errors in commercial off-the-shelf static random access memories

    NASA Astrophysics Data System (ADS)

    Dilillo, L.; Tsiligiannis, G.; Gupta, V.; Bosser, A.; Saigne, F.; Wrobel, F.

    2017-01-01

    This article reviews state-of-the-art techniques for the evaluation of the effect of radiation on static random access memory (SRAM). We detailed irradiation test techniques and results from irradiation experiments with several types of particles. Two commercial SRAMs, in 90 and 65 nm technology nodes, were considered as case studies. Besides the basic static and dynamic test modes, advanced stimuli for the irradiation tests were introduced, as well as statistical post-processing techniques allowing for deeper analysis of the correlations between bit-flip cross-sections and design/architectural characteristics of the memory device. Further insight is provided on the response of irradiated stacked layer devices and on the use of characterized SRAM devices as particle detectors.

  9. Statistical atmospheric inversion of local gas emissions by coupling the tracer release technique and local-scale transport modelling: a test case with controlled methane emissions

    NASA Astrophysics Data System (ADS)

    Ars, Sébastien; Broquet, Grégoire; Yver Kwok, Camille; Roustan, Yelva; Wu, Lin; Arzoumanian, Emmanuel; Bousquet, Philippe

    2017-12-01

    This study presents a new concept for estimating the pollutant emission rates of a site and its main facilities using a series of atmospheric measurements across the pollutant plumes. This concept combines the tracer release method, local-scale atmospheric transport modelling and a statistical atmospheric inversion approach. The conversion between the controlled emission and the measured atmospheric concentrations of the released tracer across the plume places valuable constraints on the atmospheric transport. This is used to optimise the configuration of the transport model parameters and the model uncertainty statistics in the inversion system. The emission rates of all sources are then inverted to optimise the match between the concentrations simulated with the transport model and the pollutants' measured atmospheric concentrations, accounting for the transport model uncertainty. In principle, by using atmospheric transport modelling, this concept does not strongly rely on the good colocation between the tracer and pollutant sources and can be used to monitor multiple sources within a single site, unlike the classical tracer release technique. The statistical inversion framework and the use of the tracer data for the configuration of the transport and inversion modelling systems should ensure that the transport modelling errors are correctly handled in the source estimation. The potential of this new concept is evaluated with a relatively simple practical implementation based on a Gaussian plume model and a series of inversions of controlled methane point sources using acetylene as a tracer gas. The experimental conditions are chosen so that they are suitable for the use of a Gaussian plume model to simulate the atmospheric transport. In these experiments, different configurations of methane and acetylene point source locations are tested to assess the efficiency of the method in comparison to the classic tracer release technique in coping with the distances between the different methane and acetylene sources. The results from these controlled experiments demonstrate that, when the targeted and tracer gases are not well collocated, this new approach provides a better estimate of the emission rates than the tracer release technique. As an example, the relative error between the estimated and actual emission rates is reduced from 32 % with the tracer release technique to 16 % with the combined approach in the case of a tracer located 60 m upwind of a single methane source. Further studies and more complex implementations with more advanced transport models and more advanced optimisations of their configuration will be required to generalise the applicability of the approach and strengthen its robustness.

  10. Statistics for wildlifers: how much and what kind?

    USGS Publications Warehouse

    Johnson, D.H.; Shaffer, T.L.; Newton, W.E.

    2001-01-01

    Quantitative methods are playing increasingly important roles in wildlife ecology and, ultimately, management. This change poses a challenge for wildlife practitioners and students who are not well-educated in mathematics and statistics. Here we give our opinions on what wildlife biologists should know about statistics, while recognizing that not everyone is inclined mathematically. For those who are, we recommend that they take mathematics coursework at least through calculus and linear algebra. They should take statistics courses that are focused conceptually , stressing the Why rather than the How of doing statistics. For less mathematically oriented wildlifers, introductory classes in statistical techniques will furnish some useful background in basic methods but may provide little appreciation of when the methods are appropriate. These wildlifers will have to rely much more on advice from statisticians. Far more important than knowing how to analyze data is an understanding of how to obtain and recognize good data. Regardless of the statistical education they receive, all wildlife biologists should appreciate the importance of controls, replication, and randomization in studies they conduct. Understanding these concepts requires little mathematical sophistication, but is critical to advancing the science of wildlife ecology.

  11. Impact of advanced laparoscopy courses on present surgical practice.

    PubMed

    Houck, Jared; Kopietz, Courtni M; Shah, Bhavin C; Goede, Matthew R; McBride, Corrigan L; Oleynikov, Dmitry

    2013-01-01

    The introduction of new surgical techniques has made training in laparoscopic procedures a necessity for the practicing surgeon, but acquisition of new surgical skills is a formidable task. This study was conducted to assess the impact of advanced laparoscopic workshops on caseload patterns of practicing surgeons. After we obtained institutional review board approval, a survey of practicing surgeons who participated in advanced laparoscopic courses was distributed; the results were analyzed for statistical significance. The courses were held at the University of Nebraska Medical Center between January 2002 and December 2010. Questionnaires were mailed, faxed, and e-mailed to surgeons. Of the 109 surgeons who participated in the advanced laparoscopy courses, 79 received surveys and 30 were excluded from the survey because of their affiliation with the University of Nebraska Medical Center. A total of 47 responses (59%) were received from 41 male and 6 female surgeons. The median response time from completion of the course to completion of the survey was 13.2 months (range, 6.8-19.1 months). The mean age of participating surgeons was 39.2 years (range, 29-51 years). The mean time since residency was 8.4 years (range, 0.8-21 years). Eleven surgeons had completed a minimal number of laparoscopic cases in residency (<50), 17 surgeons had completed a moderate number of laparoscopic procedures in residency (50-200), and 21 surgeons had completed a significant number of cases during residency (>200). Of the surgeons who responded, 94% were in private practice. Fifty-seven percent of the participating surgeons who responded reported a change in laparoscopic practice patterns after the courses. Of these surgeons, 24% had a limited residency laparoscopy exposure of <50 cases. Surgeons who were exposed to ≥50 laparoscopic cases during their residency showed a statistically significant increase in the number of laparoscopic procedures performed after their class compared with surgeons who did not receive ≥50 laparoscopic cases in residency (P = .03). In addition, regardless of the procedures learned in a specific class, surgeons with ≥50 laparoscopic cases in residency had a statistically significant increase in their laparoscopic colectomy and laparoscopic hernia procedure caseload (P < .01). However, there was no statistically significant difference in laparoscopic caseload between surgeons who had completed 50 to 200 laparoscopic residency cases and those who had completed greater than 200 laparoscopic residency cases (P = .31). Furthermore, the participant's age (P = .23), practice type (P = .61), and years in practice (P = .22) had no statistical significance with regard to the adoption of laparoscopic procedures after courses taken. This finding is congruent with the findings of other researchers. Future interest in advanced laparoscopy courses was noted in 70% of surgeons and was more pronounced in surgeons with ≥50 cases in residency. Advanced laparoscopic workshops provide an efficacious instrument in educating surgeons on minimally invasive surgical techniques. Participating surgeons significantly increased the number of course-specific procedures that they performed but also increased the number of other laparoscopic surgeries, suggesting that a certain proficiency in laparoscopic skills is translated to multiple surgical procedures. Laparoscopy experience of ≥50 cases during residency is a strong predictor of an increase in the number of advanced laparoscopic cases after attending courses.

  12. Clinical Evaluation of Papilla Reconstruction Using Subepithelial Connective Tissue Graft

    PubMed Central

    Kaushik, Alka; PK, Pal; Chopra, Deepak; Chaurasia, Vishwajit Rampratap; Masamatti, Vinaykumar S; DK, Suresh; Babaji, Prashant

    2014-01-01

    Objective: The aesthetics of the patient can be improved by surgical reconstruction of interdental papilla by using an advanced papillary flap interposed with subepithelial connective tissue graft. Materials and Methods: A total of fifteen sites from ten patients having black triangles/papilla recession in the maxillary anterior region were selected and subjected to presurgical evaluation. The sites were treated with interposed subepithelial connective tissue graft placed under a coronally advance flap. The integrity of the papilla was maintained by moving the whole of gingivopapillary unit coronally. The various parameters were analysed at different intervals. Results: There was a mean decrease in the papilla presence index score and distance from contact point to gingival margin, but it was statistically not significant. Also, there is increase in the width of the keratinized gingiva which was statistically highly significant. Conclusion: Advanced papillary flap with interposed sub–epithelial connective tissue graft can offer predictable results for the reconstruction of interdental papilla. If papilla loss occurs solely due to soft-tissue damage, reconstructive techniques can completely restore it; but if due to periodontal disease involving bone loss, reconstruction is generally incomplete and multiple surgical procedures may be required. PMID:25386529

  13. Application of GPS radio occultation to the assessment of temperature profile retrievals from microwave and infrared sounders

    NASA Astrophysics Data System (ADS)

    Feltz, M.; Knuteson, R.; Ackerman, S.; Revercomb, H.

    2014-05-01

    Comparisons of satellite temperature profile products from GPS radio occultation (RO) and hyperspectral infrared (IR)/microwave (MW) sounders are made using a previously developed matchup technique. The profile matchup technique matches GPS RO and IR/MW sounder profiles temporally, within 1 h, and spatially, taking into account the unique RO profile geometry and theoretical spatial resolution by calculating a ray-path averaged sounder profile. The comparisons use the GPS RO dry temperature product. Sounder minus GPS RO differences are computed and used to calculate bias and RMS profile statistics, which are created for global and 30° latitude zones for selected time periods. These statistics are created from various combinations of temperature profile data from the Constellation Observing System for Meteorology, Ionosphere & Climate (COSMIC) network, Global Navigation Satellite System Receiver for Atmospheric Sounding (GRAS) instrument, and the Atmospheric Infrared Sounder (AIRS)/Advanced Microwave Sounding Unit (AMSU), Infrared Atmospheric Sounding Interferometer (IASI)/AMSU, and Crosstrack Infrared Sounder (CrIS)/Advanced Technology Microwave Sounder (ATMS) sounding systems. By overlaying combinations of these matchup statistics for similar time and space domains, comparisons of different sounders' products, sounder product versions, and GPS RO products can be made. The COSMIC GPS RO network has the spatial coverage, time continuity, and stability to provide a common reference for comparison of the sounder profile products. The results of this study demonstrate that GPS RO has potential to act as a common temperature reference and can help facilitate inter-comparison of sounding retrieval methods and also highlight differences among sensor product versions.

  14. Application of GPS radio occultation to the assessment of temperature profile retrievals from microwave and infrared sounders

    NASA Astrophysics Data System (ADS)

    Feltz, M.; Knuteson, R.; Ackerman, S.; Revercomb, H.

    2014-11-01

    Comparisons of satellite temperature profile products from GPS radio occultation (RO) and hyperspectral infrared (IR)/microwave (MW) sounders are made using a previously developed matchup technique. The profile matchup technique matches GPS RO and IR/MW sounder profiles temporally, within 1 h, and spatially, taking into account the unique RO profile geometry and theoretical spatial resolution by calculating a ray-path averaged sounder profile. The comparisons use the GPS RO dry temperature product. Sounder minus GPS RO differences are computed and used to calculate bias and rms profile statistics, which are created for global and 30° latitude zones for selected time periods. These statistics are created from various combinations of temperature profile data from the Constellation Observing System for Meteorology, Ionosphere & Climate (COSMIC) network, Global Navigation Satellite System Receiver for Atmospheric Sounding (GRAS) instrument, and the Atmospheric Infrared Sounder (AIRS)/Advanced Microwave Sounding Unit (AMSU), Infrared Atmospheric Sounding Interferometer (IASI)/AMSU, and Crosstrack Infrared Sounder (CrIS)/Advanced Technology Microwave Sounder (ATMS) sounding systems. By overlaying combinations of these matchup statistics for similar time and space domains, comparisons of different sounders' products, sounder product versions, and GPS RO products can be made. The COSMIC GPS RO network has the spatial coverage, time continuity, and stability to provide a common reference for comparison of the sounder profile products. The results of this study demonstrate that GPS RO has potential to act as a common temperature reference and can help facilitate inter-comparison of sounding retrieval methods and also highlight differences among sensor product versions.

  15. Advanced techniques and technology for efficient data storage, access, and transfer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Miller, Warner

    1991-01-01

    Advanced techniques for efficiently representing most forms of data are being implemented in practical hardware and software form through the joint efforts of three NASA centers. These techniques adapt to local statistical variations to continually provide near optimum code efficiency when representing data without error. Demonstrated in several earlier space applications, these techniques are the basis of initial NASA data compression standards specifications. Since the techniques clearly apply to most NASA science data, NASA invested in the development of both hardware and software implementations for general use. This investment includes high-speed single-chip very large scale integration (VLSI) coding and decoding modules as well as machine-transferrable software routines. The hardware chips were tested in the laboratory at data rates as high as 700 Mbits/s. A coding module's definition includes a predictive preprocessing stage and a powerful adaptive coding stage. The function of the preprocessor is to optimally process incoming data into a standard form data source that the second stage can handle.The built-in preprocessor of the VLSI coder chips is ideal for high-speed sampled data applications such as imaging and high-quality audio, but additionally, the second stage adaptive coder can be used separately with any source that can be externally preprocessed into the 'standard form'. This generic functionality assures that the applicability of these techniques and their recent high-speed implementations should be equally broad outside of NASA.

  16. PHYSICS OF NON-GAUSSIAN FIELDS AND THE COSMOLOGICAL GENUS STATISTIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, J. Berian, E-mail: berian@berkeley.edu

    2012-05-20

    We report a technique to calculate the impact of distinct physical processes inducing non-Gaussianity on the cosmological density field. A natural decomposition of the cosmic genus statistic into an orthogonal polynomial sequence allows complete expression of the scale-dependent evolution of the topology of large-scale structure, in which effects including galaxy bias, nonlinear gravitational evolution, and primordial non-Gaussianity may be delineated. The relationship of this decomposition to previous methods for analyzing the genus statistic is briefly considered and the following applications are made: (1) the expression of certain systematics affecting topological measurements, (2) the quantification of broad deformations from Gaussianity thatmore » appear in the genus statistic as measured in the Horizon Run simulation, and (3) the study of the evolution of the genus curve for simulations with primordial non-Gaussianity. These advances improve the treatment of flux-limited galaxy catalogs for use with this measurement and further the use of the genus statistic as a tool for exploring non-Gaussianity.« less

  17. Proposal for a biometrics of the cortical surface: a statistical method for relative surface distance metrics

    NASA Astrophysics Data System (ADS)

    Bookstein, Fred L.

    1995-08-01

    Recent advances in computational geometry have greatly extended the range of neuroanatomical questions that can be approached by rigorous quantitative methods. One of the major current challenges in this area is to describe the variability of human cortical surface form and its implications for individual differences in neurophysiological functioning. Existing techniques for representation of stochastically invaginated surfaces do not conduce to the necessary parametric statistical summaries. In this paper, following a hint from David Van Essen and Heather Drury, I sketch a statistical method customized for the constraints of this complex data type. Cortical surface form is represented by its Riemannian metric tensor and averaged according to parameters of a smooth averaged surface. Sulci are represented by integral trajectories of the smaller principal strains of this metric, and their statistics follow the statistics of that relative metric. The diagrams visualizing this tensor analysis look like alligator leather but summarize all aspects of cortical surface form in between the principal sulci, the reliable ones; no flattening is required.

  18. A Review of the Study Designs and Statistical Methods Used in the Determination of Predictors of All-Cause Mortality in HIV-Infected Cohorts: 2002–2011

    PubMed Central

    Otwombe, Kennedy N.; Petzold, Max; Martinson, Neil; Chirwa, Tobias

    2014-01-01

    Background Research in the predictors of all-cause mortality in HIV-infected people has widely been reported in literature. Making an informed decision requires understanding the methods used. Objectives We present a review on study designs, statistical methods and their appropriateness in original articles reporting on predictors of all-cause mortality in HIV-infected people between January 2002 and December 2011. Statistical methods were compared between 2002–2006 and 2007–2011. Time-to-event analysis techniques were considered appropriate. Data Sources Pubmed/Medline. Study Eligibility Criteria Original English-language articles were abstracted. Letters to the editor, editorials, reviews, systematic reviews, meta-analysis, case reports and any other ineligible articles were excluded. Results A total of 189 studies were identified (n = 91 in 2002–2006 and n = 98 in 2007–2011) out of which 130 (69%) were prospective and 56 (30%) were retrospective. One hundred and eighty-two (96%) studies described their sample using descriptive statistics while 32 (17%) made comparisons using t-tests. Kaplan-Meier methods for time-to-event analysis were commonly used in the earlier period (n = 69, 76% vs. n = 53, 54%, p = 0.002). Predictors of mortality in the two periods were commonly determined using Cox regression analysis (n = 67, 75% vs. n = 63, 64%, p = 0.12). Only 7 (4%) used advanced survival analysis methods of Cox regression analysis with frailty in which 6 (3%) were used in the later period. Thirty-two (17%) used logistic regression while 8 (4%) used other methods. There were significantly more articles from the first period using appropriate methods compared to the second (n = 80, 88% vs. n = 69, 70%, p-value = 0.003). Conclusion Descriptive statistics and survival analysis techniques remain the most common methods of analysis in publications on predictors of all-cause mortality in HIV-infected cohorts while prospective research designs are favoured. Sophisticated techniques of time-dependent Cox regression and Cox regression with frailty are scarce. This motivates for more training in the use of advanced time-to-event methods. PMID:24498313

  19. Incipient fault detection study for advanced spacecraft systems

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Black, Michael C.; Hovenga, J. Mike; Mcclure, Paul F.

    1986-01-01

    A feasibility study to investigate the application of vibration monitoring to the rotating machinery of planned NASA advanced spacecraft components is described. Factors investigated include: (1) special problems associated with small, high RPM machines; (2) application across multiple component types; (3) microgravity; (4) multiple fault types; (5) eight different analysis techniques including signature analysis, high frequency demodulation, cepstrum, clustering, amplitude analysis, and pattern recognition are compared; and (6) small sample statistical analysis is used to compare performance by computation of probability of detection and false alarm for an ensemble of repeated baseline and faulted tests. Both detection and classification performance are quantified. Vibration monitoring is shown to be an effective means of detecting the most important problem types for small, high RPM fans and pumps typical of those planned for the advanced spacecraft. A preliminary monitoring system design and implementation plan is presented.

  20. Lipid membranes and single ion channel recording for the advanced physics laboratory

    NASA Astrophysics Data System (ADS)

    Klapper, Yvonne; Nienhaus, Karin; Röcker, Carlheinz; Ulrich Nienhaus, G.

    2014-05-01

    We present an easy-to-handle, low-cost, and reliable setup to study various physical phenomena on a nanometer-thin lipid bilayer using the so-called black lipid membrane technique. The apparatus allows us to precisely measure optical and electrical properties of free-standing lipid membranes, to study the formation of single ion channels, and to gain detailed information on the ion conduction properties of these channels using statistical physics and autocorrelation analysis. The experiments are well suited as part of an advanced physics or biophysics laboratory course; they interconnect physics, chemistry, and biology and will be appealing to students of the natural sciences who are interested in quantitative experimentation.

  1. Statistics and Informatics in Space Astrophysics

    NASA Astrophysics Data System (ADS)

    Feigelson, E.

    2017-12-01

    The interest in statistical and computational methodology has seen rapid growth in space-based astrophysics, parallel to the growth seen in Earth remote sensing. There is widespread agreement that scientific interpretation of the cosmic microwave background, discovery of exoplanets, and classifying multiwavelength surveys is too complex to be accomplished with traditional techniques. NASA operates several well-functioning Science Archive Research Centers providing 0.5 PBy datasets to the research community. These databases are integrated with full-text journal articles in the NASA Astrophysics Data System (200K pageviews/day). Data products use interoperable formats and protocols established by the International Virtual Observatory Alliance. NASA supercomputers also support complex astrophysical models of systems such as accretion disks and planet formation. Academic researcher interest in methodology has significantly grown in areas such as Bayesian inference and machine learning, and statistical research is underway to treat problems such as irregularly spaced time series and astrophysical model uncertainties. Several scholarly societies have created interest groups in astrostatistics and astroinformatics. Improvements are needed on several fronts. Community education in advanced methodology is not sufficiently rapid to meet the research needs. Statistical procedures within NASA science analysis software are sometimes not optimal, and pipeline development may not use modern software engineering techniques. NASA offers few grant opportunities supporting research in astroinformatics and astrostatistics.

  2. Statistical contact angle analyses; "slow moving" drops on a horizontal silicon-oxide surface.

    PubMed

    Schmitt, M; Grub, J; Heib, F

    2015-06-01

    Sessile drop experiments on horizontal surfaces are commonly used to characterise surface properties in science and in industry. The advancing angle and the receding angle are measurable on every solid. Specially on horizontal surfaces even the notions themselves are critically questioned by some authors. Building a standard, reproducible and valid method of measuring and defining specific (advancing/receding) contact angles is an important challenge of surface science. Recently we have developed two/three approaches, by sigmoid fitting, by independent and by dependent statistical analyses, which are practicable for the determination of specific angles/slopes if inclining the sample surface. These approaches lead to contact angle data which are independent on "user-skills" and subjectivity of the operator which is also of urgent need to evaluate dynamic measurements of contact angles. We will show in this contribution that the slightly modified procedures are also applicable to find specific angles for experiments on horizontal surfaces. As an example droplets on a flat freshly cleaned silicon-oxide surface (wafer) are dynamically measured by sessile drop technique while the volume of the liquid is increased/decreased. The triple points, the time, the contact angles during the advancing and the receding of the drop obtained by high-precision drop shape analysis are statistically analysed. As stated in the previous contribution the procedure is called "slow movement" analysis due to the small covered distance and the dominance of data points with low velocity. Even smallest variations in velocity such as the minimal advancing motion during the withdrawing of the liquid are identifiable which confirms the flatness and the chemical homogeneity of the sample surface and the high sensitivity of the presented approaches. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Mediation analysis in nursing research: a methodological review.

    PubMed

    Liu, Jianghong; Ulrich, Connie

    2016-12-01

    Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask - and answer - more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science.

  4. Statistical Analysis of Time-Series from Monitoring of Active Volcanic Vents

    NASA Astrophysics Data System (ADS)

    Lachowycz, S.; Cosma, I.; Pyle, D. M.; Mather, T. A.; Rodgers, M.; Varley, N. R.

    2016-12-01

    Despite recent advances in the collection and analysis of time-series from volcano monitoring, and the resulting insights into volcanic processes, challenges remain in forecasting and interpreting activity from near real-time analysis of monitoring data. Statistical methods have potential to characterise the underlying structure and facilitate intercomparison of these time-series, and so inform interpretation of volcanic activity. We explore the utility of multiple statistical techniques that could be widely applicable to monitoring data, including Shannon entropy and detrended fluctuation analysis, by their application to various data streams from volcanic vents during periods of temporally variable activity. Each technique reveals changes through time in the structure of some of the data that were not apparent from conventional analysis. For example, we calculate the Shannon entropy (a measure of the randomness of a signal) of time-series from the recent dome-forming eruptions of Volcán de Colima (Mexico) and Soufrière Hills (Montserrat). The entropy of real-time seismic measurements and the count rate of certain volcano-seismic event types from both volcanoes is found to be temporally variable, with these data generally having higher entropy during periods of lava effusion and/or larger explosions. In some instances, the entropy shifts prior to or coincident with changes in seismic or eruptive activity, some of which were not clearly recognised by real-time monitoring. Comparison with other statistics demonstrates the sensitivity of the entropy to the data distribution, but that it is distinct from conventional statistical measures such as coefficient of variation. We conclude that each analysis technique examined could provide valuable insights for interpretation of diverse monitoring time-series.

  5. Cross-cultural examination of measurement invariance of the Beck Depression Inventory-II.

    PubMed

    Dere, Jessica; Watters, Carolyn A; Yu, Stephanie Chee-Min; Bagby, R Michael; Ryder, Andrew G; Harkness, Kate L

    2015-03-01

    Given substantial rates of major depressive disorder among college and university students, as well as the growing cultural diversity on many campuses, establishing the cross-cultural validity of relevant assessment tools is important. In the current investigation, we examined the Beck Depression Inventory-Second Edition (BDI-II; Beck, Steer, & Brown, 1996) among Chinese-heritage (n = 933) and European-heritage (n = 933) undergraduates in North America. The investigation integrated 3 distinct lines of inquiry: (a) the literature on cultural variation in depressive symptom reporting between people of Chinese and Western heritage; (b) recent developments regarding the factor structure of the BDI-II; and (c) the application of advanced statistical techniques to the issue of cross-cultural measurement invariance. A bifactor model was found to represent the optimal factor structure of the BDI-II. Multigroup confirmatory factor analysis showed that the BDI-II had strong measurement invariance across both culture and gender. In group comparisons with latent and observed variables, Chinese-heritage students scored higher than European-heritage students on cognitive symptoms of depression. This finding deviates from the commonly held view that those of Chinese heritage somatize depression. These findings hold implications for the study and use of the BDI-II, highlight the value of advanced statistical techniques such as multigroup confirmatory factor analysis, and offer methodological lessons for cross-cultural psychopathology research more broadly. 2015 APA, all rights reserved

  6. Classifiers utilized to enhance acoustic based sensors to identify round types of artillery/mortar

    NASA Astrophysics Data System (ADS)

    Grasing, David; Desai, Sachi; Morcos, Amir

    2008-04-01

    Feature extraction methods based on the statistical analysis of the change in event pressure levels over a period and the level of ambient pressure excitation facilitate the development of a robust classification algorithm. The features reliably discriminates mortar and artillery variants via acoustic signals produced during the launch events. Utilizing acoustic sensors to exploit the sound waveform generated from the blast for the identification of mortar and artillery variants as type A, etcetera through analysis of the waveform. Distinct characteristics arise within the different mortar/artillery variants because varying HE mortar payloads and related charges emphasize varying size events at launch. The waveform holds various harmonic properties distinct to a given mortar/artillery variant that through advanced signal processing and data mining techniques can employed to classify a given type. The skewness and other statistical processing techniques are used to extract the predominant components from the acoustic signatures at ranges exceeding 3000m. Exploiting these techniques will help develop a feature set highly independent of range, providing discrimination based on acoustic elements of the blast wave. Highly reliable discrimination will be achieved with a feedforward neural network classifier trained on a feature space derived from the distribution of statistical coefficients, frequency spectrum, and higher frequency details found within different energy bands. The processes that are described herein extend current technologies, which emphasis acoustic sensor systems to provide such situational awareness.

  7. Artillery/mortar type classification based on detected acoustic transients

    NASA Astrophysics Data System (ADS)

    Morcos, Amir; Grasing, David; Desai, Sachi

    2008-04-01

    Feature extraction methods based on the statistical analysis of the change in event pressure levels over a period and the level of ambient pressure excitation facilitate the development of a robust classification algorithm. The features reliably discriminates mortar and artillery variants via acoustic signals produced during the launch events. Utilizing acoustic sensors to exploit the sound waveform generated from the blast for the identification of mortar and artillery variants as type A, etcetera through analysis of the waveform. Distinct characteristics arise within the different mortar/artillery variants because varying HE mortar payloads and related charges emphasize varying size events at launch. The waveform holds various harmonic properties distinct to a given mortar/artillery variant that through advanced signal processing and data mining techniques can employed to classify a given type. The skewness and other statistical processing techniques are used to extract the predominant components from the acoustic signatures at ranges exceeding 3000m. Exploiting these techniques will help develop a feature set highly independent of range, providing discrimination based on acoustic elements of the blast wave. Highly reliable discrimination will be achieved with a feed-forward neural network classifier trained on a feature space derived from the distribution of statistical coefficients, frequency spectrum, and higher frequency details found within different energy bands. The processes that are described herein extend current technologies, which emphasis acoustic sensor systems to provide such situational awareness.

  8. Artillery/mortar round type classification to increase system situational awareness

    NASA Astrophysics Data System (ADS)

    Desai, Sachi; Grasing, David; Morcos, Amir; Hohil, Myron

    2008-04-01

    Feature extraction methods based on the statistical analysis of the change in event pressure levels over a period and the level of ambient pressure excitation facilitate the development of a robust classification algorithm. The features reliably discriminates mortar and artillery variants via acoustic signals produced during the launch events. Utilizing acoustic sensors to exploit the sound waveform generated from the blast for the identification of mortar and artillery variants as type A, etcetera through analysis of the waveform. Distinct characteristics arise within the different mortar/artillery variants because varying HE mortar payloads and related charges emphasize varying size events at launch. The waveform holds various harmonic properties distinct to a given mortar/artillery variant that through advanced signal processing and data mining techniques can employed to classify a given type. The skewness and other statistical processing techniques are used to extract the predominant components from the acoustic signatures at ranges exceeding 3000m. Exploiting these techniques will help develop a feature set highly independent of range, providing discrimination based on acoustic elements of the blast wave. Highly reliable discrimination will be achieved with a feedforward neural network classifier trained on a feature space derived from the distribution of statistical coefficients, frequency spectrum, and higher frequency details found within different energy bands. The processes that are described herein extend current technologies, which emphasis acoustic sensor systems to provide such situational awareness.

  9. Advancing statistical analysis of ambulatory assessment data in the study of addictive behavior: A primer on three person-oriented techniques.

    PubMed

    Foster, Katherine T; Beltz, Adriene M

    2018-08-01

    Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.

  10. Identification of the contribution of contact and aerial biomechanical parameters in acrobatic performance

    PubMed Central

    Haering, Diane; Huchez, Aurore; Barbier, Franck; Holvoët, Patrice; Begon, Mickaël

    2017-01-01

    Introduction Teaching acrobatic skills with a minimal amount of repetition is a major challenge for coaches. Biomechanical, statistical or computer simulation tools can help them identify the most determinant factors of performance. Release parameters, change in moment of inertia and segmental momentum transfers were identified in the prediction of acrobatics success. The purpose of the present study was to evaluate the relative contribution of these parameters in performance throughout expertise or optimisation based improvements. The counter movement forward in flight (CMFIF) was chosen for its intrinsic dichotomy between the accessibility of its attempt and complexity of its mastery. Methods Three repetitions of the CMFIF performed by eight novice and eight advanced female gymnasts were recorded using a motion capture system. Optimal aerial techniques that maximise rotation potential at regrasp were also computed. A 14-segment-multibody-model defined through the Rigid Body Dynamics Library was used to compute recorded and optimal kinematics, and biomechanical parameters. A stepwise multiple linear regression was used to determine the relative contribution of these parameters in novice recorded, novice optimised, advanced recorded and advanced optimised trials. Finally, fixed effects of expertise and optimisation were tested through a mixed-effects analysis. Results and discussion Variation in release state only contributed to performances in novice recorded trials. Moment of inertia contribution to performance increased from novice recorded, to novice optimised, advanced recorded, and advanced optimised trials. Contribution to performance of momentum transfer to the trunk during the flight prevailed in all recorded trials. Although optimisation decreased transfer contribution, momentum transfer to the arms appeared. Conclusion Findings suggest that novices should be coached on both contact and aerial technique. Inversely, mainly improved aerial technique helped advanced gymnasts increase their performance. For both, reduction of the moment of inertia should be focused on. The method proposed in this article could be generalized to any aerial skill learning investigation. PMID:28422954

  11. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  12. General advancing front packing algorithm for the discrete element method

    NASA Astrophysics Data System (ADS)

    Morfa, Carlos A. Recarey; Pérez Morales, Irvin Pablo; de Farias, Márcio Muniz; de Navarra, Eugenio Oñate Ibañez; Valera, Roberto Roselló; Casañas, Harold Díaz-Guzmán

    2018-01-01

    A generic formulation of a new method for packing particles is presented. It is based on a constructive advancing front method, and uses Monte Carlo techniques for the generation of particle dimensions. The method can be used to obtain virtual dense packings of particles with several geometrical shapes. It employs continuous, discrete, and empirical statistical distributions in order to generate the dimensions of particles. The packing algorithm is very flexible and allows alternatives for: 1—the direction of the advancing front (inwards or outwards), 2—the selection of the local advancing front, 3—the method for placing a mobile particle in contact with others, and 4—the overlap checks. The algorithm also allows obtaining highly porous media when it is slightly modified. The use of the algorithm to generate real particle packings from grain size distribution curves, in order to carry out engineering applications, is illustrated. Finally, basic applications of the algorithm, which prove its effectiveness in the generation of a large number of particles, are carried out.

  13. Vibroacoustic Response of the NASA ACTS Spacecraft Antenna to Launch Acoustic Excitation

    NASA Technical Reports Server (NTRS)

    Larko, Jeffrey M.; Cotoni, Vincent

    2008-01-01

    The Advanced Communications Technology Satellite was an experimental NASA satellite launched from the Space Shuttle Discovery. As part of the ground test program, the satellite s large, parabolic reflector antennas were exposed to a reverberant acoustic loading to simulate the launch acoustics in the Shuttle payload bay. This paper describes the modelling and analysis of the dynamic response of these large, composite spacecraft antenna structure subjected to a diffuse acoustic field excitation. Due to the broad frequency range of the excitation, different models were created to make predictions in the various frequency regimes of interest: a statistical energy analysis (SEA) model to capture the high frequency response and a hybrid finite element-statistical energy (hybrid FE-SEA) model for the low to mid-frequency responses. The strengths and limitations of each of the analytical techniques are discussed. The predictions are then compared to the measured acoustic test data and to a boundary element (BEM) model to evaluate the performance of the hybrid techniques.

  14. UVPROM dosimetry, microdosimetry and applications to SEU and extreme value theory

    NASA Astrophysics Data System (ADS)

    Scheick, Leif Zebediah

    A new method is described for characterizing a device in terms of the statistical distribution of first failures. The method is based on the erasure of a commercial Ultra- Violet erasable Programmable Read Only Memory (UVPROM). The method of readout would be used on a spacecraft or in other restrictive radiation environments. The measurement of the charge remaining on the floating gate is used to determine absorbed dose. The method of determining dose does not require the detector to be destroyed or erased nor does it effect the ability for taking further measurements. This is compared to extreme value theory applied to the statistical distributions that apply to this device. This technique predicts the threshold of Single Event Effects (SEE), like anomalous changes in erasure time in programmable devices due to high microdose energy-deposition events. This technique also allows for advanced non-destructive, screening of a single microelectronic devices for predictable response in a stressful, i.e. radiation, environments.

  15. Basis function models for animal movement

    USGS Publications Warehouse

    Hooten, Mevin B.; Johnson, Devin S.

    2017-01-01

    Advances in satellite-based data collection techniques have served as a catalyst for new statistical methodology to analyze these data. In wildlife ecological studies, satellite-based data and methodology have provided a wealth of information about animal space use and the investigation of individual-based animal–environment relationships. With the technology for data collection improving dramatically over time, we are left with massive archives of historical animal telemetry data of varying quality. While many contemporary statistical approaches for inferring movement behavior are specified in discrete time, we develop a flexible continuous-time stochastic integral equation framework that is amenable to reduced-rank second-order covariance parameterizations. We demonstrate how the associated first-order basis functions can be constructed to mimic behavioral characteristics in realistic trajectory processes using telemetry data from mule deer and mountain lion individuals in western North America. Our approach is parallelizable and provides inference for heterogenous trajectories using nonstationary spatial modeling techniques that are feasible for large telemetry datasets. Supplementary materials for this article are available online.

  16. Detecting dark matter in the Milky Way with cosmic and gamma radiation

    NASA Astrophysics Data System (ADS)

    Carlson, Eric C.

    Over the last decade, experiments in high-energy astroparticle physics have reached unprecedented precision and sensitivity which span the electromagnetic and cosmic-ray spectra. These advances have opened a new window onto the universe for which little was previously known. Such dramatic increases in sensitivity lead naturally to claims of excess emission, which call for either revised astrophysical models or the existence of exotic new sources such as particle dark matter. Here we stand firmly with Occam, sharpening his razor by (i) developing new techniques for discriminating astrophysical signatures from those of dark matter, and (ii) by developing detailed foreground models which can explain excess signals and shed light on the underlying astrophysical processes at hand. We concentrate most directly on observations of Galactic gamma and cosmic rays, factoring the discussion into three related parts which each contain significant advancements from our cumulative works. In Part I we introduce concepts which are fundamental to the Indirect Detection of particle dark matter, including motivations, targets, experiments, production of Standard Model particles, and a variety of statistical techniques. In Part II we introduce basic and advanced modelling techniques for propagation of cosmic-rays through the Galaxy and describe astrophysical gamma-ray production, as well as presenting state-of-the-art propagation models of the Milky Way.Finally, in Part III, we employ these models and techniques in order to study several indirect detection signals, including the Fermi GeV excess at the Galactic center, the Fermi 135 GeV line, the 3.5 keV line, and the WMAP-Planck haze.

  17. Collaborative Research: Using ARM Observations to Evaluate GCM Cloud Statistics for Development of Stochastic Cloud-Radiation Parameterizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Samuel S. P.

    2013-09-01

    The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key stepmore » in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been an interdisciplinary collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen). The motivation and long-term goal underlying this work is the utilization of stochastic radiative transfer theory (Lane-Veron and Somerville, 2004; Lane et al., 2002) to develop a new class of parametric representations of cloud-radiation interactions and closely related processes for atmospheric models. The theoretical advantage of the stochastic approach is that it can accurately calculate the radiative heating rates through a broken cloud layer without requiring an exact description of the cloud geometry.« less

  18. Investigation of an intelligent system for fiber optic-based epidural anesthesia.

    PubMed

    Gong, Cihun-Siyong Alex; Ting, Chien-Kun

    2014-01-01

    Even though there have been many approaches to assist the anesthesiologists in performing regional anesthesia, none of the prior arts may be said as an unrestricted technique. The lack of a design that is with sufficient sensitivity to the targets of interest and automatic indication of needle placement makes it difficult to all-round implementation of field usage of objectiveness. In addition, light-weight easy-to-use realization is the key point of portability. This paper reports on an intelligent system of epidural space identification using optical technique, with particular emphasis on efficiency-enhanced aspects. Statistical algorithms, implemented in a dedicated field-programmable hardware platform along with an on-platform application-specific integrated chip, used to advance real-time self decision making in needle advancement are discussed together with the feedback results. Clinicians' viewpoint of improving the correct rate of our technique is explained in detail. Our study demonstrates not only that the improved system is able to behave as if it is a skillful anesthesiologist but also it has potential to bring promising assist into clinical use under varied conditions and small amount of sample, provided that several concerns are addressed.

  19. An Examination of Application of Artificial Neural Network in Cognitive Radios

    NASA Astrophysics Data System (ADS)

    Bello Salau, H.; Onwuka, E. N.; Aibinu, A. M.

    2013-12-01

    Recent advancement in software radio technology has led to the development of smart device known as cognitive radio. This type of radio fuses powerful techniques taken from artificial intelligence, game theory, wideband/multiple antenna techniques, information theory and statistical signal processing to create an outstanding dynamic behavior. This cognitive radio is utilized in achieving diverse set of applications such as spectrum sensing, radio parameter adaptation and signal classification. This paper contributes by reviewing different cognitive radio implementation that uses artificial intelligence such as the hidden markov models, metaheuristic algorithm and artificial neural networks (ANNs). Furthermore, different areas of application of ANNs and their performance metrics based approach are also examined.

  20. Single Molecule Approaches in RNA-Protein Interactions.

    PubMed

    Serebrov, Victor; Moore, Melissa J

    RNA-protein interactions govern every aspect of RNA metabolism, and aberrant RNA-binding proteins are the cause of hundreds of genetic diseases. Quantitative measurements of these interactions are necessary in order to understand mechanisms leading to diseases and to develop efficient therapies. Existing methods of RNA-protein interactome capture can afford a comprehensive snapshot of RNA-protein interaction networks but lack the ability to characterize the dynamics of these interactions. As all ensemble methods, their resolution is also limited by statistical averaging. Here we discuss recent advances in single molecule techniques that have the potential to tackle these challenges. We also provide a thorough overview of single molecule colocalization microscopy and the essential protein and RNA tagging and detection techniques.

  1. PREFACE: Advanced many-body and statistical methods in mesoscopic systems

    NASA Astrophysics Data System (ADS)

    Anghel, Dragos Victor; Sabin Delion, Doru; Sorin Paraoanu, Gheorghe

    2012-02-01

    It has increasingly been realized in recent times that the borders separating various subfields of physics are largely artificial. This is the case for nanoscale physics, physics of lower-dimensional systems and nuclear physics, where the advanced techniques of many-body theory developed in recent times could provide a unifying framework for these disciplines under the general name of mesoscopic physics. Other fields, such as quantum optics and quantum information, are increasingly using related methods. The 6-day conference 'Advanced many-body and statistical methods in mesoscopic systems' that took place in Constanta, Romania, between 27 June and 2 July 2011 was, we believe, a successful attempt at bridging an impressive list of topical research areas: foundations of quantum physics, equilibrium and non-equilibrium quantum statistics/fractional statistics, quantum transport, phases and phase transitions in mesoscopic systems/superfluidity and superconductivity, quantum electromechanical systems, quantum dissipation, dephasing, noise and decoherence, quantum information, spin systems and their dynamics, fundamental symmetries in mesoscopic systems, phase transitions, exactly solvable methods for mesoscopic systems, various extension of the random phase approximation, open quantum systems, clustering, decay and fission modes and systematic versus random behaviour of nuclear spectra. This event brought together participants from seventeen countries and five continents. Each of the participants brought considerable expertise in his/her field of research and, at the same time, was exposed to the newest results and methods coming from the other, seemingly remote, disciplines. The talks touched on subjects that are at the forefront of topical research areas and we hope that the resulting cross-fertilization of ideas will lead to new, interesting results from which everybody will benefit. We are grateful for the financial and organizational support from IFIN-HH, Ovidius University (where the conference took place), the Academy of Romanian Scientists and the Romanian National Authority for Scientific Research. This conference proceedings volume brings together some of the invited and contributed talks of the conference. The hope of the editors is that they will constitute reference material for applying many-body techniques to problems in mesoscopic and nuclear physics. We thank all the participants for their contribution to the success of this conference. D V Anghel and D S Delion IFIN-HH, Bucharest, Romania G S Paraoanu Aalto University, Finland Conference photograph

  2. A comparison of skeletal stability after mandibular advancement and use of two rigid internal fixation techniques.

    PubMed

    Blomqvist, J E; Ahlborg, G; Isaksson, S; Svartz, K

    1997-06-01

    Two different methods of rigid fixation were compared for postoperative stability 6 months after mandibular advancement for treatment of Class II malocclusion. Sixty (30 + 30) patients from two different oral and maxillofacial units treated for a Class II malocclusion by bilateral saggital split osteotomy (BSSO), and two different methods of internal rigid fixation were prospectively investigated. Two groups (S1, n = 15; S2, n = 15) had bicortical noncompressive screws inserted in the gonial area through a transcutaneous approach, and the other two groups (P1, n = 15; P2, n = 15) had the bone segments fixed with unicortical screws and miniplates on the lateral surface of the mandibular body. Cephalograms were taken preoperatively, 2 days postoperatively and 6 months after the operation. A computer program was used to superimpose the three cephalograms and to register the mandibular advancement and postoperative change both sagittally and vertically. These were minor differences in the advancement and postoperative changes between the four groups, but statistically no significant difference was shown in either sagittal or vertical directions. However, statistically verified differences proved that increasing age was associated with a smaller amount of postsurgical relapse. Low-angle cases (ML/NSL < 25 degrees) had a bigger amount of surgical (P = .0008) and postsurgical (P = .0195) movement compared with the patients in the high-angle group (ML/NSL < 38 degrees). Using a multiple regression test, a positive correlation was also shown between the amount of surgical advancement and the amount of postsurgical instability (P = .018). This prospective dual-center study indicates that the two different methods of internal rigid fixation after surgical advancement of the mandible by BSSO did not significantly differ from each other, and it is up to the individual operator to choose the method for internal rigid fixation.

  3. Intermediate/Advanced Research Design and Statistics

    NASA Technical Reports Server (NTRS)

    Ploutz-Snyder, Robert

    2009-01-01

    The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs

  4. ANEMOS: Development of a next generation wind power forecasting system for the large-scale integration of onshore and offshore wind farms.

    NASA Astrophysics Data System (ADS)

    Kariniotakis, G.; Anemos Team

    2003-04-01

    Objectives: Accurate forecasting of the wind energy production up to two days ahead is recognized as a major contribution for reliable large-scale wind power integration. Especially, in a liberalized electricity market, prediction tools enhance the position of wind energy compared to other forms of dispatchable generation. ANEMOS, is a new 3.5 years R&D project supported by the European Commission, that resembles research organizations and end-users with an important experience on the domain. The project aims to develop advanced forecasting models that will substantially outperform current methods. Emphasis is given to situations like complex terrain, extreme weather conditions, as well as to offshore prediction for which no specific tools currently exist. The prediction models will be implemented in a software platform and installed for online operation at onshore and offshore wind farms by the end-users participating in the project. Approach: The paper presents the methodology of the project. Initially, the prediction requirements are identified according to the profiles of the end-users. The project develops prediction models based on both a physical and an alternative statistical approach. Research on physical models gives emphasis to techniques for use in complex terrain and the development of prediction tools based on CFD techniques, advanced model output statistics or high-resolution meteorological information. Statistical models (i.e. based on artificial intelligence) are developed for downscaling, power curve representation, upscaling for prediction at regional or national level, etc. A benchmarking process is set-up to evaluate the performance of the developed models and to compare them with existing ones using a number of case studies. The synergy between statistical and physical approaches is examined to identify promising areas for further improvement of forecasting accuracy. Appropriate physical and statistical prediction models are also developed for offshore wind farms taking into account advances in marine meteorology (interaction between wind and waves, coastal effects). The benefits from the use of satellite radar images for modeling local weather patterns are investigated. A next generation forecasting software, ANEMOS, will be developed to integrate the various models. The tool is enhanced by advanced Information Communication Technology (ICT) functionality and can operate both in stand alone, or remote mode, or be interfaced with standard Energy or Distribution Management Systems (EMS/DMS) systems. Contribution: The project provides an advanced technology for wind resource forecasting applicable in a large scale: at a single wind farm, regional or national level and for both interconnected and island systems. A major milestone is the on-line operation of the developed software by the participating utilities for onshore and offshore wind farms and the demonstration of the economic benefits. The outcome of the ANEMOS project will help consistently the increase of wind integration in two levels; in an operational level due to better management of wind farms, but also, it will contribute to increasing the installed capacity of wind farms. This is because accurate prediction of the resource reduces the risk of wind farm developers, who are then more willing to undertake new wind farm installations especially in a liberalized electricity market environment.

  5. Treatment of localized gingival recession using the free rotated papilla autograft combined with coronally advanced flap by conventional (macrosurgery) and surgery under magnification (microsurgical) technique: A comparative clinical study

    PubMed Central

    Pandey, Suraj; Mehta, D. S.

    2013-01-01

    Background: The aim of the present study was to evaluate and compare the conventional (macro-surgical) and microsurgical approach in performing the free rotated papilla autograft combined with coronally advanced flap surgery in treatment of localized gingival recession. Materials and Methods: A total of 20 sites from 10 systemically healthy patients were selected for the study. The selected sites were randomly divided into experimental site A and experimental site B by using the spilt mouth design. Conventional (macro-surgical) approach for site A and micro-surgery for site B was applied in performing the free rotated papilla autograft combined with coronally advanced flap. Recession depth (RD), recession width (RW) clinical attachment level (CAL.) and width of keratinized tissue (WKT.) were recorded at baseline, 3 months and 6 months post-operatively. Results: Both (macro- and microsurgery) groups showed significant clinical improvement in all the parameters (RD, RW, CAL and WKT). However, on comparing both the groups, these parameters did not reach statistical significance. Conclusion: Both the surgical procedures were equally effective in treatment of localized gingival recession by the free rotated papilla autograft technique combined with coronally advanced flap. However, surgery under magnification (microsurgery) may be clinically better than conventional surgery in terms of less post-operative pain and discomfort experienced by patients at the microsurgical site. PMID:24554888

  6. New insights into the endophenotypic status of cognition in bipolar disorder: genetic modelling study of twins and siblings.

    PubMed

    Georgiades, Anna; Rijsdijk, Fruhling; Kane, Fergus; Rebollo-Mesa, Irene; Kalidindi, Sridevi; Schulze, Katja K; Stahl, Daniel; Walshe, Muriel; Sahakian, Barbara J; McDonald, Colm; Hall, Mei-Hua; Murray, Robin M; Kravariti, Eugenia

    2016-06-01

    Twin studies have lacked statistical power to apply advanced genetic modelling techniques to the search for cognitive endophenotypes for bipolar disorder. To quantify the shared genetic variability between bipolar disorder and cognitive measures. Structural equation modelling was performed on cognitive data collected from 331 twins/siblings of varying genetic relatedness, disease status and concordance for bipolar disorder. Using a parsimonious AE model, verbal episodic and spatial working memory showed statistically significant genetic correlations with bipolar disorder (rg = |0.23|-|0.27|), which lost statistical significance after covarying for affective symptoms. Using an ACE model, IQ and visual-spatial learning showed statistically significant genetic correlations with bipolar disorder (rg = |0.51|-|1.00|), which remained significant after covarying for affective symptoms. Verbal episodic and spatial working memory capture a modest fraction of the bipolar diathesis. IQ and visual-spatial learning may tap into genetic substrates of non-affective symptomatology in bipolar disorder. © The Royal College of Psychiatrists 2016.

  7. Quantile regression for the statistical analysis of immunological data with many non-detects.

    PubMed

    Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth

    2012-07-07

    Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.

  8. Motion-base simulator results of advanced supersonic transport handling qualities with active controls

    NASA Technical Reports Server (NTRS)

    Feather, J. B.; Joshi, D. S.

    1981-01-01

    Handling qualities of the unaugmented advanced supersonic transport (AST) are deficient in the low-speed, landing approach regime. Consequently, improvement in handling with active control augmentation systems has been achieved using implicit model-following techniques. Extensive fixed-based simulator evaluations were used to validate these systems prior to tests with full motion and visual capabilities on a six-axis motion-base simulator (MBS). These tests compared the handling qualities of the unaugmented AST with several augmented configurations to ascertain the effectiveness of these systems. Cooper-Harper ratings, tracking errors, and control activity data from the MBS tests have been analyzed statistically. The results show the fully augmented AST handling qualities have been improved to an acceptable level.

  9. Why interdisciplinary research enriches the study of crime. Comment on "Statistical physics of crime: A review" by M.R. D'Orsogna and M. Perc

    NASA Astrophysics Data System (ADS)

    Donnay, Karsten

    2015-03-01

    The past several years have seen a rapidly growing interest in the use of advanced quantitative methodologies and formalisms adapted from the natural sciences to study a broad range of social phenomena. The research field of computational social science [1,2], for example, uses digital artifacts of human online activity to cast a new light on social dynamics. Similarly, the studies reviewed by D'Orsogna and Perc showcase a diverse set of advanced quantitative techniques to study the dynamics of crime. Methods used range from partial differential equations and self-exciting point processes to agent-based models, evolutionary game theory and network science [3].

  10. Advances in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.

    1991-01-01

    Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.

  11. NASA Marshall Space Flight Center Controls Systems Design and Analysis Branch

    NASA Technical Reports Server (NTRS)

    Gilligan, Eric

    2014-01-01

    Marshall Space Flight Center maintains a critical national capability in the analysis of launch vehicle flight dynamics and flight certification of GN&C algorithms. MSFC analysts are domain experts in the areas of flexible-body dynamics and control-structure interaction, thrust vector control, sloshing propellant dynamics, and advanced statistical methods. Marshall's modeling and simulation expertise has supported manned spaceflight for over 50 years. Marshall's unparalleled capability in launch vehicle guidance, navigation, and control technology stems from its rich heritage in developing, integrating, and testing launch vehicle GN&C systems dating to the early Mercury-Redstone and Saturn vehicles. The Marshall team is continuously developing novel methods for design, including advanced techniques for large-scale optimization and analysis.

  12. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    Parametric cost estimating methods for space systems in the conceptual design phase are developed. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance, and time. The relationship between weight and cost is examined in detail. A theoretical model of cost is developed and tested statistically against a historical data base of major research and development programs. It is concluded that the technique presented is sound, but that it must be refined in order to produce acceptable cost estimates.

  13. STATISTICS OF THE VELOCITY GRADIENT TENSOR IN SPACE PLASMA TURBULENT FLOWS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Consolini, Giuseppe; Marcucci, Maria Federica; Pallocchia, Giuseppe

    2015-10-10

    In the last decade, significant advances have been presented for the theoretical characterization and experimental techniques used to measure and model all of the components of the velocity gradient tensor in the framework of fluid turbulence. Here, we attempt the evaluation of the small-scale velocity gradient tensor for a case study of space plasma turbulence, observed in the Earth's magnetosheath region by the CLUSTER mission. In detail, we investigate the joint statistics P(R, Q) of the velocity gradient geometric invariants R and Q, and find that this P(R, Q) is similar to that of the low end of the inertialmore » range for fluid turbulence, with a pronounced increase in the statistics along the so-called Vieillefosse tail. In the context of hydrodynamics, this result is referred to as the dissipation/dissipation-production due to vortex stretching.« less

  14. Single-case research design in pediatric psychology: considerations regarding data analysis.

    PubMed

    Cohen, Lindsey L; Feinstein, Amanda; Masuda, Akihiko; Vowles, Kevin E

    2014-03-01

    Single-case research allows for an examination of behavior and can demonstrate the functional relation between intervention and outcome in pediatric psychology. This review highlights key assumptions, methodological and design considerations, and options for data analysis. Single-case methodology and guidelines are reviewed with an in-depth focus on visual and statistical analyses. Guidelines allow for the careful evaluation of design quality and visual analysis. A number of statistical techniques have been introduced to supplement visual analysis, but to date, there is no consensus on their recommended use in single-case research design. Single-case methodology is invaluable for advancing pediatric psychology science and practice, and guidelines have been introduced to enhance the consistency, validity, and reliability of these studies. Experts generally agree that visual inspection is the optimal method of analysis in single-case design; however, statistical approaches are becoming increasingly evaluated and used to augment data interpretation.

  15. Mediation analysis in nursing research: a methodological review

    PubMed Central

    Liu, Jianghong; Ulrich, Connie

    2017-01-01

    Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask – and answer – more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science. PMID:26176804

  16. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...

  17. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...

  18. Specialized data analysis of SSME and advanced propulsion system vibration measurements

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi

    1993-01-01

    The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.

  19. Dynamic Quantitative Trait Locus Analysis of Plant Phenomic Data.

    PubMed

    Li, Zitong; Sillanpää, Mikko J

    2015-12-01

    Advanced platforms have recently become available for automatic and systematic quantification of plant growth and development. These new techniques can efficiently produce multiple measurements of phenotypes over time, and introduce time as an extra dimension to quantitative trait locus (QTL) studies. Functional mapping utilizes a class of statistical models for identifying QTLs associated with the growth characteristics of interest. A major benefit of functional mapping is that it integrates information over multiple timepoints, and therefore could increase the statistical power for QTL detection. We review the current development of computationally efficient functional mapping methods which provide invaluable tools for analyzing large-scale timecourse data that are readily available in our post-genome era. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Behavior guidance techniques in Pediatric Dentistry: attitudes of parents of children with disabilities and without disabilities.

    PubMed

    de Castro, Alessandra Maia; de Oliveira, Fabiana Sodré; de Paiva Novaes, Myrian Stella; Araújo Ferreira, Danielly Cunha

    2013-01-01

    This study compared the parental acceptance of pediatric behavior guidance techniques (BGT). Forty parents of children without disabilities (Group A) and another 40 parents of children with disabilities (Group B) were selected. Each BGT was explained by a single examiner and it was presented together with a photograph album. After that parents evaluated the acceptance in: totally unacceptable, somewhat acceptable, acceptable, and totally acceptable. Results indicated that in Group A, the BGT based on communicative guidance was accepted by most participants. In Group B, just one mother considered totally unacceptable the voice control method and other two, tell-show-do. For both groups, the general anesthesia was the less accepted BGT. There was statistically significant difference in acceptance for protective stabilization with a restrictive device in Group B. Children's parents with and without disabilities accepted behavioral guidance techniques, but basic techniques showed higher rates of acceptance than advanced techniques. ©2013 Special Care Dentistry Association and Wiley Periodicals, Inc.

  1. Deformable Medical Image Registration: A Survey

    PubMed Central

    Sotiras, Aristeidis; Davatzikos, Christos; Paragios, Nikos

    2013-01-01

    Deformable image registration is a fundamental task in medical image processing. Among its most important applications, one may cite: i) multi-modality fusion, where information acquired by different imaging devices or protocols is fused to facilitate diagnosis and treatment planning; ii) longitudinal studies, where temporal structural or anatomical changes are investigated; and iii) population modeling and statistical atlases used to study normal anatomical variability. In this paper, we attempt to give an overview of deformable registration methods, putting emphasis on the most recent advances in the domain. Additional emphasis has been given to techniques applied to medical images. In order to study image registration methods in depth, their main components are identified and studied independently. The most recent techniques are presented in a systematic fashion. The contribution of this paper is to provide an extensive account of registration techniques in a systematic manner. PMID:23739795

  2. University of Washington's eScience Institute Promotes New Training and Career Pathways in Data Science

    NASA Astrophysics Data System (ADS)

    Stone, S.; Parker, M. S.; Howe, B.; Lazowska, E.

    2015-12-01

    Rapid advances in technology are transforming nearly every field from "data-poor" to "data-rich." The ability to extract knowledge from this abundance of data is the cornerstone of 21st century discovery. At the University of Washington eScience Institute, our mission is to engage researchers across disciplines in developing and applying advanced computational methods and tools to real world problems in data-intensive discovery. Our research team consists of individuals with diverse backgrounds in domain sciences such as astronomy, oceanography and geology, with complementary expertise in advanced statistical and computational techniques such as data management, visualization, and machine learning. Two key elements are necessary to foster careers in data science: individuals with cross-disciplinary training in both method and domain sciences, and career paths emphasizing alternative metrics for advancement. We see persistent and deep-rooted challenges for the career paths of people whose skills, activities and work patterns don't fit neatly into the traditional roles and success metrics of academia. To address these challenges the eScience Institute has developed training programs and established new career opportunities for data-intensive research in academia. Our graduate students and post-docs have mentors in both a methodology and an application field. They also participate in coursework and tutorials to advance technical skill and foster community. Professional Data Scientist positions were created to support research independence while encouraging the development and adoption of domain-specific tools and techniques. The eScience Institute also supports the appointment of faculty who are innovators in developing and applying data science methodologies to advance their field of discovery. Our ultimate goal is to create a supportive environment for data science in academia and to establish global recognition for data-intensive discovery across all fields.

  3. Advanced approach for intraoperative MRI guidance and potential benefit for neurosurgical applications.

    PubMed

    Busse, Harald; Schmitgen, Arno; Trantakis, Christos; Schober, Ralf; Kahn, Thomas; Moche, Michael

    2006-07-01

    To present an advanced approach for intraoperative image guidance in an open 0.5 T MRI and to evaluate its effectiveness for neurosurgical interventions by comparison with a dynamic scan-guided localization technique. The built-in scan guidance mode relied on successive interactive MRI scans. The additional advanced mode provided real-time navigation based on reformatted high-quality, intraoperatively acquired MR reference data, allowed multimodal image fusion, and used the successive scans of the built-in mode for quick verification of the position only. Analysis involved tumor resections and biopsies in either scan guidance (N = 36) or advanced mode (N = 59) by the same three neurosurgeons. Technical, surgical, and workflow aspects were compared. The image quality and hand-eye coordination of the advanced approach were improved. While the average extent of resection, neurologic outcome after functional MRI (fMRI) integration, and diagnostic yield appeared to be slightly better under advanced guidance, particularly for the main surgeon, statistical analysis revealed no significant differences. Resection times were comparable, while biopsies took around 30 minutes longer. The presented approach is safe and provides more detailed images and higher navigation speed at the expense of actuality. The surgical outcome achieved with advanced guidance is (at least) as good as that obtained with dynamic scan guidance. (c) 2006 Wiley-Liss, Inc.

  4. Measurements of Turbulent Flow Field in Separate Flow Nozzles with Enhanced Mixing Devices - Test Report

    NASA Technical Reports Server (NTRS)

    Bridges, James

    2002-01-01

    As part of the Advanced Subsonic Technology Program, a series of experiments was conducted at NASA Glenn Research Center on the effect of mixing enhancement devices on the aeroacoustic performance of separate flow nozzles. Initial acoustic evaluations of the devices showed that they reduced jet noise significantly, while creating very little thrust loss. The explanation for the improvement required that turbulence measurements, namely single point mean and RMS statistics and two-point spatial correlations, be made to determine the change in the turbulence caused by the mixing enhancement devices that lead to the noise reduction. These measurements were made in the summer of 2000 in a test program called Separate Nozzle Flow Test 2000 (SFNT2K) supported by the Aeropropulsion Research Program at NASA Glenn Research Center. Given the hot high-speed flows representative of a contemporary bypass ratio 5 turbofan engine, unsteady flow field measurements required the use of an optical measurement method. To achieve the spatial correlations, the Particle Image Velocimetry technique was employed, acquiring high-density velocity maps of the flows from which the required statistics could be derived. This was the first successful use of this technique for such flows, and shows the utility of this technique for future experimental programs. The extensive statistics obtained were likewise unique and give great insight into the turbulence which produces noise and how the turbulence can be modified to reduce jet noise.

  5. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    ERIC Educational Resources Information Center

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  6. An Investigation of the Characterization of Cloud Contamination in Hyperspectral Radiances

    NASA Technical Reports Server (NTRS)

    McCarty, William; Jedlovec, Gary J.; LeMarshall, John

    2007-01-01

    In regions lacking direct observations, the assimilation of radiances from infrared and microwave sounders is the primary method for characterizing the atmosphere in the analysis process. In recent years, technological advances have led to the launching of more advanced sounders, particularly in the thermal infrared spectrum. With the advent of these hyperspectral sounders, the amount of data available for the analysis process has and will continue to be dramatically increased. However, the utilization of infrared radiances in variational assimilation can be problematic in the presence of clouds; specifically the assessment of the presence of clouds in an instantaneous field of view (IFOV) and the contamination in the individual channels within the IFOV. Various techniques have been developed to determine if a channel is contaminated by clouds. The work presented in this paper and subsequent presentation will investigate traditional techniques and compare them to a new technique, the C02 sorting technique, which utilizes the high spectral resolution of the Atmospheric Infrared Sounder (AIRS) within the framework of the Gridpoint Statistical Interpolation (GSI) 3DVAR system. Ultimately, this work is done in preparation for the assessment of short-term forecast impacts with the regional assimilation of AIRS radiances within the analysis fields of the Weather Research and Forecast Nonhydrostatic Mesoscale Model (WRF-NMM) at the NASA Short-term Prediction Research and Transition (SPORT) Center.

  7. Clinical skills assessment of procedural and advanced communication skills: performance expectations of residency program directors.

    PubMed

    Langenau, Erik E; Zhang, Xiuyuan; Roberts, William L; DeChamplain, Andre F; Boulet, John R

    2012-01-01

    High stakes medical licensing programs are planning to augment and adapt current examinations to be relevant for a two-decision point model for licensure: entry into supervised practice and entry into unsupervised practice. Therefore, identifying which skills should be assessed at each decision point is critical for informing examination development, and gathering input from residency program directors is important. Using data from previously developed surveys and expert panels, a web-delivered survey was distributed to 3,443 residency program directors. For each of the 28 procedural and 18 advanced communication skills, program directors were asked which clinical skills should be assessed, by whom, when, and how. Descriptive statistics were collected, and Intraclass Correlations (ICC) were conducted to determine consistency across different specialties. Among 347 respondents, program directors reported that all advanced communication and some procedural tasks are important to assess. The following procedures were considered 'important' or 'extremely important' to assess: sterile technique (93.8%), advanced cardiovascular life support (ACLS) (91.1%), basic life support (BLS) (90.0%), interpretation of electrocardiogram (89.4%) and blood gas (88.7%). Program directors reported that most clinical skills should be assessed at the end of the first year of residency (or later) and not before graduation from medical school. A minority were considered important to assess prior to the start of residency training: demonstration of respectfulness (64%), sterile technique (67.2%), BLS (68.9%), ACLS (65.9%) and phlebotomy (63.5%). Results from this study support that assessing procedural skills such as cardiac resuscitation, sterile technique, and phlebotomy would be amenable to assessment at the end of medical school, but most procedural and advanced communications skills would be amenable to assessment at the end of the first year of residency training or later. Gathering data from residency program directors provides support for developing new assessment tools in high-stakes licensing examinations.

  8. Clinical skills assessment of procedural and advanced communication skills: performance expectations of residency program directors

    PubMed Central

    Langenau, Erik E.; Zhang, Xiuyuan; Roberts, William L.; DeChamplain, Andre F.; Boulet, John R.

    2012-01-01

    Background High stakes medical licensing programs are planning to augment and adapt current examinations to be relevant for a two-decision point model for licensure: entry into supervised practice and entry into unsupervised practice. Therefore, identifying which skills should be assessed at each decision point is critical for informing examination development, and gathering input from residency program directors is important. Methods Using data from previously developed surveys and expert panels, a web-delivered survey was distributed to 3,443 residency program directors. For each of the 28 procedural and 18 advanced communication skills, program directors were asked which clinical skills should be assessed, by whom, when, and how. Descriptive statistics were collected, and Intraclass Correlations (ICC) were conducted to determine consistency across different specialties. Results Among 347 respondents, program directors reported that all advanced communication and some procedural tasks are important to assess. The following procedures were considered ‘important’ or ‘extremely important’ to assess: sterile technique (93.8%), advanced cardiovascular life support (ACLS) (91.1%), basic life support (BLS) (90.0%), interpretation of electrocardiogram (89.4%) and blood gas (88.7%). Program directors reported that most clinical skills should be assessed at the end of the first year of residency (or later) and not before graduation from medical school. A minority were considered important to assess prior to the start of residency training: demonstration of respectfulness (64%), sterile technique (67.2%), BLS (68.9%), ACLS (65.9%) and phlebotomy (63.5%). Discussion Results from this study support that assessing procedural skills such as cardiac resuscitation, sterile technique, and phlebotomy would be amenable to assessment at the end of medical school, but most procedural and advanced communications skills would be amenable to assessment at the end of the first year of residency training or later. Conclusions Gathering data from residency program directors provides support for developing new assessment tools in high-stakes licensing examinations. PMID:22833698

  9. Detailed statistical contact angle analyses; "slow moving" drops on inclining silicon-oxide surfaces.

    PubMed

    Schmitt, M; Groß, K; Grub, J; Heib, F

    2015-06-01

    Contact angle determination by sessile drop technique is essential to characterise surface properties in science and in industry. Different specific angles can be observed on every solid which are correlated with the advancing or the receding of the triple line. Different procedures and definitions for the determination of specific angles exist which are often not comprehensible or reproducible. Therefore one of the most important things in this area is to build standard, reproducible and valid methods for determining advancing/receding contact angles. This contribution introduces novel techniques to analyse dynamic contact angle measurements (sessile drop) in detail which are applicable for axisymmetric and non-axisymmetric drops. Not only the recently presented fit solution by sigmoid function and the independent analysis of the different parameters (inclination, contact angle, velocity of the triple point) but also the dependent analysis will be firstly explained in detail. These approaches lead to contact angle data and different access on specific contact angles which are independent from "user-skills" and subjectivity of the operator. As example the motion behaviour of droplets on flat silicon-oxide surfaces after different surface treatments is dynamically measured by sessile drop technique when inclining the sample plate. The triple points, the inclination angles, the downhill (advancing motion) and the uphill angles (receding motion) obtained by high-precision drop shape analysis are independently and dependently statistically analysed. Due to the small covered distance for the dependent analysis (<0.4mm) and the dominance of counted events with small velocity the measurements are less influenced by motion dynamics and the procedure can be called "slow moving" analysis. The presented procedures as performed are especially sensitive to the range which reaches from the static to the "slow moving" dynamic contact angle determination. They are characterised by small deviations of the computed values. Additional to the detailed introduction of this novel analytical approaches plus fit solution special motion relations for the drop on inclined surfaces and detailed relations about the reactivity of the freshly cleaned silicon wafer surface resulting in acceleration behaviour (reactive de-wetting) are presented. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Conceptualizing a Framework for Advanced Placement Statistics Teaching Knowledge

    ERIC Educational Resources Information Center

    Haines, Brenna

    2015-01-01

    The purpose of this article is to sketch a conceptualization of a framework for Advanced Placement (AP) Statistics Teaching Knowledge. Recent research continues to problematize the lack of knowledge and preparation among secondary level statistics teachers. The College Board's AP Statistics course continues to grow and gain popularity, but is a…

  11. Large-scale quantitative analysis of painting arts.

    PubMed

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  12. Online Statistical Modeling (Regression Analysis) for Independent Responses

    NASA Astrophysics Data System (ADS)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  13. Evaluation of analysis techniques for low frequency interior noise and vibration of commercial aircraft

    NASA Technical Reports Server (NTRS)

    Landmann, A. E.; Tillema, H. F.; Marshall, S. E.

    1989-01-01

    The application of selected analysis techniques to low frequency cabin noise associated with advanced propeller engine installations is evaluated. Three design analysis techniques were chosen for evaluation including finite element analysis, statistical energy analysis (SEA), and a power flow method using element of SEA (computer program Propeller Aircraft Interior Noise). An overview of the three procedures is provided. Data from tests of a 727 airplane (modified to accept a propeller engine) were used to compare with predictions. Comparisons of predicted and measured levels at the end of the first year's effort showed reasonable agreement leading to the conclusion that each technique had value for propeller engine noise predictions on large commercial transports. However, variations in agreement were large enough to remain cautious and to lead to recommendations for further work with each technique. Assessment of the second year's results leads to the conclusion that the selected techniques can accurately predict trends and can be useful to a designer, but that absolute level predictions remain unreliable due to complexity of the aircraft structure and low modal densities.

  14. Tracing Interstellar Magnetic Field Using Velocity Gradient Technique: Application to Atomic Hydrogen Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuen, Ka Ho; Lazarian, A., E-mail: kyuen2@wisc.edu, E-mail: lazarian@astro.wisc.edu

    The advancement of our understanding of MHD turbulence opens ways to develop new techniques to probe magnetic fields. In MHD turbulence, the velocity gradients are expected to be perpendicular to magnetic fields and this fact was used by González-Casanova and Lazarian to introduce a new technique to trace magnetic fields using velocity centroid gradients (VCGs). The latter can be obtained from spectroscopic observations. We apply the technique to GALFA-H i survey data and then compare the directions of magnetic fields obtained with our technique to the direction of magnetic fields obtained using PLANCK polarization. We find an excellent correspondence betweenmore » the two ways of magnetic field tracing, which is obvious via the visual comparison and through the measuring of the statistics of magnetic field fluctuations obtained with the polarization data and our technique. This suggests that the VCGs have a potential for measuring of the foreground magnetic field fluctuations, and thus provide a new way of separating foreground and CMB polarization signals.« less

  15. Improving cerebellar segmentation with statistical fusion

    NASA Astrophysics Data System (ADS)

    Plassard, Andrew J.; Yang, Zhen; Prince, Jerry L.; Claassen, Daniel O.; Landman, Bennett A.

    2016-03-01

    The cerebellum is a somatotopically organized central component of the central nervous system well known to be involved with motor coordination and increasingly recognized roles in cognition and planning. Recent work in multiatlas labeling has created methods that offer the potential for fully automated 3-D parcellation of the cerebellar lobules and vermis (which are organizationally equivalent to cortical gray matter areas). This work explores the trade offs of using different statistical fusion techniques and post hoc optimizations in two datasets with distinct imaging protocols. We offer a novel fusion technique by extending the ideas of the Selective and Iterative Method for Performance Level Estimation (SIMPLE) to a patch-based performance model. We demonstrate the effectiveness of our algorithm, Non- Local SIMPLE, for segmentation of a mixed population of healthy subjects and patients with severe cerebellar anatomy. Under the first imaging protocol, we show that Non-Local SIMPLE outperforms previous gold-standard segmentation techniques. In the second imaging protocol, we show that Non-Local SIMPLE outperforms previous gold standard techniques but is outperformed by a non-locally weighted vote with the deeper population of atlases available. This work advances the state of the art in open source cerebellar segmentation algorithms and offers the opportunity for routinely including cerebellar segmentation in magnetic resonance imaging studies that acquire whole brain T1-weighted volumes with approximately 1 mm isotropic resolution.

  16. Uncertainty Management in Remote Sensing of Climate Data. Summary of A Workshop

    NASA Technical Reports Server (NTRS)

    McConnell, M.; Weidman, S.

    2009-01-01

    Great advances have been made in our understanding of the climate system over the past few decades, and remotely sensed data have played a key role in supporting many of these advances. Improvements in satellites and in computational and data-handling techniques have yielded high quality, readily accessible data. However, rapid increases in data volume have also led to large and complex datasets that pose significant challenges in data analysis (NRC, 2007). Uncertainty characterization is needed for every satellite mission and scientists continue to be challenged by the need to reduce the uncertainty in remotely sensed climate records and projections. The approaches currently used to quantify the uncertainty in remotely sensed data, including statistical methods used to calibrate and validate satellite instruments, lack an overall mathematically based framework.

  17. Frontiers of Two-Dimensional Correlation Spectroscopy. Part 1. New concepts and noteworthy developments

    NASA Astrophysics Data System (ADS)

    Noda, Isao

    2014-07-01

    A comprehensive survey review of new and noteworthy developments, which are advancing forward the frontiers in the field of 2D correlation spectroscopy during the last four years, is compiled. This review covers books, proceedings, and review articles published on 2D correlation spectroscopy, a number of significant conceptual developments in the field, data pretreatment methods and other pertinent topics, as well as patent and publication trends and citation activities. Developments discussed include projection 2D correlation analysis, concatenated 2D correlation, and correlation under multiple perturbation effects, as well as orthogonal sample design, predicting 2D correlation spectra, manipulating and comparing 2D spectra, correlation strategy based on segmented data blocks, such as moving-window analysis, features like determination of sequential order and enhanced spectral resolution, statistical 2D spectroscopy using covariance and other statistical metrics, hetero-correlation analysis, and sample-sample correlation technique. Data pretreatment operations prior to 2D correlation analysis are discussed, including the correction for physical effects, background and baseline subtraction, selection of reference spectrum, normalization and scaling of data, derivatives spectra and deconvolution technique, and smoothing and noise reduction. Other pertinent topics include chemometrics and statistical considerations, peak position shift phenomena, variable sampling increments, computation and software, display schemes, such as color coded format, slice and power spectra, tabulation, and other schemes.

  18. The Complexity of Solar and Geomagnetic Indices

    NASA Astrophysics Data System (ADS)

    Pesnell, W. Dean

    2017-08-01

    How far in advance can the sunspot number be predicted with any degree of confidence? Solar cycle predictions are needed to plan long-term space missions. Fleets of satellites circle the Earth collecting science data, protecting astronauts, and relaying information. All of these satellites are sensitive at some level to solar cycle effects. Statistical and timeseries analyses of the sunspot number are often used to predict solar activity. These methods have not been completely successful as the solar dynamo changes over time and one cycle's sunspots are not a faithful predictor of the next cycle's activity. In some ways, using these techniques is similar to asking whether the stock market can be predicted. It has been shown that the Dow Jones Industrial Average (DJIA) can be more accurately predicted during periods when it obeys certain statistical properties than at other times. The Hurst exponent is one such way to partition the data. Another measure of the complexity of a timeseries is the fractal dimension. We can use these measures of complexity to compare the sunspot number with other solar and geomagnetic indices. Our concentration is on how trends are removed by the various techniques, either internally or externally. Comparisons of the statistical properties of the various solar indices may guide us in understanding how the dynamo manifests in the various indices and the Sun.

  19. Avoid lost discoveries, because of violations of standard assumptions, by using modern robust statistical methods.

    PubMed

    Wilcox, Rand; Carlson, Mike; Azen, Stan; Clark, Florence

    2013-03-01

    Recently, there have been major advances in statistical techniques for assessing central tendency and measures of association. The practical utility of modern methods has been documented extensively in the statistics literature, but they remain underused and relatively unknown in clinical trials. Our objective was to address this issue. STUDY DESIGN AND PURPOSE: The first purpose was to review common problems associated with standard methodologies (low power, lack of control over type I errors, and incorrect assessments of the strength of the association). The second purpose was to summarize some modern methods that can be used to circumvent such problems. The third purpose was to illustrate the practical utility of modern robust methods using data from the Well Elderly 2 randomized controlled trial. In multiple instances, robust methods uncovered differences among groups and associations among variables that were not detected by classic techniques. In particular, the results demonstrated that details of the nature and strength of the association were sometimes overlooked when using ordinary least squares regression and Pearson correlation. Modern robust methods can make a practical difference in detecting and describing differences between groups and associations between variables. Such procedures should be applied more frequently when analyzing trial-based data. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. 4D printing smart biomedical scaffolds with novel soybean oil epoxidized acrylate

    PubMed Central

    Miao, Shida; Zhu, Wei; Castro, Nathan J.; Nowicki, Margaret; Zhou, Xuan; Cui, Haitao; Fisher, John P.; Zhang, Lijie Grace

    2016-01-01

    Photocurable, biocompatible liquid resins are highly desired for 3D stereolithography based bioprinting. Here we solidified a novel renewable soybean oil epoxidized acrylate, using a 3D laser printing technique, into smart and highly biocompatible scaffolds capable of supporting growth of multipotent human bone marrow mesenchymal stem cells (hMSCs). Porous scaffolds were readily fabricated by simply adjusting the printer infill density; superficial structures of the polymerized soybean oil epoxidized acrylate were significantly affected by laser frequency and printing speed. Shape memory tests confirmed that the scaffold fixed a temporary shape at −18 °C and fully recovered its original shape at human body temperature (37 °C), which indicated the great potential for 4D printing applications. Cytotoxicity analysis proved that the printed scaffolds had significant higher hMSC adhesion and proliferation than traditional polyethylene glycol diacrylate (PEGDA), and had no statistical difference from poly lactic acid (PLA) and polycaprolactone (PCL). This research is believed to significantly advance the development of biomedical scaffolds with renewable plant oils and advanced 3D fabrication techniques. PMID:27251982

  1. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE.

    PubMed

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis.

  2. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE

    PubMed Central

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M.; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis. PMID:28596729

  3. A guide to missing data for the pediatric nephrologist.

    PubMed

    Larkins, Nicholas G; Craig, Jonathan C; Teixeira-Pinto, Armando

    2018-03-13

    Missing data is an important and common source of bias in clinical research. Readers should be alert to and consider the impact of missing data when reading studies. Beyond preventing missing data in the first place, through good study design and conduct, there are different strategies available to handle data containing missing observations. Complete case analysis is often biased unless data are missing completely at random. Better methods of handling missing data include multiple imputation and models using likelihood-based estimation. With advancing computing power and modern statistical software, these methods are within the reach of clinician-researchers under guidance of a biostatistician. As clinicians reading papers, we need to continue to update our understanding of statistical methods, so that we understand the limitations of these techniques and can critically interpret literature.

  4. Statistical methods in personality assessment research.

    PubMed

    Schinka, J A; LaLone, L; Broeckel, J A

    1997-06-01

    Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.

  5. Descriptive and inferential statistical methods used in burns research.

    PubMed

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.

  6. Structural equation modeling in pediatric psychology: overview and review of applications.

    PubMed

    Nelson, Timothy D; Aylward, Brandon S; Steele, Ric G

    2008-08-01

    To describe the use of structural equation modeling (SEM) in the Journal of Pediatric Psychology (JPP) and to discuss the usefulness of SEM applications in pediatric psychology research. The use of SEM in JPP between 1997 and 2006 was examined and compared to leading journals in clinical psychology, clinical child psychology, and child development. SEM techniques were used in <4% of the empirical articles appearing in JPP between 1997 and 2006. SEM was used less frequently in JPP than in other clinically relevant journals over the past 10 years. However, results indicated a recent increase in JPP studies employing SEM techniques. SEM is an under-utilized class of techniques within pediatric psychology research, although investigations employing these methods are becoming more prevalent. Despite its infrequent use to date, SEM is a potentially useful tool for advancing pediatric psychology research with a number of advantages over traditional statistical methods.

  7. Integrating instance selection, instance weighting, and feature weighting for nearest neighbor classifiers by coevolutionary algorithms.

    PubMed

    Derrac, Joaquín; Triguero, Isaac; Garcia, Salvador; Herrera, Francisco

    2012-10-01

    Cooperative coevolution is a successful trend of evolutionary computation which allows us to define partitions of the domain of a given problem, or to integrate several related techniques into one, by the use of evolutionary algorithms. It is possible to apply it to the development of advanced classification methods, which integrate several machine learning techniques into a single proposal. A novel approach integrating instance selection, instance weighting, and feature weighting into the framework of a coevolutionary model is presented in this paper. We compare it with a wide range of evolutionary and nonevolutionary related methods, in order to show the benefits of the employment of coevolution to apply the techniques considered simultaneously. The results obtained, contrasted through nonparametric statistical tests, show that our proposal outperforms other methods in the comparison, thus becoming a suitable tool in the task of enhancing the nearest neighbor classifier.

  8. Statistical analysis of texture in trunk images for biometric identification of tree species.

    PubMed

    Bressane, Adriano; Roveda, José A F; Martins, Antônio C G

    2015-04-01

    The identification of tree species is a key step for sustainable management plans of forest resources, as well as for several other applications that are based on such surveys. However, the present available techniques are dependent on the presence of tree structures, such as flowers, fruits, and leaves, limiting the identification process to certain periods of the year. Therefore, this article introduces a study on the application of statistical parameters for texture classification of tree trunk images. For that, 540 samples from five Brazilian native deciduous species were acquired and measures of entropy, uniformity, smoothness, asymmetry (third moment), mean, and standard deviation were obtained from the presented textures. Using a decision tree, a biometric species identification system was constructed and resulted to a 0.84 average precision rate for species classification with 0.83accuracy and 0.79 agreement. Thus, it can be considered that the use of texture presented in trunk images can represent an important advance in tree identification, since the limitations of the current techniques can be overcome.

  9. Variable Selection in the Presence of Missing Data: Imputation-based Methods.

    PubMed

    Zhao, Yize; Long, Qi

    2017-01-01

    Variable selection plays an essential role in regression analysis as it identifies important variables that associated with outcomes and is known to improve predictive accuracy of resulting models. Variable selection methods have been widely investigated for fully observed data. However, in the presence of missing data, methods for variable selection need to be carefully designed to account for missing data mechanisms and statistical techniques used for handling missing data. Since imputation is arguably the most popular method for handling missing data due to its ease of use, statistical methods for variable selection that are combined with imputation are of particular interest. These methods, valid used under the assumptions of missing at random (MAR) and missing completely at random (MCAR), largely fall into three general strategies. The first strategy applies existing variable selection methods to each imputed dataset and then combine variable selection results across all imputed datasets. The second strategy applies existing variable selection methods to stacked imputed datasets. The third variable selection strategy combines resampling techniques such as bootstrap with imputation. Despite recent advances, this area remains under-developed and offers fertile ground for further research.

  10. Reducing radiation dose without compromising image quality in preoperative perforator flap imaging with CTA using ASIR technology.

    PubMed

    Niumsawatt, Vachara; Debrotwir, Andrew N; Rozen, Warren Matthew

    2014-01-01

    Computed tomographic angiography (CTA) has become a mainstay in preoperative perforator flap planning in the modern era of reconstructive surgery. However, the increased use of CTA does raise the concern of radiation exposure to patients. Several techniques have been developed to decrease radiation dosage without compromising image quality, with varying results. The most recent advance is in the improvement of image reconstruction using an adaptive statistical iterative reconstruction (ASIR) algorithm. We sought to evaluate the image quality of ASIR in preoperative deep inferior epigastric perforator (DIEP) flap surgery, through a direct comparison with conventional filtered back projection (FBP) images. A prospective review of 60 consecutive ASIR and 60 consecutive FBP CTA images using similar protocol (except for radiation dosage) was undertaken, analyzed by 2 independent reviewers. In both groups, we were able to accurately identify axial arteries and their perforators. Subjective analysis of image quality demonstrated no statistically significant difference between techniques. ASIR can thus be used for preoperative imaging with similar image quality to FBP, but with a 60% reduction in radiation delivery to patients.

  11. Attitude determination using an adaptive multiple model filtering Scheme

    NASA Technical Reports Server (NTRS)

    Lam, Quang; Ray, Surendra N.

    1995-01-01

    Attitude determination has been considered as a permanent topic of active research and perhaps remaining as a forever-lasting interest for spacecraft system designers. Its role is to provide a reference for controls such as pointing the directional antennas or solar panels, stabilizing the spacecraft or maneuvering the spacecraft to a new orbit. Least Square Estimation (LSE) technique was utilized to provide attitude determination for the Nimbus 6 and G. Despite its poor performance (estimation accuracy consideration), LSE was considered as an effective and practical approach to meet the urgent need and requirement back in the 70's. One reason for this poor performance associated with the LSE scheme is the lack of dynamic filtering or 'compensation'. In other words, the scheme is based totally on the measurements and no attempts were made to model the dynamic equations of motion of the spacecraft. We propose an adaptive filtering approach which employs a bank of Kalman filters to perform robust attitude estimation. The proposed approach, whose architecture is depicted, is essentially based on the latest proof on the interactive multiple model design framework to handle the unknown of the system noise characteristics or statistics. The concept fundamentally employs a bank of Kalman filter or submodel, instead of using fixed values for the system noise statistics for each submodel (per operating condition) as the traditional multiple model approach does, we use an on-line dynamic system noise identifier to 'identify' the system noise level (statistics) and update the filter noise statistics using 'live' information from the sensor model. The advanced noise identifier, whose architecture is also shown, is implemented using an advanced system identifier. To insure the robust performance for the proposed advanced system identifier, it is also further reinforced by a learning system which is implemented (in the outer loop) using neural networks to identify other unknown quantities such as spacecraft dynamics parameters, gyro biases, dynamic disturbances, or environment variations.

  12. Attitude determination using an adaptive multiple model filtering Scheme

    NASA Astrophysics Data System (ADS)

    Lam, Quang; Ray, Surendra N.

    1995-05-01

    Attitude determination has been considered as a permanent topic of active research and perhaps remaining as a forever-lasting interest for spacecraft system designers. Its role is to provide a reference for controls such as pointing the directional antennas or solar panels, stabilizing the spacecraft or maneuvering the spacecraft to a new orbit. Least Square Estimation (LSE) technique was utilized to provide attitude determination for the Nimbus 6 and G. Despite its poor performance (estimation accuracy consideration), LSE was considered as an effective and practical approach to meet the urgent need and requirement back in the 70's. One reason for this poor performance associated with the LSE scheme is the lack of dynamic filtering or 'compensation'. In other words, the scheme is based totally on the measurements and no attempts were made to model the dynamic equations of motion of the spacecraft. We propose an adaptive filtering approach which employs a bank of Kalman filters to perform robust attitude estimation. The proposed approach, whose architecture is depicted, is essentially based on the latest proof on the interactive multiple model design framework to handle the unknown of the system noise characteristics or statistics. The concept fundamentally employs a bank of Kalman filter or submodel, instead of using fixed values for the system noise statistics for each submodel (per operating condition) as the traditional multiple model approach does, we use an on-line dynamic system noise identifier to 'identify' the system noise level (statistics) and update the filter noise statistics using 'live' information from the sensor model. The advanced noise identifier, whose architecture is also shown, is implemented using an advanced system identifier. To insure the robust performance for the proposed advanced system identifier, it is also further reinforced by a learning system which is implemented (in the outer loop) using neural networks to identify other unknown quantities such as spacecraft dynamics parameters, gyro biases, dynamic disturbances, or environment variations.

  13. Metabolomics and malaria biology

    PubMed Central

    Lakshmanan, Viswanathan; Rhee, Kyu Y.; Daily, Johanna P.

    2010-01-01

    Metabolomics has ushered in a novel and multi-disciplinary realm in biological research. It has provided researchers with a platform to combine powerful biochemical, statistical, computational, and bioinformatics techniques to delve into the mysteries of biology and disease. The application of metabolomics to study malaria parasites represents a major advance in our approach towards gaining a more comprehensive perspective on parasite biology and disease etiology. This review attempts to highlight some of the important aspects of the field of metabolomics, and its ongoing and potential future applications to malaria research. PMID:20970461

  14. Development, implementation and evaluation of satellite-aided agricultural monitoring systems

    NASA Technical Reports Server (NTRS)

    Cicone, R. C.; Crist, E. P.; Metzler, M.; Nuesch, D.

    1982-01-01

    Research activities in support of AgRISTARS Inventory Technology Development Project in the use of aerospace remote sensing for agricultural inventory described include: (1) corn and soybean crop spectral temporal signature characterization; (2) efficient area estimation techniques development; and (3) advanced satellite and sensor system definition. Studies include a statistical evaluation of the impact of cultural and environmental factors on crop spectral profiles, the development and evaluation of an automatic crop area estimation procedure, and the joint use of SEASAT-SAR and LANDSAT MSS for crop inventory.

  15. International experience on the use of artificial neural networks in gastroenterology.

    PubMed

    Grossi, E; Mancini, A; Buscema, M

    2007-03-01

    In this paper, we reconsider the scientific background for the use of artificial intelligence tools in medicine. A review of some recent significant papers shows that artificial neural networks, the more advanced and effective artificial intelligence technique, can improve the classification accuracy and survival prediction of a number of gastrointestinal diseases. We discuss the 'added value' the use of artificial neural networks-based tools can bring in the field of gastroenterology, both at research and clinical application level, when compared with traditional statistical or clinical-pathological methods.

  16. Current Developments in Machine Learning Techniques in Biological Data Mining.

    PubMed

    Dumancas, Gerard G; Adrianto, Indra; Bello, Ghalib; Dozmorov, Mikhail

    2017-01-01

    This supplement is intended to focus on the use of machine learning techniques to generate meaningful information on biological data. This supplement under Bioinformatics and Biology Insights aims to provide scientists and researchers working in this rapid and evolving field with online, open-access articles authored by leading international experts in this field. Advances in the field of biology have generated massive opportunities to allow the implementation of modern computational and statistical techniques. Machine learning methods in particular, a subfield of computer science, have evolved as an indispensable tool applied to a wide spectrum of bioinformatics applications. Thus, it is broadly used to investigate the underlying mechanisms leading to a specific disease, as well as the biomarker discovery process. With a growth in this specific area of science comes the need to access up-to-date, high-quality scholarly articles that will leverage the knowledge of scientists and researchers in the various applications of machine learning techniques in mining biological data.

  17. Understanding photon sideband statistics and correlation for determining phonon coherence

    NASA Astrophysics Data System (ADS)

    Ding, Ding; Yin, Xiaobo; Li, Baowen

    2018-01-01

    Generating and detecting coherent high-frequency heat-carrying phonons have been topics of great interest in recent years. Although there have been successful attempts in generating and observing coherent phonons, rigorous techniques to characterize and detect phonon coherence in a crystalline material have been lagging compared to what has been achieved for photons. One main challenge is a lack of detailed understanding of how detection signals for phonons can be related to coherence. The quantum theory of photoelectric detection has greatly advanced the ability to characterize photon coherence in the past century, and a similar theory for phonon detection is necessary. Here, we reexamine the optical sideband fluorescence technique that has been used to detect high-frequency phonons in materials with optically active defects. We propose a quantum theory of phonon detection using the sideband technique and found that there are distinct differences in sideband counting statistics between thermal and coherent phonons. We further propose a second-order correlation function unique to sideband signals that allows for a rigorous distinction between thermal and coherent phonons. Our theory is relevant to a correlation measurement with nontrivial response functions at the quantum level and can potentially bridge the gap of experimentally determining phonon coherence to be on par with that of photons.

  18. RankProd 2.0: a refactored bioconductor package for detecting differentially expressed features in molecular profiling datasets.

    PubMed

    Del Carratore, Francesco; Jankevics, Andris; Eisinga, Rob; Heskes, Tom; Hong, Fangxin; Breitling, Rainer

    2017-09-01

    The Rank Product (RP) is a statistical technique widely used to detect differentially expressed features in molecular profiling experiments such as transcriptomics, metabolomics and proteomics studies. An implementation of the RP and the closely related Rank Sum (RS) statistics has been available in the RankProd Bioconductor package for several years. However, several recent advances in the understanding of the statistical foundations of the method have made a complete refactoring of the existing package desirable. We implemented a completely refactored version of the RankProd package, which provides a more principled implementation of the statistics for unpaired datasets. Moreover, the permutation-based P -value estimation methods have been replaced by exact methods, providing faster and more accurate results. RankProd 2.0 is available at Bioconductor ( https://www.bioconductor.org/packages/devel/bioc/html/RankProd.html ) and as part of the mzMatch pipeline ( http://www.mzmatch.sourceforge.net ). rainer.breitling@manchester.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  19. Big data to smart data in Alzheimer's disease: Real-world examples of advanced modeling and simulation.

    PubMed

    Haas, Magali; Stephenson, Diane; Romero, Klaus; Gordon, Mark Forrest; Zach, Neta; Geerts, Hugo

    2016-09-01

    Many disease-modifying clinical development programs in Alzheimer's disease (AD) have failed to date, and development of new and advanced preclinical models that generate actionable knowledge is desperately needed. This review reports on computer-based modeling and simulation approach as a powerful tool in AD research. Statistical data-analysis techniques can identify associations between certain data and phenotypes, such as diagnosis or disease progression. Other approaches integrate domain expertise in a formalized mathematical way to understand how specific components of pathology integrate into complex brain networks. Private-public partnerships focused on data sharing, causal inference and pathway-based analysis, crowdsourcing, and mechanism-based quantitative systems modeling represent successful real-world modeling examples with substantial impact on CNS diseases. Similar to other disease indications, successful real-world examples of advanced simulation can generate actionable support of drug discovery and development in AD, illustrating the value that can be generated for different stakeholders. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Advances in statistics

    Treesearch

    Howard Stauffer; Nadav Nur

    2005-01-01

    The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...

  1. Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models

    NASA Astrophysics Data System (ADS)

    Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.

    2017-12-01

    Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream measurements.

  2. Stochastic Analysis and Design of Heterogeneous Microstructural Materials System

    NASA Astrophysics Data System (ADS)

    Xu, Hongyi

    Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.

  3. Resolving anthropogenic aerosol pollution types - deconvolution and exploratory classification of pollution events

    NASA Astrophysics Data System (ADS)

    Äijälä, Mikko; Heikkinen, Liine; Fröhlich, Roman; Canonaco, Francesco; Prévôt, André S. H.; Junninen, Heikki; Petäjä, Tuukka; Kulmala, Markku; Worsnop, Douglas; Ehn, Mikael

    2017-03-01

    Mass spectrometric measurements commonly yield data on hundreds of variables over thousands of points in time. Refining and synthesizing this raw data into chemical information necessitates the use of advanced, statistics-based data analytical techniques. In the field of analytical aerosol chemistry, statistical, dimensionality reductive methods have become widespread in the last decade, yet comparable advanced chemometric techniques for data classification and identification remain marginal. Here we present an example of combining data dimensionality reduction (factorization) with exploratory classification (clustering), and show that the results cannot only reproduce and corroborate earlier findings, but also complement and broaden our current perspectives on aerosol chemical classification. We find that applying positive matrix factorization to extract spectral characteristics of the organic component of air pollution plumes, together with an unsupervised clustering algorithm, k-means+ + , for classification, reproduces classical organic aerosol speciation schemes. Applying appropriately chosen metrics for spectral dissimilarity along with optimized data weighting, the source-specific pollution characteristics can be statistically resolved even for spectrally very similar aerosol types, such as different combustion-related anthropogenic aerosol species and atmospheric aerosols with similar degree of oxidation. In addition to the typical oxidation level and source-driven aerosol classification, we were also able to classify and characterize outlier groups that would likely be disregarded in a more conventional analysis. Evaluating solution quality for the classification also provides means to assess the performance of mass spectral similarity metrics and optimize weighting for mass spectral variables. This facilitates algorithm-based evaluation of aerosol spectra, which may prove invaluable for future development of automatic methods for spectra identification and classification. Robust, statistics-based results and data visualizations also provide important clues to a human analyst on the existence and chemical interpretation of data structures. Applying these methods to a test set of data, aerosol mass spectrometric data of organic aerosol from a boreal forest site, yielded five to seven different recurring pollution types from various sources, including traffic, cooking, biomass burning and nearby sawmills. Additionally, three distinct, minor pollution types were discovered and identified as amine-dominated aerosols.

  4. Fat suppression in magnetic resonance imaging of the head and neck region: is the two-point DIXON technique superior to spectral fat suppression?

    PubMed

    Wendl, Christina M; Eiglsperger, Johannes; Dendl, Lena-Marie; Brodoefel, Harald; Schebesch, Karl-Michael; Stroszczynski, Christian; Fellner, Claudia

    2018-05-01

    The aim of our study was to systematically compare two-point Dixon fat suppression (FS) and spectral FS techniques in contrast enhanced imaging of the head and neck region. Three independent readers analysed coronal T 1 weighted images recorded after contrast medium injection with Dixon and spectral FS techniques with regard to FS homogeneity, motion artefacts, lesion contrast, image sharpness and overall image quality. 85 patients were prospectively enrolled in the study. Images generated with Dixon-FS technique were of higher overall image quality and had a more homogenous FS over the whole field of view compared with the standard spectral fat-suppressed images (p < 0.001). Concerning motion artefacts, flow artefacts, lesion contrast and image sharpness no statistically significant difference was observed. The Dixon-FS technique is superior to the spectral technique due to improved homogeneity of FS and overall image quality while maintaining lesion contrast. Advances in knowledge: T 1 with Dixon FS technique offers, compared to spectral FS, significantly improved FS homogeneity and over all image quality in imaging of the head and neck region.

  5. Systematic Biases in Parameter Estimation of Binary Black-Hole Mergers

    NASA Technical Reports Server (NTRS)

    Littenberg, Tyson B.; Baker, John G.; Buonanno, Alessandra; Kelly, Bernard J.

    2012-01-01

    Parameter estimation of binary-black-hole merger events in gravitational-wave data relies on matched filtering techniques, which, in turn, depend on accurate model waveforms. Here we characterize the systematic biases introduced in measuring astrophysical parameters of binary black holes by applying the currently most accurate effective-one-body templates to simulated data containing non-spinning numerical-relativity waveforms. For advanced ground-based detectors, we find that the systematic biases are well within the statistical error for realistic signal-to-noise ratios (SNR). These biases grow to be comparable to the statistical errors at high signal-to-noise ratios for ground-based instruments (SNR approximately 50) but never dominate the error budget. At the much larger signal-to-noise ratios expected for space-based detectors, these biases will become large compared to the statistical errors but are small enough (at most a few percent in the black-hole masses) that we expect they should not affect broad astrophysical conclusions that may be drawn from the data.

  6. A study of environmental characterization of conventional and advanced aluminum alloys for selection and design. Phase 2: The breaking load test method

    NASA Technical Reports Server (NTRS)

    Sprowls, D. O.; Bucci, R. J.; Ponchel, B. M.; Brazill, R. L.; Bretz, P. E.

    1984-01-01

    A technique is demonstrated for accelerated stress corrosion testing of high strength aluminum alloys. The method offers better precision and shorter exposure times than traditional pass fail procedures. The approach uses data from tension tests performed on replicate groups of smooth specimens after various lengths of exposure to static stress. The breaking strength measures degradation in the test specimen load carrying ability due to the environmental attack. Analysis of breaking load data by extreme value statistics enables the calculation of survival probabilities and a statistically defined threshold stress applicable to the specific test conditions. A fracture mechanics model is given which quantifies depth of attack in the stress corroded specimen by an effective flaw size calculated from the breaking stress and the material strength and fracture toughness properties. Comparisons are made with experimental results from three tempers of 7075 alloy plate tested by the breaking load method and by traditional tests of statistically loaded smooth tension bars and conventional precracked specimens.

  7. Soil mapping and process modeling for sustainable land use management: a brief historical review

    NASA Astrophysics Data System (ADS)

    Brevik, Eric C.; Pereira, Paulo; Muñoz-Rojas, Miriam; Miller, Bradley A.; Cerdà, Artemi; Parras-Alcántara, Luis; Lozano-García, Beatriz

    2017-04-01

    Basic soil management goes back to the earliest days of agricultural practices, approximately 9,000 BCE. Through time humans developed soil management techniques of ever increasing complexity, including plows, contour tillage, terracing, and irrigation. Spatial soil patterns were being recognized as early as 3,000 BCE, but the first soil maps didn't appear until the 1700s and the first soil models finally arrived in the 1880s (Brevik et al., in press). The beginning of the 20th century saw an increase in standardization in many soil science methods and wide-spread soil mapping in many parts of the world, particularly in developed countries. However, the classification systems used, mapping scale, and national coverage varied considerably from country to country. Major advances were made in pedologic modeling starting in the 1940s, and in erosion modeling starting in the 1950s. In the 1970s and 1980s advances in computing power, remote and proximal sensing, geographic information systems (GIS), global positioning systems (GPS), and statistics and spatial statistics among other numerical techniques significantly enhanced our ability to map and model soils (Brevik et al., 2016). These types of advances positioned soil science to make meaningful contributions to sustainable land use management as we moved into the 21st century. References Brevik, E., Pereira, P., Muñoz-Rojas, M., Miller, B., Cerda, A., Parras-Alcantara, L., Lozano-Garcia, B. Historical perspectives on soil mapping and process modelling for sustainable land use management. In: Pereira, P., Brevik, E., Muñoz-Rojas, M., Miller, B. (eds) Soil mapping and process modelling for sustainable land use management (In press). Brevik, E., Calzolari, C., Miller, B., Pereira, P., Kabala, C., Baumgarten, A., Jordán, A. 2016. Historical perspectives and future needs in soil mapping, classification and pedological modelling, Geoderma, 264, Part B, 256-274.

  8. Dynamic rain fade compensation techniques for the advanced communications technology satellite

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1992-01-01

    The dynamic and composite nature of propagation impairments that are incurred on earth-space communications links at frequencies in and above the 30/20 GHz Ka band necessitate the use of dynamic statistical identification and prediction processing of the fading signal in order to optimally estimate and predict the levels of each of the deleterious attenuation components. Such requirements are being met in NASA's Advanced Communications Technology Satellite (ACTS) project by the implementation of optimal processing schemes derived through the use of the ACTS Rain Attenuation Prediction Model and nonlinear Markov filtering theory. The ACTS Rain Attenuation Prediction Model discerns climatological variations on the order of 0.5 deg in latitude and longitude in the continental U.S. The time-dependent portion of the model gives precise availability predictions for the 'spot beam' links of ACTS. However, the structure of the dynamic portion of the model, which yields performance parameters such as fade duration probabilities, is isomorphic to the state-variable approach of stochastic control theory and is amenable to the design of such statistical fade processing schemes which can be made specific to the particular climatological location at which they are employed.

  9. EEG analysis of the brain activity during the observation of commercial, political, or public service announcements.

    PubMed

    Vecchiato, Giovanni; Astolfi, Laura; Tabarrini, Alessandro; Salinari, Serenella; Mattia, Donatella; Cincotti, Febo; Bianchi, Luigi; Sorrentino, Domenica; Aloise, Fabio; Soranzo, Ramon; Babiloni, Fabio

    2010-01-01

    The use of modern brain imaging techniques could be useful to understand what brain areas are involved in the observation of video clips related to commercial advertising, as well as for the support of political campaigns, and also the areas of Public Service Announcements (PSAs). In this paper we describe the capability of tracking brain activity during the observation of commercials, political spots, and PSAs with advanced high-resolution EEG statistical techniques in time and frequency domains in a group of normal subjects. We analyzed the statistically significant cortical spectral power activity in different frequency bands during the observation of a commercial video clip related to the use of a beer in a group of 13 normal subjects. In addition, a TV speech of the Prime Minister of Italy was analyzed in two groups of swing and "supporter" voters. Results suggested that the cortical activity during the observation of commercial spots could vary consistently across the spot. This fact suggest the possibility to remove the parts of the spot that are not particularly attractive by using those cerebral indexes. The cortical activity during the observation of the political speech indicated a major cortical activity in the supporters group when compared to the swing voters. In this case, it is possible to conclude that the communication proposed has failed to raise attention or interest on swing voters. In conclusions, high-resolution EEG statistical techniques have been proved to able to generate useful insights about the particular fruition of TV messages, related to both commercial as well as political fields.

  10. Three-dimensional virtual planning in orthognathic surgery enhances the accuracy of soft tissue prediction.

    PubMed

    Van Hemelen, Geert; Van Genechten, Maarten; Renier, Lieven; Desmedt, Maria; Verbruggen, Elric; Nadjmi, Nasser

    2015-07-01

    Throughout the history of computing, shortening the gap between the physical and digital world behind the screen has always been strived for. Recent advances in three-dimensional (3D) virtual surgery programs have reduced this gap significantly. Although 3D assisted surgery is now widely available for orthognathic surgery, one might still argue whether a 3D virtual planning approach is a better alternative to a conventional two-dimensional (2D) planning technique. The purpose of this study was to compare the accuracy of a traditional 2D technique and a 3D computer-aided prediction method. A double blind randomised prospective study was performed to compare the prediction accuracy of a traditional 2D planning technique versus a 3D computer-aided planning approach. The accuracy of the hard and soft tissue profile predictions using both planning methods was investigated. There was a statistically significant difference between 2D and 3D soft tissue planning (p < 0.05). The statistically significant difference found between 2D and 3D planning and the actual soft tissue outcome was not confirmed by a statistically significant difference between methods. The 3D planning approach provides more accurate soft tissue planning. However, the 2D orthognathic planning is comparable to 3D planning when it comes to hard tissue planning. This study provides relevant results for choosing between 3D and 2D planning in clinical practice. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  11. EEG Analysis of the Brain Activity during the Observation of Commercial, Political, or Public Service Announcements

    PubMed Central

    Vecchiato, Giovanni; Astolfi, Laura; Tabarrini, Alessandro; Salinari, Serenella; Mattia, Donatella; Cincotti, Febo; Bianchi, Luigi; Sorrentino, Domenica; Aloise, Fabio; Soranzo, Ramon; Babiloni, Fabio

    2010-01-01

    The use of modern brain imaging techniques could be useful to understand what brain areas are involved in the observation of video clips related to commercial advertising, as well as for the support of political campaigns, and also the areas of Public Service Announcements (PSAs). In this paper we describe the capability of tracking brain activity during the observation of commercials, political spots, and PSAs with advanced high-resolution EEG statistical techniques in time and frequency domains in a group of normal subjects. We analyzed the statistically significant cortical spectral power activity in different frequency bands during the observation of a commercial video clip related to the use of a beer in a group of 13 normal subjects. In addition, a TV speech of the Prime Minister of Italy was analyzed in two groups of swing and “supporter” voters. Results suggested that the cortical activity during the observation of commercial spots could vary consistently across the spot. This fact suggest the possibility to remove the parts of the spot that are not particularly attractive by using those cerebral indexes. The cortical activity during the observation of the political speech indicated a major cortical activity in the supporters group when compared to the swing voters. In this case, it is possible to conclude that the communication proposed has failed to raise attention or interest on swing voters. In conclusions, high-resolution EEG statistical techniques have been proved to able to generate useful insights about the particular fruition of TV messages, related to both commercial as well as political fields. PMID:20069055

  12. Targeting the untargeted in molecular phenomics with structurally-selective ion mobility-mass spectrometry.

    PubMed

    May, Jody Christopher; Gant-Branum, Randi Lee; McLean, John Allen

    2016-06-01

    Systems-wide molecular phenomics is rapidly expanding through technological advances in instrumentation and bioinformatics. Strategies such as structural mass spectrometry, which utilizes size and shape measurements with molecular weight, serve to characterize the sum of molecular expression in biological contexts, where broad-scale measurements are made that are interpreted through big data statistical techniques to reveal underlying patterns corresponding to phenotype. The data density, data dimensionality, data projection, and data interrogation are all critical aspects of these approaches to turn data into salient information. Untargeted molecular phenomics is already having a dramatic impact in discovery science from drug discovery to synthetic biology. It is evident that these emerging techniques will integrate closely in broad efforts aimed at precision medicine. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  14. Development of advanced techniques for rotorcraft state estimation and parameter identification

    NASA Technical Reports Server (NTRS)

    Hall, W. E., Jr.; Bohn, J. G.; Vincent, J. H.

    1980-01-01

    An integrated methodology for rotorcraft system identification consists of rotorcraft mathematical modeling, three distinct data processing steps, and a technique for designing inputs to improve the identifiability of the data. These elements are as follows: (1) a Kalman filter smoother algorithm which estimates states and sensor errors from error corrupted data. Gust time histories and statistics may also be estimated; (2) a model structure estimation algorithm for isolating a model which adequately explains the data; (3) a maximum likelihood algorithm for estimating the parameters and estimates for the variance of these estimates; and (4) an input design algorithm, based on a maximum likelihood approach, which provides inputs to improve the accuracy of parameter estimates. Each step is discussed with examples to both flight and simulated data cases.

  15. Characterizing reliability in a product/process design-assurance program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerscher, W.J. III; Booker, J.M.; Bement, T.R.

    1997-10-01

    Over the years many advancing techniques in the area of reliability engineering have surfaced in the military sphere of influence, and one of these techniques is Reliability Growth Testing (RGT). Private industry has reviewed RGT as part of the solution to their reliability concerns, but many practical considerations have slowed its implementation. It`s objective is to demonstrate the reliability requirement of a new product with a specified confidence. This paper speaks directly to that objective but discusses a somewhat different approach to achieving it. Rather than conducting testing as a continuum and developing statistical confidence bands around the results, thismore » Bayesian updating approach starts with a reliability estimate characterized by large uncertainty and then proceeds to reduce the uncertainty by folding in fresh information in a Bayesian framework.« less

  16. Advanced statistics: linear regression, part I: simple linear regression.

    PubMed

    Marill, Keith A

    2004-01-01

    Simple linear regression is a mathematical technique used to model the relationship between a single independent predictor variable and a single dependent outcome variable. In this, the first of a two-part series exploring concepts in linear regression analysis, the four fundamental assumptions and the mechanics of simple linear regression are reviewed. The most common technique used to derive the regression line, the method of least squares, is described. The reader will be acquainted with other important concepts in simple linear regression, including: variable transformations, dummy variables, relationship to inference testing, and leverage. Simplified clinical examples with small datasets and graphic models are used to illustrate the points. This will provide a foundation for the second article in this series: a discussion of multiple linear regression, in which there are multiple predictor variables.

  17. Confocal Imaging of porous media

    NASA Astrophysics Data System (ADS)

    Shah, S.; Crawshaw, D.; Boek, D.

    2012-12-01

    Carbonate rocks, which hold approximately 50% of the world's oil and gas reserves, have a very complicated and heterogeneous structure in comparison with sandstone reservoir rock. We present advances with different techniques to image, reconstruct, and characterize statistically the micro-geometry of carbonate pores. The main goal here is to develop a technique to obtain two dimensional and three dimensional images using Confocal Laser Scanning Microscopy. CLSM is used in epi-fluorescent imaging mode, allowing for the very high optical resolution of features well below 1μm size. Images of pore structures were captured using CLSM imaging where spaces in the carbonate samples were impregnated with a fluorescent, dyed epoxy-resin, and scanned in the x-y plane by a laser probe. We discuss the sample preparation in detail for Confocal Imaging to obtain sub-micron resolution images of heterogeneous carbonate rocks. We also discuss the technical and practical aspects of this imaging technique, including its advantages and limitation. We present several examples of this application, including studying pore geometry in carbonates, characterizing sub-resolution porosity in two dimensional images. We then describe approaches to extract statistical information about porosity using image processing and spatial correlation function. We have managed to obtain very low depth information in z -axis (~ 50μm) to develop three dimensional images of carbonate rocks with the current capabilities and limitation of CLSM technique. Hence, we have planned a novel technique to obtain higher depth information to obtain high three dimensional images with sub-micron resolution possible in the lateral and axial planes.

  18. A Quantitative Comparative Study of Blended and Traditional Models in the Secondary Advanced Placement Statistics Classroom

    ERIC Educational Resources Information Center

    Owens, Susan T.

    2017-01-01

    Technology is becoming an integral tool in the classroom and can make a positive impact on how the students learn. This quantitative comparative research study examined gender-based differences among secondary Advanced Placement (AP) Statistic students comparing Educational Testing Service (ETS) College Board AP Statistic examination scores…

  19. An analytic treatment of gravitational microlensing for sources of finite size at large optical depths

    NASA Technical Reports Server (NTRS)

    Deguchi, Shuji; Watson, William D.

    1988-01-01

    Statistical methods are developed for gravitational lensing in order to obtain analytic expressions for the average surface brightness that include the effects of microlensing by stellar (or other compact) masses within the lensing galaxy. The primary advance here is in utilizing a Markoff technique to obtain expressions that are valid for sources of finite size when the surface density of mass in the lensing galaxy is large. The finite size of the source is probably the key consideration for the occurrence of microlensing by individual stars. For the intensity from a particular location, the parameter which governs the importance of microlensing is determined. Statistical methods are also formulated to assess the time variation of the surface brightness due to the random motion of the masses that cause the microlensing.

  20. A Statistical Description of Neural Ensemble Dynamics

    PubMed Central

    Long, John D.; Carmena, Jose M.

    2011-01-01

    The growing use of multi-channel neural recording techniques in behaving animals has produced rich datasets that hold immense potential for advancing our understanding of how the brain mediates behavior. One limitation of these techniques is they do not provide important information about the underlying anatomical connections among the recorded neurons within an ensemble. Inferring these connections is often intractable because the set of possible interactions grows exponentially with ensemble size. This is a fundamental challenge one confronts when interpreting these data. Unfortunately, the combination of expert knowledge and ensemble data is often insufficient for selecting a unique model of these interactions. Our approach shifts away from modeling the network diagram of the ensemble toward analyzing changes in the dynamics of the ensemble as they relate to behavior. Our contribution consists of adapting techniques from signal processing and Bayesian statistics to track the dynamics of ensemble data on time-scales comparable with behavior. We employ a Bayesian estimator to weigh prior information against the available ensemble data, and use an adaptive quantization technique to aggregate poorly estimated regions of the ensemble data space. Importantly, our method is capable of detecting changes in both the magnitude and structure of correlations among neurons missed by firing rate metrics. We show that this method is scalable across a wide range of time-scales and ensemble sizes. Lastly, the performance of this method on both simulated and real ensemble data is used to demonstrate its utility. PMID:22319486

  1. High-spatial-resolution passive microwave sounding systems

    NASA Technical Reports Server (NTRS)

    Staelin, D. H.; Rosenkranz, P. W.

    1994-01-01

    The principal contributions of this combined theoretical and experimental effort were to advance and demonstrate new and more accurate techniques for sounding atmospheric temperature, humidity, and precipitation profiles at millimeter wavelengths, and to improve the scientific basis for such soundings. Some of these techniques are being incorporated in both research and operational systems. Specific results include: (1) development of the MIT Microwave Temperature Sounder (MTS), a 118-GHz eight-channel imaging spectrometer plus a switched-frequency spectrometer near 53 GHz, for use on the NASA ER-2 high-altitude aircraft, (2) conduct of ER-2 MTS missions in multiple seasons and locations in combination with other instruments, mapping with unprecedented approximately 2-km lateral resolution atmospheric temperature and precipitation profiles, atmospheric transmittances (at both zenith and nadir), frontal systems, and hurricanes, (3) ground based 118-GHz 3-D spectral images of wavelike structure within clouds passing overhead, (4) development and analysis of approaches to ground- and space-based 5-mm wavelength sounding of the upper stratosphere and mesosphere, which supported the planning of improvements to operational weather satellites, (5) development of improved multidimensional and adaptive retrieval methods for atmospheric temperature and humidity profiles, (6) development of combined nonlinear and statistical retrieval techniques for 183-GHz humidity profile retrievals, (7) development of nonlinear statistical retrieval techniques for precipitation cell-top altitudes, and (8) numerical analyses of the impact of remote sensing data on the accuracy of numerical weather predictions; a 68-km gridded model was used to study the spectral properties of error growth.

  2. The Statistical Consulting Center for Astronomy (SCCA)

    NASA Technical Reports Server (NTRS)

    Akritas, Michael

    2001-01-01

    The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.

  3. Privacy-Preserving Data Exploration in Genome-Wide Association Studies.

    PubMed

    Johnson, Aaron; Shmatikov, Vitaly

    2013-08-01

    Genome-wide association studies (GWAS) have become a popular method for analyzing sets of DNA sequences in order to discover the genetic basis of disease. Unfortunately, statistics published as the result of GWAS can be used to identify individuals participating in the study. To prevent privacy breaches, even previously published results have been removed from public databases, impeding researchers' access to the data and hindering collaborative research. Existing techniques for privacy-preserving GWAS focus on answering specific questions, such as correlations between a given pair of SNPs (DNA sequence variations). This does not fit the typical GWAS process, where the analyst may not know in advance which SNPs to consider and which statistical tests to use, how many SNPs are significant for a given dataset, etc. We present a set of practical, privacy-preserving data mining algorithms for GWAS datasets. Our framework supports exploratory data analysis, where the analyst does not know a priori how many and which SNPs to consider. We develop privacy-preserving algorithms for computing the number and location of SNPs that are significantly associated with the disease, the significance of any statistical test between a given SNP and the disease, any measure of correlation between SNPs, and the block structure of correlations. We evaluate our algorithms on real-world datasets and demonstrate that they produce significantly more accurate results than prior techniques while guaranteeing differential privacy.

  4. A test to evaluate the earthquake prediction algorithm, M8

    USGS Publications Warehouse

    Healy, John H.; Kossobokov, Vladimir G.; Dewey, James W.

    1992-01-01

    A test of the algorithm M8 is described. The test is constructed to meet four rules, which we propose to be applicable to the test of any method for earthquake prediction:  1. An earthquake prediction technique should be presented as a well documented, logical algorithm that can be used by  investigators without restrictions. 2. The algorithm should be coded in a common programming language and implementable on widely available computer systems. 3. A test of the earthquake prediction technique should involve future predictions with a black box version of the algorithm in which potentially adjustable parameters are fixed in advance. The source of the input data must be defined and ambiguities in these data must be resolved automatically by the algorithm. 4. At least one reasonable null hypothesis should be stated in advance of testing the earthquake prediction method, and it should be stated how this null hypothesis will be used to estimate the statistical significance of the earthquake predictions. The M8 algorithm has successfully predicted several destructive earthquakes, in the sense that the earthquakes occurred inside regions with linear dimensions from 384 to 854 km that the algorithm had identified as being in times of increased probability for strong earthquakes. In addition, M8 has successfully "post predicted" high percentages of strong earthquakes in regions to which it has been applied in retroactive studies. The statistical significance of previous predictions has not been established, however, and post-prediction studies in general are notoriously subject to success-enhancement through hindsight. Nor has it been determined how much more precise an M8 prediction might be than forecasts and probability-of-occurrence estimates made by other techniques. We view our test of M8 both as a means to better determine the effectiveness of M8 and as an experimental structure within which to make observations that might lead to improvements in the algorithm or conceivably lead to a radically different approach to earthquake prediction.

  5. Attitudes toward Advanced and Multivariate Statistics When Using Computers.

    ERIC Educational Resources Information Center

    Kennedy, Robert L.; McCallister, Corliss Jean

    This study investigated the attitudes toward statistics of graduate students who studied advanced statistics in a course in which the focus of instruction was the use of a computer program in class. The use of the program made it possible to provide an individualized, self-paced, student-centered, and activity-based course. The three sections…

  6. A Case Study on Teaching the Topic "Experimental Unit" and How It Is Presented in Advanced Placement Statistics Textbooks

    ERIC Educational Resources Information Center

    Perrett, Jamis J.

    2012-01-01

    This article demonstrates how textbooks differ in their description of the term "experimental unit". Advanced Placement Statistics teachers and students are often limited in their statistical knowledge by the information presented in their classroom textbook. Definitions and descriptions differ among textbooks as well as among different…

  7. Detection of Failure in Asynchronous Motor Using Soft Computing Method

    NASA Astrophysics Data System (ADS)

    Vinoth Kumar, K.; Sony, Kevin; Achenkunju John, Alan; Kuriakose, Anto; John, Ano P.

    2018-04-01

    This paper investigates the stator short winding failure of asynchronous motor also their effects on motor current spectrums. A fuzzy logic approach i.e., model based technique possibly will help to detect the asynchronous motor failure. Actually, fuzzy logic similar to humanoid intelligent methods besides expected linguistic empowering inferences through vague statistics. The dynamic model is technologically advanced for asynchronous motor by means of fuzzy logic classifier towards investigate the stator inter turn failure in addition open phase failure. A hardware implementation was carried out with LabVIEW for the online-monitoring of faults.

  8. Preface: Special Topic on Single-Molecule Biophysics

    NASA Astrophysics Data System (ADS)

    Makarov, Dmitrii E.; Schuler, Benjamin

    2018-03-01

    Single-molecule measurements are now almost routinely used to study biological systems and processes. The scope of this special topic emphasizes the physics side of single-molecule observations, with the goal of highlighting new developments in physical techniques as well as conceptual insights that single-molecule measurements bring to biophysics. This issue also comprises recent advances in theoretical physical models of single-molecule phenomena, interpretation of single-molecule signals, and fundamental areas of statistical mechanics that are related to single-molecule observations. A particular goal is to illustrate the increasing synergy between theory, simulation, and experiment in single-molecule biophysics.

  9. Subgrid or Reynolds stress-modeling for three-dimensional turbulence computations

    NASA Technical Reports Server (NTRS)

    Rubesin, M. W.

    1975-01-01

    A review is given of recent advances in two distinct computational methods for evaluating turbulence fields, namely, statistical Reynolds stress modeling and turbulence simulation, where large eddies are followed in time. It is shown that evaluation of the mean Reynolds stresses, rather than use of a scalar eddy viscosity, permits an explanation of streamline curvature effects found in several experiments. Turbulence simulation, with a new volume averaging technique and third-order accurate finite-difference computing is shown to predict the decay of isotropic turbulence in incompressible flow with rather modest computer storage requirements, even at Reynolds numbers of aerodynamic interest.

  10. GIS-based analysis and modelling with empirical and remotely-sensed data on coastline advance and retreat

    NASA Astrophysics Data System (ADS)

    Ahmad, Sajid Rashid

    With the understanding that far more research remains to be done on the development and use of innovative and functional geospatial techniques and procedures to investigate coastline changes this thesis focussed on the integration of remote sensing, geographical information systems (GIS) and modelling techniques to provide meaningful insights on the spatial and temporal dynamics of coastline changes. One of the unique strengths of this research was the parameterization of the GIS with long-term empirical and remote sensing data. Annual empirical data from 1941--2007 were analyzed by the GIS, and then modelled with statistical techniques. Data were also extracted from Landsat TM and ETM+ images. The band ratio method was used to extract the coastlines. Topographic maps were also used to extract digital map data. All data incorporated into ArcGIS 9.2 were analyzed with various modules, including Spatial Analyst, 3D Analyst, and Triangulated Irregular Networks. The Digital Shoreline Analysis System was used to analyze and predict rates of coastline change. GIS results showed the spatial locations along the coast that will either advance or retreat over time. The linear regression results highlighted temporal changes which are likely to occur along the coastline. Box-Jenkins modelling procedures were utilized to determine statistical models which best described the time series (1941--2007) of coastline change data. After several iterations and goodness-of-fit tests, second-order spatial cyclic autoregressive models, first-order autoregressive models and autoregressive moving average models were identified as being appropriate for describing the deterministic and random processes operating in Guyana's coastal system. The models highlighted not only cyclical patterns in advance and retreat of the coastline, but also the existence of short and long-term memory processes. Long-term memory processes could be associated with mudshoal propagation and stabilization while short-term memory processes were indicative of transitory hydrodynamic and other processes. An innovative framework for a spatio-temporal information-based system (STIBS) was developed. STIBS incorporated diverse datasets within a GIS, dynamic computer-based simulation models, and a spatial information query and graphical subsystem. Tests of the STIBS proved that it could be used to simulate and visualize temporal variability in shifting morphological states of the coastline.

  11. Earthquake prediction evaluation standards applied to the VAN Method

    NASA Astrophysics Data System (ADS)

    Jackson, David D.

    Earthquake prediction research must meet certain standards before it can be suitably evaluated for potential application in decision making. For methods that result in a binary (on or off) alarm condition, requirements include (1) a quantitative description of observables that trigger an alarm, (2) a quantitative description, including ranges of time, location, and magnitude, of the predicted earthquakes, (3) documented evidence of all previous alarms, (4) a complete list of predicted earthquakes, (5) a complete list of unpredicted earthquakes. The VAN technique [Varotsos and Lazaridou, 1991; Varotsos et al., 1996] has not yet been stated as a testable hypothesis. It fails criteria (1) and (2) so it is not ready to be evaluated properly. Although telegrams were transmitted in advance of claimed successes, these telegrams did not fully specify the predicted events, and all of the published statistical evaluations involve many subjective ex post facto decisions. Lacking a statistically demonstrated relationship to earthquakes, a candidate prediction technique should satisfy several plausibility criteria, including: (1) a reasonable relationship between the location of the candidate precursor and that of the predicted earthquake, (2) some demonstration that the candidate precursory observations are related to stress, strain, or other quantities related to earthquakes, and (3) the existence of co-seismic as well as pre-seismic variations of the candidate precursor. The VAN technique meets none of these criteria.

  12. Advanced statistics: linear regression, part II: multiple linear regression.

    PubMed

    Marill, Keith A

    2004-01-01

    The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.

  13. Optimization of Premix Powders for Tableting Use.

    PubMed

    Todo, Hiroaki; Sato, Kazuki; Takayama, Kozo; Sugibayashi, Kenji

    2018-05-08

    Direct compression is a popular choice as it provides the simplest way to prepare the tablet. It can be easily adopted when the active pharmaceutical ingredient (API) is unstable in water or to thermal drying. An optimal formulation of preliminary mixed powders (premix powders) is beneficial if prepared in advance for tableting use. The aim of this study was to find the optimal formulation of the premix powders composed of lactose (LAC), cornstarch (CS), and microcrystalline cellulose (MCC) by using statistical techniques. Based on the "Quality by Design" concept, a (3,3)-simplex lattice design consisting of three components, LAC, CS, and MCC was employed to prepare the model premix powders. Response surface method incorporating a thin-plate spline interpolation (RSM-S) was applied for estimation of the optimum premix powders for tableting use. The effect of tablet shape identified by the surface curvature on the optimization was investigated. The optimum premix powder was effective when the premix was applied to a small quantity of API, although the function of premix was limited in the case of the formulation of large amount of API. Statistical techniques are valuable to exploit new functions of well-known materials such as LAC, CS, and MCC.

  14. Turbine blade tip durability analysis

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.

    1981-01-01

    An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.

  15. Hard, harder, hardest: principal stratification, statistical identifiability, and the inherent difficulty of finding surrogate endpoints.

    PubMed

    Wolfson, Julian; Henn, Lisa

    2014-01-01

    In many areas of clinical investigation there is great interest in identifying and validating surrogate endpoints, biomarkers that can be measured a relatively short time after a treatment has been administered and that can reliably predict the effect of treatment on the clinical outcome of interest. However, despite dramatic advances in the ability to measure biomarkers, the recent history of clinical research is littered with failed surrogates. In this paper, we present a statistical perspective on why identifying surrogate endpoints is so difficult. We view the problem from the framework of causal inference, with a particular focus on the technique of principal stratification (PS), an approach which is appealing because the resulting estimands are not biased by unmeasured confounding. In many settings, PS estimands are not statistically identifiable and their degree of non-identifiability can be thought of as representing the statistical difficulty of assessing the surrogate value of a biomarker. In this work, we examine the identifiability issue and present key simplifying assumptions and enhanced study designs that enable the partial or full identification of PS estimands. We also present example situations where these assumptions and designs may or may not be feasible, providing insight into the problem characteristics which make the statistical evaluation of surrogate endpoints so challenging.

  16. Hard, harder, hardest: principal stratification, statistical identifiability, and the inherent difficulty of finding surrogate endpoints

    PubMed Central

    2014-01-01

    In many areas of clinical investigation there is great interest in identifying and validating surrogate endpoints, biomarkers that can be measured a relatively short time after a treatment has been administered and that can reliably predict the effect of treatment on the clinical outcome of interest. However, despite dramatic advances in the ability to measure biomarkers, the recent history of clinical research is littered with failed surrogates. In this paper, we present a statistical perspective on why identifying surrogate endpoints is so difficult. We view the problem from the framework of causal inference, with a particular focus on the technique of principal stratification (PS), an approach which is appealing because the resulting estimands are not biased by unmeasured confounding. In many settings, PS estimands are not statistically identifiable and their degree of non-identifiability can be thought of as representing the statistical difficulty of assessing the surrogate value of a biomarker. In this work, we examine the identifiability issue and present key simplifying assumptions and enhanced study designs that enable the partial or full identification of PS estimands. We also present example situations where these assumptions and designs may or may not be feasible, providing insight into the problem characteristics which make the statistical evaluation of surrogate endpoints so challenging. PMID:25342953

  17. Secure distributed genome analysis for GWAS and sequence comparison computation.

    PubMed

    Zhang, Yihua; Blanton, Marina; Almashaqbeh, Ghada

    2015-01-01

    The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice.

  18. Secure distributed genome analysis for GWAS and sequence comparison computation

    PubMed Central

    2015-01-01

    Background The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. Methods In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. Results We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. Conclusions This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice. PMID:26733307

  19. Advances in the microrheology of complex fluids

    NASA Astrophysics Data System (ADS)

    Waigh, Thomas Andrew

    2016-07-01

    New developments in the microrheology of complex fluids are considered. Firstly the requirements for a simple modern particle tracking microrheology experiment are introduced, the error analysis methods associated with it and the mathematical techniques required to calculate the linear viscoelasticity. Progress in microrheology instrumentation is then described with respect to detectors, light sources, colloidal probes, magnetic tweezers, optical tweezers, diffusing wave spectroscopy, optical coherence tomography, fluorescence correlation spectroscopy, elastic- and quasi-elastic scattering techniques, 3D tracking, single molecule methods, modern microscopy methods and microfluidics. New theoretical techniques are also reviewed such as Bayesian analysis, oversampling, inversion techniques, alternative statistical tools for tracks (angular correlations, first passage probabilities, the kurtosis, motor protein step segmentation etc), issues in micro/macro rheological agreement and two particle methodologies. Applications where microrheology has begun to make some impact are also considered including semi-flexible polymers, gels, microorganism biofilms, intracellular methods, high frequency viscoelasticity, comb polymers, active motile fluids, blood clots, colloids, granular materials, polymers, liquid crystals and foods. Two large emergent areas of microrheology, non-linear microrheology and surface microrheology are also discussed.

  20. Quantitative Aspects of Single Molecule Microscopy

    PubMed Central

    Ober, Raimund J.; Tahmasbi, Amir; Ram, Sripad; Lin, Zhiping; Ward, E. Sally

    2015-01-01

    Single molecule microscopy is a relatively new optical microscopy technique that allows the detection of individual molecules such as proteins in a cellular context. This technique has generated significant interest among biologists, biophysicists and biochemists, as it holds the promise to provide novel insights into subcellular processes and structures that otherwise cannot be gained through traditional experimental approaches. Single molecule experiments place stringent demands on experimental and algorithmic tools due to the low signal levels and the presence of significant extraneous noise sources. Consequently, this has necessitated the use of advanced statistical signal and image processing techniques for the design and analysis of single molecule experiments. In this tutorial paper, we provide an overview of single molecule microscopy from early works to current applications and challenges. Specific emphasis will be on the quantitative aspects of this imaging modality, in particular single molecule localization and resolvability, which will be discussed from an information theoretic perspective. We review the stochastic framework for image formation, different types of estimation techniques and expressions for the Fisher information matrix. We also discuss several open problems in the field that demand highly non-trivial signal processing algorithms. PMID:26167102

  1. Current application of chemometrics in traditional Chinese herbal medicine research.

    PubMed

    Huang, Yipeng; Wu, Zhenwei; Su, Rihui; Ruan, Guihua; Du, Fuyou; Li, Gongke

    2016-07-15

    Traditional Chinese herbal medicines (TCHMs) are promising approach for the treatment of various diseases which have attracted increasing attention all over the world. Chemometrics in quality control of TCHMs are great useful tools that harnessing mathematics, statistics and other methods to acquire information maximally from the data obtained from various analytical approaches. This feature article focuses on the recent studies which evaluating the pharmacological efficacy and quality of TCHMs by determining, identifying and discriminating the bioactive or marker components in different samples with the help of chemometric techniques. In this work, the application of chemometric techniques in the classification of TCHMs based on their efficacy and usage was introduced. The recent advances of chemometrics applied in the chemical analysis of TCHMs were reviewed in detail. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Door Security using Face Detection and Raspberry Pi

    NASA Astrophysics Data System (ADS)

    Bhutra, Venkatesh; Kumar, Harshav; Jangid, Santosh; Solanki, L.

    2018-03-01

    With the world moving towards advanced technologies, security forms a crucial part in daily life. Among the many techniques used for this purpose, Face Recognition stands as effective means of authentication and security. This paper deals with the user of principal component and security. PCA is a statistical approach used to simplify a data set. The minimum Euclidean distance found from the PCA technique is used to recognize the face. Raspberry Pi a low cost ARM based computer on a small circuit board, controls the servo motor and other sensors. The servo-motor is in turn attached to the doors of home and opens up when the face is recognized. The proposed work has been done using a self-made training database of students from B.K. Birla Institute of Engineering and Technology, Pilani, Rajasthan, India.

  3. Utilization of ERTS-1 for appraising changes in continental migratory bird habitat

    NASA Technical Reports Server (NTRS)

    Gilmer, D. S. (Principal Investigator); Work, E. A., Jr.; Klett, A. T.

    1974-01-01

    The author has identified the following significant results. Information on numbers, distribution, and quality of wetlands in the breeding range of migratory waterfowl is important for the management of this wildlife resource. Using computer processing of data gathered by the ERTS-1 multispectral scanner, techniques for obtaining indices of annual waterfowl recruitment, and habitat quality are examined. As a primary task, thematic maps and statistics relating to open surface water were produced. Discrimination of water was based upon water's low apparent radiance in a single, near-infrared waveband. An advanced technique using multispectral information for discerning open water at a level of detail finer than the virtual resolution of the data was also successfully tested. In another related task, vegetation indicators were used for detecting conditions of latent or occluded water and upland habitat characteristics.

  4. Imaging flow cytometry for phytoplankton analysis.

    PubMed

    Dashkova, Veronika; Malashenkov, Dmitry; Poulton, Nicole; Vorobjev, Ivan; Barteneva, Natasha S

    2017-01-01

    This review highlights the concepts and instrumentation of imaging flow cytometry technology and in particular its use for phytoplankton analysis. Imaging flow cytometry, a hybrid technology combining speed and statistical capabilities of flow cytometry with imaging features of microscopy, is rapidly advancing as a cell imaging platform that overcomes many of the limitations of current techniques and contributed significantly to the advancement of phytoplankton analysis in recent years. This review presents the various instrumentation relevant to the field and currently used for assessment of complex phytoplankton communities' composition and abundance, size structure determination, biovolume estimation, detection of harmful algal bloom species, evaluation of viability and metabolic activity and other applications. Also we present our data on viability and metabolic assessment of Aphanizomenon sp. cyanobacteria using Imagestream X Mark II imaging cytometer. Herein, we highlight the immense potential of imaging flow cytometry for microalgal research, but also discuss limitations and future developments. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Recent advances in parametric neuroreceptor mapping with dynamic PET: basic concepts and graphical analyses.

    PubMed

    Seo, Seongho; Kim, Su Jin; Lee, Dong Soo; Lee, Jae Sung

    2014-10-01

    Tracer kinetic modeling in dynamic positron emission tomography (PET) has been widely used to investigate the characteristic distribution patterns or dysfunctions of neuroreceptors in brain diseases. Its practical goal has progressed from regional data quantification to parametric mapping that produces images of kinetic-model parameters by fully exploiting the spatiotemporal information in dynamic PET data. Graphical analysis (GA) is a major parametric mapping technique that is independent on any compartmental model configuration, robust to noise, and computationally efficient. In this paper, we provide an overview of recent advances in the parametric mapping of neuroreceptor binding based on GA methods. The associated basic concepts in tracer kinetic modeling are presented, including commonly-used compartment models and major parameters of interest. Technical details of GA approaches for reversible and irreversible radioligands are described, considering both plasma input and reference tissue input models. Their statistical properties are discussed in view of parametric imaging.

  6. Phylogeography Takes a Relaxed Random Walk in Continuous Space and Time

    PubMed Central

    Lemey, Philippe; Rambaut, Andrew; Welch, John J.; Suchard, Marc A.

    2010-01-01

    Research aimed at understanding the geographic context of evolutionary histories is burgeoning across biological disciplines. Recent endeavors attempt to interpret contemporaneous genetic variation in the light of increasingly detailed geographical and environmental observations. Such interest has promoted the development of phylogeographic inference techniques that explicitly aim to integrate such heterogeneous data. One promising development involves reconstructing phylogeographic history on a continuous landscape. Here, we present a Bayesian statistical approach to infer continuous phylogeographic diffusion using random walk models while simultaneously reconstructing the evolutionary history in time from molecular sequence data. Moreover, by accommodating branch-specific variation in dispersal rates, we relax the most restrictive assumption of the standard Brownian diffusion process and demonstrate increased statistical efficiency in spatial reconstructions of overdispersed random walks by analyzing both simulated and real viral genetic data. We further illustrate how drawing inference about summary statistics from a fully specified stochastic process over both sequence evolution and spatial movement reveals important characteristics of a rabies epidemic. Together with recent advances in discrete phylogeographic inference, the continuous model developments furnish a flexible statistical framework for biogeographical reconstructions that is easily expanded upon to accommodate various landscape genetic features. PMID:20203288

  7. Robust hypothesis tests for detecting statistical evidence of two-dimensional and three-dimensional interactions in single-molecule measurements

    NASA Astrophysics Data System (ADS)

    Calderon, Christopher P.; Weiss, Lucien E.; Moerner, W. E.

    2014-05-01

    Experimental advances have improved the two- (2D) and three-dimensional (3D) spatial resolution that can be extracted from in vivo single-molecule measurements. This enables researchers to quantitatively infer the magnitude and directionality of forces experienced by biomolecules in their native environment. Situations where such force information is relevant range from mitosis to directed transport of protein cargo along cytoskeletal structures. Models commonly applied to quantify single-molecule dynamics assume that effective forces and velocity in the x ,y (or x ,y,z) directions are statistically independent, but this assumption is physically unrealistic in many situations. We present a hypothesis testing approach capable of determining if there is evidence of statistical dependence between positional coordinates in experimentally measured trajectories; if the hypothesis of independence between spatial coordinates is rejected, then a new model accounting for 2D (3D) interactions can and should be considered. Our hypothesis testing technique is robust, meaning it can detect interactions, even if the noise statistics are not well captured by the model. The approach is demonstrated on control simulations and on experimental data (directed transport of intraflagellar transport protein 88 homolog in the primary cilium).

  8. Pulsating stars and the distance scale

    NASA Astrophysics Data System (ADS)

    Macri, Lucas

    2017-09-01

    I present an overview of the latest results from the SH0ES project, which obtained homogeneous Hubble Space Telescope (HST) photometry in the optical and near-infrared for ˜ 3500 and ˜ 2300 Cepheids, respectively, across 19 supernova hosts and 4 calibrators to determine the value of H0 with a total uncertainty of 2.4%. I discuss the current 3.4σ "tension" between this local measurement and predictions of H0 based on observations of the CMB and the assumption of "standard" ΛCDM. I review ongoing efforts to reach σ(H0) = 1%, including recent advances on the absolute calibration of Milky Way Cepheid period-luminosity relations (PLRs) using a novel astrometric technique with HST. Lastly, I highlight recent results from another collaboration on the development of new statistical techniques to detect, classify and phase extragalactic Miras using noisy and sparsely-sampled observations. I present preliminary Mira PLRs at various wavelengths based on the application of these techniques to a survey of M33.

  9. Design and statistical problems in prevention.

    PubMed

    Gullberg, B

    1996-01-01

    Clinical and epidemiological research in osteoporosis can benefit from using the methods and techniques established in the area of chronic disease epidemiology. However, attention has to be given to the special characteristics such as the multifactorial nature and the fact that the subjects usually are of high ages. In order to evaluate prevention it is of course first necessary to detect and confirm reversible risk factors. The advantage and disadvantage of different design (cross-sectional, cohort and case-control) are well known. The effects of avoidable biases, e.g. selection, observation and confounding have to be balanced against practical conveniences like time, expenses, recruitment etc. The translation of relative risks into population attributable risks (etiologic fractions, prevented fractions) are complex and are usually performed under unrealistic, simplified assumptions. The consequences of interactions (synergy) between risk factors are often neglected. The multifactorial structure requires application of more advanced multi-level statistical techniques. The common strategy in prevention to target a cluster of risk factors in order to avoid the multifactorial nature implies that in the end it is impossible to separate each unique factor. Experimental designs for evaluating prevention like clinical trials and intervention have to take into account the distinction between explanatory and pragmatic studies. An explanatory approach is similar to an idealized laboratory trial while the pragmatic design is more realistic, practical and has a general public health perspective. The statistical techniques to be used in osteoporosis research are implemented in easy available computer-packages like SAS, SPSS, BMDP and GLIM. In addition to the traditional logistic regression methods like Cox analysis and Poisson regression also analysis of repeated measurement and cluster analysis are relevant.

  10. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    PubMed

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.

  11. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    PubMed Central

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185

  12. Challenges of assessing critical thinking and clinical judgment in nurse practitioner students.

    PubMed

    Gorton, Karen L; Hayes, Janice

    2014-03-01

    The purpose of this study was to determine whether there was a relationship between critical thinking skills and clinical judgment in nurse practitioner students. The study used a convenience, nonprobability sampling technique, engaging participants from across the United States. Correlational analysis demonstrated no statistically significant relationship between critical thinking skills and examination-style questions, critical thinking skills and scores on the evaluation and reevaluation of consequences subscale of the Clinical Decision Making in Nursing Scale, and critical thinking skills and the preceptor evaluation tool. The study found no statistically significant relationships between critical thinking skills and clinical judgment. Educators and practitioners could consider further research in these areas to gain insight into how critical thinking is and could be measured, to gain insight into the clinical decision making skills of nurse practitioner students, and to gain insight into the development and measurement of critical thinking skills in advanced practice educational programs. Copyright 2014, SLACK Incorporated.

  13. Regression modeling of particle size distributions in urban storm water: advancements through improved sample collection methods

    USGS Publications Warehouse

    Fienen, Michael N.; Selbig, William R.

    2012-01-01

    A new sample collection system was developed to improve the representation of sediment entrained in urban storm water by integrating water quality samples from the entire water column. The depth-integrated sampler arm (DISA) was able to mitigate sediment stratification bias in storm water, thereby improving the characterization of suspended-sediment concentration and particle size distribution at three independent study locations. Use of the DISA decreased variability, which improved statistical regression to predict particle size distribution using surrogate environmental parameters, such as precipitation depth and intensity. The performance of this statistical modeling technique was compared to results using traditional fixed-point sampling methods and was found to perform better. When environmental parameters can be used to predict particle size distributions, environmental managers have more options when characterizing concentrations, loads, and particle size distributions in urban runoff.

  14. The Physics of Semiconductors

    NASA Astrophysics Data System (ADS)

    Brennan, Kevin F.

    1999-02-01

    Modern fabrication techniques have made it possible to produce semiconductor devices whose dimensions are so small that quantum mechanical effects dominate their behavior. This book describes the key elements of quantum mechanics, statistical mechanics, and solid-state physics that are necessary in understanding these modern semiconductor devices. The author begins with a review of elementary quantum mechanics, and then describes more advanced topics, such as multiple quantum wells. He then disusses equilibrium and nonequilibrium statistical mechanics. Following this introduction, he provides a thorough treatment of solid-state physics, covering electron motion in periodic potentials, electron-phonon interaction, and recombination processes. The final four chapters deal exclusively with real devices, such as semiconductor lasers, photodiodes, flat panel displays, and MOSFETs. The book contains many homework exercises and is suitable as a textbook for electrical engineering, materials science, or physics students taking courses in solid-state device physics. It will also be a valuable reference for practicing engineers in optoelectronics and related areas.

  15. Simulating the complex output of rainfall and hydrological processes using the information contained in large data sets: the Direct Sampling approach.

    NASA Astrophysics Data System (ADS)

    Oriani, Fabio

    2017-04-01

    The unpredictable nature of rainfall makes its estimation as much difficult as it is essential to hydrological applications. Stochastic simulation is often considered a convenient approach to asses the uncertainty of rainfall processes, but preserving their irregular behavior and variability at multiple scales is a challenge even for the most advanced techniques. In this presentation, an overview on the Direct Sampling technique [1] and its recent application to rainfall and hydrological data simulation [2, 3] is given. The algorithm, having its roots in multiple-point statistics, makes use of a training data set to simulate the outcome of a process without inferring any explicit probability measure: the data are simulated in time or space by sampling the training data set where a sufficiently similar group of neighbor data exists. This approach allows preserving complex statistical dependencies at different scales with a good approximation, while reducing the parameterization to the minimum. The straights and weaknesses of the Direct Sampling approach are shown through a series of applications to rainfall and hydrological data: from time-series simulation to spatial rainfall fields conditioned by elevation or a climate scenario. In the era of vast databases, is this data-driven approach a valid alternative to parametric simulation techniques? [1] Mariethoz G., Renard P., and Straubhaar J. (2010), The Direct Sampling method to perform multiple-point geostatistical simulations, Water. Rerous. Res., 46(11), http://dx.doi.org/10.1029/2008WR007621 [2] Oriani F., Straubhaar J., Renard P., and Mariethoz G. (2014), Simulation of rainfall time series from different climatic regions using the direct sampling technique, Hydrol. Earth Syst. Sci., 18, 3015-3031, http://dx.doi.org/10.5194/hess-18-3015-2014 [3] Oriani F., Borghi A., Straubhaar J., Mariethoz G., Renard P. (2016), Missing data simulation inside flow rate time-series using multiple-point statistics, Environ. Model. Softw., vol. 86, pp. 264 - 276, http://dx.doi.org/10.1016/j.envsoft.2016.10.002

  16. Sequential projection pursuit for optimised vibration-based damage detection in an experimental wind turbine blade

    NASA Astrophysics Data System (ADS)

    Hoell, Simon; Omenzetter, Piotr

    2018-02-01

    To advance the concept of smart structures in large systems, such as wind turbines (WTs), it is desirable to be able to detect structural damage early while using minimal instrumentation. Data-driven vibration-based damage detection methods can be competitive in that respect because global vibrational responses encompass the entire structure. Multivariate damage sensitive features (DSFs) extracted from acceleration responses enable to detect changes in a structure via statistical methods. However, even though such DSFs contain information about the structural state, they may not be optimised for the damage detection task. This paper addresses the shortcoming by exploring a DSF projection technique specialised for statistical structural damage detection. High dimensional initial DSFs are projected onto a low-dimensional space for improved damage detection performance and simultaneous computational burden reduction. The technique is based on sequential projection pursuit where the projection vectors are optimised one by one using an advanced evolutionary strategy. The approach is applied to laboratory experiments with a small-scale WT blade under wind-like excitations. Autocorrelation function coefficients calculated from acceleration signals are employed as DSFs. The optimal numbers of projection vectors are identified with the help of a fast forward selection procedure. To benchmark the proposed method, selections of original DSFs as well as principal component analysis scores from these features are additionally investigated. The optimised DSFs are tested for damage detection on previously unseen data from the healthy state and a wide range of damage scenarios. It is demonstrated that using selected subsets of the initial and transformed DSFs improves damage detectability compared to the full set of features. Furthermore, superior results can be achieved by projecting autocorrelation coefficients onto just a single optimised projection vector.

  17. Analyzing quality of colorectal cancer care through registry statistics: a small community hospital example.

    PubMed

    Hopewood, Ian

    2011-01-01

    As the quantity of elderly Americans requiring oncologic care grows, and as cancer treatment and medicine become more advanced, assessing quality of cancer care becomes a necessary and advantageous practice for any facility.' Such analysis is especially practical in small community hospitals, which may not have the resources of their larger academic counterparts to ensure that the care being provided is current and competitive in terms of both technique and outcome. This study is a comparison of the colorectal cancer care at one such center, Falmouth Community Hospital (FCH)--located in Falmouth, Massachusetts, about an hour and a half away from the nearest metropolitan center--to the care provided at a major nearby Boston Tertiary Center (BTC) and at teaching and research facilities across New England and the United States. The metrics used to measure performance encompass both outcome (survival rate data) as well as technique, including quality of surgery (number of lymph nodes removed) and the administration of adjuvant treatments, chemotherapy, and radiation therapy, as per national guidelines. All data for comparison between FCH and BTC were culled from those hospitals' tumor registries. Data for the comparison between FCH and national tertiary/referral centers were taken from the American College of Surgeons' Commission on Cancer, namely National Cancer Data Base (NCDB) statistics, Hospital Benchmark Reports and Practice Profile Reports. The results showed that, while patients at FCH were diagnosed at both a higher age and at a more advanced stage of colorectal cancer than their BTC counterparts, FCH stands up favorably to BTC and other large centers in terms of the metrics referenced above. Quality assessment such as the analysis conducted here can be used at other community facilities to spotlight, and ultimately eliminate, deficiencies in cancer programs.

  18. Unsupervised nonlinear dimensionality reduction machine learning methods applied to multiparametric MRI in cerebral ischemia: preliminary results

    NASA Astrophysics Data System (ADS)

    Parekh, Vishwa S.; Jacobs, Jeremy R.; Jacobs, Michael A.

    2014-03-01

    The evaluation and treatment of acute cerebral ischemia requires a technique that can determine the total area of tissue at risk for infarction using diagnostic magnetic resonance imaging (MRI) sequences. Typical MRI data sets consist of T1- and T2-weighted imaging (T1WI, T2WI) along with advanced MRI parameters of diffusion-weighted imaging (DWI) and perfusion weighted imaging (PWI) methods. Each of these parameters has distinct radiological-pathological meaning. For example, DWI interrogates the movement of water in the tissue and PWI gives an estimate of the blood flow, both are critical measures during the evolution of stroke. In order to integrate these data and give an estimate of the tissue at risk or damaged; we have developed advanced machine learning methods based on unsupervised non-linear dimensionality reduction (NLDR) techniques. NLDR methods are a class of algorithms that uses mathematically defined manifolds for statistical sampling of multidimensional classes to generate a discrimination rule of guaranteed statistical accuracy and they can generate a two- or three-dimensional map, which represents the prominent structures of the data and provides an embedded image of meaningful low-dimensional structures hidden in their high-dimensional observations. In this manuscript, we develop NLDR methods on high dimensional MRI data sets of preclinical animals and clinical patients with stroke. On analyzing the performance of these methods, we observed that there was a high of similarity between multiparametric embedded images from NLDR methods and the ADC map and perfusion map. It was also observed that embedded scattergram of abnormal (infarcted or at risk) tissue can be visualized and provides a mechanism for automatic methods to delineate potential stroke volumes and early tissue at risk.

  19. Recent developments of axial flow compressors under transonic flow conditions

    NASA Astrophysics Data System (ADS)

    Srinivas, G.; Raghunandana, K.; Satish Shenoy, B.

    2017-05-01

    The objective of this paper is to give a holistic view of the most advanced technology and procedures that are practiced in the field of turbomachinery design. Compressor flow solver is the turbulence model used in the CFD to solve viscous problems. The popular techniques like Jameson’s rotated difference scheme was used to solve potential flow equation in transonic condition for two dimensional aero foils and later three dimensional wings. The gradient base method is also a popular method especially for compressor blade shape optimization. Various other types of optimization techniques available are Evolutionary algorithms (EAs) and Response surface methodology (RSM). It is observed that in order to improve compressor flow solver and to get agreeable results careful attention need to be paid towards viscous relations, grid resolution, turbulent modeling and artificial viscosity, in CFD. The advanced techniques like Jameson’s rotated difference had most substantial impact on wing design and aero foil. For compressor blade shape optimization, Evolutionary algorithm is quite simple than gradient based technique because it can solve the parameters simultaneously by searching from multiple points in the given design space. Response surface methodology (RSM) is a method basically used to design empirical models of the response that were observed and to study systematically the experimental data. This methodology analyses the correct relationship between expected responses (output) and design variables (input). RSM solves the function systematically in a series of mathematical and statistical processes. For turbomachinery blade optimization recently RSM has been implemented successfully. The well-designed high performance axial flow compressors finds its application in any air-breathing jet engines.

  20. Identification of fungal phytopathogens using Fourier transform infrared-attenuated total reflection spectroscopy and advanced statistical methods

    NASA Astrophysics Data System (ADS)

    Salman, Ahmad; Lapidot, Itshak; Pomerantz, Ami; Tsror, Leah; Shufan, Elad; Moreh, Raymond; Mordechai, Shaul; Huleihel, Mahmoud

    2012-01-01

    The early diagnosis of phytopathogens is of a great importance; it could save large economical losses due to crops damaged by fungal diseases, and prevent unnecessary soil fumigation or the use of fungicides and bactericides and thus prevent considerable environmental pollution. In this study, 18 isolates of three different fungi genera were investigated; six isolates of Colletotrichum coccodes, six isolates of Verticillium dahliae and six isolates of Fusarium oxysporum. Our main goal was to differentiate these fungi samples on the level of isolates, based on their infrared absorption spectra obtained using the Fourier transform infrared-attenuated total reflection (FTIR-ATR) sampling technique. Advanced statistical and mathematical methods: principal component analysis (PCA), linear discriminant analysis (LDA), and k-means were applied to the spectra after manipulation. Our results showed significant spectral differences between the various fungi genera examined. The use of k-means enabled classification between the genera with a 94.5% accuracy, whereas the use of PCA [3 principal components (PCs)] and LDA has achieved a 99.7% success rate. However, on the level of isolates, the best differentiation results were obtained using PCA (9 PCs) and LDA for the lower wavenumber region (800-1775 cm-1), with identification success rates of 87%, 85.5%, and 94.5% for Colletotrichum, Fusarium, and Verticillium strains, respectively.

  1. Application of Artificial Neural Networks to the Development of Improved Multi-Sensor Retrievals of Near-Surface Air Temperature and Humidity Over Ocean

    NASA Technical Reports Server (NTRS)

    Roberts, J. Brent; Robertson, Franklin R.; Clayson, Carol Anne

    2012-01-01

    Improved estimates of near-surface air temperature and air humidity are critical to the development of more accurate turbulent surface heat fluxes over the ocean. Recent progress in retrieving these parameters has been made through the application of artificial neural networks (ANN) and the use of multi-sensor passive microwave observations. Details are provided on the development of an improved retrieval algorithm that applies the nonlinear statistical ANN methodology to a set of observations from the Advanced Microwave Scanning Radiometer (AMSR-E) and the Advanced Microwave Sounding Unit (AMSU-A) that are currently available from the NASA AQUA satellite platform. Statistical inversion techniques require an adequate training dataset to properly capture embedded physical relationships. The development of multiple training datasets containing only in-situ observations, only synthetic observations produced using the Community Radiative Transfer Model (CRTM), or a mixture of each is discussed. An intercomparison of results using each training dataset is provided to highlight the relative advantages and disadvantages of each methodology. Particular emphasis will be placed on the development of retrievals in cloudy versus clear-sky conditions. Near-surface air temperature and humidity retrievals using the multi-sensor ANN algorithms are compared to previous linear and non-linear retrieval schemes.

  2. Advanced signal processing based on support vector regression for lidar applications

    NASA Astrophysics Data System (ADS)

    Gelfusa, M.; Murari, A.; Malizia, A.; Lungaroni, M.; Peluso, E.; Parracino, S.; Talebzadeh, S.; Vega, J.; Gaudio, P.

    2015-10-01

    The LIDAR technique has recently found many applications in atmospheric physics and remote sensing. One of the main issues, in the deployment of systems based on LIDAR, is the filtering of the backscattered signal to alleviate the problems generated by noise. Improvement in the signal to noise ratio is typically achieved by averaging a quite large number (of the order of hundreds) of successive laser pulses. This approach can be effective but presents significant limitations. First of all, it implies a great stress on the laser source, particularly in the case of systems for automatic monitoring of large areas for long periods. Secondly, this solution can become difficult to implement in applications characterised by rapid variations of the atmosphere, for example in the case of pollutant emissions, or by abrupt changes in the noise. In this contribution, a new method for the software filtering and denoising of LIDAR signals is presented. The technique is based on support vector regression. The proposed new method is insensitive to the statistics of the noise and is therefore fully general and quite robust. The developed numerical tool has been systematically compared with the most powerful techniques available, using both synthetic and experimental data. Its performances have been tested for various statistical distributions of the noise and also for other disturbances of the acquired signal such as outliers. The competitive advantages of the proposed method are fully documented. The potential of the proposed approach to widen the capability of the LIDAR technique, particularly in the detection of widespread smoke, is discussed in detail.

  3. Understanding disparities among diagnostic technologies in glaucoma.

    PubMed

    De Moraes, Carlos Gustavo V; Liebmann, Jeffrey M; Ritch, Robert; Hood, Donald C

    2012-07-01

    To investigate causes of disagreement among 3 glaucoma diagnostic techniques: standard automated achromatic perimetry (SAP), the multifocal visual evoked potential technique (mfVEP), and optical coherence tomography (OCT). In a prospective cross-sectional study, 138 eyes of 69 patients with glaucomatous optic neuropathy were tested using SAP, the mfVEP, and OCT. Eyes with the worse and better mean deviations (MDs) were analyzed separately. If the results of 2 tests were consistent for the presence of an abnormality in the same topographic site, that abnormality was considered a true glaucoma defect. If a third test missed that abnormality (false-negative result), the reasons for disparity were investigated. Eyes with worse MD (mean [SD], -6.8 [8.0] dB) had better agreements among tests than did eyes with better MD (-2.5 [3.5] dB, P<.01). For the 94 of 138 hemifields with abnormalities of the more advanced eyes, the 3 tests were consistent in showing the same hemifield abnormality in 50 hemifields (53%), and at least 2 tests were abnormal in 65 of the 94 hemifields (69%). The potential explanations for the false-negative results fell into 2 general categories: inherent limitations of each technique to detect distinct features of glaucoma and individual variability and the distribution of normative values used to define statistically significant abnormalities. All the cases of disparity could be explained by known limitations of each technique and interindividual variability, suggesting that the agreement among diagnostic tests may be better than summary statistics suggest and that disagreements between tests do not indicate discordance in the structure-function relationship.

  4. Increasing Complexity of Clinical Research in Gastroenterology: Implications for Training Clinician-Scientists

    PubMed Central

    Scott, Frank I.; McConnell, Ryan A.; Lewis, Matthew E.; Lewis, James D.

    2014-01-01

    Background Significant advances have been made in clinical and epidemiologic research methods over the past 30 years. We sought to demonstrate the impact of these advances on published research in gastroenterology from 1980 to 2010. Methods Three journals (Gastroenterology, Gut, and American Journal of Gastroenterology) were selected for evaluation given their continuous publication during the study period. Twenty original clinical articles were randomly selected from each journal from 1980, 1990, 2000, and 2010. Each article was assessed for topic studied, whether the outcome was clinical or physiologic, study design, sample size, number of authors and centers collaborating, and reporting of statistical methods such as sample size calculations, p-values, confidence intervals, and advanced techniques such as bioinformatics or multivariate modeling. Research support with external funding was also recorded. Results A total of 240 articles were included in the study. From 1980 to 2010, there was a significant increase in analytic studies (p<0.001), clinical outcomes (p=0.003), median number of authors per article (p<0.001), multicenter collaboration (p<0.001), sample size (p<0.001), and external funding (p<0.001)). There was significantly increased reporting of p-values (p=0.01), confidence intervals (p<0.001), and power calculations (p<0.001). There was also increased utilization of large multicenter databases (p=0.001), multivariate analyses (p<0.001), and bioinformatics techniques (p=0.001). Conclusions There has been a dramatic increase in complexity in clinical research related to gastroenterology and hepatology over the last three decades. This increase highlights the need for advanced training of clinical investigators to conduct future research. PMID:22475957

  5. Micro-magnetic Structures for Biological Applications

    NASA Astrophysics Data System (ADS)

    Howdyshell, Marci L.

    Developments in single-molecule and single-cell experiments over the past century have provided researchers with many tools to probe the responses of cells to stresses such as physical force or to the injection of foreign genes. Often these techniques target the cell membrane, although many are now advancing to probe within the cell. As these techniques are improved upon and the investigations advance toward clinical applications, it has become more critical to achieve high-throughput outcomes which in turn lead to statistically significant results. The technologies developed in this thesis are targeted at transfecting large populations of cells with controlled doses of specific exogenic material without adversely affecting cell viability. Underlying this effort is a platform of lithographically patterned ferromagnetic thin films capable of remotely manipulating and localizing magnetic microbeads attached to biological entities. A novel feature of this approach, as demonstrated here with both DNA and cells, is the opportunity for multiplexed operations on targeted biological specimens. This thesis includes two main thrusts: (1) the advancement of the trapping platforms through experimental verification of mathematical models providing the energy landscapes associated with the traps and (2) implementation of the platform as a basis for rapid and effective high-throughput microchannel and nanochannel cell electroporation devices. The electroporation devices have, in our studies, not only been demonstrated to sustain cell viability with extremely low cell mortality rates, but are also found to be effective for various types of cells. The advances over current electroporation technologies that are achieved in these efforts demonstrate the potential for detection of mRNA expression in heterogeneous cell populations and probing intracellular responses to the introduction of foreign genes into cells.

  6. A Descriptive Analysis of the Educational Perceptions, Professional Identity, and Professional Practices of Dual-Trained Music Therapists as Counselors.

    PubMed

    Sevcik, Emily E; Jones, Jennifer D; Myers, Charles E

    2017-11-01

    Given the rise in music therapy master's programs that offer dual degrees in music therapy and counseling or programs that satisfy state mental health counseling licensure laws, the professional counseling field is playing an increased role in the advanced education and professional practices of music therapists. To identify factors that lead music therapists to pursue advanced education with an emphasis in professional counseling, perceptions about benefits and drawbacks for three advanced degree options (i.e., music therapy, counseling, and music therapy/counseling dual degree), and describe the professional practices and identity of dual-trained music therapists as counselors. A convenience sample of music therapists (n = 123) who held board certification, and held a master's degree or higher that emphasized professional counseling, completed an online survey. We used descriptive statistics to analyze categorical and numeric survey data. Eligibility for licensure as a professional counselor was the most important decisional factor in selecting a specific master's degree program. Respondents also reported favorable perceptions of the dual degree in music therapy and counseling. With regard to professional practice and identity, respondents reported high use of verbal processing techniques alongside music therapy interventions, and dual-trained music therapists retained their professional identity as a music therapist. The reported view of licensure in a related field as beneficial and frequent use of verbal processing techniques warrants future study into the role of counseling in the advanced training of music therapists. Given contradictory findings across studies, we recommend investigators also explore how a degree in a related field affects career longevity of music therapists. © the American Music Therapy Association 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  7. The role of alternative (advanced) conscious sedation techniques in dentistry for adult patients: a series of cases.

    PubMed

    Robb, N

    2014-03-01

    The basic techniques of conscious sedation have been found to be safe and effective for the management of anxiety in adult dental patients requiring sedation to allow them to undergo dental treatment. There remains great debate within the profession as to the role of the so called advanced sedation techniques. This paper presents a series of nine patients who were managed with advanced sedation techniques where the basic techniques were either inappropriate or had previously failed to provide adequate relief of anxiety. In these cases, had there not been the availability of advanced sedation techniques, the most likely recourse would have been general anaesthesia--a treatment modality that current guidance indicates should not be used where there is an appropriate alternative. The sedation techniques used have provided that appropriate alternative management strategy.

  8. MetaGenyo: a web tool for meta-analysis of genetic association studies.

    PubMed

    Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro

    2017-12-16

    Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .

  9. Informatics for Metabolomics.

    PubMed

    Kusonmano, Kanthida; Vongsangnak, Wanwipa; Chumnanpuen, Pramote

    2016-01-01

    Metabolome profiling of biological systems has the powerful ability to provide the biological understanding of their metabolic functional states responding to the environmental factors or other perturbations. Tons of accumulative metabolomics data have thus been established since pre-metabolomics era. This is directly influenced by the high-throughput analytical techniques, especially mass spectrometry (MS)- and nuclear magnetic resonance (NMR)-based techniques. Continuously, the significant numbers of informatics techniques for data processing, statistical analysis, and data mining have been developed. The following tools and databases are advanced for the metabolomics society which provide the useful metabolomics information, e.g., the chemical structures, mass spectrum patterns for peak identification, metabolite profiles, biological functions, dynamic metabolite changes, and biochemical transformations of thousands of small molecules. In this chapter, we aim to introduce overall metabolomics studies from pre- to post-metabolomics era and their impact on society. Directing on post-metabolomics era, we provide a conceptual framework of informatics techniques for metabolomics and show useful examples of techniques, tools, and databases for metabolomics data analysis starting from preprocessing toward functional interpretation. Throughout the framework of informatics techniques for metabolomics provided, it can be further used as a scaffold for translational biomedical research which can thus lead to reveal new metabolite biomarkers, potential metabolic targets, or key metabolic pathways for future disease therapy.

  10. Preliminary compressor design study for an advanced multistage axial flow compressor

    NASA Technical Reports Server (NTRS)

    Marman, H. V.; Marchant, R. D.

    1976-01-01

    An optimum, axial flow, high pressure ratio compressor for a turbofan engine was defined for commercial subsonic transport service starting in the late 1980's. Projected 1985 technologies were used and applied to compressors with an 18:1 pressure ratio having 6 to 12 stages. A matrix of 49 compressors was developed by statistical techniques. The compressors were evaluated by means of computer programs in terms of various airline economic figures of merit such as return on investment and direct-operating cost. The optimum configuration was determined to be a high speed, 8-stage compressor with an average blading aspect ratio of 1.15.

  11. Assessment of Coastal and Urban Flooding Hazards Applying Extreme Value Analysis and Multivariate Statistical Techniques: A Case Study in Elwood, Australia

    NASA Astrophysics Data System (ADS)

    Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik

    2016-04-01

    Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.

  12. Bioinformatics tools in predictive ecology: applications to fisheries

    PubMed Central

    Tucker, Allan; Duplisea, Daniel

    2012-01-01

    There has been a huge effort in the advancement of analytical techniques for molecular biological data over the past decade. This has led to many novel algorithms that are specialized to deal with data associated with biological phenomena, such as gene expression and protein interactions. In contrast, ecological data analysis has remained focused to some degree on off-the-shelf statistical techniques though this is starting to change with the adoption of state-of-the-art methods, where few assumptions can be made about the data and a more explorative approach is required, for example, through the use of Bayesian networks. In this paper, some novel bioinformatics tools for microarray data are discussed along with their ‘crossover potential’ with an application to fisheries data. In particular, a focus is made on the development of models that identify functionally equivalent species in different fish communities with the aim of predicting functional collapse. PMID:22144390

  13. Mechanical characterization of TiO{sub 2} nanofibers produced by different electrospinning techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vahtrus, Mikk; Šutka, Andris; Institute of Silicate Materials, Riga Technical University, P. Valdena 3/7, Riga LV-1048

    2015-02-15

    In this work TiO{sub 2} nanofibers produced by needle and needleless electrospinning processes from the same precursor were characterized and compared using Raman spectroscopy, transmission electron microscopy (TEM), scanning electron microscopy (SEM) and in situ SEM nanomechanical testing. Phase composition, morphology, Young's modulus and bending strength values were found. Weibull statistics was used to evaluate and compare uniformity of mechanical properties of nanofibers produced by two different methods. It is shown that both methods yield nanofibers with very similar properties. - Graphical abstract: Display Omitted - Highlights: • TiO{sub 2} nanofibers were produced by needle and needleless electrospinning processes. •more » Structure was studied by Raman spectroscopy and electron microscopy methods. • Mechanical properties were measured using advanced in situ SEM cantilevered beam bending technique. • Both methods yield nanofibers with very similar properties.« less

  14. Bioinformatics tools in predictive ecology: applications to fisheries.

    PubMed

    Tucker, Allan; Duplisea, Daniel

    2012-01-19

    There has been a huge effort in the advancement of analytical techniques for molecular biological data over the past decade. This has led to many novel algorithms that are specialized to deal with data associated with biological phenomena, such as gene expression and protein interactions. In contrast, ecological data analysis has remained focused to some degree on off-the-shelf statistical techniques though this is starting to change with the adoption of state-of-the-art methods, where few assumptions can be made about the data and a more explorative approach is required, for example, through the use of Bayesian networks. In this paper, some novel bioinformatics tools for microarray data are discussed along with their 'crossover potential' with an application to fisheries data. In particular, a focus is made on the development of models that identify functionally equivalent species in different fish communities with the aim of predicting functional collapse.

  15. Advances in segmentation modeling for health communication and social marketing campaigns.

    PubMed

    Albrecht, T L; Bryant, C

    1996-01-01

    Large-scale communication campaigns for health promotion and disease prevention involve analysis of audience demographic and psychographic factors for effective message targeting. A variety of segmentation modeling techniques, including tree-based methods such as Chi-squared Automatic Interaction Detection and logistic regression, are used to identify meaningful target groups within a large sample or population (N = 750-1,000+). Such groups are based on statistically significant combinations of factors (e.g., gender, marital status, and personality predispositions). The identification of groups or clusters facilitates message design in order to address the particular needs, attention patterns, and concerns of audience members within each group. We review current segmentation techniques, their contributions to conceptual development, and cost-effective decision making. Examples from a major study in which these strategies were used are provided from the Texas Women, Infants and Children Program's Comprehensive Social Marketing Program.

  16. Visualizing water

    NASA Astrophysics Data System (ADS)

    Baart, F.; van Gils, A.; Hagenaars, G.; Donchyts, G.; Eisemann, E.; van Velzen, J. W.

    2016-12-01

    A compelling visualization is captivating, beautiful and narrative. Here we show how melding the skills of computer graphics, art, statistics, and environmental modeling can be used to generate innovative, attractive and very informative visualizations. We focus on the topic of visualizing forecasts and measurements of water (water level, waves, currents, density, and salinity). For the field of computer graphics and arts, water is an important topic because it occurs in many natural scenes. For environmental modeling and statistics, water is an important topic because the water is essential for transport, a healthy environment, fruitful agriculture, and a safe environment.The different disciplines take different approaches to visualizing water. In computer graphics, one focusses on creating water as realistic looking as possible. The focus on realistic perception (versus the focus on the physical balance pursued by environmental scientists) resulted in fascinating renderings, as seen in recent games and movies. Visualization techniques for statistical results have benefited from the advancement in design and journalism, resulting in enthralling infographics. The field of environmental modeling has absorbed advances in contemporary cartography as seen in the latest interactive data-driven maps. We systematically review the design emerging types of water visualizations. The examples that we analyze range from dynamically animated forecasts, interactive paintings, infographics, modern cartography to web-based photorealistic rendering. By characterizing the intended audience, the design choices, the scales (e.g. time, space), and the explorability we provide a set of guidelines and genres. The unique contributions of the different fields show how the innovations in the current state of the art of water visualization have benefited from inter-disciplinary collaborations.

  17. Preparing High School Students for Success in Advanced Placement Statistics: An Investigation of Pedagogies and Strategies Used in an Online Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Potter, James Thomson, III

    2012-01-01

    Research into teaching practices and strategies has been performed separately in AP Statistics and in K-12 online learning (Garfield, 2002; Ferdig, DiPietro, Black & Dawson, 2009). This study seeks combine the two and build on the need for more investigation into online teaching and learning in specific content (Ferdig et al, 2009; DiPietro,…

  18. NAUSEA and the Principle of Supplementarity of Damping and Isolation in Noise Control.

    DTIC Science & Technology

    1980-02-01

    New approaches and uses of the statistical energy analysis (NAUSEA) have been considered and developed in recent months. The advances were made...possible in that the requirement, in the olde statistical energy analysis , that the dynamic systems be highly reverberant and the couplings between the...analytical consideration in terms of the statistical energy analysis (SEA). A brief discussion and simple examples that relate to these recent advances

  19. Self-calibration of photometric redshift scatter in weak-lensing surveys

    DOE PAGES

    Zhang, Pengjie; Pen, Ue -Li; Bernstein, Gary

    2010-06-11

    Photo-z errors, especially catastrophic errors, are a major uncertainty for precision weak lensing cosmology. We find that the shear-(galaxy number) density and density-density cross correlation measurements between photo-z bins, available from the same lensing surveys, contain valuable information for self-calibration of the scattering probabilities between the true-z and photo-z bins. The self-calibration technique we propose does not rely on cosmological priors nor parameterization of the photo-z probability distribution function, and preserves all of the cosmological information available from shear-shear measurement. We estimate the calibration accuracy through the Fisher matrix formalism. We find that, for advanced lensing surveys such as themore » planned stage IV surveys, the rate of photo-z outliers can be determined with statistical uncertainties of 0.01-1% for z < 2 galaxies. Among the several sources of calibration error that we identify and investigate, the galaxy distribution bias is likely the most dominant systematic error, whereby photo-z outliers have different redshift distributions and/or bias than non-outliers from the same bin. This bias affects all photo-z calibration techniques based on correlation measurements. As a result, galaxy bias variations of O(0.1) produce biases in photo-z outlier rates similar to the statistical errors of our method, so this galaxy distribution bias may bias the reconstructed scatters at several-σ level, but is unlikely to completely invalidate the self-calibration technique.« less

  20. Scalable Dry Printing Manufacturing to Enable Long-Life and High Energy Lithium-Ion Batteries

    DOE PAGES

    Liu, Jin; Ludwig, Brandon; Liu, Yangtao; ...

    2017-08-22

    Slurry casting method dominates the electrode manufacture of lithium-ion batteries. The entire procedure is similar to the newspaper printing that includes premixing of cast materials into solvents homogeneously, and continuously transferring and drying the slurry mixture onto the current collector. As a market approaching US $80 billion by 2024, the optimization of manufacture process is crucial and attractive. However, the organic solvent remains irreplaceable in the wet method for making slurries, even though it is capital-intensive and toxic. In this paper, an advanced powder printing technique is demonstrated that is completely solvent-free and dry. Through removing the solvent and relatedmore » procedures, this method is anticipated to statistically save 20% of the cost at a remarkably shortened production cycle (from hours to minutes). The dry printed electrodes outperform commercial slurry cast ones in 650 cycles (80% capacity retention in 500 cycles), and thick electrodes are successfully fabricated to increase the energy density. Furthermore, microscopy techniques are utilized to characterize the difference of electrode microstructure between dry and wet methods, and distinguish dry printing's advantages on controlling the microstructure. Finally, this study proves a practical fabrication method for lithium-ion electrodes with lowered cost and favorable performance, and allows more advanced electrode designs potentially.« less

  1. On Advanced Estimation Techniques for Exoplanet Detection and Characterization Using Ground-based Coronagraphs

    PubMed Central

    Lawson, Peter R.; Poyneer, Lisa; Barrett, Harrison; Frazin, Richard; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gładysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jérôme; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Pearson, Iain; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry

    2015-01-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012. PMID:26347393

  2. Scalable Dry Printing Manufacturing to Enable Long-Life and High Energy Lithium-Ion Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Jin; Ludwig, Brandon; Liu, Yangtao

    Slurry casting method dominates the electrode manufacture of lithium-ion batteries. The entire procedure is similar to the newspaper printing that includes premixing of cast materials into solvents homogeneously, and continuously transferring and drying the slurry mixture onto the current collector. As a market approaching US $80 billion by 2024, the optimization of manufacture process is crucial and attractive. However, the organic solvent remains irreplaceable in the wet method for making slurries, even though it is capital-intensive and toxic. In this paper, an advanced powder printing technique is demonstrated that is completely solvent-free and dry. Through removing the solvent and relatedmore » procedures, this method is anticipated to statistically save 20% of the cost at a remarkably shortened production cycle (from hours to minutes). The dry printed electrodes outperform commercial slurry cast ones in 650 cycles (80% capacity retention in 500 cycles), and thick electrodes are successfully fabricated to increase the energy density. Furthermore, microscopy techniques are utilized to characterize the difference of electrode microstructure between dry and wet methods, and distinguish dry printing's advantages on controlling the microstructure. Finally, this study proves a practical fabrication method for lithium-ion electrodes with lowered cost and favorable performance, and allows more advanced electrode designs potentially.« less

  3. On Advanced Estimation Techniques for Exoplanet Detection and Characterization using Ground-Based Coronagraphs

    NASA Technical Reports Server (NTRS)

    Lawson, Peter R.; Frazin, Richard; Barrett, Harrison; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gladysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jerome; hide

    2012-01-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We provide a formal comparison of techniques through a blind data challenge and evaluate performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

  4. On Advanced Estimation Techniques for Exoplanet Detection and Characterization Using Ground-based Coronagraphs.

    PubMed

    Lawson, Peter R; Poyneer, Lisa; Barrett, Harrison; Frazin, Richard; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gładysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jérôme; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Pearson, Iain; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry

    2012-07-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

  5. On advanced estimation techniques for exoplanet detection and characterization using ground-based coronagraphs

    NASA Astrophysics Data System (ADS)

    Lawson, Peter R.; Poyneer, Lisa; Barrett, Harrison; Frazin, Richard; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gładysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jérôme; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Pearson, Iain; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry

    2012-07-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

  6. Advances in Statistical Methods for Substance Abuse Prevention Research

    PubMed Central

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  7. Assessment of Remote Sensing Technologies for Location of Hydrogen and Helium Leaks

    NASA Technical Reports Server (NTRS)

    Sellar, R. Glenn; Sohn, Yongho; Mathur, Varun; Reardon, Peter

    2001-01-01

    In Phase 1 of this project, a hierarchy of techniques for H2 and He leak location was developed. A total of twelve specific remote sensing techniques were evaluated; the results are summarized. A basic diffusion model was also developed to predict the concentration and distribution of H2 or He resulting from a leak. The objectives of Phase 2 of the project consisted of the following four tasks: Advance Rayleigh Doppler technique from TRL 1 to TRL 2; Plan to advance Rayleigh Doppler technique from TRL 2 to TRL 3; Advance researchers and resources for further advancement; Extend diffusion model.

  8. Survey of statistical techniques used in validation studies of air pollution prediction models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bornstein, R D; Anderson, S F

    1979-03-01

    Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.

  9. Validating LES for Jet Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Bridges, James

    2011-01-01

    Engineers charged with making jet aircraft quieter have long dreamed of being able to see exactly how turbulent eddies produce sound and this dream is now coming true with the advent of large eddy simulation (LES). Two obvious challenges remain: validating the LES codes at the resolution required to see the fluid-acoustic coupling, and the interpretation of the massive datasets that result in having dreams come true. This paper primarily addresses the former, the use of advanced experimental techniques such as particle image velocimetry (PIV) and Raman and Rayleigh scattering, to validate the computer codes and procedures used to create LES solutions. It also addresses the latter problem in discussing what are relevant measures critical for aeroacoustics that should be used in validating LES codes. These new diagnostic techniques deliver measurements and flow statistics of increasing sophistication and capability, but what of their accuracy? And what are the measures to be used in validation? This paper argues that the issue of accuracy be addressed by cross-facility and cross-disciplinary examination of modern datasets along with increased reporting of internal quality checks in PIV analysis. Further, it is argued that the appropriate validation metrics for aeroacoustic applications are increasingly complicated statistics that have been shown in aeroacoustic theory to be critical to flow-generated sound.

  10. Modality-Driven Classification and Visualization of Ensemble Variance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bensema, Kevin; Gosink, Luke; Obermaier, Harald

    Advances in computational power now enable domain scientists to address conceptual and parametric uncertainty by running simulations multiple times in order to sufficiently sample the uncertain input space. While this approach helps address conceptual and parametric uncertainties, the ensemble datasets produced by this technique present a special challenge to visualization researchers as the ensemble dataset records a distribution of possible values for each location in the domain. Contemporary visualization approaches that rely solely on summary statistics (e.g., mean and variance) cannot convey the detailed information encoded in ensemble distributions that are paramount to ensemble analysis; summary statistics provide no informationmore » about modality classification and modality persistence. To address this problem, we propose a novel technique that classifies high-variance locations based on the modality of the distribution of ensemble predictions. Additionally, we develop a set of confidence metrics to inform the end-user of the quality of fit between the distribution at a given location and its assigned class. We apply a similar method to time-varying ensembles to illustrate the relationship between peak variance and bimodal or multimodal behavior. These classification schemes enable a deeper understanding of the behavior of the ensemble members by distinguishing between distributions that can be described by a single tendency and distributions which reflect divergent trends in the ensemble.« less

  11. The study of brain activity during the observation of commercial advertising by using high resolution EEG techniques.

    PubMed

    Vecchiato, Giovanni; Astolfi, Laura; De Vico Fallani, Fabrizio; Salinari, Serenella; Cincotti, Febo; Aloise, Fabio; Mattia, Donatella; Marciani, Maria Grazia; Bianchi, Luigi; Soranzo, Ramon; Babiloni, Fabio

    2009-01-01

    In this paper we illustrate the capability of tracking brain activity during the observation of commercial TV spots by using advanced high resolution EEG statistical techniques in time and frequency domains. In particular, we analyzed the statistically significant cortical spectral power activity in different frequency bands during the observation of a commercial video clip related to the use of a beer in a group of 13 normal subjects. In addition, a TV speech of the prime minister of Italy was analyzed in two groups of swing and "supporter" voters. Results suggested that the cortical activity during the observation of commercial spots could vary consistently across the spot. This fact suggest the possibility to remove the part of the spot that are not particularly attractive by using those cerebral indexes. The cortical activity during the observation of the political speech indicated a major cortical activity in the supporters group when compared to the swing voters. In this case, it is possible to conclude that the communication proposed has failed to raise attention or interest on swing voters. In conclusions, high resolution EEG have been proved able to generate useful insights about the particular fruition of TV messages, related to both commercial as well as political fields.

  12. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  13. All-Atom Four-Body Knowledge-Based Statistical Potentials to Distinguish Native Protein Structures from Nonnative Folds

    PubMed Central

    2017-01-01

    Recent advances in understanding protein folding have benefitted from coarse-grained representations of protein structures. Empirical energy functions derived from these techniques occasionally succeed in distinguishing native structures from their corresponding ensembles of nonnative folds or decoys which display varying degrees of structural dissimilarity to the native proteins. Here we utilized atomic coordinates of single protein chains, comprising a large diverse training set, to develop and evaluate twelve all-atom four-body statistical potentials obtained by exploring alternative values for a pair of inherent parameters. Delaunay tessellation was performed on the atomic coordinates of each protein to objectively identify all quadruplets of interacting atoms, and atomic potentials were generated via statistical analysis of the data and implementation of the inverted Boltzmann principle. Our potentials were evaluated using benchmarking datasets from Decoys-‘R'-Us, and comparisons were made with twelve other physics- and knowledge-based potentials. Ranking 3rd, our best potential tied CHARMM19 and surpassed AMBER force field potentials. We illustrate how a generalized version of our potential can be used to empirically calculate binding energies for target-ligand complexes, using HIV-1 protease-inhibitor complexes for a practical application. The combined results suggest an accurate and efficient atomic four-body statistical potential for protein structure prediction and assessment. PMID:29119109

  14. ADAPTATION OF THE ADVANCED STATISTICAL TRAJECTORY REGIONAL AIR POLLUTION (ASTRAP) MODEL TO THE EPA VAX COMPUTER - MODIFICATIONS AND TESTING

    EPA Science Inventory

    The Advanced Statistical Trajectory Regional Air Pollution (ASTRAP) model simulates long-term transport and deposition of oxides of and nitrogen. t is a potential screening tool for assessing long-term effects on regional visibility from sulfur emission sources. owever, a rigorou...

  15. Advanced Categorical Statistics: Issues and Applications in Communication Research.

    ERIC Educational Resources Information Center

    Denham, Bryan E.

    2002-01-01

    Discusses not only the procedures, assumptions, and applications of advanced categorical statistics, but also covers some common misapplications, from which a great deal can be learned. Addresses the use and limitations of cross-tabulation and chi-square analysis, as well as issues such as observation independence and artificial inflation of a…

  16. Three-dimensional mapping of soil chemical characteristics at micrometric scale: Statistical prediction by combining 2D SEM-EDX data and 3D X-ray computed micro-tomographic images

    NASA Astrophysics Data System (ADS)

    Hapca, Simona

    2015-04-01

    Many soil properties and functions emerge from interactions of physical, chemical and biological processes at microscopic scales, which can be understood only by integrating techniques that traditionally are developed within separate disciplines. While recent advances in imaging techniques, such as X-ray computed tomography (X-ray CT), offer the possibility to reconstruct the 3D physical structure at fine resolutions, for the distribution of chemicals in soil, existing methods, based on scanning electron microscope (SEM) and energy dispersive X-ray detection (EDX), allow for characterization of the chemical composition only on 2D surfaces. At present, direct 3D measurement techniques are still lacking, sequential sectioning of soils, followed by 2D mapping of chemical elements and interpolation to 3D, being an alternative which is explored in this study. Specifically, we develop an integrated experimental and theoretical framework which combines 3D X-ray CT imaging technique with 2D SEM-EDX and use spatial statistics methods to map the chemical composition of soil in 3D. The procedure involves three stages 1) scanning a resin impregnated soil cube by X-ray CT, followed by precision cutting to produce parallel thin slices, the surfaces of which are scanned by SEM-EDX, 2) alignment of the 2D chemical maps within the internal 3D structure of the soil cube, and 3) development, of spatial statistics methods to predict the chemical composition of 3D soil based on the observed 2D chemical and 3D physical data. Specifically, three statistical models consisting of a regression tree, a regression tree kriging and cokriging model were used to predict the 3D spatial distribution of carbon, silicon, iron and oxygen in soil, these chemical elements showing a good spatial agreement between the X-ray grayscale intensities and the corresponding 2D SEM-EDX data. Due to the spatial correlation between the physical and chemical data, the regression-tree model showed a great potential in predicting chemical composition in particular for iron, which is generally sparsely distributed in soil. For carbon, silicon and oxygen, which are more densely distributed, the additional kriging of the regression tree residuals improved significantly the prediction, whereas prediction based on co-kriging was less consistent across replicates, underperforming regression-tree kriging. The present study shows a great potential in integrating geo-statistical methods with imaging techniques to unveil the 3D chemical structure of soil at very fine scales, the framework being suitable to be further applied to other types of imaging data such as images of biological thin sections for characterization of microbial distribution. Key words: X-ray CT, SEM-EDX, segmentation techniques, spatial correlation, 3D soil images, 2D chemical maps.

  17. Monitoring and Evaluation: Statistical Support for Life-cycle Studies, Annual Report 2003.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skalski, John

    2003-11-01

    The ongoing mission of this project is the development of statistical tools for analyzing fisheries tagging data in the most precise and appropriate manner possible. This mission also includes providing statistical guidance on the best ways to design large-scale tagging studies. This mission continues because the technologies for conducting fish tagging studies continuously evolve. In just the last decade, fisheries biologists have seen the evolution from freeze-brands and coded wire tags (CWT) to passive integrated transponder (PIT) tags, balloon-tags, radiotelemetry, and now, acoustic-tags. With each advance, the technology holds the promise of more detailed and precise information. However, the technologymore » for analyzing and interpreting the data also becomes more complex as the tagging techniques become more sophisticated. The goal of the project is to develop the analytical tools in parallel with the technical advances in tagging studies, so that maximum information can be extracted on a timely basis. Associated with this mission is the transfer of these analytical capabilities to the field investigators to assure consistency and the highest levels of design and analysis throughout the fisheries community. Consequently, this project provides detailed technical assistance on the design and analysis of tagging studies to groups requesting assistance throughout the fisheries community. Ideally, each project and each investigator would invest in the statistical support needed for the successful completion of their study. However, this is an ideal that is rarely if every attained. Furthermore, there is only a small pool of highly trained scientists in this specialized area of tag analysis here in the Northwest. Project 198910700 provides the financial support to sustain this local expertise on the statistical theory of tag analysis at the University of Washington and make it available to the fisheries community. Piecemeal and fragmented support from various agencies and organizations would be incapable of maintaining a center of expertise. The mission of the project is to help assure tagging studies are designed and analyzed from the onset to extract the best available information using state-of-the-art statistical methods. The overarching goals of the project is to assure statistically sound survival studies so that fish managers can focus on the management implications of their findings and not be distracted by concerns whether the studies are statistically reliable or not. Specific goals and objectives of the study include the following: (1) Provide consistent application of statistical methodologies for survival estimation across all salmon life cycle stages to assure comparable performance measures and assessment of results through time, to maximize learning and adaptive management opportunities, and to improve and maintain the ability to responsibly evaluate the success of implemented Columbia River FWP salmonid mitigation programs and identify future mitigation options. (2) Improve analytical capabilities to conduct research on survival processes of wild and hatchery chinook and steelhead during smolt outmigration, to improve monitoring and evaluation capabilities and assist in-season river management to optimize operational and fish passage strategies to maximize survival. (3) Extend statistical support to estimate ocean survival and in-river survival of returning adults. Provide statistical guidance in implementing a river-wide adult PIT-tag detection capability. (4) Develop statistical methods for survival estimation for all potential users and make this information available through peer-reviewed publications, statistical software, and technology transfers to organizations such as NOAA Fisheries, the Fish Passage Center, US Fish and Wildlife Service, US Geological Survey (USGS), US Army Corps of Engineers (USACE), Public Utility Districts (PUDs), the Independent Scientific Advisory Board (ISAB), and other members of the Northwest fisheries community. (5) Provide and maintain statistical software for tag analysis and user support. (6) Provide improvements in statistical theory and software as requested by user groups. These improvements include extending software capabilities to address new research issues, adapting tagging techniques to new study designs, and extending the analysis capabilities to new technologies such as radio-tags and acoustic-tags.« less

  18. Advanced Neuroimaging in Traumatic Brain Injury

    PubMed Central

    Edlow, Brian L.; Wu, Ona

    2013-01-01

    Advances in structural and functional neuroimaging have occurred at a rapid pace over the past two decades. Novel techniques for measuring cerebral blood flow, metabolism, white matter connectivity, and neural network activation have great potential to improve the accuracy of diagnosis and prognosis for patients with traumatic brain injury (TBI), while also providing biomarkers to guide the development of new therapies. Several of these advanced imaging modalities are currently being implemented into clinical practice, whereas others require further development and validation. Ultimately, for advanced neuroimaging techniques to reach their full potential and improve clinical care for the many civilians and military personnel affected by TBI, it is critical for clinicians to understand the applications and methodological limitations of each technique. In this review, we examine recent advances in structural and functional neuroimaging and the potential applications of these techniques to the clinical care of patients with TBI. We also discuss pitfalls and confounders that should be considered when interpreting data from each technique. Finally, given the vast amounts of advanced imaging data that will soon be available to clinicians, we discuss strategies for optimizing data integration, visualization and interpretation. PMID:23361483

  19. Comparative Evaluation of Two Venous Sampling Techniques for the Assessment of Pancreatic Insulin and Zinc Release upon Glucose Challenge.

    PubMed

    Pillai, Anil Kumar; Silvers, William; Christensen, Preston; Riegel, Matthew; Adams-Huet, Beverley; Lingvay, Ildiko; Sun, Xiankai; Öz, Orhan K

    2015-01-01

    Advances in noninvasive imaging modalities have provided opportunities to study β cell function through imaging zinc release from insulin secreting β cells. Understanding the temporal secretory pattern of insulin and zinc corelease after a glucose challenge is essential for proper timing of administration of zinc sensing probes. Portal venous sampling is an essential part of pharmacological and nutritional studies in animal models. The purpose of this study was to compare two different percutaneous image-guided techniques: transhepatic ultrasound guided portal vein access and transsplenic fluoroscopy guided splenic vein access for ease of access, safety, and evaluation of temporal kinetics of insulin and zinc release into the venous effluent from the pancreas. Both techniques were safe, reproducible, and easy to perform. The mean time required to obtain desired catheter position for venous sampling was 15 minutes shorter using the transsplenic technique. A clear biphasic insulin release profile was observed in both techniques. Statistically higher insulin concentration but similar zinc release after a glucose challenge was observed from splenic vein samples, as compared to the ones from the portal vein. To our knowledge, this is the first report of percutaneous methods to assess zinc release kinetics from the porcine pancreas.

  20. Comparative Evaluation of Two Venous Sampling Techniques for the Assessment of Pancreatic Insulin and Zinc Release upon Glucose Challenge

    PubMed Central

    Pillai, Anil Kumar; Silvers, William; Christensen, Preston; Riegel, Matthew; Adams-Huet, Beverley; Lingvay, Ildiko; Sun, Xiankai; Öz, Orhan K.

    2015-01-01

    Advances in noninvasive imaging modalities have provided opportunities to study β cell function through imaging zinc release from insulin secreting β cells. Understanding the temporal secretory pattern of insulin and zinc corelease after a glucose challenge is essential for proper timing of administration of zinc sensing probes. Portal venous sampling is an essential part of pharmacological and nutritional studies in animal models. The purpose of this study was to compare two different percutaneous image-guided techniques: transhepatic ultrasound guided portal vein access and transsplenic fluoroscopy guided splenic vein access for ease of access, safety, and evaluation of temporal kinetics of insulin and zinc release into the venous effluent from the pancreas. Both techniques were safe, reproducible, and easy to perform. The mean time required to obtain desired catheter position for venous sampling was 15 minutes shorter using the transsplenic technique. A clear biphasic insulin release profile was observed in both techniques. Statistically higher insulin concentration but similar zinc release after a glucose challenge was observed from splenic vein samples, as compared to the ones from the portal vein. To our knowledge, this is the first report of percutaneous methods to assess zinc release kinetics from the porcine pancreas. PMID:26273676

  1. Advanced correlation grid: Analysis and visualisation of functional connectivity among multiple spike trains.

    PubMed

    Masud, Mohammad Shahed; Borisyuk, Roman; Stuart, Liz

    2017-07-15

    This study analyses multiple spike trains (MST) data, defines its functional connectivity and subsequently visualises an accurate diagram of connections. This is a challenging problem. For example, it is difficult to distinguish the common input and the direct functional connection of two spike trains. The new method presented in this paper is based on the traditional pairwise cross-correlation function (CCF) and a new combination of statistical techniques. First, the CCF is used to create the Advanced Correlation Grid (ACG) correlation where both the significant peak of the CCF and the corresponding time delay are used for detailed analysis of connectivity. Second, these two features of functional connectivity are used to classify connections. Finally, the visualization technique is used to represent the topology of functional connections. Examples are presented in the paper to demonstrate the new Advanced Correlation Grid method and to show how it enables discrimination between (i) influence from one spike train to another through an intermediate spike train and (ii) influence from one common spike train to another pair of analysed spike trains. The ACG method enables scientists to automatically distinguish between direct connections from spurious connections such as common source connection and indirect connection whereas existing methods require in-depth analysis to identify such connections. The ACG is a new and effective method for studying functional connectivity of multiple spike trains. This method can identify accurately all the direct connections and can distinguish common source and indirect connections automatically. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Data-driven Applications for the Sun-Earth System

    NASA Astrophysics Data System (ADS)

    Kondrashov, D. A.

    2016-12-01

    Advances in observational and data mining techniques allow extracting information from the large volume of Sun-Earth observational data that can be assimilated into first principles physical models. However, equations governing Sun-Earth phenomena are typically nonlinear, complex, and high-dimensional. The high computational demand of solving the full governing equations over a large range of scales precludes the use of a variety of useful assimilative tools that rely on applied mathematical and statistical techniques for quantifying uncertainty and predictability. Effective use of such tools requires the development of computationally efficient methods to facilitate fusion of data with models. This presentation will provide an overview of various existing as well as newly developed data-driven techniques adopted from atmospheric and oceanic sciences that proved to be useful for space physics applications, such as computationally efficient implementation of Kalman Filter in radiation belts modeling, solar wind gap-filling by Singular Spectrum Analysis, and low-rank procedure for assimilation of low-altitude ionospheric magnetic perturbations into the Lyon-Fedder-Mobarry (LFM) global magnetospheric model. Reduced-order non-Markovian inverse modeling and novel data-adaptive decompositions of Sun-Earth datasets will be also demonstrated.

  3. Advances in high-resolution mass spectrometry based on metabolomics studies for food--a review.

    PubMed

    Rubert, Josep; Zachariasova, Milena; Hajslova, Jana

    2015-01-01

    Food authenticity becomes a necessity for global food policies, since food placed in the market without fail has to be authentic. It has always been a challenge, since in the past minor components, called also markers, have been mainly monitored by chromatographic methods in order to authenticate the food. Nevertheless, nowadays, advanced analytical methods have allowed food fingerprints to be achieved. At the same time they have been also combined with chemometrics, which uses statistical methods in order to verify food and to provide maximum information by analysing chemical data. These sophisticated methods based on different separation techniques or stand alone have been recently coupled to high-resolution mass spectrometry (HRMS) in order to verify the authenticity of food. The new generation of HRMS detectors have experienced significant advances in resolving power, sensitivity, robustness, extended dynamic range, easier mass calibration and tandem mass capabilities, making HRMS more attractive and useful to the food metabolomics community, therefore becoming a reliable tool for food authenticity. The purpose of this review is to summarise and describe the most recent metabolomics approaches in the area of food metabolomics, and to discuss the strengths and drawbacks of the HRMS analytical platforms combined with chemometrics.

  4. Review of advanced imaging techniques

    PubMed Central

    Chen, Yu; Liang, Chia-Pin; Liu, Yang; Fischer, Andrew H.; Parwani, Anil V.; Pantanowitz, Liron

    2012-01-01

    Pathology informatics encompasses digital imaging and related applications. Several specialized microscopy techniques have emerged which permit the acquisition of digital images (“optical biopsies”) at high resolution. Coupled with fiber-optic and micro-optic components, some of these imaging techniques (e.g., optical coherence tomography) are now integrated with a wide range of imaging devices such as endoscopes, laparoscopes, catheters, and needles that enable imaging inside the body. These advanced imaging modalities have exciting diagnostic potential and introduce new opportunities in pathology. Therefore, it is important that pathology informaticists understand these advanced imaging techniques and the impact they have on pathology. This paper reviews several recently developed microscopic techniques, including diffraction-limited methods (e.g., confocal microscopy, 2-photon microscopy, 4Pi microscopy, and spatially modulated illumination microscopy) and subdiffraction techniques (e.g., photoactivated localization microscopy, stochastic optical reconstruction microscopy, and stimulated emission depletion microscopy). This article serves as a primer for pathology informaticists, highlighting the fundamentals and applications of advanced optical imaging techniques. PMID:22754737

  5. Updates to Post-Flash Calibration for the Advanced Camera for Surveys Wide Field Channel

    NASA Astrophysics Data System (ADS)

    Miles, Nathan

    2018-03-01

    This report presents a new technique for generating the post-flash calibration reference file for the Advanced Camera for Surveys (ACS) Wide Field Channel (WFC). The new method substantially reduces, if not, eliminates all together the presence of dark current artifacts arising from improper dark subtraction, while simultaneously preserving flat-field artifacts. The stability of the post-flash calibration reference file over time is measured using data taken yearly since 2012 and no statistically significant deviations are found. An analysis of all short-flashed darks taken every two days since January 2015 reveals a periodic modulation of the LED intensity on timescales of about one year. This effect is most readily explained by changes to the local temperature in the area surrounding the LED. However, a slight offset between the periods of the temperature and LED modulations lends to the possibility that the effect is a chance observation of the two sinusoids at an unfortunate point in their beat cycle.

  6. Single-Case Experimental Designs to Evaluate Novel Technology-Based Health Interventions

    PubMed Central

    Cassidy, Rachel N; Raiff, Bethany R

    2013-01-01

    Technology-based interventions to promote health are expanding rapidly. Assessing the preliminary efficacy of these interventions can be achieved by employing single-case experiments (sometimes referred to as n-of-1 studies). Although single-case experiments are often misunderstood, they offer excellent solutions to address the challenges associated with testing new technology-based interventions. This paper provides an introduction to single-case techniques and highlights advances in developing and evaluating single-case experiments, which help ensure that treatment outcomes are reliable, replicable, and generalizable. These advances include quality control standards, heuristics to guide visual analysis of time-series data, effect size calculations, and statistical analyses. They also include experimental designs to isolate the active elements in a treatment package and to assess the mechanisms of behavior change. The paper concludes with a discussion of issues related to the generality of findings derived from single-case research and how generality can be established through replication and through analysis of behavioral mechanisms. PMID:23399668

  7. Redesign of LAOBP to bind novel l-amino acid ligands.

    PubMed

    Banda-Vázquez, Jesús; Shanmugaratnam, Sooruban; Rodríguez-Sotres, Rogelio; Torres-Larios, Alfredo; Höcker, Birte; Sosa-Peinado, Alejandro

    2018-05-01

    Computational protein design is still a challenge for advancing structure-function relationships. While recent advances in this field are promising, more information for genuine predictions is needed. Here, we discuss different approaches applied to install novel glutamine (Gln) binding into the Lysine/Arginine/Ornithine binding protein (LAOBP) from Salmonella typhimurium. We studied the ligand binding behavior of two mutants: a binding pocket grafting design based on a structural superposition of LAOBP to the Gln binding protein QBP from Escherichia coli and a design based on statistical coupled positions. The latter showed the ability to bind Gln even though the protein was not very stable. Comparison of both approaches highlighted a nonconservative shared point mutation between LAOBP_graft and LAOBP_sca. This context dependent L117K mutation in LAOBP turned out to be sufficient for introducing Gln binding, as confirmed by different experimental techniques. Moreover, the crystal structure of LAOBP_L117K in complex with its ligand is reported. © 2018 The Protein Society.

  8. Combining medical informatics and bioinformatics toward tools for personalized medicine.

    PubMed

    Sarachan, B D; Simmons, M K; Subramanian, P; Temkin, J M

    2003-01-01

    Key bioinformatics and medical informatics research areas need to be identified to advance knowledge and understanding of disease risk factors and molecular disease pathology in the 21 st century toward new diagnoses, prognoses, and treatments. Three high-impact informatics areas are identified: predictive medicine (to identify significant correlations within clinical data using statistical and artificial intelligence methods), along with pathway informatics and cellular simulations (that combine biological knowledge with advanced informatics to elucidate molecular disease pathology). Initial predictive models have been developed for a pilot study in Huntington's disease. An initial bioinformatics platform has been developed for the reconstruction and analysis of pathways, and work has begun on pathway simulation. A bioinformatics research program has been established at GE Global Research Center as an important technology toward next generation medical diagnostics. We anticipate that 21 st century medical research will be a combination of informatics tools with traditional biology wet lab research, and that this will translate to increased use of informatics techniques in the clinic.

  9. The logical foundations of forensic science: towards reliable knowledge

    PubMed Central

    Evett, Ian

    2015-01-01

    The generation of observations is a technical process and the advances that have been made in forensic science techniques over the last 50 years have been staggering. But science is about reasoning—about making sense from observations. For the forensic scientist, this is the challenge of interpreting a pattern of observations within the context of a legal trial. Here too, there have been major advances over recent years and there is a broad consensus among serious thinkers, both scientific and legal, that the logical framework is furnished by Bayesian inference (Aitken et al. Fundamentals of Probability and Statistical Evidence in Criminal Proceedings). This paper shows how the paradigm has matured, centred on the notion of the balanced scientist. Progress through the courts has not been always smooth and difficulties arising from recent judgments are discussed. Nevertheless, the future holds exciting prospects, in particular the opportunities for managing and calibrating the knowledge of the forensic scientists who assign the probabilities that are at the foundation of logical inference in the courtroom. PMID:26101288

  10. Advanced dynamic statistical parametric mapping with MEG in localizing epileptogenicity of the bottom of sulcus dysplasia.

    PubMed

    Nakajima, Midori; Wong, Simeon; Widjaja, Elysa; Baba, Shiro; Okanishi, Tohru; Takada, Lynne; Sato, Yosuke; Iwata, Hiroki; Sogabe, Maya; Morooka, Hikaru; Whitney, Robyn; Ueda, Yuki; Ito, Tomoshiro; Yagyu, Kazuyori; Ochi, Ayako; Carter Snead, O; Rutka, James T; Drake, James M; Doesburg, Sam; Takeuchi, Fumiya; Shiraishi, Hideaki; Otsubo, Hiroshi

    2018-06-01

    To investigate whether advanced dynamic statistical parametric mapping (AdSPM) using magnetoencephalography (MEG) can better localize focal cortical dysplasia at bottom of sulcus (FCDB). We analyzed 15 children with diagnosis of FCDB in surgical specimen and 3 T MRI by using MEG. Using AdSPM, we analyzed a ±50 ms epoch relative to each single moving dipole (SMD) and applied summation technique to estimate the source activity. The most active area in AdSPM was defined as the location of AdSPM spike source. We compared spatial congruence between MRI-visible FCDB and (1) dipole cluster in SMD method; and (2) AdSPM spike source. AdSPM localized FCDB in 12 (80%) of 15 children whereas dipole cluster localized six (40%). AdSPM spike source was concordant within seizure onset zone in nine (82%) of 11 children with intracranial video EEG. Eleven children with resective surgery achieved seizure freedom with follow-up period of 1.9 ± 1.5 years. Ten (91%) of them had an AdSPM spike source in the resection area. AdSPM can noninvasively and neurophysiologically localize epileptogenic FCDB, whether it overlaps with the dipole cluster or not. This is the first study to localize epileptogenic FCDB using MEG. Copyright © 2018 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  11. Application of advanced techniques for the assessment of bio-stability of biowaste-derived residues: A minireview.

    PubMed

    Lü, Fan; Shao, Li-Ming; Zhang, Hua; Fu, Wen-Ding; Feng, Shi-Jin; Zhan, Liang-Tong; Chen, Yun-Min; He, Pin-Jing

    2018-01-01

    Bio-stability is a key feature for the utilization and final disposal of biowaste-derived residues, such as aerobic compost or vermicompost of food waste, bio-dried waste, anaerobic digestate or landfilled waste. The present paper reviews conventional methods and advanced techniques used for the assessment of bio-stability. The conventional methods are reclassified into two categories. Advanced techniques, including spectroscopic (fluorescent, ultraviolet-visible, infrared, Raman, nuclear magnetic resonance), thermogravimetric and thermochemolysis analysis, are emphasized for their application in bio-stability assessment in recent years. Their principles, pros and cons are critically discussed. These advanced techniques are found to be convenient in sample preparation and to supply diversified information. However, the viability of these techniques as potential indicators for bio-stability assessment ultimately lies in the establishment of the relationship of advanced ones with the conventional methods, especially with the methods based on biotic response. Furthermore, some misuses in data explanation should be noted. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Evaluation of statistical treatments of left-censored environmental data using coincident uncensored data sets: I. Summary statistics

    USGS Publications Warehouse

    Antweiler, Ronald C.; Taylor, Howard E.

    2008-01-01

    The main classes of statistical treatment of below-detection limit (left-censored) environmental data for the determination of basic statistics that have been used in the literature are substitution methods, maximum likelihood, regression on order statistics (ROS), and nonparametric techniques. These treatments, along with using all instrument-generated data (even those below detection), were evaluated by examining data sets in which the true values of the censored data were known. It was found that for data sets with less than 70% censored data, the best technique overall for determination of summary statistics was the nonparametric Kaplan-Meier technique. ROS and the two substitution methods of assigning one-half the detection limit value to censored data or assigning a random number between zero and the detection limit to censored data were adequate alternatives. The use of these two substitution methods, however, requires a thorough understanding of how the laboratory censored the data. The technique of employing all instrument-generated data - including numbers below the detection limit - was found to be less adequate than the above techniques. At high degrees of censoring (greater than 70% censored data), no technique provided good estimates of summary statistics. Maximum likelihood techniques were found to be far inferior to all other treatments except substituting zero or the detection limit value to censored data.

  13. Using the Student Research Project to Integrate Macroeconomics and Statistics in an Advanced Cost Accounting Course

    ERIC Educational Resources Information Center

    Hassan, Mahamood M.; Schwartz, Bill N.

    2014-01-01

    This paper discusses a student research project that is part of an advanced cost accounting class. The project emphasizes active learning, integrates cost accounting with macroeconomics and statistics by "learning by doing" using real world data. Students analyze sales data for a publicly listed company by focusing on the company's…

  14. Coping, Stress, and Job Satisfaction as Predictors of Advanced Placement Statistics Teachers' Intention to Leave the Field

    ERIC Educational Resources Information Center

    McCarthy, Christopher J.; Lambert, Richard G.; Crowe, Elizabeth W.; McCarthy, Colleen J.

    2010-01-01

    This study examined the relationship of teachers' perceptions of coping resources and demands to job satisfaction factors. Participants were 158 Advanced Placement Statistics high school teachers who completed measures of personal resources for stress prevention, classroom demands and resources, job satisfaction, and intention to leave the field…

  15. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC I. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…

  16. Measuring the statistical validity of summary meta-analysis and meta-regression results for use in clinical practice.

    PubMed

    Willis, Brian H; Riley, Richard D

    2017-09-20

    An important question for clinicians appraising a meta-analysis is: are the findings likely to be valid in their own practice-does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity-where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple ('leave-one-out') cross-validation technique, we demonstrate how we may test meta-analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta-analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta-analysis and a tailored meta-regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within-study variance, between-study variance, study sample size, and the number of studies in the meta-analysis. Finally, we apply Vn to two published meta-analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta-analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  17. A shift from significance test to hypothesis test through power analysis in medical research.

    PubMed

    Singh, G

    2006-01-01

    Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.

  18. Online Dectection and Modeling of Safety Boundaries for Aerospace Application Using Bayesian Statistics

    NASA Technical Reports Server (NTRS)

    He, Yuning

    2015-01-01

    The behavior of complex aerospace systems is governed by numerous parameters. For safety analysis it is important to understand how the system behaves with respect to these parameter values. In particular, understanding the boundaries between safe and unsafe regions is of major importance. In this paper, we describe a hierarchical Bayesian statistical modeling approach for the online detection and characterization of such boundaries. Our method for classification with active learning uses a particle filter-based model and a boundary-aware metric for best performance. From a library of candidate shapes incorporated with domain expert knowledge, the location and parameters of the boundaries are estimated using advanced Bayesian modeling techniques. The results of our boundary analysis are then provided in a form understandable by the domain expert. We illustrate our approach using a simulation model of a NASA neuro-adaptive flight control system, as well as a system for the detection of separation violations in the terminal airspace.

  19. A Statistical Learning Framework for Materials Science: Application to Elastic Moduli of k-nary Inorganic Polycrystalline Compounds.

    PubMed

    de Jong, Maarten; Chen, Wei; Notestine, Randy; Persson, Kristin; Ceder, Gerbrand; Jain, Anubhav; Asta, Mark; Gamst, Anthony

    2016-10-03

    Materials scientists increasingly employ machine or statistical learning (SL) techniques to accelerate materials discovery and design. Such pursuits benefit from pooling training data across, and thus being able to generalize predictions over, k-nary compounds of diverse chemistries and structures. This work presents a SL framework that addresses challenges in materials science applications, where datasets are diverse but of modest size, and extreme values are often of interest. Our advances include the application of power or Hölder means to construct descriptors that generalize over chemistry and crystal structure, and the incorporation of multivariate local regression within a gradient boosting framework. The approach is demonstrated by developing SL models to predict bulk and shear moduli (K and G, respectively) for polycrystalline inorganic compounds, using 1,940 compounds from a growing database of calculated elastic moduli for metals, semiconductors and insulators. The usefulness of the models is illustrated by screening for superhard materials.

  20. A Statistical Learning Framework for Materials Science: Application to Elastic Moduli of k-nary Inorganic Polycrystalline Compounds

    PubMed Central

    de Jong, Maarten; Chen, Wei; Notestine, Randy; Persson, Kristin; Ceder, Gerbrand; Jain, Anubhav; Asta, Mark; Gamst, Anthony

    2016-01-01

    Materials scientists increasingly employ machine or statistical learning (SL) techniques to accelerate materials discovery and design. Such pursuits benefit from pooling training data across, and thus being able to generalize predictions over, k-nary compounds of diverse chemistries and structures. This work presents a SL framework that addresses challenges in materials science applications, where datasets are diverse but of modest size, and extreme values are often of interest. Our advances include the application of power or Hölder means to construct descriptors that generalize over chemistry and crystal structure, and the incorporation of multivariate local regression within a gradient boosting framework. The approach is demonstrated by developing SL models to predict bulk and shear moduli (K and G, respectively) for polycrystalline inorganic compounds, using 1,940 compounds from a growing database of calculated elastic moduli for metals, semiconductors and insulators. The usefulness of the models is illustrated by screening for superhard materials. PMID:27694824

  1. A Statistical Learning Framework for Materials Science: Application to Elastic Moduli of k-nary Inorganic Polycrystalline Compounds

    DOE PAGES

    de Jong, Maarten; Chen, Wei; Notestine, Randy; ...

    2016-10-03

    Materials scientists increasingly employ machine or statistical learning (SL) techniques to accelerate materials discovery and design. Such pursuits benefit from pooling training data across, and thus being able to generalize predictions over, k-nary compounds of diverse chemistries and structures. This work presents a SL framework that addresses challenges in materials science applications, where datasets are diverse but of modest size, and extreme values are often of interest. Our advances include the application of power or Hölder means to construct descriptors that generalize over chemistry and crystal structure, and the incorporation of multivariate local regression within a gradient boosting framework. Themore » approach is demonstrated by developing SL models to predict bulk and shear moduli (K and G, respectively) for polycrystalline inorganic compounds, using 1,940 compounds from a growing database of calculated elastic moduli for metals, semiconductors and insulators. The usefulness of the models is illustrated by screening for superhard materials.« less

  2. Irreversible electroporation of locally advanced pancreatic neck/body adenocarcinoma

    PubMed Central

    2015-01-01

    Objective Irreversible electroporation (IRE) of locally advanced pancreatic adenocarcinoma of the neck has been used to palliate appropriate stage 3 pancreatic cancers without evidence of metastasis and who have undergone appropriate induction therapy. Currently there has not been a standardized reported technique for pancreatic mid-body tumors for patient selection and intra-operative technique. Patients Subjects are patients with locally advanced pancreatic adenocarcinoma of the body/neck who have undergone appropriate induction chemotherapy for a reasonable duration. Main outcome measures Technique of open IRE of locally advanced pancreatic adenocarcinoma of the neck/body is described, with the emphasis on intra-operative ultrasound and intra-operative electroporation management. Results The technique of open IRE of the pancreatic neck/body with bracketing of the celiac axis and superior mesenteric artery with continuous intraoperative ultrasound imaging and consideration of intraoperative navigational system is described. Conclusions IRE of locally advanced pancreatic adenocarcinoma of the body/neck is feasible for appropriate patients with locally advanced unresectable pancreatic cancer. PMID:26029461

  3. Xenogeneic Collagen Matrix Versus Connective Tissue Graft: Case Series of Various Gingival Recession Treatments.

    PubMed

    Chevalier, Grégoire; Cherkaoui, Selma; Kruk, Hanna; Bensaïd, Xavier; Danan, Marc

    A xenogeneic collagen matrix recently has been suggested as an alternative to connective tissue graft for the treatment of gingival recession. The matrix avoids the second surgical site, and as a consequence could decrease surgical morbidity. This new matrix was used in various clinical situations and compared to connective tissue graft (CTG) in a split-mouth design case series. A total of 17 recessions were treated with a coronally advanced flap, 9 with CTG, and 8 with the matrix. Mean recession reduction was 2.00 mm with the CTG and 2.00 mm with the matrix. No significant statistical differences between the techniques were observed in this case report.

  4. Xenogeneic Collagen Matrix Versus Connective Tissue Graft: Case Series of Various Gingival Recession Treatments.

    PubMed

    Chevalier, Grégoire; Cherkaoui, Selma; Kruk, Hanna; Bensaïd, Xavier; Danan, Marc

    2016-08-24

    A xenogeneic collagen matrix recently has been suggested as an alternative to connective tissue graft for the treatment of gingival recession. The matrix avoids the second surgical site, and as a consequence could decrease surgical morbidity. This new matrix was used in various clinical situations and compared to connective tissue graft (CTG) in a split-mouth design case series. A total of 17 recessions were treated with a coronally advanced flap, 9 with CTG, and 8 with the matrix. Mean recession reduction was 2.00 mm with the CTG and 2.00 mm with the matrix. No significant statistical differences between the techniques were observed in this case report.

  5. Automation in high-content flow cytometry screening.

    PubMed

    Naumann, U; Wand, M P

    2009-09-01

    High-content flow cytometric screening (FC-HCS) is a 21st Century technology that combines robotic fluid handling, flow cytometric instrumentation, and bioinformatics software, so that relatively large numbers of flow cytometric samples can be processed and analysed in a short period of time. We revisit a recent application of FC-HCS to the problem of cellular signature definition for acute graft-versus-host-disease. Our focus is on automation of the data processing steps using recent advances in statistical methodology. We demonstrate that effective results, on par with those obtained via manual processing, can be achieved using our automatic techniques. Such automation of FC-HCS has the potential to drastically improve diagnosis and biomarker identification.

  6. Challenges in the automated classification of variable stars in large databases

    NASA Astrophysics Data System (ADS)

    Graham, Matthew; Drake, Andrew; Djorgovski, S. G.; Mahabal, Ashish; Donalek, Ciro

    2017-09-01

    With ever-increasing numbers of astrophysical transient surveys, new facilities and archives of astronomical time series, time domain astronomy is emerging as a mainstream discipline. However, the sheer volume of data alone - hundreds of observations for hundreds of millions of sources - necessitates advanced statistical and machine learning methodologies for scientific discovery: characterization, categorization, and classification. Whilst these techniques are slowly entering the astronomer's toolkit, their application to astronomical problems is not without its issues. In this paper, we will review some of the challenges posed by trying to identify variable stars in large data collections, including appropriate feature representations, dealing with uncertainties, establishing ground truths, and simple discrete classes.

  7. Gaze inspired subtitle position evaluation for MOOCs videos

    NASA Astrophysics Data System (ADS)

    Chen, Hongli; Yan, Mengzhen; Liu, Sijiang; Jiang, Bo

    2017-06-01

    Online educational resources, such as MOOCs, is becoming increasingly popular, especially in higher education field. One most important media type for MOOCs is course video. Besides traditional bottom-position subtitle accompany to the videos, in recent years, researchers try to develop more advanced algorithms to generate speaker-following style subtitles. However, the effectiveness of such subtitle is still unclear. In this paper, we investigate the relationship between subtitle position and the learning effect after watching the video on tablet devices. Inspired with image based human eye tracking technique, this work combines the objective gaze estimation statistics with subjective user study to achieve a convincing conclusion - speaker-following subtitles are more suitable for online educational videos.

  8. Neutron Resonance Spin Determination Using Multi-Segmented Detector DANCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baramsai, B.; Mitchell, G. E.; Chyzh, A.

    2011-06-01

    A sensitive method to determine the spin of neutron resonances is introduced based on the statistical pattern recognition technique. The new method was used to assign the spins of s-wave resonances in {sup 155}Gd. The experimental neutron capture data for these nuclei were measured with the DANCE (Detector for Advanced Neutron Capture Experiment) calorimeter at the Los Alamos Neutron Science Center. The highly segmented calorimeter provided detailed multiplicity distributions of the capture {gamma}-rays. Using this information, the spins of the neutron capture resonances were determined. With these new spin assignments, level spacings are determined separately for s-wave resonances with J{supmore » {pi}} = 1{sup -} and 2{sup -}.« less

  9. A Bernoulli Formulation of the Land-Use Portfolio Model

    USGS Publications Warehouse

    Champion, Richard A.

    2008-01-01

    Decision making for natural-hazards mitigation can be sketched as knowledge available in advance (a priori), knowledge available later (a posteriori), and how consequences of the mitigation decision might be viewed once future outcomes are known. Two outcomes - mitigating for a hazard event that will occur, and not mitigating for a hazard event that will not occur - can be considered narrowly correct. Two alternative outcomes - mitigating for a hazard event that will not occur, and not mitigating for a hazard event that will occur - can be considered narrowly incorrect. The dilemma facing the decision maker is that mitigation choices must be made before the event, and often must be made with imperfect statistical techniques and imperfect data.

  10. A novel alkaloid isolated from Crotalaria paulina and identified by NMR and DFT calculations

    NASA Astrophysics Data System (ADS)

    Oliveira, Ramon Prata; Demuner, Antonio Jacinto; Alvarenga, Elson Santiago; Barbosa, Luiz Claudio Almeida; de Melo Silva, Thiago

    2018-01-01

    Pyrrolizidine alkaloids (PAs) are secondary metabolites found in Crotalaria genus and are known to have several biological activities. A novel macrocycle bislactone alkaloid, coined ethylcrotaline, was isolated and purified from the aerial parts of Crotalaria paulina. The novel macrocycle was identified with the aid of high resolution mass spectrometry and advanced nuclear magnetic resonance techniques. The relative stereochemistry of the alkaloid was defined by comparing the calculated quantum mechanical hydrogen and carbon chemical shifts of eight candidate structures with the experimental NMR data. The best fit between the eight candidate structures and the experimental NMR chemical shifts was defined by the DP4 statistical analyses and the Mean Absolute Error (MAE) calculations.

  11. CAPSAS: Computer Assisted Program for the Selection of Appropriate Statistics.

    ERIC Educational Resources Information Center

    Shermis, Mark D.; Albert, Susan L.

    A computer-assisted program has been developed for the selection of statistics or statistical techniques by both students and researchers. Based on Andrews, Klem, Davidson, O'Malley and Rodgers "A Guide for Selecting Statistical Techniques for Analyzing Social Science Data," this FORTRAN-compiled interactive computer program was…

  12. Labour analgesia: Recent advances

    PubMed Central

    Pandya, Sunil T

    2010-01-01

    Advances in the field of labour analgesia have tread a long journey from the days of ether and chloroform in 1847 to the present day practice of comprehensive programme of labour pain management using evidence-based medicine. Newer advances include introduction of newer techniques like combined spinal epidurals, low-dose epidurals facilitating ambulation, pharmacological advances like introduction of remifentanil for patient-controlled intravenous analgesia, introduction of newer local anaesthetics and adjuvants like ropivacaine, levobupivacaine, sufentanil, clonidine and neostigmine, use of inhalational agents like sevoflourane for patient-controlled inhalational analgesia using special vaporizers, all have revolutionized the practice of pain management in labouring parturients. Technological advances like use of ultrasound to localize epidural space in difficult cases minimizes failed epidurals and introduction of novel drug delivery modalities like patient-controlled epidural analgesia (PCEA) pumps and computer-integrated drug delivery pumps have improved the overall maternal satisfaction rate and have enabled us to customize a suitable analgesic regimen for each parturient. Recent randomized controlled trials and Cochrane studies have concluded that the association of epidurals with increased caesarean section and long-term backache remains only a myth. Studies have also shown that the newer, low-dose regimes do not have a statistically significant impact on the duration of labour and breast feeding and also that these reduce the instrumental delivery rates thus improving maternal and foetal safety. Advances in medical technology like use of ultrasound for localizing epidural space have helped the clinicians to minimize the failure rates, and many novel drug delivery modalities like PCEA and computer-integrated PCEA have contributed to the overall maternal satisfaction and safety. PMID:21189877

  13. Recent advances in quantitative high throughput and high content data analysis.

    PubMed

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  14. Thyroid Radiofrequency Ablation: Updates on Innovative Devices and Techniques

    PubMed Central

    Park, Hye Sun; Park, Auh Whan; Chung, Sae Rom; Choi, Young Jun; Lee, Jeong Hyun

    2017-01-01

    Radiofrequency ablation (RFA) is a well-known, effective, and safe method for treating benign thyroid nodules and recurrent thyroid cancers. Thyroid-dedicated devices and basic techniques for thyroid RFA were introduced by the Korean Society of Thyroid Radiology (KSThR) in 2012. Thyroid RFA has now been adopted worldwide, with subsequent advances in devices and techniques. To optimize the treatment efficacy and patient safety, understanding the basic and advanced RFA techniques and selecting the optimal treatment strategy are critical. The goal of this review is to therefore provide updates and analysis of current devices and advanced techniques for RFA treatment of benign thyroid nodules and recurrent thyroid cancers. PMID:28670156

  15. Data analysis techniques

    NASA Technical Reports Server (NTRS)

    Park, Steve

    1990-01-01

    A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.

  16. 24 CFR 266.420 - Closing and endorsement by the Commissioner.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    .... (a) Closing. Before disbursement of loan advances in periodic advances cases, and in all cases after... market occupancy percentages, value/replacement cost, interest rate, and similar statistical information... certification for periodic advances cases, if submitted for final endorsement, that advances were made...

  17. MO-G-12A-01: Quantitative Imaging Metrology: What Should Be Assessed and How?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giger, M; Petrick, N; Obuchowski, N

    The first two symposia in the Quantitative Imaging Track focused on 1) the introduction of quantitative imaging (QI) challenges and opportunities, and QI efforts of agencies and organizations such as the RSNA, NCI, FDA, and NIST, and 2) the techniques, applications, and challenges of QI, with specific examples from CT, PET/CT, and MR. This third symposium in the QI Track will focus on metrology and its importance in successfully advancing the QI field. While the specific focus will be on QI, many of the concepts presented are more broadly applicable to many areas of medical physics research and applications. Asmore » such, the topics discussed should be of interest to medical physicists involved in imaging as well as therapy. The first talk of the session will focus on the introduction to metrology and why it is critically important in QI. The second talk will focus on appropriate methods for technical performance assessment. The third talk will address statistically valid methods for algorithm comparison, a common problem not only in QI but also in other areas of medical physics. The final talk in the session will address strategies for publication of results that will allow statistically valid meta-analyses, which is critical for combining results of individual studies with typically small sample sizes in a manner that can best inform decisions and advance the field. Learning Objectives: Understand the importance of metrology in the QI efforts. Understand appropriate methods for technical performance assessment. Understand methods for comparing algorithms with or without reference data (i.e., “ground truth”). Understand the challenges and importance of reporting results in a manner that allows for statistically valid meta-analyses.« less

  18. Lightning Initiation Forecasting: An Operational Dual-Polarimetric Radar Technique

    NASA Technical Reports Server (NTRS)

    Woodard, Crystal J.; Carey, L. D.; Petersen, W. A.; Roeder, W. P.

    2011-01-01

    The objective of this NASA MSFC and NOAA CSTAR funded study is to develop and test operational forecast algorithms for the prediction of lightning initiation utilizing the C-band dual-polarimetric radar, UAHuntsville's Advanced Radar for Meteorological and Operational Research (ARMOR). Although there is a rich research history of radar signatures associated with lightning initiation, few studies have utilized dual-polarimetric radar signatures (e.g., Z(sub dr) columns) and capabilities (e.g., fuzzy-logic particle identification [PID] of precipitation ice) in an operational algorithm for first flash forecasting. The specific goal of this study is to develop and test polarimetric techniques that enhance the performance of current operational radar reflectivity based first flash algorithms. Improving lightning watch and warning performance will positively impact personnel safety in both work and leisure environments. Advanced warnings can provide space shuttle launch managers time to respond appropriately to secure equipment and personnel, while they can also provide appropriate warnings for spectators and players of leisure sporting events to seek safe shelter. Through the analysis of eight case dates, consisting of 35 pulse-type thunderstorms and 20 non-thunderstorm case studies, lightning initiation forecast techniques were developed and tested. The hypothesis is that the additional dual-polarimetric information could potentially reduce false alarms while maintaining high probability of detection and increasing lead-time for the prediction of the first lightning flash relative to reflectivity-only based techniques. To test the hypothesis, various physically-based techniques using polarimetric variables and/or PID categories, which are strongly correlated to initial storm electrification (e.g., large precipitation ice production via drop freezing), were benchmarked against the operational reflectivity-only based approaches to find the best compromise between forecast skill and lead-time. Forecast skill is determined by statistical analysis of probability of detection (POD), false alarm ratio (FAR), Operational Utility Index (OUI), and critical success index (CSI).

  19. Systems-Level Synthetic Biology for Advanced Biofuel Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruffing, Anne; Jensen, Travis J.; Strickland, Lucas Marshall

    2015-03-01

    Cyanobacteria have been shown to be capable of producing a variety of advanced biofuels; however, product yields remain well below those necessary for large scale production. New genetic tools and high throughput metabolic engineering techniques are needed to optimize cyanobacterial metabolisms for enhanced biofuel production. Towards this goal, this project advances the development of a multiple promoter replacement technique for systems-level optimization of gene expression in a model cyanobacterial host: Synechococcus sp. PCC 7002. To realize this multiple-target approach, key capabilities were developed, including a high throughput detection method for advanced biofuels, enhanced transformation efficiency, and genetic tools for Synechococcusmore » sp. PCC 7002. Moreover, several additional obstacles were identified for realization of this multiple promoter replacement technique. The techniques and tools developed in this project will help to enable future efforts in the advancement of cyanobacterial biofuels.« less

  20. Reliability of CBCT as an assessment tool for mandibular molars furcation defects

    NASA Astrophysics Data System (ADS)

    Marinescu, Adrian George; Boariu, Marius; Rusu, Darian; Stratul, Stefan-Ioan; Ogodescu, Alexandru

    2014-01-01

    Introduction. In numerous clinical situations it is not possible to have an exact clinical evaluation of the furcation defects. Recently the use of CBCT in periodontology has led to an increased precision in diagnostic. Aim. To determine the accuracy of CBCT as diagnostic tool of the furcation defects. Material and method. 19 patients with generalised advanced chronic periodontitis were included in this study, presenting a total of 25 lower molars with different degrees of furcation defects. Clinical and digital measurements (in mm) were performed on all the molars involved. The data obtained has been compared and statistically analysed. Results. The analysis of primary data has demonstrated that all the furcation grade II and III defects were revealed using the CBCT technique. Regarding the incipient defects (grade I Hamp < 3mm), the dimensions measured on CBCT images were slightly bigger. The results have shown that 84% of the defects detected by CBCT have been confirmed by clinical measurements. These data are similar to those revealed by other studies1. Conclusions. The use of CBCT technique in evaluation and diagnosis of human mandibular furcation defects can provide many important information regarding the size and aspect of the interradicular defect, efficiently and noninvasively. CBCT technique is used more effectively in detection of advanced furcation degree compared to incipient ones. However, the CBCT examination cannot replace, at least in this stage of development, the clinical measurements, especially the intraoperative ones, which are considered to represent the „golden standard" in this domain.

  1. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  2. Post-fire debris flow prediction in Western United States: Advancements based on a nonparametric statistical technique

    NASA Astrophysics Data System (ADS)

    Nikolopoulos, E. I.; Destro, E.; Bhuiyan, M. A. E.; Borga, M., Sr.; Anagnostou, E. N.

    2017-12-01

    Fire disasters affect modern societies at global scale inducing significant economic losses and human casualties. In addition to their direct impacts they have various adverse effects on hydrologic and geomorphologic processes of a region due to the tremendous alteration of the landscape characteristics (vegetation, soil properties etc). As a consequence, wildfires often initiate a cascade of hazards such as flash floods and debris flows that usually follow the occurrence of a wildfire thus magnifying the overall impact in a region. Post-fire debris flows (PFDF) is one such type of hazards frequently occurring in Western United States where wildfires are a common natural disaster. Prediction of PDFD is therefore of high importance in this region and over the last years a number of efforts from United States Geological Survey (USGS) and National Weather Service (NWS) have been focused on the development of early warning systems that will help mitigate PFDF risk. This work proposes a prediction framework that is based on a nonparametric statistical technique (random forests) that allows predicting the occurrence of PFDF at regional scale with a higher degree of accuracy than the commonly used approaches that are based on power-law thresholds and logistic regression procedures. The work presented is based on a recently released database from USGS that reports a total of 1500 storms that triggered and did not trigger PFDF in a number of fire affected catchments in Western United States. The database includes information on storm characteristics (duration, accumulation, max intensity etc) and other auxiliary information of land surface properties (soil erodibility index, local slope etc). Results show that the proposed model is able to achieve a satisfactory prediction accuracy (threat score > 0.6) superior of previously published prediction frameworks highlighting the potential of nonparametric statistical techniques for development of PFDF prediction systems.

  3. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC 11: SPC & Graphs. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…

  4. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Statistical Process Control.

    ERIC Educational Resources Information Center

    Billings, Paul H.

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…

  5. Reducing Anxiety and Increasing Self-Efficacy within an Advanced Graduate Psychology Statistics Course

    ERIC Educational Resources Information Center

    McGrath, April L.; Ferns, Alyssa; Greiner, Leigh; Wanamaker, Kayla; Brown, Shelley

    2015-01-01

    In this study we assessed the usefulness of a multifaceted teaching framework in an advanced statistics course. We sought to expand on past findings by using this framework to assess changes in anxiety and self-efficacy, and we collected focus group data to ascertain whether students attribute such changes to a multifaceted teaching approach.…

  6. 48 CFR 31.109 - Advance agreements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Advance agreements. 31.109... REQUIREMENTS CONTRACT COST PRINCIPLES AND PROCEDURES Applicability 31.109 Advance agreements. (a) The extent of... contractors should seek advance agreement on the treatment of special or unusual costs and on statistical...

  7. 48 CFR 31.109 - Advance agreements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Advance agreements. 31.109... REQUIREMENTS CONTRACT COST PRINCIPLES AND PROCEDURES Applicability 31.109 Advance agreements. (a) The extent of... contractors should seek advance agreement on the treatment of special or unusual costs and on statistical...

  8. 48 CFR 31.109 - Advance agreements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advance agreements. 31.109... REQUIREMENTS CONTRACT COST PRINCIPLES AND PROCEDURES Applicability 31.109 Advance agreements. (a) The extent of... contractors should seek advance agreement on the treatment of special or unusual costs and on statistical...

  9. Single-tooth anesthesia: pressure-sensing technology provides innovative advancement in the field of dental local anesthesia.

    PubMed

    Hochman, Mark N

    2007-04-01

    This article will review standard techniques for intraligamentary injection and describe the technology and technique behind a new single-tooth anesthesia system. This system and technique represents a technological advancement and a greater understanding of intraligamentary anesthesia.

  10. Developing Statistical Literacy with Year 9 Students: A Collaborative Research Project

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2013-01-01

    Advances in technology and communication have increased the amount of statistical information delivered through everyday media. The importance of statistics in everyday life has led to calls for increased attention to statistical literacy in the mathematics curriculum (Watson 2006). Gal (2004) sees statistical literacy as the need for students to…

  11. Teaching Statistics Online: A Decade's Review of the Literature about What Works

    ERIC Educational Resources Information Center

    Mills, Jamie D.; Raju, Dheeraj

    2011-01-01

    A statistics course can be a very challenging subject to teach. To enhance learning, today's modern course in statistics might incorporate many different aspects of technology. Due to advances in technology, teaching statistics online has also become a popular course option. Although researchers are studying how to deliver statistics courses in…

  12. Four-corner fusion: comparison of patient satisfaction and functional outcome of conventional K-wire technique vs. a new locking plate.

    PubMed

    Hernekamp, J F; Reinecke, A; Neubrech, F; Bickert, B; Kneser, U; Kremer, T

    2016-04-01

    Four-corner fusion is a standard procedure for advanced carpal collapse. Several operative techniques and numerous implants for osseous fixation have been described. Recently, a specially designed locking plate (Aptus©, Medartis, Basel, Switzerland) was introduced. The purpose of this study was to compare functional results after osseous fixation using K-wires (standard of care, SOC) with four-corner fusion and locking plate fixation. 21 patients who underwent four-corner fusion in our institution between 2008 and 2013 were included in a retrospective analysis. In 11 patients, osseous fixation was performed using locking plates whereas ten patients underwent bone fixation with conventional K-wires. Outcome parameters were functional outcome, osseous consolidation, patient satisfaction (DASH- and Krimmer Score), pain and perioperative morbidity and the time until patients returned to daily work. Patients were divided in two groups and paired t-tests were performed for statistical analysis. No implant related complications were observed. Osseous consolidation was achieved in all cases. Differences between groups were not significant regarding active range of motion (AROM), pain and function. Overall patient satisfaction was acceptable in all cases; differences in the DASH questionnaire and the Krimmer questionnaire were not significant. One patient of the plate group required conversion to total wrist arthrodesis without implant-related complications. Both techniques for four-corner fusion have similar healing rates. Using the more expensive locking implant avoids a second operation for K-wire removal, but no statistical differences were detected in functional outcome as well as in patient satisfaction when compared to SOC.

  13. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review

    PubMed Central

    Lamb, Karen E.; Thornton, Lukar E.; Cerin, Ester; Ball, Kylie

    2015-01-01

    Background Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Methods Searches were conducted for articles published from 2000–2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Results Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. Conclusions With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results. PMID:29546115

  14. RF Testing Of Microwave Integrated Circuits

    NASA Technical Reports Server (NTRS)

    Romanofsky, R. R.; Ponchak, G. E.; Shalkhauser, K. A.; Bhasin, K. B.

    1988-01-01

    Fixtures and techniques are undergoing development. Four test fixtures and two advanced techniques developed in continuing efforts to improve RF characterization of MMIC's. Finline/waveguide test fixture developed to test submodules of 30-GHz monolithic receiver. Universal commercially-manufactured coaxial test fixture modified to enable characterization of various microwave solid-state devices in frequency range of 26.5 to 40 GHz. Probe/waveguide fixture is compact, simple, and designed for non destructive testing of large number of MMIC's. Nondestructive-testing fixture includes cosine-tapered ridge, to match impedance wavequide to microstrip. Advanced technique is microwave-wafer probing. Second advanced technique is electro-optical sampling.

  15. Endoscopic therapy for early gastric cancer: Standard techniques and recent advances in ESD

    PubMed Central

    Kume, Keiichiro

    2014-01-01

    The technique of endoscopic submucosal dissection (ESD) is now a well-known endoscopic therapy for early gastric cancer. ESD was introduced to resect large specimens of early gastric cancer in a single piece. ESD can provide precision of histologic diagnosis and can also reduce the recurrence rate. However, the drawback of ESD is its technical difficulty, and, consequently, it is associated with a high rate of complications, the need for advanced endoscopic techniques, and a lengthy procedure time. Various advances in the devices and techniques used for ESD have contributed to overcoming these drawbacks. PMID:24914364

  16. Advanced Diffusion-Weighted Magnetic Resonance Imaging Techniques of the Human Spinal Cord

    PubMed Central

    Andre, Jalal B.; Bammer, Roland

    2012-01-01

    Unlike those of the brain, advances in diffusion-weighted imaging (DWI) of the human spinal cord have been challenged by the more complicated and inhomogeneous anatomy of the spine, the differences in magnetic susceptibility between adjacent air and fluid-filled structures and the surrounding soft tissues, and the inherent limitations of the initially used echo-planar imaging techniques used to image the spine. Interval advances in DWI techniques for imaging the human spinal cord, with the specific aims of improving the diagnostic quality of the images, and the simultaneous reduction in unwanted artifacts have resulted in higher-quality images that are now able to more accurately portray the complicated underlying anatomy and depict pathologic abnormality with improved sensitivity and specificity. Diffusion tensor imaging (DTI) has benefited from the advances in DWI techniques, as DWI images form the foundation for all tractography and DTI. This review provides a synopsis of the many recent advances in DWI of the human spinal cord, as well as some of the more common clinical uses for these techniques, including DTI and tractography. PMID:22158130

  17. Image interpolation and denoising for division of focal plane sensors using Gaussian processes.

    PubMed

    Gilboa, Elad; Cunningham, John P; Nehorai, Arye; Gruev, Viktor

    2014-06-16

    Image interpolation and denoising are important techniques in image processing. These methods are inherent to digital image acquisition as most digital cameras are composed of a 2D grid of heterogeneous imaging sensors. Current polarization imaging employ four different pixelated polarization filters, commonly referred to as division of focal plane polarization sensors. The sensors capture only partial information of the true scene, leading to a loss of spatial resolution as well as inaccuracy of the captured polarization information. Interpolation is a standard technique to recover the missing information and increase the accuracy of the captured polarization information. Here we focus specifically on Gaussian process regression as a way to perform a statistical image interpolation, where estimates of sensor noise are used to improve the accuracy of the estimated pixel information. We further exploit the inherent grid structure of this data to create a fast exact algorithm that operates in ����(N(3/2)) (vs. the naive ���� (N³)), thus making the Gaussian process method computationally tractable for image data. This modeling advance and the enabling computational advance combine to produce significant improvements over previously published interpolation methods for polarimeters, which is most pronounced in cases of low signal-to-noise ratio (SNR). We provide the comprehensive mathematical model as well as experimental results of the GP interpolation performance for division of focal plane polarimeter.

  18. Evaluation of periosteum eversion and coronally advanced flap techniques in the treatment of isolated Miller's Class I/II gingival recession: A comparative clinical study

    PubMed Central

    Debnath, Koel; Chatterjee, Anirban

    2018-01-01

    Aim: The present investigation aimed to evaluate root coverage (RC) with periosteum eversion technique (PET) using periosteum as a graft and coronally advanced flap (CAF) with platelet-rich fibrin (PRF) membrane as a graft in the treatment of isolated Miller's class I and II gingival recession defects. Materials and Methods: Thirty sites in 15 participants with Miller's Class I or II gingival recession were randomly treated either with PET using periosteum as graft and CAF + PRF as graft. In a split mouth design, the parameters such as recession depth, recession width at cementoenamel junction, probing depth, periodontal attachment level (PAL), and keratinized gingival width were assessed at baseline, 3 months, and 6 months postoperative follow-up with William's graduated probe and Vernier caliper. Results: Both the treatment modalities yielded statistically nonsignificant desirable treatment outcomes at both postoperative levels in terms of all the parameters The mean RC with probe method and Vernier method in CAF + PRF was 75.01% and 86.86%, respectively, and PET showed a mean RC of 61.112% and 83.971%, respectively, at 6-month interval period which showed a nonstatistically significant difference. Conclusion: Both the treatment modalities, i.e., CAF + PRF and PET are essentially and equally effective in the treatment of Miller's Class I or II gingival recession defects. PMID:29769769

  19. Measuring the statistical validity of summary meta‐analysis and meta‐regression results for use in clinical practice

    PubMed Central

    Riley, Richard D.

    2017-01-01

    An important question for clinicians appraising a meta‐analysis is: are the findings likely to be valid in their own practice—does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity—where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple (‘leave‐one‐out’) cross‐validation technique, we demonstrate how we may test meta‐analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta‐analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta‐analysis and a tailored meta‐regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within‐study variance, between‐study variance, study sample size, and the number of studies in the meta‐analysis. Finally, we apply Vn to two published meta‐analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta‐analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:28620945

  20. Research Education in Undergraduate Occupational Therapy Programs.

    ERIC Educational Resources Information Center

    Petersen, Paul; And Others

    1992-01-01

    Of 63 undergraduate occupational therapy programs surveyed, the 38 responses revealed some common areas covered: elementary descriptive statistics, validity, reliability, and measurement. Areas underrepresented include statistical analysis with or without computers, research design, and advanced statistics. (SK)

  1. The Statistical Segment Length of DNA: Opportunities for Biomechanical Modeling in Polymer Physics and Next-Generation Genomics.

    PubMed

    Dorfman, Kevin D

    2018-02-01

    The development of bright bisintercalating dyes for deoxyribonucleic acid (DNA) in the 1990s, most notably YOYO-1, revolutionized the field of polymer physics in the ensuing years. These dyes, in conjunction with modern molecular biology techniques, permit the facile observation of polymer dynamics via fluorescence microscopy and thus direct tests of different theories of polymer dynamics. At the same time, they have played a key role in advancing an emerging next-generation method known as genome mapping in nanochannels. The effect of intercalation on the bending energy of DNA as embodied by a change in its statistical segment length (or, alternatively, its persistence length) has been the subject of significant controversy. The precise value of the statistical segment length is critical for the proper interpretation of polymer physics experiments and controls the phenomena underlying the aforementioned genomics technology. In this perspective, we briefly review the model of DNA as a wormlike chain and a trio of methods (light scattering, optical or magnetic tweezers, and atomic force microscopy (AFM)) that have been used to determine the statistical segment length of DNA. We then outline the disagreement in the literature over the role of bisintercalation on the bending energy of DNA, and how a multiscale biomechanical approach could provide an important model for this scientifically and technologically relevant problem.

  2. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research.

    ERIC Educational Resources Information Center

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    Discusses small-group apprenticeships (SGAs) as a method for introducing cell culture techniques to high school participants. Teaches cell culture practices and introduces advance imaging techniques to solve various biomedical engineering problems. Clarifies and illuminates the value of small-group laboratory apprenticeships. (Author/KHR)

  3. Advanced techniques to prepare seed to sow

    Treesearch

    Robert P. Karrfalt

    2013-01-01

    This paper reviews research on improving the basic technique of cold stratification for tree and shrub seeds. Advanced stratification techniques include long stratification, stratification re-dry, or multiple cycles of warm-cold stratification. Research demonstrates that careful regulation of moisture levels and lengthening the stratification period have produced a...

  4. Fourth NASA Inter-Center Control Systems Conference

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Space vehicle control applications are discussed, along with aircraft guidance, control, and handling qualities. System simulation and identification, engine control, advanced propulsion techniques, and advanced control techniques are also included.

  5. What can comparative effectiveness research, propensity score and registry study bring to Chinese medicine?

    PubMed

    Liao, Xing; Xie, Yan-ming

    2014-10-01

    The impact of evidence-based medicine and clinical epidemiology on clinical research has contributed to the development of Chinese medicine in modern times over the past two decades. Many concepts and methods of modern science and technology are emerging in Chinese medicine research, resulting in constant progress. Systematic reviews, randomized controlled trials and other advanced mathematic approaches and statistical analysis methods have brought reform to Chinese medicine. In this new era, Chinese medicine researchers have many opportunities and challenges. On the one hand, Chinese medicine researchers need to dedicate themselves to providing enough evidence to the world through rigorous studies, whilst on the other hand, they also need to keep up with the speed of modern medicine research. For example, recently, real world study, comparative effectiveness research, propensity score techniques and registry study have emerged. This article aims to inspire Chinese medicine researchers to explore new areas by introducing these new ideas and new techniques.

  6. Statistical Tutorial | Center for Cancer Research

    Cancer.gov

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data.  ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018.  The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean

  7. Understanding Statistics and Statistics Education: A Chinese Perspective

    ERIC Educational Resources Information Center

    Shi, Ning-Zhong; He, Xuming; Tao, Jian

    2009-01-01

    In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…

  8. A Novel Microcharacterization Technique in the Measurement of Strain and Orientation Gradient in Advanced Materials

    NASA Technical Reports Server (NTRS)

    Garmestai, H.; Harris, K.; Lourenco, L.

    1997-01-01

    Representation of morphology and evolution of the microstructure during processing and their relation to properties requires proper experimental techniques. Residual strains, lattice distortion, and texture (micro-texture) at the interface and the matrix of a layered structure or a functionally gradient material and their variation are among parameters important in materials characterization but hard to measure with present experimental techniques. Current techniques available to measure changes in interred material parameters (residual stress, micro-texture, microplasticity) produce results which are either qualitative or unreliable. This problem becomes even more complicated in the case of a temperature variation. These parameters affect many of the mechanical properties of advanced materials including stress-strain relation, ductility, creep, and fatigue. A review of some novel experimental techniques using recent advances in electron microscopy is presented here to measure internal stress, (micro)texture, interracial strength and (sub)grain formation and realignment. Two of these techniques are combined in the chamber of an Environmental Scanning Electron Microscope to measure strain and orientation gradients in advanced materials. These techniques which include Backscattered Kikuchi Diffractometry (BKD) and Microscopic Strain Field Analysis are used to characterize metallic and intermetallic matrix composites and superplastic materials. These techniques are compared with the more conventional x-ray diffraction and indentation techniques.

  9. Comparative Analysis Between Computed and Conventional Inferior Alveolar Nerve Block Techniques.

    PubMed

    Araújo, Gabriela Madeira; Barbalho, Jimmy Charles Melo; Dias, Tasiana Guedes de Souza; Santos, Thiago de Santana; Vasconcellos, Ricardo José de Holanda; de Morais, Hécio Henrique Araújo

    2015-11-01

    The aim of this randomized, double-blind, controlled trial was to compare the computed and conventional inferior alveolar nerve block techniques in symmetrically positioned inferior third molars. Both computed and conventional anesthetic techniques were performed in 29 healthy patients (58 surgeries) aged between 18 and 40 years. The anesthetic of choice was 2% lidocaine with 1: 200,000 epinephrine. The Visual Analogue Scale assessed the pain variable after anesthetic infiltration. Patient satisfaction was evaluated using the Likert Scale. Heart and respiratory rates, mean time to perform technique, and the need for additional anesthesia were also evaluated. Pain variable means were higher for the conventional technique as compared with computed, 3.45 ± 2.73 and 2.86 ± 1.96, respectively, but no statistically significant differences were found (P > 0.05). Patient satisfaction showed no statistically significant differences. The average computed technique runtime and the conventional were 3.85 and 1.61 minutes, respectively, showing statistically significant differences (P <0.001). The computed anesthetic technique showed lower mean pain perception, but did not show statistically significant differences when contrasted to the conventional technique.

  10. Infant Statistical-Learning Ability Is Related to Real-Time Language Processing

    ERIC Educational Resources Information Center

    Lany, Jill; Shoaib, Amber; Thompson, Abbie; Estes, Katharine Graf

    2018-01-01

    Infants are adept at learning statistical regularities in artificial language materials, suggesting that the ability to learn statistical structure may support language development. Indeed, infants who perform better on statistical learning tasks tend to be more advanced in parental reports of infants' language skills. Work with adults suggests…

  11. [Technical advancement improves survival in patients with locally advanced non-small cell lung cancer (LA-NSCLC) receiving definitive radiotherapy].

    PubMed

    Wang, J B; Jiang, W; Ji, Z; Cao, J Z; Liu, L P; Men, Y; Xu, C; Wang, X Z; Hui, Z G; Liang, J; Lyu, J M; Zhou, Z M; Xiao, Z F; Feng, Q F; Chen, D F; Zhang, H X; Yin, W B; Wang, L H

    2016-08-01

    This study aimed to evaluate the impact of technical advancement of radiation therapy in patients with LA-NSCLC receiving definitive radiotherapy (RT). Patients treated with definitive RT (≥50 Gy) between 2000 and 2010 were retrospectively reviewed. Overall survival (OS), cancer specific survival (CSS), locoregional progression-free survival (LRPFS), distant metastasis-free survival (DMFS) and progression-free survival (PFS) were calculated and compared among patients irradiated with different techniques. Radiation-induced lung injury (RILI) and esophageal injury (RIEI) were assessed according to the National Cancer Institute Common Terminology Criteria for Adverse Events 3.0 (NCI-CTCAE 3.0). A total of 946 patients were eligible for analysis, including 288 treated with two-dimensional radiotherapy (2D-RT), 209 with three-dimensional conformal radiation therapy (3D-CRT) and 449 with intensity-modulated radiation therapy (IMRT) respectively. The median follow-up time for the whole population was 84.1 months. The median OS of 2D-RT, 3D-CRT and IMRT groups were 15.8, 19.7 and 23.3 months, respectively, with the corresponding 5-year survival rate of 8.7%, 13.0% and 18.8%, respectively (P<0.001). The univariate analysis demonstrated significantly inferior OS, LRPFS, DMFS and PFS of 2D-RT than those provided by 3D-CRT or IMRT. The univariate analysis also revealed that the IMRT group had significantly loger LRPFS and a trend toward better OS and DMFS compared with 3D-CRT. Multivariate analysis showed that TNM stage, RT technique and KPS were independent factors correlated with all survival indexes. Compared with 2D-RT, the utilization of IMRT was associated with significantly improved OS, LRPFS, DMFS as well as PFS. Compared with 3D-CRT, IMRT provided superior DMFS (P=0.035), a trend approaching significance with regard to LRPFS (P=0.073) but no statistically significant improvement on OS, CSS and PFS in multivariate analysis. The incidence rates of RILI were significantly decreased in the IMRT group (29.3% vs. 26.6% vs.14.0%, P<0.001) whereas that of RIET rates were similar (34.7% vs. 29.7% vs. 35.3%, P=0.342) among the three groups. Radiation therapy technique is a factor affecting prognosis of LA-NSCLC patients. Advanced radiation therapy technique is associated with improved tumor control and survival, and decreased radiation-induced lung toxicity.

  12. Validation and Calibration of Nuclear Thermal Hydraulics Multiscale Multiphysics Models - Subcooled Flow Boiling Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anh Bui; Nam Dinh; Brian Williams

    In addition to validation data plan, development of advanced techniques for calibration and validation of complex multiscale, multiphysics nuclear reactor simulation codes are a main objective of the CASL VUQ plan. Advanced modeling of LWR systems normally involves a range of physico-chemical models describing multiple interacting phenomena, such as thermal hydraulics, reactor physics, coolant chemistry, etc., which occur over a wide range of spatial and temporal scales. To a large extent, the accuracy of (and uncertainty in) overall model predictions is determined by the correctness of various sub-models, which are not conservation-laws based, but empirically derived from measurement data. Suchmore » sub-models normally require extensive calibration before the models can be applied to analysis of real reactor problems. This work demonstrates a case study of calibration of a common model of subcooled flow boiling, which is an important multiscale, multiphysics phenomenon in LWR thermal hydraulics. The calibration process is based on a new strategy of model-data integration, in which, all sub-models are simultaneously analyzed and calibrated using multiple sets of data of different types. Specifically, both data on large-scale distributions of void fraction and fluid temperature and data on small-scale physics of wall evaporation were simultaneously used in this work’s calibration. In a departure from traditional (or common-sense) practice of tuning/calibrating complex models, a modern calibration technique based on statistical modeling and Bayesian inference was employed, which allowed simultaneous calibration of multiple sub-models (and related parameters) using different datasets. Quality of data (relevancy, scalability, and uncertainty) could be taken into consideration in the calibration process. This work presents a step forward in the development and realization of the “CIPS Validation Data Plan” at the Consortium for Advanced Simulation of LWRs to enable quantitative assessment of the CASL modeling of Crud-Induced Power Shift (CIPS) phenomenon, in particular, and the CASL advanced predictive capabilities, in general. This report is prepared for the Department of Energy’s Consortium for Advanced Simulation of LWRs program’s VUQ Focus Area.« less

  13. Advanced Welding Concepts

    NASA Technical Reports Server (NTRS)

    Ding, Robert J.

    2010-01-01

    Four advanced welding techniques and their use in NASA are briefly reviewed in this poster presentation. The welding techniques reviewed are: Solid State Welding, Friction Stir Welding (FSW), Thermal Stir Welding (TSW) and Ultrasonic Stir Welding.

  14. Use of Statistical Analyses in the Ophthalmic Literature

    PubMed Central

    Lisboa, Renato; Meira-Freitas, Daniel; Tatham, Andrew J.; Marvasti, Amir H.; Sharpsten, Lucie; Medeiros, Felipe A.

    2014-01-01

    Purpose To identify the most commonly used statistical analyses in the ophthalmic literature and to determine the likely gain in comprehension of the literature that readers could expect if they were to sequentially add knowledge of more advanced techniques to their statistical repertoire. Design Cross-sectional study Methods All articles published from January 2012 to December 2012 in Ophthalmology, American Journal of Ophthalmology and Archives of Ophthalmology were reviewed. A total of 780 peer-reviewed articles were included. Two reviewers examined each article and assigned categories to each one depending on the type of statistical analyses used. Discrepancies between reviewers were resolved by consensus. Main Outcome Measures Total number and percentage of articles containing each category of statistical analysis were obtained. Additionally we estimated the accumulated number and percentage of articles that a reader would be expected to be able to interpret depending on their statistical repertoire. Results Readers with little or no statistical knowledge would be expected to be able to interpret the statistical methods presented in only 20.8% of articles. In order to understand more than half (51.4%) of the articles published, readers were expected to be familiar with at least 15 different statistical methods. Knowledge of 21 categories of statistical methods was necessary to comprehend 70.9% of articles, while knowledge of more than 29 categories was necessary to comprehend more than 90% of articles. Articles in retina and glaucoma subspecialties showed a tendency for using more complex analysis when compared to cornea. Conclusions Readers of clinical journals in ophthalmology need to have substantial knowledge of statistical methodology to understand the results of published studies in the literature. The frequency of use of complex statistical analyses also indicates that those involved in the editorial peer-review process must have sound statistical knowledge in order to critically appraise articles submitted for publication. The results of this study could provide guidance to direct the statistical learning of clinical ophthalmologists, researchers and educators involved in the design of courses for residents and medical students. PMID:24612977

  15. Sensor failure detection system. [for the F100 turbofan engine

    NASA Technical Reports Server (NTRS)

    Beattie, E. C.; Laprad, R. F.; Mcglone, M. E.; Rock, S. M.; Akhter, M. M.

    1981-01-01

    Advanced concepts for detecting, isolating, and accommodating sensor failures were studied to determine their applicability to the gas turbine control problem. Five concepts were formulated based upon such techniques as Kalman filters and a screening process led to the selection of one advanced concept for further evaluation. The selected advanced concept uses a Kalman filter to generate residuals, a weighted sum square residuals technique to detect soft failures, likelihood ratio testing of a bank of Kalman filters for isolation, and reconfiguring of the normal mode Kalman filter by eliminating the failed input to accommodate the failure. The advanced concept was compared to a baseline parameter synthesis technique. The advanced concept was shown to be a viable concept for detecting, isolating, and accommodating sensor failures for the gas turbine applications.

  16. Targeted Muscle Reinnervation for Transradial Amputation: Description of Operative Technique.

    PubMed

    Morgan, Emily N; Kyle Potter, Benjamin; Souza, Jason M; Tintle, Scott M; Nanos, George P

    2016-12-01

    Targeted muscle reinnervation (TMR) is a revolutionary surgical technique that, together with advances in upper extremity prostheses and advanced neuromuscular pattern recognition, allows intuitive and coordinated control in multiple planes of motion for shoulder disarticulation and transhumeral amputees. TMR also may provide improvement in neuroma-related pain and may represent an opportunity for sensory reinnervation as advances in prostheses and haptic feedback progress. Although most commonly utilized following shoulder disarticulation and transhumeral amputations, TMR techniques also represent an exciting opportunity for improvement in integrated prosthesis control and neuroma-related pain improvement in patients with transradial amputations. As there are no detailed descriptions of this technique in the literature to date, we provide our surgical technique for TMR in transradial amputations.

  17. Joint Data Assimilation and Parameter Calibration in on-line groundwater modelling using Sequential Monte Carlo techniques

    NASA Astrophysics Data System (ADS)

    Ramgraber, M.; Schirmer, M.

    2017-12-01

    As computational power grows and wireless sensor networks find their way into common practice, it becomes increasingly feasible to pursue on-line numerical groundwater modelling. The reconciliation of model predictions with sensor measurements often necessitates the application of Sequential Monte Carlo (SMC) techniques, most prominently represented by the Ensemble Kalman Filter. In the pursuit of on-line predictions it seems advantageous to transcend the scope of pure data assimilation and incorporate on-line parameter calibration as well. Unfortunately, the interplay between shifting model parameters and transient states is non-trivial. Several recent publications (e.g. Chopin et al., 2013, Kantas et al., 2015) in the field of statistics discuss potential algorithms addressing this issue. However, most of these are computationally intractable for on-line application. In this study, we investigate to what extent compromises between mathematical rigour and computational restrictions can be made within the framework of on-line numerical modelling of groundwater. Preliminary studies are conducted in a synthetic setting, with the goal of transferring the conclusions drawn into application in a real-world setting. To this end, a wireless sensor network has been established in the valley aquifer around Fehraltorf, characterized by a highly dynamic groundwater system and located about 20 km to the East of Zürich, Switzerland. By providing continuous probabilistic estimates of the state and parameter distribution, a steady base for branched-off predictive scenario modelling could be established, providing water authorities with advanced tools for assessing the impact of groundwater management practices. Chopin, N., Jacob, P.E. and Papaspiliopoulos, O. (2013): SMC2: an efficient algorithm for sequential analysis of state space models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75 (3), p. 397-426. Kantas, N., Doucet, A., Singh, S.S., Maciejowski, J., and Chopin, N. (2015): On Particle Methods for Parameter Estimation in State-Space Models. Statistical Science, 30 (3), p. 328.-351.

  18. Advanced magnetic resonance imaging of neurodegenerative diseases.

    PubMed

    Agosta, Federica; Galantucci, Sebastiano; Filippi, Massimo

    2017-01-01

    Magnetic resonance imaging (MRI) is playing an increasingly important role in the study of neurodegenerative diseases, delineating the structural and functional alterations determined by these conditions. Advanced MRI techniques are of special interest for their potential to characterize the signature of each neurodegenerative condition and aid both the diagnostic process and the monitoring of disease progression. This aspect will become crucial when disease-modifying (personalized) therapies will be established. MRI techniques are very diverse and go from the visual inspection of MRI scans to more complex approaches, such as manual and automatic volume measurements, diffusion tensor MRI, and functional MRI. All these techniques allow us to investigate the different features of neurodegeneration. In this review, we summarize the most recent advances concerning the use of MRI in some of the most important neurodegenerative conditions, putting an emphasis on the advanced techniques.

  19. Sonic Fatigue Design Techniques for Advanced Composite Aircraft Structures

    DTIC Science & Technology

    1980-04-01

    AFWAL-TR-80.3019 AD A 090553 SONIC FATIGUE DESIGN TECHNIQUES FOR ADVANCED COMPOSITE AIRCRAFT STRUCTURES FINAL REPORT Ian Holehouse Rohr Industries...5 2. General Sonic Fatigue Theory .... ....... 7 3. Composite Laminate Analysis .. ....... ... 10 4. Preliminary Sonic Fatigue...overall sonic fatigue design guides. These existing desiyn methcds have been developed for metal structures. However, recent advanced composite

  20. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research

    ERIC Educational Resources Information Center

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    The purpose of this article is to discuss "small-group apprenticeships (SGAs)" as a method to instruct cell culture techniques to high school participants. The study aimed to teach cell culture practices and to introduce advanced imaging techniques to solve various biomedical engineering problems. Participants designed and completed experiments…

  1. Advances in top-down and bottom-up surface nanofabrication: techniques, applications & future prospects.

    PubMed

    Biswas, Abhijit; Bayer, Ilker S; Biris, Alexandru S; Wang, Tao; Dervishi, Enkeleda; Faupel, Franz

    2012-01-15

    This review highlights the most significant advances of the nanofabrication techniques reported over the past decade with a particular focus on the approaches tailored towards the fabrication of functional nano-devices. The review is divided into two sections: top-down and bottom-up nanofabrication. Under the classification of top-down, special attention is given to technical reports that demonstrate multi-directional patterning capabilities less than or equal to 100 nm. These include recent advances in lithographic techniques, such as optical, electron beam, soft, nanoimprint, scanning probe, and block copolymer lithography. Bottom-up nanofabrication techniques--such as, atomic layer deposition, sol-gel nanofabrication, molecular self-assembly, vapor-phase deposition and DNA-scaffolding for nanoelectronics--are also discussed. Specifically, we describe advances in the fabrication of functional nanocomposites and graphene using chemical and physical vapor deposition. Our aim is to provide a comprehensive platform for prominent nanofabrication tools and techniques in order to facilitate the development of new or hybrid nanofabrication techniques leading to novel and efficient functional nanostructured devices. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. A New Scoring System to Predict the Risk for High-risk Adenoma and Comparison of Existing Risk Calculators.

    PubMed

    Murchie, Brent; Tandon, Kanwarpreet; Hakim, Seifeldin; Shah, Kinchit; O'Rourke, Colin; Castro, Fernando J

    2017-04-01

    Colorectal cancer (CRC) screening guidelines likely over-generalizes CRC risk, 35% of Americans are not up to date with screening, and there is growing incidence of CRC in younger patients. We developed a practical prediction model for high-risk colon adenomas in an average-risk population, including an expanded definition of high-risk polyps (≥3 nonadvanced adenomas), exposing higher than average-risk patients. We also compared results with previously created calculators. Patients aged 40 to 59 years, undergoing first-time average-risk screening or diagnostic colonoscopies were evaluated. Risk calculators for advanced adenomas and high-risk adenomas were created based on age, body mass index, sex, race, and smoking history. Previously established calculators with similar risk factors were selected for comparison of concordance statistic (c-statistic) and external validation. A total of 5063 patients were included. Advanced adenomas, and high-risk adenomas were seen in 5.7% and 7.4% of the patient population, respectively. The c-statistic for our calculator was 0.639 for the prediction of advanced adenomas, and 0.650 for high-risk adenomas. When applied to our population, all previous models had lower c-statistic results although one performed similarly. Our model compares favorably to previously established prediction models. Age and body mass index were used as continuous variables, likely improving the c-statistic. It also reports absolute predictive probabilities of advanced and high-risk polyps, allowing for more individualized risk assessment of CRC.

  3. Descriptive Statistical Techniques for Librarians. 2nd Edition.

    ERIC Educational Resources Information Center

    Hafner, Arthur W.

    A thorough understanding of the uses and applications of statistical techniques is integral in gaining support for library funding or new initiatives. This resource is designed to help practitioners develop and manipulate descriptive statistical information in evaluating library services, tracking and controlling limited resources, and analyzing…

  4. Understanding Summary Statistics and Graphical Techniques to Compare Michael Jordan versus LeBron James

    ERIC Educational Resources Information Center

    Williams, Immanuel James; Williams, Kelley Kim

    2016-01-01

    Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.

  5. A newly developed spinal simulator.

    PubMed

    Chester, R; Watson, M J

    2000-11-01

    A number of studies indicate poor intra-therapist and inter-therapist reliability in the performance of graded, passive oscillatory movements to the lumbar spine. However, it has been suggested that therapists can be trained to be more consistent in their performance of these techniques if given reliable quantitative feedback. The intention of this study was to develop equipment, analogous to the lumbar spine that could be used for both teaching and research purposes. Equipment has been updated and connected to a personal IBM compatible computer. Custom designed software allows concurrent and accurate feedback to students on their performance and in a form suitable for advanced data analysis using statistical packages. The uses and implications of this equipment are discussed. Copyright 2000 Harcourt Publishers Ltd.

  6. Aging and Family Life: A Decade Review

    PubMed Central

    Silverstein, Merril; Giarrusso, Roseann

    2010-01-01

    In this review, we summarize and critically evaluate the major empirical, conceptual, and theoretical directions that studies of aging families have taken during the first decade of the 21st century. The field has benefited from an expanded perspective based on four overarching themes: (a) complexity in emotional relations, (b) diversity in family structures and households, (c) interdependence of family roles and functions, and (d) patterns and outcomes of caregiving. Although research on aging families has advanced theory and applied innovative statistical techniques, the literature has fallen short in fully representing diverse populations and in applying the broadest set of methodological tools available. We discuss these and other frontier areas of scholarship in light of the aging of baby boomers and their families. PMID:22930600

  7. Factors influencing the consumption of alcohol and tobacco: the use and abuse of economic models.

    PubMed

    Godfrey, C

    1989-10-01

    This paper is concerned with the use of economic models in the debate about the role that tax increases and restrictions on advertising should play in reducing the health problems that arise from the consumption of alcohol and tobacco. It is argued that properly specified demand models that take account of all the important factors that influence consumption are required, otherwise inadequate modelling may lead to misleading estimates of the effects of policy changes. The ability of economics to deal with goods such as alcohol and tobacco that have addictive characteristics receives special attention. Recent advances in economic theory, estimation techniques and statistical testing are discussed, as is the problem of identifying policy recommendations from empirical results.

  8. Mathematics and the surgeon.

    PubMed Central

    Crank, J.

    1976-01-01

    The surgeon uses elementary mathematics just as much as any other educated layman. In his professional life, however, much of the knowledge and skill on which he relies has had a mathematical strand in its development, possibly woven into the supporting disciplines such as physics, chemistry, biology, and bioengineering. The valves and limitations of mathematical models are examined briefly in the general medical field and particularly in relation to the surgeon. Arithmetic and statistics are usually regarded as the most immediately useful parts of mathematics. Examples are cited, however, of medical postgraduate work which uses other highly advanced mathematical techniques. The place of mathematics in postgraduate and postexperience teaching courses is touched on. The role of a mathematical consultant in the medical team is discussed. PMID:942167

  9. Computer-assisted qualitative data analysis software.

    PubMed

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  10. InSAR Tropospheric Correction Methods: A Statistical Comparison over Different Regions

    NASA Astrophysics Data System (ADS)

    Bekaert, D. P.; Walters, R. J.; Wright, T. J.; Hooper, A. J.; Parker, D. J.

    2015-12-01

    Observing small magnitude surface displacements through InSAR is highly challenging, and requires advanced correction techniques to reduce noise. In fact, one of the largest obstacles facing the InSAR community is related to tropospheric noise correction. Spatial and temporal variations in temperature, pressure, and relative humidity result in a spatially-variable InSAR tropospheric signal, which masks smaller surface displacements due to tectonic or volcanic deformation. Correction methods applied today include those relying on weather model data, GNSS and/or spectrometer data. Unfortunately, these methods are often limited by the spatial and temporal resolution of the auxiliary data. Alternatively a correction can be estimated from the high-resolution interferometric phase by assuming a linear or a power-law relationship between the phase and topography. For these methods, the challenge lies in separating deformation from tropospheric signals. We will present results of a statistical comparison of the state-of-the-art tropospheric corrections estimated from spectrometer products (MERIS and MODIS), a low and high spatial-resolution weather model (ERA-I and WRF), and both the conventional linear and power-law empirical methods. We evaluate the correction capability over Southern Mexico, Italy, and El Hierro, and investigate the impact of increasing cloud cover on the accuracy of the tropospheric delay estimation. We find that each method has its strengths and weaknesses, and suggest that further developments should aim to combine different correction methods. All the presented methods are included into our new open source software package called TRAIN - Toolbox for Reducing Atmospheric InSAR Noise (Bekaert et al., in review), which is available to the community Bekaert, D., R. Walters, T. Wright, A. Hooper, and D. Parker (in review), Statistical comparison of InSAR tropospheric correction techniques, Remote Sensing of Environment

  11. Discrimination of soft tissues using laser-induced breakdown spectroscopy in combination with k nearest neighbors (kNN) and support vector machine (SVM) classifiers

    NASA Astrophysics Data System (ADS)

    Li, Xiaohui; Yang, Sibo; Fan, Rongwei; Yu, Xin; Chen, Deying

    2018-06-01

    In this paper, discrimination of soft tissues using laser-induced breakdown spectroscopy (LIBS) in combination with multivariate statistical methods is presented. Fresh pork fat, skin, ham, loin and tenderloin muscle tissues are manually cut into slices and ablated using a 1064 nm pulsed Nd:YAG laser. Discrimination analyses between fat, skin and muscle tissues, and further between highly similar ham, loin and tenderloin muscle tissues, are performed based on the LIBS spectra in combination with multivariate statistical methods, including principal component analysis (PCA), k nearest neighbors (kNN) classification, and support vector machine (SVM) classification. Performances of the discrimination models, including accuracy, sensitivity and specificity, are evaluated using 10-fold cross validation. The classification models are optimized to achieve best discrimination performances. The fat, skin and muscle tissues can be definitely discriminated using both kNN and SVM classifiers, with accuracy of over 99.83%, sensitivity of over 0.995 and specificity of over 0.998. The highly similar ham, loin and tenderloin muscle tissues can also be discriminated with acceptable performances. The best performances are achieved with SVM classifier using Gaussian kernel function, with accuracy of 76.84%, sensitivity of over 0.742 and specificity of over 0.869. The results show that the LIBS technique assisted with multivariate statistical methods could be a powerful tool for online discrimination of soft tissues, even for tissues of high similarity, such as muscles from different parts of the animal body. This technique could be used for discrimination of tissues suffering minor clinical changes, thus may advance the diagnosis of early lesions and abnormalities.

  12. Learning Outcomes in a Laboratory Environment vs. Classroom for Statistics Instruction: An Alternative Approach Using Statistical Software

    ERIC Educational Resources Information Center

    McCulloch, Ryan Sterling

    2017-01-01

    The role of any statistics course is to increase the understanding and comprehension of statistical concepts and those goals can be achieved via both theoretical instruction and statistical software training. However, many introductory courses either forego advanced software usage, or leave its use to the student as a peripheral activity. The…

  13. Advanced Placement® Statistics Students' Education Choices after High School. Research Notes. RN-38

    ERIC Educational Resources Information Center

    Patterson, Brian F.

    2009-01-01

    Taking the AP Statistics course and exam does not appear to be related to greater interest in the statistical sciences. Despite this finding, with respect to deciding whether to take further statistics course work and majoring in statistics, students appear to feel prepared for, but not interested in, further study. There is certainly more…

  14. [Value of laparoscopic virtual reality simulator in laparoscopic suture ability training of catechumen].

    PubMed

    Cai, Jian-liang; Zhang, Yi; Sun, Guo-feng; Li, Ning-chen; Zhang, Xiang-hua; Na, Yan-qun

    2012-12-01

    To investigate the value of laparoscopic virtual reality simulator in laparoscopic suture ability training of catechumen. After finishing the virtual reality training of basic laparoscopic skills, 26 catechumen were divided randomly into 2 groups, one group undertook advanced laparoscopic skill (suture technique) training with laparoscopic virtual reality simulator (virtual group), another used laparoscopic box trainer (box group). Using our homemade simulations, before grouping and after training, every trainee performed nephropyeloureterostomy under laparoscopy, the running time, anastomosis quality and proficiency were recorded and assessed. For virtual group, the running time, anastomosis quality and proficiency scores before grouping were (98 ± 11) minutes, 3.20 ± 0.41, 3.47 ± 0.64, respectively, after training were (53 ± 8) minutes, 6.87 ± 0.74, 6.33 ± 0.82, respectively, all the differences were statistically significant (all P < 0.01). In box group, before grouping were (98 ± 10) minutes, 3.17 ± 0.39, 3.42 ± 0.67, respectively, after training were (52 ± 9) minutes, 6.08 ± 0.90, 6.33 ± 0.78, respectively, all the differences also were statistically significant (all P < 0.01). After training, the running time and proficiency scores of virtual group were similar to box group (all P > 0.05), however, anstomosis quality scores in virtual group were higher than in box group (P = 0.02). The laparoscopic virtual reality simulator is better than traditional box trainer in advanced laparoscopic suture ability training of catechumen.

  15. Advanced Manufacturing Processes in the Motor Vehicle Industry

    DOT National Transportation Integrated Search

    1983-05-01

    Advanced manufacturing processes, which include a range of automation and management techniques, are aiding U.S. motor vehicle manufacturers to reduce vehicle costs. This report discusses these techniques in general and their specific applications in...

  16. Advanced wiring technique and hardware application: Airplane and space vehicle

    NASA Technical Reports Server (NTRS)

    Ernst, H. L.; Eichman, C. D.

    1972-01-01

    An advanced wiring system is described which achieves the safety/reliability required for present and future airplane and space vehicle applications. Also, present wiring installation techniques and hardware are analyzed to establish existing problem areas. An advanced wiring system employing matrix interconnecting unit, plug to plug trunk bundles (FCC or ribbon cable) is outlined, and an installation study presented. A planned program to develop, lab test and flight test key features of these techniques and hardware as a part of the SST technology follow-on activities is discussed.

  17. P-Value Club: Teaching Significance Level on the Dance Floor

    ERIC Educational Resources Information Center

    Gray, Jennifer

    2010-01-01

    Courses: Beginning research methods and statistics courses, as well as advanced communication courses that require reading research articles and completing research projects involving statistics. Objective: Students will understand the difference between significant and nonsignificant statistical results based on p-value.

  18. ISSUES IN THE STATISTICAL ANALYSIS OF SMALL-AREA HEALTH DATA. (R825173)

    EPA Science Inventory

    The availability of geographically indexed health and population data, with advances in computing, geographical information systems and statistical methodology, have opened the way for serious exploration of small area health statistics based on routine data. Such analyses may be...

  19. The environmental control and life support system advanced automation project

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.

    1991-01-01

    The objective of the ECLSS Advanced Automation project includes reduction of the risk associated with the integration of new, beneficial software techniques. Demonstrations of this software to baseline engineering and test personnel will show the benefits of these techniques. The advanced software will be integrated into ground testing and ground support facilities, familiarizing its usage by key personnel.

  20. Application of advanced cytometric and molecular technologies to minimal residual disease monitoring

    NASA Astrophysics Data System (ADS)

    Leary, James F.; He, Feng; Reece, Lisa M.

    2000-04-01

    Minimal residual disease monitoring presents a number of theoretical and practical challenges. Recently it has been possible to meet some of these challenges by combining a number of new advanced biotechnologies. To monitor the number of residual tumor cells requires complex cocktails of molecular probes that collectively provide sensitivities of detection on the order of one residual tumor cell per million total cells. Ultra-high-speed, multi parameter flow cytometry is capable of analyzing cells at rates in excess of 100,000 cells/sec. Residual tumor selection marker cocktails can be optimized by use of receiver operating characteristic analysis. New data minimizing techniques when combined with multi variate statistical or neural network classifications of tumor cells can more accurately predict residual tumor cell frequencies. The combination of these techniques can, under at least some circumstances, detect frequencies of tumor cells as low as one cell in a million with an accuracy of over 98 percent correct classification. Detection of mutations in tumor suppressor genes requires insolation of these rare tumor cells and single-cell DNA sequencing. Rare residual tumor cells can be isolated at single cell level by high-resolution single-cell cell sorting. Molecular characterization of tumor suppressor gene mutations can be accomplished using a combination of single- cell polymerase chain reaction amplification of specific gene sequences followed by TA cloning techniques and DNA sequencing. Mutations as small as a single base pair in a tumor suppressor gene of a single sorted tumor cell have been detected using these methods. Using new amplification procedures and DNA micro arrays it should be possible to extend the capabilities shown in this paper to screening of multiple DNA mutations in tumor suppressor and other genes on small numbers of sorted metastatic tumor cells.

  1. Chi-squared and C statistic minimization for low count per bin data

    NASA Astrophysics Data System (ADS)

    Nousek, John A.; Shue, David R.

    1989-07-01

    Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.

  2. Chi-squared and C statistic minimization for low count per bin data. [sampling in X ray astronomy

    NASA Technical Reports Server (NTRS)

    Nousek, John A.; Shue, David R.

    1989-01-01

    Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.

  3. 75 FR 81643 - In the Matter of Certain Semiconductor Products Made by Advanced Lithography Techniques and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-28

    ... Semiconductor Products Made by Advanced Lithography Techniques and Products Containing Same; Notice of... Mexico) (``STC''), alleging a violation of section 337 in the importation, sale for [[Page 81644

  4. Adaptive and Optimal Control of Stochastic Dynamical Systems

    DTIC Science & Technology

    2015-09-14

    Advances in Statistics, Probability and Actuarial Sciences , Vol. 1, World Scientific, 2012, 451- 463. [4] T. E. Duncan and B. Pasik-Duncan, A...S. N. Cohen, T. K. Siu and H. Yang) Advances in Statistics, Probability and Actuarial Sciences , Vol. 1, World Scientific, 2012, 451-463. 4. T. E...games with gen- eral noise processes, Models and Methods in Economics and Management Science : Essays in Honor of Charles S. Tapiero, (eds. F. El

  5. Support Vector Feature Selection for Early Detection of Anastomosis Leakage From Bag-of-Words in Electronic Health Records.

    PubMed

    Soguero-Ruiz, Cristina; Hindberg, Kristian; Rojo-Alvarez, Jose Luis; Skrovseth, Stein Olav; Godtliebsen, Fred; Mortensen, Kim; Revhaug, Arthur; Lindsetmo, Rolv-Ole; Augestad, Knut Magne; Jenssen, Robert

    2016-09-01

    The free text in electronic health records (EHRs) conveys a huge amount of clinical information about health state and patient history. Despite a rapidly growing literature on the use of machine learning techniques for extracting this information, little effort has been invested toward feature selection and the features' corresponding medical interpretation. In this study, we focus on the task of early detection of anastomosis leakage (AL), a severe complication after elective surgery for colorectal cancer (CRC) surgery, using free text extracted from EHRs. We use a bag-of-words model to investigate the potential for feature selection strategies. The purpose is earlier detection of AL and prediction of AL with data generated in the EHR before the actual complication occur. Due to the high dimensionality of the data, we derive feature selection strategies using the robust support vector machine linear maximum margin classifier, by investigating: 1) a simple statistical criterion (leave-one-out-based test); 2) an intensive-computation statistical criterion (Bootstrap resampling); and 3) an advanced statistical criterion (kernel entropy). Results reveal a discriminatory power for early detection of complications after CRC (sensitivity 100%; specificity 72%). These results can be used to develop prediction models, based on EHR data, that can support surgeons and patients in the preoperative decision making phase.

  6. Application of the Statistical ICA Technique in the DANCE Data Analysis

    NASA Astrophysics Data System (ADS)

    Baramsai, Bayarbadrakh; Jandel, M.; Bredeweg, T. A.; Rusev, G.; Walker, C. L.; Couture, A.; Mosby, S.; Ullmann, J. L.; Dance Collaboration

    2015-10-01

    The Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center is used to improve our understanding of the neutron capture reaction. DANCE is a highly efficient 4 π γ-ray detector array consisting of 160 BaF2 crystals which make it an ideal tool for neutron capture experiments. The (n, γ) reaction Q-value equals to the sum energy of all γ-rays emitted in the de-excitation cascades from the excited capture state to the ground state. The total γ-ray energy is used to identify reactions on different isotopes as well as the background. However, it's challenging to identify contribution in the Esum spectra from different isotopes with the similar Q-values. Recently we have tested the applicability of modern statistical methods such as Independent Component Analysis (ICA) to identify and separate different (n, γ) reaction yields on different isotopes that are present in the target material. ICA is a recently developed computational tool for separating multidimensional data into statistically independent additive subcomponents. In this conference talk, we present some results of the application of ICA algorithms and its modification for the DANCE experimental data analysis. This research is supported by the U. S. Department of Energy, Office of Science, Nuclear Physics under the Early Career Award No. LANL20135009.

  7. Lifetime Prediction for Degradation of Solar Mirrors using Step-Stress Accelerated Testing (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, J.; Elmore, R.; Kennedy, C.

    This research is to illustrate the use of statistical inference techniques in order to quantify the uncertainty surrounding reliability estimates in a step-stress accelerated degradation testing (SSADT) scenario. SSADT can be used when a researcher is faced with a resource-constrained environment, e.g., limits on chamber time or on the number of units to test. We apply the SSADT methodology to a degradation experiment involving concentrated solar power (CSP) mirrors and compare the results to a more traditional multiple accelerated testing paradigm. Specifically, our work includes: (1) designing a durability testing plan for solar mirrors (3M's new improved silvered acrylic "Solarmore » Reflector Film (SFM) 1100") through the ultra-accelerated weathering system (UAWS), (2) defining degradation paths of optical performance based on the SSADT model which is accelerated by high UV-radiant exposure, and (3) developing service lifetime prediction models for solar mirrors using advanced statistical inference. We use the method of least squares to estimate the model parameters and this serves as the basis for the statistical inference in SSADT. Several quantities of interest can be estimated from this procedure, e.g., mean-time-to-failure (MTTF) and warranty time. The methods allow for the estimation of quantities that may be of interest to the domain scientists.« less

  8. A Study of Students' Learning Styles, Discipline Attitudes and Knowledge Acquisition in Technology-Enhanced Probability and Statistics Education.

    PubMed

    Christou, Nicolas; Dinov, Ivo D

    2010-09-01

    Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources.

  9. A Study of Students' Learning Styles, Discipline Attitudes and Knowledge Acquisition in Technology-Enhanced Probability and Statistics Education

    PubMed Central

    Christou, Nicolas; Dinov, Ivo D.

    2011-01-01

    Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources. PMID:21603097

  10. Advanced Packaging Materials and Techniques for High Power TR Module: Standard Flight vs. Advanced Packaging

    NASA Technical Reports Server (NTRS)

    Hoffman, James Patrick; Del Castillo, Linda; Miller, Jennifer; Jenabi, Masud; Hunter, Donald; Birur, Gajanana

    2011-01-01

    The higher output power densities required of modern radar architectures, such as the proposed DESDynI [Deformation, Ecosystem Structure, and Dynamics of Ice] SAR [Synthetic Aperture Radar] Instrument (or DSI) require increasingly dense high power electronics. To enable these higher power densities, while maintaining or even improving hardware reliability, requires advances in integrating advanced thermal packaging technologies into radar transmit/receive (TR) modules. New materials and techniques have been studied and compared to standard technologies.

  11. Effects of advanced treatment of municipal wastewater on the White River near Indianapolis, Indiana; trends in water quality, 1978-86

    USGS Publications Warehouse

    Crawford, Charles G.; Wangsness, David J.

    1993-01-01

    The City of Indianapolis has constructed state-of-the-art advanced municipal wastewater-treatment systems to enlarge and upgrade the existing secondary-treatment processes at its Belmont and Southport treatment plants. These new advanced-wastewater-treatment plants became operational in 1983. A nonparametric statistical procedure--a modified form of the Wilcoxon-Mann-Whitney rank-sum test--was used to test for trends in time-series water-quality data from four sites on the White River and from the Belmont and Southport wastewater-treatment plants. Time-series data representative of pre-advanced- (1978-1980) and post-advanced- (1983--86) wastewater-treatment conditions were tested for trends, and the results indicate substantial changes in water quality of treated effluent and of the White River downstream from Indianapolis after implementation of advanced wastewater treatment. Water quality from 1981 through 1982 was highly variable due to plant construction. Therefore, this time period was excluded from the analysis. Water quality at sample sites located upstream from the wastewater-treatment plants was relatively constant during the period of study (1978-86). Analysis of data from the two plants and downstream from the plants indicates statistically significant decreasing trends in effluent concentrations of total ammonia, 5-day biochemical-oxygen demand, fecal-coliform bacteria, total phosphate, and total solids at all sites where sufficient data were available for testing. Because of in-plant nitrification, increases in nitrate concentration were statistically significant in the two plants and in the White River. The decrease in ammonia concentrations and 5-day biochemical-oxygen demand in the White River resulted in a statistically significant increasing trend in dissolved-oxygen concentration in the river because of reduced oxygen demand for nitrification and biochemical oxidation processes. Following implementation of advanced wastewater treatment, the number of river-quality samples that failed to meet the water-quality standards for ammonia and dissolved oxygen that apply to the White River decreased substantially.

  12. The science of teams in the military: Contributions from over 60 years of research.

    PubMed

    Goodwin, Gerald F; Blacksmith, Nikki; Coats, Meredith R

    2018-01-01

    Teams are the foundational building blocks of the military, which uses a hierarchical structure built on and around teams to form larger units. Consequently, team effectiveness has been a substantial focus of research within the military for decades to ensure military teams have the human capabilities to complete their missions and address future challenges successfully. This research has contributed greatly to broader team theory and informed the development of evidence-based interventions. Team-focused research supported or executed by the military has yielded major insights into the nature of team performance, advanced the methods for measuring and improving team performance, and broken new ground in understanding the assembly of effective teams. Furthermore, military research has made major contributions to advancing methodological and statistical techniques for studying teams. We highlight the military contributions to the broader team literature and conclude with a discussion of critical areas of future research on teams and enduring challenges for both the military and team science as a whole. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amadio, G.; et al.

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less

  14. The logical foundations of forensic science: towards reliable knowledge.

    PubMed

    Evett, Ian

    2015-08-05

    The generation of observations is a technical process and the advances that have been made in forensic science techniques over the last 50 years have been staggering. But science is about reasoning-about making sense from observations. For the forensic scientist, this is the challenge of interpreting a pattern of observations within the context of a legal trial. Here too, there have been major advances over recent years and there is a broad consensus among serious thinkers, both scientific and legal, that the logical framework is furnished by Bayesian inference (Aitken et al. Fundamentals of Probability and Statistical Evidence in Criminal Proceedings). This paper shows how the paradigm has matured, centred on the notion of the balanced scientist. Progress through the courts has not been always smooth and difficulties arising from recent judgments are discussed. Nevertheless, the future holds exciting prospects, in particular the opportunities for managing and calibrating the knowledge of the forensic scientists who assign the probabilities that are at the foundation of logical inference in the courtroom. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  15. Advances in primate stable isotope ecology-Achievements and future prospects.

    PubMed

    Crowley, Brooke E; Reitsema, Laurie J; Oelze, Vicky M; Sponheimer, Matt

    2016-10-01

    Stable isotope biogeochemistry has been used to investigate foraging ecology in non-human primates for nearly 30 years. Whereas early studies focused on diet, more recently, isotopic analysis has been used to address a diversity of ecological questions ranging from niche partitioning to nutritional status to variability in life history traits. With this increasing array of applications, stable isotope analysis stands to make major contributions to our understanding of primate behavior and biology. Most notably, isotopic data provide novel insights into primate feeding behaviors that may not otherwise be detectable. This special issue brings together some of the recent advances in this relatively new field. In this introduction to the special issue, we review the state of isotopic applications in primatology and its origins and describe some developing methodological issues, including techniques for analyzing different tissue types, statistical approaches, and isotopic baselines. We then discuss the future directions we envision for the field of primate isotope ecology. Am. J. Primatol. 78:995-1003, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  16. The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques

    ERIC Educational Resources Information Center

    Menil, Violeta C.

    2005-01-01

    In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…

  17. Directed assembly of hybrid nanostructures using optically resonant nanotweezers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, David

    This represents the final report for this project. Over the course of the project we have made significant progress in photonically driven nano-assembly including: (1) demonstrating the first direct optical tweezer based manipulation of proteins, (2) the ability to apply optical angular torques to microtubuals and other rod-shaped microparticles, (3) direct assembly of hybrid nanostructures comprising of polymeric nanoparticles and carbon nanotubes and, (4) the ability to drive biological reactions (specifically protein aggregation) that are thermodynamically unfavorable by applying localized optical work. These advancements are described in the list of papers provided in section 2.0 of the below. Summary detailsmore » are provided in prior year annual reports. We have two additional papers which will be submitted shortly based on the work done under this award. An updated publication list will be provided to the program manager when those are accepted. In this report, we report on a new advancement made in the final project year, which uses the nanotweezer technology to perform direct measurements of particle-surface interactions. Briefly, these measurements are important for characterizing the stability and behavior of colloidal and nanoparticle suspensions and current techniques are limited in their ability to measure piconewton scale interaction forces on sub-micrometer particles due to signal detection limits and thermal noise. In this project year we developed a new technique called “Nanophotonic Force Microscopy” which uses the localized region of exponentially decaying, near-field, light to confine small particles close to a surface. From the statistical distribution of the light intensity scattered by the particle the technique maps out the potential well of the trap and directly quantify the repulsive force between the nanoparticle and the surface. The major advantage of the technique is that it can measure forces and energy wells below the thermal noise limit, resolving interaction forces smaller than 1 pN on dielectric particles as small as 100 nm in diameter.« less

  18. Satisfaction, function and repair integrity after arthroscopic versus mini-open rotator cuff repair.

    PubMed

    Barnes, L A Fink; Kim, H M; Caldwell, J-M; Buza, J; Ahmad, C S; Bigliani, L U; Levine, W N

    2017-02-01

    Advances in arthroscopic techniques for rotator cuff repair have made the mini-open approach less popular. However, the mini-open approach remains an important technique for repair for many surgeons. The aims of this study were to compare the integrity of the repair, the function of the shoulder and satisfaction post-operatively using these two techniques in patients aged > 50 years. We identified 22 patients treated with mini-open and 128 patients treated with arthroscopic rotator cuff repair of July 2007 and June 2011. The mean follow-up was two years (1 to 5). Outcome was assessed using the American Shoulder and Elbow Surgeons (ASES) and Simple Shoulder Test (SST) scores, and satisfaction. The integrity of the repair was assessed using ultrasonography. A power analysis ensured sufficient enrolment. There was no statistically significant difference between the age, function, satisfaction, or pain scores (p > 0.05) of the two groups. The integrity of the repair and the mean SST scores were significantly better in the mini-open group (91% of mini-open repairs were intact versus 60% of arthroscopic repairs, p = 0.023; mean SST score 10.9 (standard deviation (sd) 1.3) in the mini-open group; 8.9 (sd 3.5) in arthroscopic group; p = 0.003). The ASES scores were also higher in the mini-open group (mean ASES score 91.0 (sd 10.5) in mini-open group; mean 82.70 (sd 19.8) in the arthroscopic group; p = 0.048). The integrity of the repair and function of the shoulder were better after a mini-open repair than after arthroscopic repair of a rotator cuff tear in these patients. The functional difference did not translate into a difference in satisfaction. Mini-open rotator cuff repair remains a useful technique despite advances in arthroscopy. Cite this article: Bone Joint J 2017;99-B:245-9. ©2017 The British Editorial Society of Bone & Joint Surgery.

  19. Linguistic Alternatives to Quantitative Research Strategies. Part One: How Linguistic Mechanisms Advance Research Outcomes

    ERIC Educational Resources Information Center

    Yeager, Joseph; Sommer, Linda

    2007-01-01

    Combining psycholinguistic technologies and systems analysis created advances in motivational profiling and numerous new behavioral engineering applications. These advances leapfrog many mainstream statistical research methods, producing superior research results via cause-effect language mechanisms. Entire industries explore motives ranging from…

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wurtz, R.; Kaplan, A.

    Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-­realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-­building elements and their functions in a fully-­designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejectionmore » rate (GRR) relevant for realistic applications.« less

  1. Should I Pack My Umbrella? Clinical versus Statistical Prediction of Mental Health Decisions

    ERIC Educational Resources Information Center

    Aegisdottir, Stefania; Spengler, Paul M.; White, Michael J.

    2006-01-01

    In this rejoinder, the authors respond to the insightful commentary of Strohmer and Arm, Chwalisz, and Hilton, Harris, and Rice about the meta-analysis on statistical versus clinical prediction techniques for mental health judgments. The authors address issues including the availability of statistical prediction techniques for real-life psychology…

  2. Change Detection in Rough Time Series

    DTIC Science & Technology

    2014-09-01

    Business Statistics : An Inferential Approach, Dellen: San Francisco. [18] Winston, W. (1997) Operations Research Applications and Algorithms, Duxbury...distribution that can present significant challenges to conventional statistical tracking techniques. To address this problem the proposed method...applies hybrid fuzzy statistical techniques to series granules instead of to individual measures. Three examples demonstrated the robust nature of the

  3. Enhancing Students' Ability to Use Statistical Reasoning with Everyday Problems

    ERIC Educational Resources Information Center

    Lawson, Timothy J.; Schwiers, Michael; Doellman, Maureen; Grady, Greg; Kelnhofer, Robert

    2003-01-01

    We discuss a technique for teaching students everyday applications of statistical concepts. We used this technique with students (n = 50) enrolled in several sections of an introductory statistics course; students (n = 45) in other sections served as a comparison group. A class of introductory psychology students (n = 24) served as a second…

  4. Nuclear magnetic resonance (NMR)-based metabolomics for cancer research.

    PubMed

    Ranjan, Renuka; Sinha, Neeraj

    2018-05-07

    Nuclear magnetic resonance (NMR) has emerged as an effective tool in various spheres of biomedical research, amongst which metabolomics is an important method for the study of various types of disease. Metabolomics has proved its stronghold in cancer research by the development of different NMR methods over time for the study of metabolites, thus identifying key players in the aetiology of cancer. A plethora of one-dimensional and two-dimensional NMR experiments (in solids, semi-solids and solution phases) are utilized to obtain metabolic profiles of biofluids, cell extracts and tissue biopsy samples, which can further be subjected to statistical analysis. Any alteration in the assigned metabolite peaks gives an indication of changes in metabolic pathways. These defined changes demonstrate the utility of NMR in the early diagnosis of cancer and provide further measures to combat malignancy and its progression. This review provides a snapshot of the trending NMR techniques and the statistical analysis involved in the metabolomics of diseases, with emphasis on advances in NMR methodology developed for cancer research. Copyright © 2018 John Wiley & Sons, Ltd.

  5. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu

    2014-01-15

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located onmore » a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.« less

  6. Technical Note: The Initial Stages of Statistical Data Analysis

    PubMed Central

    Tandy, Richard D.

    1998-01-01

    Objective: To provide an overview of several important data-related considerations in the design stage of a research project and to review the levels of measurement and their relationship to the statistical technique chosen for the data analysis. Background: When planning a study, the researcher must clearly define the research problem and narrow it down to specific, testable questions. The next steps are to identify the variables in the study, decide how to group and treat subjects, and determine how to measure, and the underlying level of measurement of, the dependent variables. Then the appropriate statistical technique can be selected for data analysis. Description: The four levels of measurement in increasing complexity are nominal, ordinal, interval, and ratio. Nominal data are categorical or “count” data, and the numbers are treated as labels. Ordinal data can be ranked in a meaningful order by magnitude. Interval data possess the characteristics of ordinal data and also have equal distances between levels. Ratio data have a natural zero point. Nominal and ordinal data are analyzed with nonparametric statistical techniques and interval and ratio data with parametric statistical techniques. Advantages: Understanding the four levels of measurement and when it is appropriate to use each is important in determining which statistical technique to use when analyzing data. PMID:16558489

  7. [Advanced online search techniques and dedicated search engines for physicians].

    PubMed

    Nahum, Yoav

    2008-02-01

    In recent years search engines have become an essential tool in the work of physicians. This article will review advanced search techniques from the world of information specialists, as well as some advanced search engine operators that may help physicians improve their online search capabilities, and maximize the yield of their searches. This article also reviews popular dedicated scientific and biomedical literature search engines.

  8. Advanced Welding Applications

    NASA Technical Reports Server (NTRS)

    Ding, Robert J.

    2010-01-01

    Some of the applications of advanced welding techniques are shown in this poster presentation. Included are brief explanations of the use on the Ares I and Ares V launch vehicle and on the Space Shuttle Launch vehicle. Also included are microstructural views from four advanced welding techniques: Variable Polarity Plasma Arc (VPPA) weld (fusion), self-reacting friction stir welding (SR-FSW), conventional FSW, and Tube Socket Weld (TSW) on aluminum.

  9. Getting There: Despite Their Prevalence in Advancement, Women Still Trail Men in Pay and Titles

    ERIC Educational Resources Information Center

    Scully, Maura King

    2011-01-01

    Advancement is a women-dominated profession. The numbers say so: Approximately two-thirds of Council for Advancement and Support of Education (CASE) members are women, and one-third are men. What does this mean for women and the advancement profession as a whole? As anyone who has ever analyzed statistics can tell, it depends. The numbers…

  10. Anterior Urethral Advancement as a Single-Stage Technique for Repair of Anterior Hypospadias: Our Experience.

    PubMed

    Gite, Venkat A; Nikose, Jayant V; Bote, Sachin M; Patil, Saurabh R

    2017-07-02

    Many techniques have been described to correct anterior hypospadias with variable results. Anterior urethral advancement as one stage technique was first described by Ti Chang Shing in 1984. It was also used for the repair of strictures and urethrocutaneous fistulae involving distal urethra. We report our experience of using this technique with some modification for the repair of anterior hypospadias. In the period between 2013-2015, 20 cases with anterior hypospadias including 2 cases of glanular, 3 cases of coronal, 12 cases of subcoronal and 3 cases of distal penile hypospadias were treated with anterior urethral advancement technique. Patients' age groups ranged from 18 months to 10 years. Postoperatively, patients were passing urine from tip of neomeatus with satisfactory stream during follow up period of 6 months to 2 years. There were no major complications in any of our patients except in one patient who developed meatal stenosis which was treated by periodic dilatation. Three fold urethral mobilization was sufficient in all cases. Anterior urethral advancement technique is a single-stage procedure with good cosmetic results and least complications for anterior hypospadias repair in properly selected cases.

  11. Contributions to advances in blend pellet products (BPP) research on molecular structure and molecular nutrition interaction by advanced synchrotron and globar molecular (Micro)spectroscopy.

    PubMed

    Guevara-Oquendo, Víctor H; Zhang, Huihua; Yu, Peiqiang

    2018-04-13

    To date, advanced synchrotron-based and globar-sourced techniques are almost unknown to food and feed scientists. There has been little application of these advanced techniques to study blend pellet products at a molecular level. This article aims to provide recent research on advanced synchrotron and globar vibrational molecular spectroscopy contributions to advances in blend pellet products research on molecular structure and molecular nutrition interaction. How processing induced molecular structure changes in relation to nutrient availability and utilization of the blend pellet products. The study reviews Utilization of co-product components for blend pellet product in North America; Utilization and benefits of inclusion of pulse screenings; Utilization of additives in blend pellet products; Application of pellet processing in blend pellet products; Conventional evaluation techniques and methods for blend pellet products. The study focus on recent applications of cutting-edge vibrational molecular spectroscopy for molecular structure and molecular structure association with nutrient utilization in blend pellet products. The information described in this article gives better insight on how advanced molecular (micro)spectroscopy contributions to advances in blend pellet products research on molecular structure and molecular nutrition interaction.

  12. A Comparison of the Performance of Advanced Statistical Techniques for the Refinement of Day-ahead and Longer NWP-based Wind Power Forecasts

    NASA Astrophysics Data System (ADS)

    Zack, J. W.

    2015-12-01

    Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble, which is a case-matching scheme. The presentation will provide (1) an overview of each method and the experimental design, (2) performance comparisons based on standard metrics such as bias, MAE and RMSE, (3) a summary of the performance characteristics of each approach and (4) a preview of further experiments to be conducted.

  13. Probabilistic volcanic hazard assessments of Pyroclastic Density Currents: ongoing practices and future perspectives

    NASA Astrophysics Data System (ADS)

    Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto

    2014-05-01

    Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease simulator runtimes (whether by increasing the knowledge about the physical processes or by doing more efficient programming, parallelization, ...) and 3) Improve uncertainty quantification techniques.

  14. Outcomes of Thoracic Endovascular Aortic Repair and Subclavian Revascularization Techniques

    PubMed Central

    Zamor, Kimberly C; Eskandari, Mark K; Rodriguez, Heron E; Ho, Karen J; Morasch, Mark D; Hoel, Andrew W

    2015-01-01

    Background Practice guidelines regarding management of the left subclavian artery (LSA) during thoracic endovascular aortic repair (TEVAR) are based on low quality evidence and there is limited literature that addresses optimal revascularization techniques. The purpose of this study is to compare outcomes of LSA coverage during TEVAR and revascularization techniques. Study Design We performed a single-center retrospective cohort study from 2001–2013. Patients were categorized by LSA revascularization and by revascularization technique, carotid-subclavian bypass (CSB) or subclavian-carotid transposition (SCT). Thirty-day and mid-term stroke, spinal cord ischemia, vocal cord paralysis, upper extremity ischemia, primary patency of revascularization, and mortality were compared. Results Eighty patients underwent TEVAR with LSA coverage, 25% (n=20) were unrevascularized and the remaining patients underwent CSB (n=22, 27.5%) or SCT (n=38, 47.5%). Mean follow-up time was 24.9 months. Comparisons between unrevascularized and revascularized patients were significant for a higher rate of 30-day stroke (25% vs. 2%, p=0.003) and upper extremity ischemia (15% vs. 0%, p=0.014). However, there was no difference in 30-day or mid-term rates of spinal cord ischemia, vocal cord paralysis, or mortality. There were no statistically significant differences in 30-day or midterm outcomes for CSB vs. SCT. Primary patency of revascularizations was 100%. Survival analysis comparing unrevascularized vs. revascularized LSA, was statistically significant for freedom from stroke and upper extremity ischemia, p=0.02 and p=0.003, respectively. After adjustment for advanced age, urgency and coronary artery disease, LSA revascularization was associated with lower rates of peri-operative adverse events (OR 0.23, p=0.034). Conclusions During TEVAR, LSA coverage without revascularization is associated with an increased risk of stroke and upper extremity ischemia. When LSA coverage is required during TEVAR, CSB and SCT are equally acceptable options. PMID:25872688

  15. Principal component analysis of normalized full spectrum mass spectrometry data in multiMS-toolbox: An effective tool to identify important factors for classification of different metabolic patterns and bacterial strains.

    PubMed

    Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora

    2018-06-15

    Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.

  16. Development of Large Area Emulsion Chamber Methods with a Super Conducting Magnet for Observation of Cosmic Ray Nuclei from 1 GeV to 1,000 TeV (Emulsion Techniques)

    NASA Technical Reports Server (NTRS)

    Takahashi, Yoshiyuki; Gregory, John C.; Tominaga, Taka; Dong, Bei Lei

    1997-01-01

    The research developed the fundamental techniques of the emulsion chamber methods that permit measurements of the composition and energy spectra of cosmic rays at energies ranging from 1 GeV/n to over 1,000 TeV/n. The research program consisted of exploring new principles and techniques in measuring very high energy cosmic nuclei with large-area emulsion chambers for high statistics experiments. These tasks have been accomplished and their use was essential in successful analysis of the balloon-borne emulsion chamber experiments up to 10(exp 14) eV. It also provided the fundamental technologies for designing large-area detectors that are aimed at measuring the composition at above 1015 eV region. The latter is now partially succeeded by a NASA Mission Concept, Advanced Cosmic Composition Experiments on the Space Station (ACCESS). The cosmic ray group at the University of Alabama in Huntsville has performed technological R & D as well as contributing to the Japanese-American-Emulsion-Chamber-Experiments (JACEE) Collaboration with the regular data analysis. While primary research support for other institutions' efforts in the JACEE experiments came from NSF and DOE, primary support for the University of Alabama in Huntsville was this contract. Supplemental tasks to standardize the data base and hardware upgrades (automatized microscope) had this institutions cooperation. Investigation of new techniques in this program consisted of development of a fast calorimetry, magnetic/scattering selection of high momentum tracks for a pairmeter, and high statistics momentum measurements for low energy nuclei (E < 1 TeV/n). The highest energy calorimetry and a pairmeter have been considered as strawman instruments by the GOAL (Galactic Origin and Acceleration Limit) proposal of the NASA Cosmic Ray Working Group for long- duration balloon flights. We accomplished the objectives of the GOAL program with three circumpolar, Antarctic JACEE balloon flights during 1992 - 1994.

  17. Interview with Dennis Pearl

    ERIC Educational Resources Information Center

    Rossman, Allan; Pearl, Dennis

    2017-01-01

    Dennis Pearl is Professor of Statistics at Pennsylvania State University and Director of the Consortium for the Advancement of Undergraduate Statistics Education (CAUSE). He is a Fellow of the American Statistical Association. This interview took place via email on November 18-29, 2016, and provides Dennis Pearl's background story, which describes…

  18. 5 CFR 297.401 - Conditions of disclosure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... with advance adequate written assurance that the record will be used solely as a statistical research... records; and (ii) Certification that the records will be used only for statistical purposes. (2) These... information from records released for statistical purposes, the system manager will reasonably ensure that the...

  19. Write-Skewed: Writing in an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Delcham, Hendrick; Sezer, Renan

    2010-01-01

    Statistics is used in almost every facet of our daily lives: crime reports, election results, environmental/climate change, advances in business, financial planning, and progress in multifarious research. Although understanding statistics is essential for efficient functioning in the modern world (Cerrito 1996), students often do not grasp…

  20. 28 CFR 0.154 - Advance and evacuation payments and special allowances.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Advance and evacuation payments and... Advance and evacuation payments and special allowances. The Director of the Federal Bureau of... Marshals Service, and the Director of the Office of Justice Assistance, Research and Statistics, as to...

  1. SMART-DS: Synthetic Models for Advanced, Realistic Testing: Distribution

    Science.gov Websites

    statistical summary of the U.S. distribution systems World-class, high spatial/temporal resolution of solar Systems and Scenarios | Grid Modernization | NREL SMART-DS: Synthetic Models for Advanced , Realistic Testing: Distribution Systems and Scenarios SMART-DS: Synthetic Models for Advanced, Realistic

  2. Are Assumptions of Well-Known Statistical Techniques Checked, and Why (Not)?

    PubMed Central

    Hoekstra, Rink; Kiers, Henk A. L.; Johnson, Addie

    2012-01-01

    A valid interpretation of most statistical techniques requires that one or more assumptions be met. In published articles, however, little information tends to be reported on whether the data satisfy the assumptions underlying the statistical techniques used. This could be due to self-selection: Only manuscripts with data fulfilling the assumptions are submitted. Another explanation could be that violations of assumptions are rarely checked for in the first place. We studied whether and how 30 researchers checked fictitious data for violations of assumptions in their own working environment. Participants were asked to analyze the data as they would their own data, for which often used and well-known techniques such as the t-procedure, ANOVA and regression (or non-parametric alternatives) were required. It was found that the assumptions of the techniques were rarely checked, and that if they were, it was regularly by means of a statistical test. Interviews afterward revealed a general lack of knowledge about assumptions, the robustness of the techniques with regards to the assumptions, and how (or whether) assumptions should be checked. These data suggest that checking for violations of assumptions is not a well-considered choice, and that the use of statistics can be described as opportunistic. PMID:22593746

  3. Evaluation of Three Different Processing Techniques in the Fabrication of Complete Dentures

    PubMed Central

    Chintalacheruvu, Vamsi Krishna; Balraj, Rajasekaran Uttukuli; Putchala, Lavanya Sireesha; Pachalla, Sreelekha

    2017-01-01

    Aims and Objectives: The objective of the present study is to compare the effectiveness of three different processing techniques and to find out the accuracy of processing techniques through number of occlusal interferences and increase in vertical dimension after denture processing. Materials and Methods: A cross-sectional study was conducted on a sample of 18 patients indicated for complete denture fabrication was selected for the study and they were divided into three subgroups. Three processing techniques, compression molding and injection molding using prepolymerized resin and unpolymerized resin, were used to fabricate dentures for each of the groups. After processing, laboratory-remounted dentures were evaluated for number of occlusal interferences in centric and eccentric relations and change in vertical dimension through vertical pin rise in articulator. Data were analyzed using statistical test ANOVA and SPSS software version 19.0 by IBM was used. Results: Data obtained from three groups were subjected to one-way ANOVA test. After ANOVA test, results with significant variations were subjected to post hoc test. Number of occlusal interferences with compression molding technique was reported to be more in both centric and eccentric positions as compared to the two injection molding techniques with statistical significance in centric, protrusive, right lateral nonworking, and left lateral working positions (P < 0.05). Mean vertical pin rise (0.52 mm) was reported to more in compression molding technique as compared to injection molding techniques, which is statistically significant (P < 0.001). Conclusions: Within the limitations of this study, injection molding techniques exhibited less processing errors as compared to compression molding technique with statistical significance. There was no statistically significant difference in processing errors reported within two injection molding systems. PMID:28713763

  4. Evaluation of Three Different Processing Techniques in the Fabrication of Complete Dentures.

    PubMed

    Chintalacheruvu, Vamsi Krishna; Balraj, Rajasekaran Uttukuli; Putchala, Lavanya Sireesha; Pachalla, Sreelekha

    2017-06-01

    The objective of the present study is to compare the effectiveness of three different processing techniques and to find out the accuracy of processing techniques through number of occlusal interferences and increase in vertical dimension after denture processing. A cross-sectional study was conducted on a sample of 18 patients indicated for complete denture fabrication was selected for the study and they were divided into three subgroups. Three processing techniques, compression molding and injection molding using prepolymerized resin and unpolymerized resin, were used to fabricate dentures for each of the groups. After processing, laboratory-remounted dentures were evaluated for number of occlusal interferences in centric and eccentric relations and change in vertical dimension through vertical pin rise in articulator. Data were analyzed using statistical test ANOVA and SPSS software version 19.0 by IBM was used. Data obtained from three groups were subjected to one-way ANOVA test. After ANOVA test, results with significant variations were subjected to post hoc test. Number of occlusal interferences with compression molding technique was reported to be more in both centric and eccentric positions as compared to the two injection molding techniques with statistical significance in centric, protrusive, right lateral nonworking, and left lateral working positions ( P < 0.05). Mean vertical pin rise (0.52 mm) was reported to more in compression molding technique as compared to injection molding techniques, which is statistically significant ( P < 0.001). Within the limitations of this study, injection molding techniques exhibited less processing errors as compared to compression molding technique with statistical significance. There was no statistically significant difference in processing errors reported within two injection molding systems.

  5. Overall Survival of Patients with Locally Advanced or Metastatic Esophageal Squamous Cell Carcinoma Treated with Nimotuzumab in the Real World.

    PubMed

    Saumell, Yaimarelis; Sanchez, Lizet; González, Sandra; Ortiz, Ramón; Medina, Edadny; Galán, Yaima; Lage, Agustin

    2017-12-01

    Despite improvements in surgical techniques and treatments introduced into clinical practice, the overall survival of patients with esophageal squamous cell carcinoma remains low. Several epidermal growth factor receptor inhibitors are being evaluated in the context of clinical trials, but there is little evidence of effectiveness in real-world conditions. This study aimed at assessing the effectiveness of nimotuzumab combined with onco-specific treatment in Cuban real-life patients with locally advanced or metastatic esophageal squamous cell carcinoma. A comparative and retrospective effectiveness study was performed. The 93 patients treated with nimotuzumab were matched, with use of propensity score matching, with patients who received a diagnosis of locally advanced or metastatic squamous cell carcinoma of the esophagus in three Cuban provinces reported between 2011 and 2015 to the National Cancer Registry. The Kaplan-Meier method was used to estimate event-time distributions. Log-rank statistics were used for comparisons of overall survival between groups. A two-component mixture model assuming a Weibull distribution was fitted to assess the effect of nimotuzumab on short-term and long-term survival populations. There was an increase in median overall survival in patients treated with nimotuzumab (11.9 months versus 6.5 months without treatment) and an increase in the 1-year survival rate (54.0% versus 21.9% without treatment). The 2-year survival rates were 21.1% for patients treated with nimotuzumab and 0% in the untreated cohort. There were statistically significant differences in survival between groups treated and not treated with nimotuzumab, both in the short-term survival population (6.0 months vs 4.0 months, p = 0.009) and in the long-term survival population (18.0 months vs 11.0 months, p = 0.001). Our study shows that nimotuzumab treatment concurrent with chemoradiotherapy increases the survival of real-world patients with locally advanced or metastatic esophageal squamous cell carcinoma. Further prospective studies are required to confirm the therapeutic effectiveness of nimotuzumab in esophageal cancer.

  6. Statistical Tutorial | Center for Cancer Research

    Cancer.gov

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data.  ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018.  The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean differences, simple and multiple linear regression, ANOVA tests, and Chi-Squared distribution.

  7. Advanced Software V&V for Civil Aviation and Autonomy

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.

    2017-01-01

    With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.

  8. Advanced chronic kidney disease in non-valvular atrial fibrillation: extending the utility of R2CHADS2 to patients with advanced renal failure.

    PubMed

    Bautista, Josef; Bella, Archie; Chaudhari, Ashok; Pekler, Gerald; Sapra, Katherine J; Carbajal, Roger; Baumstein, Donald

    2015-04-01

    The R2CHADS2 is a new prediction rule for stroke risk in atrial fibrillation (AF) patients wherein R stands for renal risk. However, it was created from a cohort that excluded patients with advanced renal failure (defined as glomerular filtration rate of <30 mL/min). Our study extends the use of R2CHADS2 to patients with advanced renal failure and aims to compare its predictive power against the currently used CHADS and CHA2DS2VaSc. This retrospective cohort study analyzed the 1-year risk for stroke of the 524 patients with AF at Metropolitan Hospital Center. AUC and C statistics were calculated using three groups: (i) the entire cohort including patients with advanced renal failure, (ii) a cohort excluding patients with advanced renal failure and (iii) all patients with GFR < 30 mL/min only. R2CHADS2, as a predictor for stroke risk, consistently performs better than CHADS2 and CHA2DS2VsC in groups 1 and 2. The C-statistic was highest in R2CHADS compared with CHADS or CHADSVASC in group 1 (0.718 versus 0.605 versus 0.602) and in group 2 (0.724 versus 0.584 versus 0.579). However, there was no statistically significant difference in group 3 (0.631 versus 0.629 versus 0.623). Our study supports the utility of R2CHADS2 as a clinical prediction rule for stroke risk in patients with advanced renal failure.

  9. Deriving Criteria-supporting Benchmark Values from Empirical Response Relationships: Comparison of Statistical Techniques and Effect of Log-transforming the Nutrient Variable

    EPA Science Inventory

    In analyses supporting the development of numeric nutrient criteria, multiple statistical techniques can be used to extract critical values from stressor response relationships. However there is little guidance for choosing among techniques, and the extent to which log-transfor...

  10. Incorporating Nonparametric Statistics into Delphi Studies in Library and Information Science

    ERIC Educational Resources Information Center

    Ju, Boryung; Jin, Tao

    2013-01-01

    Introduction: The Delphi technique is widely used in library and information science research. However, many researchers in the field fail to employ standard statistical tests when using this technique. This makes the technique vulnerable to criticisms of its reliability and validity. The general goal of this article is to explore how…

  11. Advances in borehole geophysics for hydrology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, P.H.

    1982-01-01

    Borehole geophysical methods provide vital subsurface information on rock properties, fluid movement, and the condition of engineered borehole structures. Within the first category, salient advances include the continuing improvement of the borehole televiewer, refinement of the electrical conductivity dipmeter for fracture characterization, and the development of a gigahertz-frequency electromagnetic propagation tool for water saturation measurements. The exploration of the rock mass between boreholes remains a challenging problem with high potential; promising methods are now incorporating high-density spatial sampling and sophisticated data processing. Flow-rate measurement methods appear adequate for all but low-flow situations. At low rates the tagging method seems themore » most attractive. The current exploitation of neutron-activation techniques for tagging means that the wellbore fluid itself is tagged, thereby eliminating the mixing of an alien fluid into the wellbore. Another method uses the acoustic noise generated by flow through constrictions and in and behind casing to detect and locate flaws in the production system. With the advent of field-recorded digital data, the interpretation of logs from sedimentary sequences is now reaching a sophisticated level with the aid of computer processing and the application of statistical methods. Lagging behind are interpretive schemes for the low-porosity, fracture-controlled igneous and metamorphic rocks encountered in the geothermal reservoirs and in potential waste-storage sites. Progress is being made on the general problem of fracture detection by use of electrical and acoustical techniques, but the reliable definition of permeability continues to be an elusive goal.« less

  12. An Analysis of Research Methods and Statistical Techniques Used by Doctoral Dissertation at the Education Sciences in Turkey

    ERIC Educational Resources Information Center

    Karadag, Engin

    2010-01-01

    To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…

  13. Statistics in the Workplace: A Survey of Use by Recent Graduates with Higher Degrees

    ERIC Educational Resources Information Center

    Harraway, John A.; Barker, Richard J.

    2005-01-01

    A postal survey was conducted regarding statistical techniques, research methods and software used in the workplace by 913 graduates with PhD and Masters degrees in the biological sciences, psychology, business, economics, and statistics. The study identified gaps between topics and techniques learned at university and those used in the workplace,…

  14. Statistical approach for selection of biologically informative genes.

    PubMed

    Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N

    2018-05-20

    Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.

  15. Recent advances in vacuum sciences and applications

    NASA Astrophysics Data System (ADS)

    Mozetič, M.; Ostrikov, K.; Ruzic, D. N.; Curreli, D.; Cvelbar, U.; Vesel, A.; Primc, G.; Leisch, M.; Jousten, K.; Malyshev, O. B.; Hendricks, J. H.; Kövér, L.; Tagliaferro, A.; Conde, O.; Silvestre, A. J.; Giapintzakis, J.; Buljan, M.; Radić, N.; Dražić, G.; Bernstorff, S.; Biederman, H.; Kylián, O.; Hanuš, J.; Miloševič, S.; Galtayries, A.; Dietrich, P.; Unger, W.; Lehocky, M.; Sedlarik, V.; Stana-Kleinschek, K.; Drmota-Petrič, A.; Pireaux, J. J.; Rogers, J. W.; Anderle, M.

    2014-04-01

    Recent advances in vacuum sciences and applications are reviewed. Novel optical interferometer cavity devices enable pressure measurements with ppm accuracy. The innovative dynamic vacuum standard allows for pressure measurements with temporal resolution of 2 ms. Vacuum issues in the construction of huge ultra-high vacuum devices worldwide are reviewed. Recent advances in surface science and thin films include new phenomena observed in electron transport near solid surfaces as well as novel results on the properties of carbon nanomaterials. Precise techniques for surface and thin-film characterization have been applied in the conservation technology of cultural heritage objects and recent advances in the characterization of biointerfaces are presented. The combination of various vacuum and atmospheric-pressure techniques enables an insight into the complex phenomena of protein and other biomolecule conformations on solid surfaces. Studying these phenomena at solid-liquid interfaces is regarded as the main issue in the development of alternative techniques for drug delivery, tissue engineering and thus the development of innovative techniques for curing cancer and cardiovascular diseases. A review on recent advances in plasma medicine is presented as well as novel hypotheses on cell apoptosis upon treatment with gaseous plasma. Finally, recent advances in plasma nanoscience are illustrated with several examples and a roadmap for future activities is presented.

  16. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…

  17. Testing the Difference of Correlated Agreement Coefficients for Statistical Significance

    ERIC Educational Resources Information Center

    Gwet, Kilem L.

    2016-01-01

    This article addresses the problem of testing the difference between two correlated agreement coefficients for statistical significance. A number of authors have proposed methods for testing the difference between two correlated kappa coefficients, which require either the use of resampling methods or the use of advanced statistical modeling…

  18. Transportation informatics : advanced image processing techniques automated pavement distress evaluation.

    DOT National Transportation Integrated Search

    2010-01-01

    The current project, funded by MIOH-UTC for the period 1/1/2009- 4/30/2010, is concerned : with the development of the framework for a transportation facility inspection system using : advanced image processing techniques. The focus of this study is ...

  19. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.; Stebbins, J. P.; Smith, A. W.; Pullen, K. E.

    1973-01-01

    A method for the prediction of propellant-material compatibility for periods of time up to ten years is presented. Advanced sensitive measurement techniques used in the prediction method are described. These include: neutron activation analysis, radioactive tracer technique, and atomic absorption spectroscopy with a graphite tube furnace sampler. The results of laboratory tests performed to verify the prediction method are presented.

  20. Advancement of Techniques for Modeling the Effects of Atmospheric Gravity-Wave-Induced Inhomogeneities on Infrasound Propagation

    DTIC Science & Technology

    2010-09-01

    ADVANCEMENT OF TECHNIQUES FOR MODELING THE EFFECTS OF ATMOSPHERIC GRAVITY-WAVE-INDUCED INHOMOGENEITIES ON INFRASOUND PROPAGATION Robert G...number of infrasound observations indicate that fine-scale atmospheric inhomogeneities contribute to infrasonic arrivals that are not predicted by...standard modeling techniques. In particular, gravity waves, or buoyancy waves, are believed to contribute to the multipath nature of infrasound

  1. Laparoscopic versus open-component separation: a comparative analysis in a porcine model.

    PubMed

    Rosen, Michael J; Williams, Christina; Jin, Judy; McGee, Michael F; Schomisch, Steve; Marks, Jeffrey; Ponsky, Jeffrey

    2007-09-01

    The ideal surgical treatment for complicated ventral hernias remains elusive. Traditional component separation provides local advancement of native tissue for tension-free closure without prosthetic materials. This technique requires an extensive subcutaneous dissection with division of perforating vessels predisposing to skin-flap necrosis and complicated wound infections. A minimally invasive component separation may decrease wound complication rates; however, the adequacy of the myofascial advancement has not been studied. Five 25-kg pigs underwent bilateral laparoscopic component separation. A 10-mm incision was made lateral to the rectus abdominus muscle. The external oblique fascia was incised, and a dissecting balloon was inflated between the internal and external oblique muscles. Two additional ports were placed in the intermuscular space. The external oblique was incised from the costal margin to the inguinal ligament. The maximal abdominal wall advancement was recorded. A formal open-component separation was performed and maximal advancement 5 cm superior and 5 cm inferior to the umbilicus was recorded for comparison. Groups were compared using standard statistical analysis. The laparoscopic component separation was completed successfully in all animals, with a mean of 22 min/side. Laparoscopic component separation yielded 3.9 cm (SD 1.1) of fascial advancement above the umbilicus, whereas 4.4 cm (1.2) was obtained after open release (P = .24). Below the umbilicus, laparoscopic release achieved 5.0 cm (1.0) of advancement, whereas 5.8 cm (1.2) was gained after open release (P = .13). The minimally invasive component separation achieved an average of 86% of the myofascial advancement compared with a formal open release. The laparoscopic approach does not require extensive subcutaneous dissection and might theoretically result in a decreased incidence or decreased complexity of postoperative wound infections or skin-flap necrosis. Based on our preliminary data in this porcine model, further comparative studies of laparoscopic versus open component separation in complex ventral hernia repair is warranted to evaluate postoperative morbidity and long-term hernia recurrence rates.

  2. Multisensor fusion for the detection of mines and minelike targets

    NASA Astrophysics Data System (ADS)

    Hanshaw, Terilee

    1995-06-01

    The US Army's Communications and Electronics Command through the auspices of its Night Vision and Electronics Sensors Directorate (CECOM-NVESD) is actively applying multisensor techniques to the detection of mine targets. This multisensor research results from the 'detection activity' with its broad range of operational conditions and targets. Multisensor operation justifies significant attention by yielding high target detection and low false alarm statistics. Furthermore, recent advances in sensor and computing technologies make its practical application realistic and affordable. The mine detection field-of-endeavor has since its WWI baptismal investigated the known spectra for applicable mine observation phenomena. Countless sensors, algorithms, processors, networks, and other techniques have been investigated to determine candidacy for mine detection. CECOM-NVESD efforts have addressed a wide range of sensors spanning the spectrum from gravity field perturbations, magentic field disturbances, seismic sounding, electromagnetic fields, earth penetrating radar imagery, and infrared/visible/ultraviolet surface imaging technologies. Supplementary analysis has considered sensor candidate applicability by testing under field conditions (versus laboratory), in determination of fieldability. As these field conditions directly effect the probability of detection and false alarms, sensor employment and design must be considered. Consequently, as a given sensor's performance is influenced directly by the operational conditions, tradeoffs are necessary. At present, mass produced and fielded mine detection techniques are limited to those incorporating a single sensor/processor methodology such as, pulse induction and megnetometry, as found in hand held detectors. The most sensitive fielded systems can detect minute metal components in small mine targets but result in very high false alarm rates reducing velocity in operation environments. Furthermore, the actual speed of advance for the entire mission (convoy, movement to engagement, etc.) is determined by the level of difficulty presented in clearance or avoidance activities required in response to the potential 'targets' marked throughout a detection activity. Therefore the application of fielded hand held systems to convoy operations in clearly impractical. CECOM-NVESD efforts are presently seeking to overcome these operational limitations by substantially increasing speed of detection while reducing the false alarm rate through the application of multisensor techniques. The CECOM-NVESD application of multisensor techniques through integration/fusion methods will be defined in this paper.

  3. Techniques in teaching statistics : linking research production and research use.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martinez-Moyano, I .; Smith, A.; Univ. of Massachusetts at Boston)

    In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between researchmore » and practice.« less

  4. Advanced imaging in COPD: insights into pulmonary pathophysiology

    PubMed Central

    Milne, Stephen

    2014-01-01

    Chronic obstructive pulmonary disease (COPD) involves a complex interaction of structural and functional abnormalities. The two have long been studied in isolation. However, advanced imaging techniques allow us to simultaneously assess pathological processes and their physiological consequences. This review gives a comprehensive account of the various advanced imaging modalities used to study COPD, including computed tomography (CT), magnetic resonance imaging (MRI), and the nuclear medicine techniques positron emission tomography (PET) and single-photon emission computed tomography (SPECT). Some more recent developments in imaging technology, including micro-CT, synchrotron imaging, optical coherence tomography (OCT) and electrical impedance tomography (EIT), are also described. The authors identify the pathophysiological insights gained from these techniques, and speculate on the future role of advanced imaging in both clinical and research settings. PMID:25478198

  5. A review of ocean color remote sensing methods and statistical techniques for the detection, mapping and analysis of phytoplankton blooms in coastal and open oceans

    NASA Astrophysics Data System (ADS)

    Blondeau-Patissier, David; Gower, James F. R.; Dekker, Arnold G.; Phinn, Stuart R.; Brando, Vittorio E.

    2014-04-01

    The need for more effective environmental monitoring of the open and coastal ocean has recently led to notable advances in satellite ocean color technology and algorithm research. Satellite ocean color sensors' data are widely used for the detection, mapping and monitoring of phytoplankton blooms because earth observation provides a synoptic view of the ocean, both spatially and temporally. Algal blooms are indicators of marine ecosystem health; thus, their monitoring is a key component of effective management of coastal and oceanic resources. Since the late 1970s, a wide variety of operational ocean color satellite sensors and algorithms have been developed. The comprehensive review presented in this article captures the details of the progress and discusses the advantages and limitations of the algorithms used with the multi-spectral ocean color sensors CZCS, SeaWiFS, MODIS and MERIS. Present challenges include overcoming the severe limitation of these algorithms in coastal waters and refining detection limits in various oceanic and coastal environments. To understand the spatio-temporal patterns of algal blooms and their triggering factors, it is essential to consider the possible effects of environmental parameters, such as water temperature, turbidity, solar radiation and bathymetry. Hence, this review will also discuss the use of statistical techniques and additional datasets derived from ecosystem models or other satellite sensors to characterize further the factors triggering or limiting the development of algal blooms in coastal and open ocean waters.

  6. Techniques for estimating health care costs with censored data: an overview for the health services researcher

    PubMed Central

    Wijeysundera, Harindra C; Wang, Xuesong; Tomlinson, George; Ko, Dennis T; Krahn, Murray D

    2012-01-01

    Objective The aim of this study was to review statistical techniques for estimating the mean population cost using health care cost data that, because of the inability to achieve complete follow-up until death, are right censored. The target audience is health service researchers without an advanced statistical background. Methods Data were sourced from longitudinal heart failure costs from Ontario, Canada, and administrative databases were used for estimating costs. The dataset consisted of 43,888 patients, with follow-up periods ranging from 1 to 1538 days (mean 576 days). The study was designed so that mean health care costs over 1080 days of follow-up were calculated using naïve estimators such as full-sample and uncensored case estimators. Reweighted estimators – specifically, the inverse probability weighted estimator – were calculated, as was phase-based costing. Costs were adjusted to 2008 Canadian dollars using the Bank of Canada consumer price index (http://www.bankofcanada.ca/en/cpi.html). Results Over the restricted follow-up of 1080 days, 32% of patients were censored. The full-sample estimator was found to underestimate mean cost ($30,420) compared with the reweighted estimators ($36,490). The phase-based costing estimate of $37,237 was similar to that of the simple reweighted estimator. Conclusion The authors recommend against the use of full-sample or uncensored case estimators when censored data are present. In the presence of heavy censoring, phase-based costing is an attractive alternative approach. PMID:22719214

  7. Statistical Symbolic Execution with Informed Sampling

    NASA Technical Reports Server (NTRS)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  8. Machine learning based cloud mask algorithm driven by radiative transfer modeling

    NASA Astrophysics Data System (ADS)

    Chen, N.; Li, W.; Tanikawa, T.; Hori, M.; Shimada, R.; Stamnes, K. H.

    2017-12-01

    Cloud detection is a critically important first step required to derive many satellite data products. Traditional threshold based cloud mask algorithms require a complicated design process and fine tuning for each sensor, and have difficulty over snow/ice covered areas. With the advance of computational power and machine learning techniques, we have developed a new algorithm based on a neural network classifier driven by extensive radiative transfer modeling. Statistical validation results obtained by using collocated CALIOP and MODIS data show that its performance is consistent over different ecosystems and significantly better than the MODIS Cloud Mask (MOD35 C6) during the winter seasons over mid-latitude snow covered areas. Simulations using a reduced number of satellite channels also show satisfactory results, indicating its flexibility to be configured for different sensors.

  9. Effect of bending on the room-temperature tensile strengths of structural ceramics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenkins, M.G.

    1992-01-01

    Results for nearly fifty, room-temperature tensile tests conducted on two advanced, monolithic silicon nitride ceramics are evaluated for the effects of bending and application of various Weibull statistical analyses. Two specimen gripping systems (straight collet and tapered collet) were evaluated for both success in producing gage section failures and tendency to minimize bending at failure. Specimen fabrication and grinding technique consderations are briefly reviewed and related to their effects on successful tensile tests. Ultimate tensile strengths are related to the bending measured at specimen failure and the effects of the gripping system on bending are discussed. Finally, comparisons are mademore » between the use of censored and uncensored data sample sets for determining the maximum likelihood estimates of the Weibull parameters from the tensile strength distributions.« less

  10. Effect of bending on the room-temperature tensile strengths of structural ceramics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenkins, M.G.

    1992-07-01

    Results for nearly fifty, room-temperature tensile tests conducted on two advanced, monolithic silicon nitride ceramics are evaluated for the effects of bending and application of various Weibull statistical analyses. Two specimen gripping systems (straight collet and tapered collet) were evaluated for both success in producing gage section failures and tendency to minimize bending at failure. Specimen fabrication and grinding technique consderations are briefly reviewed and related to their effects on successful tensile tests. Ultimate tensile strengths are related to the bending measured at specimen failure and the effects of the gripping system on bending are discussed. Finally, comparisons are mademore » between the use of censored and uncensored data sample sets for determining the maximum likelihood estimates of the Weibull parameters from the tensile strength distributions.« less

  11. Work-family Conflict and Alcohol Use: Examination of a Moderated Mediation Model

    PubMed Central

    Wolff, Jennifer M.; Rospenda, Kathleen M.; Richman, Judith A.; Liu, Li; Milner, Lauren A.

    2013-01-01

    Research consistently documents the negative effects of work-family conflict; however, little focuses on alcohol use. This study embraces a tension-reduction theory of drinking, wherein alcohol use is thought to reduce the negative effects of stress. The purpose of the present study was to test a moderated mediation model of the relationship between work-family conflict and alcohol use in a Chicagoland community sample of 998 caregivers. Structural equation models showed that distress mediated the relationship between work-family conflict and alcohol use. Furthermore, tension reduction expectancies of alcohol exacerbated the relationship between distress and alcohol use. The results advance the study of work-family conflict and alcohol use, helping explain this complicated relationship using sophisticated statistical techniques. Implications for theory and practice are discussed. PMID:23480251

  12. Work-family conflict and alcohol use: examination of a moderated mediation model.

    PubMed

    Wolff, Jennifer M; Rospenda, Kathleen M; Richman, Judith A; Liu, Li; Milner, Lauren A

    2013-01-01

    Research consistently documents the negative effects of work-family conflict; however, little research focuses on alcohol use. This study embraces a tension reduction theory of drinking, wherein alcohol use is thought to reduce the negative effects of stress. The purpose of the study was to test a moderated mediation model of the relationship between work-family conflict and alcohol use in a Chicagoland community sample of 998 caregivers. Structural equation models showed that distress mediated the relationship between work-family conflict and alcohol use. Furthermore, tension reduction expectancies of alcohol exacerbated the relationship between distress and alcohol use. The results advance the study of work-family conflict and alcohol use, helping explain this complicated relationship using sophisticated statistical techniques. Implications for theory and practice are discussed.

  13. Fast and accurate automated cell boundary determination for fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Arce, Stephen Hugo; Wu, Pei-Hsun; Tseng, Yiider

    2013-07-01

    Detailed measurement of cell phenotype information from digital fluorescence images has the potential to greatly advance biomedicine in various disciplines such as patient diagnostics or drug screening. Yet, the complexity of cell conformations presents a major barrier preventing effective determination of cell boundaries, and introduces measurement error that propagates throughout subsequent assessment of cellular parameters and statistical analysis. State-of-the-art image segmentation techniques that require user-interaction, prolonged computation time and specialized training cannot adequately provide the support for high content platforms, which often sacrifice resolution to foster the speedy collection of massive amounts of cellular data. This work introduces a strategy that allows us to rapidly obtain accurate cell boundaries from digital fluorescent images in an automated format. Hence, this new method has broad applicability to promote biotechnology.

  14. Predicting, examining, and evaluating FAC in US power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cohn, M.J.; Garud, Y.S.; Raad, J. de

    1999-11-01

    There have been many pipe failures in fossil and nuclear power plant piping systems caused by flow-accelerated corrosion (FAC). In some piping systems, this failure mechanism maybe the most important type of damage to mitigate because FAC damage has led to catastrophic failures and fatalities. Detecting the damage and mitigating the problem can significantly reduce future forced outages and increase personnel safety. This article discusses the implementation of recent developments to select FAC inspection locations, perform cost-effective examinations, evaluate results, and mitigate FAC failures. These advances include implementing the combination of software to assist in selecting examination locations and anmore » improved pulsed eddy current technique to scan for wall thinning without removing insulation. The use of statistical evaluation methodology and possible mitigation strategies also are discussed.« less

  15. Numerical consideration on trapping and guiding of nanoparticles in a flow using scattering field of laser light

    NASA Astrophysics Data System (ADS)

    Yokoi, Naomichi; Aizu, Yoshihisa

    2017-04-01

    Optical manipulation techniques proposed so far almost depend on carefully fabricated setups and samples. Similar conditions can be fixed in laboratories, however, it is still a challenging work to manipulate nanoparticles when the environment is not well controlled and is unknown in advance. Nonetheless, coherent light scattered by rough object generates speckles which are random interference patterns with well-defined statistical properties. In the present study, we numerically investigate the motion of a particle in a flow under the illumination of a speckle pattern that is at rest or in motion. Trajectory of the particle is simulated in relation to a flow velocity and a speckle contrast to confirm the feasibility of the present method for performing optical manipulation tasks such as trapping and guiding.

  16. Numerical considerations on control of motion of nanoparticles using scattering field of laser light

    NASA Astrophysics Data System (ADS)

    Yokoi, Naomichi; Aizu, Yoshihisa

    2017-05-01

    Most of optical manipulation techniques proposed so far depend on carefully fabricated setups and samples. Similar conditions can be fixed in laboratories; however, it is still challenging to manipulate nanoparticles when the environment is not well controlled and is unknown in advance. Nonetheless, coherent light scattered by rough object generates a speckle pattern which consists of random interference speckle grains with well-defined statistical properties. In the present study, we numerically investigate the motion of a Brownian particle suspended in water under the illumination of a speckle pattern. Particle-captured time and size of particle-captured area are quantitatively estimated in relation to an optical force and a speckle diameter to confirm the feasibility of the present method for performing optical manipulation tasks such as trapping and guiding.

  17. Detection of plum pox virus infection in selection plum trees using spectral imaging

    NASA Astrophysics Data System (ADS)

    Angelova, Liliya; Stoev, Antoniy; Borisova, Ekaterina; Avramov, Latchezar

    2016-01-01

    Plum pox virus (PPV) is among the most studied viral diseases in the world in plants. It is considered to be one of the most devastating diseases of stone fruits in terms of agronomic impact and economic importance. Noninvasive, fast and reliable techniques are required for evaluation of the pathology in selection trees with economic impact. Such advanced tools for PPV detection could be optical techniques as light-induced fluorescence and diffuse reflectance spectroscopies. Specific regions in the electromagnetic spectra have been found to provide information about the physiological stress in plants, and consequently, diseased plants usually exhibit different spectral signature than non-stressed healthy plants in those specific ranges. In this study spectral reflectance and chlorophyll fluorescence were used for the identification of biotic stress caused by the pox virus on plum trees. The spectral responses of healthy and infected leaves from cultivars, which are widespread in Bulgaria were investigated. The two applied techniques revealed statistically significant differences between the spectral data of healthy plum leaves and those infected by PPV in the visible and near-infrared spectral ranges. Their application for biotic stress detection helps in monitoring diseases in plants using the different plant spectral properties in these spectral ranges. The strong relationship between the results indicates the applicability of diffuse reflectance and fluorescence techniques for conducting health condition assessments of vegetation and their importance for plant protection practices.

  18. 75 FR 78063 - Passenger Weight and Inspected Vessel Stability Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-14

    ... Health Statistics NEPA--National Environmental Policy Act of 1969 NHANES--National Health and Nutrition..., Advance Data From Vital Health Statistics Mean Body Weight, Height, and Body Mass Index, United States...

  19. WE-AB-209-12: Quasi Constrained Multi-Criteria Optimization for Automated Radiation Therapy Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watkins, W.T.; Siebers, J.V.

    Purpose: To introduce quasi-constrained Multi-Criteria Optimization (qcMCO) for unsupervised radiation therapy optimization which generates alternative patient-specific plans emphasizing dosimetric tradeoffs and conformance to clinical constraints for multiple delivery techniques. Methods: For N Organs At Risk (OARs) and M delivery techniques, qcMCO generates M(N+1) alternative treatment plans per patient. Objective weight variations for OARs and targets are used to generate alternative qcMCO plans. For 30 locally advanced lung cancer patients, qcMCO plans were generated for dosimetric tradeoffs to four OARs: each lung, heart, and esophagus (N=4) and 4 delivery techniques (simple 4-field arrangements, 9-field coplanar IMRT, 27-field non-coplanar IMRT, and non-coplanarmore » Arc IMRT). Quasi-constrained objectives included target prescription isodose to 95% (PTV-D95), maximum PTV dose (PTV-Dmax)< 110% of prescription, and spinal cord Dmax<45 Gy. The algorithm’s ability to meet these constraints while simultaneously revealing dosimetric tradeoffs was investigated. Statistically significant dosimetric tradeoffs were defined such that the coefficient of determination between dosimetric indices which varied by at least 5 Gy between different plans was >0.8. Results: The qcMCO plans varied mean dose by >5 Gy to ipsilateral lung for 24/30 patients, contralateral lung for 29/30 patients, esophagus for 29/30 patients, and heart for 19/30 patients. In the 600 plans computed without human interaction, average PTV-D95=67.4±3.3 Gy, PTV-Dmax=79.2±5.3 Gy, and spinal cord Dmax was >45 Gy in 93 plans (>50 Gy in 2/600 plans). Statistically significant dosimetric tradeoffs were evident in 19/30 plans, including multiple tradeoffs of at least 5 Gy between multiple OARs in 7/30 cases. The most common statistically significant tradeoff was increasing PTV-Dmax to reduce OAR dose (15/30 patients). Conclusion: The qcMCO method can conform to quasi-constrained objectives while revealing significant variations in OAR doses including mean dose reductions >5 Gy. Clinical implementation will facilitate patient-specific decision making based on achievable dosimetry as opposed to accept/reject models based on population derived objectives.« less

  20. Advances in high-resolution imaging--techniques for three-dimensional imaging of cellular structures.

    PubMed

    Lidke, Diane S; Lidke, Keith A

    2012-06-01

    A fundamental goal in biology is to determine how cellular organization is coupled to function. To achieve this goal, a better understanding of organelle composition and structure is needed. Although visualization of cellular organelles using fluorescence or electron microscopy (EM) has become a common tool for the cell biologist, recent advances are providing a clearer picture of the cell than ever before. In particular, advanced light-microscopy techniques are achieving resolutions below the diffraction limit and EM tomography provides high-resolution three-dimensional (3D) images of cellular structures. The ability to perform both fluorescence and electron microscopy on the same sample (correlative light and electron microscopy, CLEM) makes it possible to identify where a fluorescently labeled protein is located with respect to organelle structures visualized by EM. Here, we review the current state of the art in 3D biological imaging techniques with a focus on recent advances in electron microscopy and fluorescence super-resolution techniques.

  1. SU-E-T-275: Radiobiological Evaluation of Intensity Modulated Radiotherapy Treatment for Locally Advanced Head and Neck Squamous Cell Carcinomas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rekha Reddy, B.; Ravikumar, M.; Tanvir Pasha, C.R

    2014-06-01

    Purpose: To evaluate the radiobiological outcome of Intensity Modulated Radiotherapy Treatment (IMRT) for locally advanced head and neck squamous cell carcinomas using HART (Histogram Analysis in Radiation Therapy; J Appl Clin Med Phys 11(1): 137–157, 2010) program and compare with the clinical outcomes. Methods: We have treated 20 patients of stage III and IV HNSCC Oropharynx and hypopharynx with accelerated IMRT technique and concurrent chemotherapy. Delineation of tumor and normal tissues were done using Danish Head and Neck Cancer Group (DAHANCA) contouring guidelines and radiotherapy was delivered to a dose of 70Gy in 35 fractions to the primary and involvedmore » lymph nodes, 63Gy to intermediate risk areas and 56 Gy to lower risk areas, Monday to Saturday, 6 Days/week using 6 MV Photons with an expected overall treatment time of 6 weeks. The TCP and NTCP's were calculated from the dose-volume histogram (DVH) statistics using the Poisson Statistics (PS) and JT Lyman models respectively and the Resultwas correlated with clinical outcomes of the patients with mean follow up of 24 months. Results: Using HART program, the TCP (0.89± 0.01) of primary tumor and the NTCP for parotids (0.20±0.12), spinal cord (0.05±0.01), esophagus (0.30±0.2), mandible (0.35±0.21), Oral cavity (0.37±0.18), Larynx (0.30±0.15) were estimated and correlated with clinical outcome of the patients. Conclusion: Accelerated IMRT with Chemotherapy is a clinical feasible option in the treatment of locally advanced HNSCC with encouraging initial tumour response and acceptable acute toxicities. The correlation between the clinical outcomes and radiobiological model estimated parameters using HART programs are found to be satisfactory.« less

  2. 14 CFR 151.111 - Advance planning proposals: General.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Engineering Proposals § 151.111 Advance planning proposals: General. (a) Each advance planning and engineering... application, under §§ 151.21(c) and 151.27, or both. (c) Each proposal must relate to planning and engineering... “Airport Activity Statistics of Certificated Route Air Carriers” (published jointly by FAA and the Civil...

  3. Earth Observation System Flight Dynamics System Covariance Realism

    NASA Technical Reports Server (NTRS)

    Zaidi, Waqar H.; Tracewell, David

    2016-01-01

    This presentation applies a covariance realism technique to the National Aeronautics and Space Administration (NASA) Earth Observation System (EOS) Aqua and Aura spacecraft based on inferential statistics. The technique consists of three parts: collection calculation of definitive state estimates through orbit determination, calculation of covariance realism test statistics at each covariance propagation point, and proper assessment of those test statistics.

  4. The Shock and Vibration Digest. Volume 16, Number 1

    DTIC Science & Technology

    1984-01-01

    investigation of the measure- ment of frequency band average loss factors of structural components for use in the statistical energy analysis method of...stiffness. Matrix methods Key Words: Finite element technique. Statistical energy analysis . Experimental techniques. Framed structures, Com- puter...programs In order to further understand the practical application of the statistical energy analysis , a two section plate-like frame structure is

  5. Knowledge Discovery and Data Mining in Iran's Climatic Researches

    NASA Astrophysics Data System (ADS)

    Karimi, Mostafa

    2013-04-01

    Advances in measurement technology and data collection is the database gets larger. Large databases require powerful tools for analysis data. Iterative process of acquiring knowledge from information obtained from data processing is done in various forms in all scientific fields. However, when the data volume large, and many of the problems the Traditional methods cannot respond. in the recent years, use of databases in various scientific fields, especially atmospheric databases in climatology expanded. in addition, increases in the amount of data generated by the climate models is a challenge for analysis of it for extraction of hidden pattern and knowledge. The approach to this problem has been made in recent years uses the process of knowledge discovery and data mining techniques with the use of the concepts of machine learning, artificial intelligence and expert (professional) systems is overall performance. Data manning is analytically process for manning in massive volume data. The ultimate goal of data mining is access to information and finally knowledge. climatology is a part of science that uses variety and massive volume data. Goal of the climate data manning is Achieve to information from variety and massive atmospheric and non-atmospheric data. in fact, Knowledge Discovery performs these activities in a logical and predetermined and almost automatic process. The goal of this research is study of uses knowledge Discovery and data mining technique in Iranian climate research. For Achieve This goal, study content (descriptive) analysis and classify base method and issue. The result shown that in climatic research of Iran most clustering, k-means and wards applied and in terms of issues precipitation and atmospheric circulation patterns most introduced. Although several studies in geography and climate issues with statistical techniques such as clustering and pattern extraction is done, Due to the nature of statistics and data mining, but cannot say for internal climate studies in data mining and knowledge discovery techniques are used. However, it is necessary to use the KDD Approach and DM techniques in the climatic studies, specific interpreter of climate modeling result.

  6. Investigation of advanced phase-shifting projected fringe profilometry techniques

    NASA Astrophysics Data System (ADS)

    Liu, Hongyu

    1999-11-01

    The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.

  7. Towards Solving the Mixing Problem in the Decomposition of Geophysical Time Series by Independent Component Analysis

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)

    2000-01-01

    The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.

  8. Advanced Visualization and Interactive Display Rapid Innovation and Discovery Evaluation Research (VISRIDER) Program Task 6: Point Cloud Visualization Techniques for Desktop and Web Platforms

    DTIC Science & Technology

    2017-04-01

    ADVANCED VISUALIZATION AND INTERACTIVE DISPLAY RAPID INNOVATION AND DISCOVERY EVALUATION RESEARCH (VISRIDER) PROGRAM TASK 6: POINT CLOUD...To) OCT 2013 – SEP 2014 4. TITLE AND SUBTITLE ADVANCED VISUALIZATION AND INTERACTIVE DISPLAY RAPID INNOVATION AND DISCOVERY EVALUATION RESEARCH...various point cloud visualization techniques for viewing large scale LiDAR datasets. Evaluate their potential use for thick client desktop platforms

  9. Advancement of the anterior maxilla by distraction (case report).

    PubMed

    Karakasis, Dimitri; Hadjipetrou, Loucia

    2004-06-01

    Several techniques of distraction osteogenesis have been applied for the correction of compromised midface in patients with clefts of the lip, alveolus and palate. This article presents a technique of callus distraction applied in a specific case of hypoplasia of a cleft maxilla with the sagittal advancement of the maxilla thus not affecting velopharyngeal function. The decision to apply distraction osteogenesis for advancement of the anterior maxillary segment in cleft patients offers many advantages.

  10. Wafer hot spot identification through advanced photomask characterization techniques

    NASA Astrophysics Data System (ADS)

    Choi, Yohan; Green, Michael; McMurran, Jeff; Ham, Young; Lin, Howard; Lan, Andy; Yang, Richer; Lung, Mike

    2016-10-01

    As device manufacturers progress through advanced technology nodes, limitations in standard 1-dimensional (1D) mask Critical Dimension (CD) metrics are becoming apparent. Historically, 1D metrics such as Mean to Target (MTT) and CD Uniformity (CDU) have been adequate for end users to evaluate and predict the mask impact on the wafer process. However, the wafer lithographer's process margin is shrinking at advanced nodes to a point that the classical mask CD metrics are no longer adequate to gauge the mask contribution to wafer process error. For example, wafer CDU error at advanced nodes is impacted by mask factors such as 3-dimensional (3D) effects and mask pattern fidelity on subresolution assist features (SRAFs) used in Optical Proximity Correction (OPC) models of ever-increasing complexity. These items are not quantifiable with the 1D metrology techniques of today. Likewise, the mask maker needs advanced characterization methods in order to optimize the mask process to meet the wafer lithographer's needs. These advanced characterization metrics are what is needed to harmonize mask and wafer processes for enhanced wafer hot spot analysis. In this paper, we study advanced mask pattern characterization techniques and their correlation with modeled wafer performance.

  11. Flow Control Research at NASA Langley in Support of High-Lift Augmentation

    NASA Technical Reports Server (NTRS)

    Sellers, William L., III; Jones, Gregory S.; Moore, Mark D.

    2002-01-01

    The paper describes the efforts at NASA Langley to apply active and passive flow control techniques for improved high-lift systems, and advanced vehicle concepts utilizing powered high-lift techniques. The development of simplified high-lift systems utilizing active flow control is shown to provide significant weight and drag reduction benefits based on system studies. Active flow control that focuses on separation, and the development of advanced circulation control wings (CCW) utilizing unsteady excitation techniques will be discussed. The advanced CCW airfoils can provide multifunctional controls throughout the flight envelope. Computational and experimental data are shown to illustrate the benefits and issues with implementation of the technology.

  12. Imaging evidence and recommendations for traumatic brain injury: advanced neuro- and neurovascular imaging techniques.

    PubMed

    Wintermark, M; Sanelli, P C; Anzai, Y; Tsiouris, A J; Whitlow, C T

    2015-02-01

    Neuroimaging plays a critical role in the evaluation of patients with traumatic brain injury, with NCCT as the first-line of imaging for patients with traumatic brain injury and MR imaging being recommended in specific settings. Advanced neuroimaging techniques, including MR imaging DTI, blood oxygen level-dependent fMRI, MR spectroscopy, perfusion imaging, PET/SPECT, and magnetoencephalography, are of particular interest in identifying further injury in patients with traumatic brain injury when conventional NCCT and MR imaging findings are normal, as well as for prognostication in patients with persistent symptoms. These advanced neuroimaging techniques are currently under investigation in an attempt to optimize them and substantiate their clinical relevance in individual patients. However, the data currently available confine their use to the research arena for group comparisons, and there remains insufficient evidence at the time of this writing to conclude that these advanced techniques can be used for routine clinical use at the individual patient level. TBI imaging is a rapidly evolving field, and a number of the recommendations presented will be updated in the future to reflect the advances in medical knowledge. © 2015 by American Journal of Neuroradiology.

  13. Symptom Clusters in Advanced Cancer Patients: An Empirical Comparison of Statistical Methods and the Impact on Quality of Life.

    PubMed

    Dong, Skye T; Costa, Daniel S J; Butow, Phyllis N; Lovell, Melanie R; Agar, Meera; Velikova, Galina; Teckle, Paulos; Tong, Allison; Tebbutt, Niall C; Clarke, Stephen J; van der Hoek, Kim; King, Madeleine T; Fayers, Peter M

    2016-01-01

    Symptom clusters in advanced cancer can influence patient outcomes. There is large heterogeneity in the methods used to identify symptom clusters. To investigate the consistency of symptom cluster composition in advanced cancer patients using different statistical methodologies for all patients across five primary cancer sites, and to examine which clusters predict functional status, a global assessment of health and global quality of life. Principal component analysis and exploratory factor analysis (with different rotation and factor selection methods) and hierarchical cluster analysis (with different linkage and similarity measures) were used on a data set of 1562 advanced cancer patients who completed the European Organization for the Research and Treatment of Cancer Quality of Life Questionnaire-Core 30. Four clusters consistently formed for many of the methods and cancer sites: tense-worry-irritable-depressed (emotional cluster), fatigue-pain, nausea-vomiting, and concentration-memory (cognitive cluster). The emotional cluster was a stronger predictor of overall quality of life than the other clusters. Fatigue-pain was a stronger predictor of overall health than the other clusters. The cognitive cluster and fatigue-pain predicted physical functioning, role functioning, and social functioning. The four identified symptom clusters were consistent across statistical methods and cancer types, although there were some noteworthy differences. Statistical derivation of symptom clusters is in need of greater methodological guidance. A psychosocial pathway in the management of symptom clusters may improve quality of life. Biological mechanisms underpinning symptom clusters need to be delineated by future research. A framework for evidence-based screening, assessment, treatment, and follow-up of symptom clusters in advanced cancer is essential. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  14. Numerical simulation of coupled electrochemical and transport processes in battery systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liaw, B.Y.; Gu, W.B.; Wang, C.Y.

    1997-12-31

    Advanced numerical modeling to simulate dynamic battery performance characteristics for several types of advanced batteries is being conducted using computational fluid dynamics (CFD) techniques. The CFD techniques provide efficient algorithms to solve a large set of highly nonlinear partial differential equations that represent the complex battery behavior governed by coupled electrochemical reactions and transport processes. The authors have recently successfully applied such techniques to model advanced lead-acid, Ni-Cd and Ni-MH cells. In this paper, the authors briefly discuss how the governing equations were numerically implemented, show some preliminary modeling results, and compare them with other modeling or experimental data reportedmore » in the literature. The authors describe the advantages and implications of using the CFD techniques and their capabilities in future battery applications.« less

  15. Mesenchymal stem cells in osteotomy repair after tibial tuberosity advancement in dogs with cranial cruciate ligament injury.

    PubMed

    Rocha Dos Santos, Clarissa; da Rocha Filgueiras, Richard; Furtado Malard, Patrícia; Rodrigues da Cunha Barreto-Vianna, Andre; Nogueira, Kaique; da Silva Leite, Carolina; Maurício Mendes de Lima, Eduardo

    2018-06-14

    The cranial cruciate ligament rupture (CCLR) is the most commonly encountered orthopedic condition in dogs. Among the various techniques to treat this condition, tibial tuberosity advancement (TTA) has been used to obtain rapid recovery of the affected knee. The objective of this study was to evaluate the viability of the use of mesenchymal stem cells (MSC) implanted in the osteotomy site obtained by TTA in nine dogs diagnosed with CCLR. The MSC were isolated from the adipose tissue of the dogs and cultured for eight days, the animals were divided into two groups. Animals from the treated group (GT) received cell transport medium containing about 1.5 millions MSC, and the animals from the control group (GC) received only the cell transport medium. The study was performed in a double-blind manner using radiographs acquired on days 15, 30, 60 and 120 after the procedure. Evaluations of the density of the trabecular bone were performed using image analysis software. The results were subjected to descriptive statistical analysis, followed by the normality test, Chi-square test, Mann-Whitney test and Tukey's multiple comparison test for p ≤ 0.05. After 30 days of the procedure, the animals of the GT presented an ossification mean 36.45% greater (p ≤ 0.033) than the GC, and there were no statistical differences for the other periods. Despite the total bone ossification within the expected period, there was no minimization of the estimated recovery time with the application of MSC, and inflammatory factors should be considered for reassessment of the therapeutic intervention time.

  16. Automated Training Evaluation (ATE). Final Report.

    ERIC Educational Resources Information Center

    Charles, John P.; Johnson, Robert M.

    The automation of weapons system training presents the potential for significant savings in training costs in terms of manpower, time, and money. The demonstration of the technical feasibility of automated training through the application of advanced digital computer techniques and advanced training techniques is essential before the application…

  17. Evolutionary neural networks for anomaly detection based on the behavior of a program.

    PubMed

    Han, Sang-Jun; Cho, Sung-Bae

    2006-06-01

    The process of learning the behavior of a given program by using machine-learning techniques (based on system-call audit data) is effective to detect intrusions. Rule learning, neural networks, statistics, and hidden Markov models (HMMs) are some of the kinds of representative methods for intrusion detection. Among them, neural networks are known for good performance in learning system-call sequences. In order to apply this knowledge to real-world problems successfully, it is important to determine the structures and weights of these call sequences. However, finding the appropriate structures requires very long time periods because there are no suitable analytical solutions. In this paper, a novel intrusion-detection technique based on evolutionary neural networks (ENNs) is proposed. One advantage of using ENNs is that it takes less time to obtain superior neural networks than when using conventional approaches. This is because they discover the structures and weights of the neural networks simultaneously. Experimental results with the 1999 Defense Advanced Research Projects Agency (DARPA) Intrusion Detection Evaluation (IDEVAL) data confirm that ENNs are promising tools for intrusion detection.

  18. Development of quantitative structure-activity relationships and its application in rational drug design.

    PubMed

    Yang, Guang-Fu; Huang, Xiaoqin

    2006-01-01

    Over forty years have elapsed since Hansch and Fujita published their pioneering work of quantitative structure-activity relationships (QSAR). Following the introduction of Comparative Molecular Field Analysis (CoMFA) by Cramer in 1998, other three-dimensional QSAR methods have been developed. Currently, combination of classical QSAR and other computational techniques at three-dimensional level is of greatest interest and generally used in the process of modern drug discovery and design. During the last several decades, a number of different mythologies incorporating a range of molecular descriptors and different statistical regression ways have been proposed and successfully applied in developing of new drugs, thus QSAR method has been proven to be indispensable in not only the reliable prediction of specific properties of new compounds, but also the help to elucidate the possible molecular mechanism of the receptor-ligand interactions. Here, we review the recent developments in QSAR and their applications in rational drug design, focusing on the reasonable selection of novel molecular descriptors and the construction of predictive QSAR models by the help of advanced computational techniques.

  19. Solar thematic maps for space weather operations

    USGS Publications Warehouse

    Rigler, E. Joshua; Hill, Steven M.; Reinard, Alysha A.; Steenburgh, Robert A.

    2012-01-01

    Thematic maps are arrays of labels, or "themes", associated with discrete locations in space and time. Borrowing heavily from the terrestrial remote sensing discipline, a numerical technique based on Bayes' theorem captures operational expertise in the form of trained theme statistics, then uses this to automatically assign labels to solar image pixels. Ultimately, regular thematic maps of the solar corona will be generated from high-cadence, high-resolution SUVI images, the solar ultraviolet imager slated to fly on NOAA's next-generation GOES-R series of satellites starting ~2016. These thematic maps will not only provide quicker, more consistent synoptic views of the sun for space weather forecasters, but digital thematic pixel masks (e.g., coronal hole, active region, flare, etc.), necessary for a new generation of operational solar data products, will be generated. This paper presents the mathematical underpinnings of our thematic mapper, as well as some practical algorithmic considerations. Then, using images from the Solar Dynamics Observatory (SDO) Advanced Imaging Array (AIA) as test data, it presents results from validation experiments designed to ascertain the robustness of the technique with respect to differing expert opinions and changing solar conditions.

  20. Computationally efficient stochastic optimization using multiple realizations

    NASA Astrophysics Data System (ADS)

    Bayer, P.; Bürger, C. M.; Finkel, M.

    2008-02-01

    The presented study is concerned with computationally efficient methods for solving stochastic optimization problems involving multiple equally probable realizations of uncertain parameters. A new and straightforward technique is introduced that is based on dynamically ordering the stack of realizations during the search procedure. The rationale is that a small number of critical realizations govern the output of a reliability-based objective function. By utilizing a problem, which is typical to designing a water supply well field, several variants of this "stack ordering" approach are tested. The results are statistically assessed, in terms of optimality and nominal reliability. This study demonstrates that the simple ordering of a given number of 500 realizations while applying an evolutionary search algorithm can save about half of the model runs without compromising the optimization procedure. More advanced variants of stack ordering can, if properly configured, save up to more than 97% of the computational effort that would be required if the entire number of realizations were considered. The findings herein are promising for similar problems of water management and reliability-based design in general, and particularly for non-convex problems that require heuristic search techniques.

  1. Big Data of the Cosmic Web

    NASA Astrophysics Data System (ADS)

    Kitaura, Francisco-Shu

    2016-10-01

    One of the main goals in cosmology is to understand how the Universe evolves, how it forms structures, why it expands, and what is the nature of dark matter and dark energy. Next decade large and expensive observational projects will bring information on the structure and the distribution of many millions of galaxies at different redshifts enabling us to make great progress in answering these questions. However, these data require a very special and complex set of analysis tools to extract the maximum valuable information. Statistical inference techniques are being developed, bridging the gaps between theory, simulations, and observations. In particular, we discuss the efforts to address the question: What is the underlying nonlinear matter distribution and dynamics at any cosmic time corresponding to a set of observed galaxies in redshift space? An accurate reconstruction of the initial conditions encodes the full phase-space information at any later cosmic time (given a particular structure formation model and a set of cosmological parameters). We present advances to solve this problem in a self-consistent way with Big Data techniques of the Cosmic Web.

  2. Rear-End Crashes: Problem Size Assessment And Statistical Description

    DOT National Transportation Integrated Search

    1993-05-01

    KEYWORDS : RESEARCH AND DEVELOPMENT OR R&D, ADVANCED VEHICLE CONTROL & SAFETY SYSTEMS OR AVCSS, INTELLIGENT VEHICLE INITIATIVE OR IVI : THIS DOCUMENT PRESENTS PROBLEM SIZE ASSESSMENTS AND STATISTICAL CRASH DESCRIPTION FOR REAR-END CRASHES, INC...

  3. 75 FR 55333 - Board of Scientific Counselors, National Center for Health Statistics, (BSC, NCHS)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-10

    ... Scientific Counselors, National Center for Health Statistics, (BSC, NCHS) In accordance with section 10(a)(2... Prevention (CDC), National Center for Health Statistics (NCHS) announces the following meeting of [email protected] or Virginia Cain, [email protected] at least 10 days in advance for requirements). All visitors...

  4. Writing to Learn Statistics in an Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Northrup, Christian Glenn

    2012-01-01

    This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…

  5. Using R in Introductory Statistics Courses with the pmg Graphical User Interface

    ERIC Educational Resources Information Center

    Verzani, John

    2008-01-01

    The pmg add-on package for the open source statistics software R is described. This package provides a simple to use graphical user interface (GUI) that allows introductory statistics students, without advanced computing skills, to quickly create the graphical and numeric summaries expected of them. (Contains 9 figures.)

  6. A Selective Overview of Variable Selection in High Dimensional Feature Space

    PubMed Central

    Fan, Jianqing

    2010-01-01

    High dimensional statistical problems arise from diverse fields of scientific research and technological development. Variable selection plays a pivotal role in contemporary statistical learning and scientific discoveries. The traditional idea of best subset selection methods, which can be regarded as a specific form of penalized likelihood, is computationally too expensive for many modern statistical applications. Other forms of penalized likelihood methods have been successfully developed over the last decade to cope with high dimensionality. They have been widely applied for simultaneously selecting important variables and estimating their effects in high dimensional statistical inference. In this article, we present a brief account of the recent developments of theory, methods, and implementations for high dimensional variable selection. What limits of the dimensionality such methods can handle, what the role of penalty functions is, and what the statistical properties are rapidly drive the advances of the field. The properties of non-concave penalized likelihood and its roles in high dimensional statistical modeling are emphasized. We also review some recent advances in ultra-high dimensional variable selection, with emphasis on independence screening and two-scale methods. PMID:21572976

  7. Advanced ultrasonic measurement methodology for non-invasive interrogation and identification of fluids in sealed containers

    NASA Astrophysics Data System (ADS)

    Tucker, Brian J.; Diaz, Aaron A.; Eckenrode, Brian A.

    2006-03-01

    Government agencies and homeland security related organizations have identified the need to develop and establish a wide range of unprecedented capabilities for providing scientific and technical forensic services to investigations involving hazardous chemical, biological, and radiological materials, including extremely dangerous chemical and biological warfare agents. Pacific Northwest National Laboratory (PNNL) has developed a prototype portable, hand-held, hazardous materials acoustic inspection prototype that provides noninvasive container interrogation and material identification capabilities using nondestructive ultrasonic velocity and attenuation measurements. Due to the wide variety of fluids as well as container sizes and materials encountered in various law enforcement inspection activities, the need for high measurement sensitivity and advanced ultrasonic measurement techniques were identified. The prototype was developed using a versatile electronics platform, advanced ultrasonic wave propagation methods, and advanced signal processing techniques. This paper primarily focuses on the ultrasonic measurement methods and signal processing techniques incorporated into the prototype. High bandwidth ultrasonic transducers combined with an advanced pulse compression technique allowed researchers to 1) obtain high signal-to-noise ratios and 2) obtain accurate and consistent time-of-flight (TOF) measurements through a variety of highly attenuative containers and fluid media. Results of work conducted in the laboratory have demonstrated that the prototype experimental measurement technique also provided information regarding container properties, which will be utilized in future container-independent measurements of hidden liquids.

  8. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis

    PubMed Central

    Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876

  9. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis.

    PubMed

    Wu, Yazhou; Zhou, Liang; Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent.

  10. Replication of long-bone length QTL in the F9-F10 LG,SM advanced intercross.

    PubMed

    Norgard, Elizabeth A; Jarvis, Joseph P; Roseman, Charles C; Maxwell, Taylor J; Kenney-Hunt, Jane P; Samocha, Kaitlin E; Pletscher, L Susan; Wang, Bing; Fawcett, Gloria L; Leatherwood, Christopher J; Wolf, Jason B; Cheverud, James M

    2009-04-01

    Quantitative trait locus (QTL) mapping techniques are frequently used to identify genomic regions associated with variation in phenotypes of interest. However, the F(2) intercross and congenic strain populations usually employed have limited genetic resolution resulting in relatively large confidence intervals that greatly inhibit functional confirmation of statistical results. Here we use the increased resolution of the combined F(9) and F(10) generations (n = 1455) of the LG,SM advanced intercross to fine-map previously identified QTL associated with the lengths of the humerus, ulna, femur, and tibia. We detected 81 QTL affecting long-bone lengths. Of these, 49 were previously identified in the combined F(2)-F(3) population of this intercross, while 32 represent novel contributors to trait variance. Pleiotropy analysis suggests that most QTL affect three to four long bones or serially homologous limb segments. We also identified 72 epistatic interactions involving 38 QTL and 88 novel regions. This analysis shows that using later generations of an advanced intercross greatly facilitates fine-mapping of confidence intervals, resolving three F(2)-F(3) QTL into multiple linked loci and narrowing confidence intervals of other loci, as well as allowing identification of additional QTL. Further characterization of the biological bases of these QTL will help provide a better understanding of the genetics of small variations in long-bone length.

  11. The side effects and complications of percutaneous iodine-125 seeds implantation under CT-guide for patients with advanced pancreatic cancer.

    PubMed

    Lv, Wei-Fu; Lu, Dong; Xiao, Jing-Kun; Mukhiya, Gauri; Tan, Zhong-Xiao; Cheng, De-Lei; Zhou, Chun-Ze; Zhang, Xing-Min; Zhang, Zheng-Feng; Hou, Chang-Long

    2017-12-01

    The present study investigates the side effects and complications of computed tomography (CT)-guided percutaneous iodine-125 (I-125) seeds implantation for advanced pancreatic cancer. The clinical data were retrospectively analyzed for patients treated with implantation of I-125 seeds under CT-guide in our hospital from May 2010 to April 2015. The side effects and complications were collected and their possible reasons were analyzed. A total of 78 patients were enrolled. The side effects were categorized as fever in 29 cases (37.18%), abdominal pain in 26 cases (33.33%), nausea and vomiting in 9 cases (11.54%), diarrhea in 5 cases (6.41%), and constipation in 4 cases (5.13%). Complications were composed of pancreatitis in 9 cases (11.54%), infection in 5 cases (6.41%), seed migration in 2 cases (2.56%), intestinal perforation in 1 case (1.28%), and intestinal obstruction in 1 case. The incidence of complication was 23.08% (18/78). The difference in incidence of complication was statistically significant between patients implanted with ≤27 seeds and those with >27 seeds (P = .032). The side effects and complications frequently occur in implantation of I-125 seeds for patients with advanced pancreatic cancer. More concern should be given to the patients treated by this technique. Copyright © 2017 The Authors. Published by Wolters Kluwer Health, Inc. All rights reserved.

  12. Multivariate statistical analysis: Principles and applications to coorbital streams of meteorite falls

    NASA Technical Reports Server (NTRS)

    Wolf, S. F.; Lipschutz, M. E.

    1993-01-01

    Multivariate statistical analysis techniques (linear discriminant analysis and logistic regression) can provide powerful discrimination tools which are generally unfamiliar to the planetary science community. Fall parameters were used to identify a group of 17 H chondrites (Cluster 1) that were part of a coorbital stream which intersected Earth's orbit in May, from 1855 - 1895, and can be distinguished from all other H chondrite falls. Using multivariate statistical techniques, it was demonstrated that a totally different criterion, labile trace element contents - hence thermal histories - or 13 Cluster 1 meteorites are distinguishable from those of 45 non-Cluster 1 H chondrites. Here, we focus upon the principles of multivariate statistical techniques and illustrate their application using non-meteoritic and meteoritic examples.

  13. 40 CFR Appendix K to Part 50 - Interpretation of the National Ambient Air Quality Standards for Particulate Matter

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., other techniques, such as the use of statistical models or the use of historical data could be..., mathematical techniques should be applied to account for the trends to ensure that the expected annual values... emission patterns, either the most recent representative year(s) could be used or statistical techniques or...

  14. Emerging nondestructive inspection methods for aging aircraft

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beattie, A; Dahlke, L; Gieske, J

    This report identifies and describes emerging nondestructive inspection (NDI) methods that can potentially be used to inspect commercial transport and commuter aircraft for structural damage. The nine categories of emerging NDI techniques are: acoustic emission, x-ray computed tomography, backscatter radiation, reverse geometry x-ray, advanced electromagnetics, including magnetooptic imaging and advanced eddy current techniques, coherent optics, advanced ultrasonics, advanced visual, and infrared thermography. The physical principles, generalized performance characteristics, and typical applications associated with each method are described. In addition, aircraft inspection applications are discussed along with the associated technical considerations. Finally, the status of each technique is presented, with amore » discussion on when it may be available for use in actual aircraft maintenance programs. It should be noted that this is a companion document to DOT/FAA/CT-91/5, Current Nondestructive Inspection Methods for Aging Aircraft.« less

  15. The Use of a Context-Based Information Retrieval Technique

    DTIC Science & Technology

    2009-07-01

    provided in context. Latent Semantic Analysis (LSA) is a statistical technique for inferring contextual and structural information, and previous studies...WAIS). 10 DSTO-TR-2322 1.4.4 Latent Semantic Analysis LSA, which is also known as latent semantic indexing (LSI), uses a statistical and...1.4.6 Language Models In contrast, natural language models apply algorithms that combine statistical information with semantic information. Semantic

  16. Advances in Modern Botnet Understanding and the Accurate Enumeration of Infected Hosts

    ERIC Educational Resources Information Center

    Nunnery, Christopher Edward

    2011-01-01

    Botnets remain a potent threat due to evolving modern architectures, inadequate remediation methods, and inaccurate measurement techniques. In response, this research exposes the architectures and operations of two advanced botnets, techniques to enumerate infected hosts, and pursues the scientific refinement of infected-host enumeration data by…

  17. Real-time application of advanced three-dimensional graphic techniques for research aircraft simulation

    NASA Technical Reports Server (NTRS)

    Davis, Steven B.

    1990-01-01

    Visual aids are valuable assets to engineers for design, demonstration, and evaluation. Discussed here are a variety of advanced three-dimensional graphic techniques used to enhance the displays of test aircraft dynamics. The new software's capabilities are examined and possible future uses are considered.

  18. 75 FR 44015 - Certain Semiconductor Products Made by Advanced Lithography Techniques and Products Containing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-729] Certain Semiconductor Products Made by... the sale within the United States after importation of certain semiconductor products made by advanced lithography techniques and products containing same by reason of infringement of certain claims of U.S. Patent...

  19. Advanced liner-cooling techniques for gas turbine combustors

    NASA Technical Reports Server (NTRS)

    Norgren, C. T.; Riddlebaugh, S. M.

    1985-01-01

    Component research for advanced small gas turbine engines is currently underway at the NASA Lewis Research Center. As part of this program, a basic reverse-flow combustor geometry was being maintained while different advanced liner wall cooling techniques were investigated. Performance and liner cooling effectiveness of the experimental combustor configuration featuring counter-flow film-cooled panels is presented and compared with two previously reported combustors featuring: splash film-cooled liner walls; and transpiration cooled liner walls (Lamilloy).

  20. Wallerian Degeneration Beyond the Corticospinal Tracts: Conventional and Advanced MRI Findings.

    PubMed

    Chen, Yin Jie; Nabavizadeh, Seyed Ali; Vossough, Arastoo; Kumar, Sunil; Loevner, Laurie A; Mohan, Suyash

    2017-05-01

    Wallerian degeneration (WD) is defined as progressive anterograde disintegration of axons and accompanying demyelination after an injury to the proximal axon or cell body. Since the 1980s and 1990s, conventional magnetic resonance imaging (MRI) sequences have been shown to be sensitive to changes of WD in the subacute to chronic phases. More recently, advanced MRI techniques, such as diffusion-weighted imaging (DWI) and diffusion tensor imaging (DTI), have demonstrated some of earliest changes attributed to acute WD, typically on the order of days. In addition, there is increasing evidence on the value of advanced MRI techniques in providing important prognostic information related to WD. This article reviews the utility of conventional and advanced MRI techniques for assessing WD, by focusing not only on the corticospinal tract but also other neural tracts less commonly thought of, including corticopontocerebellar tract, dentate-rubro-olivary pathway, posterior column of the spinal cord, corpus callosum, limbic circuit, and optic pathway. The basic anatomy of these neural pathways will be discussed, followed by a comprehensive review of existing literature supported by instructive clinical examples. The goal of this review is for readers to become more familiar with both conventional and advanced MRI findings of WD involving important neural pathways, as well as to illustrate increasing utility of advanced MRI techniques in providing important prognostic information for various pathologies. Copyright © 2016 by the American Society of Neuroimaging.

Top