Sample records for advanced statistical tools

  1. A Quantitative Comparative Study of Blended and Traditional Models in the Secondary Advanced Placement Statistics Classroom

    ERIC Educational Resources Information Center

    Owens, Susan T.

    2017-01-01

    Technology is becoming an integral tool in the classroom and can make a positive impact on how the students learn. This quantitative comparative research study examined gender-based differences among secondary Advanced Placement (AP) Statistic students comparing Educational Testing Service (ETS) College Board AP Statistic examination scores…

  2. GAPIT version 2: an enhanced integrated tool for genomic association and prediction

    USDA-ARS?s Scientific Manuscript database

    Most human diseases and agriculturally important traits are complex. Dissecting their genetic architecture requires continued development of innovative and powerful statistical methods. Corresponding advances in computing tools are critical to efficiently use these statistical innovations and to enh...

  3. ADAPTATION OF THE ADVANCED STATISTICAL TRAJECTORY REGIONAL AIR POLLUTION (ASTRAP) MODEL TO THE EPA VAX COMPUTER - MODIFICATIONS AND TESTING

    EPA Science Inventory

    The Advanced Statistical Trajectory Regional Air Pollution (ASTRAP) model simulates long-term transport and deposition of oxides of and nitrogen. t is a potential screening tool for assessing long-term effects on regional visibility from sulfur emission sources. owever, a rigorou...

  4. OASIS 2: online application for survival analysis 2 with features for the analysis of maximal lifespan and healthspan in aging research.

    PubMed

    Han, Seong Kyu; Lee, Dongyeop; Lee, Heetak; Kim, Donghyo; Son, Heehwa G; Yang, Jae-Seong; Lee, Seung-Jae V; Kim, Sanguk

    2016-08-30

    Online application for survival analysis (OASIS) has served as a popular and convenient platform for the statistical analysis of various survival data, particularly in the field of aging research. With the recent advances in the fields of aging research that deal with complex survival data, we noticed a need for updates to the current version of OASIS. Here, we report OASIS 2 (http://sbi.postech.ac.kr/oasis2), which provides extended statistical tools for survival data and an enhanced user interface. In particular, OASIS 2 enables the statistical comparison of maximal lifespans, which is potentially useful for determining key factors that limit the lifespan of a population. Furthermore, OASIS 2 provides statistical and graphical tools that compare values in different conditions and times. That feature is useful for comparing age-associated changes in physiological activities, which can be used as indicators of "healthspan." We believe that OASIS 2 will serve as a standard platform for survival analysis with advanced and user-friendly statistical tools for experimental biologists in the field of aging research.

  5. METABOLOMICS AS A DIAGNOSTIC TOOL FOR SMALL FISH TOXICOLOGY RESEARCH

    EPA Science Inventory

    Metabolomics involves the application of advanced analytical and statistical tools to profile changes in levels of endogenous metabolites in tissues and biofluids resulting from disease onset or stress. While certain metabolites are being specifically targeted in these studies, w...

  6. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  7. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    PubMed

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.

  8. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    PubMed Central

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185

  9. Advance Directives

    MedlinePlus

    ... Data Conducting Clinical Trials Statistical Tools and Data Terminology Resources NCI Data Catalog Cryo-EM NCI's Role ... Withholding food and fluids Organ and tissue donation Medical Power of Attorney This is a document that ...

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wurtz, R.; Kaplan, A.

    Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-­realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-­building elements and their functions in a fully-­designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejectionmore » rate (GRR) relevant for realistic applications.« less

  11. The Statistical Consulting Center for Astronomy (SCCA)

    NASA Technical Reports Server (NTRS)

    Akritas, Michael

    2001-01-01

    The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.

  12. Big Data and Neuroimaging.

    PubMed

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  13. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    PubMed

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  14. SimHap GUI: An intuitive graphical user interface for genetic association analysis

    PubMed Central

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-01-01

    Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877

  15. METABOLOMICS FOR DEVELOPING MARKERS OF CHEMICAL EXPOSURE AND DISTINGUISHING TOXICITY PATHWAYS

    EPA Science Inventory

    Metabolomics involves the application of advanced analytical and statistical tools to profile changes in levels of endogenous metabolites in tissues and biofluids resulting from disease onset, stress, or chemical exposure. Nuclear Magnetic Resonance (NMR) spectroscopy-based meta...

  16. Computational Ecology and Open Science: Tools to Help Manage Lakes for Cyanobacteria in Lakes

    EPA Science Inventory

    Computational ecology is an interdisciplinary field that takes advantage of modern computation abilities to expand our ecological understanding. As computational ecologists, we use large data sets, which often cover large spatial extents, and advanced statistical/mathematical co...

  17. Estimating population diversity with CatchAll

    USDA-ARS?s Scientific Manuscript database

    The massive quantity of data produced by next-generation sequencing has created a pressing need for advanced statistical tools, in particular for analysis of bacterial and phage communities. Here we address estimating the total diversity in a population – the species richness. This is an important s...

  18. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shear, Trevor Allan

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystalmore » sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.« less

  19. Using Self-Reflection To Increase Science Process Skills in the General Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Veal, William R.; Taylor, Dawne; Rogers, Amy L.

    2009-03-01

    Self-reflection is a tool of instruction that has been used in the science classroom. Research has shown great promise in using video as a learning tool in the classroom. However, the integration of self-reflective practice using video in the general chemistry laboratory to help students develop process skills has not been done. Immediate video feedback and direct instruction were employed in a general chemistry laboratory course to improve students' mastery and understanding of basic and advanced process skills. Qualitative results and statistical analysis of quantitative data proved that self-reflection significantly helped students develop basic and advanced process skills, yet did not seem to influence the general understanding of the science content.

  20. Risk assessment model for development of advanced age-related macular degeneration.

    PubMed

    Klein, Michael L; Francis, Peter J; Ferris, Frederick L; Hamon, Sara C; Clemons, Traci E

    2011-12-01

    To design a risk assessment model for development of advanced age-related macular degeneration (AMD) incorporating phenotypic, demographic, environmental, and genetic risk factors. We evaluated longitudinal data from 2846 participants in the Age-Related Eye Disease Study. At baseline, these individuals had all levels of AMD, ranging from none to unilateral advanced AMD (neovascular or geographic atrophy). Follow-up averaged 9.3 years. We performed a Cox proportional hazards analysis with demographic, environmental, phenotypic, and genetic covariates and constructed a risk assessment model for development of advanced AMD. Performance of the model was evaluated using the C statistic and the Brier score and externally validated in participants in the Complications of Age-Related Macular Degeneration Prevention Trial. The final model included the following independent variables: age, smoking history, family history of AMD (first-degree member), phenotype based on a modified Age-Related Eye Disease Study simple scale score, and genetic variants CFH Y402H and ARMS2 A69S. The model did well on performance measures, with very good discrimination (C statistic = 0.872) and excellent calibration and overall performance (Brier score at 5 years = 0.08). Successful external validation was performed, and a risk assessment tool was designed for use with or without the genetic component. We constructed a risk assessment model for development of advanced AMD. The model performed well on measures of discrimination, calibration, and overall performance and was successfully externally validated. This risk assessment tool is available for online use.

  1. Personalizing oncology treatments by predicting drug efficacy, side-effects, and improved therapy: mathematics, statistics, and their integration.

    PubMed

    Agur, Zvia; Elishmereni, Moran; Kheifetz, Yuri

    2014-01-01

    Despite its great promise, personalized oncology still faces many hurdles, and it is increasingly clear that targeted drugs and molecular biomarkers alone yield only modest clinical benefit. One reason is the complex relationships between biomarkers and the patient's response to drugs, obscuring the true weight of the biomarkers in the overall patient's response. This complexity can be disentangled by computational models that integrate the effects of personal biomarkers into a simulator of drug-patient dynamic interactions, for predicting the clinical outcomes. Several computational tools have been developed for personalized oncology, notably evidence-based tools for simulating pharmacokinetics, Bayesian-estimated tools for predicting survival, etc. We describe representative statistical and mathematical tools, and discuss their merits, shortcomings and preliminary clinical validation attesting to their potential. Yet, the individualization power of mathematical models alone, or statistical models alone, is limited. More accurate and versatile personalization tools can be constructed by a new application of the statistical/mathematical nonlinear mixed effects modeling (NLMEM) approach, which until recently has been used only in drug development. Using these advanced tools, clinical data from patient populations can be integrated with mechanistic models of disease and physiology, for generating personal mathematical models. Upon a more substantial validation in the clinic, this approach will hopefully be applied in personalized clinical trials, P-trials, hence aiding the establishment of personalized medicine within the main stream of clinical oncology. © 2014 Wiley Periodicals, Inc.

  2. GAPIT: genome association and prediction integrated tool.

    PubMed

    Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu

    2012-09-15

    Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.

  3. The Emergence of Contextual Social Psychology.

    PubMed

    Pettigrew, Thomas F

    2018-07-01

    Social psychology experiences recurring so-called "crises." This article maintains that these episodes actually mark advances in the discipline; these "crises" have enhanced relevance and led to greater methodological and statistical sophistication. New statistical tools have allowed social psychologists to begin to achieve a major goal: placing psychological phenomena in their larger social contexts. This growing trend is illustrated with numerous recent studies; they demonstrate how cultures and social norms moderate basic psychological processes. Contextual social psychology is finally emerging.

  4. Improved processes for meeting the data requirements for implementing the Highway Safety Manual (HSM) and Safety Analyst in Florida.

    DOT National Transportation Integrated Search

    2014-03-01

    Recent research in highway safety has focused on the more advanced and statistically proven techniques of highway : safety analysis. This project focuses on the two most recent safety analysis tools, the Highway Safety Manual (HSM) : and SafetyAnalys...

  5. Advanced LIGO low-latency searches

    NASA Astrophysics Data System (ADS)

    Kanner, Jonah; LIGO Scientific Collaboration, Virgo Collaboration

    2016-06-01

    Advanced LIGO recently made the first detection of gravitational waves from merging binary black holes. The signal was first identified by a low-latency analysis, which identifies gravitational-wave transients within a few minutes of data collection. More generally, Advanced LIGO transients are sought with a suite of automated tools, which collectively identify events, evaluate statistical significance, estimate source position, and attempt to characterize source properties. This low-latency effort is enabling a broad multi-messenger approach to the science of compact object mergers and other transients. This talk will give an overview of the low-latency methodology and recent results.

  6. ANEMOS: Development of a next generation wind power forecasting system for the large-scale integration of onshore and offshore wind farms.

    NASA Astrophysics Data System (ADS)

    Kariniotakis, G.; Anemos Team

    2003-04-01

    Objectives: Accurate forecasting of the wind energy production up to two days ahead is recognized as a major contribution for reliable large-scale wind power integration. Especially, in a liberalized electricity market, prediction tools enhance the position of wind energy compared to other forms of dispatchable generation. ANEMOS, is a new 3.5 years R&D project supported by the European Commission, that resembles research organizations and end-users with an important experience on the domain. The project aims to develop advanced forecasting models that will substantially outperform current methods. Emphasis is given to situations like complex terrain, extreme weather conditions, as well as to offshore prediction for which no specific tools currently exist. The prediction models will be implemented in a software platform and installed for online operation at onshore and offshore wind farms by the end-users participating in the project. Approach: The paper presents the methodology of the project. Initially, the prediction requirements are identified according to the profiles of the end-users. The project develops prediction models based on both a physical and an alternative statistical approach. Research on physical models gives emphasis to techniques for use in complex terrain and the development of prediction tools based on CFD techniques, advanced model output statistics or high-resolution meteorological information. Statistical models (i.e. based on artificial intelligence) are developed for downscaling, power curve representation, upscaling for prediction at regional or national level, etc. A benchmarking process is set-up to evaluate the performance of the developed models and to compare them with existing ones using a number of case studies. The synergy between statistical and physical approaches is examined to identify promising areas for further improvement of forecasting accuracy. Appropriate physical and statistical prediction models are also developed for offshore wind farms taking into account advances in marine meteorology (interaction between wind and waves, coastal effects). The benefits from the use of satellite radar images for modeling local weather patterns are investigated. A next generation forecasting software, ANEMOS, will be developed to integrate the various models. The tool is enhanced by advanced Information Communication Technology (ICT) functionality and can operate both in stand alone, or remote mode, or be interfaced with standard Energy or Distribution Management Systems (EMS/DMS) systems. Contribution: The project provides an advanced technology for wind resource forecasting applicable in a large scale: at a single wind farm, regional or national level and for both interconnected and island systems. A major milestone is the on-line operation of the developed software by the participating utilities for onshore and offshore wind farms and the demonstration of the economic benefits. The outcome of the ANEMOS project will help consistently the increase of wind integration in two levels; in an operational level due to better management of wind farms, but also, it will contribute to increasing the installed capacity of wind farms. This is because accurate prediction of the resource reduces the risk of wind farm developers, who are then more willing to undertake new wind farm installations especially in a liberalized electricity market environment.

  7. Federal IPM Programs - National Site for the Regional IPM Centers

    Science.gov Websites

    2012 Symposium 2015 Symposium USDA/NIFA NIFA advances knowledge for agriculture, the environment, human strengthen the Department's support for agriculture by helping to develop alternative pest management tools timely, accurate, and useful statistics in service to U. S. agriculture. NASS publications cover a wide

  8. Modelling for Prediction vs. Modelling for Understanding: Commentary on Musso et al. (2013)

    ERIC Educational Resources Information Center

    Edelsbrunner, Peter; Schneider, Michael

    2013-01-01

    Musso et al. (2013) predict students' academic achievement with high accuracy one year in advance from cognitive and demographic variables, using artificial neural networks (ANNs). They conclude that ANNs have high potential for theoretical and practical improvements in learning sciences. ANNs are powerful statistical modelling tools but they can…

  9. A Computational Study of the Energy Dissipation Through an Acrylic Target Impacted by Various Size FSP

    DTIC Science & Technology

    2009-06-01

    data, and then returns an array that describes the line. This function, when compared to the LOGEST statistical function of the Microsoft Excel, which...threats continues to grow, the ability to predict materials performances using advanced modeling tools increases. The current paper has demonstrated

  10. A Biologically Informed Framework for the Analysis of the PPAR Signaling Pathway using a Bayesian Network

    EPA Science Inventory

    The US EPA’s ToxCastTM program seeks to combine advances in high-throughput screening technology with methodologies from statistics and computer science to develop high-throughput decision support tools for assessing chemical hazard and risk. To develop new methods of analysis of...

  11. Corpora Processing and Computational Scaffolding for a Web-Based English Learning Environment: The CANDLE Project

    ERIC Educational Resources Information Center

    Liou, Hsien-Chin; Chang, Jason S; Chen, Hao-Jan; Lin, Chih-Cheng; Liaw, Meei-Ling; Gao, Zhao-Ming; Jang, Jyh-Shing Roger; Yeh, Yuli; Chuang, Thomas C.; You, Geeng-Neng

    2006-01-01

    This paper describes the development of an innovative web-based environment for English language learning with advanced data-driven and statistical approaches. The project uses various corpora, including a Chinese-English parallel corpus ("Sinorama") and various natural language processing (NLP) tools to construct effective English…

  12. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  13. SWATH2stats: An R/Bioconductor Package to Process and Convert Quantitative SWATH-MS Proteomics Data for Downstream Analysis Tools.

    PubMed

    Blattmann, Peter; Heusel, Moritz; Aebersold, Ruedi

    2016-01-01

    SWATH-MS is an acquisition and analysis technique of targeted proteomics that enables measuring several thousand proteins with high reproducibility and accuracy across many samples. OpenSWATH is popular open-source software for peptide identification and quantification from SWATH-MS data. For downstream statistical and quantitative analysis there exist different tools such as MSstats, mapDIA and aLFQ. However, the transfer of data from OpenSWATH to the downstream statistical tools is currently technically challenging. Here we introduce the R/Bioconductor package SWATH2stats, which allows convenient processing of the data into a format directly readable by the downstream analysis tools. In addition, SWATH2stats allows annotation, analyzing the variation and the reproducibility of the measurements, FDR estimation, and advanced filtering before submitting the processed data to downstream tools. These functionalities are important to quickly analyze the quality of the SWATH-MS data. Hence, SWATH2stats is a new open-source tool that summarizes several practical functionalities for analyzing, processing, and converting SWATH-MS data and thus facilitates the efficient analysis of large-scale SWATH/DIA datasets.

  14. EpiModel: An R Package for Mathematical Modeling of Infectious Disease over Networks.

    PubMed

    Jenness, Samuel M; Goodreau, Steven M; Morris, Martina

    2018-04-01

    Package EpiModel provides tools for building, simulating, and analyzing mathematical models for the population dynamics of infectious disease transmission in R. Several classes of models are included, but the unique contribution of this software package is a general stochastic framework for modeling the spread of epidemics on networks. EpiModel integrates recent advances in statistical methods for network analysis (temporal exponential random graph models) that allow the epidemic modeling to be grounded in empirical data on contacts that can spread infection. This article provides an overview of both the modeling tools built into EpiModel , designed to facilitate learning for students new to modeling, and the application programming interface for extending package EpiModel , designed to facilitate the exploration of novel research questions for advanced modelers.

  15. EpiModel: An R Package for Mathematical Modeling of Infectious Disease over Networks

    PubMed Central

    Jenness, Samuel M.; Goodreau, Steven M.; Morris, Martina

    2018-01-01

    Package EpiModel provides tools for building, simulating, and analyzing mathematical models for the population dynamics of infectious disease transmission in R. Several classes of models are included, but the unique contribution of this software package is a general stochastic framework for modeling the spread of epidemics on networks. EpiModel integrates recent advances in statistical methods for network analysis (temporal exponential random graph models) that allow the epidemic modeling to be grounded in empirical data on contacts that can spread infection. This article provides an overview of both the modeling tools built into EpiModel, designed to facilitate learning for students new to modeling, and the application programming interface for extending package EpiModel, designed to facilitate the exploration of novel research questions for advanced modelers. PMID:29731699

  16. Advancements in RNASeqGUI towards a Reproducible Analysis of RNA-Seq Experiments

    PubMed Central

    Russo, Francesco; Righelli, Dario

    2016-01-01

    We present the advancements and novelties recently introduced in RNASeqGUI, a graphical user interface that helps biologists to handle and analyse large data collected in RNA-Seq experiments. This work focuses on the concept of reproducible research and shows how it has been incorporated in RNASeqGUI to provide reproducible (computational) results. The novel version of RNASeqGUI combines graphical interfaces with tools for reproducible research, such as literate statistical programming, human readable report, parallel executions, caching, and interactive and web-explorable tables of results. These features allow the user to analyse big datasets in a fast, efficient, and reproducible way. Moreover, this paper represents a proof of concept, showing a simple way to develop computational tools for Life Science in the spirit of reproducible research. PMID:26977414

  17. Quantification of Operational Risk Using A Data Mining

    NASA Technical Reports Server (NTRS)

    Perera, J. Sebastian

    1999-01-01

    What is Data Mining? - Data Mining is the process of finding actionable information hidden in raw data. - Data Mining helps find hidden patterns, trends, and important relationships often buried in a sea of data - Typically, automated software tools based on advanced statistical analysis and data modeling technology can be utilized to automate the data mining process

  18. Construct Validity in TOEFL iBT Speaking Tasks: Insights from Natural Language Processing

    ERIC Educational Resources Information Center

    Kyle, Kristopher; Crossley, Scott A.; McNamara, Danielle S.

    2016-01-01

    This study explores the construct validity of speaking tasks included in the TOEFL iBT (e.g., integrated and independent speaking tasks). Specifically, advanced natural language processing (NLP) tools, MANOVA difference statistics, and discriminant function analyses (DFA) are used to assess the degree to which and in what ways responses to these…

  19. Effect Size Measures for Mediation Models: Quantitative Strategies for Communicating Indirect Effects

    ERIC Educational Resources Information Center

    Preacher, Kristopher J.; Kelley, Ken

    2011-01-01

    The statistical analysis of mediation effects has become an indispensable tool for helping scientists investigate processes thought to be causal. Yet, in spite of many recent advances in the estimation and testing of mediation effects, little attention has been given to methods for communicating effect size and the practical importance of those…

  20. Advances in molecular labeling, high throughput imaging and machine intelligence portend powerful functional cellular biochemistry tools.

    PubMed

    Price, Jeffrey H; Goodacre, Angela; Hahn, Klaus; Hodgson, Louis; Hunter, Edward A; Krajewski, Stanislaw; Murphy, Robert F; Rabinovich, Andrew; Reed, John C; Heynen, Susanne

    2002-01-01

    Cellular behavior is complex. Successfully understanding systems at ever-increasing complexity is fundamental to advances in modern science and unraveling the functional details of cellular behavior is no exception. We present a collection of prospectives to provide a glimpse of the techniques that will aid in collecting, managing and utilizing information on complex cellular processes via molecular imaging tools. These include: 1) visualizing intracellular protein activity with fluorescent markers, 2) high throughput (and automated) imaging of multilabeled cells in statistically significant numbers, and 3) machine intelligence to analyze subcellular image localization and pattern. Although not addressed here, the importance of combining cell-image-based information with detailed molecular structure and ligand-receptor binding models cannot be overlooked. Advanced molecular imaging techniques have the potential to impact cellular diagnostics for cancer screening, clinical correlations of tissue molecular patterns for cancer biology, and cellular molecular interactions for accelerating drug discovery. The goal of finally understanding all cellular components and behaviors will be achieved by advances in both instrumentation engineering (software and hardware) and molecular biochemistry. Copyright 2002 Wiley-Liss, Inc.

  1. International experience on the use of artificial neural networks in gastroenterology.

    PubMed

    Grossi, E; Mancini, A; Buscema, M

    2007-03-01

    In this paper, we reconsider the scientific background for the use of artificial intelligence tools in medicine. A review of some recent significant papers shows that artificial neural networks, the more advanced and effective artificial intelligence technique, can improve the classification accuracy and survival prediction of a number of gastrointestinal diseases. We discuss the 'added value' the use of artificial neural networks-based tools can bring in the field of gastroenterology, both at research and clinical application level, when compared with traditional statistical or clinical-pathological methods.

  2. COLLABORATIVE RESEARCH:USING ARM OBSERVATIONS & ADVANCED STATISTICAL TECHNIQUES TO EVALUATE CAM3 CLOUDS FOR DEVELOPMENT OF STOCHASTIC CLOUD-RADIATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somerville, Richard

    2013-08-22

    The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key stepmore » in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been a collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen).« less

  3. Interpretation of statistical results.

    PubMed

    García Garmendia, J L; Maroto Monserrat, F

    2018-02-21

    The appropriate interpretation of the statistical results is crucial to understand the advances in medical science. The statistical tools allow us to transform the uncertainty and apparent chaos in nature to measurable parameters which are applicable to our clinical practice. The importance of understanding the meaning and actual extent of these instruments is essential for researchers, the funders of research and for professionals who require a permanent update based on good evidence and supports to decision making. Various aspects of the designs, results and statistical analysis are reviewed, trying to facilitate his comprehension from the basics to what is most common but no better understood, and bringing a constructive, non-exhaustive but realistic look. Copyright © 2018 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.

  4. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R

    PubMed Central

    Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis. PMID:27792763

  5. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R.

    PubMed

    Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.

  6. Predicting Operator Execution Times Using CogTool

    NASA Technical Reports Server (NTRS)

    Santiago-Espada, Yamira; Latorella, Kara A.

    2013-01-01

    Researchers and developers of NextGen systems can use predictive human performance modeling tools as an initial approach to obtain skilled user performance times analytically, before system testing with users. This paper describes the CogTool models for a two pilot crew executing two different types of a datalink clearance acceptance tasks, and on two different simulation platforms. The CogTool time estimates for accepting and executing Required Time of Arrival and Interval Management clearances were compared to empirical data observed in video tapes and registered in simulation files. Results indicate no statistically significant difference between empirical data and the CogTool predictions. A population comparison test found no significant differences between the CogTool estimates and the empirical execution times for any of the four test conditions. We discuss modeling caveats and considerations for applying CogTool to crew performance modeling in advanced cockpit environments.

  7. Advanced functional network analysis in the geosciences: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Runge, Jakob; Schultz, Hanna C. H.; Wiedermann, Marc; Zech, Alraune; Feldhoff, Jan; Rheinwalt, Aljoscha; Kutza, Hannes; Radebach, Alexander; Marwan, Norbert; Kurths, Jürgen

    2013-04-01

    Functional networks are a powerful tool for analyzing large geoscientific datasets such as global fields of climate time series originating from observations or model simulations. pyunicorn (pythonic unified complex network and recurrence analysis toolbox) is an open-source, fully object-oriented and easily parallelizable package written in the language Python. It allows for constructing functional networks (aka climate networks) representing the structure of statistical interrelationships in large datasets and, subsequently, investigating this structure using advanced methods of complex network theory such as measures for networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn allows to study the complex dynamics of geoscientific systems as recorded by time series by means of recurrence networks and visibility graphs. The range of possible applications of the package is outlined drawing on several examples from climatology.

  8. Integration of Advanced Statistical Analysis Tools and Geophysical Modeling

    DTIC Science & Technology

    2012-08-01

    Carin Duke University Douglas Oldenburg University of British Columbia Stephen Billings Leonard Pasion Laurens Beran Sky Research...data processing for UXO discrimination is the time (or frequency) dependent dipole model (Bell and Barrow (2001), Pasion and Oldenburg (2001), Zhang...described by a bimodal distribution (i.e. two Gaussians, see Pasion (2007)). Data features are nonetheless useful when data quality is not sufficient

  9. DRIVE: Drive-Cycle Rapid Investigation, Visualization, and Evaluation

    Science.gov Websites

    specialized statistical clustering methods. The duration of these representative drive cycles, which aim to , DRIVE can benefit a variety of users. For example: Fleet managers can use the tool to make educated investment decisions by determining, in advance, the payback period for a given technology. Vehicle

  10. AP® Potential Predicted by PSAT/NMSQT® Scores Using Logistic Regression. Statistical Report 2014-1

    ERIC Educational Resources Information Center

    Zhang, Xiuyuan; Patel, Priyank; Ewing, Maureen

    2014-01-01

    AP Potential™ is an educational guidance tool that uses PSAT/NMSQT® scores to identify students who have the potential to do well on one or more Advanced Placement® (AP®) Exams. Students identified as having AP potential, perhaps students who would not have been otherwise identified, should consider enrolling in the corresponding AP course if they…

  11. The discriminatory capability of existing scores to predict advanced colorectal neoplasia: a prospective colonoscopy study of 5,899 screening participants.

    PubMed

    Wong, Martin C S; Ching, Jessica Y L; Ng, Simpson; Lam, Thomas Y T; Luk, Arthur K C; Wong, Sunny H; Ng, Siew C; Ng, Simon S M; Wu, Justin C Y; Chan, Francis K L; Sung, Joseph J Y

    2016-02-03

    We evaluated the performance of seven existing risk scoring systems in predicting advanced colorectal neoplasia in an asymptomatic Chinese cohort. We prospectively recruited 5,899 Chinese subjects aged 50-70 years in a colonoscopy screening programme(2008-2014). Scoring systems under evaluation included two scoring tools from the US; one each from Spain, Germany, and Poland; the Korean Colorectal Screening(KCS) scores; and the modified Asia Pacific Colorectal Screening(APCS) scores. The c-statistics, sensitivity, specificity, positive predictive values(PPVs), and negative predictive values(NPVs) of these systems were evaluated. The resources required were estimated based on the Number Needed to Screen(NNS) and the Number Needed to Refer for colonoscopy(NNR). Advanced neoplasia was detected in 364 (6.2%) subjects. The German system referred the least proportion of subjects (11.2%) for colonoscopy, whilst the KCS scoring system referred the highest (27.4%). The c-statistics of all systems ranged from 0.56-0.65, with sensitivities ranging from 0.04-0.44 and specificities from 0.74-0.99. The modified APCS scoring system had the highest c-statistics (0.65, 95% C.I. 0.58-0.72). The NNS (12-19) and NNR (5-10) were similar among the scoring systems. The existing scoring systems have variable capability to predict advanced neoplasia among asymptomatic Chinese subjects, and further external validation should be performed.

  12. A Diagnostics Tool to detect ensemble forecast system anomaly and guide operational decisions

    NASA Astrophysics Data System (ADS)

    Park, G. H.; Srivastava, A.; Shrestha, E.; Thiemann, M.; Day, G. N.; Draijer, S.

    2017-12-01

    The hydrologic community is moving toward using ensemble forecasts to take uncertainty into account during the decision-making process. The New York City Department of Environmental Protection (DEP) implements several types of ensemble forecasts in their decision-making process: ensemble products for a statistical model (Hirsch and enhanced Hirsch); the National Weather Service (NWS) Advanced Hydrologic Prediction Service (AHPS) forecasts based on the classical Ensemble Streamflow Prediction (ESP) technique; and the new NWS Hydrologic Ensemble Forecasting Service (HEFS) forecasts. To remove structural error and apply the forecasts to additional forecast points, the DEP post processes both the AHPS and the HEFS forecasts. These ensemble forecasts provide mass quantities of complex data, and drawing conclusions from these forecasts is time-consuming and difficult. The complexity of these forecasts also makes it difficult to identify system failures resulting from poor data, missing forecasts, and server breakdowns. To address these issues, we developed a diagnostic tool that summarizes ensemble forecasts and provides additional information such as historical forecast statistics, forecast skill, and model forcing statistics. This additional information highlights the key information that enables operators to evaluate the forecast in real-time, dynamically interact with the data, and review additional statistics, if needed, to make better decisions. We used Bokeh, a Python interactive visualization library, and a multi-database management system to create this interactive tool. This tool compiles and stores data into HTML pages that allows operators to readily analyze the data with built-in user interaction features. This paper will present a brief description of the ensemble forecasts, forecast verification results, and the intended applications for the diagnostic tool.

  13. Meteorological satellite data: A tool to describe the health of the world's agriculture

    NASA Technical Reports Server (NTRS)

    Gray, T. I., Jr.; Mccrary, D. G. (Principal Investigator); Scott, L.

    1981-01-01

    Local area coverage data acquired aboard the TIROS-N satellite family by the advanced very high resolution radiometer systems was examined to determine the agricultural information current. Albedo differences between channel 2 and channel 1 of the advanced very high resolution radiometer LAC (called EVI) are shown to be closely correlated to the Ashburn vegetative index produced from LANDSAT multispectral scanner data which have been shown to vary in response to "greenness", soil moisture, and crop production. The statistical correlation between the EVI and the Ashburn Vegetative Index (+ or - 1 deg) is 0.86.

  14. Integration of Advanced Statistical Analysis Tools and Geophysical Modeling

    DTIC Science & Technology

    2010-12-01

    Carin Duke University Douglas Oldenburg University of British Columbia Stephen Billings, Leonard Pasion Laurens Beran Sky Research...means and covariances estimated for each class [5]. For this study, dipole polarizabilities were fit with a Pasion -Oldenburg parameterization of 8 −1...model for unexploded ordnance classification with EMI data,” IEEE Geosci. Remote Sensing Letters, vol. 4, pp. 629–633, 2007. [4] L. R. Pasion

  15. Detecting differential DNA methylation from sequencing of bisulfite converted DNA of diverse species.

    PubMed

    Huh, Iksoo; Wu, Xin; Park, Taesung; Yi, Soojin V

    2017-07-21

    DNA methylation is one of the most extensively studied epigenetic modifications of genomic DNA. In recent years, sequencing of bisulfite-converted DNA, particularly via next-generation sequencing technologies, has become a widely popular method to study DNA methylation. This method can be readily applied to a variety of species, dramatically expanding the scope of DNA methylation studies beyond the traditionally studied human and mouse systems. In parallel to the increasing wealth of genomic methylation profiles, many statistical tools have been developed to detect differentially methylated loci (DMLs) or differentially methylated regions (DMRs) between biological conditions. We discuss and summarize several key properties of currently available tools to detect DMLs and DMRs from sequencing of bisulfite-converted DNA. However, the majority of the statistical tools developed for DML/DMR analyses have been validated using only mammalian data sets, and less priority has been placed on the analyses of invertebrate or plant DNA methylation data. We demonstrate that genomic methylation profiles of non-mammalian species are often highly distinct from those of mammalian species using examples of honey bees and humans. We then discuss how such differences in data properties may affect statistical analyses. Based on these differences, we provide three specific recommendations to improve the power and accuracy of DML and DMR analyses of invertebrate data when using currently available statistical tools. These considerations should facilitate systematic and robust analyses of DNA methylation from diverse species, thus advancing our understanding of DNA methylation. © The Author 2017. Published by Oxford University Press.

  16. The Victor C++ library for protein representation and advanced manipulation.

    PubMed

    Hirsh, Layla; Piovesan, Damiano; Giollo, Manuel; Ferrari, Carlo; Tosatto, Silvio C E

    2015-04-01

    Protein sequence and structure representation and manipulation require dedicated software libraries to support methods of increasing complexity. Here, we describe the VIrtual Constrution TOol for pRoteins (Victor) C++ library, an open source platform dedicated to enabling inexperienced users to develop advanced tools and gathering contributions from the community. The provided application examples cover statistical energy potentials, profile-profile sequence alignments and ab initio loop modeling. Victor was used over the last 15 years in several publications and optimized for efficiency. It is provided as a GitHub repository with source files and unit tests, plus extensive online documentation, including a Wiki with help files and tutorials, examples and Doxygen documentation. The C++ library and online documentation, distributed under a GPL license are available from URL: http://protein.bio.unipd.it/victor/. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Concordance in the Assessment of Effectiveness of Palliative Care between Patients and Palliative Care Nurses in Malaysia: A Study with the Palliative Care Outcome Scale

    PubMed Central

    Koh, Kwee Choy; Gupta, Esha Das; Poovaneswaran, Sangeetha; Then, Siaw Ling; Teo, Michelle Jia Jui; Gan, Teik Yiap; Thing, Joanne Hwei Yean

    2017-01-01

    Context: The Palliative Care Outcome Scale (POS) is an easy-to-use assessment tool to evaluate the effectiveness of palliative care. There is no published literature on the use of POS as an assessment tool in Malaysia. Aim: To define the concordance in the assessment of quality of life between patients with advanced cancers and their palliative care nurses using a Malay version of the POS. Settings and Design: This study was conducted in the palliative care unit of the Hospital Tuanku Ja'afar Seremban, Malaysia, from February 2014 to June 2014. Subjects and Methods: We adapted and validated the English version of the 3-day recall POS into Malay and used it to define the concordance in the assessment of quality of life between patients and palliative care nurses. Forty patients with advanced stage cancers and forty palliative care nurses completed the Malay POS questionnaire. Statistical Analysis Used: The kappa statistical test was used to assess the agreement between patients and their palliative care nurses. Results: Slight to fair concordance was found in all items, except for one item (family anxiety) where there was no agreement. Conclusions: The Malay version of the POS was well accepted and reliable as an assessment tool for evaluation of the effectiveness of palliative care in Malaysia. Slight to fair concordance was shown between the patients and their palliative care nurses, suggesting the needs for more training of the nurses. PMID:28216862

  18. Tracking Changes in Cardiac Output: Statistical Considerations on the 4-Quadrant Plot and the Polar Plot Methodology.

    PubMed

    Saugel, Bernd; Grothe, Oliver; Wagner, Julia Y

    2015-08-01

    When comparing 2 technologies for measuring hemodynamic parameters with regard to their ability to track changes, 2 graphical tools are omnipresent in the literature: the 4-quadrant plot and the polar plot recently proposed by Critchley et al. The polar plot is thought to be the more advanced statistical tool, but care should be taken when it comes to its interpretation. The polar plot excludes possibly important measurements from the data. The polar plot transforms the data nonlinearily, which may prevent it from being seen clearly. In this article, we compare the 4-quadrant and the polar plot in detail and thoroughly describe advantages and limitations of each. We also discuss pitfalls concerning the methods to prepare the researcher for the sound use of both methods. Finally, we briefly revisit the Bland-Altman plot for the use in this context.

  19. Recent advances in mathematical criminology. Comment on "Statistical physics of crime: A review" by M.R. D'Orsogna and M. Perc

    NASA Astrophysics Data System (ADS)

    Rodríguez, Nancy

    2015-03-01

    The use of mathematical tools has long proved to be useful in gaining understanding of complex systems in physics [1]. Recently, many researchers have realized that there is an analogy between emerging phenomena in complex social systems and complex physical or biological systems [4,5,12]. This realization has particularly benefited the modeling and understanding of crime, a ubiquitous phenomena that is far from being understood. In fact, when one is interested in the bulk behavior of patterns that emerge from small and seemingly unrelated interactions as well as decisions that occur at the individual level, the mathematical tools that have been developed in statistical physics, game theory, network theory, dynamical systems, and partial differential equations can be useful in shedding light into the dynamics of these patterns [2-4,6,12].

  20. Statistical strategy for anisotropic adventitia modelling in IVUS.

    PubMed

    Gil, Debora; Hernández, Aura; Rodriguez, Oriol; Mauri, Josepa; Radeva, Petia

    2006-06-01

    Vessel plaque assessment by analysis of intravascular ultrasound sequences is a useful tool for cardiac disease diagnosis and intervention. Manual detection of luminal (inner) and media-adventitia (external) vessel borders is the main activity of physicians in the process of lumen narrowing (plaque) quantification. Difficult definition of vessel border descriptors, as well as, shades, artifacts, and blurred signal response due to ultrasound physical properties trouble automated adventitia segmentation. In order to efficiently approach such a complex problem, we propose blending advanced anisotropic filtering operators and statistical classification techniques into a vessel border modelling strategy. Our systematic statistical analysis shows that the reported adventitia detection achieves an accuracy in the range of interobserver variability regardless of plaque nature, vessel geometry, and incomplete vessel borders.

  1. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    PubMed Central

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  2. Modeling to Optimize Terminal Stem Cell Differentiation

    PubMed Central

    Gallicano, G. Ian

    2013-01-01

    Embryonic stem cell (ESC), iPCs, and adult stem cells (ASCs) all are among the most promising potential treatments for heart failure, spinal cord injury, neurodegenerative diseases, and diabetes. However, considerable uncertainty in the production of ESC-derived terminally differentiated cell types has limited the efficiency of their development. To address this uncertainty, we and other investigators have begun to employ a comprehensive statistical model of ESC differentiation for determining the role of intracellular pathways (e.g., STAT3) in ESC differentiation and determination of germ layer fate. The approach discussed here applies the Baysian statistical model to cell/developmental biology combining traditional flow cytometry methodology and specific morphological observations with advanced statistical and probabilistic modeling and experimental design. The final result of this study is a unique tool and model that enhances the understanding of how and when specific cell fates are determined during differentiation. This model provides a guideline for increasing the production efficiency of therapeutically viable ESCs/iPSCs/ASC derived neurons or any other cell type and will eventually lead to advances in stem cell therapy. PMID:24278782

  3. Recent Progress in the Development of Metabolome Databases for Plant Systems Biology

    PubMed Central

    Fukushima, Atsushi; Kusano, Miyako

    2013-01-01

    Metabolomics has grown greatly as a functional genomics tool, and has become an invaluable diagnostic tool for biochemical phenotyping of biological systems. Over the past decades, a number of databases involving information related to mass spectra, compound names and structures, statistical/mathematical models and metabolic pathways, and metabolite profile data have been developed. Such databases complement each other and support efficient growth in this area, although the data resources remain scattered across the World Wide Web. Here, we review available metabolome databases and summarize the present status of development of related tools, particularly focusing on the plant metabolome. Data sharing discussed here will pave way for the robust interpretation of metabolomic data and advances in plant systems biology. PMID:23577015

  4. A proof of concept for epidemiological research using structured reporting with pulmonary embolism as a use case.

    PubMed

    Daniel, Pinto Dos Santos; Sonja, Scheibl; Gordon, Arnhold; Aline, Maehringer-Kunz; Christoph, Düber; Peter, Mildenberger; Roman, Kloeckner

    2018-05-10

    This paper studies the possibilities of an integrated IT-based workflow for epidemiological research in pulmonary embolism using freely available tools and structured reporting. We included a total of 521 consecutive cases which had been referred to the radiology department for computed tomography pulmonary angiography (CTPA) with suspected pulmonary embolism (PE). Free-text reports were transformed into structured reports using a freely available IHE-MRRT-compliant reporting platform. D-dimer values were retrieved from the hospitals laboratory results system. All information was stored in the platform's database and visualized using freely available tools. For further analysis, we directly accessed the platform's database with an advanced analytics tool (RapidMiner). We were able developed an integrated workflow for epidemiological statistics from reports obtained in clinical routine. The report data allowed for automated calculation of epidemiological parameters. Prevalence of pulmonary embolism was 27.6%. The mean age in patients with and without pulmonary embolism did not differ (62.8 years and 62.0 years, respectively, p=0.987). As expected, there was a significant difference in mean D-dimer values (10.13 mg/L FEU and 3.12 mg/L FEU, respectively, p<0.001). Structured reporting can make data obtained from clinical routine more accessible. Designing practical workflows is feasible using freely available tools and allows for the calculation of epidemiological statistics on a near real-time basis. Therefore, radiologists should push for the implementation of structured reporting in clinical routine. Advances in knowledge: Theoretical benefits of structured reporting have long been discussed, but practical implementation demonstrating those benefits has been lacking. Here we present a first experience providing proof that structured reporting will make data from clinical routine more accessible.

  5. Generating community-built tools for data sharing and analysis in environmental networks

    USGS Publications Warehouse

    Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David

    2016-01-01

    Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.

  6. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    PubMed

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  7. Evaluation of the internal and external responsiveness of the Pressure Ulcer Scale for Healing (PUSH) tool for assessing acute and chronic wounds.

    PubMed

    Choi, Edmond P H; Chin, Weng Yee; Wan, Eric Y F; Lam, Cindy L K

    2016-05-01

    To examine the internal and external responsiveness of the Pressure Ulcer Scale for Healing (PUSH) tool for assessing the healing progress in acute and chronic wounds. It is important to establish the responsiveness of instruments used in conducting wound care assessments to ensure that they are able to capture changes in wound healing accurately over time. Prospective longitudinal observational study. The key study instrument was the PUSH tool. Internal responsiveness was assessed using paired t-testing and effect size statistics. External responsiveness was assessed using multiple linear regression. All new patients with at least one eligible acute or chronic wound, enrolled in the Nurse and Allied Health Clinic-Wound Care programme between 1 December 2012 - 31 March 2013 were included for analysis (N = 541). Overall, the PUSH tool was able to detect statistically significant changes in wound healing between baseline and discharge. The effect size statistics were large. The internal responsiveness of the PUSH tool was confirmed in patients with a variety of different wound types including venous ulcers, pressure ulcers, neuropathic ulcers, burns and scalds, skin tears, surgical wounds and traumatic wounds. After controlling for age, gender and wound type, subjects in the 'wound improved but not healed' group had a smaller change in PUSH scores than those in the 'wound healed' group. Subjects in the 'wound static or worsened' group had the smallest change in PUSH scores. The external responsiveness was confirmed. The internal and external responsiveness of the PUSH tool confirmed that it can be used to track the healing progress of both acute and chronic wounds. © 2016 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  8. Statistical Study of Soviet Nuclear Explosions: Data, Results, and Software Tools

    DTIC Science & Technology

    1993-11-05

    KIRTLAND AFB, NM 87117-6008 Monitored by: ADVANCED RESEARCH PROJECTS AGENCY NUCLEAR MONITORING RESEARCH OFFICE 94-03131 3701 NORTH FAIRFAX DRIVE...AGENCY REPORT NUMBER ARPAINMRO (Attn. Dr. Alan Ryall, Jr.) 3701 North Fairfax Drive Arlington, VA 22203-1714 11. SUPPLEMENTARY NOTES *Department of...dug by them, in Nuclear Explosions for Peaceful Purposes (I. D. Morokhov, Ed.), Atomizdat, Moscow, LLL Report UCRL -Trans-10517, 79-109. Nuttli, 0. W

  9. MetaGenyo: a web tool for meta-analysis of genetic association studies.

    PubMed

    Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro

    2017-12-16

    Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .

  10. The clinical value of large neuroimaging data sets in Alzheimer's disease.

    PubMed

    Toga, Arthur W

    2012-02-01

    Rapid advances in neuroimaging and cyberinfrastructure technologies have brought explosive growth in the Web-based warehousing, availability, and accessibility of imaging data on a variety of neurodegenerative and neuropsychiatric disorders and conditions. There has been a prolific development and emergence of complex computational infrastructures that serve as repositories of databases and provide critical functionalities such as sophisticated image analysis algorithm pipelines and powerful three-dimensional visualization and statistical tools. The statistical and operational advantages of collaborative, distributed team science in the form of multisite consortia push this approach in a diverse range of population-based investigations. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Statistics and bioinformatics in nutritional sciences: analysis of complex data in the era of systems biology⋆

    PubMed Central

    Fu, Wenjiang J.; Stromberg, Arnold J.; Viele, Kert; Carroll, Raymond J.; Wu, Guoyao

    2009-01-01

    Over the past two decades, there have been revolutionary developments in life science technologies characterized by high throughput, high efficiency, and rapid computation. Nutritionists now have the advanced methodologies for the analysis of DNA, RNA, protein, low-molecular-weight metabolites, as well as access to bioinformatics databases. Statistics, which can be defined as the process of making scientific inferences from data that contain variability, has historically played an integral role in advancing nutritional sciences. Currently, in the era of systems biology, statistics has become an increasingly important tool to quantitatively analyze information about biological macromolecules. This article describes general terms used in statistical analysis of large, complex experimental data. These terms include experimental design, power analysis, sample size calculation, and experimental errors (type I and II errors) for nutritional studies at population, tissue, cellular, and molecular levels. In addition, we highlighted various sources of experimental variations in studies involving microarray gene expression, real-time polymerase chain reaction, proteomics, and other bioinformatics technologies. Moreover, we provided guidelines for nutritionists and other biomedical scientists to plan and conduct studies and to analyze the complex data. Appropriate statistical analyses are expected to make an important contribution to solving major nutrition-associated problems in humans and animals (including obesity, diabetes, cardiovascular disease, cancer, ageing, and intrauterine fetal retardation). PMID:20233650

  12. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    PubMed Central

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis. PMID:25705672

  13. The novel quantitative technique for assessment of gait symmetry using advanced statistical learning algorithm.

    PubMed

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  14. Evaluating Cellular Polyfunctionality with a Novel Polyfunctionality Index

    PubMed Central

    Larsen, Martin; Sauce, Delphine; Arnaud, Laurent; Fastenackels, Solène; Appay, Victor; Gorochov, Guy

    2012-01-01

    Functional evaluation of naturally occurring or vaccination-induced T cell responses in mice, men and monkeys has in recent years advanced from single-parameter (e.g. IFN-γ-secretion) to much more complex multidimensional measurements. Co-secretion of multiple functional molecules (such as cytokines and chemokines) at the single-cell level is now measurable due primarily to major advances in multiparametric flow cytometry. The very extensive and complex datasets generated by this technology raise the demand for proper analytical tools that enable the analysis of combinatorial functional properties of T cells, hence polyfunctionality. Presently, multidimensional functional measures are analysed either by evaluating all combinations of parameters individually or by summing frequencies of combinations that include the same number of simultaneous functions. Often these evaluations are visualized as pie charts. Whereas pie charts effectively represent and compare average polyfunctionality profiles of particular T cell subsets or patient groups, they do not document the degree or variation of polyfunctionality within a group nor does it allow more sophisticated statistical analysis. Here we propose a novel polyfunctionality index that numerically evaluates the degree and variation of polyfuntionality, and enable comparative and correlative parametric and non-parametric statistical tests. Moreover, it allows the usage of more advanced statistical approaches, such as cluster analysis. We believe that the polyfunctionality index will render polyfunctionality an appropriate end-point measure in future studies of T cell responsiveness. PMID:22860124

  15. Tools for surveying and improving the quality of life: people with special needs in focus.

    PubMed

    Hoyningen-Süess, Ursula; Oberholzer, David; Stalder, René; Brügger, Urs

    2012-01-01

    This article seeks to describe online tools for surveying and improving quality of life for people with disabilities living in assisted living centers and special education service organizations. Ensuring a decent quality of life for disabled people is an important welfare state goal. Using well-accepted quality of life conceptions, online diagnostic and planning tools were developed during an Institute for Education, University of Zurich, research project. The diagnostic tools measure, evaluate and analyze disabled people's quality of life. The planning tools identify factors that can affect their quality of life and suggest improvements. Instrument validity and reliability are not tested according to the standard statistical procedures. This will be done at a more advanced stage of the project. Instead, the tool is developed, refined and adjusted in cooperation with practitioners who are constantly judging it according to best practice standards. The tools support staff in assisted living centers and special education service organizations. These tools offer comprehensive resources for surveying, quantifying, evaluating, describing and simulating quality of life elements.

  16. GAMBIT: the global and modular beyond-the-standard-model inference tool

    NASA Astrophysics Data System (ADS)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-11-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.

  17. SECIMTools: a suite of metabolomics data analysis tools.

    PubMed

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  18. Advance directives in intensive care: Health professional competences.

    PubMed

    Velasco-Sanz, T R; Rayón-Valpuesta, E

    2016-04-01

    To identify knowledge, skills and attitudes among physicians and nurses of adults' intensive care units (ICUs), referred to advance directives or living wills. A cross-sectional descriptive study was carried out. Nine hospitals in the Community of Madrid (Spain). Physicians and nurses of adults' intensive care. A qualitative Likert-type scale and multiple response survey were made. Knowledge, skills and attitudes about the advance directives. A descriptive statistical analysis based on percentages was made, with application of the chi-squared test for comparisons, accepting p < 0.05 as representing statistical significance. A total of 331 surveys were collected (51%). It was seen that 90.3% did not know all the measures envisaged by the advance directives. In turn, 50.2% claimed that the living wills are not respected, and 82.8% believed advance directives to be a useful tool for health professionals in the decision making process. A total of 85.3% the physicians stated that they would respect a living will, in cases of emergencies, compared to 66.2% of the nursing staff (p = 0.007). Lastly, only 19.1% of the physicians and 2.3% of the nursing staff knew whether their patients had advance directives (p < 0.001). Although health professionals displayed poor knowledge of advance directives, they had a favorable attitude toward their usefulness. However, most did not know whether their patients had a living will, and some professionals even failed to respect such instructions despite knowledge of the existence of advance directives. Improvements in health professional education in this field are needed. Copyright © 2015 Elsevier España, S.L.U. and SEMICYUC. All rights reserved.

  19. Advances in Machine Learning and Data Mining for Astronomy

    NASA Astrophysics Data System (ADS)

    Way, Michael J.; Scargle, Jeffrey D.; Ali, Kamal M.; Srivastava, Ashok N.

    2012-03-01

    Advances in Machine Learning and Data Mining for Astronomy documents numerous successful collaborations among computer scientists, statisticians, and astronomers who illustrate the application of state-of-the-art machine learning and data mining techniques in astronomy. Due to the massive amount and complexity of data in most scientific disciplines, the material discussed in this text transcends traditional boundaries between various areas in the sciences and computer science. The book's introductory part provides context to issues in the astronomical sciences that are also important to health, social, and physical sciences, particularly probabilistic and statistical aspects of classification and cluster analysis. The next part describes a number of astrophysics case studies that leverage a range of machine learning and data mining technologies. In the last part, developers of algorithms and practitioners of machine learning and data mining show how these tools and techniques are used in astronomical applications. With contributions from leading astronomers and computer scientists, this book is a practical guide to many of the most important developments in machine learning, data mining, and statistics. It explores how these advances can solve current and future problems in astronomy and looks at how they could lead to the creation of entirely new algorithms within the data mining community.

  20. Statistical tests for detecting associations with groups of genetic variants: generalization, evaluation, and implementation

    PubMed Central

    Ferguson, John; Wheeler, William; Fu, YiPing; Prokunina-Olsson, Ludmila; Zhao, Hongyu; Sampson, Joshua

    2013-01-01

    With recent advances in sequencing, genotyping arrays, and imputation, GWAS now aim to identify associations with rare and uncommon genetic variants. Here, we describe and evaluate a class of statistics, generalized score statistics (GSS), that can test for an association between a group of genetic variants and a phenotype. GSS are a simple weighted sum of single-variant statistics and their cross-products. We show that the majority of statistics currently used to detect associations with rare variants are equivalent to choosing a specific set of weights within this framework. We then evaluate the power of various weighting schemes as a function of variant characteristics, such as MAF, the proportion associated with the phenotype, and the direction of effect. Ultimately, we find that two classical tests are robust and powerful, but details are provided as to when other GSS may perform favorably. The software package CRaVe is available at our website (http://dceg.cancer.gov/bb/tools/crave). PMID:23092956

  1. PHYSICS OF NON-GAUSSIAN FIELDS AND THE COSMOLOGICAL GENUS STATISTIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, J. Berian, E-mail: berian@berkeley.edu

    2012-05-20

    We report a technique to calculate the impact of distinct physical processes inducing non-Gaussianity on the cosmological density field. A natural decomposition of the cosmic genus statistic into an orthogonal polynomial sequence allows complete expression of the scale-dependent evolution of the topology of large-scale structure, in which effects including galaxy bias, nonlinear gravitational evolution, and primordial non-Gaussianity may be delineated. The relationship of this decomposition to previous methods for analyzing the genus statistic is briefly considered and the following applications are made: (1) the expression of certain systematics affecting topological measurements, (2) the quantification of broad deformations from Gaussianity thatmore » appear in the genus statistic as measured in the Horizon Run simulation, and (3) the study of the evolution of the genus curve for simulations with primordial non-Gaussianity. These advances improve the treatment of flux-limited galaxy catalogs for use with this measurement and further the use of the genus statistic as a tool for exploring non-Gaussianity.« less

  2. Combining medical informatics and bioinformatics toward tools for personalized medicine.

    PubMed

    Sarachan, B D; Simmons, M K; Subramanian, P; Temkin, J M

    2003-01-01

    Key bioinformatics and medical informatics research areas need to be identified to advance knowledge and understanding of disease risk factors and molecular disease pathology in the 21 st century toward new diagnoses, prognoses, and treatments. Three high-impact informatics areas are identified: predictive medicine (to identify significant correlations within clinical data using statistical and artificial intelligence methods), along with pathway informatics and cellular simulations (that combine biological knowledge with advanced informatics to elucidate molecular disease pathology). Initial predictive models have been developed for a pilot study in Huntington's disease. An initial bioinformatics platform has been developed for the reconstruction and analysis of pathways, and work has begun on pathway simulation. A bioinformatics research program has been established at GE Global Research Center as an important technology toward next generation medical diagnostics. We anticipate that 21 st century medical research will be a combination of informatics tools with traditional biology wet lab research, and that this will translate to increased use of informatics techniques in the clinic.

  3. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    PubMed

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  4. Carbohydrate Structure Database: tools for statistical analysis of bacterial, plant and fungal glycomes

    PubMed Central

    Egorova, K.S.; Kondakova, A.N.; Toukach, Ph.V.

    2015-01-01

    Carbohydrates are biological blocks participating in diverse and crucial processes both at cellular and organism levels. They protect individual cells, establish intracellular interactions, take part in the immune reaction and participate in many other processes. Glycosylation is considered as one of the most important modifications of proteins and other biologically active molecules. Still, the data on the enzymatic machinery involved in the carbohydrate synthesis and processing are scattered, and the advance on its study is hindered by the vast bulk of accumulated genetic information not supported by any experimental evidences for functions of proteins that are encoded by these genes. In this article, we present novel instruments for statistical analysis of glycomes in taxa. These tools may be helpful for investigating carbohydrate-related enzymatic activities in various groups of organisms and for comparison of their carbohydrate content. The instruments are developed on the Carbohydrate Structure Database (CSDB) platform and are available freely on the CSDB web-site at http://csdb.glycoscience.ru. Database URL: http://csdb.glycoscience.ru PMID:26337239

  5. Meeting report: applied biopharmaceutics and quality by design for dissolution/release specification setting: product quality for patient benefit.

    PubMed

    Selen, Arzu; Cruañes, Maria T; Müllertz, Anette; Dickinson, Paul A; Cook, Jack A; Polli, James E; Kesisoglou, Filippos; Crison, John; Johnson, Kevin C; Muirhead, Gordon T; Schofield, Timothy; Tsong, Yi

    2010-09-01

    A biopharmaceutics and Quality by Design (QbD) conference was held on June 10-12, 2009 in Rockville, Maryland, USA to provide a forum and identify approaches for enhancing product quality for patient benefit. Presentations concerned the current biopharmaceutical toolbox (i.e., in vitro, in silico, pre-clinical, in vivo, and statistical approaches), as well as case studies, and reflections on new paradigms. Plenary and breakout session discussions evaluated the current state and envisioned a future state that more effectively integrates QbD and biopharmaceutics. Breakout groups discussed the following four topics: Integrating Biopharmaceutical Assessment into the QbD Paradigm, Predictive Statistical Tools, Predictive Mechanistic Tools, and Predictive Analytical Tools. Nine priority areas, further described in this report, were identified for advancing integration of biopharmaceutics and support a more fundamentally based, integrated approach to setting product dissolution/release acceptance criteria. Collaboration among a broad range of disciplines and fostering a knowledge sharing environment that places the patient's needs as the focus of drug development, consistent with science- and risk-based spirit of QbD, were identified as key components of the path forward.

  6. Recent advances in quantitative high throughput and high content data analysis.

    PubMed

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  7. EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing.

    PubMed

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments.

  8. Potential Impacts of Accelerated Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, L. R.; Vail, L. W.

    2016-05-31

    This research project is part of the U.S. Nuclear Regulatory Commission’s (NRC’s) Probabilistic Flood Hazard Assessment (PFHA) Research plan in support of developing a risk-informed licensing framework for flood hazards and design standards at proposed new facilities and significance determination tools for evaluating potential deficiencies related to flood protection at operating facilities. The PFHA plan aims to build upon recent advances in deterministic, probabilistic, and statistical modeling of extreme precipitation events to develop regulatory tools and guidance for NRC staff with regard to PFHA for nuclear facilities. The tools and guidance developed under the PFHA plan will support and enhancemore » NRC’s capacity to perform thorough and efficient reviews of license applications and license amendment requests. They will also support risk-informed significance determination of inspection findings, unusual events, and other oversight activities.« less

  9. Effectiveness of a Technology-Based Intervention to Teach Evidence-Based Practice: The EBR Tool.

    PubMed

    Long, JoAnn D; Gannaway, Paula; Ford, Cindy; Doumit, Rita; Zeeni, Nadine; Sukkarieh-Haraty, Ola; Milane, Aline; Byers, Beverly; Harrison, LaNell; Hatch, Daniel; Brown, Justin; Proper, Sharlan; White, Patricia; Song, Huaxin

    2016-02-01

    As the world becomes increasingly digital, advances in technology have changed how students access evidence-based information. Research suggests that students overestimate their ability to locate quality online research and lack the skills needed to evaluate the scientific literature. Clinical nurses report relying on personal experience to answer clinical questions rather than searching evidence-based sources. To address the problem, a web-based, evidence-based research (EBR) tool that is usable from a computer, smartphone, or iPad was developed and tested. The purpose of the EBR tool is to guide students through the basic steps needed to locate and critically appraise the online scientific literature while linking users to quality electronic resources to support evidence-based practice (EBP). Testing of the tool took place in a mixed-method, quasi-experimental, and two-population randomized controlled trial (RCT) design in a U.S. and Middle East university. A statistically significant improvement in overall research skills was supported in the quasi-experimental nursing student group and RCT nutrition student group using the EBR tool. A statistically significant proportional difference was supported in the RCT nutrition and PharmD intervention groups in participants' ability to distinguish the credibility of online source materials compared with controls. The majority of participants could correctly apply PICOTS to a case study when using the tool. The data from this preliminary study suggests that the EBR tool enhanced student overall research skills and selected EBP skills while generating data for assessment of learning outcomes. The EBR tool places evidence-based resources at the fingertips of users by addressing some of the most commonly cited barriers to research utilization while exposing users to information and online literacy standards of practice, meeting a growing need within nursing curricula. © 2016 Sigma Theta Tau International.

  10. Are advance directives helpful for good end of life decision making: a cross sectional survey of health professionals.

    PubMed

    Peicius, Eimantas; Blazeviciene, Aurelija; Kaminskas, Raimondas

    2017-06-05

    This paper joins the debate over changes in the role of health professionals when applying advance directives to manage the decision-making process at the end of life care. Issues in relation to advance directives occur in clinical units in Lithuania; however, it remains one of the few countries in the European Union (EU) where the discussion on advance directives is not included in the health-care policy-making agenda. To encourage the discussion of advance directives, a study was designed to examine health professionals' understanding and preferences related to advance directives. In addition, the study sought to explore the views of health care professionals of the application of Advance Directives (AD) in clinical practice in Lithuania. A cross-sectional survey was conducted by interviewing 478 health professionals based at major health care centers in Kaunas district, Lithuania. The design of the study included the use of a questionnaire developed for this study and validated by a pilot study. The collected data were analyzed using standard descriptive statistical methods. The analysis of knowledge about AD revealed some statistically significant differences when comparing the respondents' profession and gender. The analysis also indicated key emerging themes among respondents including tranquility of mind, the longest possible life expectancy and freedom of choice. Further, the study findings revealed that more than half of the study participants preferred to express their will while alive by using advance directives. The study findings revealed a low level of knowledge on advance directives among health professionals. Most health professionals agreed that AD's improved end-of-life decision making while the majority of physicians appreciated AD as the best tool for sharing responsibilities in clinical practice in Lithuania. More physicians than nurses preferred the presence of advance directives to support their decision making in end-of-life situations.

  11. A Comparison of Atmospheric Quantities Determined from Advanced WVR and Weather Analysis Data

    NASA Astrophysics Data System (ADS)

    Morabito, D.; Wu, L.; Slobin, S.

    2017-05-01

    Lower frequency bands used for deep space communications (e.g., 2.3 GHz and 8.4 GHz) are oversubscribed. Thus, NASA has become interested in using higher frequency bands (e.g., 26 GHz and 32 GHz) for telemetry, making use of the available wider bandwidth. However, these bands are more susceptible to atmospheric degradation. Currently, flight projects tend to be conservative in preparing their communications links by using worst-case or conservative assumptions, which result in nonoptimum data return. We previously explored the use of weather forecasting over different weather condition scenarios to determine more optimal values of atmospheric attenuation and atmospheric noise temperature for use in telecommunications link design. In this article, we present the results of a comparison of meteorological parameters (columnar water vapor and liquid water content) estimated from multifrequency Advanced Water Vapor Radiometer (AWVR) data with those estimated from weather analysis tools (FNL). We find that for the Deep Space Network's Goldstone and Madrid tracking sites, the statistics are in reasonable agreement between the two methods. We can then use the statistics of these quantities based on FNL runs to estimate statistics of atmospheric signal degradation for tracking sites that do not have the benefit of possessing multiyear WVR data sets, such as those of the NASA Near-Earth Network (NEN). The resulting statistics of atmospheric attenuation and atmospheric noise temperature increase can then be used in link budget calculations.

  12. PREDICT: a diagnostic accuracy study of a tool for predicting mortality within one year: who should have an advance healthcare directive?

    PubMed

    Richardson, Philip; Greenslade, Jaimi; Shanmugathasan, Sulochana; Doucet, Katherine; Widdicombe, Neil; Chu, Kevin; Brown, Anthony

    2015-01-01

    CARING is a screening tool developed to identify patients who have a high likelihood of death in 1 year. This study sought to validate a modified CARING tool (termed PREDICT) using a population of patients presenting to the Emergency Department. In total, 1000 patients aged over 55 years who were admitted to hospital via the Emergency Department between January and June 2009 were eligible for inclusion in this study. Data on the six prognostic indicators comprising PREDICT were obtained retrospectively from patient records. One-year mortality data were obtained from the State Death Registry. Weights were applied to each PREDICT criterion, and its final score ranged from 0 to 44. Receiver operator characteristic analyses and diagnostic accuracy statistics were used to assess the accuracy of PREDICT in identifying 1-year mortality. The sample comprised 976 patients with a median (interquartile range) age of 71 years (62-81 years) and a 1-year mortality of 23.4%. In total, 50% had ≥1 PREDICT criteria with a 1-year mortality of 40.4%. Receiver operator characteristic analysis gave an area under the curve of 0.86 (95% confidence interval: 0.83-0.89). Using a cut-off of 13 points, PREDICT had a 95.3% (95% confidence interval: 93.6-96.6) specificity and 53.9% (95% confidence interval: 47.5-60.3) sensitivity for predicting 1-year mortality. PREDICT was simpler than the CARING criteria and identified 158 patients per 1000 admitted who could benefit from advance care planning. PREDICT was successfully applied to the Australian healthcare system with findings similar to the original CARING study conducted in the United States. This tool could improve end-of-life care by identifying who should have advance care planning or an advance healthcare directive. © The Author(s) 2014.

  13. SpecViz: Interactive Spectral Data Analysis

    NASA Astrophysics Data System (ADS)

    Earl, Nicholas Michael; STScI

    2016-06-01

    The astronomical community is about to enter a new generation of scientific enterprise. With next-generation instrumentation and advanced capabilities, the need has arisen to equip astronomers with the necessary tools to deal with large, multi-faceted data. The Space Telescope Science Institute has initiated a data analysis forum for the creation, development, and maintenance of software tools for the interpretation of these new data sets. SpecViz is a spectral 1-D interactive visualization and analysis application built with Python in an open source development environment. A user-friendly GUI allows for a fast, interactive approach to spectral analysis. SpecViz supports handling of unique and instrument-specific data, incorporation of advanced spectral unit handling and conversions in a flexible, high-performance interactive plotting environment. Active spectral feature analysis is possible through interactive measurement and statistical tools. It can be used to build wide-band SEDs, with the capability of combining or overplotting data products from various instruments. SpecViz sports advanced toolsets for filtering and detrending spectral lines; identifying, isolating, and manipulating spectral features; as well as utilizing spectral templates for renormalizing data in an interactive way. SpecViz also includes a flexible model fitting toolset that allows for multi-component models, as well as custom models, to be used with various fitting and decomposition routines. SpecViz also features robust extension via custom data loaders and connection to the central communication system underneath the interface for more advanced control. Incorporation with Jupyter notebooks via connection with the active iPython kernel allows for SpecViz to be used in addition to a user’s normal workflow without demanding the user drastically alter their method of data analysis. In addition, SpecViz allows the interactive analysis of multi-object spectroscopy in the same straight-forward, consistent way. Through the development of such tools, STScI hopes to unify astronomical data analysis software for JWST and other instruments, allowing for efficient, reliable, and consistent scientific results.

  14. Research in Computational Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2003-01-01

    We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.

  15. Differential proteome analysis of diabetes mellitus type 2 and its pathophysiological complications.

    PubMed

    Sohail, Waleed; Majeed, Fatimah; Afroz, Amber

    2018-06-11

    The prevalence of Diabetes Mellitus Type 2 (DM 2) is increasing every passing year due to some global changes in lifestyles of people. The exact underlying mechanisms of the progression of this disease are not yet known. However recent advances in the combined omics more particularly in proteomics and genomics have opened a gateway towards the understanding of predetermined genetic factors, progression, complications and treatment of this disease. Here we shall review the recent advances in proteomics that have led to an early and better diagnostic approaches in controlling DM 2 more importantly the comparison of structural and functional protein biomarkers that are modified in the diseased state. By applying these advanced and promising proteomic strategies with bioinformatics applications and bio-statistical tools the prevalence of DM 2 and its associated disorders i-e nephropathy and retinopathy are expected to be controlled. Copyright © 2018 Diabetes India. Published by Elsevier Ltd. All rights reserved.

  16. Recent developments in measurement and evaluation of FAC damage in power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garud, Y.S.; Besuner, P.; Cohn, M.J.

    1999-11-01

    This paper describes some recent developments in the measurement and evaluation of flow-accelerated corrosion (FAC) damage in power plants. The evaluation focuses on data checking and smoothing to account for gross errors, noise, and uncertainty in the wall thickness measurements from ultrasonic or pulsed eddy-current data. Also, the evaluation method utilizes advanced regression analysis for spatial and temporal evolution of the wall loss, providing statistically robust predictions of wear rates and associated uncertainty. Results of the application of these new tools are presented for several components in actual service. More importantly, the practical implications of using these advances are discussedmore » in relation to the likely impact on the scope and effectiveness of FAC related inspection programs.« less

  17. An advanced probabilistic structural analysis method for implicit performance functions

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  18. New software for statistical analysis of Cambridge Structural Database data

    PubMed Central

    Sykes, Richard A.; McCabe, Patrick; Allen, Frank H.; Battle, Gary M.; Bruno, Ian J.; Wood, Peter A.

    2011-01-01

    A collection of new software tools is presented for the analysis of geometrical, chemical and crystallographic data from the Cambridge Structural Database (CSD). This software supersedes the program Vista. The new functionality is integrated into the program Mercury in order to provide statistical, charting and plotting options alongside three-dimensional structural visualization and analysis. The integration also permits immediate access to other information about specific CSD entries through the Mercury framework, a common requirement in CSD data analyses. In addition, the new software includes a range of more advanced features focused towards structural analysis such as principal components analysis, cone-angle correction in hydrogen-bond analyses and the ability to deal with topological symmetry that may be exhibited in molecular search fragments. PMID:22477784

  19. Dynamic Quantitative Trait Locus Analysis of Plant Phenomic Data.

    PubMed

    Li, Zitong; Sillanpää, Mikko J

    2015-12-01

    Advanced platforms have recently become available for automatic and systematic quantification of plant growth and development. These new techniques can efficiently produce multiple measurements of phenotypes over time, and introduce time as an extra dimension to quantitative trait locus (QTL) studies. Functional mapping utilizes a class of statistical models for identifying QTLs associated with the growth characteristics of interest. A major benefit of functional mapping is that it integrates information over multiple timepoints, and therefore could increase the statistical power for QTL detection. We review the current development of computationally efficient functional mapping methods which provide invaluable tools for analyzing large-scale timecourse data that are readily available in our post-genome era. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  1. Global cardiac risk assessment in the Registry Of Pregnancy And Cardiac disease: results of a registry from the European Society of Cardiology.

    PubMed

    van Hagen, Iris M; Boersma, Eric; Johnson, Mark R; Thorne, Sara A; Parsonage, William A; Escribano Subías, Pilar; Leśniak-Sobelga, Agata; Irtyuga, Olga; Sorour, Khaled A; Taha, Nasser; Maggioni, Aldo P; Hall, Roger; Roos-Hesselink, Jolien W

    2016-05-01

    To validate the modified World Health Organization (mWHO) risk classification in advanced and emerging countries, and to identify additional risk factors for cardiac events during pregnancy. The ongoing prospective worldwide Registry Of Pregnancy And Cardiac disease (ROPAC) included 2742 pregnant women (mean age ± standard deviation, 29.2 ± 5.5 years) with established cardiac disease: 1827 from advanced countries and 915 from emerging countries. In patients from advanced countries, congenital heart disease was the most prevalent diagnosis (70%) while in emerging countries valvular heart disease was more common (55%). A cardiac event occurred in 566 patients (20.6%) during pregnancy: 234 (12.8%) in advanced countries and 332 (36.3%) in emerging countries. The mWHO classification had a moderate performance to discriminate between women with and without cardiac events (c-statistic 0.711 and 95% confidence interval (CI) 0.686-0.735). However, its performance in advanced countries (0.726) was better than in emerging countries (0.633). The best performance was found in patients with acquired heart disease from developed countries (0.712). Pre-pregnancy signs of heart failure and, in advanced countries, atrial fibrillation and no previous cardiac intervention added prognostic value to the mWHO classification, with a c-statistic of 0.751 (95% CI 0.715-0.786) in advanced countries and of 0.724 (95% CI 0.691-0.758) in emerging countries. The mWHO risk classification is a useful tool for predicting cardiac events during pregnancy in women with established cardiac disease in advanced countries, but seems less effective in emerging countries. Data on pre-pregnancy cardiac condition including signs of heart failure and atrial fibrillation, may help to improve preconception counselling in advanced and emerging countries. © 2016 The Authors. European Journal of Heart Failure © 2016 European Society of Cardiology.

  2. Modeling of fiber orientation in viscous fluid flow with application to self-compacting concrete

    NASA Astrophysics Data System (ADS)

    Kolařík, Filip; Patzák, Bořek

    2013-10-01

    In recent years, unconventional concrete reinforcement is of growing popularity. Especially fiber reinforcement has very wide usage in high performance concretes like "Self Compacting Concrete" (SCC). The design of advanced tailor-made structures made of SCC can take advantage of anisotropic orientation of fibers. Tools for fiber orientation predictions can contribute to design of tailor made structure and allow to develop casting procedures that enable to achieve the desired fiber distribution and orientation. This paper deals with development and implementation of suitable tool for prediction of fiber orientation in a fluid based on the knowledge of the velocity field. Statistical approach to the topic is employed. Fiber orientation is described by a probability distribution of the fiber angle.

  3. Data Visualization in Sociology

    PubMed Central

    Healy, Kieran; Moody, James

    2014-01-01

    Visualizing data is central to social scientific work. Despite a promising early beginning, sociology has lagged in the use of visual tools. We review the history and current state of visualization in sociology. Using examples throughout, we discuss recent developments in ways of seeing raw data and presenting the results of statistical modeling. We make a general distinction between those methods and tools designed to help explore datasets, and those designed to help present results to others. We argue that recent advances should be seen as part of a broader shift towards easier sharing of the code and data both between researchers and with wider publics, and encourage practitioners and publishers to work toward a higher and more consistent standard for the graphical display of sociological insights. PMID:25342872

  4. Advancing multiscale structural mapping of the brain through fluorescence imaging and analysis across length scales

    PubMed Central

    Hogstrom, L. J.; Guo, S. M.; Murugadoss, K.; Bathe, M.

    2016-01-01

    Brain function emerges from hierarchical neuronal structure that spans orders of magnitude in length scale, from the nanometre-scale organization of synaptic proteins to the macroscopic wiring of neuronal circuits. Because the synaptic electrochemical signal transmission that drives brain function ultimately relies on the organization of neuronal circuits, understanding brain function requires an understanding of the principles that determine hierarchical neuronal structure in living or intact organisms. Recent advances in fluorescence imaging now enable quantitative characterization of neuronal structure across length scales, ranging from single-molecule localization using super-resolution imaging to whole-brain imaging using light-sheet microscopy on cleared samples. These tools, together with correlative electron microscopy and magnetic resonance imaging at the nanoscopic and macroscopic scales, respectively, now facilitate our ability to probe brain structure across its full range of length scales with cellular and molecular specificity. As these imaging datasets become increasingly accessible to researchers, novel statistical and computational frameworks will play an increasing role in efforts to relate hierarchical brain structure to its function. In this perspective, we discuss several prominent experimental advances that are ushering in a new era of quantitative fluorescence-based imaging in neuroscience along with novel computational and statistical strategies that are helping to distil our understanding of complex brain structure. PMID:26855758

  5. EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing

    PubMed Central

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590

  6. Implementation of novel statistical procedures and other advanced approaches to improve analysis of CASA data.

    PubMed

    Ramón, M; Martínez-Pastor, F

    2018-04-23

    Computer-aided sperm analysis (CASA) produces a wealth of data that is frequently ignored. The use of multiparametric statistical methods can help explore these datasets, unveiling the subpopulation structure of sperm samples. In this review we analyse the significance of the internal heterogeneity of sperm samples and its relevance. We also provide a brief description of the statistical tools used for extracting sperm subpopulations from the datasets, namely unsupervised clustering (with non-hierarchical, hierarchical and two-step methods) and the most advanced supervised methods, based on machine learning. The former method has allowed exploration of subpopulation patterns in many species, whereas the latter offering further possibilities, especially considering functional studies and the practical use of subpopulation analysis. We also consider novel approaches, such as the use of geometric morphometrics or imaging flow cytometry. Finally, although the data provided by CASA systems provides valuable information on sperm samples by applying clustering analyses, there are several caveats. Protocols for capturing and analysing motility or morphometry should be standardised and adapted to each experiment, and the algorithms should be open in order to allow comparison of results between laboratories. Moreover, we must be aware of new technology that could change the paradigm for studying sperm motility and morphology.

  7. Conducting Simulation Studies in the R Programming Environment.

    PubMed

    Hallgren, Kevin A

    2013-10-12

    Simulation studies allow researchers to answer specific questions about data analysis, statistical power, and best-practices for obtaining accurate results in empirical research. Despite the benefits that simulation research can provide, many researchers are unfamiliar with available tools for conducting their own simulation studies. The use of simulation studies need not be restricted to researchers with advanced skills in statistics and computer programming, and such methods can be implemented by researchers with a variety of abilities and interests. The present paper provides an introduction to methods used for running simulation studies using the R statistical programming environment and is written for individuals with minimal experience running simulation studies or using R. The paper describes the rationale and benefits of using simulations and introduces R functions relevant for many simulation studies. Three examples illustrate different applications for simulation studies, including (a) the use of simulations to answer a novel question about statistical analysis, (b) the use of simulations to estimate statistical power, and (c) the use of simulations to obtain confidence intervals of parameter estimates through bootstrapping. Results and fully annotated syntax from these examples are provided.

  8. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Statistical Tool Reporting. 1852.223-76 Section 1852.223-76 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.223-76 Federal Automotive Statistical Tool Reporting. As prescribed at 1823.271 and 1851.205, insert the following clause: Federal Automotive Statistical Tool Reporting (JUL 2003) If...

  9. University of Washington's eScience Institute Promotes New Training and Career Pathways in Data Science

    NASA Astrophysics Data System (ADS)

    Stone, S.; Parker, M. S.; Howe, B.; Lazowska, E.

    2015-12-01

    Rapid advances in technology are transforming nearly every field from "data-poor" to "data-rich." The ability to extract knowledge from this abundance of data is the cornerstone of 21st century discovery. At the University of Washington eScience Institute, our mission is to engage researchers across disciplines in developing and applying advanced computational methods and tools to real world problems in data-intensive discovery. Our research team consists of individuals with diverse backgrounds in domain sciences such as astronomy, oceanography and geology, with complementary expertise in advanced statistical and computational techniques such as data management, visualization, and machine learning. Two key elements are necessary to foster careers in data science: individuals with cross-disciplinary training in both method and domain sciences, and career paths emphasizing alternative metrics for advancement. We see persistent and deep-rooted challenges for the career paths of people whose skills, activities and work patterns don't fit neatly into the traditional roles and success metrics of academia. To address these challenges the eScience Institute has developed training programs and established new career opportunities for data-intensive research in academia. Our graduate students and post-docs have mentors in both a methodology and an application field. They also participate in coursework and tutorials to advance technical skill and foster community. Professional Data Scientist positions were created to support research independence while encouraging the development and adoption of domain-specific tools and techniques. The eScience Institute also supports the appointment of faculty who are innovators in developing and applying data science methodologies to advance their field of discovery. Our ultimate goal is to create a supportive environment for data science in academia and to establish global recognition for data-intensive discovery across all fields.

  10. iScreen: Image-Based High-Content RNAi Screening Analysis Tools.

    PubMed

    Zhong, Rui; Dong, Xiaonan; Levine, Beth; Xie, Yang; Xiao, Guanghua

    2015-09-01

    High-throughput RNA interference (RNAi) screening has opened up a path to investigating functional genomics in a genome-wide pattern. However, such studies are often restricted to assays that have a single readout format. Recently, advanced image technologies have been coupled with high-throughput RNAi screening to develop high-content screening, in which one or more cell image(s), instead of a single readout, were generated from each well. This image-based high-content screening technology has led to genome-wide functional annotation in a wider spectrum of biological research studies, as well as in drug and target discovery, so that complex cellular phenotypes can be measured in a multiparametric format. Despite these advances, data analysis and visualization tools are still largely lacking for these types of experiments. Therefore, we developed iScreen (image-Based High-content RNAi Screening Analysis Tool), an R package for the statistical modeling and visualization of image-based high-content RNAi screening. Two case studies were used to demonstrate the capability and efficiency of the iScreen package. iScreen is available for download on CRAN (http://cran.cnr.berkeley.edu/web/packages/iScreen/index.html). The user manual is also available as a supplementary document. © 2014 Society for Laboratory Automation and Screening.

  11. Big heart data: advancing health informatics through data sharing in cardiovascular imaging.

    PubMed

    Suinesiaputra, Avan; Medrano-Gracia, Pau; Cowan, Brett R; Young, Alistair A

    2015-07-01

    The burden of heart disease is rapidly worsening due to the increasing prevalence of obesity and diabetes. Data sharing and open database resources for heart health informatics are important for advancing our understanding of cardiovascular function, disease progression and therapeutics. Data sharing enables valuable information, often obtained at considerable expense and effort, to be reused beyond the specific objectives of the original study. Many government funding agencies and journal publishers are requiring data reuse, and are providing mechanisms for data curation and archival. Tools and infrastructure are available to archive anonymous data from a wide range of studies, from descriptive epidemiological data to gigabytes of imaging data. Meta-analyses can be performed to combine raw data from disparate studies to obtain unique comparisons or to enhance statistical power. Open benchmark datasets are invaluable for validating data analysis algorithms and objectively comparing results. This review provides a rationale for increased data sharing and surveys recent progress in the cardiovascular domain. We also highlight the potential of recent large cardiovascular epidemiological studies enabling collaborative efforts to facilitate data sharing, algorithms benchmarking, disease modeling and statistical atlases.

  12. Uncovering Local Trends in Genetic Effects of Multiple Phenotypes via Functional Linear Models.

    PubMed

    Vsevolozhskaya, Olga A; Zaykin, Dmitri V; Barondess, David A; Tong, Xiaoren; Jadhav, Sneha; Lu, Qing

    2016-04-01

    Recent technological advances equipped researchers with capabilities that go beyond traditional genotyping of loci known to be polymorphic in a general population. Genetic sequences of study participants can now be assessed directly. This capability removed technology-driven bias toward scoring predominantly common polymorphisms and let researchers reveal a wealth of rare and sample-specific variants. Although the relative contributions of rare and common polymorphisms to trait variation are being debated, researchers are faced with the need for new statistical tools for simultaneous evaluation of all variants within a region. Several research groups demonstrated flexibility and good statistical power of the functional linear model approach. In this work we extend previous developments to allow inclusion of multiple traits and adjustment for additional covariates. Our functional approach is unique in that it provides a nuanced depiction of effects and interactions for the variables in the model by representing them as curves varying over a genetic region. We demonstrate flexibility and competitive power of our approach by contrasting its performance with commonly used statistical tools and illustrate its potential for discovery and characterization of genetic architecture of complex traits using sequencing data from the Dallas Heart Study. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  13. In silico environmental chemical science: properties and processes from statistical and computational modelling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tratnyek, Paul G.; Bylaska, Eric J.; Weber, Eric J.

    2017-01-01

    Quantitative structure–activity relationships (QSARs) have long been used in the environmental sciences. More recently, molecular modeling and chemoinformatic methods have become widespread. These methods have the potential to expand and accelerate advances in environmental chemistry because they complement observational and experimental data with “in silico” results and analysis. The opportunities and challenges that arise at the intersection between statistical and theoretical in silico methods are most apparent in the context of properties that determine the environmental fate and effects of chemical contaminants (degradation rate constants, partition coefficients, toxicities, etc.). The main example of this is the calibration of QSARs usingmore » descriptor variable data calculated from molecular modeling, which can make QSARs more useful for predicting property data that are unavailable, but also can make them more powerful tools for diagnosis of fate determining pathways and mechanisms. Emerging opportunities for “in silico environmental chemical science” are to move beyond the calculation of specific chemical properties using statistical models and toward more fully in silico models, prediction of transformation pathways and products, incorporation of environmental factors into model predictions, integration of databases and predictive models into more comprehensive and efficient tools for exposure assessment, and extending the applicability of all the above from chemicals to biologicals and materials.« less

  14. An online tool for Operational Probabilistic Drought Forecasting System (OPDFS): a Statistical-Dynamical Framework

    NASA Astrophysics Data System (ADS)

    Zarekarizi, M.; Moradkhani, H.; Yan, H.

    2017-12-01

    The Operational Probabilistic Drought Forecasting System (OPDFS) is an online tool recently developed at Portland State University for operational agricultural drought forecasting. This is an integrated statistical-dynamical framework issuing probabilistic drought forecasts monthly for the lead times of 1, 2, and 3 months. The statistical drought forecasting method utilizes copula functions in order to condition the future soil moisture values on the antecedent states. Due to stochastic nature of land surface properties, the antecedent soil moisture states are uncertain; therefore, data assimilation system based on Particle Filtering (PF) is employed to quantify the uncertainties associated with the initial condition of the land state, i.e. soil moisture. PF assimilates the satellite soil moisture data to Variable Infiltration Capacity (VIC) land surface model and ultimately updates the simulated soil moisture. The OPDFS builds on the NOAA's seasonal drought outlook by offering drought probabilities instead of qualitative ordinal categories and provides the user with the probability maps associated with a particular drought category. A retrospective assessment of the OPDFS showed that the forecasting of the 2012 Great Plains and 2014 California droughts were possible at least one month in advance. The OPDFS offers a timely assistance to water managers, stakeholders and decision-makers to develop resilience against uncertain upcoming droughts.

  15. Application of the Statistical ICA Technique in the DANCE Data Analysis

    NASA Astrophysics Data System (ADS)

    Baramsai, Bayarbadrakh; Jandel, M.; Bredeweg, T. A.; Rusev, G.; Walker, C. L.; Couture, A.; Mosby, S.; Ullmann, J. L.; Dance Collaboration

    2015-10-01

    The Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center is used to improve our understanding of the neutron capture reaction. DANCE is a highly efficient 4 π γ-ray detector array consisting of 160 BaF2 crystals which make it an ideal tool for neutron capture experiments. The (n, γ) reaction Q-value equals to the sum energy of all γ-rays emitted in the de-excitation cascades from the excited capture state to the ground state. The total γ-ray energy is used to identify reactions on different isotopes as well as the background. However, it's challenging to identify contribution in the Esum spectra from different isotopes with the similar Q-values. Recently we have tested the applicability of modern statistical methods such as Independent Component Analysis (ICA) to identify and separate different (n, γ) reaction yields on different isotopes that are present in the target material. ICA is a recently developed computational tool for separating multidimensional data into statistically independent additive subcomponents. In this conference talk, we present some results of the application of ICA algorithms and its modification for the DANCE experimental data analysis. This research is supported by the U. S. Department of Energy, Office of Science, Nuclear Physics under the Early Career Award No. LANL20135009.

  16. The heritability of the functional connectome is robust to common nonlinear registration methods

    NASA Astrophysics Data System (ADS)

    Hafzalla, George W.; Prasad, Gautam; Baboyan, Vatche G.; Faskowitz, Joshua; Jahanshad, Neda; McMahon, Katie L.; de Zubicaray, Greig I.; Wright, Margaret J.; Braskie, Meredith N.; Thompson, Paul M.

    2016-03-01

    Nonlinear registration algorithms are routinely used in brain imaging, to align data for inter-subject and group comparisons, and for voxelwise statistical analyses. To understand how the choice of registration method affects maps of functional brain connectivity in a sample of 611 twins, we evaluated three popular nonlinear registration methods: Advanced Normalization Tools (ANTs), Automatic Registration Toolbox (ART), and FMRIB's Nonlinear Image Registration Tool (FNIRT). Using both structural and functional MRI, we used each of the three methods to align the MNI152 brain template, and 80 regions of interest (ROIs), to each subject's T1-weighted (T1w) anatomical image. We then transformed each subject's ROIs onto the associated resting state functional MRI (rs-fMRI) scans and computed a connectivity network or functional connectome for each subject. Given the different degrees of genetic similarity between pairs of monozygotic (MZ) and same-sex dizygotic (DZ) twins, we used structural equation modeling to estimate the additive genetic influences on the elements of the function networks, or their heritability. The functional connectome and derived statistics were relatively robust to nonlinear registration effects.

  17. An advanced kinetic theory for morphing continuum with inner structures

    NASA Astrophysics Data System (ADS)

    Chen, James

    2017-12-01

    Advanced kinetic theory with the Boltzmann-Curtiss equation provides a promising tool for polyatomic gas flows, especially for fluid flows containing inner structures, such as turbulence, polyatomic gas flows and others. Although a Hamiltonian-based distribution function was proposed for diatomic gas flow, a general distribution function for the generalized Boltzmann-Curtiss equations and polyatomic gas flow is still out of reach. With assistance from Boltzmann's entropy principle, a generalized Boltzmann-Curtiss distribution for polyatomic gas flow is introduced. The corresponding governing equations at equilibrium state are derived and compared with Eringen's morphing (micropolar) continuum theory derived under the framework of rational continuum thermomechanics. Although rational continuum thermomechanics has the advantages of mathematical rigor and simplicity, the presented statistical kinetic theory approach provides a clear physical picture for what the governing equations represent.

  18. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  19. Report on the ''ESO Python Boot Camp — Pilot Version''

    NASA Astrophysics Data System (ADS)

    Dias, B.; Milli, J.

    2017-03-01

    The Python programming language is becoming very popular within the astronomical community. Python is a high-level language with multiple applications including database management, handling FITS images and tables, statistical analysis, and more advanced topics. Python is a very powerful tool both for astronomical publications and for observatory operations. Since the best way to learn a new programming language is through practice, we therefore organised a two-day hands-on workshop to share expertise among ESO colleagues. We report here the outcome and feedback from this pilot event.

  20. Increasing rigor in NMR-based metabolomics through validated and open source tools

    PubMed Central

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2016-01-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism’s phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. PMID:27643760

  1. Increasing rigor in NMR-based metabolomics through validated and open source tools.

    PubMed

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2017-02-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.

  2. Advanced Prosthetic Gait Training Tool

    DTIC Science & Technology

    2014-10-01

    AWARD NUMBER: W81XWH-10-1-0870 TITLE: Advanced Prosthetic Gait Training Tool...October 2014 2. REPORT TYPE Annual Report 3. DATES COVERED 20 Sep 2013 to 19 Sep 2014 4. TITLE AND SUBTITLE Advanced Prosthetic Gait Training...produce a computer-based Advanced Prosthetic Gait Training Tool to aid in the training of clinicians at military treatment facilities providing care

  3. fMRat: an extension of SPM for a fully automatic analysis of rodent brain functional magnetic resonance series.

    PubMed

    Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel

    2016-05-01

    The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat .

  4. Big data to smart data in Alzheimer's disease: Real-world examples of advanced modeling and simulation.

    PubMed

    Haas, Magali; Stephenson, Diane; Romero, Klaus; Gordon, Mark Forrest; Zach, Neta; Geerts, Hugo

    2016-09-01

    Many disease-modifying clinical development programs in Alzheimer's disease (AD) have failed to date, and development of new and advanced preclinical models that generate actionable knowledge is desperately needed. This review reports on computer-based modeling and simulation approach as a powerful tool in AD research. Statistical data-analysis techniques can identify associations between certain data and phenotypes, such as diagnosis or disease progression. Other approaches integrate domain expertise in a formalized mathematical way to understand how specific components of pathology integrate into complex brain networks. Private-public partnerships focused on data sharing, causal inference and pathway-based analysis, crowdsourcing, and mechanism-based quantitative systems modeling represent successful real-world modeling examples with substantial impact on CNS diseases. Similar to other disease indications, successful real-world examples of advanced simulation can generate actionable support of drug discovery and development in AD, illustrating the value that can be generated for different stakeholders. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Application of the GEM Inventory Data Capture Tools for Dynamic Vulnerability Assessment and Recovery Modelling

    NASA Astrophysics Data System (ADS)

    Verrucci, Enrica; Bevington, John; Vicini, Alessandro

    2014-05-01

    A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be inferred. Lastly, this work draws attention to the use of the IDCT suite as an education resource for inspiring and training new students and engineers in the field of disaster risk reduction.

  6. The development and validation of the AMPREDICT model for predicting mobility outcome after dysvascular lower extremity amputation.

    PubMed

    Czerniecki, Joseph M; Turner, Aaron P; Williams, Rhonda M; Thompson, Mary Lou; Landry, Greg; Hakimi, Kevin; Speckman, Rebecca; Norvell, Daniel C

    2017-01-01

    The objective of this study was the development of AMPREDICT-Mobility, a tool to predict the probability of independence in either basic or advanced (iBASIC or iADVANCED) mobility 1 year after dysvascular major lower extremity amputation. Two prospective cohort studies during consecutive 4-year periods (2005-2009 and 2010-2014) were conducted at seven medical centers. Multiple demographic and biopsychosocial predictors were collected in the periamputation period among individuals undergoing their first major amputation because of complications of peripheral arterial disease or diabetes. The primary outcomes were iBASIC and iADVANCED mobility, as measured by the Locomotor Capabilities Index. Combined data from both studies were used for model development and internal validation. Backwards stepwise logistic regression was used to develop the final prediction models. The discrimination and calibration of each model were assessed. Internal validity of each model was assessed with bootstrap sampling. Twelve-month follow-up was reached by 157 of 200 (79%) participants. Among these, 54 (34%) did not achieve iBASIC mobility, 103 (66%) achieved at least iBASIC mobility, and 51 (32%) also achieved iADVANCED mobility. Predictive factors associated with reduced odds of achieving iBASIC mobility were increasing age, chronic obstructive pulmonary disease, dialysis, diabetes, prior history of treatment for depression or anxiety, and very poor to fair self-rated health. Those who were white, were married, and had at least a high-school degree had a higher probability of achieving iBASIC mobility. The odds of achieving iBASIC mobility increased with increasing body mass index up to 30 kg/m 2 and decreased with increasing body mass index thereafter. The prediction model of iADVANCED mobility included the same predictors with the exception of diabetes, chronic obstructive pulmonary disease, and education level. Both models showed strong discrimination with C statistics of 0.85 and 0.82, respectively. The mean difference in predicted probabilities for those who did and did not achieve iBASIC and iADVANCED mobility was 33% and 29%, respectively. Tests for calibration and observed vs predicted plots suggested good fit for both models; however, the precision of the estimates of the predicted probabilities was modest. Internal validation through bootstrapping demonstrated some overoptimism of the original model development, with the optimism-adjusted C statistic for iBASIC and iADVANCED mobility being 0.74 and 0.71, respectively, and the discrimination slope 19% and 16%, respectively. AMPREDICT-Mobility is a user-friendly prediction tool that can inform the patient undergoing a dysvascular amputation and the patient's provider about the probability of independence in either basic or advanced mobility at each major lower extremity amputation level. Copyright © 2016 Society for Vascular Surgery. All rights reserved.

  7. Improved Broadband Liner Optimization Applied to the Advanced Noise Control Fan

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Jones, Michael G.; Sutliff, Daniel L.; Ayle, Earl; Ichihashi, Fumitaka

    2014-01-01

    The broadband component of fan noise has grown in relevance with the utilization of increased bypass ratio and advanced fan designs. Thus, while the attenuation of fan tones remains paramount, the ability to simultaneously reduce broadband fan noise levels has become more desirable. This paper describes improvements to a previously established broadband acoustic liner optimization process using the Advanced Noise Control Fan rig as a demonstrator. Specifically, in-duct attenuation predictions with a statistical source model are used to obtain optimum impedance spectra over the conditions of interest. The predicted optimum impedance information is then used with acoustic liner modeling tools to design liners aimed at producing impedance spectra that most closely match the predicted optimum values. Design selection is based on an acceptance criterion that provides the ability to apply increased weighting to specific frequencies and/or operating conditions. Constant-depth, double-degree of freedom and variable-depth, multi-degree of freedom designs are carried through design, fabrication, and testing to validate the efficacy of the design process. Results illustrate the value of the design process in concurrently evaluating the relative costs/benefits of these liner designs. This study also provides an application for demonstrating the integrated use of duct acoustic propagation/radiation and liner modeling tools in the design and evaluation of novel broadband liner concepts for complex engine configurations.

  8. The Health Impact Assessment (HIA) Resource and Tool ...

    EPA Pesticide Factsheets

    Health Impact Assessment (HIA) is a relatively new and rapidly emerging field in the U.S. An inventory of available HIA resources and tools was conducted, with a primary focus on resources developed in the U.S. The resources and tools available to HIA practitioners in the conduct of their work were identified through multiple methods and compiled into a comprehensive list. The compilation includes tools and resources related to the HIA process itself and those that can be used to collect and analyze data, establish a baseline profile, assess potential health impacts, and establish benchmarks and indicators for monitoring and evaluation. These resources include literature and evidence bases, data and statistics, guidelines, benchmarks, decision and economic analysis tools, scientific models, methods, frameworks, indices, mapping, and various data collection tools. Understanding the data, tools, models, methods, and other resources available to perform HIAs will help to advance the HIA community of practice in the U.S., improve the quality and rigor of assessments upon which stakeholder and policy decisions are based, and potentially improve the overall effectiveness of HIA to promote healthy and sustainable communities. The Health Impact Assessment (HIA) Resource and Tool Compilation is a comprehensive list of resources and tools that can be utilized by HIA practitioners with all levels of HIA experience to guide them throughout the HIA process. The HIA Resource

  9. SNiPlay: a web-based tool for detection, management and analysis of SNPs. Application to grapevine diversity projects.

    PubMed

    Dereeper, Alexis; Nicolas, Stéphane; Le Cunff, Loïc; Bacilieri, Roberto; Doligez, Agnès; Peros, Jean-Pierre; Ruiz, Manuel; This, Patrice

    2011-05-05

    High-throughput re-sequencing, new genotyping technologies and the availability of reference genomes allow the extensive characterization of Single Nucleotide Polymorphisms (SNPs) and insertion/deletion events (indels) in many plant species. The rapidly increasing amount of re-sequencing and genotyping data generated by large-scale genetic diversity projects requires the development of integrated bioinformatics tools able to efficiently manage, analyze, and combine these genetic data with genome structure and external data. In this context, we developed SNiPlay, a flexible, user-friendly and integrative web-based tool dedicated to polymorphism discovery and analysis. It integrates:1) a pipeline, freely accessible through the internet, combining existing softwares with new tools to detect SNPs and to compute different types of statistical indices and graphical layouts for SNP data. From standard sequence alignments, genotyping data or Sanger sequencing traces given as input, SNiPlay detects SNPs and indels events and outputs submission files for the design of Illumina's SNP chips. Subsequently, it sends sequences and genotyping data into a series of modules in charge of various processes: physical mapping to a reference genome, annotation (genomic position, intron/exon location, synonymous/non-synonymous substitutions), SNP frequency determination in user-defined groups, haplotype reconstruction and network, linkage disequilibrium evaluation, and diversity analysis (Pi, Watterson's Theta, Tajima's D).Furthermore, the pipeline allows the use of external data (such as phenotype, geographic origin, taxa, stratification) to define groups and compare statistical indices.2) a database storing polymorphisms, genotyping data and grapevine sequences released by public and private projects. It allows the user to retrieve SNPs using various filters (such as genomic position, missing data, polymorphism type, allele frequency), to compare SNP patterns between populations, and to export genotyping data or sequences in various formats. Our experiments on grapevine genetic projects showed that SNiPlay allows geneticists to rapidly obtain advanced results in several key research areas of plant genetic diversity. Both the management and treatment of large amounts of SNP data are rendered considerably easier for end-users through automation and integration. Current developments are taking into account new advances in high-throughput technologies.SNiPlay is available at: http://sniplay.cirad.fr/.

  10. Beyond existence and aiming outside the laboratory: estimating frequency-dependent and pay-off-biased social learning strategies.

    PubMed

    McElreath, Richard; Bell, Adrian V; Efferson, Charles; Lubell, Mark; Richerson, Peter J; Waring, Timothy

    2008-11-12

    The existence of social learning has been confirmed in diverse taxa, from apes to guppies. In order to advance our understanding of the consequences of social transmission and evolution of behaviour, however, we require statistical tools that can distinguish among diverse social learning strategies. In this paper, we advance two main ideas. First, social learning is diverse, in the sense that individuals can take advantage of different kinds of information and combine them in different ways. Examining learning strategies for different information conditions illuminates the more detailed design of social learning. We construct and analyse an evolutionary model of diverse social learning heuristics, in order to generate predictions and illustrate the impact of design differences on an organism's fitness. Second, in order to eventually escape the laboratory and apply social learning models to natural behaviour, we require statistical methods that do not depend upon tight experimental control. Therefore, we examine strategic social learning in an experimental setting in which the social information itself is endogenous to the experimental group, as it is in natural settings. We develop statistical models for distinguishing among different strategic uses of social information. The experimental data strongly suggest that most participants employ a hierarchical strategy that uses both average observed pay-offs of options as well as frequency information, the same model predicted by our evolutionary analysis to dominate a wide range of conditions.

  11. The taxonomy statistic uncovers novel clinical patterns in a population of ischemic stroke patients.

    PubMed

    Tukiendorf, Andrzej; Kaźmierski, Radosław; Michalak, Sławomir

    2013-01-01

    In this paper, we describe a simple taxonomic approach for clinical data mining elaborated by Marczewski and Steinhaus (M-S), whose performance equals the advanced statistical methodology known as the expectation-maximization (E-M) algorithm. We tested these two methods on a cohort of ischemic stroke patients. The comparison of both methods revealed strong agreement. Direct agreement between M-S and E-M classifications reached 83%, while Cohen's coefficient of agreement was κ = 0.766(P < 0.0001). The statistical analysis conducted and the outcomes obtained in this paper revealed novel clinical patterns in ischemic stroke patients. The aim of the study was to evaluate the clinical usefulness of Marczewski-Steinhaus' taxonomic approach as a tool for the detection of novel patterns of data in ischemic stroke patients and the prediction of disease outcome. In terms of the identification of fairly frequent types of stroke patients using their age, National Institutes of Health Stroke Scale (NIHSS), and diabetes mellitus (DM) status, when dealing with rough characteristics of patients, four particular types of patients are recognized, which cannot be identified by means of routine clinical methods. Following the obtained taxonomical outcomes, the strong correlation between the health status at moment of admission to emergency department (ED) and the subsequent recovery of patients is established. Moreover, popularization and simplification of the ideas of advanced mathematicians may provide an unconventional explorative platform for clinical problems.

  12. Development and Validation of the Controller Acceptance Rating Scale (CARS): Results of Empirical Research

    NASA Technical Reports Server (NTRS)

    Lee, Katharine K.; Kerns, Karol; Bone, Randall

    2001-01-01

    The measurement of operational acceptability is important for the development, implementation, and evolution of air traffic management decision support tools. The Controller Acceptance Rating Scale was developed at NASA Ames Research Center for the development and evaluation of the Passive Final Approach Spacing Tool. CARS was modeled after a well-known pilot evaluation rating instrument, the Cooper-Harper Scale, and has since been used in the evaluation of the User Request Evaluation Tool, developed by MITRE's Center for Advanced Aviation System Development. In this paper, we provide a discussion of the development of CARS and an analysis of the empirical data collected with CARS to examine construct validity. Results of intraclass correlations indicated statistically significant reliability for the CARS. From the subjective workload data that were collected in conjunction with the CARS, it appears that the expected set of workload attributes was correlated with the CARS. As expected, the analysis also showed that CARS was a sensitive indicator of the impact of decision support tools on controller operations. Suggestions for future CARS development and its improvement are also provided.

  13. Development of guidance for states transitioning to new safety analysis tools

    NASA Astrophysics Data System (ADS)

    Alluri, Priyanka

    With about 125 people dying on US roads each day, the US Department of Transportation heightened the awareness of critical safety issues with the passage of SAFETEA-LU (Safe Accountable Flexible Efficient Transportation Equity Act---a Legacy for Users) legislation in 2005. The legislation required each of the states to develop a Strategic Highway Safety Plan (SHSP) and incorporate data-driven approaches to prioritize and evaluate program outcomes: Failure to do so resulted in funding sanctioning. In conjunction with the legislation, research efforts have also been progressing toward the development of new safety analysis tools such as IHSDM (Interactive Highway Safety Design Model), SafetyAnalyst, and HSM (Highway Safety Manual). These software and analysis tools are comparatively more advanced in statistical theory and level of accuracy, and have a tendency to be more data intensive. A review of the 2009 five-percent reports and excerpts from the nationwide survey revealed astonishing facts about the continuing use of traditional methods including crash frequencies and rates for site selection and prioritization. The intense data requirements and statistical complexity of advanced safety tools are considered as a hindrance to their adoption. In this context, this research aims at identifying the data requirements and data availability for SafetyAnalyst and HSM by working with both the tools. This research sets the stage for working with the Empirical Bayes approach by highlighting some of the biases and issues associated with the traditional methods of selecting projects such as greater emphasis on traffic volume and regression-to-mean phenomena. Further, the not-so-obvious issue with shorter segment lengths, which effect the results independent of the methods used, is also discussed. The more reliable and statistically acceptable Empirical Bayes methodology requires safety performance functions (SPFs), regression equations predicting the relation between crashes and exposure for a subset of roadway network. These SPFs, specific to a region and the analysis period are often unavailable. Calibration of already existing default national SPFs to the state's data could be a feasible solution, but, how well the state's data is represented is a legitimate question. With this background, SPFs were generated for various classifications of segments in Georgia and compared against the national default SPFs used in SafetyAnalyst calibrated to Georgia data. Dwelling deeper into the development of SPFs, the influence of actual and estimated traffic data on the fit of the equations is also studied questioning the accuracy and reliability of traffic estimations. In addition to SafetyAnalyst, HSM aims at performing quantitative safety analysis. Applying HSM methodology to two-way two-lane rural roads, the effect of using multiple CMFs (Crash Modification Factors) is studied. Lastly, data requirements, methodology, constraints, and results are compared between SafetyAnalyst and HSM.

  14. Statistical inference to advance network models in epidemiology.

    PubMed

    Welch, David; Bansal, Shweta; Hunter, David R

    2011-03-01

    Contact networks are playing an increasingly important role in the study of epidemiology. Most of the existing work in this area has focused on considering the effect of underlying network structure on epidemic dynamics by using tools from probability theory and computer simulation. This work has provided much insight on the role that heterogeneity in host contact patterns plays on infectious disease dynamics. Despite the important understanding afforded by the probability and simulation paradigm, this approach does not directly address important questions about the structure of contact networks such as what is the best network model for a particular mode of disease transmission, how parameter values of a given model should be estimated, or how precisely the data allow us to estimate these parameter values. We argue that these questions are best answered within a statistical framework and discuss the role of statistical inference in estimating contact networks from epidemiological data. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Statistical model with two order parameters for ductile and soft fiber bundles in nanoscience and biomaterials.

    PubMed

    Rinaldi, Antonio

    2011-04-01

    Traditional fiber bundles models (FBMs) have been an effective tool to understand brittle heterogeneous systems. However, fiber bundles in modern nano- and bioapplications demand a new generation of FBM capturing more complex deformation processes in addition to damage. In the context of loose bundle systems and with reference to time-independent plasticity and soft biomaterials, we formulate a generalized statistical model for ductile fracture and nonlinear elastic problems capable of handling more simultaneous deformation mechanisms by means of two order parameters (as opposed to one). As the first rational FBM for coupled damage problems, it may be the cornerstone for advanced statistical models of heterogeneous systems in nanoscience and materials design, especially to explore hierarchical and bio-inspired concepts in the arena of nanobiotechnology. Applicative examples are provided for illustrative purposes at last, discussing issues in inverse analysis (i.e., nonlinear elastic polymer fiber and ductile Cu submicron bars arrays) and direct design (i.e., strength prediction).

  16. On entropy, financial markets and minority games

    NASA Astrophysics Data System (ADS)

    Zapart, Christopher A.

    2009-04-01

    The paper builds upon an earlier statistical analysis of financial time series with Shannon information entropy, published in [L. Molgedey, W. Ebeling, Local order, entropy and predictability of financial time series, European Physical Journal B-Condensed Matter and Complex Systems 15/4 (2000) 733-737]. A novel generic procedure is proposed for making multistep-ahead predictions of time series by building a statistical model of entropy. The approach is first demonstrated on the chaotic Mackey-Glass time series and later applied to Japanese Yen/US dollar intraday currency data. The paper also reinterprets Minority Games [E. Moro, The minority game: An introductory guide, Advances in Condensed Matter and Statistical Physics (2004)] within the context of physical entropy, and uses models derived from minority game theory as a tool for measuring the entropy of a model in response to time series. This entropy conditional upon a model is subsequently used in place of information-theoretic entropy in the proposed multistep prediction algorithm.

  17. Technology Tools to Support Reading in the Digital Age

    ERIC Educational Resources Information Center

    Biancarosa, Gina; Griffiths, Gina G.

    2012-01-01

    Advances in digital technologies are dramatically altering the texts and tools available to teachers and students. These technological advances have created excitement among many for their potential to be used as instructional tools for literacy education. Yet with the promise of these advances come issues that can exacerbate the literacy…

  18. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  19. Intermediate/Advanced Research Design and Statistics

    NASA Technical Reports Server (NTRS)

    Ploutz-Snyder, Robert

    2009-01-01

    The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs

  20. Pathogenesis-based treatments in primary Sjogren's syndrome using artificial intelligence and advanced machine learning techniques: a systematic literature review.

    PubMed

    Foulquier, Nathan; Redou, Pascal; Le Gal, Christophe; Rouvière, Bénédicte; Pers, Jacques-Olivier; Saraux, Alain

    2018-05-17

    Big data analysis has become a common way to extract information from complex and large datasets among most scientific domains. This approach is now used to study large cohorts of patients in medicine. This work is a review of publications that have used artificial intelligence and advanced machine learning techniques to study physio pathogenesis-based treatments in pSS. A systematic literature review retrieved all articles reporting on the use of advanced statistical analysis applied to the study of systemic autoimmune diseases (SADs) over the last decade. An automatic bibliography screening method has been developed to perform this task. The program called BIBOT was designed to fetch and analyze articles from the pubmed database using a list of keywords and Natural Language Processing approaches. The evolution of trends in statistical approaches, sizes of cohorts and number of publications over this period were also computed in the process. In all, 44077 abstracts were screened and 1017 publications were analyzed. The mean number of selected articles was 101.0 (S.D. 19.16) by year, but increased significantly over the time (from 74 articles in 2008 to 138 in 2017). Among them only 12 focused on pSS but none of them emphasized on the aspect of pathogenesis-based treatments. To conclude, medicine progressively enters the era of big data analysis and artificial intelligence, but these approaches are not yet used to describe pSS-specific pathogenesis-based treatment. Nevertheless, large multicentre studies are investigating this aspect with advanced algorithmic tools on large cohorts of SADs patients.

  1. Web-TCGA: an online platform for integrated analysis of molecular cancer data sets.

    PubMed

    Deng, Mario; Brägelmann, Johannes; Schultze, Joachim L; Perner, Sven

    2016-02-06

    The Cancer Genome Atlas (TCGA) is a pool of molecular data sets publicly accessible and freely available to cancer researchers anywhere around the world. However, wide spread use is limited since an advanced knowledge of statistics and statistical software is required. In order to improve accessibility we created Web-TCGA, a web based, freely accessible online tool, which can also be run in a private instance, for integrated analysis of molecular cancer data sets provided by TCGA. In contrast to already available tools, Web-TCGA utilizes different methods for analysis and visualization of TCGA data, allowing users to generate global molecular profiles across different cancer entities simultaneously. In addition to global molecular profiles, Web-TCGA offers highly detailed gene and tumor entity centric analysis by providing interactive tables and views. As a supplement to other already available tools, such as cBioPortal (Sci Signal 6:pl1, 2013, Cancer Discov 2:401-4, 2012), Web-TCGA is offering an analysis service, which does not require any installation or configuration, for molecular data sets available at the TCGA. Individual processing requests (queries) are generated by the user for mutation, methylation, expression and copy number variation (CNV) analyses. The user can focus analyses on results from single genes and cancer entities or perform a global analysis (multiple cancer entities and genes simultaneously).

  2. Development and pilot of an advance care planning website for women with ovarian cancer: a randomized controlled trial.

    PubMed

    Vogel, Rachel Isaksson; Petzel, Sue V; Cragg, Julie; McClellan, Molly; Chan, Daniel; Dickson, Elizabeth; Jacko, Julie A; Sainfort, François; Geller, Melissa A

    2013-11-01

    Few available tools facilitate cancer patients and physicians' discussions of quality of life and end-of-life. Our objective was to develop a web-based tool to promote advance care planning for women with ovarian cancer. Women with ovarian cancer, their families, clinicians and researchers met to identify ways to improve cancer care. A prototype website was created to address advance care planning, focusing on advance healthcare directives (AHD) and palliative care consultation. Patients were recruited from a gynecologic oncology clinic for a pilot randomized controlled trial. Primary outcomes included completion of an AHD and palliative care consultation. At study completion, 53 women with ovarian cancer were enrolled and 35 completed the study. The mean age at enrollment was 57.9 ± 9.5 years; most were newly diagnosed or at first recurrence. There were no statistical differences in completion of AHD (p=0.220) or palliative care consultation (p=0.440) between intervention and control groups. However, women in the intervention group showed evidence of moving toward decision making regarding AHD and palliative care and lower decisional conflict. Women assigned to the intervention, compared to control website, were highly satisfied with the amount (p=0.054) and quality (p=0.119) of information and when they accessed the website, used it longer (p=0.049). Overall website use was lower than expected, resulting from several patient-related and design barriers. A website providing information and decisional support for women with ovarian cancer is feasible. Increasing frequency of website use requires future research. © 2013.

  3. Development and pilot of an advance care planning website for women with ovarian cancer: A randomized controlled trial

    PubMed Central

    Vogel, Rachel Isaksson; Petzel, Sue V.; Cragg, Julie; McClellan, Molly; Chan, Daniel; Dickson, Elizabeth; Jacko, Julie A.; Sainfort, François; Geller, Melissa A.

    2015-01-01

    Objective Few available tools facilitate cancer patients and physicians' discussions of quality of life and end-of-life. Our objective was to develop a web-based tool to promote advance care planning for women with ovarian cancer. Methods Women with ovarian cancer, their families, clinicians and researchers met to identify ways to improve cancer care. A prototype website was created to address advance care planning, focusing on advance healthcare directives (AHD) and palliative care consultation. Patients were recruited from a gynecologic oncology clinic for a pilot randomized controlled trial. Primary outcomes included completion of an AHD and palliative care consultation. Results At study completion, 53 women with ovarian cancer were enrolled and 35 completed the study. The mean age at enrollment was 57.9 ± 9.5 years; most were newly diagnosed or at first recurrence. There were no statistical differences in completion of AHD (p = 0.220) or palliative care consultation (p = 0.440) between intervention and control groups. However, women in the intervention group showed evidence of moving toward decision making regarding AHD and palliative care and lower decisional conflict. Women assigned to the intervention, compared to control website, were highly satisfied with the amount (p = 0.054) and quality (p = 0.119) of information and when they accessed the website, used it longer (p = 0.049). Overall website use was lower than expected, resulting from several patient-related and design barriers. Conclusions A website providing information and decisional support for women with ovarian cancer is feasible. Increasing frequency of website use requires future research. PMID:23988413

  4. A toolbox for determining subdiffusive mechanisms

    NASA Astrophysics Data System (ADS)

    Meroz, Yasmine; Sokolov, Igor M.

    2015-04-01

    Subdiffusive processes have become a field of great interest in the last decades, due to amounting experimental evidence of subdiffusive behavior in complex systems, and especially in biological systems. Different physical scenarios leading to subdiffusion differ in the details of the dynamics. These differences are what allow to theoretically reconstruct the underlying physics from the results of observations, and will be the topic of this review. We review the main statistical analyses available today to distinguish between these scenarios, categorizing them according to the relevant characteristics. We collect the available tools and statistical tests, presenting them within a broader perspective. We also consider possible complications such as the subordination of subdiffusive mechanisms. Due to the advances in single particle tracking experiments in recent years, we focus on the relevant case of where the available experimental data is scant, at the level of single trajectories.

  5. Fukunaga-Koontz feature transformation for statistical structural damage detection and hierarchical neuro-fuzzy damage localisation

    NASA Astrophysics Data System (ADS)

    Hoell, Simon; Omenzetter, Piotr

    2017-07-01

    Considering jointly damage sensitive features (DSFs) of signals recorded by multiple sensors, applying advanced transformations to these DSFs and assessing systematically their contribution to damage detectability and localisation can significantly enhance the performance of structural health monitoring systems. This philosophy is explored here for partial autocorrelation coefficients (PACCs) of acceleration responses. They are interrogated with the help of the linear discriminant analysis based on the Fukunaga-Koontz transformation using datasets of the healthy and selected reference damage states. Then, a simple but efficient fast forward selection procedure is applied to rank the DSF components with respect to statistical distance measures specialised for either damage detection or localisation. For the damage detection task, the optimal feature subsets are identified based on the statistical hypothesis testing. For damage localisation, a hierarchical neuro-fuzzy tool is developed that uses the DSF ranking to establish its own optimal architecture. The proposed approaches are evaluated experimentally on data from non-destructively simulated damage in a laboratory scale wind turbine blade. The results support our claim of being able to enhance damage detectability and localisation performance by transforming and optimally selecting DSFs. It is demonstrated that the optimally selected PACCs from multiple sensors or their Fukunaga-Koontz transformed versions can not only improve the detectability of damage via statistical hypothesis testing but also increase the accuracy of damage localisation when used as inputs into a hierarchical neuro-fuzzy network. Furthermore, the computational effort of employing these advanced soft computing models for damage localisation can be significantly reduced by using transformed DSFs.

  6. Further Development and Assessment of a Broadband Liner Optimization Process

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Jones, Michael G.; Sutliff, Daniel L.

    2016-01-01

    The utilization of advanced fan designs (including higher bypass ratios) and shorter engine nacelles has highlighted a need for increased fan noise reduction over a broader frequency range. Thus, improved broadband liner designs must account for these constraints and, where applicable, take advantage of advanced manufacturing techniques that have opened new possibilities for novel configurations. This work focuses on the use of an established broadband acoustic liner optimization process to design a variable-depth, multi-degree of freedom liner for a high speed fan. Specifically, in-duct attenuation predictions with a statistical source model are used to obtain optimum impedance spectra over the conditions of interest. The predicted optimum impedance information is then used with acoustic liner modeling tools to design a liner aimed at producing impedance spectra that most closely match the predicted optimum values. The multi-degree of freedom design is carried through design, fabrication, and testing. In-duct attenuation predictions compare well with measured data and the multi-degree of freedom liner is shown to outperform a more conventional liner over a range of flow conditions. These promising results provide further confidence in the design tool, as well as the enhancements made to the overall design process.

  7. MPD: a pathogen genome and metagenome database

    PubMed Central

    Zhang, Tingting; Miao, Jiaojiao; Han, Na; Qiang, Yujun; Zhang, Wen

    2018-01-01

    Abstract Advances in high-throughput sequencing have led to unprecedented growth in the amount of available genome sequencing data, especially for bacterial genomes, which has been accompanied by a challenge for the storage and management of such huge datasets. To facilitate bacterial research and related studies, we have developed the Mypathogen database (MPD), which provides access to users for searching, downloading, storing and sharing bacterial genomics data. The MPD represents the first pathogenic database for microbial genomes and metagenomes, and currently covers pathogenic microbial genomes (6604 genera, 11 071 species, 41 906 strains) and metagenomic data from host, air, water and other sources (28 816 samples). The MPD also functions as a management system for statistical and storage data that can be used by different organizations, thereby facilitating data sharing among different organizations and research groups. A user-friendly local client tool is provided to maintain the steady transmission of big sequencing data. The MPD is a useful tool for analysis and management in genomic research, especially for clinical Centers for Disease Control and epidemiological studies, and is expected to contribute to advancing knowledge on pathogenic bacteria genomes and metagenomes. Database URL: http://data.mypathogen.org PMID:29917040

  8. Finding the Root Causes of Statistical Inconsistency in Community Earth System Model Output

    NASA Astrophysics Data System (ADS)

    Milroy, D.; Hammerling, D.; Baker, A. H.

    2017-12-01

    Baker et al (2015) developed the Community Earth System Model Ensemble Consistency Test (CESM-ECT) to provide a metric for software quality assurance by determining statistical consistency between an ensemble of CESM outputs and new test runs. The test has proved useful for detecting statistical difference caused by compiler bugs and errors in physical modules. However, detection is only the necessary first step in finding the causes of statistical difference. The CESM is a vastly complex model comprised of millions of lines of code which is developed and maintained by a large community of software engineers and scientists. Any root cause analysis is correspondingly challenging. We propose a new capability for CESM-ECT: identifying the sections of code that cause statistical distinguishability. The first step is to discover CESM variables that cause CESM-ECT to classify new runs as statistically distinct, which we achieve via Randomized Logistic Regression. Next we use a tool developed to identify CESM components that define or compute the variables found in the first step. Finally, we employ the application Kernel GENerator (KGEN) created in Kim et al (2016) to detect fine-grained floating point differences. We demonstrate an example of the procedure and advance a plan to automate this process in our future work.

  9. The potential contributions of geographic information science to the study of social determinants of health in Iran.

    PubMed

    Rabiei-Dastjerdi, Hamidreza; Matthews, Stephen A

    2018-01-01

    Recent interest in the social determinants of health (SDOH) and the effects of neighborhood contexts on individual health and well-being has grown exponentially. In this brief communication, we describe recent developments in both analytical perspectives and methods that have opened up new opportunities for researchers interested in exploring neighborhoods and health research within a SDOH framework. We focus specifically on recent advances in geographic information science, statistical methods, and spatial analytical tools. We close with a discussion of how these recent developments have the potential to enhance SDOH research in Iran.

  10. A Comparison of Satellite Conjunction Analysis Screening Tools

    DTIC Science & Technology

    2011-09-01

    visualization tool. Version 13.1.4 for Linux was tested. The SOAP conjunction analysis function does not have the capacity to perform the large...was examined by SOAP to confirm the conjunction. STK Advanced CAT STK Advanced CAT (Conjunction Analysis Tools) is an add-on module for the STK ...run with each tool. When attempting to perform the seven day all vs all analysis with STK Advanced CAT, the program consistently crashed during report

  11. Hosted Services for Advanced V and V Technologies: An Approach to Achieving Adoption without the Woes of Usage

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.; OMalley, Owen; Brew, William A.

    2003-01-01

    Attempts to achieve widespread use of software verification tools have been notably unsuccessful. Even 'straightforward', classic, and potentially effective verification tools such as lint-like tools face limits on their acceptance. These limits are imposed by the expertise required applying the tools and interpreting the results, the high false positive rate of many verification tools, and the need to integrate the tools into development environments. The barriers are even greater for more complex advanced technologies such as model checking. Web-hosted services for advanced verification technologies may mitigate these problems by centralizing tool expertise. The possible benefits of this approach include eliminating the need for software developer expertise in tool application and results filtering, and improving integration with other development tools.

  12. Key statistical and analytical issues for evaluating treatment effects in periodontal research.

    PubMed

    Tu, Yu-Kang; Gilthorpe, Mark S

    2012-06-01

    Statistics is an indispensible tool for evaluating treatment effects in clinical research. Due to the complexities of periodontal disease progression and data collection, statistical analyses for periodontal research have been a great challenge for both clinicians and statisticians. The aim of this article is to provide an overview of several basic, but important, statistical issues related to the evaluation of treatment effects and to clarify some common statistical misconceptions. Some of these issues are general, concerning many disciplines, and some are unique to periodontal research. We first discuss several statistical concepts that have sometimes been overlooked or misunderstood by periodontal researchers. For instance, decisions about whether to use the t-test or analysis of covariance, or whether to use parametric tests such as the t-test or its non-parametric counterpart, the Mann-Whitney U-test, have perplexed many periodontal researchers. We also describe more advanced methodological issues that have sometimes been overlooked by researchers. For instance, the phenomenon of regression to the mean is a fundamental issue to be considered when evaluating treatment effects, and collinearity amongst covariates is a conundrum that must be resolved when explaining and predicting treatment effects. Quick and easy solutions to these methodological and analytical issues are not always available in the literature, and careful statistical thinking is paramount when conducting useful and meaningful research. © 2012 John Wiley & Sons A/S.

  13. Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.

    2009-01-01

    Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.

  14. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    ERIC Educational Resources Information Center

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  15. Conceptualizing a Framework for Advanced Placement Statistics Teaching Knowledge

    ERIC Educational Resources Information Center

    Haines, Brenna

    2015-01-01

    The purpose of this article is to sketch a conceptualization of a framework for Advanced Placement (AP) Statistics Teaching Knowledge. Recent research continues to problematize the lack of knowledge and preparation among secondary level statistics teachers. The College Board's AP Statistics course continues to grow and gain popularity, but is a…

  16. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 1: Executive Summary, of a 15-Volume Set of Skills Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology (MAST) consortium was formed to address the shortage of skilled workers for the machine tools and metals-related industries. Featuring six of the nation's leading advanced technology centers, the MAST consortium developed, tested, and disseminated industry-specific skill standards and model curricula for…

  17. Construct Validity: Advances in Theory and Methodology

    PubMed Central

    Strauss, Milton E.; Smith, Gregory T.

    2008-01-01

    Measures of psychological constructs are validated by testing whether they relate to measures of other constructs as specified by theory. Each test of relations between measures reflects on the validity of both the measures and the theory driving the test. Construct validation concerns the simultaneous process of measure and theory validation. In this chapter, we review the recent history of validation efforts in clinical psychological science that has led to this perspective, and we review five recent advances in validation theory and methodology of importance for clinical researchers. These are: the emergence of nonjustificationist philosophy of science; an increasing appreciation for theory and the need for informative tests of construct validity; valid construct representation in experimental psychopathology; the need to avoid representing multidimensional constructs with a single score; and the emergence of effective new statistical tools for the evaluation of convergent and discriminant validity. PMID:19086835

  18. Primary Sclerosing Cholangitis Risk Estimate Tool (PREsTo) Predicts Outcomes in PSC: A Derivation & Validation Study Using Machine Learning.

    PubMed

    Eaton, John E; Vesterhus, Mette; McCauley, Bryan M; Atkinson, Elizabeth J; Schlicht, Erik M; Juran, Brian D; Gossard, Andrea A; LaRusso, Nicholas F; Gores, Gregory J; Karlsen, Tom H; Lazaridis, Konstantinos N

    2018-05-09

    Improved methods are needed to risk stratify and predict outcomes in patients with primary sclerosing cholangitis (PSC). Therefore, we sought to derive and validate a new prediction model and compare its performance to existing surrogate markers. The model was derived using 509 subjects from a multicenter North American cohort and validated in an international multicenter cohort (n=278). Gradient boosting, a machine based learning technique, was used to create the model. The endpoint was hepatic decompensation (ascites, variceal hemorrhage or encephalopathy). Subjects with advanced PSC or cholangiocarcinoma at baseline were excluded. The PSC risk estimate tool (PREsTo) consists of 9 variables: bilirubin, albumin, serum alkaline phosphatase (SAP) times the upper limit of normal (ULN), platelets, AST, hemoglobin, sodium, patient age and the number of years since PSC was diagnosed. Validation in an independent cohort confirms PREsTo accurately predicts decompensation (C statistic 0.90, 95% confidence interval (CI) 0.84-0.95) and performed well compared to MELD score (C statistic 0.72, 95% CI 0.57-0.84), Mayo PSC risk score (C statistic 0.85, 95% CI 0.77-0.92) and SAP < 1.5x ULN (C statistic 0.65, 95% CI 0.55-0.73). PREsTo continued to be accurate among individuals with a bilirubin < 2.0 mg/dL (C statistic 0.90, 95% CI 0.82-0.96) and when the score was re-applied at a later course in the disease (C statistic 0.82, 95% CI 0.64-0.95). PREsTo accurately predicts hepatic decompensation in PSC and exceeds the performance among other widely available, noninvasive prognostic scoring systems. This article is protected by copyright. All rights reserved. © 2018 by the American Association for the Study of Liver Diseases.

  19. MutSpec: a Galaxy toolbox for streamlined analyses of somatic mutation spectra in human and mouse cancer genomes.

    PubMed

    Ardin, Maude; Cahais, Vincent; Castells, Xavier; Bouaoun, Liacine; Byrnes, Graham; Herceg, Zdenko; Zavadil, Jiri; Olivier, Magali

    2016-04-18

    The nature of somatic mutations observed in human tumors at single gene or genome-wide levels can reveal information on past carcinogenic exposures and mutational processes contributing to tumor development. While large amounts of sequencing data are being generated, the associated analysis and interpretation of mutation patterns that may reveal clues about the natural history of cancer present complex and challenging tasks that require advanced bioinformatics skills. To make such analyses accessible to a wider community of researchers with no programming expertise, we have developed within the web-based user-friendly platform Galaxy a first-of-its-kind package called MutSpec. MutSpec includes a set of tools that perform variant annotation and use advanced statistics for the identification of mutation signatures present in cancer genomes and for comparing the obtained signatures with those published in the COSMIC database and other sources. MutSpec offers an accessible framework for building reproducible analysis pipelines, integrating existing methods and scripts developed in-house with publicly available R packages. MutSpec may be used to analyse data from whole-exome, whole-genome or targeted sequencing experiments performed on human or mouse genomes. Results are provided in various formats including rich graphical outputs. An example is presented to illustrate the package functionalities, the straightforward workflow analysis and the richness of the statistics and publication-grade graphics produced by the tool. MutSpec offers an easy-to-use graphical interface embedded in the popular Galaxy platform that can be used by researchers with limited programming or bioinformatics expertise to analyse mutation signatures present in cancer genomes. MutSpec can thus effectively assist in the discovery of complex mutational processes resulting from exogenous and endogenous carcinogenic insults.

  20. Online Statistical Modeling (Regression Analysis) for Independent Responses

    NASA Astrophysics Data System (ADS)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  1. Heavy-Duty Vehicle Port Drayage Drive Cycle Characterization and Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prohaska, Robert; Konan, Arnaud; Kelly, Kenneth

    In an effort to better understand the operational requirements of port drayage vehicles and their potential for adoption of advanced technologies, National Renewable Energy Laboratory (NREL) researchers collected over 36,000 miles of in-use duty cycle data from 30 Class 8 drayage trucks operating at the Port of Long Beach and Port of Los Angeles in Southern California. These data include 1-Hz global positioning system location and SAE J1939 high-speed controller area network information. Researchers processed the data through NREL's Drive-Cycle Rapid Investigation, Visualization, and Evaluation tool to examine vehicle kinematic and dynamic patterns across the spectrum of operations. Using themore » k-medoids clustering method, a repeatable and quantitative process for multi-mode drive cycle segmentation, the analysis led to the creation of multiple drive cycles representing four distinct modes of operation that can be used independently or in combination. These drive cycles are statistically representative of real-world operation of port drayage vehicles. When combined with modeling and simulation tools, these representative test cycles allow advanced vehicle or systems developers to efficiently and accurately evaluate vehicle technology performance requirements to reduce cost and development time while ultimately leading to the commercialization of advanced technologies that meet the performance requirements of the port drayage vocation. The drive cycles, which are suitable for chassis dynamometer testing, were compared to several existing test cycles. This paper presents the clustering methodology, accompanying results of the port drayage duty cycle analysis and custom drive cycle creation.« less

  2. Heavy-Duty Vehicle Port Drayage Drive Cycle Characterization and Development: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prohaska, Robert; Konan, Arnaud; Kelly, Kenneth

    In an effort to better understand the operational requirements of port drayage vehicles and their potential for adoption of advanced technologies, National Renewable Energy Laboratory (NREL) researchers collected over 36,000 miles of in-use duty cycle data from 30 Class 8 drayage trucks operating at the Port of Long Beach and Port of Los Angeles in Southern California. These data include 1-Hz global positioning system location and SAE J1939 high-speed controller area network information. Researchers processed the data through NREL's Drive-Cycle Rapid Investigation, Visualization, and Evaluation tool to examine vehicle kinematic and dynamic patterns across the spectrum of operations. Using themore » k-medoids clustering method, a repeatable and quantitative process for multi-mode drive cycle segmentation, the analysis led to the creation of multiple drive cycles representing four distinct modes of operation that can be used independently or in combination. These drive cycles are statistically representative of real-world operation of port drayage vehicles. When combined with modeling and simulation tools, these representative test cycles allow advanced vehicle or systems developers to efficiently and accurately evaluate vehicle technology performance requirements to reduce cost and development time while ultimately leading to the commercialization of advanced technologies that meet the performance requirements of the port drayage vocation. The drive cycles, which are suitable for chassis dynamometer testing, were compared to several existing test cycles. This paper presents the clustering methodology, accompanying results of the port drayage duty cycle analysis and custom drive cycle creation.« less

  3. Heavy-Duty Vehicle Port Drayage Drive Cycle Characterization and Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prohaska, Robert; Konan, Arnaud; Kelly, Kenneth

    2016-05-02

    In an effort to better understand the operational requirements of port drayage vehicles and their potential for adoption of advanced technologies, National Renewable Energy Laboratory (NREL) researchers collected over 36,000 miles of in-use duty cycle data from 30 Class 8 drayage trucks operating at the Port of Long Beach and Port of Los Angeles in Southern California. These data include 1-Hz global positioning system location and SAE J1939 high-speed controller area network information. Researchers processed the data through NREL's Drive-Cycle Rapid Investigation, Visualization, and Evaluation tool to examine vehicle kinematic and dynamic patterns across the spectrum of operations. Using themore » k-medoids clustering method, a repeatable and quantitative process for multi-mode drive cycle segmentation, the analysis led to the creation of multiple drive cycles representing four distinct modes of operation that can be used independently or in combination. These drive cycles are statistically representative of real-world operation of port drayage vehicles. When combined with modeling and simulation tools, these representative test cycles allow advanced vehicle or systems developers to efficiently and accurately evaluate vehicle technology performance requirements to reduce cost and development time while ultimately leading to the commercialization of advanced technologies that meet the performance requirements of the port drayage vocation. The drive cycles, which are suitable for chassis dynamometer testing, were compared to several existing test cycles. This paper presents the clustering methodology, accompanying results of the port drayage duty cycle analysis and custom drive cycle creation.« less

  4. Evaluation of the Terminal Sequencing and Spacing System for Performance Based Navigation Arrivals

    NASA Technical Reports Server (NTRS)

    Thipphavong, Jane; Jung, Jaewoo; Swenson, Harry N.; Martin, Lynne; Lin, Melody; Nguyen, Jimmy

    2013-01-01

    NASA has developed the Terminal Sequencing and Spacing (TSS) system, a suite of advanced arrival management technologies combining timebased scheduling and controller precision spacing tools. TSS is a ground-based controller automation tool that facilitates sequencing and merging arrivals that have both current standard ATC routes and terminal Performance-Based Navigation (PBN) routes, especially during highly congested demand periods. In collaboration with the FAA and MITRE's Center for Advanced Aviation System Development (CAASD), TSS system performance was evaluated in human-in-the-loop (HITL) simulations with currently active controllers as participants. Traffic scenarios had mixed Area Navigation (RNAV) and Required Navigation Performance (RNP) equipage, where the more advanced RNP-equipped aircraft had preferential treatment with a shorter approach option. Simulation results indicate the TSS system achieved benefits by enabling PBN, while maintaining high throughput rates-10% above baseline demand levels. Flight path predictability improved, where path deviation was reduced by 2 NM on average and variance in the downwind leg length was 75% less. Arrivals flew more fuel-efficient descents for longer, spending an average of 39 seconds less in step-down level altitude segments. Self-reported controller workload was reduced, with statistically significant differences at the p less than 0.01 level. The RNP-equipped arrivals were also able to more frequently capitalize on the benefits of being "Best-Equipped, Best- Served" (BEBS), where less vectoring was needed and nearly all RNP approaches were conducted without interruption.

  5. Recent Advances in Algal Genetic Tool Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Dahlin, Lukas; T. Guarnieri, Michael

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less

  6. Recent Advances in Algal Genetic Tool Development

    DOE PAGES

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less

  7. Performance Analysis of and Tool Support for Transactional Memory on BG/Q

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schindewolf, M

    2011-12-08

    Martin Schindewolf worked during his internship at the Lawrence Livermore National Laboratory (LLNL) under the guidance of Martin Schulz at the Computer Science Group of the Center for Applied Scientific Computing. We studied the performance of the TM subsystem of BG/Q as well as researched the possibilities for tool support for TM. To study the performance, we run CLOMP-TM. CLOMP-TM is a benchmark designed for the purpose to quantify the overhead of OpenMP and compare different synchronization primitives. To advance CLOMP-TM, we added Message Passing Interface (MPI) routines for a hybrid parallelization. This enables to run multiple MPI tasks, eachmore » running OpenMP, on one node. With these enhancements, a beneficial MPI task to OpenMP thread ratio is determined. Further, the synchronization primitives are ranked as a function of the application characteristics. To demonstrate the usefulness of these results, we investigate a real Monte Carlo simulation called Monte Carlo Benchmark (MCB). Applying the lessons learned yields the best task to thread ratio. Further, we were able to tune the synchronization by transactifying the MCB. Further, we develop tools that capture the performance of the TM run time system and present it to the application's developer. The performance of the TM run time system relies on the built-in statistics. These tools use the Blue Gene Performance Monitoring (BGPM) interface to correlate the statistics from the TM run time system with performance counter values. This combination provides detailed insights in the run time behavior of the application and enables to track down the cause of degraded performance. Further, one tool has been implemented that separates the performance counters in three categories: Successful Speculation, Unsuccessful Speculation and No Speculation. All of the tools are crafted around IBM's xlc compiler for C and C++ and have been run and tested on a Q32 early access system.« less

  8. Challenges of assessing critical thinking and clinical judgment in nurse practitioner students.

    PubMed

    Gorton, Karen L; Hayes, Janice

    2014-03-01

    The purpose of this study was to determine whether there was a relationship between critical thinking skills and clinical judgment in nurse practitioner students. The study used a convenience, nonprobability sampling technique, engaging participants from across the United States. Correlational analysis demonstrated no statistically significant relationship between critical thinking skills and examination-style questions, critical thinking skills and scores on the evaluation and reevaluation of consequences subscale of the Clinical Decision Making in Nursing Scale, and critical thinking skills and the preceptor evaluation tool. The study found no statistically significant relationships between critical thinking skills and clinical judgment. Educators and practitioners could consider further research in these areas to gain insight into how critical thinking is and could be measured, to gain insight into the clinical decision making skills of nurse practitioner students, and to gain insight into the development and measurement of critical thinking skills in advanced practice educational programs. Copyright 2014, SLACK Incorporated.

  9. Optimization of Sinter Plant Operating Conditions Using Advanced Multivariate Statistics: Intelligent Data Processing

    NASA Astrophysics Data System (ADS)

    Fernández-González, Daniel; Martín-Duarte, Ramón; Ruiz-Bustinza, Íñigo; Mochón, Javier; González-Gasca, Carmen; Verdeja, Luis Felipe

    2016-08-01

    Blast furnace operators expect to get sinter with homogenous and regular properties (chemical and mechanical), necessary to ensure regular blast furnace operation. Blends for sintering also include several iron by-products and other wastes that are obtained in different processes inside the steelworks. Due to their source, the availability of such materials is not always consistent, but their total production should be consumed in the sintering process, to both save money and recycle wastes. The main scope of this paper is to obtain the least expensive iron ore blend for the sintering process, which will provide suitable chemical and mechanical features for the homogeneous and regular operation of the blast furnace. The systematic use of statistical tools was employed to analyze historical data, including linear and partial correlations applied to the data and fuzzy clustering based on the Sugeno Fuzzy Inference System to establish relationships among the available variables.

  10. New efficient optimizing techniques for Kalman filters and numerical weather prediction models

    NASA Astrophysics Data System (ADS)

    Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis

    2016-06-01

    The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.

  11. Utility of the advanced chronic kidney disease patient management tools: case studies.

    PubMed

    Patwardhan, Meenal B; Matchar, David B; Samsa, Gregory P; Haley, William E

    2008-01-01

    Appropriate management of advanced chronic kidney disease (CKD) delays or limits its progression. The Advanced CKD Patient Management Toolkit was developed using a process-improvement technique to assist patient management and address CKD-specific management issues. We pilot tested the toolkit in 2 community nephrology practices, assessed the utility of individual tools, and evaluated the impact on conformance to an advanced CKD guideline through patient chart abstraction. Tool use was distinct in the 2 sites and depended on the site champion's involvement, the extent of process reconfiguration demanded by a tool, and its perceived value. Baseline conformance varied across guideline recommendations (averaged 54%). Posttrial conformance increased in all clinical areas (averaged 59%). Valuable features of the toolkit in real-world settings were its ability to: facilitate tool selection, direct implementation efforts in response to a baseline performance audit, and allow selection of tool versions and customizing them. Our results suggest that systematically created, multifaceted, and customizable tools can promote guideline conformance.

  12. Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces.

    PubMed

    Spezia, Riccardo; Martínez-Nuñez, Emilio; Vazquez, Saulo; Hase, William L

    2017-04-28

    In this Introduction, we show the basic problems of non-statistical and non-equilibrium phenomena related to the papers collected in this themed issue. Over the past few years, significant advances in both computing power and development of theories have allowed the study of larger systems, increasing the time length of simulations and improving the quality of potential energy surfaces. In particular, the possibility of using quantum chemistry to calculate energies and forces 'on the fly' has paved the way to directly study chemical reactions. This has provided a valuable tool to explore molecular mechanisms at given temperatures and energies and to see whether these reactive trajectories follow statistical laws and/or minimum energy pathways. This themed issue collects different aspects of the problem and gives an overview of recent works and developments in different contexts, from the gas phase to the condensed phase to excited states.This article is part of the themed issue 'Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces'. © 2017 The Author(s).

  13. Metabonomics and drug development.

    PubMed

    Ramana, Pranov; Adams, Erwin; Augustijns, Patrick; Van Schepdael, Ann

    2015-01-01

    Metabolites as an end product of metabolism possess a wealth of information about altered metabolic control and homeostasis that is dependent on numerous variables including age, sex, and environment. Studying significant changes in the metabolite patterns has been recognized as a tool to understand crucial aspects in drug development like drug efficacy and toxicity. The inclusion of metabonomics into the OMICS study platform brings us closer to define the phenotype and allows us to look at alternatives to improve the diagnosis of diseases. Advancements in the analytical strategies and statistical tools used to study metabonomics allow us to prevent drug failures at early stages of drug development and reduce financial losses during expensive phase II and III clinical trials. This chapter introduces metabonomics along with the instruments used in the study; in addition relevant examples of the usage of metabonomics in the drug development process are discussed along with an emphasis on future directions and the challenges it faces.

  14. Bioinformatics tools in predictive ecology: applications to fisheries

    PubMed Central

    Tucker, Allan; Duplisea, Daniel

    2012-01-01

    There has been a huge effort in the advancement of analytical techniques for molecular biological data over the past decade. This has led to many novel algorithms that are specialized to deal with data associated with biological phenomena, such as gene expression and protein interactions. In contrast, ecological data analysis has remained focused to some degree on off-the-shelf statistical techniques though this is starting to change with the adoption of state-of-the-art methods, where few assumptions can be made about the data and a more explorative approach is required, for example, through the use of Bayesian networks. In this paper, some novel bioinformatics tools for microarray data are discussed along with their ‘crossover potential’ with an application to fisheries data. In particular, a focus is made on the development of models that identify functionally equivalent species in different fish communities with the aim of predicting functional collapse. PMID:22144390

  15. Technology Solutions Case Study: Predicting Envelope Leakage in Attached Dwellings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2013-11-01

    The most common method of measuring air leakage is to perform single (or solo) blower door pressurization and/or depressurization test. In detached housing, the single blower door test measures leakage to the outside. In attached housing, however, this “solo” test method measures both air leakage to the outside and air leakage between adjacent units through common surfaces. In an attempt to create a simplified tool for predicting leakage to the outside, Building America team Consortium for Advanced Residential Buildings (CARB) performed a preliminary statistical analysis on blower door test results from 112 attached dwelling units in four apartment complexes. Althoughmore » the subject data set is limited in size and variety, the preliminary analyses suggest significant predictors are present and support the development of a predictive model. Further data collection is underway to create a more robust prediction tool for use across different construction types, climate zones, and unit configurations.« less

  16. Bioinformatics tools in predictive ecology: applications to fisheries.

    PubMed

    Tucker, Allan; Duplisea, Daniel

    2012-01-19

    There has been a huge effort in the advancement of analytical techniques for molecular biological data over the past decade. This has led to many novel algorithms that are specialized to deal with data associated with biological phenomena, such as gene expression and protein interactions. In contrast, ecological data analysis has remained focused to some degree on off-the-shelf statistical techniques though this is starting to change with the adoption of state-of-the-art methods, where few assumptions can be made about the data and a more explorative approach is required, for example, through the use of Bayesian networks. In this paper, some novel bioinformatics tools for microarray data are discussed along with their 'crossover potential' with an application to fisheries data. In particular, a focus is made on the development of models that identify functionally equivalent species in different fish communities with the aim of predicting functional collapse.

  17. Health-Related Quality of Life in SCALOP, a Randomized Phase 2 Trial Comparing Chemoradiation Therapy Regimens in Locally Advanced Pancreatic Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurt, Christopher N., E-mail: hurtcn@cardiff.ac.uk; Mukherjee, Somnath; Bridgewater, John

    Purpose: Chemoradiation therapy (CRT) for patients with locally advanced pancreatic cancer (LAPC) provides survival benefits but may result in considerable toxicity. Health-related quality of life (HRQL) measurements during CRT have not been widely reported. This paper reports HRQL data from the Selective Chemoradiation in Advanced Localised Pancreatic Cancer (SCALOP) trial, including validation of the QLQ-PAN26 tool in CRT. Methods and Materials: Patients with locally advanced, inoperable, nonmetastatic carcinoma of the pancreas were eligible. Following 12 weeks of induction gemcitabine plus capecitabine (GEMCAP) chemotherapy, patients with stable and responding disease were randomized to a further cycle of GEMCAP followed by capecitabine- or gemcitabine-basedmore » CRT. HRQL was assessed with the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire (EORTC QLQ-C30) and the EORTC Pancreatic Cancer module (PAN26). Results: A total of 114 patients from 28 UK centers were registered and 74 patients randomized. There was improvement in the majority of HRQL scales during induction chemotherapy. Patients with significant deterioration in fatigue, appetite loss, and gastrointestinal symptoms during CRT recovered within 3 weeks following CRT. Differences in changes in HRQL scores between trial arms rarely reached statistical significance; however, where they did, they favored capecitabine therapy. PAN26 scales had good internal consistency and were able to distinguish between subgroups of patients experiencing toxicity. Conclusions: Although there is deterioration in HRQL following CRT, this resolves within 3 weeks. HRQL data support the use of capecitabine- over gemcitabine-based chemoradiation. The QLQ-PAN26 is a reliable and valid tool for use in patients receiving CRT.« less

  18. Statistical methods in personality assessment research.

    PubMed

    Schinka, J A; LaLone, L; Broeckel, J A

    1997-06-01

    Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.

  19. A strip chart recorder pattern recognition tool kit for Shuttle operations

    NASA Technical Reports Server (NTRS)

    Hammen, David G.; Moebes, Travis A.; Shelton, Robert O.; Savely, Robert T.

    1993-01-01

    During Space Shuttle operations, Mission Control personnel monitor numerous mission-critical systems such as electrical power; guidance, navigation, and control; and propulsion by means of paper strip chart recorders. For example, electrical power controllers monitor strip chart recorder pen traces to identify onboard electrical equipment activations and deactivations. Recent developments in pattern recognition technologies coupled with new capabilities that distribute real-time Shuttle telemetry data to engineering workstations make it possible to develop computer applications that perform some of the low-level monitoring now performed by controllers. The number of opportunities for such applications suggests a need to build a pattern recognition tool kit to reduce software development effort through software reuse. We are building pattern recognition applications while keeping such a tool kit in mind. We demonstrated the initial prototype application, which identifies electrical equipment activations, during three recent Shuttle flights. This prototype was developed to test the viability of the basic system architecture, to evaluate the performance of several pattern recognition techniques including those based on cross-correlation, neural networks, and statistical methods, to understand the interplay between an advanced automation application and human controllers to enhance utility, and to identify capabilities needed in a more general-purpose tool kit.

  20. Using GIS to analyze animal movements in the marine environment

    USGS Publications Warehouse

    Hooge, Philip N.; Eichenlaub, William M.; Solomon, Elizabeth K.; Kruse, Gordon H.; Bez, Nicolas; Booth, Anthony; Dorn, Martin W.; Hills, Susan; Lipcius, Romuald N.; Pelletier, Dominique; Roy, Claude; Smith, Stephen J.; Witherell, David B.

    2001-01-01

    Advanced methods for analyzing animal movements have been little used in the aquatic research environment compared to the terrestrial. In addition, despite obvious advantages of integrating geographic information systems (GIS) with spatial studies of animal movement behavior, movement analysis tools have not been integrated into GIS for either aquatic or terrestrial environments. We therefore developed software that integrates one of the most commonly used GIS programs (ArcView®) with a large collection of animal movement analysis tools. This application, the Animal Movement Analyst Extension (AMAE), can be loaded as an extension to ArcView® under multiple operating system platforms (PC, Unix, and Mac OS). It contains more than 50 functions, including parametric and nonparametric home range analyses, random walk models, habitat analyses, point and circular statistics, tests of complete spatial randomness, tests for autocorrelation and sample size, point and line manipulation tools, and animation tools. This paper describes the use of these functions in analyzing animal location data; some limited examples are drawn from a sonic-tracking study of Pacific halibut (Hippoglossus stenolepis) in Glacier Bay, Alaska. The extension is available on the Internet at www.absc.usgs.gov/glba/gistools/index.htm.

  1. Use of vegetation health data for estimation of aus rice yield in bangladesh.

    PubMed

    Rahman, Atiqur; Roytman, Leonid; Krakauer, Nir Y; Nizamuddin, Mohammad; Goldberg, Mitch

    2009-01-01

    Rice is a vital staple crop for Bangladesh and surrounding countries, with interannual variation in yields depending on climatic conditions. We compared Bangladesh yield of aus rice, one of the main varieties grown, from official agricultural statistics with Vegetation Health (VH) Indices [Vegetation Condition Index (VCI), Temperature Condition Index (TCI) and Vegetation Health Index (VHI)] computed from Advanced Very High Resolution Radiometer (AVHRR) data covering a period of 15 years (1991-2005). A strong correlation was found between aus rice yield and VCI and VHI during the critical period of aus rice development that occurs during March-April (weeks 8-13 of the year), several months in advance of the rice harvest. Stepwise principal component regression (PCR) was used to construct a model to predict yield as a function of critical-period VHI. The model reduced the yield prediction error variance by 62% compared with a prediction of average yield for each year. Remote sensing is a valuable tool for estimating rice yields well in advance of harvest and at a low cost.

  2. Use of Vegetation Health Data for Estimation of Aus Rice Yield in Bangladesh

    PubMed Central

    Rahman, Atiqur; Roytman, Leonid; Krakauer, Nir Y.; Nizamuddin, Mohammad; Goldberg, Mitch

    2009-01-01

    Rice is a vital staple crop for Bangladesh and surrounding countries, with interannual variation in yields depending on climatic conditions. We compared Bangladesh yield of aus rice, one of the main varieties grown, from official agricultural statistics with Vegetation Health (VH) Indices [Vegetation Condition Index (VCI), Temperature Condition Index (TCI) and Vegetation Health Index (VHI)] computed from Advanced Very High Resolution Radiometer (AVHRR) data covering a period of 15 years (1991–2005). A strong correlation was found between aus rice yield and VCI and VHI during the critical period of aus rice development that occurs during March–April (weeks 8–13 of the year), several months in advance of the rice harvest. Stepwise principal component regression (PCR) was used to construct a model to predict yield as a function of critical-period VHI. The model reduced the yield prediction error variance by 62% compared with a prediction of average yield for each year. Remote sensing is a valuable tool for estimating rice yields well in advance of harvest and at a low cost. PMID:22574057

  3. The clinical effects of music therapy in palliative medicine.

    PubMed

    Gallagher, Lisa M; Lagman, Ruth; Walsh, Declan; Davis, Mellar P; Legrand, Susan B

    2006-08-01

    This study was to objectively assess the effect of music therapy on patients with advanced disease. Two hundred patients with chronic and/or advanced illnesses were prospectively evaluated. The effects of music therapy on these patients are reported. Visual analog scales, the Happy/Sad Faces Assessment Tool, and a behavior scale recorded pre- and post-music therapy scores on standardized data collection forms. A computerized database was used to collect and analyze the data. Utilizing the Wilcoxon signed rank test and a paired t test, music therapy improved anxiety, body movement, facial expression, mood, pain, shortness of breath, and verbalizations. Sessions with family members were also evaluated, and music therapy improved families' facial expressions, mood, and verbalizations. All improvements were statistically significant (P<0.001). Most patients and families had a positive subjective and objective response to music therapy. Objective data were obtained for a large number of patients with advanced disease. This is a significant addition to the quantitative literature on music therapy in this unique patient population. Our results suggest that music therapy is invaluable in palliative medicine.

  4. Validation of surrogate endpoints in advanced solid tumors: systematic review of statistical methods, results, and implications for policy makers.

    PubMed

    Ciani, Oriana; Davis, Sarah; Tappenden, Paul; Garside, Ruth; Stein, Ken; Cantrell, Anna; Saad, Everardo D; Buyse, Marc; Taylor, Rod S

    2014-07-01

    Licensing of, and coverage decisions on, new therapies should rely on evidence from patient-relevant endpoints such as overall survival (OS). Nevertheless, evidence from surrogate endpoints may also be useful, as it may not only expedite the regulatory approval of new therapies but also inform coverage decisions. It is, therefore, essential that candidate surrogate endpoints be properly validated. However, there is no consensus on statistical methods for such validation and on how the evidence thus derived should be applied by policy makers. We review current statistical approaches to surrogate-endpoint validation based on meta-analysis in various advanced-tumor settings. We assessed the suitability of two surrogates (progression-free survival [PFS] and time-to-progression [TTP]) using three current validation frameworks: Elston and Taylor's framework, the German Institute of Quality and Efficiency in Health Care's (IQWiG) framework and the Biomarker-Surrogacy Evaluation Schema (BSES3). A wide variety of statistical methods have been used to assess surrogacy. The strength of the association between the two surrogates and OS was generally low. The level of evidence (observation-level versus treatment-level) available varied considerably by cancer type, by evaluation tools and was not always consistent even within one specific cancer type. Not in all solid tumors the treatment-level association between PFS or TTP and OS has been investigated. According to IQWiG's framework, only PFS achieved acceptable evidence of surrogacy in metastatic colorectal and ovarian cancer treated with cytotoxic agents. Our study emphasizes the challenges of surrogate-endpoint validation and the importance of building consensus on the development of evaluation frameworks.

  5. Sparse network modeling and metscape-based visualization methods for the analysis of large-scale metabolomics data.

    PubMed

    Basu, Sumanta; Duren, William; Evans, Charles R; Burant, Charles F; Michailidis, George; Karnovsky, Alla

    2017-05-15

    Recent technological advances in mass spectrometry, development of richer mass spectral libraries and data processing tools have enabled large scale metabolic profiling. Biological interpretation of metabolomics studies heavily relies on knowledge-based tools that contain information about metabolic pathways. Incomplete coverage of different areas of metabolism and lack of information about non-canonical connections between metabolites limits the scope of applications of such tools. Furthermore, the presence of a large number of unknown features, which cannot be readily identified, but nonetheless can represent bona fide compounds, also considerably complicates biological interpretation of the data. Leveraging recent developments in the statistical analysis of high-dimensional data, we developed a new Debiased Sparse Partial Correlation algorithm (DSPC) for estimating partial correlation networks and implemented it as a Java-based CorrelationCalculator program. We also introduce a new version of our previously developed tool Metscape that enables building and visualization of correlation networks. We demonstrate the utility of these tools by constructing biologically relevant networks and in aiding identification of unknown compounds. http://metscape.med.umich.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  6. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 11: Computer-Aided Manufacturing & Advanced CNC, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  7. Uterine Cancer Statistics

    MedlinePlus

    ... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...

  8. Automated finite element modeling of the lumbar spine: Using a statistical shape model to generate a virtual population of models.

    PubMed

    Campbell, J Q; Petrella, A J

    2016-09-06

    Population-based modeling of the lumbar spine has the potential to be a powerful clinical tool. However, developing a fully parameterized model of the lumbar spine with accurate geometry has remained a challenge. The current study used automated methods for landmark identification to create a statistical shape model of the lumbar spine. The shape model was evaluated using compactness, generalization ability, and specificity. The primary shape modes were analyzed visually, quantitatively, and biomechanically. The biomechanical analysis was performed by using the statistical shape model with an automated method for finite element model generation to create a fully parameterized finite element model of the lumbar spine. Functional finite element models of the mean shape and the extreme shapes (±3 standard deviations) of all 17 shape modes were created demonstrating the robust nature of the methods. This study represents an advancement in finite element modeling of the lumbar spine and will allow population-based modeling in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Emerging Concepts of Data Integration in Pathogen Phylodynamics.

    PubMed

    Baele, Guy; Suchard, Marc A; Rambaut, Andrew; Lemey, Philippe

    2017-01-01

    Phylodynamics has become an increasingly popular statistical framework to extract evolutionary and epidemiological information from pathogen genomes. By harnessing such information, epidemiologists aim to shed light on the spatio-temporal patterns of spread and to test hypotheses about the underlying interaction of evolutionary and ecological dynamics in pathogen populations. Although the field has witnessed a rich development of statistical inference tools with increasing levels of sophistication, these tools initially focused on sequences as their sole primary data source. Integrating various sources of information, however, promises to deliver more precise insights in infectious diseases and to increase opportunities for statistical hypothesis testing. Here, we review how the emerging concept of data integration is stimulating new advances in Bayesian evolutionary inference methodology which formalize a marriage of statistical thinking and evolutionary biology. These approaches include connecting sequence to trait evolution, such as for host, phenotypic and geographic sampling information, but also the incorporation of covariates of evolutionary and epidemic processes in the reconstruction procedures. We highlight how a full Bayesian approach to covariate modeling and testing can generate further insights into sequence evolution, trait evolution, and population dynamics in pathogen populations. Specific examples demonstrate how such approaches can be used to test the impact of host on rabies and HIV evolutionary rates, to identify the drivers of influenza dispersal as well as the determinants of rabies cross-species transmissions, and to quantify the evolutionary dynamics of influenza antigenicity. Finally, we briefly discuss how data integration is now also permeating through the inference of transmission dynamics, leading to novel insights into tree-generative processes and detailed reconstructions of transmission trees. [Bayesian inference; birth–death models; coalescent models; continuous trait evolution; covariates; data integration; discrete trait evolution; pathogen phylodynamics.

  10. Emerging Concepts of Data Integration in Pathogen Phylodynamics

    PubMed Central

    Baele, Guy; Suchard, Marc A.; Rambaut, Andrew; Lemey, Philippe

    2017-01-01

    Phylodynamics has become an increasingly popular statistical framework to extract evolutionary and epidemiological information from pathogen genomes. By harnessing such information, epidemiologists aim to shed light on the spatio-temporal patterns of spread and to test hypotheses about the underlying interaction of evolutionary and ecological dynamics in pathogen populations. Although the field has witnessed a rich development of statistical inference tools with increasing levels of sophistication, these tools initially focused on sequences as their sole primary data source. Integrating various sources of information, however, promises to deliver more precise insights in infectious diseases and to increase opportunities for statistical hypothesis testing. Here, we review how the emerging concept of data integration is stimulating new advances in Bayesian evolutionary inference methodology which formalize a marriage of statistical thinking and evolutionary biology. These approaches include connecting sequence to trait evolution, such as for host, phenotypic and geographic sampling information, but also the incorporation of covariates of evolutionary and epidemic processes in the reconstruction procedures. We highlight how a full Bayesian approach to covariate modeling and testing can generate further insights into sequence evolution, trait evolution, and population dynamics in pathogen populations. Specific examples demonstrate how such approaches can be used to test the impact of host on rabies and HIV evolutionary rates, to identify the drivers of influenza dispersal as well as the determinants of rabies cross-species transmissions, and to quantify the evolutionary dynamics of influenza antigenicity. Finally, we briefly discuss how data integration is now also permeating through the inference of transmission dynamics, leading to novel insights into tree-generative processes and detailed reconstructions of transmission trees. [Bayesian inference; birth–death models; coalescent models; continuous trait evolution; covariates; data integration; discrete trait evolution; pathogen phylodynamics. PMID:28173504

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apte, A; Veeraraghavan, H; Oh, J

    Purpose: To present an open source and free platform to facilitate radiomics research — The “Radiomics toolbox” in CERR. Method: There is scarcity of open source tools that support end-to-end modeling of image features to predict patient outcomes. The “Radiomics toolbox” strives to fill the need for such a software platform. The platform supports (1) import of various kinds of image modalities like CT, PET, MR, SPECT, US. (2) Contouring tools to delineate structures of interest. (3) Extraction and storage of image based features like 1st order statistics, gray-scale co-occurrence and zonesize matrix based texture features and shape features andmore » (4) Statistical Analysis. Statistical analysis of the extracted features is supported with basic functionality that includes univariate correlations, Kaplan-Meir curves and advanced functionality that includes feature reduction and multivariate modeling. The graphical user interface and the data management are performed with Matlab for the ease of development and readability of code and features for wide audience. Open-source software developed with other programming languages is integrated to enhance various components of this toolbox. For example: Java-based DCM4CHE for import of DICOM, R for statistical analysis. Results: The Radiomics toolbox will be distributed as an open source, GNU copyrighted software. The toolbox was prototyped for modeling Oropharyngeal PET dataset at MSKCC. The analysis will be presented in a separate paper. Conclusion: The Radiomics Toolbox provides an extensible platform for extracting and modeling image features. To emphasize new uses of CERR for radiomics and image-based research, we have changed the name from the “Computational Environment for Radiotherapy Research” to the “Computational Environment for Radiological Research”.« less

  12. Improving timeliness and efficiency in the referral process for safety net providers: application of the Lean Six Sigma methodology.

    PubMed

    Deckard, Gloria J; Borkowski, Nancy; Diaz, Deisell; Sanchez, Carlos; Boisette, Serge A

    2010-01-01

    Designated primary care clinics largely serve low-income and uninsured patients who present a disproportionate number of chronic illnesses and face great difficulty in obtaining the medical care they need, particularly the access to specialty physicians. With limited capacity for providing specialty care, these primary care clinics generally refer patients to safety net hospitals' specialty ambulatory care clinics. A large public safety net health system successfully improved the effectiveness and efficiency of the specialty clinic referral process through application of Lean Six Sigma, an advanced process-improvement methodology and set of tools driven by statistics and engineering concepts.

  13. Integrating teaching and authentic research in the field and laboratory settings

    NASA Astrophysics Data System (ADS)

    Daryanto, S.; Wang, L.; Kaseke, K. F.; Ravi, S.

    2016-12-01

    Typically authentic research activities are separated from rigorous classroom teaching. Here we assessed the potential of integrating teaching and research activities both in the field and in the laboratory. We worked with students from both US and abroad without strong science background to utilize advanced environmental sensors and statistical tool to conduct innovative projects. The students include one from Namibia and two local high school students in Indianapolis (through Project SEED, Summer Experience for the Economically Disadvantaged). They conducted leaf potential measurements, isotope measurements and meta-analysis. The experience showed us the great potential of integrating teaching and research in both field and laboratory settings.

  14. [Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].

    PubMed

    Golder, W

    1999-09-01

    To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.

  15. Using NOAA/AVHRR based remote sensing data and PCR method for estimation of Aus rice yield in Bangladesh

    NASA Astrophysics Data System (ADS)

    Nizamuddin, Mohammad; Akhand, Kawsar; Roytman, Leonid; Kogan, Felix; Goldberg, Mitch

    2015-06-01

    Rice is a dominant food crop of Bangladesh accounting about 75 percent of agricultural land use for rice cultivation and currently Bangladesh is the world's fourth largest rice producing country. Rice provides about two-third of total calorie supply and about one-half of the agricultural GDP and one-sixth of the national income in Bangladesh. Aus is one of the main rice varieties in Bangladesh. Crop production, especially rice, the main food staple, is the most susceptible to climate change and variability. Any change in climate will, thus, increase uncertainty regarding rice production as climate is major cause year-to-year variability in rice productivity. This paper shows the application of remote sensing data for estimating Aus rice yield in Bangladesh using official statistics of rice yield with real time acquired satellite data from Advanced Very High Resolution Radiometer (AVHRR) sensor and Principal Component Regression (PCR) method was used to construct a model. The simulated result was compared with official agricultural statistics showing that the error of estimation of Aus rice yield was less than 10%. Remote sensing, therefore, is a valuable tool for estimating crop yields well in advance of harvest, and at a low cost.

  16. Detecting measurement outliers: remeasure efficiently

    NASA Astrophysics Data System (ADS)

    Ullrich, Albrecht

    2010-09-01

    Shrinking structures, advanced optical proximity correction (OPC) and complex measurement strategies continually challenge critical dimension (CD) metrology tools and recipe creation processes. One important quality ensuring task is the control of measurement outlier behavior. Outliers could trigger false positive alarm for specification violations impacting cycle time or potentially yield. Constant high level of outliers not only deteriorates cycle time but also puts unnecessary stress on tool operators leading eventually to human errors. At tool level the sources of outliers are natural variations (e.g. beam current etc.), drifts, contrast conditions, focus determination or pattern recognition issues, etc. Some of these can result from suboptimal or even wrong recipe settings, like focus position or measurement box size. Such outliers, created by an automatic recipe creation process faced with more complicated structures, would manifest itself rather as systematic variation of measurements than the one caused by 'pure' tool variation. I analyzed several statistical methods to detect outliers. These range from classical outlier tests for extrema, robust metrics like interquartile range (IQR) to methods evaluating the distribution of different populations of measurement sites, like the Cochran test. The latter suits especially the detection of systematic effects. The next level of outlier detection entwines additional information about the mask and the manufacturing process with the measurement results. The methods were reviewed for measured variations assumed to be normally distributed with zero mean but also for the presence of a statistically significant spatial process signature. I arrive at the conclusion that intelligent outlier detection can influence the efficiency and cycle time of CD metrology greatly. In combination with process information like target, typical platform variation and signature, one can tailor the detection to the needs of the photomask at hand. By monitoring the outlier behavior carefully, weaknesses of the automatic recipe creation process can be spotted.

  17. Advances in statistics

    Treesearch

    Howard Stauffer; Nadav Nur

    2005-01-01

    The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...

  18. The road map towards providing a robust Raman spectroscopy-based cancer diagnostic platform and integration into clinic

    NASA Astrophysics Data System (ADS)

    Lau, Katherine; Isabelle, Martin; Lloyd, Gavin R.; Old, Oliver; Shepherd, Neil; Bell, Ian M.; Dorney, Jennifer; Lewis, Aaran; Gaifulina, Riana; Rodriguez-Justo, Manuel; Kendall, Catherine; Stone, Nicolas; Thomas, Geraint; Reece, David

    2016-03-01

    Despite the demonstrated potential as an accurate cancer diagnostic tool, Raman spectroscopy (RS) is yet to be adopted by the clinic for histopathology reviews. The Stratified Medicine through Advanced Raman Technologies (SMART) consortium has begun to address some of the hurdles in its adoption for cancer diagnosis. These hurdles include awareness and acceptance of the technology, practicality of integration into the histopathology workflow, data reproducibility and availability of transferrable models. We have formed a consortium, in joint efforts, to develop optimised protocols for tissue sample preparation, data collection and analysis. These protocols will be supported by provision of suitable hardware and software tools to allow statistically sound classification models to be built and transferred for use on different systems. In addition, we are building a validated gastrointestinal (GI) cancers model, which can be trialled as part of the histopathology workflow at hospitals, and a classification tool. At the end of the project, we aim to deliver a robust Raman based diagnostic platform to enable clinical researchers to stage cancer, define tumour margin, build cancer diagnostic models and discover novel disease bio markers.

  19. CalFitter: a web server for analysis of protein thermal denaturation data.

    PubMed

    Mazurenko, Stanislav; Stourac, Jan; Kunka, Antonin; Nedeljkovic, Sava; Bednar, David; Prokop, Zbynek; Damborsky, Jiri

    2018-05-14

    Despite significant advances in the understanding of protein structure-function relationships, revealing protein folding pathways still poses a challenge due to a limited number of relevant experimental tools. Widely-used experimental techniques, such as calorimetry or spectroscopy, critically depend on a proper data analysis. Currently, there are only separate data analysis tools available for each type of experiment with a limited model selection. To address this problem, we have developed the CalFitter web server to be a unified platform for comprehensive data fitting and analysis of protein thermal denaturation data. The server allows simultaneous global data fitting using any combination of input data types and offers 12 protein unfolding pathway models for selection, including irreversible transitions often missing from other tools. The data fitting produces optimal parameter values, their confidence intervals, and statistical information to define unfolding pathways. The server provides an interactive and easy-to-use interface that allows users to directly analyse input datasets and simulate modelled output based on the model parameters. CalFitter web server is available free at https://loschmidt.chemi.muni.cz/calfitter/.

  20. Geochemistry and the understanding of ground-water systems

    USGS Publications Warehouse

    Glynn, Pierre D.; Plummer, Niel

    2005-01-01

    Geochemistry has contributed significantly to the understanding of ground-water systems over the last 50 years. Historic advances include development of the hydrochemical facies concept, application of equilibrium theory, investigation of redox processes, and radiocarbon dating. Other hydrochemical concepts, tools, and techniques have helped elucidate mechanisms of flow and transport in ground-water systems, and have helped unlock an archive of paleoenvironmental information. Hydrochemical and isotopic information can be used to interpret the origin and mode of ground-water recharge, refine estimates of time scales of recharge and ground-water flow, decipher reactive processes, provide paleohydrological information, and calibrate ground-water flow models. Progress needs to be made in obtaining representative samples. Improvements are needed in the interpretation of the information obtained, and in the construction and interpretation of numerical models utilizing hydrochemical data. The best approach will ensure an optimized iterative process between field data collection and analysis, interpretation, and the application of forward, inverse, and statistical modeling tools. Advances are anticipated from microbiological investigations, the characterization of natural organics, isotopic fingerprinting, applications of dissolved gas measurements, and the fields of reaction kinetics and coupled processes. A thermodynamic perspective is offered that could facilitate the comparison and understanding of the multiple physical, chemical, and biological processes affecting ground-water systems.

  1. A generic flexible and robust approach for intelligent real-time video-surveillance systems

    NASA Astrophysics Data System (ADS)

    Desurmont, Xavier; Delaigle, Jean-Francois; Bastide, Arnaud; Macq, Benoit

    2004-05-01

    In this article we present a generic, flexible and robust approach for an intelligent real-time video-surveillance system. A previous version of the system was presented in [1]. The goal of these advanced tools is to provide help to operators by detecting events of interest in visual scenes and highlighting alarms and compute statistics. The proposed system is a multi-camera platform able to handle different standards of video inputs (composite, IP, IEEE1394 ) and which can basically compress (MPEG4), store and display them. This platform also integrates advanced video analysis tools, such as motion detection, segmentation, tracking and interpretation. The design of the architecture is optimised to playback, display, and process video flows in an efficient way for video-surveillance application. The implementation is distributed on a scalable computer cluster based on Linux and IP network. It relies on POSIX threads for multitasking scheduling. Data flows are transmitted between the different modules using multicast technology and under control of a TCP-based command network (e.g. for bandwidth occupation control). We report here some results and we show the potential use of such a flexible system in third generation video surveillance system. We illustrate the interest of the system in a real case study, which is the indoor surveillance.

  2. A survey of tools and resources for the next generation analyst

    NASA Astrophysics Data System (ADS)

    Hall, David L.; Graham, Jake; Catherman, Emily

    2015-05-01

    We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.

  3. SpacePy - a Python-based library of tools for the space sciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morley, Steven K; Welling, Daniel T; Koller, Josef

    Space science deals with the bodies within the solar system and the interplanetary medium; the primary focus is on atmospheres and above - at Earth the short timescale variation in the the geomagnetic field, the Van Allen radiation belts and the deposition of energy into the upper atmosphere are key areas of investigation. SpacePy is a package for Python, targeted at the space sciences, that aims to make basic data analysis, modeling and visualization easier. It builds on the capabilities of the well-known NumPy and MatPlotLib packages. Publication quality output direct from analyses is emphasized. The SpacePy project seeks tomore » promote accurate and open research standards by providing an open environment for code development. In the space physics community there has long been a significant reliance on proprietary languages that restrict free transfer of data and reproducibility of results. By providing a comprehensive, open-source library of widely used analysis and visualization tools in a free, modern and intuitive language, we hope that this reliance will be diminished. SpacePy includes implementations of widely used empirical models, statistical techniques used frequently in space science (e.g. superposed epoch analysis), and interfaces to advanced tools such as electron drift shell calculations for radiation belt studies. SpacePy also provides analysis and visualization tools for components of the Space Weather Modeling Framework - currently this only includes the BATS-R-US 3-D magnetohydrodynamic model and the RAM ring current model - including streamline tracing in vector fields. Further development is currently underway. External libraries, which include well-known magnetic field models, high-precision time conversions and coordinate transformations are wrapped for access from Python using SWIG and f2py. The rest of the tools have been implemented directly in Python. The provision of open-source tools to perform common tasks will provide openness in the analysis methods employed in scientific studies and will give access to advanced tools to all space scientists regardless of affiliation or circumstance.« less

  4. Forbush Decrease Prediction Based on Remote Solar Observations

    NASA Astrophysics Data System (ADS)

    Dumbovic, Mateja; Vrsnak, Bojan; Calogovic, Jasa

    2016-04-01

    We study the relation between remote observations of coronal mass ejections (CMEs), their associated solar flares and short-term depressions in the galactic cosmic-ray flux (so called Forbush decreases). Statistical relations between Forbush decrease magnitude and several CME/flare parameters are examined. In general we find that Forbush decrease magnitude is larger for faster CMEs with larger apparent width, which is associated with stronger flares that originate close to the center of the solar disk and are (possibly) involved in a CME-CME interaction. The statistical relations are quantified and employed to forecast expected Forbush decrease magnitude range based on the selected remote solar observations of the CME and associated solar flare. Several verification measures are used to evaluate the forecast method. We find that the forecast is most reliable in predicting whether or not a CME will produce a Forbush decrease with a magnitude >3 %. The main advantage of the method is that it provides an early prediction, 1-4 days in advance. Based on the presented research, an online forecast tool was developed (Forbush Decrease Forecast Tool, FDFT) available at Hvar Observatory web page: http://oh.geof.unizg.hr/FDFT/fdft.php. We acknowledge the support of Croatian Science Foundation under the project 6212 "Solar and Stellar Variability" and of European social fond under the project "PoKRet".

  5. GOEAST: a web-based software toolkit for Gene Ontology enrichment analysis.

    PubMed

    Zheng, Qi; Wang, Xiu-Jie

    2008-07-01

    Gene Ontology (GO) analysis has become a commonly used approach for functional studies of large-scale genomic or transcriptomic data. Although there have been a lot of software with GO-related analysis functions, new tools are still needed to meet the requirements for data generated by newly developed technologies or for advanced analysis purpose. Here, we present a Gene Ontology Enrichment Analysis Software Toolkit (GOEAST), an easy-to-use web-based toolkit that identifies statistically overrepresented GO terms within given gene sets. Compared with available GO analysis tools, GOEAST has the following improved features: (i) GOEAST displays enriched GO terms in graphical format according to their relationships in the hierarchical tree of each GO category (biological process, molecular function and cellular component), therefore, provides better understanding of the correlations among enriched GO terms; (ii) GOEAST supports analysis for data from various sources (probe or probe set IDs of Affymetrix, Illumina, Agilent or customized microarrays, as well as different gene identifiers) and multiple species (about 60 prokaryote and eukaryote species); (iii) One unique feature of GOEAST is to allow cross comparison of the GO enrichment status of multiple experiments to identify functional correlations among them. GOEAST also provides rigorous statistical tests to enhance the reliability of analysis results. GOEAST is freely accessible at http://omicslab.genetics.ac.cn/GOEAST/

  6. Relevance of the c-statistic when evaluating risk-adjustment models in surgery.

    PubMed

    Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Dimick, Justin B; Wang, Edward; Chow, Warren B; Ko, Clifford Y; Bilimoria, Karl Y

    2012-05-01

    The measurement of hospital quality based on outcomes requires risk adjustment. The c-statistic is a popular tool used to judge model performance, but can be limited, particularly when evaluating specific operations in focused populations. Our objectives were to examine the interpretation and relevance of the c-statistic when used in models with increasingly similar case mix and to consider an alternative perspective on model calibration based on a graphical depiction of model fit. From the American College of Surgeons National Surgical Quality Improvement Program (2008-2009), patients were identified who underwent a general surgery procedure, and procedure groups were increasingly restricted: colorectal-all, colorectal-elective cases only, and colorectal-elective cancer cases only. Mortality and serious morbidity outcomes were evaluated using logistic regression-based risk adjustment, and model c-statistics and calibration curves were used to compare model performance. During the study period, 323,427 general, 47,605 colorectal-all, 39,860 colorectal-elective, and 21,680 colorectal cancer patients were studied. Mortality ranged from 1.0% in general surgery to 4.1% in the colorectal-all group, and serious morbidity ranged from 3.9% in general surgery to 12.4% in the colorectal-all procedural group. As case mix was restricted, c-statistics progressively declined from the general to the colorectal cancer surgery cohorts for both mortality and serious morbidity (mortality: 0.949 to 0.866; serious morbidity: 0.861 to 0.668). Calibration was evaluated graphically by examining predicted vs observed number of events over risk deciles. For both mortality and serious morbidity, there was no qualitative difference in calibration identified between the procedure groups. In the present study, we demonstrate how the c-statistic can become less informative and, in certain circumstances, can lead to incorrect model-based conclusions, as case mix is restricted and patients become more homogenous. Although it remains an important tool, caution is advised when the c-statistic is advanced as the sole measure of a model performance. Copyright © 2012 American College of Surgeons. All rights reserved.

  7. Emergency preparedness: community-based short-term eruption forecasting at Campi Flegrei

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Marzocchi, Warner; Civetta, Lucia; Del Pezzo, Edoardo; Papale, Paolo

    2010-05-01

    A key element in emergency preparedness is to define advance tools to assist decision makers and emergency management groups during crises. Such tools must be prepared in advance, accounting for all of expertise and scientific knowledge accumulated through time. During a pre-eruptive phase, the key for sound short-term eruption forecasting is the analysis of the monitoring signals. This involves the capability (i) to recognize anomalous signals and to relate single or combined anomalies to physical processes, assigning them probability values, and (ii) to quickly provide an answer to the observed phenomena even when unexpected. Here we present a > 4 years long process devoted to define the pre-eruptive Event Tree (ET) for Campi Flegrei. A community of about 40 experts in volcanology and volcano monitoring participating to two Italian Projects on Campi Flegrei funded by the Italian Civil Protection, has been constituted and trained during periodic meetings on the statistical methods and the model BET_EF (Marzocchi et al., 2008) that forms the statistical package tool for ET definition. Model calibration has been carried out through public elicitation sessions, preceded and followed by devoted meetings and web forum discussion on the monitoring parameters, their accuracy and relevance, and their potential meanings. The calibrated ET allows anomalies in the monitored parameters to be recognized and interpreted, assigning probability values to each set of data. This process de-personalizes the difficult task of interpreting multi-parametric sets of data during on-going emergencies, and provides a view of the observed variations that accounts for the averaged, weighted opinion of the scientific community. An additional positive outcome of the described ET calibration process is that of providing a picture of the degree of confidence by the expert community on the capability of the many different monitored quantities of recognizing significant variations in the state of the volcano. This picture is particularly useful since it can be used to guide future implementations in the monitoring network, as well as research investments aimed at substantially improving the capability to forecast the short-term volcanic hazard.

  8. Methodolgy For Evaluation Of Technology Impacts In Space Electric Power Systems

    NASA Technical Reports Server (NTRS)

    Holda, Julie

    2004-01-01

    The Analysis and Management branch of the Power and Propulsion Office at NASA Glenn Research Center is responsible for performing complex analyses of the space power and In-Space propulsion products developed by GRC. This work quantifies the benefits of the advanced technologies to support on-going advocacy efforts. The Power and Propulsion Office is committed to understanding how the advancement in space technologies could benefit future NASA missions. They support many diverse projects and missions throughout NASA as well as industry and academia. The area of work that we are concentrating on is space technology investment strategies. Our goal is to develop a Monte-Carlo based tool to investigate technology impacts in space electric power systems. The framework is being developed at this stage, which will be used to set up a computer simulation of a space electric power system (EPS). The outcome is expected to be a probabilistic assessment of critical technologies and potential development issues. We are developing methods for integrating existing spreadsheet-based tools into the simulation tool. Also, work is being done on defining interface protocols to enable rapid integration of future tools. Monte Carlo-based simulation programs for statistical modeling of the EPS Model. I decided to learn and evaluate Palisade's @Risk and Risk Optimizer software, and utilize it's capabilities for the Electric Power System (EPS) model. I also looked at similar software packages (JMP, SPSS, Crystal Ball, VenSim, Analytica) available from other suppliers and evaluated them. The second task was to develop the framework for the tool, in which we had to define technology characteristics using weighing factors and probability distributions. Also we had to define the simulation space and add hard and soft constraints to the model. The third task is to incorporate (preliminary) cost factors into the model. A final task is developing a cross-platform solution of this framework.

  9. Apes produce tools for future use.

    PubMed

    Bräuer, Juliane; Call, Josep

    2015-03-01

    There is now growing evidence that some animal species are able to plan for the future. For example great apes save and exchange tools for future use. Here we raise the question whether chimpanzees, orangutans, and bonobos would produce tools for future use. Subjects only had access to a baited apparatus for a limited duration and therefore should use the time preceding this access to create the appropriate tools in order to get the rewards. The apes were tested in three conditions depending on the need for pre-prepared tools. Either eight tools, one tool or no tools were needed to retrieve the reward. The apes prepared tools in advance for future use and they produced them mainly in conditions when they were really needed. The fact that apes were able to solve this new task indicates that their planning skills are flexible. However, for the condition in which eight tools were needed, apes produced less than two tools per trial in advance. However, they used their chance to produce additional tools in the tool use phase-thus often obtaining most of the reward from the apparatus. Increased pressure to prepare more tools in advance did not have an effect on their performance. © 2014 Wiley Periodicals, Inc.

  10. High satisfaction and low decisional conflict with advance care planning among chronically ill patients with advanced chronic obstructive pulmonary disease or heart failure using an online decision aid: A pilot study.

    PubMed

    Van Scoy, Lauren J; Green, Michael J; Dimmock, Anne Ef; Bascom, Rebecca; Boehmer, John P; Hensel, Jessica K; Hozella, Joshua B; Lehman, Erik B; Schubart, Jane R; Farace, Elana; Stewart, Renee R; Levi, Benjamin H

    2016-09-01

    Many patients with chronic illnesses report a desire for increased involvement in medical decision-making. This pilot study aimed to explore how patients with exacerbation-prone disease trajectories such as advanced heart failure or chronic obstructive pulmonary disease experience advance care planning using an online decision aid and to compare whether patients with different types of exacerbation-prone illnesses had varied experiences using the tool. Pre-intervention questionnaires measured advance care planning knowledge. Post-intervention questionnaires measured: (1) advance care planning knowledge; (2) satisfaction with tool; (3) decisional conflict; and (4) accuracy of the resultant advance directive. Comparisons were made between patients with heart failure and chronic obstructive pulmonary disease. Over 90% of the patients with heart failure (n = 24) or chronic obstructive pulmonary disease (n = 25) reported being "satisfied" or "highly satisfied" with the tool across all satisfaction domains; over 90% of participants rated the resultant advance directive as "very accurate." Participants reported low decisional conflict. Advance care planning knowledge scores rose by 18% (p < 0.001) post-intervention. There were no significant differences between participants with heart failure and chronic obstructive pulmonary disease. Patients with advanced heart failure and chronic obstructive pulmonary disease were highly satisfied after using an online advance care planning decision aid and had increased knowledge of advance care planning. This tool can be a useful resource for time-constrained clinicians whose patients wish to engage in advance care planning. © The Author(s) 2016.

  11. Recent advances in systems metabolic engineering tools and strategies.

    PubMed

    Chae, Tong Un; Choi, So Young; Kim, Je Woong; Ko, Yoo-Sung; Lee, Sang Yup

    2017-10-01

    Metabolic engineering has been playing increasingly important roles in developing microbial cell factories for the production of various chemicals and materials to achieve sustainable chemical industry. Nowadays, many tools and strategies are available for performing systems metabolic engineering that allows systems-level metabolic engineering in more sophisticated and diverse ways by adopting rapidly advancing methodologies and tools of systems biology, synthetic biology and evolutionary engineering. As an outcome, development of more efficient microbial cell factories has become possible. Here, we review recent advances in systems metabolic engineering tools and strategies together with accompanying application examples. In addition, we describe how these tools and strategies work together in simultaneous and synergistic ways to develop novel microbial cell factories. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. BEAT: Bioinformatics Exon Array Tool to store, analyze and visualize Affymetrix GeneChip Human Exon Array data from disease experiments

    PubMed Central

    2012-01-01

    Background It is known from recent studies that more than 90% of human multi-exon genes are subject to Alternative Splicing (AS), a key molecular mechanism in which multiple transcripts may be generated from a single gene. It is widely recognized that a breakdown in AS mechanisms plays an important role in cellular differentiation and pathologies. Polymerase Chain Reactions, microarrays and sequencing technologies have been applied to the study of transcript diversity arising from alternative expression. Last generation Affymetrix GeneChip Human Exon 1.0 ST Arrays offer a more detailed view of the gene expression profile providing information on the AS patterns. The exon array technology, with more than five million data points, can detect approximately one million exons, and it allows performing analyses at both gene and exon level. In this paper we describe BEAT, an integrated user-friendly bioinformatics framework to store, analyze and visualize exon arrays datasets. It combines a data warehouse approach with some rigorous statistical methods for assessing the AS of genes involved in diseases. Meta statistics are proposed as a novel approach to explore the analysis results. BEAT is available at http://beat.ba.itb.cnr.it. Results BEAT is a web tool which allows uploading and analyzing exon array datasets using standard statistical methods and an easy-to-use graphical web front-end. BEAT has been tested on a dataset with 173 samples and tuned using new datasets of exon array experiments from 28 colorectal cancer and 26 renal cell cancer samples produced at the Medical Genetics Unit of IRCCS Casa Sollievo della Sofferenza. To highlight all possible AS events, alternative names, accession Ids, Gene Ontology terms and biochemical pathways annotations are integrated with exon and gene level expression plots. The user can customize the results choosing custom thresholds for the statistical parameters and exploiting the available clinical data of the samples for a multivariate AS analysis. Conclusions Despite exon array chips being widely used for transcriptomics studies, there is a lack of analysis tools offering advanced statistical features and requiring no programming knowledge. BEAT provides a user-friendly platform for a comprehensive study of AS events in human diseases, displaying the analysis results with easily interpretable and interactive tables and graphics. PMID:22536968

  13. State of the art of sonic boom modeling

    NASA Astrophysics Data System (ADS)

    Plotkin, Kenneth J.

    2002-01-01

    Based on fundamental theory developed through the 1950s and 1960s, sonic boom modeling has evolved into practical tools. Over the past decade, there have been requirements for design tools for an advanced supersonic transport, and for tools for environmental assessment of various military and aerospace activities. This has resulted in a number of advances in the understanding of the physics of sonic booms, including shock wave rise times, propagation through turbulence, and blending sonic boom theory with modern computational fluid dynamics (CFD) aerodynamic design methods. This article reviews the early fundamental theory, recent advances in theory, and the application of these advances to practical models.

  14. State of the art of sonic boom modeling.

    PubMed

    Plotkin, Kenneth J

    2002-01-01

    Based on fundamental theory developed through the 1950s and 1960s, sonic boom modeling has evolved into practical tools. Over the past decade, there have been requirements for design tools for an advanced supersonic transport, and for tools for environmental assessment of various military and aerospace activities. This has resulted in a number of advances in the understanding of the physics of sonic booms, including shock wave rise times, propagation through turbulence, and blending sonic boom theory with modern computational fluid dynamics (CFD) aerodynamic design methods. This article reviews the early fundamental theory, recent advances in theory, and the application of these advances to practical models.

  15. A randomized, controlled trial of in situ pediatric advanced life support recertification ("pediatric advanced life support reconstructed") compared with standard pediatric advanced life support recertification for ICU frontline providers*.

    PubMed

    Kurosawa, Hiroshi; Ikeyama, Takanari; Achuff, Patricia; Perkel, Madeline; Watson, Christine; Monachino, Annemarie; Remy, Daphne; Deutsch, Ellen; Buchanan, Newton; Anderson, Jodee; Berg, Robert A; Nadkarni, Vinay M; Nishisaki, Akira

    2014-03-01

    Recent evidence shows poor retention of Pediatric Advanced Life Support provider skills. Frequent refresher training and in situ simulation are promising interventions. We developed a "Pediatric Advanced Life Support-reconstructed" recertification course by deconstructing the training into six 30-minute in situ simulation scenario sessions delivered over 6 months. We hypothesized that in situ Pediatric Advanced Life Support-reconstructed implementation is feasible and as effective as standard Pediatric Advanced Life Support recertification. A prospective randomized, single-blinded trial. Single-center, large, tertiary PICU in a university-affiliated children's hospital. Nurses and respiratory therapists in PICU. Simulation-based modular Pediatric Advanced Life Support recertification training. Simulation-based pre- and postassessment sessions were conducted to evaluate participants' performance. Video-recorded sessions were rated by trained raters blinded to allocation. The primary outcome was skill performance measured by a validated Clinical Performance Tool, and secondary outcome was behavioral performance measured by a Behavioral Assessment Tool. A mixed-effect model was used to account for baseline differences. Forty participants were prospectively randomized to Pediatric Advanced Life Support reconstructed versus standard Pediatric Advanced Life Support with no significant difference in demographics. Clinical Performance Tool score was similar at baseline in both groups and improved after Pediatric Advanced Life Support reconstructed (pre, 16.3 ± 4.1 vs post, 22.4 ± 3.9; p < 0.001), but not after standard Pediatric Advanced Life Support (pre, 14.3 ± 4.7 vs post, 14.9 ± 4.4; p =0.59). Improvement of Clinical Performance Tool was significantly higher in Pediatric Advanced Life Support reconstructed compared with standard Pediatric Advanced Life Support (p = 0.006). Behavioral Assessment Tool improved in both groups: Pediatric Advanced Life Support reconstructed (pre, 33.3 ± 4.5 vs post, 35.9 ± 5.0; p = 0.008) and standard Pediatric Advanced Life Support (pre, 30.5 ± 4.7 vs post, 33.6 ± 4.9; p = 0.02), with no significant difference of improvement between both groups (p = 0.49). For PICU-based nurses and respiratory therapists, simulation-based "Pediatric Advanced Life Support-reconstructed" in situ training is feasible and more effective than standard Pediatric Advanced Life Support recertification training for skill performance. Both Pediatric Advanced Life Support recertification training courses improved behavioral performance.

  16. Analysis instruments for the performance of Advanced Practice Nursing.

    PubMed

    Sevilla-Guerra, Sonia; Zabalegui, Adelaida

    2017-11-29

    Advanced Practice Nursing has been a reality in the international context for several decades and recently new nursing profiles have been developed in Spain as well that follow this model. The consolidation of these advanced practice roles has also led to of the creation of tools that attempt to define and evaluate their functions. This study aims to identify and explore the existing instruments that enable the domains of Advanced Practice Nursing to be defined. A review of existing international questionnaires and instruments was undertaken, including an analysis of the design process, the domains/dimensions defined, the main results and an exploration of clinimetric properties. Seven studies were analysed but not all proved to be valid, stable or reliable tools. One included tool was able to differentiate between the functions of the general nurse and the advanced practice nurse by the level of activities undertaken within the five domains described. These tools are necessary to evaluate the scope of advanced practice in new nursing roles that correspond to other international models of competencies and practice domains. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  17. Attitudes toward Advanced and Multivariate Statistics When Using Computers.

    ERIC Educational Resources Information Center

    Kennedy, Robert L.; McCallister, Corliss Jean

    This study investigated the attitudes toward statistics of graduate students who studied advanced statistics in a course in which the focus of instruction was the use of a computer program in class. The use of the program made it possible to provide an individualized, self-paced, student-centered, and activity-based course. The three sections…

  18. A Case Study on Teaching the Topic "Experimental Unit" and How It Is Presented in Advanced Placement Statistics Textbooks

    ERIC Educational Resources Information Center

    Perrett, Jamis J.

    2012-01-01

    This article demonstrates how textbooks differ in their description of the term "experimental unit". Advanced Placement Statistics teachers and students are often limited in their statistical knowledge by the information presented in their classroom textbook. Definitions and descriptions differ among textbooks as well as among different…

  19. Geostatistical applications in environmental remediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, R.N.; Purucker, S.T.; Lyon, B.F.

    1995-02-01

    Geostatistical analysis refers to a collection of statistical methods for addressing data that vary in space. By incorporating spatial information into the analysis, geostatistics has advantages over traditional statistical analysis for problems with a spatial context. Geostatistics has a history of success in earth science applications, and its popularity is increasing in other areas, including environmental remediation. Due to recent advances in computer technology, geostatistical algorithms can be executed at a speed comparable to many standard statistical software packages. When used responsibly, geostatistics is a systematic and defensible tool can be used in various decision frameworks, such as the Datamore » Quality Objectives (DQO) process. At every point in the site, geostatistics can estimate both the concentration level and the probability or risk of exceeding a given value. Using these probability maps can assist in identifying clean-up zones. Given any decision threshold and an acceptable level of risk, the probability maps identify those areas that are estimated to be above or below the acceptable risk. Those areas that are above the threshold are of the most concern with regard to remediation. In addition to estimating clean-up zones, geostatistics can assist in designing cost-effective secondary sampling schemes. Those areas of the probability map with high levels of estimated uncertainty are areas where more secondary sampling should occur. In addition, geostatistics has the ability to incorporate soft data directly into the analysis. These data include historical records, a highly correlated secondary contaminant, or expert judgment. The role of geostatistics in environmental remediation is a tool that in conjunction with other methods can provide a common forum for building consensus.« less

  20. Results of an Experimental Exploration of Advanced Automated Geospatial Tools: Agility in Complex Planning

    DTIC Science & Technology

    2009-06-01

    AUTOMATED GEOSPATIAL TOOLS : AGILITY IN COMPLEX PLANNING Primary Topic: Track 5 – Experimentation and Analysis Walter A. Powell [STUDENT] - GMU...TITLE AND SUBTITLE Results of an Experimental Exploration of Advanced Automated Geospatial Tools : Agility in Complex Planning 5a. CONTRACT NUMBER...Std Z39-18 Abstract Typically, the development of tools and systems for the military is requirement driven; systems are developed to meet

  1. Exploring complex dynamics in multi agent-based intelligent systems: Theoretical and experimental approaches using the Multi Agent-based Behavioral Economic Landscape (MABEL) model

    NASA Astrophysics Data System (ADS)

    Alexandridis, Konstantinos T.

    This dissertation adopts a holistic and detailed approach to modeling spatially explicit agent-based artificial intelligent systems, using the Multi Agent-based Behavioral Economic Landscape (MABEL) model. The research questions that addresses stem from the need to understand and analyze the real-world patterns and dynamics of land use change from a coupled human-environmental systems perspective. Describes the systemic, mathematical, statistical, socio-economic and spatial dynamics of the MABEL modeling framework, and provides a wide array of cross-disciplinary modeling applications within the research, decision-making and policy domains. Establishes the symbolic properties of the MABEL model as a Markov decision process, analyzes the decision-theoretic utility and optimization attributes of agents towards comprising statistically and spatially optimal policies and actions, and explores the probabilogic character of the agents' decision-making and inference mechanisms via the use of Bayesian belief and decision networks. Develops and describes a Monte Carlo methodology for experimental replications of agent's decisions regarding complex spatial parcel acquisition and learning. Recognizes the gap on spatially-explicit accuracy assessment techniques for complex spatial models, and proposes an ensemble of statistical tools designed to address this problem. Advanced information assessment techniques such as the Receiver-Operator Characteristic curve, the impurity entropy and Gini functions, and the Bayesian classification functions are proposed. The theoretical foundation for modular Bayesian inference in spatially-explicit multi-agent artificial intelligent systems, and the ensembles of cognitive and scenario assessment modular tools build for the MABEL model are provided. Emphasizes the modularity and robustness as valuable qualitative modeling attributes, and examines the role of robust intelligent modeling as a tool for improving policy-decisions related to land use change. Finally, the major contributions to the science are presented along with valuable directions for future research.

  2. Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data

    NASA Astrophysics Data System (ADS)

    Jern, Mikael

    Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.

  3. A quadratically regularized functional canonical correlation analysis for identifying the global structure of pleiotropy with NGS data

    PubMed Central

    Zhu, Yun; Fan, Ruzong; Xiong, Momiao

    2017-01-01

    Investigating the pleiotropic effects of genetic variants can increase statistical power, provide important information to achieve deep understanding of the complex genetic structures of disease, and offer powerful tools for designing effective treatments with fewer side effects. However, the current multiple phenotype association analysis paradigm lacks breadth (number of phenotypes and genetic variants jointly analyzed at the same time) and depth (hierarchical structure of phenotype and genotypes). A key issue for high dimensional pleiotropic analysis is to effectively extract informative internal representation and features from high dimensional genotype and phenotype data. To explore correlation information of genetic variants, effectively reduce data dimensions, and overcome critical barriers in advancing the development of novel statistical methods and computational algorithms for genetic pleiotropic analysis, we proposed a new statistic method referred to as a quadratically regularized functional CCA (QRFCCA) for association analysis which combines three approaches: (1) quadratically regularized matrix factorization, (2) functional data analysis and (3) canonical correlation analysis (CCA). Large-scale simulations show that the QRFCCA has a much higher power than that of the ten competing statistics while retaining the appropriate type 1 errors. To further evaluate performance, the QRFCCA and ten other statistics are applied to the whole genome sequencing dataset from the TwinsUK study. We identify a total of 79 genes with rare variants and 67 genes with common variants significantly associated with the 46 traits using QRFCCA. The results show that the QRFCCA substantially outperforms the ten other statistics. PMID:29040274

  4. Using the Nursing Culture Assessment Tool (NCAT) in Long-Term Care: An Update on Psychometrics and Scoring Standardization.

    PubMed

    Kennerly, Susan; Heggestad, Eric D; Myers, Haley; Yap, Tracey L

    2015-07-29

    An effective workforce performing within the context of a positive cultural environment is central to a healthcare organization's ability to achieve quality outcomes. The Nursing Culture Assessment Tool (NCAT) provides nurses with a valid and reliable tool that captures the general aspects of nursing culture. This study extends earlier work confirming the tool's construct validity and dimensionality by standardizing the scoring approach and establishing norm-referenced scoring. Scoring standardization provides a reliable point of comparison for NCAT users. NCAT assessments support nursing's ability to evaluate nursing culture, use results to shape the culture into one that supports change, and advance nursing's best practices and care outcomes. Registered nurses, licensed practical nurses, and certified nursing assistants from 54 long-term care facilities in Kentucky, Nevada, North Carolina, and Oregon were surveyed. Confirmatory factor analysis yielded six first order factors forming the NCAT's subscales (Expectations, Behaviors, Teamwork, Communication, Satisfaction, Commitment) (Comparative Fit Index 0.93) and a second order factor-The Total Culture Score. Aggregated facility level comparisons of observed group variance with expected random variance using rwg(J) statistics is presented. Normative scores and cumulative rank percentages and how the NCAT can be used in implementing planned change are provided.

  5. Innovations for the future of pharmacovigilance.

    PubMed

    Almenoff, June S

    2007-01-01

    Post-marketing pharmacovigilance involves the review and management of safety information from many sources. Among these sources, spontaneous adverse event reporting systems are among the most challenging and resource-intensive to manage. Traditionally, efforts to monitor spontaneous adverse event reporting systems have focused on review of individual case reports. The science of pharmacovigilance could be enhanced with the availability of systems-based tools that facilitate analysis of aggregate data for purposes of signal detection, signal evaluation and knowledge management. GlaxoSmithKline (GSK) recently implemented Online Signal Management (OSM) as a data-driven framework for managing the pharmacovigilance of marketed products. This pioneering work builds upon the strong history GSK has of innovation in this area. OSM is a software application co-developed by GSK and Lincoln Technologies that integrates traditional pharmacovigilance methods with modern quantitative statistical methods and data visualisation tools. OSM enables the rapid identification of trends from the individual adverse event reports received by GSK. OSM also provides knowledge-management tools to ensure the successful tracking of emerging safety issues. GSK has developed standard procedures and 'best practices' around the use of OSM to ensure the systematic evaluation of complex safety datasets. In summary, the implementation of OSM provides new tools and efficient processes to advance the science of pharmacovigilance.

  6. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    ERIC Educational Resources Information Center

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  7. From dinner table to digital tablet: technology's potential for reducing loneliness in older adults.

    PubMed

    McCausland, Lauren; Falk, Nancy L

    2012-05-01

    Statistics estimate that close to 35% of our nation's older individuals experience loneliness. Feelings of loneliness have been associated with physical and psychological illness in several research studies. As technology advances and connectivity through tablet devices becomes increasingly user friendly, the potential for tablets to reduce loneliness among older adults is substantial. This article discusses the issue of loneliness among older adults and suggests tablet technology as a tool to improve connectivity and reduce loneliness in the older adult population. As nurses, we have the opportunity to help enhance the quality of life for our clients. Tablet technology offers a new option that should be fully explored. Copyright 2012, SLACK Incorporated.

  8. Accelerating plant breeding.

    PubMed

    De La Fuente, Gerald N; Frei, Ursula K; Lübberstedt, Thomas

    2013-12-01

    The growing demand for food with limited arable land available necessitates that the yield of major food crops continues to increase over time. Advances in marker technology, predictive statistics, and breeding methodology have allowed for continued increases in crop performance through genetic improvement. However, one major bottleneck is the generation time of plants, which is biologically limited and has not been improved since the introduction of doubled haploid technology. In this opinion article, we propose to implement in vitro nurseries, which could substantially shorten generation time through rapid cycles of meiosis and mitosis. This could prove a useful tool for speeding up future breeding programs with the aim of sustainable food production. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Emerging patterns of somatic mutations in cancer

    PubMed Central

    Watson, Ian R.; Takahashi, Koichi; Futreal, P. Andrew; Chin, Lynda

    2014-01-01

    The advance in technological tools for massively parallel, high-throughput sequencing of DNA has enabled the comprehensive characterization of somatic mutations in large number of tumor samples. Here, we review recent cancer genomic studies that have assembled emerging views of the landscapes of somatic mutations through deep sequencing analyses of the coding exomes and whole genomes in various cancer types. We discuss the comparative genomics of different cancers, including mutation rates, spectrums, and roles of environmental insults that influence these processes. We highlight the developing statistical approaches used to identify significantly mutated genes, and discuss the emerging biological and clinical insights from such analyses as well as the challenges ahead translating these genomic data into clinical impacts. PMID:24022702

  10. Enhancement of MS Signal Processing For Improved Cancer Biomarker Discovery

    NASA Astrophysics Data System (ADS)

    Si, Qian

    Technological advances in proteomics have shown great potential in detecting cancer at the earliest stages. One way is to use the time of flight mass spectroscopy to identify biomarkers, or early disease indicators related to the cancer. Pattern analysis of time of flight mass spectra data from blood and tissue samples gives great hope for the identification of potential biomarkers among the complex mixture of biological and chemical samples for the early cancer detection. One of the keys issues is the pre-processing of raw mass spectra data. A lot of challenges need to be addressed: unknown noise character associated with the large volume of data, high variability in the mass spectroscopy measurements, and poorly understood signal background and so on. This dissertation focuses on developing statistical algorithms and creating data mining tools for computationally improved signal processing for mass spectrometry data. I have introduced an advanced accurate estimate of the noise model and a half-supervised method of mass spectrum data processing which requires little knowledge about the data.

  11. Process safety improvement--quality and target zero.

    PubMed

    Van Scyoc, Karl

    2008-11-15

    Process safety practitioners have adopted quality management principles in design of process safety management systems with positive effect, yet achieving safety objectives sometimes remain a distant target. Companies regularly apply tools and methods which have roots in quality and productivity improvement. The "plan, do, check, act" improvement loop, statistical analysis of incidents (non-conformities), and performance trending popularized by Dr. Deming are now commonly used in the context of process safety. Significant advancements in HSE performance are reported after applying methods viewed as fundamental for quality management. In pursuit of continual process safety improvement, the paper examines various quality improvement methods, and explores how methods intended for product quality can be additionally applied to continual improvement of process safety. Methods such as Kaizen, Poke yoke, and TRIZ, while long established for quality improvement, are quite unfamiliar in the process safety arena. These methods are discussed for application in improving both process safety leadership and field work team performance. Practical ways to advance process safety, based on the methods, are given.

  12. OSTI.GOV | OSTI, US Dept of Energy Office of Scientific and Technical

    Science.gov Websites

    Information Skip to main content ☰ Submit Research Results Search Tools Public Access Policy Data Services & Dev Tools About FAQs News Sign In Create Account Sign In Create Account Department Information Search terms: Advanced search options Advanced Search OptionsAdvanced Search queries use a

  13. A systematic review of orthopaedic manual therapy randomized clinical trials quality

    PubMed Central

    Riley, Sean P.; Swanson, Brian; Brismée, Jean-Michel; Sawyer, Steven F.

    2016-01-01

    Study Design: Systematic review and meta-analysis. Objectives: To conduct a systematic review and meta-analysis of randomized clinical trials (RCTs) in the orthopaedic manual therapy (OMT) literature from January 2010 to June 2014 in order to determine if the CONSORT checklist and Cochrane Risk of Bias (RoB) assessment tools: (1) are reliable; (2) have improved the reporting and decreased the risk of bias in RCTs in the OMT literature; (3) differ based on journal impact factor (JIF); and (4) scores are associated with each other. Background: The CONSORT statement is used to improve the accuracy of reporting within RCTs. The Cochrane RoB tool was designed to assess the risk of bias within RCTs. To date, no evaluation of the quality of reporting and risk of bias in OMT RCTs has been published. Methods: Relevant RCTs were identified by a literature review from January 2010 to June 2014. The identified RCTs were assessed by two individual reviewers utilizing the 2010 CONSORT checklist and the RoB tool. Agreement and a mean composite total score for each tool were attained in order to determine if the CONSORT and RoB tools were reliable and varied by year and impact factor. Results: A total of 72 RCTs in the OMT literature were identified. A number of categories within the CONSORT and RoB tools demonstrated prevalence-adjusted bias-adjusted kappa (PABAK) scores of less than 0.20 and from 0.20 to 0.40. The total CONSORT and RoB scores were correlated to each other (r = 0.73; 95% CI 0.60 to 0.82; p < 0.0001). There were no statistically significant differences in CONSORT or RoB scores by year. There was a statistically significant correlation between both CONSORT scores and JIF (r = 0.64, 95% CI 0.47 to 0.76; p < 0.0001), and between RoB scores and JIF (r = 0.42, 95% confidence interval 0.21–0.60; p < 0.001). There was not a statistically significant correlation between JIF and year of publication. Conclusion: Our findings suggest that the CONSORT and RoB have a number of items that are unclear and unreliable, and that the quality of reporting in OMT trials has not improved in recent years. Improvements in reporting are necessary to allow advances in OMT practice. Level of Evidence: 1A PMID:27956817

  14. Advances in high-resolution mass spectrometry based on metabolomics studies for food--a review.

    PubMed

    Rubert, Josep; Zachariasova, Milena; Hajslova, Jana

    2015-01-01

    Food authenticity becomes a necessity for global food policies, since food placed in the market without fail has to be authentic. It has always been a challenge, since in the past minor components, called also markers, have been mainly monitored by chromatographic methods in order to authenticate the food. Nevertheless, nowadays, advanced analytical methods have allowed food fingerprints to be achieved. At the same time they have been also combined with chemometrics, which uses statistical methods in order to verify food and to provide maximum information by analysing chemical data. These sophisticated methods based on different separation techniques or stand alone have been recently coupled to high-resolution mass spectrometry (HRMS) in order to verify the authenticity of food. The new generation of HRMS detectors have experienced significant advances in resolving power, sensitivity, robustness, extended dynamic range, easier mass calibration and tandem mass capabilities, making HRMS more attractive and useful to the food metabolomics community, therefore becoming a reliable tool for food authenticity. The purpose of this review is to summarise and describe the most recent metabolomics approaches in the area of food metabolomics, and to discuss the strengths and drawbacks of the HRMS analytical platforms combined with chemometrics.

  15. Brief cognitive assessment of Alzheimer's disease in advanced stages: Proposal for a Brazilian version of the Short Battery for Severe Impairment (SIB-8).

    PubMed

    Wajman, José Roberto; Bertolucci, Paulo Henrique Ferreira

    2013-01-01

    The measurement of cognitive abilities of patients with severe dementia can serve a wide range of methodological and clinical needs. To validate a proposed severe impairment battery SIB-8 for a Brazilian population sample as part of the neuropsychological assessment of patients with Alzheimer's disease (AD) in advanced stages. After a systematic process of translation and back-translation, the SIB-8 was applied to 95 patients with AD at different stages; moderate, moderately severe and severe according to FAST subdivisions (5, 6 and 7), with scores on the Mini-Mental State Examination (MMSE) of between 5 and 15 and followed by the Division of Behavioral Neurology and the Center for Aging Brain of the Federal University of São Paulo - UNIFESP. Inferential data revealed that the SIB-8 instrument behaved differently at each stage of the disease with a statistical value of sensitivity p<0.001, gradually reflecting the expected course of the dementia, inherent with the decline of cognitive functions. Findings indicated that the SIB-8 is a useful tool for the evaluation and prospective comparison of AD patients in advanced stages, retaining its original characteristics in our population.

  16. Monitoring and Evaluation: Statistical Support for Life-cycle Studies, Annual Report 2003.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skalski, John

    2003-11-01

    The ongoing mission of this project is the development of statistical tools for analyzing fisheries tagging data in the most precise and appropriate manner possible. This mission also includes providing statistical guidance on the best ways to design large-scale tagging studies. This mission continues because the technologies for conducting fish tagging studies continuously evolve. In just the last decade, fisheries biologists have seen the evolution from freeze-brands and coded wire tags (CWT) to passive integrated transponder (PIT) tags, balloon-tags, radiotelemetry, and now, acoustic-tags. With each advance, the technology holds the promise of more detailed and precise information. However, the technologymore » for analyzing and interpreting the data also becomes more complex as the tagging techniques become more sophisticated. The goal of the project is to develop the analytical tools in parallel with the technical advances in tagging studies, so that maximum information can be extracted on a timely basis. Associated with this mission is the transfer of these analytical capabilities to the field investigators to assure consistency and the highest levels of design and analysis throughout the fisheries community. Consequently, this project provides detailed technical assistance on the design and analysis of tagging studies to groups requesting assistance throughout the fisheries community. Ideally, each project and each investigator would invest in the statistical support needed for the successful completion of their study. However, this is an ideal that is rarely if every attained. Furthermore, there is only a small pool of highly trained scientists in this specialized area of tag analysis here in the Northwest. Project 198910700 provides the financial support to sustain this local expertise on the statistical theory of tag analysis at the University of Washington and make it available to the fisheries community. Piecemeal and fragmented support from various agencies and organizations would be incapable of maintaining a center of expertise. The mission of the project is to help assure tagging studies are designed and analyzed from the onset to extract the best available information using state-of-the-art statistical methods. The overarching goals of the project is to assure statistically sound survival studies so that fish managers can focus on the management implications of their findings and not be distracted by concerns whether the studies are statistically reliable or not. Specific goals and objectives of the study include the following: (1) Provide consistent application of statistical methodologies for survival estimation across all salmon life cycle stages to assure comparable performance measures and assessment of results through time, to maximize learning and adaptive management opportunities, and to improve and maintain the ability to responsibly evaluate the success of implemented Columbia River FWP salmonid mitigation programs and identify future mitigation options. (2) Improve analytical capabilities to conduct research on survival processes of wild and hatchery chinook and steelhead during smolt outmigration, to improve monitoring and evaluation capabilities and assist in-season river management to optimize operational and fish passage strategies to maximize survival. (3) Extend statistical support to estimate ocean survival and in-river survival of returning adults. Provide statistical guidance in implementing a river-wide adult PIT-tag detection capability. (4) Develop statistical methods for survival estimation for all potential users and make this information available through peer-reviewed publications, statistical software, and technology transfers to organizations such as NOAA Fisheries, the Fish Passage Center, US Fish and Wildlife Service, US Geological Survey (USGS), US Army Corps of Engineers (USACE), Public Utility Districts (PUDs), the Independent Scientific Advisory Board (ISAB), and other members of the Northwest fisheries community. (5) Provide and maintain statistical software for tag analysis and user support. (6) Provide improvements in statistical theory and software as requested by user groups. These improvements include extending software capabilities to address new research issues, adapting tagging techniques to new study designs, and extending the analysis capabilities to new technologies such as radio-tags and acoustic-tags.« less

  17. Regional Sediment Management (RSM) Modeling Tools: Integration of Advanced Sediment Transport Tools into HEC-RAS

    DTIC Science & Technology

    2014-06-01

    Integration of Advanced Sediment Transport Tools into HEC-RAS by Paul M. Boyd and Stanford A. Gibson PURPOSE: This Coastal and Hydraulics Engineering...Technical Note (CHETN) summarizes the development and initial testing of new sediment transport and modeling tools developed by the U.S. Army Corps...sediment transport within the USACE HEC River Analysis System (HEC-RAS) software package and to determine its applicability to Regional Sediment

  18. Advanced Flow Control as a Management Tool in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Wugalter, S.

    1974-01-01

    Advanced Flow Control is closely related to Air Traffic Control. Air Traffic Control is the business of the Federal Aviation Administration. To formulate an understanding of advanced flow control and its use as a management tool in the National Airspace System, it becomes necessary to speak somewhat of air traffic control, the role of FAA, and their relationship to advanced flow control. Also, this should dispell forever, any notion that advanced flow control is the inspirational master valve scheme to be used on the Alaskan Oil Pipeline.

  19. Estimating population diversity with CatchAll

    PubMed Central

    Bunge, John; Woodard, Linda; Böhning, Dankmar; Foster, James A.; Connolly, Sean; Allen, Heather K.

    2012-01-01

    Motivation: The massive data produced by next-generation sequencing require advanced statistical tools. We address estimating the total diversity or species richness in a population. To date, only relatively simple methods have been implemented in available software. There is a need for software employing modern, computationally intensive statistical analyses including error, goodness-of-fit and robustness assessments. Results: We present CatchAll, a fast, easy-to-use, platform-independent program that computes maximum likelihood estimates for finite-mixture models, weighted linear regression-based analyses and coverage-based non-parametric methods, along with outlier diagnostics. Given sample ‘frequency count’ data, CatchAll computes 12 different diversity estimates and applies a model-selection algorithm. CatchAll also derives discounted diversity estimates to adjust for possibly uncertain low-frequency counts. It is accompanied by an Excel-based graphics program. Availability: Free executable downloads for Linux, Windows and Mac OS, with manual and source code, at www.northeastern.edu/catchall. Contact: jab18@cornell.edu PMID:22333246

  20. A unified approach to validation, reliability, and education study design for surgical technical skills training.

    PubMed

    Sweet, Robert M; Hananel, David; Lawrenz, Frances

    2010-02-01

    To present modern educational psychology theory and apply these concepts to validity and reliability of surgical skills training and assessment. In a series of cross-disciplinary meetings, we applied a unified approach of behavioral science principles and theory to medical technical skills education given the recent advances in the theories in the field of behavioral psychology and statistics. While validation of the individual simulation tools is important, it is only one piece of a multimodal curriculum that in and of itself deserves examination and study. We propose concurrent validation throughout the design of simulation-based curriculum rather than once it is complete. We embrace the concept that validity and curriculum development are interdependent, ongoing processes that are never truly complete. Individual predictive, construct, content, and face validity aspects should not be considered separately but as interdependent and complementary toward an end application. Such an approach could help guide our acceptance and appropriate application of these exciting new training and assessment tools for technical skills training in medicine.

  1. Protein Identification Using Top-Down Spectra*

    PubMed Central

    Liu, Xiaowen; Sirotkin, Yakov; Shen, Yufeng; Anderson, Gordon; Tsai, Yihsuan S.; Ting, Ying S.; Goodlett, David R.; Smith, Richard D.; Bafna, Vineet; Pevzner, Pavel A.

    2012-01-01

    In the last two years, because of advances in protein separation and mass spectrometry, top-down mass spectrometry moved from analyzing single proteins to analyzing complex samples and identifying hundreds and even thousands of proteins. However, computational tools for database search of top-down spectra against protein databases are still in their infancy. We describe MS-Align+, a fast algorithm for top-down protein identification based on spectral alignment that enables searches for unexpected post-translational modifications. We also propose a method for evaluating statistical significance of top-down protein identifications and further benchmark various software tools on two top-down data sets from Saccharomyces cerevisiae and Salmonella typhimurium. We demonstrate that MS-Align+ significantly increases the number of identified spectra as compared with MASCOT and OMSSA on both data sets. Although MS-Align+ and ProSightPC have similar performance on the Salmonella typhimurium data set, MS-Align+ outperforms ProSightPC on the (more complex) Saccharomyces cerevisiae data set. PMID:22027200

  2. A computational study on outliers in world music.

    PubMed

    Panteli, Maria; Benetos, Emmanouil; Dixon, Simon

    2017-01-01

    The comparative analysis of world music cultures has been the focus of several ethnomusicological studies in the last century. With the advances of Music Information Retrieval and the increased accessibility of sound archives, large-scale analysis of world music with computational tools is today feasible. We investigate music similarity in a corpus of 8200 recordings of folk and traditional music from 137 countries around the world. In particular, we aim to identify music recordings that are most distinct compared to the rest of our corpus. We refer to these recordings as 'outliers'. We use signal processing tools to extract music information from audio recordings, data mining to quantify similarity and detect outliers, and spatial statistics to account for geographical correlation. Our findings suggest that Botswana is the country with the most distinct recordings in the corpus and China is the country with the most distinct recordings when considering spatial correlation. Our analysis includes a comparison of musical attributes and styles that contribute to the 'uniqueness' of the music of each country.

  3. A computational study on outliers in world music

    PubMed Central

    Benetos, Emmanouil; Dixon, Simon

    2017-01-01

    The comparative analysis of world music cultures has been the focus of several ethnomusicological studies in the last century. With the advances of Music Information Retrieval and the increased accessibility of sound archives, large-scale analysis of world music with computational tools is today feasible. We investigate music similarity in a corpus of 8200 recordings of folk and traditional music from 137 countries around the world. In particular, we aim to identify music recordings that are most distinct compared to the rest of our corpus. We refer to these recordings as ‘outliers’. We use signal processing tools to extract music information from audio recordings, data mining to quantify similarity and detect outliers, and spatial statistics to account for geographical correlation. Our findings suggest that Botswana is the country with the most distinct recordings in the corpus and China is the country with the most distinct recordings when considering spatial correlation. Our analysis includes a comparison of musical attributes and styles that contribute to the ‘uniqueness’ of the music of each country. PMID:29253027

  4. Cross-cultural examination of measurement invariance of the Beck Depression Inventory-II.

    PubMed

    Dere, Jessica; Watters, Carolyn A; Yu, Stephanie Chee-Min; Bagby, R Michael; Ryder, Andrew G; Harkness, Kate L

    2015-03-01

    Given substantial rates of major depressive disorder among college and university students, as well as the growing cultural diversity on many campuses, establishing the cross-cultural validity of relevant assessment tools is important. In the current investigation, we examined the Beck Depression Inventory-Second Edition (BDI-II; Beck, Steer, & Brown, 1996) among Chinese-heritage (n = 933) and European-heritage (n = 933) undergraduates in North America. The investigation integrated 3 distinct lines of inquiry: (a) the literature on cultural variation in depressive symptom reporting between people of Chinese and Western heritage; (b) recent developments regarding the factor structure of the BDI-II; and (c) the application of advanced statistical techniques to the issue of cross-cultural measurement invariance. A bifactor model was found to represent the optimal factor structure of the BDI-II. Multigroup confirmatory factor analysis showed that the BDI-II had strong measurement invariance across both culture and gender. In group comparisons with latent and observed variables, Chinese-heritage students scored higher than European-heritage students on cognitive symptoms of depression. This finding deviates from the commonly held view that those of Chinese heritage somatize depression. These findings hold implications for the study and use of the BDI-II, highlight the value of advanced statistical techniques such as multigroup confirmatory factor analysis, and offer methodological lessons for cross-cultural psychopathology research more broadly. 2015 APA, all rights reserved

  5. Basics of Bayesian methods.

    PubMed

    Ghosh, Sujit K

    2010-01-01

    Bayesian methods are rapidly becoming popular tools for making statistical inference in various fields of science including biology, engineering, finance, and genetics. One of the key aspects of Bayesian inferential method is its logical foundation that provides a coherent framework to utilize not only empirical but also scientific information available to a researcher. Prior knowledge arising from scientific background, expert judgment, or previously collected data is used to build a prior distribution which is then combined with current data via the likelihood function to characterize the current state of knowledge using the so-called posterior distribution. Bayesian methods allow the use of models of complex physical phenomena that were previously too difficult to estimate (e.g., using asymptotic approximations). Bayesian methods offer a means of more fully understanding issues that are central to many practical problems by allowing researchers to build integrated models based on hierarchical conditional distributions that can be estimated even with limited amounts of data. Furthermore, advances in numerical integration methods, particularly those based on Monte Carlo methods, have made it possible to compute the optimal Bayes estimators. However, there is a reasonably wide gap between the background of the empirically trained scientists and the full weight of Bayesian statistical inference. Hence, one of the goals of this chapter is to bridge the gap by offering elementary to advanced concepts that emphasize linkages between standard approaches and full probability modeling via Bayesian methods.

  6. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns Across Heterogenous Space-Time Data

    NASA Astrophysics Data System (ADS)

    Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.

    2017-12-01

    Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.

  7. NASA/CARES dual-use ceramic technology spinoff applications

    NASA Technical Reports Server (NTRS)

    Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.; Nemeth, Noel N.

    1994-01-01

    NASA has developed software that enables American industry to establish the reliability and life of ceramic structures in a wide variety of 21st Century applications. Designing ceramic components to survive at higher temperatures than the capability of most metals and in severe loading environments involves the disciplines of statistics and fracture mechanics. Successful application of advanced ceramics material properties and the use of a probabilistic brittle material design methodology. The NASA program, known as CARES (Ceramics Analysis and Reliability Evaluation of Structures), is a comprehensive general purpose design tool that predicts the probability of failure of a ceramic component as a function of its time in service. The latest version of this software, CARESALIFE, is coupled to several commercially available finite element analysis programs (ANSYS, MSC/NASTRAN, ABAQUS, COSMOS/N4, MARC), resulting in an advanced integrated design tool which is adapted to the computing environment of the user. The NASA-developed CARES software has been successfully used by industrial, government, and academic organizations to design and optimize ceramic components for many demanding applications. Industrial sectors impacted by this program include aerospace, automotive, electronic, medical, and energy applications. Dual-use applications include engine components, graphite and ceramic high temperature valves, TV picture tubes, ceramic bearings, electronic chips, glass building panels, infrared windows, radiant heater tubes, heat exchangers, and artificial hips, knee caps, and teeth.

  8. Managing expectations when publishing tools and methods for computational proteomics.

    PubMed

    Martens, Lennart; Kohlbacher, Oliver; Weintraub, Susan T

    2015-05-01

    Computational tools are pivotal in proteomics because they are crucial for identification, quantification, and statistical assessment of data. The gateway to finding the best choice of a tool or approach for a particular problem is frequently journal articles, yet there is often an overwhelming variety of options that makes it hard to decide on the best solution. This is particularly difficult for nonexperts in bioinformatics. The maturity, reliability, and performance of tools can vary widely because publications may appear at different stages of development. A novel idea might merit early publication despite only offering proof-of-principle, while it may take years before a tool can be considered mature, and by that time it might be difficult for a new publication to be accepted because of a perceived lack of novelty. After discussions with members of the computational mass spectrometry community, we describe here proposed recommendations for organization of informatics manuscripts as a way to set the expectations of readers (and reviewers) through three different manuscript types that are based on existing journal designations. Brief Communications are short reports describing novel computational approaches where the implementation is not necessarily production-ready. Research Articles present both a novel idea and mature implementation that has been suitably benchmarked. Application Notes focus on a mature and tested tool or concept and need not be novel but should offer advancement from improved quality, ease of use, and/or implementation. Organizing computational proteomics contributions into these three manuscript types will facilitate the review process and will also enable readers to identify the maturity and applicability of the tool for their own workflows.

  9. Methodological considerations, such as directed acyclic graphs, for studying "acute on chronic" disease epidemiology: chronic obstructive pulmonary disease example.

    PubMed

    Tsai, Chu-Lin; Camargo, Carlos A

    2009-09-01

    Acute exacerbations of chronic disease are ubiquitous in clinical medicine, and thus far, there has been a paucity of integrated methodological discussion on this phenomenon. We use acute exacerbations of chronic obstructive pulmonary disease as an example to emphasize key epidemiological and statistical issues for this understudied field in clinical epidemiology. Directed acyclic graphs are a useful epidemiological tool to explain the differential effects of risk factor on health outcomes in studies of acute and chronic phases of disease. To study the pathogenesis of acute exacerbations of chronic disease, case-crossover design and time-series analysis are well-suited study designs to differentiate acute and chronic effect. Modeling changes over time and setting appropriate thresholds are important steps to separate acute from chronic phases of disease in serial measurements. In statistical analysis, acute exacerbations are recurrent events, and some individuals are more prone to recurrences than others. Therefore, appropriate statistical modeling should take into account intraindividual dependence. Finally, we recommend the use of "event-based" number needed to treat (NNT) to prevent a single exacerbation instead of traditional patient-based NNT. Addressing these methodological challenges will advance research quality in acute on chronic disease epidemiology.

  10. Tooth-size discrepancy: A comparison between manual and digital methods

    PubMed Central

    Correia, Gabriele Dória Cabral; Habib, Fernando Antonio Lima; Vogel, Carlos Jorge

    2014-01-01

    Introduction Technological advances in Dentistry have emerged primarily in the area of diagnostic tools. One example is the 3D scanner, which can transform plaster models into three-dimensional digital models. Objective This study aimed to assess the reliability of tooth size-arch length discrepancy analysis measurements performed on three-dimensional digital models, and compare these measurements with those obtained from plaster models. Material and Methods To this end, plaster models of lower dental arches and their corresponding three-dimensional digital models acquired with a 3Shape R700T scanner were used. All of them had lower permanent dentition. Four different tooth size-arch length discrepancy calculations were performed on each model, two of which by manual methods using calipers and brass wire, and two by digital methods using linear measurements and parabolas. Results Data were statistically assessed using Friedman test and no statistically significant differences were found between the two methods (P > 0.05), except for values found by the linear digital method which revealed a slight, non-significant statistical difference. Conclusions Based on the results, it is reasonable to assert that any of these resources used by orthodontists to clinically assess tooth size-arch length discrepancy can be considered reliable. PMID:25279529

  11. ArrayVigil: a methodology for statistical comparison of gene signatures using segregated-one-tailed (SOT) Wilcoxon's signed-rank test.

    PubMed

    Khan, Haseeb Ahmad

    2005-01-28

    Due to versatile diagnostic and prognostic fidelity molecular signatures or fingerprints are anticipated as the most powerful tools for cancer management in the near future. Notwithstanding the experimental advancements in microarray technology, methods for analyzing either whole arrays or gene signatures have not been firmly established. Recently, an algorithm, ArraySolver has been reported by Khan for two-group comparison of microarray gene expression data using two-tailed Wilcoxon signed-rank test. Most of the molecular signatures are composed of two sets of genes (hybrid signatures) wherein up-regulation of one set and down-regulation of the other set collectively define the purpose of a gene signature. Since the direction of a selected gene's expression (positive or negative) with respect to a particular disease condition is known, application of one-tailed statistics could be a more relevant choice. A novel method, ArrayVigil, is described for comparing hybrid signatures using segregated-one-tailed (SOT) Wilcoxon signed-rank test and the results compared with integrated-two-tailed (ITT) procedures (SPSS and ArraySolver). ArrayVigil resulted in lower P values than those obtained from ITT statistics while comparing real data from four signatures.

  12. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 9: Tool and Die, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  13. Use of advanced analysis tools to support freeway corridor freight planning.

    DOT National Transportation Integrated Search

    2010-07-22

    Advanced corridor freight management and pricing strategies are increasingly being chosen to : address freight mobility challenges. As a result, evaluation tools are needed to assess the benefits : of these strategies as compared to other alternative...

  14. SmartWay Truck Tool-Advanced Class: Getting the Most out of Your SmartWay Participation

    EPA Pesticide Factsheets

    This EPA presentation provides information on the Advanced SmartWay Truck Tool; it's background, development, participation, data collection, usage, fleet categories, emission metrics, ranking system, performance data, reports, and schedule for 2017.

  15. "Dear Fresher …"--How Online Questionnaires Can Improve Learning and Teaching Statistics

    ERIC Educational Resources Information Center

    Bebermeier, Sarah; Nussbeck, Fridtjof W.; Ontrup, Greta

    2015-01-01

    Lecturers teaching statistics are faced with several challenges supporting students' learning in appropriate ways. A variety of methods and tools exist to facilitate students' learning on statistics courses. The online questionnaires presented in this report are a new, slightly different computer-based tool: the central aim was to support students…

  16. Simple prognostic model for patients with advanced cancer based on performance status.

    PubMed

    Jang, Raymond W; Caraiscos, Valerie B; Swami, Nadia; Banerjee, Subrata; Mak, Ernie; Kaya, Ebru; Rodin, Gary; Bryson, John; Ridley, Julia Z; Le, Lisa W; Zimmermann, Camilla

    2014-09-01

    Providing survival estimates is important for decision making in oncology care. The purpose of this study was to provide survival estimates for outpatients with advanced cancer, using the Eastern Cooperative Oncology Group (ECOG), Palliative Performance Scale (PPS), and Karnofsky Performance Status (KPS) scales, and to compare their ability to predict survival. ECOG, PPS, and KPS were completed by physicians for each new patient attending the Princess Margaret Cancer Centre outpatient Oncology Palliative Care Clinic (OPCC) from April 2007 to February 2010. Survival analysis was performed using the Kaplan-Meier method. The log-rank test for trend was employed to test for differences in survival curves for each level of performance status (PS), and the concordance index (C-statistic) was used to test the predictive discriminatory ability of each PS measure. Measures were completed for 1,655 patients. PS delineated survival well for all three scales according to the log-rank test for trend (P < .001). Survival was approximately halved for each worsening performance level. Median survival times, in days, for each ECOG level were: EGOG 0, 293; ECOG 1, 197; ECOG 2, 104; ECOG 3, 55; and ECOG 4, 25.5. Median survival times, in days, for PPS (and KPS) were: PPS/KPS 80-100, 221 (215); PPS/KPS 60 to 70, 115 (119); PPS/KPS 40 to 50, 51 (49); PPS/KPS 10 to 30, 22 (29). The C-statistic was similar for all three scales and ranged from 0.63 to 0.64. We present a simple tool that uses PS alone to prognosticate in advanced cancer, and has similar discriminatory ability to more complex models. Copyright © 2014 by American Society of Clinical Oncology.

  17. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    NASA Astrophysics Data System (ADS)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and principal vector similarity criteria. Poles to points are assigned to individual discontinuity objects using easy custom vector clustering and Jaccard distance approaches, and each object is segmented into planar clusters using an improved version of the DBSCAN algorithm. Modal set orientations are then recomputed by cluster-based orientation statistics to avoid the effects of biases related to cluster size and density heterogeneity of the point cloud. Finally, spacing values are measured between individual discontinuity clusters along scanlines parallel to modal pole vectors, whereas individual feature size (persistence) is measured using 3D convex hull bounding boxes. Spacing and size are provided both as raw population data and as summary statistics. The tool is optimized for parallel computing on 64bit systems, and a Graphic User Interface (GUI) has been developed to manage data processing, provide several outputs, including reclassified point clouds, tables, plots, derived fracture intensity parameters, and export to modelling software tools. We present test applications performed both on synthetic 3D data (simple 3D solids) and real case studies, validating the results with existing geomechanical datasets.

  18. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 13: Laser Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  19. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 6: Welding, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  20. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 12: Instrumentation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  1. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 15: Administrative Information, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This volume developed by the Machine Tool Advanced Skill Technology (MAST) program contains key administrative documents and provides additional sources for machine tool and precision manufacturing information and important points of contact in the industry. The document contains the following sections: a foreword; grant award letter; timeline for…

  2. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 5: Mold Making, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational speciality areas within the U.S. machine tool and metals-related…

  3. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 3: Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  4. Investigation of advanced UQ for CRUD prediction with VIPRE.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eldred, Michael Scott

    2011-09-01

    This document summarizes the results from a level 3 milestone study within the CASL VUQ effort. It demonstrates the application of 'advanced UQ,' in particular dimension-adaptive p-refinement for polynomial chaos and stochastic collocation. The study calculates statistics for several quantities of interest that are indicators for the formation of CRUD (Chalk River unidentified deposit), which can lead to CIPS (CRUD induced power shift). Stochastic expansion methods are attractive methods for uncertainty quantification due to their fast convergence properties. For smooth functions (i.e., analytic, infinitely-differentiable) in L{sup 2} (i.e., possessing finite variance), exponential convergence rates can be obtained under order refinementmore » for integrated statistical quantities of interest such as mean, variance, and probability. Two stochastic expansion methods are of interest: nonintrusive polynomial chaos expansion (PCE), which computes coefficients for a known basis of multivariate orthogonal polynomials, and stochastic collocation (SC), which forms multivariate interpolation polynomials for known coefficients. Within the DAKOTA project, recent research in stochastic expansion methods has focused on automated polynomial order refinement ('p-refinement') of expansions to support scalability to higher dimensional random input spaces [4, 3]. By preferentially refining only in the most important dimensions of the input space, the applicability of these methods can be extended from O(10{sup 0})-O(10{sup 1}) random variables to O(10{sup 2}) and beyond, depending on the degree of anisotropy (i.e., the extent to which randominput variables have differing degrees of influence on the statistical quantities of interest (QOIs)). Thus, the purpose of this study is to investigate the application of these adaptive stochastic expansion methods to the analysis of CRUD using the VIPRE simulation tools for two different plant models of differing random dimension, anisotropy, and smoothness.« less

  5. Predicting short-term mortality in advanced decompensated heart failure - role of the updated acute decompensated heart failure/N-terminal pro-B-type natriuretic Peptide risk score.

    PubMed

    Scrutinio, Domenico; Ammirati, Enrico; Passantino, Andrea; Guida, Pietro; D'Angelo, Luciana; Oliva, Fabrizio; Ciccone, Marco Matteo; Iacoviello, Massimo; Dentamaro, Ilaria; Santoro, Daniela; Lagioia, Rocco; Sarzi Braga, Simona; Guzzetti, Daniela; Frigerio, Maria

    2015-01-01

    The first few months after admission are the most vulnerable period in patients with acute decompensated heart failure (ADHF). We assessed the association of the updated ADHF/N-terminal pro-B-type natriuretic peptide (NT-proBNP) risk score with 90-day and in-hospital mortality in 701 patients admitted with advanced ADHF, defined as severe symptoms of worsening HF, severely depressed left ventricular ejection fraction, and the need for i.v. diuretic and/or inotropic drugs. A total of 15.7% of the patients died within 90 days of admission and 5.2% underwent ventricular assist device (VAD) implantation or urgent heart transplantation (UHT). The C-statistic of the ADHF/NT-proBNP risk score for 90-day mortality was 0.810 (95% CI: 0.769-0.852). Predicted and observed mortality rates were in close agreement. When the composite outcome of death/VAD/UHT at 90 days was considered, the C-statistic decreased to 0.741. During hospitalization, 7.6% of the patients died. The C-statistic for in-hospital mortality was 0.815 (95% CI: 0.761-0.868) and Hosmer-Lemeshow χ(2)=3.71 (P=0.716). The updated ADHF/NT-proBNP risk score outperformed the Acute Decompensated Heart Failure National Registry, the Organized Program to Initiate Lifesaving Treatment in Patients Hospitalized for Heart Failure, and the American Heart Association Get with the Guidelines Program predictive models. Updated ADHF/NT-proBNP risk score is a valuable tool for predicting short-term mortality in severe ADHF, outperforming existing inpatient predictive models.

  6. Collaborative Research: Using ARM Observations to Evaluate GCM Cloud Statistics for Development of Stochastic Cloud-Radiation Parameterizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Samuel S. P.

    2013-09-01

    The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key stepmore » in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been an interdisciplinary collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen). The motivation and long-term goal underlying this work is the utilization of stochastic radiative transfer theory (Lane-Veron and Somerville, 2004; Lane et al., 2002) to develop a new class of parametric representations of cloud-radiation interactions and closely related processes for atmospheric models. The theoretical advantage of the stochastic approach is that it can accurately calculate the radiative heating rates through a broken cloud layer without requiring an exact description of the cloud geometry.« less

  7. Systems-Level Synthetic Biology for Advanced Biofuel Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruffing, Anne; Jensen, Travis J.; Strickland, Lucas Marshall

    2015-03-01

    Cyanobacteria have been shown to be capable of producing a variety of advanced biofuels; however, product yields remain well below those necessary for large scale production. New genetic tools and high throughput metabolic engineering techniques are needed to optimize cyanobacterial metabolisms for enhanced biofuel production. Towards this goal, this project advances the development of a multiple promoter replacement technique for systems-level optimization of gene expression in a model cyanobacterial host: Synechococcus sp. PCC 7002. To realize this multiple-target approach, key capabilities were developed, including a high throughput detection method for advanced biofuels, enhanced transformation efficiency, and genetic tools for Synechococcusmore » sp. PCC 7002. Moreover, several additional obstacles were identified for realization of this multiple promoter replacement technique. The techniques and tools developed in this project will help to enable future efforts in the advancement of cyanobacterial biofuels.« less

  8. AgriSense-STARS: Advancing Methods of Agricultural Monitoring for Food Security in Smallholder Regions - the Case for Tanzania

    NASA Astrophysics Data System (ADS)

    Dempewolf, J.; Becker-Reshef, I.; Nakalembe, C. L.; Tumbo, S.; Maurice, S.; Mbilinyi, B.; Ntikha, O.; Hansen, M.; Justice, C. J.; Adusei, B.; Kongo, V.

    2015-12-01

    In-season monitoring of crop conditions provides critical information for agricultural policy and decision making and most importantly for food security planning and management. Nationwide agricultural monitoring in countries dominated by smallholder farming systems, generally relies on extensive networks of field data collectors. In Tanzania, extension agents make up this network and report on conditions across the country, approaching a "near-census". Data is collected on paper which is resource and time intensive, as well as prone to errors. Data quality is ambiguous and there is a general lack of clear and functional feedback loops between farmers, extension agents, analysts and decision makers. Moreover, the data are not spatially explicit, limiting the usefulness for analysis and quality of policy outcomes. Despite significant advances in remote sensing and information communication technologies (ICT) for monitoring agriculture, the full potential of these new tools is yet to be realized in Tanzania. Their use is constrained by the lack of resources, skills and infrastructure to access and process these data. The use of ICT technologies for data collection, processing and analysis is equally limited. The AgriSense-STARS project is developing and testing a system for national-scale in-season monitoring of smallholder agriculture using a combination of three main tools, 1) GLAM-East Africa, an automated MODIS satellite image processing system, 2) field data collection using GeoODK and unmanned aerial vehicles (UAVs), and 3) the Tanzania Crop Monitor, a collaborative online portal for data management and reporting. These tools are developed and applied in Tanzania through the National Food Security Division of the Ministry of Agriculture, Food Security and Cooperatives (MAFC) within a statistically representative sampling framework (area frame) that ensures data quality, representability and resource efficiency.

  9. Development and validation of automatic tools for interactive recurrence analysis in radiation therapy: optimization of treatment algorithms for locally advanced pancreatic cancer.

    PubMed

    Kessel, Kerstin A; Habermehl, Daniel; Jäger, Andreas; Floca, Ralf O; Zhang, Lanlan; Bendl, Rolf; Debus, Jürgen; Combs, Stephanie E

    2013-06-07

    In radiation oncology recurrence analysis is an important part in the evaluation process and clinical quality assurance of treatment concepts. With the example of 9 patients with locally advanced pancreatic cancer we developed and validated interactive analysis tools to support the evaluation workflow. After an automatic registration of the radiation planning CTs with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence and the distance between the boost and recurrence volume. We calculated the percentage of the recurrence volume within the 80%-isodose volume and compared it to the location of the recurrence within the boost volume, boost + 1 cm, boost + 1.5 cm and boost + 2 cm volumes. Recurrence analysis of 9 patients demonstrated that all recurrences except one occurred within the defined GTV/boost volume; one recurrence developed beyond the field border/outfield. With the defined distance volumes in relation to the recurrences, we could show that 7 recurrent lesions were within the 2 cm radius of the primary tumor. Two large recurrences extended beyond the 2 cm, however, this might be due to very rapid growth and/or late detection of the tumor progression. The main goal of using automatic analysis tools is to reduce time and effort conducting clinical analyses. We showed a first approach and use of a semi-automated workflow for recurrence analysis, which will be continuously optimized. In conclusion, despite the limitations of the automatic calculations we contributed to in-house optimization of subsequent study concepts based on an improved and validated target volume definition.

  10. DarkBit: a GAMBIT module for computing dark matter observables and likelihoods

    NASA Astrophysics Data System (ADS)

    Bringmann, Torsten; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Edsjö, Joakim; Farmer, Ben; Kahlhoefer, Felix; Kvellestad, Anders; Putze, Antje; Savage, Christopher; Scott, Pat; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-12-01

    We introduce DarkBit, an advanced software code for computing dark matter constraints on various extensions to the Standard Model of particle physics, comprising both new native code and interfaces to external packages. This release includes a dedicated signal yield calculator for gamma-ray observations, which significantly extends current tools by implementing a cascade-decay Monte Carlo, as well as a dedicated likelihood calculator for current and future experiments ( gamLike). This provides a general solution for studying complex particle physics models that predict dark matter annihilation to a multitude of final states. We also supply a direct detection package that models a large range of direct detection experiments ( DDCalc), and that provides the corresponding likelihoods for arbitrary combinations of spin-independent and spin-dependent scattering processes. Finally, we provide custom relic density routines along with interfaces to DarkSUSY, micrOMEGAs, and the neutrino telescope likelihood package nulike. DarkBit is written in the framework of the Global And Modular Beyond the Standard Model Inference Tool ( GAMBIT), providing seamless integration into a comprehensive statistical fitting framework that allows users to explore new models with both particle and astrophysics constraints, and a consistent treatment of systematic uncertainties. In this paper we describe its main functionality, provide a guide to getting started quickly, and show illustrative examples for results obtained with DarkBit (both as a stand-alone tool and as a GAMBIT module). This includes a quantitative comparison between two of the main dark matter codes ( DarkSUSY and micrOMEGAs), and application of DarkBit 's advanced direct and indirect detection routines to a simple effective dark matter model.

  11. The Precision-Power-Gradient Theory for Teaching Basic Research Statistical Tools to Graduate Students.

    ERIC Educational Resources Information Center

    Cassel, Russell N.

    This paper relates educational and psychological statistics to certain "Research Statistical Tools" (RSTs) necessary to accomplish and understand general research in the behavioral sciences. Emphasis is placed on acquiring an effective understanding of the RSTs and to this end they are are ordered to a continuum scale in terms of individual…

  12. Syndromic surveillance of influenza activity in Sweden: an evaluation of three tools.

    PubMed

    Ma, T; Englund, H; Bjelkmar, P; Wallensten, A; Hulth, A

    2015-08-01

    An evaluation was conducted to determine which syndromic surveillance tools complement traditional surveillance by serving as earlier indicators of influenza activity in Sweden. Web queries, medical hotline statistics, and school absenteeism data were evaluated against two traditional surveillance tools. Cross-correlation calculations utilized aggregated weekly data for all-age, nationwide activity for four influenza seasons, from 2009/2010 to 2012/2013. The surveillance tool indicative of earlier influenza activity, by way of statistical and visual evidence, was identified. The web query algorithm and medical hotline statistics performed equally well as each other and to the traditional surveillance tools. School absenteeism data were not reliable resources for influenza surveillance. Overall, the syndromic surveillance tools did not perform with enough consistency in season lead nor in earlier timing of the peak week to be considered as early indicators. They do, however, capture incident cases before they have formally entered the primary healthcare system.

  13. An integrated modeling and design tool for advanced optical spacecraft

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1992-01-01

    Consideration is given to the design and status of the Integrated Modeling of Optical Systems (IMOS) tool and to critical design issues. A multidisciplinary spacecraft design and analysis tool with support for structural dynamics, controls, thermal analysis, and optics, IMOS provides rapid and accurate end-to-end performance analysis, simulations, and optimization of advanced space-based optical systems. The requirements for IMOS-supported numerical arrays, user defined data structures, and a hierarchical data base are outlined, and initial experience with the tool is summarized. A simulation of a flexible telescope illustrates the integrated nature of the tools.

  14. The issue of multiple univariate comparisons in the context of neuroelectric brain mapping: an application in a neuromarketing experiment.

    PubMed

    Vecchiato, G; De Vico Fallani, F; Astolfi, L; Toppi, J; Cincotti, F; Mattia, D; Salinari, S; Babiloni, F

    2010-08-30

    This paper presents some considerations about the use of adequate statistical techniques in the framework of the neuroelectromagnetic brain mapping. With the use of advanced EEG/MEG recording setup involving hundred of sensors, the issue of the protection against the type I errors that could occur during the execution of hundred of univariate statistical tests, has gained interest. In the present experiment, we investigated the EEG signals from a mannequin acting as an experimental subject. Data have been collected while performing a neuromarketing experiment and analyzed with state of the art computational tools adopted in specialized literature. Results showed that electric data from the mannequin's head presents statistical significant differences in power spectra during the visualization of a commercial advertising when compared to the power spectra gathered during a documentary, when no adjustments were made on the alpha level of the multiple univariate tests performed. The use of the Bonferroni or Bonferroni-Holm adjustments returned correctly no differences between the signals gathered from the mannequin in the two experimental conditions. An partial sample of recently published literature on different neuroscience journals suggested that at least the 30% of the papers do not use statistical protection for the type I errors. While the occurrence of type I errors could be easily managed with appropriate statistical techniques, the use of such techniques is still not so largely adopted in the literature. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  15. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... data describing vehicle usage required by the Federal Automotive Statistical Tool (FAST) by October 15 of each year. FAST is accessed through http://fastweb.inel.gov/. (End of clause) [68 FR 43334, July...

  16. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... data describing vehicle usage required by the Federal Automotive Statistical Tool (FAST) by October 15 of each year. FAST is accessed through http://fastweb.inel.gov/. (End of clause) [68 FR 43334, July...

  17. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... data describing vehicle usage required by the Federal Automotive Statistical Tool (FAST) by October 15 of each year. FAST is accessed through http://fastweb.inel.gov/. (End of clause) [68 FR 43334, July...

  18. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... data describing vehicle usage required by the Federal Automotive Statistical Tool (FAST) by October 15 of each year. FAST is accessed through http://fastweb.inel.gov/. (End of clause) [68 FR 43334, July...

  19. Quantitative Measures for Software Independent Verification and Validation

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    1996-01-01

    As software is maintained or reused, it undergoes an evolution which tends to increase the overall complexity of the code. To understand the effects of this, we brought in statistics experts and leading researchers in software complexity, reliability, and their interrelationships. These experts' project has resulted in our ability to statistically correlate specific code complexity attributes, in orthogonal domains, to errors found over time in the HAL/S flight software which flies in the Space Shuttle. Although only a prototype-tools experiment, the result of this research appears to be extendable to all other NASA software, given appropriate data similar to that logged for the Shuttle onboard software. Our research has demonstrated that a more complete domain coverage can be mathematically demonstrated with the approach we have applied, thereby ensuring full insight into the cause-and-effects relationship between the complexity of a software system and the fault density of that system. By applying the operational profile we can characterize the dynamic effects of software path complexity under this same approach We now have the ability to measure specific attributes which have been statistically demonstrated to correlate to increased error probability, and to know which actions to take, for each complexity domain. Shuttle software verifiers can now monitor the changes in the software complexity, assess the added or decreased risk of software faults in modified code, and determine necessary corrections. The reports, tool documentation, user's guides, and new approach that have resulted from this research effort represent advances in the state of the art of software quality and reliability assurance. Details describing how to apply this technique to other NASA code are contained in this document.

  20. A Non-Destructive Method for Distinguishing Reindeer Antler (Rangifer tarandus) from Red Deer Antler (Cervus elaphus) Using X-Ray Micro-Tomography Coupled with SVM Classifiers

    PubMed Central

    Lefebvre, Alexandre; Rochefort, Gael Y.; Santos, Frédéric; Le Denmat, Dominique; Salmon, Benjamin; Pétillon, Jean-Marc

    2016-01-01

    Over the last decade, biomedical 3D-imaging tools have gained widespread use in the analysis of prehistoric bone artefacts. While initial attempts to characterise the major categories used in osseous industry (i.e. bone, antler, and dentine/ivory) have been successful, the taxonomic determination of prehistoric artefacts remains to be investigated. The distinction between reindeer and red deer antler can be challenging, particularly in cases of anthropic and/or taphonomic modifications. In addition to the range of destructive physicochemical identification methods available (mass spectrometry, isotopic ratio, and DNA analysis), X-ray micro-tomography (micro-CT) provides convincing non-destructive 3D images and analyses. This paper presents the experimental protocol (sample scans, image processing, and statistical analysis) we have developed in order to identify modern and archaeological antler collections (from Isturitz, France). This original method is based on bone microstructure analysis combined with advanced statistical support vector machine (SVM) classifiers. A combination of six microarchitecture biomarkers (bone volume fraction, trabecular number, trabecular separation, trabecular thickness, trabecular bone pattern factor, and structure model index) were screened using micro-CT in order to characterise internal alveolar structure. Overall, reindeer alveoli presented a tighter mesh than red deer alveoli, and statistical analysis allowed us to distinguish archaeological antler by species with an accuracy of 96%, regardless of anatomical location on the antler. In conclusion, micro-CT combined with SVM classifiers proves to be a promising additional non-destructive method for antler identification, suitable for archaeological artefacts whose degree of human modification and cultural heritage or scientific value has previously made it impossible (tools, ornaments, etc.). PMID:26901355

  1. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT Uncertainty Quantification library, called MUQ (\\url{muq.mit.edu}).« less

  2. Recent progress and challenges in population genetics of polyploid organisms: an overview of current state-of-the-art molecular and statistical tools.

    PubMed

    Dufresne, France; Stift, Marc; Vergilino, Roland; Mable, Barbara K

    2014-01-01

    Despite the importance of polyploidy and the increasing availability of new genomic data, there remain important gaps in our knowledge of polyploid population genetics. These gaps arise from the complex nature of polyploid data (e.g. multiple alleles and loci, mixed inheritance patterns, association between ploidy and mating system variation). Furthermore, many of the standard tools for population genetics that have been developed for diploids are often not feasible for polyploids. This review aims to provide an overview of the state-of-the-art in polyploid population genetics and to identify the main areas where further development of molecular techniques and statistical theory is required. We review commonly used molecular tools (amplified fragment length polymorphism, microsatellites, Sanger sequencing, next-generation sequencing and derived technologies) and their challenges associated with their use in polyploid populations: that is, allele dosage determination, null alleles, difficulty of distinguishing orthologues from paralogues and copy number variation. In addition, we review the approaches that have been used for population genetic analysis in polyploids and their specific problems. These problems are in most cases directly associated with dosage uncertainty and the problem of inferring allele frequencies and assumptions regarding inheritance. This leads us to conclude that for advancing the field of polyploid population genetics, most priority should be given to development of new molecular approaches that allow efficient dosage determination, and to further development of analytical approaches to circumvent dosage uncertainty and to accommodate 'flexible' modes of inheritance. In addition, there is a need for more simulation-based studies that test what kinds of biases could result from both existing and novel approaches. © 2013 John Wiley & Sons Ltd.

  3. A Framework for Assessing High School Students' Statistical Reasoning.

    PubMed

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  4. A Framework for Assessing High School Students' Statistical Reasoning

    PubMed Central

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students’ statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students’ statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework’s cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments. PMID:27812091

  5. Progress in modelling agricultural impacts of and adaptations to climate change.

    PubMed

    Rötter, R P; Hoffmann, M P; Koch, M; Müller, C

    2018-06-01

    Modelling is a key tool to explore agricultural impacts of and adaptations to climate change. Here we report recent progress made especially referring to the large project initiatives MACSUR and AgMIP; in particular, in modelling potential crop impacts from field to global using multi-model ensembles. We identify two main fields where further progress is necessary: a more mechanistic understanding of climate impacts and management options for adaptation and mitigation; and focusing on cropping systems and integrative multi-scale assessments instead of single season and crops, especially in complex tropical and neglected but important cropping systems. Stronger linking of experimentation with statistical and eco-physiological crop modelling could facilitate the necessary methodological advances. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Aging and Family Life: A Decade Review

    PubMed Central

    Silverstein, Merril; Giarrusso, Roseann

    2010-01-01

    In this review, we summarize and critically evaluate the major empirical, conceptual, and theoretical directions that studies of aging families have taken during the first decade of the 21st century. The field has benefited from an expanded perspective based on four overarching themes: (a) complexity in emotional relations, (b) diversity in family structures and households, (c) interdependence of family roles and functions, and (d) patterns and outcomes of caregiving. Although research on aging families has advanced theory and applied innovative statistical techniques, the literature has fallen short in fully representing diverse populations and in applying the broadest set of methodological tools available. We discuss these and other frontier areas of scholarship in light of the aging of baby boomers and their families. PMID:22930600

  7. Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Mike; Cipiti, Ben; Demuth, Scott Francis

    2017-01-30

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less

  8. Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durkee, Joe W.; Cipiti, Ben; Demuth, Scott Francis

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less

  9. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  10. Preparing High School Students for Success in Advanced Placement Statistics: An Investigation of Pedagogies and Strategies Used in an Online Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Potter, James Thomson, III

    2012-01-01

    Research into teaching practices and strategies has been performed separately in AP Statistics and in K-12 online learning (Garfield, 2002; Ferdig, DiPietro, Black & Dawson, 2009). This study seeks combine the two and build on the need for more investigation into online teaching and learning in specific content (Ferdig et al, 2009; DiPietro,…

  11. NAUSEA and the Principle of Supplementarity of Damping and Isolation in Noise Control.

    DTIC Science & Technology

    1980-02-01

    New approaches and uses of the statistical energy analysis (NAUSEA) have been considered and developed in recent months. The advances were made...possible in that the requirement, in the olde statistical energy analysis , that the dynamic systems be highly reverberant and the couplings between the...analytical consideration in terms of the statistical energy analysis (SEA). A brief discussion and simple examples that relate to these recent advances

  12. NIRS-SPM: statistical parametric mapping for near infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Tak, Sungho; Jang, Kwang Eun; Jung, Jinwook; Jang, Jaeduck; Jeong, Yong; Ye, Jong Chul

    2008-02-01

    Even though there exists a powerful statistical parametric mapping (SPM) tool for fMRI, similar public domain tools are not available for near infrared spectroscopy (NIRS). In this paper, we describe a new public domain statistical toolbox called NIRS-SPM for quantitative analysis of NIRS signals. Specifically, NIRS-SPM statistically analyzes the NIRS data using GLM and makes inference as the excursion probability which comes from the random field that are interpolated from the sparse measurement. In order to obtain correct inference, NIRS-SPM offers the pre-coloring and pre-whitening method for temporal correlation estimation. For simultaneous recording NIRS signal with fMRI, the spatial mapping between fMRI image and real coordinate in 3-D digitizer is estimated using Horn's algorithm. These powerful tools allows us the super-resolution localization of the brain activation which is not possible using the conventional NIRS analysis tools.

  13. Microbial ecology to manage processes in environmental biotechnology.

    PubMed

    Rittmann, Bruce E

    2006-06-01

    Microbial ecology and environmental biotechnology are inherently tied to each other. The concepts and tools of microbial ecology are the basis for managing processes in environmental biotechnology; and these processes provide interesting ecosystems to advance the concepts and tools of microbial ecology. Revolutionary advancements in molecular tools to understand the structure and function of microbial communities are bolstering the power of microbial ecology. A push from advances in modern materials along with a pull from a societal need to become more sustainable is enabling environmental biotechnology to create novel processes. How do these two fields work together? Five principles illuminate the way: (i) aim for big benefits; (ii) develop and apply more powerful tools to understand microbial communities; (iii) follow the electrons; (iv) retain slow-growing biomass; and (v) integrate, integrate, integrate.

  14. The GenABEL Project for statistical genomics.

    PubMed

    Karssen, Lennart C; van Duijn, Cornelia M; Aulchenko, Yurii S

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the "core team", facilitating agile statistical omics methodology development and fast dissemination.

  15. New tools for the tracing of ancient starbursts: Analysing globular cluster systems using Lick indices

    NASA Astrophysics Data System (ADS)

    Lilly, T.; Fritze-v. Alvensleben, U.; de Grijs, R.

    2005-05-01

    We present mathematically advanced tools for the determination of age, metallicity, and mass of old Globular Clusters (CGs) using both broad-band colors and spectral indices, and we present their application to the Globular Cluster Systems (GCSs) of elliptical galaxies. Since one of the most intriguing questions of today's astronomy aims at the evolutionary connection between (young) violently interacting galaxies at high-redshift and the (old) elliptical galaxies we observe nearby, it is necessary to reveal the possibly violent star-formation history of these old galaxies. By means of evolutionary synthesis models, we can show that, using the integrated light of a galaxy's (composite) stellar content alone, it is impossible to date (and, actually, to identify) even very strong starbursts if these events took place more than two or three Gyr ago. However, since large and violent starbursts are associated with the formation of GCs, GCSs are very good tracers of the most violent starburst events in the history of their host galaxies. Using our well-established Göttingen SED (Spectral Energy Distribution) analysis tool, we can reveal the age, metallicity, mass (and possibly extinction) of GCs by comparing the observations with an extensive grid of SSP model colors. This is done in a statistically advanced and reasonable way, including their 1 σ uncertainties. However, since for all colors the evolution slows down considerably at ages older than about 8 Gyr, even with several passbands and a long wavelength base line, the results are severely uncertain for old clusters. Therefore, we incorporated empirical calibrations for Lick indices in our models and developed a Lick indices analysis tool that works in the same way as the SED analysis tool described above. We compare the theoretical possibilities and limitations of both methods as well as their results for the example of the cD galaxy NGC 1399, for which both multi-color observations and, for a subsample of clusters, spectral indices are available, and address implications for the nature and origin of the observed bimodal color distribution.

  16. System and Software Reliability (C103)

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Within the last decade better reliability models (hardware. software, system) than those currently used have been theorized and developed but not implemented in practice. Previous research on software reliability has shown that while some existing software reliability models are practical, they are no accurate enough. New paradigms of development (e.g. OO) have appeared and associated reliability models have been proposed posed but not investigated. Hardware models have been extensively investigated but not integrated into a system framework. System reliability modeling is the weakest of the three. NASA engineers need better methods and tools to demonstrate that the products meet NASA requirements for reliability measurement. For the new models for the software component of the last decade, there is a great need to bring them into a form that they can be used on software intensive systems. The Statistical Modeling and Estimation of Reliability Functions for Systems (SMERFS'3) tool is an existing vehicle that may be used to incorporate these new modeling advances. Adapting some existing software reliability modeling changes to accommodate major changes in software development technology may also show substantial improvement in prediction accuracy. With some additional research, the next step is to identify and investigate system reliability. System reliability models could then be incorporated in a tool such as SMERFS'3. This tool with better models would greatly add value in assess in GSFC projects.

  17. Visualization of multiple influences on ocellar flight control in giant honeybees with the data-mining tool Viscovery SOMine.

    PubMed

    Kastberger, G; Kranner, G

    2000-02-01

    Viscovery SOMine is a software tool for advanced analysis and monitoring of numerical data sets. It was developed for professional use in business, industry, and science and to support dependency analysis, deviation detection, unsupervised clustering, nonlinear regression, data association, pattern recognition, and animated monitoring. Based on the concept of self-organizing maps (SOMs), it employs a robust variant of unsupervised neural networks--namely, Kohonen's Batch-SOM, which is further enhanced with a new scaling technique for speeding up the learning process. This tool provides a powerful means by which to analyze complex data sets without prior statistical knowledge. The data representation contained in the trained SOM is systematically converted to be used in a spectrum of visualization techniques, such as evaluating dependencies between components, investigating geometric properties of the data distribution, searching for clusters, or monitoring new data. We have used this software tool to analyze and visualize multiple influences of the ocellar system on free-flight behavior in giant honeybees. Occlusion of ocelli will affect orienting reactivities in relation to flight target, level of disturbance, and position of the bee in the flight chamber; it will induce phototaxis and make orienting imprecise and dependent on motivational settings. Ocelli permit the adjustment of orienting strategies to environmental demands by enforcing abilities such as centering or flight kinetics and by providing independent control of posture and flight course.

  18. Science Initiatives of the US Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.

    2012-09-01

    The United States Virtual Astronomical Observatory program is the operational facility successor to the National Virtual Observatory development project. The primary goal of the US VAO is to build on the standards, protocols, and associated infrastructure developed by NVO and the International Virtual Observatory Alliance partners and to bring to fruition a suite of applications and web-based tools that greatly enhance the research productivity of professional astronomers. To this end, and guided by the advice of our Science Council (Fabbiano et al. 2011), we have focused on five science initiatives in the first two years of VAO operations: 1) scalable cross-comparisons between astronomical source catalogs, 2) dynamic spectral energy distribution construction, visualization, and model fitting, 3) integration and periodogram analysis of time series data from the Harvard Time Series Center and NASA Star and Exoplanet Database, 4) integration of VO data discovery and access tools into the IRAF data analysis environment, and 5) a web-based portal to VO data discovery, access, and display tools. We are also developing tools for data linking and semantic discovery, and have a plan for providing data mining and advanced statistical analysis resources for VAO users. Initial versions of these applications and web-based services are being released over the course of the summer and fall of 2011, with further updates and enhancements planned for throughout 2012 and beyond.

  19. Sequence History Update Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  20. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 8: Sheet Metal & Composites, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  1. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 4: Manufacturing Engineering Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  2. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 14: Automated Equipment Technician (CIM), of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  3. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 10: Computer-Aided Drafting & Design, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  4. Advances in bioluminescence imaging: new probes from old recipes.

    PubMed

    Yao, Zi; Zhang, Brendan S; Prescher, Jennifer A

    2018-06-04

    Bioluminescent probes are powerful tools for visualizing biology in live tissues and whole animals. Recent years have seen a surge in the number of new luciferases, luciferins, and related tools available for bioluminescence imaging. Many were crafted using classic methods of optical probe design and engineering. Here we highlight recent advances in bioluminescent tool discovery and development, along with applications of the probes in cells, tissues, and organisms. Collectively, these tools are improving in vivo imaging capabilities and bolstering new research directions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 2: Career Development, General Education and Remediation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  6. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 7: Industrial Maintenance Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  7. Advanced Tools Webinar Series Presents: Regulatory Issues and Case Studies of Advanced Tools

    EPA Science Inventory

    U.S. EPA has released A Guide for Assessing Biodegradation and Source Identification of Organic Ground Water Contaminants using Compound Specific Isotope Analysis (CSIA) [EPA 600/R-08/148 | December 2008 | www.epa.gov/ada]. The Guide provides recommendations for sample collecti...

  8. New advanced tools for combined ULF wave analysis of multipoint space-borne and ground observations: application to single event and statistical studies

    NASA Astrophysics Data System (ADS)

    Balasis, G.; Papadimitriou, C.; Daglis, I. A.; Georgiou, M.; Giamini, S. A.

    2013-12-01

    In the past decade, a critical mass of high-quality scientific data on the electric and magnetic fields in the Earth's magnetosphere and topside ionosphere has been progressively collected. This data pool will be further enriched by the measurements of the upcoming ESA/Swarm mission, a constellation of three satellites in three different polar orbits between 400 and 550 km altitude, which is expected to be launched in November 2013. New analysis tools that can cope with measurements of various spacecraft at various regions of the magnetosphere and in the topside ionosphere as well as ground stations will effectively enhance the scientific exploitation of the accumulated data. Here, we report on a new suite of algorithms based on a combination of wavelet spectral methods and artificial neural network techniques and demonstrate the applicability of our recently developed analysis tools both for individual case studies and statistical studies of ultra-low frequency (ULF) waves. First, we provide evidence for a rare simultaneous observation of a ULF wave event in the Earth's magnetosphere, topside ionosphere and surface: we have found a specific time interval during the Halloween 2003 magnetic storm, when the Cluster and CHAMP spacecraft were in good local time (LT) conjunction, and have examined the ULF wave activity in the Pc3 (22-100 mHz) and Pc4-5 (1-22 mHz) bands using data from the Geotail, Cluster and CHAMP missions, as well as the CARISMA and GIMA magnetometer networks. Then, we perform a statistical study of Pc3 wave events observed by CHAMP for the full decade (2001-2010) of the satellite vector magnetic data: the creation of a database of such events enabled us to derive valuable statistics for many important physical properties relating to the spatio-temporal location of these waves, the wave power and frequency, as well as other parameters and their correlation with solar wind conditions, magnetospheric indices, electron density data, ring current decay and radiation belt enhancements. The work leading to this paper has received funding from the European Union's Seventh Framework Programme (FP7-SPACE-2011-1) under grant agreement no. 284520 for the MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Energization and Loss) collaborative research project.

  9. Advances in Statistical Methods for Substance Abuse Prevention Research

    PubMed Central

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  10. Statistical Tools for Fitting Models of the Population Consequences of Acoustic Disturbance to Data from Marine Mammal Populations (PCAD Tools II)

    DTIC Science & Technology

    2014-09-30

    Consequences of Acoustic Disturbance to Data from Marine Mammal Populations (PCAD Tools II) Len Thomas, John Harwood, Catriona Harris, and Robert S... mammals changes over time. This project will develop statistical tools to allow mathematical models of the population consequences of acoustic...disturbance to be fitted to data from marine mammal populations. We will work closely with Phase II of the ONR PCAD Working Group, and will provide

  11. Introducing SONS, a tool for operational taxonomic unit-based comparisons of microbial community memberships and structures.

    PubMed

    Schloss, Patrick D; Handelsman, Jo

    2006-10-01

    The recent advent of tools enabling statistical inferences to be drawn from comparisons of microbial communities has enabled the focus of microbial ecology to move from characterizing biodiversity to describing the distribution of that biodiversity. Although statistical tools have been developed to compare community structures across a phylogenetic tree, we lack tools to compare the memberships and structures of two communities at a particular operational taxonomic unit (OTU) definition. Furthermore, current tests of community structure do not indicate the similarity of the communities but only report the probability of a statistical hypothesis. Here we present a computer program, SONS, which implements nonparametric estimators for the fraction and richness of OTUs shared between two communities.

  12. Designing and Operating Through Compromise: Architectural Analysis of CKMS for the Advanced Metering Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duren, Mike; Aldridge, Hal; Abercrombie, Robert K

    2013-01-01

    Compromises attributable to the Advanced Persistent Threat (APT) highlight the necessity for constant vigilance. The APT provides a new perspective on security metrics (e.g., statistics based cyber security) and quantitative risk assessments. We consider design principals and models/tools that provide high assurance for energy delivery systems (EDS) operations regardless of the state of compromise. Cryptographic keys must be securely exchanged, then held and protected on either end of a communications link. This is challenging for a utility with numerous substations that must secure the intelligent electronic devices (IEDs) that may comprise complex control system of systems. For example, distribution andmore » management of keys among the millions of intelligent meters within the Advanced Metering Infrastructure (AMI) is being implemented as part of the National Smart Grid initiative. Without a means for a secure cryptographic key management system (CKMS) no cryptographic solution can be widely deployed to protect the EDS infrastructure from cyber-attack. We consider 1) how security modeling is applied to key management and cyber security concerns on a continuous basis from design through operation, 2) how trusted models and key management architectures greatly impact failure scenarios, and 3) how hardware-enabled trust is a critical element to detecting, surviving, and recovering from attack.« less

  13. Advanced Bode Plot Techniques for Ultrasonic Transducers

    NASA Astrophysics Data System (ADS)

    DeAngelis, D. A.; Schulze, G. W.

    The Bode plot, displayed as either impedance or admittance versus frequency, is the most basic test used by ultrasonic transducer designers. With simplicity and ease-of-use, Bode plots are ideal for baseline comparisons such as spacing of parasitic modes or impedance, but quite often the subtleties that manifest as poor process control are hard to interpret or are nonexistence. In-process testing of transducers is time consuming for quantifying statistical aberrations, and assessments made indirectly via the workpiece are difficult. This research investigates the use of advanced Bode plot techniques to compare ultrasonic transducers with known "good" and known "bad" process performance, with the goal of a-priori process assessment. These advanced techniques expand from the basic constant voltage versus frequency sweep to include constant current and constant velocity interrogated locally on transducer or tool; they also include up and down directional frequency sweeps to quantify hysteresis effects like jumping and dropping phenomena. The investigation focuses solely on the common PZT8 piezoelectric material used with welding transducers for semiconductor wire bonding. Several metrics are investigated such as impedance, displacement/current gain, velocity/current gain, displacement/voltage gain and velocity/voltage gain. The experimental and theoretical research methods include Bode plots, admittance loops, laser vibrometry and coupled-field finite element analysis.

  14. Identifying opportunities to advance practice at a large academic medical center using the ASHP Ambulatory Care Self-Assessment Tool.

    PubMed

    Martirosov, Amber Lanae; Michael, Angela; McCarty, Melissa; Bacon, Opal; DiLodovico, John R; Jantz, Arin; Kostoff, Diana; MacDonald, Nancy C; Mikulandric, Nancy; Neme, Klodiana; Sulejmani, Nimisha; Summers, Bryant B

    2018-05-29

    The use of the ASHP Ambulatory Care Self-Assessment Tool to advance pharmacy practice at 8 ambulatory care clinics of a large academic medical center is described. The ASHP Ambulatory Care Self-Assessment Tool was developed to help ambulatory care pharmacists assess how their current practices align with the ASHP Practice Advancement Initiative. The Henry Ford Hospital Ambulatory Care Advisory Group (ACAG) opted to use the "Practitioner Track" sections of the tool to assess pharmacy practices within each of 8 ambulatory care clinics individually. The responses to self-assessment items were then compiled and discussed by ACAG members. The group identified best practices and ways to implement action items to advance ambulatory care practice throughout the institution. Three recommended action items were common to most clinics: (1) identify and evaluate solutions to deliver financially viable services, (2) develop technology to improve patient care, and (3) optimize the role of pharmacy technicians and support personnel. The ACAG leadership met with pharmacy administrators to discuss how action items that were both feasible and deemed likely to have a medium-to-high impact aligned with departmental goals and used this information to develop an ambulatory care strategic plan. This process informed and enabled initiatives to advance ambulatory care pharmacy practice within the system. The ASHP Ambulatory Care Self-Assessment Tool was useful in identifying opportunities for practice advancement in a large academic medical center. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  15. Software Used to Generate Cancer Statistics - SEER Cancer Statistics

    Cancer.gov

    Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.

  16. Development of Advanced Light-Duty Powertrain and Hybrid Analysis Tool (SAE 2013-01-0808)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by Environmental Protection Agency to evaluate the Greenhouse gas emissions and fuel efficiency from light-duty vehicles. It is a physics-based, forward-looking, full vehicle computer simulator, which is cap...

  17. Advanced Computing Tools and Models for Accelerator Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  18. Serpentinomics-an emerging new field of study

    Treesearch

    Jessica Wright; Eric von Wettberg

    2009-01-01

    "Serpentinomics" is an emerging field of study which has the potential to greatly advance our understanding of serpentine ecology. Several newly developing –omic fields, often using high-throughput tools developed for molecular biology, will advance the field of serpentine ecology, or, "serpentinomics." Using tools from the...

  19. Conceptual Assessment Tool for Advanced Undergraduate Electrodynamics

    ERIC Educational Resources Information Center

    Baily, Charles; Ryan, Qing X.; Astolfi, Cecilia; Pollock, Steven J.

    2017-01-01

    As part of ongoing investigations into student learning in advanced undergraduate courses, we have developed a conceptual assessment tool for upper-division electrodynamics (E&M II): the Colorado UppeR-division ElectrodyNamics Test (CURrENT). This is a free response, postinstruction diagnostic with 6 multipart questions, an optional 3-question…

  20. Advances in In Vitro and In Silico Tools for Toxicokinetic Dose Modeling and Predictive Toxicology (WC10)

    EPA Science Inventory

    Recent advances in vitro assays, in silico tools, and systems biology approaches provide opportunities for refined mechanistic understanding for chemical safety assessment that will ultimately lead to reduced reliance on animal-based methods. With the U.S. commercial chemical lan...

  1. Communication Tools for End-of-Life Decision-Making in Ambulatory Care Settings: A Systematic Review and Meta-Analysis.

    PubMed

    Oczkowski, Simon J; Chung, Han-Oh; Hanvey, Louise; Mbuagbaw, Lawrence; You, John J

    2016-01-01

    Patients with serious illness, and their families, state that better communication and decision-making with healthcare providers is a high priority to improve the quality of end-of-life care. Numerous communication tools to assist patients, family members, and clinicians in end-of-life decision-making have been published, but their effectiveness remains unclear. To determine, amongst adults in ambulatory care settings, the effect of structured communication tools for end-of-life decision-making on completion of advance care planning. We searched for relevant randomized controlled trials (RCTs) or non-randomized intervention studies in MEDLINE, EMBASE, CINAHL, ERIC, and the Cochrane Database of Randomized Controlled Trials from database inception until July 2014. Two reviewers independently screened articles for eligibility, extracted data, and assessed risk of bias. Grading of Recommendations Assessment, Development, and Evaluation (GRADE) was used to evaluate the quality of evidence for each of the primary and secondary outcomes. Sixty-seven studies, including 46 RCTs, were found. The majority evaluated communication tools in older patients (age >50) with no specific medical condition, but many specifically evaluated populations with cancer, lung, heart, neurologic, or renal disease. Most studies compared the use of communication tools against usual care, but several compared the tools to less-intensive advance care planning tools. The use of structured communication tools increased: the frequency of advance care planning discussions/discussions about advance directives (RR 2.31, 95% CI 1.25-4.26, p = 0.007, low quality evidence) and the completion of advance directives (ADs) (RR 1.92, 95% CI 1.43-2.59, p<0.001, low quality evidence); concordance between AD preferences and subsequent medical orders for use or non-use of life supporting treatment (RR 1.19, 95% CI 1.01-1.39, p = 0.028, very low quality evidence, 1 observational study); and concordance between the care desired and care received by patients (RR 1.17, 95% CI 1.05-1.30, p = 0.004, low quality evidence, 2 RCTs). The use of structured communication tools may increase the frequency of discussions about and completion of advance directives, and concordance between the care desired and the care received by patients. The use of structured communication tools rather than an ad-hoc approach to end-of-life decision-making should be considered, and the selection and implementation of such tools should be tailored to address local needs and context. PROSPERO CRD42014012913.

  2. Communication Tools for End-of-Life Decision-Making in Ambulatory Care Settings: A Systematic Review and Meta-Analysis

    PubMed Central

    Chung, Han-Oh; Hanvey, Louise; Mbuagbaw, Lawrence; You, John J.

    2016-01-01

    Background Patients with serious illness, and their families, state that better communication and decision-making with healthcare providers is a high priority to improve the quality of end-of-life care. Numerous communication tools to assist patients, family members, and clinicians in end-of-life decision-making have been published, but their effectiveness remains unclear. Objectives To determine, amongst adults in ambulatory care settings, the effect of structured communication tools for end-of-life decision-making on completion of advance care planning. Methods We searched for relevant randomized controlled trials (RCTs) or non-randomized intervention studies in MEDLINE, EMBASE, CINAHL, ERIC, and the Cochrane Database of Randomized Controlled Trials from database inception until July 2014. Two reviewers independently screened articles for eligibility, extracted data, and assessed risk of bias. Grading of Recommendations Assessment, Development, and Evaluation (GRADE) was used to evaluate the quality of evidence for each of the primary and secondary outcomes. Results Sixty-seven studies, including 46 RCTs, were found. The majority evaluated communication tools in older patients (age >50) with no specific medical condition, but many specifically evaluated populations with cancer, lung, heart, neurologic, or renal disease. Most studies compared the use of communication tools against usual care, but several compared the tools to less-intensive advance care planning tools. The use of structured communication tools increased: the frequency of advance care planning discussions/discussions about advance directives (RR 2.31, 95% CI 1.25–4.26, p = 0.007, low quality evidence) and the completion of advance directives (ADs) (RR 1.92, 95% CI 1.43–2.59, p<0.001, low quality evidence); concordance between AD preferences and subsequent medical orders for use or non-use of life supporting treatment (RR 1.19, 95% CI 1.01–1.39, p = 0.028, very low quality evidence, 1 observational study); and concordance between the care desired and care received by patients (RR 1.17, 95% CI 1.05–1.30, p = 0.004, low quality evidence, 2 RCTs). Conclusions The use of structured communication tools may increase the frequency of discussions about and completion of advance directives, and concordance between the care desired and the care received by patients. The use of structured communication tools rather than an ad-hoc approach to end-of-life decision-making should be considered, and the selection and implementation of such tools should be tailored to address local needs and context. Registration PROSPERO CRD42014012913 PMID:27119571

  3. The Use of Remotely Controlled Mandibular Positioner as a Predictive Screening Tool for Mandibular Advancement Device Therapy in Patients with Obstructive Sleep Apnea through Single-Night Progressive Titration of the Mandible: A Systematic Review

    PubMed Central

    Kastoer, Chloé; Dieltjens, Marijke; Oorts, Eline; Hamans, Evert; Braem, Marc J.; Van de Heyning, Paul H.; Vanderveken, Olivier M.

    2016-01-01

    Study Objectives: To perform a review of the current evidence regarding the use of a remotely controlled mandibular positioner (RCMP) and to analyze the efficacy of RCMP as a predictive selection tool in the treatment of obstructive sleep apnea (OSA) with oral appliances that protrude the mandible (OAm), exclusively relying on single-night RCMP titration. Methods: An extensive literature search is performed through PubMed.com, Thecochranelibrary.com (CENTRAL only), Embase.com, and recent conference meeting abstracts in the field. Results: A total of 254 OSA patients from four full-text articles and 5 conference meeting abstracts contribute data to the review. Criteria for successful RCMP test and success with OAm differed between studies. Study populations were not fully comparable due to range-difference in baseline apneahypopnea index (AHI). However, in all studies elimination of airway obstruction events during sleep by RCMP titration predicted OAm therapy success by the determination of the most effective target protrusive position (ETPP). A statistically significant association is found between mean AHI predicted outcome with RCMP and treatment outcome with OAm on polysomnographic or portable sleep monitoring evaluation (p < 0.05). Conclusions: The existing evidence regarding the use of RCMP in patients with OSA indicates that it might be possible to protrude the mandible progressively during sleep under poly(somno)graphic observation by RCMP until respiratory events are eliminated without disturbing sleep or arousing the patient. ETPP as measured by the use of RCMP was significantly associated with success of OAm therapy in the reported studies. RCMP might be a promising instrument for predicting OAm treatment outcome and targeting the degree of mandibular advancement needed. Citation: Kastoer C, Dieltjens M, Oorts E, Hamans E, Braem MJ, Van de Heyning PH, Vanderveken OM. The use of remotely controlled mandibular positioner as a predictive screening tool for mandibular advancement device therapy in patients with obstructive sleep apnea through single-night progressive titration of the mandible: a systematic review. J Clin Sleep Med 2016;12(10):1411–1421. PMID:27568892

  4. Peer Review of EPA's Draft BMDS Document: Exponential ...

    EPA Pesticide Factsheets

    BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling.

  5. Simplified tools for measuring retention in care in antiretroviral treatment program in Ethiopia: cohort and current retention in care.

    PubMed

    Assefa, Yibeltal; Worku, Alemayehu; Wouters, Edwin; Koole, Olivier; Haile Mariam, Damen; Van Damme, Wim

    2012-01-01

    Patient retention in care is a critical challenge for antiretroviral treatment programs. This is mainly because retention in care is related to adherence to treatment and patient survival. It is therefore imperative that health facilities and programs measure patient retention in care. However, the currently available tools, such as Kaplan Meier, for measuring retention in care have a lot of practical limitations. The objective of this study was to develop simplified tools for measuring retention in care. Retrospective cohort data were collected from patient registers in nine health facilities in Ethiopia. Retention in care was the primary outcome for the study. Tools were developed to measure "current retention" in care during a specific period of time for a specific "ART-age group" and "cohort retention" in care among patients who were followed for the last "Y" number of years on ART. "Probability of retention" based on the tool for "cohort retention" in care was compared with "probability of retention" based on Kaplan Meier. We found that the new tools enable to measure "current retention" and "cohort retention" in care. We also found that the tools were easy to use and did not require advanced statistical skills. Both "current retention" and "cohort retention" are lower among patients in the first two "ART-age groups" and "ART-age cohorts" than in subsequent "ART-age groups" and "ART-age cohorts". The "probability of retention" based on the new tools were found to be similar to the "probability of retention" based on Kaplan Meier. The simplified tools for "current retention" and "cohort retention" will enable practitioners and program managers to measure and monitor rates of retention in care easily and appropriately. We therefore recommend that health facilities and programs start to use these tools in their efforts to improve retention in care and patient outcomes.

  6. Future ATM Concepts Evaluation Tool (FACET) Interface Control Document

    NASA Technical Reports Server (NTRS)

    Grabbe, Shon R.

    2017-01-01

    This Interface Control Document (ICD) documents the airspace adaptation and air traffic inputs of NASA's Future ATM Concepts and Evaluation Tool (FACET). Its intended audience is the project manager, project team, development team, and stakeholders interested in interfacing with the system. FACET equips Air Traffic Management (ATM) researchers and service providers with a way to explore, develop and evaluate advanced air transportation concepts before they are field-tested and eventually deployed. FACET is a flexible software tool that is capable of quickly generating and analyzing thousands of aircraft trajectories. It provides researchers with a simulation environment for preliminary testing of advanced ATM concepts. Using aircraft performance profiles, airspace models, weather data, and flight schedules, the tool models trajectories for the climb, cruise, and descent phases of flight for each type of aircraft. An advanced graphical interface displays traffic patterns in two and three dimensions, under various current and projected conditions for specific airspace regions or over the entire continental United States. The system is able to simulate a full day's dynamic national airspace system (NAS) operations, model system uncertainty, measure the impact of different decision-makers in the NAS, and provide analysis of the results in graphical form, including sector, airport, fix, and airway usage statistics. NASA researchers test and analyze the system-wide impact of new traffic flow management algorithms under anticipated air traffic growth projections on the nation's air traffic system. In addition to modeling the airspace system for NASA research, FACET has also successfully transitioned into a valuable tool for operational use. Federal Aviation Administration (FAA) traffic flow managers and commercial airline dispatchers have used FACET technology for real-time operations planning. FACET integrates live air traffic data from FAA radar systems and weather data from the National Weather Service to summarize NAS performance. This information allows system operators to reroute flights around congested airspace and severe weather to maintain safety and minimize delay. FACET also supports the planning and post-operational evaluation of reroute strategies at the national level to maximize system efficiency. For the commercial airline passenger, strategic planning with FACET can result in fewer flight delays and cancellations. The performance capabilities of FACET are largely due to its architecture, which strikes a balance between flexibility and fidelity. FACET is capable of modeling the airspace operations for the continental United States, processing thousands of aircraft on a single computer. FACET was written in Java and C, enabling the portability of its software to a variety of operating systems. In addition, FACET was designed with a modular software architecture to facilitate rapid prototyping of diverse ATM concepts. Several advanced ATM concepts have already been implemented in FACET, including aircraft self-separation, prediction of aircraft demand and sector congestion, system-wide impact assessment of traffic flow management constraints, and wind-optimal routing.

  7. An Introduction to Intelligent Processing Programs Developed by the Air Force Manufacturing Technology Directorate

    NASA Technical Reports Server (NTRS)

    Sampson, Paul G.; Sny, Linda C.

    1992-01-01

    The Air Force has numerous on-going manufacturing and integration development programs (machine tools, composites, metals, assembly, and electronics) which are instrumental in improving productivity in the aerospace industry, but more importantly, have identified strategies and technologies required for the integration of advanced processing equipment. An introduction to four current Air Force Manufacturing Technology Directorate (ManTech) manufacturing areas is provided. Research is being carried out in the following areas: (1) machining initiatives for aerospace subcontractors which provide for advanced technology and innovative manufacturing strategies to increase the capabilities of small shops; (2) innovative approaches to advance machine tool products and manufacturing processes; (3) innovative approaches to advance sensors for process control in machine tools; and (4) efforts currently underway to develop, with the support of industry, the Next Generation Workstation/Machine Controller (Low-End Controller Task).

  8. Statistical Tests of Reliability of NDE

    NASA Technical Reports Server (NTRS)

    Baaklini, George Y.; Klima, Stanley J.; Roth, Don J.; Kiser, James D.

    1987-01-01

    Capabilities of advanced material-testing techniques analyzed. Collection of four reports illustrates statistical method for characterizing flaw-detecting capabilities of sophisticated nondestructive evaluation (NDE). Method used to determine reliability of several state-of-the-art NDE techniques for detecting failure-causing flaws in advanced ceramic materials considered for use in automobiles, airplanes, and space vehicles.

  9. Advanced Categorical Statistics: Issues and Applications in Communication Research.

    ERIC Educational Resources Information Center

    Denham, Bryan E.

    2002-01-01

    Discusses not only the procedures, assumptions, and applications of advanced categorical statistics, but also covers some common misapplications, from which a great deal can be learned. Addresses the use and limitations of cross-tabulation and chi-square analysis, as well as issues such as observation independence and artificial inflation of a…

  10. Computer aided manual validation of mass spectrometry-based proteomic data.

    PubMed

    Curran, Timothy G; Bryson, Bryan D; Reigelhaupt, Michael; Johnson, Hannah; White, Forest M

    2013-06-15

    Advances in mass spectrometry-based proteomic technologies have increased the speed of analysis and the depth provided by a single analysis. Computational tools to evaluate the accuracy of peptide identifications from these high-throughput analyses have not kept pace with technological advances; currently the most common quality evaluation methods are based on statistical analysis of the likelihood of false positive identifications in large-scale data sets. While helpful, these calculations do not consider the accuracy of each identification, thus creating a precarious situation for biologists relying on the data to inform experimental design. Manual validation is the gold standard approach to confirm accuracy of database identifications, but is extremely time-intensive. To palliate the increasing time required to manually validate large proteomic datasets, we provide computer aided manual validation software (CAMV) to expedite the process. Relevant spectra are collected, catalogued, and pre-labeled, allowing users to efficiently judge the quality of each identification and summarize applicable quantitative information. CAMV significantly reduces the burden associated with manual validation and will hopefully encourage broader adoption of manual validation in mass spectrometry-based proteomics. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oehler, G.C.

    As dramatic as are the recent changes in Eastern Europe and the Soviet Union political happenings, some other factors are having at least as important an impact on the intelligence community's business. For example, new and more global problems have arisen, such as the proliferation of advanced weapons, economic competitiveness, and environmental concerns. It is obvious that intelligence requirements are on the increase. For the intelligence community whose business is information gathering and processing, advanced information management tools are needed. Fortunately, recent technical advances offer these tools. Some of the more notable advances in information documentation, storage, and retrieval aremore » described.« less

  12. The GenABEL Project for statistical genomics

    PubMed Central

    Karssen, Lennart C.; van Duijn, Cornelia M.; Aulchenko, Yurii S.

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the “core team”, facilitating agile statistical omics methodology development and fast dissemination. PMID:27347381

  13. A Study of Students' Learning Styles, Discipline Attitudes and Knowledge Acquisition in Technology-Enhanced Probability and Statistics Education.

    PubMed

    Christou, Nicolas; Dinov, Ivo D

    2010-09-01

    Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources.

  14. easyGWAS: A Cloud-Based Platform for Comparing the Results of Genome-Wide Association Studies.

    PubMed

    Grimm, Dominik G; Roqueiro, Damian; Salomé, Patrice A; Kleeberger, Stefan; Greshake, Bastian; Zhu, Wangsheng; Liu, Chang; Lippert, Christoph; Stegle, Oliver; Schölkopf, Bernhard; Weigel, Detlef; Borgwardt, Karsten M

    2017-01-01

    The ever-growing availability of high-quality genotypes for a multitude of species has enabled researchers to explore the underlying genetic architecture of complex phenotypes at an unprecedented level of detail using genome-wide association studies (GWAS). The systematic comparison of results obtained from GWAS of different traits opens up new possibilities, including the analysis of pleiotropic effects. Other advantages that result from the integration of multiple GWAS are the ability to replicate GWAS signals and to increase statistical power to detect such signals through meta-analyses. In order to facilitate the simple comparison of GWAS results, we present easyGWAS, a powerful, species-independent online resource for computing, storing, sharing, annotating, and comparing GWAS. The easyGWAS tool supports multiple species, the uploading of private genotype data and summary statistics of existing GWAS, as well as advanced methods for comparing GWAS results across different experiments and data sets in an interactive and user-friendly interface. easyGWAS is also a public data repository for GWAS data and summary statistics and already includes published data and results from several major GWAS. We demonstrate the potential of easyGWAS with a case study of the model organism Arabidopsis thaliana , using flowering and growth-related traits. © 2016 American Society of Plant Biologists. All rights reserved.

  15. A Study of Students' Learning Styles, Discipline Attitudes and Knowledge Acquisition in Technology-Enhanced Probability and Statistics Education

    PubMed Central

    Christou, Nicolas; Dinov, Ivo D.

    2011-01-01

    Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources. PMID:21603097

  16. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection

    PubMed Central

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-01-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces. PMID:21487489

  17. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection.

    PubMed

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-12-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces.

  18. Validation of virtual-reality-based simulations for endoscopic sinus surgery.

    PubMed

    Dharmawardana, N; Ruthenbeck, G; Woods, C; Elmiyeh, B; Diment, L; Ooi, E H; Reynolds, K; Carney, A S

    2015-12-01

    Virtual reality (VR) simulators provide an alternative to real patients for practicing surgical skills but require validation to ensure accuracy. Here, we validate the use of a virtual reality sinus surgery simulator with haptic feedback for training in Otorhinolaryngology - Head & Neck Surgery (OHNS). Participants were recruited from final-year medical students, interns, resident medical officers (RMOs), OHNS registrars and consultants. All participants completed an online questionnaire after performing four separate simulation tasks. These were then used to assess face, content and construct validity. anova with post hoc correlation was used for statistical analysis. The following groups were compared: (i) medical students/interns, (ii) RMOs, (iii) registrars and (iv) consultants. Face validity results had a statistically significant (P < 0.05) difference between the consultant group and others, while there was no significant difference between medical student/intern and RMOs. Variability within groups was not significant. Content validity results based on consultant scoring and comments indicated that the simulations need further development in several areas to be effective for registrar-level teaching. However, students, interns and RMOs indicated that the simulations provide a useful tool for learning OHNS-related anatomy and as an introduction to ENT-specific procedures. The VR simulations have been validated for teaching sinus anatomy and nasendoscopy to medical students, interns and RMOs. However, they require further development before they can be regarded as a valid tool for more advanced surgical training. © 2015 John Wiley & Sons Ltd.

  19. Contrast enhanced dual energy spectral mammogram, an emerging addendum in breast imaging.

    PubMed

    Kariyappa, Kalpana D; Gnanaprakasam, Francis; Anand, Subhapradha; Krishnaswami, Murali; Ramachandran, Madan

    2016-11-01

    To assess the role of contrast-enhanced dual-energy spectral mammogram (CEDM) as a problem-solving tool in equivocal cases. 44 consenting females with equivocal findings on full-field digital mammogram underwent CEDM. All the images were interpreted by two radiologists independently. Confidence of presence was plotted on a three-point Likert scale and probability of cancer was assigned on Breast Imaging Reporting and Data System scoring. Histopathology was taken as the gold standard. Statistical analyses of all variables were performed. 44 breast lesions were included in the study, among which 77.3% lesions were malignant or precancerous and 22.7% lesions were benign or inconclusive. 20% of lesions were identified only on CEDM. True extent of the lesion was made out in 15.9% of cases, multifocality was established in 9.1% of cases and ductal extension was demonstrated in 6.8% of cases. Statistical significance for CEDM was p-value <0.05. Interobserver kappa value was 0.837. CEDM has a useful role in identifying occult lesions in dense breasts and in triaging lesions. In a mammographically visible lesion, CEDM characterizes the lesion, affirms the finding and better demonstrates response to treatment. Hence, we conclude that CEDM is a useful complementary tool to standard mammogram. Advances in knowledge: CEDM can detect and demonstrate lesions even in dense breasts with the advantage of feasibility of stereotactic biopsy in the same setting. Hence, it has the potential to be a screening modality with need for further studies and validation.

  20. Nuclear magnetic resonance (NMR)-based metabolomics for cancer research.

    PubMed

    Ranjan, Renuka; Sinha, Neeraj

    2018-05-07

    Nuclear magnetic resonance (NMR) has emerged as an effective tool in various spheres of biomedical research, amongst which metabolomics is an important method for the study of various types of disease. Metabolomics has proved its stronghold in cancer research by the development of different NMR methods over time for the study of metabolites, thus identifying key players in the aetiology of cancer. A plethora of one-dimensional and two-dimensional NMR experiments (in solids, semi-solids and solution phases) are utilized to obtain metabolic profiles of biofluids, cell extracts and tissue biopsy samples, which can further be subjected to statistical analysis. Any alteration in the assigned metabolite peaks gives an indication of changes in metabolic pathways. These defined changes demonstrate the utility of NMR in the early diagnosis of cancer and provide further measures to combat malignancy and its progression. This review provides a snapshot of the trending NMR techniques and the statistical analysis involved in the metabolomics of diseases, with emphasis on advances in NMR methodology developed for cancer research. Copyright © 2018 John Wiley & Sons, Ltd.

  1. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu

    2014-01-15

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located onmore » a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.« less

  2. Correlation of Thermally Induced Pores with Microstructural Features Using High Energy X-rays

    NASA Astrophysics Data System (ADS)

    Menasche, David B.; Shade, Paul A.; Lind, Jonathan; Li, Shiu Fai; Bernier, Joel V.; Kenesei, Peter; Schuren, Jay C.; Suter, Robert M.

    2016-11-01

    Combined application of a near-field High Energy Diffraction Microscopy measurement of crystal lattice orientation fields and a tomographic measurement of pore distributions in a sintered nickel-based superalloy sample allows pore locations to be correlated with microstructural features. Measurements were carried out at the Advanced Photon Source beamline 1-ID using an X-ray energy of 65 keV for each of the measurement modes. The nickel superalloy sample was prepared in such a way as to generate significant thermally induced porosity. A three-dimensionally resolved orientation map is directly overlaid with the tomographically determined pore map through a careful registration procedure. The data are shown to reliably reproduce the expected correlations between specific microstructural features (triple lines and quadruple nodes) and pore positions. With the statistics afforded by the 3D data set, we conclude that within statistical limits, pore formation does not depend on the relative orientations of the grains. The experimental procedures and analysis tools illustrated are being applied to a variety of materials problems in which local heterogeneities can affect materials properties.

  3. SLIVISU, an Interactive Visualisation Framework for Analysis of Geological Sea-Level Indicators

    NASA Astrophysics Data System (ADS)

    Klemann, V.; Schulte, S.; Unger, A.; Dransch, D.

    2011-12-01

    Flanking data analysis in earth system sciences by advanced visualisation tools is a striking feature due to rising complexity, amount and variety of available data. With respect to sea-level indicators (SLIs), their analysis in earth-system applications, such as modelling and simulation on regional or global scales, demands the consideration of large amounts of data - we talk about thousands of SLIs - and, so, to go ahead of analysing single sea-level curves. On the other hand, a gross analysis by means of statistical methods is hindered by the often heterogeneous and individual character of the single SLIs, i.e., the spatio-temporal context and often heterogenous information is difficult to handle or to represent in an objective way. Therefore a concept of integrating automated analysis and visualisation is mandatory. This is provided by visual analytics. As an implementation of this concept, we present the visualisation framework SLIVISU, developed at GFZ, which bases on multiple linked views and provides a synoptic analysis of observational data, model configurations, model outputs and results of automated analysis in glacial isostatic adjustment. Starting as a visualisation tool for an existing database of SLIs, it now serves as an analysis tool for the evaluation of model simulations in studies of glacial-isostatic adjustment.

  4. A design tool for direct and non-stochastic calculations of near-field radiative transfer in complex structures: The NF-RT-FDTD algorithm

    NASA Astrophysics Data System (ADS)

    Didari, Azadeh; Pinar Mengüç, M.

    2017-08-01

    Advances in nanotechnology and nanophotonics are inextricably linked with the need for reliable computational algorithms to be adapted as design tools for the development of new concepts in energy harvesting, radiative cooling, nanolithography and nano-scale manufacturing, among others. In this paper, we provide an outline for such a computational tool, named NF-RT-FDTD, to determine the near-field radiative transfer between structured surfaces using Finite Difference Time Domain method. NF-RT-FDTD is a direct and non-stochastic algorithm, which accounts for the statistical nature of the thermal radiation and is easily applicable to any arbitrary geometry at thermal equilibrium. We present a review of the fundamental relations for far- and near-field radiative transfer between different geometries with nano-scale surface and volumetric features and gaps, and then we discuss the details of the NF-RT-FDTD formulation, its application to sample geometries and outline its future expansion to more complex geometries. In addition, we briefly discuss some of the recent numerical works for direct and indirect calculations of near-field thermal radiation transfer, including Scattering Matrix method, Finite Difference Time Domain method (FDTD), Wiener Chaos Expansion, Fluctuating Surface Current (FSC), Fluctuating Volume Current (FVC) and Thermal Discrete Dipole Approximations (TDDA).

  5. Phaedra, a protocol-driven system for analysis and validation of high-content imaging and flow cytometry.

    PubMed

    Cornelissen, Frans; Cik, Miroslav; Gustin, Emmanuel

    2012-04-01

    High-content screening has brought new dimensions to cellular assays by generating rich data sets that characterize cell populations in great detail and detect subtle phenotypes. To derive relevant, reliable conclusions from these complex data, it is crucial to have informatics tools supporting quality control, data reduction, and data mining. These tools must reconcile the complexity of advanced analysis methods with the user-friendliness demanded by the user community. After review of existing applications, we realized the possibility of adding innovative new analysis options. Phaedra was developed to support workflows for drug screening and target discovery, interact with several laboratory information management systems, and process data generated by a range of techniques including high-content imaging, multicolor flow cytometry, and traditional high-throughput screening assays. The application is modular and flexible, with an interface that can be tuned to specific user roles. It offers user-friendly data visualization and reduction tools for HCS but also integrates Matlab for custom image analysis and the Konstanz Information Miner (KNIME) framework for data mining. Phaedra features efficient JPEG2000 compression and full drill-down functionality from dose-response curves down to individual cells, with exclusion and annotation options, cell classification, statistical quality controls, and reporting.

  6. Integrative Functional Genomics for Systems Genetics in GeneWeaver.org.

    PubMed

    Bubier, Jason A; Langston, Michael A; Baker, Erich J; Chesler, Elissa J

    2017-01-01

    The abundance of existing functional genomics studies permits an integrative approach to interpreting and resolving the results of diverse systems genetics studies. However, a major challenge lies in assembling and harmonizing heterogeneous data sets across species for facile comparison to the positional candidate genes and coexpression networks that come from systems genetic studies. GeneWeaver is an online database and suite of tools at www.geneweaver.org that allows for fast aggregation and analysis of gene set-centric data. GeneWeaver contains curated experimental data together with resource-level data such as GO annotations, MP annotations, and KEGG pathways, along with persistent stores of user entered data sets. These can be entered directly into GeneWeaver or transferred from widely used resources such as GeneNetwork.org. Data are analyzed using statistical tools and advanced graph algorithms to discover new relations, prioritize candidate genes, and generate function hypotheses. Here we use GeneWeaver to find genes common to multiple gene sets, prioritize candidate genes from a quantitative trait locus, and characterize a set of differentially expressed genes. Coupling a large multispecies repository curated and empirical functional genomics data to fast computational tools allows for the rapid integrative analysis of heterogeneous data for interpreting and extrapolating systems genetics results.

  7. Screening of multiple potential control genes for use in caste and body region comparisons using RT-qPCR in Coptotermes formosanus

    USDA-ARS?s Scientific Manuscript database

    Formosan subterranean termites, Coptotermes formosanus, are an important world wide pest. Molecular gene expression is an important tool for understanding the physiology of organisms. The recent advancement of molecular tools for Coptotermes formosanus is leading to advancement of the understanding ...

  8. Earthquake information products and tools from the Advanced National Seismic System (ANSS)

    USGS Publications Warehouse

    Wald, Lisa

    2006-01-01

    This Fact Sheet provides a brief description of postearthquake tools and products provided by the Advanced National Seismic System (ANSS) through the U.S. Geological Survey Earthquake Hazards Program. The focus is on products specifically aimed at providing situational awareness in the period immediately following significant earthquake events.

  9. STATWIZ - AN ELECTRONIC STATISTICAL TOOL (ABSTRACT)

    EPA Science Inventory

    StatWiz is a web-based, interactive, and dynamic statistical tool for researchers. It will allow researchers to input information and/or data and then receive experimental design options, or outputs from data analysis. StatWiz is envisioned as an expert system that will walk rese...

  10. On the impact of a refined stochastic model for airborne LiDAR measurements

    NASA Astrophysics Data System (ADS)

    Bolkas, Dimitrios; Fotopoulos, Georgia; Glennie, Craig

    2016-09-01

    Accurate topographic information is critical for a number of applications in science and engineering. In recent years, airborne light detection and ranging (LiDAR) has become a standard tool for acquiring high quality topographic information. The assessment of airborne LiDAR derived DEMs is typically based on (i) independent ground control points and (ii) forward error propagation utilizing the LiDAR geo-referencing equation. The latter approach is dependent on the stochastic model information of the LiDAR observation components. In this paper, the well-known statistical tool of variance component estimation (VCE) is implemented for a dataset in Houston, Texas, in order to refine the initial stochastic information. Simulations demonstrate the impact of stochastic-model refinement for two practical applications, namely coastal inundation mapping and surface displacement estimation. Results highlight scenarios where erroneous stochastic information is detrimental. Furthermore, the refined stochastic information provides insights on the effect of each LiDAR measurement in the airborne LiDAR error budget. The latter is important for targeting future advancements in order to improve point cloud accuracy.

  11. Data-driven Applications for the Sun-Earth System

    NASA Astrophysics Data System (ADS)

    Kondrashov, D. A.

    2016-12-01

    Advances in observational and data mining techniques allow extracting information from the large volume of Sun-Earth observational data that can be assimilated into first principles physical models. However, equations governing Sun-Earth phenomena are typically nonlinear, complex, and high-dimensional. The high computational demand of solving the full governing equations over a large range of scales precludes the use of a variety of useful assimilative tools that rely on applied mathematical and statistical techniques for quantifying uncertainty and predictability. Effective use of such tools requires the development of computationally efficient methods to facilitate fusion of data with models. This presentation will provide an overview of various existing as well as newly developed data-driven techniques adopted from atmospheric and oceanic sciences that proved to be useful for space physics applications, such as computationally efficient implementation of Kalman Filter in radiation belts modeling, solar wind gap-filling by Singular Spectrum Analysis, and low-rank procedure for assimilation of low-altitude ionospheric magnetic perturbations into the Lyon-Fedder-Mobarry (LFM) global magnetospheric model. Reduced-order non-Markovian inverse modeling and novel data-adaptive decompositions of Sun-Earth datasets will be also demonstrated.

  12. Radiomics: Images Are More than Pictures, They Are Data

    PubMed Central

    Kinahan, Paul E.; Hricak, Hedvig

    2016-01-01

    In the past decade, the field of medical image analysis has grown exponentially, with an increased number of pattern recognition tools and an increase in data set sizes. These advances have facilitated the development of processes for high-throughput extraction of quantitative features that result in the conversion of images into mineable data and the subsequent analysis of these data for decision support; this practice is termed radiomics. This is in contrast to the traditional practice of treating medical images as pictures intended solely for visual interpretation. Radiomic data contain first-, second-, and higher-order statistics. These data are combined with other patient data and are mined with sophisticated bioinformatics tools to develop models that may potentially improve diagnostic, prognostic, and predictive accuracy. Because radiomics analyses are intended to be conducted with standard of care images, it is conceivable that conversion of digital images to mineable data will eventually become routine practice. This report describes the process of radiomics, its challenges, and its potential power to facilitate better clinical decision making, particularly in the care of patients with cancer. PMID:26579733

  13. Earth Observations, Models and Geo-Design in Support of SDG Implementation and Monitoring

    NASA Astrophysics Data System (ADS)

    Plag, H. P.; Jules-Plag, S.

    2016-12-01

    Implementation and Monitoring of the United Nations' Sustainable Development Goals (SDGs) requires support from Earth observation and scientific communities. Applying a goal-based approach to determine the data needs to the Targets and Indicators associated with the SDGs demonstrates that integration of environmental with socio-economic and statistical data is required. Large data gaps exist for the built environment. A Geo-Design platform can provide the infrastructure and conceptual model for the data integration. The development of policies and actions to foster the implementation of SDGs in many cases requires research and the development of tools to answer "what if" questions. Here, agent-based models and model webs combined with a Geo-Design platform are promising avenues. This advanced combined infrastructure can also play a crucial role in the necessary capacity building. We will use the example of SDG 5 (Gender equality) to illustrate these approaches. SDG 11 (Sustainable Cities and Communities) is used to underline the cross-goal linkages and the joint benefits of Earth observations, data integration, and modeling tools for multiple SDGs.

  14. Spliceosome Profiling Visualizes Operations of a Dynamic RNP at Nucleotide Resolution.

    PubMed

    Burke, Jordan E; Longhurst, Adam D; Merkurjev, Daria; Sales-Lee, Jade; Rao, Beiduo; Moresco, James J; Yates, John R; Li, Jingyi Jessica; Madhani, Hiten D

    2018-05-03

    Tools to understand how the spliceosome functions in vivo have lagged behind advances in the structural biology of the spliceosome. Here, methods are described to globally profile spliceosome-bound pre-mRNA, intermediates, and spliced mRNA at nucleotide resolution. These tools are applied to three yeast species that span 600 million years of evolution. The sensitivity of the approach enables the detection of canonical and non-canonical events, including interrupted, recursive, and nested splicing. This application of statistical modeling uncovers independent roles for the size and position of the intron and the number of introns per transcript in substrate progression through the two catalytic stages. These include species-specific inputs suggestive of spliceosome-transcriptome coevolution. Further investigations reveal the ATP-dependent discard of numerous endogenous substrates after spliceosome assembly in vivo and connect this discard to intron retention, a form of splicing regulation. Spliceosome profiling is a quantitative, generalizable global technology used to investigate an RNP central to eukaryotic gene expression. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. On consciousness, resting state fMRI, and neurodynamics

    PubMed Central

    2010-01-01

    Background During the last years, functional magnetic resonance imaging (fMRI) of the brain has been introduced as a new tool to measure consciousness, both in a clinical setting and in a basic neurocognitive research. Moreover, advanced mathematical methods and theories have arrived the field of fMRI (e.g. computational neuroimaging), and functional and structural brain connectivity can now be assessed non-invasively. Results The present work deals with a pluralistic approach to "consciousness'', where we connect theory and tools from three quite different disciplines: (1) philosophy of mind (emergentism and global workspace theory), (2) functional neuroimaging acquisitions, and (3) theory of deterministic and statistical neurodynamics – in particular the Wilson-Cowan model and stochastic resonance. Conclusions Based on recent experimental and theoretical work, we believe that the study of large-scale neuronal processes (activity fluctuations, state transitions) that goes on in the living human brain while examined with functional MRI during "resting state", can deepen our understanding of graded consciousness in a clinical setting, and clarify the concept of "consiousness" in neurocognitive and neurophilosophy research. PMID:20522270

  16. Three-dimensional anthropometric techniques applied to the fabrication of burn masks and the quantification of wound healing

    NASA Astrophysics Data System (ADS)

    Whitestone, Jennifer J.; Geisen, Glen R.; McQuiston, Barbara K.

    1997-03-01

    Anthropometric surveys conducted by the military provide comprehensive human body measurement data that are human interface requirements for successful mission performance of weapon systems, including cockpits, protective equipment, and clothing. The application of human body dimensions to model humans and human-machine performance begins with engineering anthropometry. There are two critical elements to engineering anthropometry: data acquisition and data analysis. First, the human body is captured dimensionally with either traditional anthropometric tools, such as calipers and tape measures, or with advanced image acquisition systems, such as a laser scanner. Next, numerous statistical analysis tools, such as multivariate modeling and feature envelopes, are used to effectively transition these data for design and evaluation of equipment and work environments. Recently, Air Force technology transfer allowed researchers at the Computerized Anthropometric Research and Design (CARD) Laboratory at Wright-Patterson Air Force Base to work with the Dayton, Ohio area medical community in assessing the rate of wound healing and improving the fit of total contract burn masks. This paper describes the successful application of CARD Lab engineering anthropometry to two medically oriented human interface problems.

  17. Using the Student Research Project to Integrate Macroeconomics and Statistics in an Advanced Cost Accounting Course

    ERIC Educational Resources Information Center

    Hassan, Mahamood M.; Schwartz, Bill N.

    2014-01-01

    This paper discusses a student research project that is part of an advanced cost accounting class. The project emphasizes active learning, integrates cost accounting with macroeconomics and statistics by "learning by doing" using real world data. Students analyze sales data for a publicly listed company by focusing on the company's…

  18. Coping, Stress, and Job Satisfaction as Predictors of Advanced Placement Statistics Teachers' Intention to Leave the Field

    ERIC Educational Resources Information Center

    McCarthy, Christopher J.; Lambert, Richard G.; Crowe, Elizabeth W.; McCarthy, Colleen J.

    2010-01-01

    This study examined the relationship of teachers' perceptions of coping resources and demands to job satisfaction factors. Participants were 158 Advanced Placement Statistics high school teachers who completed measures of personal resources for stress prevention, classroom demands and resources, job satisfaction, and intention to leave the field…

  19. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC I. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…

  20. Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging

    PubMed Central

    Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.

    2015-01-01

    Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288

  1. Integrating advanced visualization technology into the planetary Geoscience workflow

    NASA Astrophysics Data System (ADS)

    Huffman, John; Forsberg, Andrew; Loomis, Andrew; Head, James; Dickson, James; Fassett, Caleb

    2011-09-01

    Recent advances in computer visualization have allowed us to develop new tools for analyzing the data gathered during planetary missions, which is important, since these data sets have grown exponentially in recent years to tens of terabytes in size. As part of the Advanced Visualization in Solar System Exploration and Research (ADVISER) project, we utilize several advanced visualization techniques created specifically with planetary image data in mind. The Geoviewer application allows real-time active stereo display of images, which in aggregate have billions of pixels. The ADVISER desktop application platform allows fast three-dimensional visualization of planetary images overlain on digital terrain models. Both applications include tools for easy data ingest and real-time analysis in a programmatic manner. Incorporation of these tools into our everyday scientific workflow has proved important for scientific analysis, discussion, and publication, and enabled effective and exciting educational activities for students from high school through graduate school.

  2. DAnTE: a statistical tool for quantitative analysis of –omics data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep

    2008-05-03

    DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.

  3. Using Quality Management Tools to Enhance Feedback from Student Evaluations

    ERIC Educational Resources Information Center

    Jensen, John B.; Artz, Nancy

    2005-01-01

    Statistical tools found in the service quality assessment literature--the "T"[superscript 2] statistic combined with factor analysis--can enhance the feedback instructors receive from student ratings. "T"[superscript 2] examines variability across multiple sets of ratings to isolate individual respondents with aberrant response…

  4. DIGE Analysis of Human Tissues.

    PubMed

    Gelfi, Cecilia; Capitanio, Daniele

    2018-01-01

    Two-dimensional difference gel electrophoresis (2-D DIGE) is an advanced and elegant gel electrophoretic analytical tool for comparative protein assessment. It is based on two-dimensional gel electrophoresis (2-DE) separation of fluorescently labeled protein extracts. The tagging procedures are designed to not interfere with the chemical properties of proteins with respect to their pI and electrophoretic mobility, once a proper labeling protocol is followed. The two-dye or three-dye systems can be adopted and their choice depends on specific applications. Furthermore, the use of an internal pooled standard makes 2-D DIGE a highly accurate quantitative method enabling multiple protein samples to be separated on the same two-dimensional gel. The image matching and cross-gel statistical analysis generates robust quantitative results making data validation by independent technologies successful.

  5. The Road to Reproducibility in Animal Research.

    PubMed

    Jilka, Robert L

    2016-07-01

    Reproducibility of research findings is the hallmark of scientific advance. However, the recently noted lack of reproducibility and transparency of published research using animal models of human biology and disease has alarmed funders, scientists, and the public. Improved reporting of methodology and better use of statistical tools are needed to enhance the quality and utility of published research. Reporting guidelines like Animal Research: Reporting In Vivo Experiments (ARRIVE) have been devised to achieve these goals, but most biomedical research journals, including the JBMR, have not been able to obtain high compliance. Cooperative efforts among authors, reviewers and editors-empowered by increased awareness of their responsibilities, and enabled by user-friendly guidelines-are needed to solve this problem. © 2016 American Society for Bone and Mineral Research. © 2016 American Society for Bone and Mineral Research.

  6. Mathematical and Computational Challenges in Population Biology and Ecosystems Science

    NASA Technical Reports Server (NTRS)

    Levin, Simon A.; Grenfell, Bryan; Hastings, Alan; Perelson, Alan S.

    1997-01-01

    Mathematical and computational approaches provide powerful tools in the study of problems in population biology and ecosystems science. The subject has a rich history intertwined with the development of statistics and dynamical systems theory, but recent analytical advances, coupled with the enhanced potential of high-speed computation, have opened up new vistas and presented new challenges. Key challenges involve ways to deal with the collective dynamics of heterogeneous ensembles of individuals, and to scale from small spatial regions to large ones. The central issues-understanding how detail at one scale makes its signature felt at other scales, and how to relate phenomena across scales-cut across scientific disciplines and go to the heart of algorithmic development of approaches to high-speed computation. Examples are given from ecology, genetics, epidemiology, and immunology.

  7. Using synchrotron light to accelerate EUV resist and mask materials learning

    NASA Astrophysics Data System (ADS)

    Naulleau, Patrick; Anderson, Christopher N.; Baclea-an, Lorie-Mae; Denham, Paul; George, Simi; Goldberg, Kenneth A.; Jones, Gideon; McClinton, Brittany; Miyakawa, Ryan; Mochi, Iacopo; Montgomery, Warren; Rekawa, Seno; Wallow, Tom

    2011-03-01

    As commercialization of extreme ultraviolet lithography (EUVL) progresses, direct industry activities are being focused on near term concerns. The question of long term extendibility of EUVL, however, remains crucial given the magnitude of the investments yet required to make EUVL a reality. Extendibility questions are best addressed using advanced research tools such as the SEMATECH Berkeley microfield exposure tool (MET) and actinic inspection tool (AIT). Utilizing Lawrence Berkeley National Laboratory's Advanced Light Source facility as the light source, these tools benefit from the unique properties of synchrotron light enabling research at nodes generations ahead of what is possible with commercial tools. The MET for example uses extremely bright undulator radiation to enable a lossless fully programmable coherence illuminator. Using such a system, resolution enhancing illuminations achieving k1 factors of 0.25 can readily be attained. Given the MET numerical aperture of 0.3, this translates to an ultimate resolution capability of 12 nm. Using such methods, the SEMATECH Berkeley MET has demonstrated resolution in resist to 16-nm half pitch and below in an imageable spin-on hard mask. At a half pitch of 16 nm, this material achieves a line-edge roughness of 2 nm with a correlation length of 6 nm. These new results demonstrate that the observed stall in ultimate resolution progress in chemically amplified resists is a materials issue rather than a tool limitation. With a resolution limit of 20-22 nm, the CAR champion from 2008 remains as the highest performing CAR tested to date. To enable continued advanced learning in EUV resists, SEMATECH has initiated a plan to implement a 0.5 NA microfield tool at the Advanced Light Source synchrotron facility. This tool will be capable of printing down to 8-nm half pitch.

  8. Nutritional Risk in Emergency-2017: A New Simplified Proposal for a Nutrition Screening Tool.

    PubMed

    Marcadenti, Aline; Mendes, Larissa Loures; Rabito, Estela Iraci; Fink, Jaqueline da Silva; Silva, Flávia Moraes

    2018-03-13

    There are many nutrition screening tools currently being applied in hospitals to identify risk of malnutrition. However, multivariate statistical models are not usually employed to take into account the importance of each variable included in the instrument's development. To develop and evaluate the concurrent and predictive validities of a new screening tool of nutrition risk. A prospective cohort study was developed, in which 4 nutrition screening tools were applied to all patients. Length of stay in hospital and mortality were considered to test the predictive validity, and the concurrent validity was tested by comparing the Nuritional Risk in Emergency (NRE)-2017 to the other tools. A total of 748 patients were included. The final NRE-2017 score was composed of 6 questions (advanced age, metabolic stress of the disease, decreased appetite, changing of food consistency, unintentional weight loss, and muscle mass loss) with answers yes or no. The prevalence of nutrition risk was 50.7% and 38.8% considering the cutoff points 1.0 and 1.5, respectively. The NRE-2017 showed a satisfactory power to indentify risk of malnutrition (area under the curve >0.790 for all analyses). According to the NRE-2017, patients at risk of malnutrition have twice as high relative risk of a very long hospital stay. The hazard ratio for mortality was 2.78 (1.03-7.49) when the cutoff adopted by the NRE-2017 was 1.5 points. NRE-2017 is a new, easy-to-apply nutrition screening tool which uses 6 bi-categoric features to detect the risk of malnutrition, and it presented a good concurrent and predictive validity. © 2018 American Society for Parenteral and Enteral Nutrition.

  9. 24 CFR 266.420 - Closing and endorsement by the Commissioner.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    .... (a) Closing. Before disbursement of loan advances in periodic advances cases, and in all cases after... market occupancy percentages, value/replacement cost, interest rate, and similar statistical information... certification for periodic advances cases, if submitted for final endorsement, that advances were made...

  10. Advancing computational methods for calibration of the Soil and Water Assessment Tool (SWAT): Application for modeling climate change impacts on water resources in the Upper Neuse Watershed of North Carolina

    NASA Astrophysics Data System (ADS)

    Ercan, Mehmet Bulent

    Watershed-scale hydrologic models are used for a variety of applications from flood prediction, to drought analysis, to water quality assessments. A particular challenge in applying these models is calibration of the model parameters, many of which are difficult to measure at the watershed-scale. A primary goal of this dissertation is to contribute new computational methods and tools for calibration of watershed-scale hydrologic models and the Soil and Water Assessment Tool (SWAT) model, in particular. SWAT is a physically-based, watershed-scale hydrologic model developed to predict the impact of land management practices on water quality and quantity. The dissertation follows a manuscript format meaning it is comprised of three separate but interrelated research studies. The first two research studies focus on SWAT model calibration, and the third research study presents an application of the new calibration methods and tools to study climate change impacts on water resources in the Upper Neuse Watershed of North Carolina using SWAT. The objective of the first two studies is to overcome computational challenges associated with calibration of SWAT models. The first study evaluates a parallel SWAT calibration tool built using the Windows Azure cloud environment and a parallel version of the Dynamically Dimensioned Search (DDS) calibration method modified to run in Azure. The calibration tool was tested for six model scenarios constructed using three watersheds of increasing size (the Eno, Upper Neuse, and Neuse) for both a 2 year and 10 year simulation duration. Leveraging the cloud as an on demand computing resource allowed for a significantly reduced calibration time such that calibration of the Neuse watershed went from taking 207 hours on a personal computer to only 3.4 hours using 256 cores in the Azure cloud. The second study aims at increasing SWAT model calibration efficiency by creating an open source, multi-objective calibration tool using the Non-Dominated Sorting Genetic Algorithm II (NSGA-II). This tool was demonstrated through an application for the Upper Neuse Watershed in North Carolina, USA. The objective functions used for the calibration were Nash-Sutcliffe (E) and Percent Bias (PB), and the objective sites were the Flat, Little, and Eno watershed outlets. The results show that the use of multi-objective calibration algorithms for SWAT calibration improved model performance especially in terms of minimizing PB compared to the single objective model calibration. The third study builds upon the first two studies by leveraging the new calibration methods and tools to study future climate impacts on the Upper Neuse watershed. Statistically downscaled outputs from eight Global Circulation Models (GCMs) were used for both low and high emission scenarios to drive a well calibrated SWAT model of the Upper Neuse watershed. The objective of the study was to understand the potential hydrologic response of the watershed, which serves as a public water supply for the growing Research Triangle Park region of North Carolina, under projected climate change scenarios. The future climate change scenarios, in general, indicate an increase in precipitation and temperature for the watershed in coming decades. The SWAT simulations using the future climate scenarios, in general, suggest an increase in soil water and water yield, and a decrease in evapotranspiration within the Upper Neuse watershed. In summary, this dissertation advances the field of watershed-scale hydrologic modeling by (i) providing some of the first work to apply cloud computing for the computationally-demanding task of model calibration; (ii) providing a new, open source library that can be used by SWAT modelers to perform multi-objective calibration of their models; and (iii) advancing understanding of climate change impacts on water resources for an important watershed in the Research Triangle Park region of North Carolina. The third study leveraged the methodological advances presented in the first two studies. Therefore, the dissertation contains three independent by interrelated studies that collectively advance the field of watershed-scale hydrologic modeling and analysis.

  11. The Beck Depression Inventory (BDI-II) and a single screening question as screening tools for depressive disorder in Dutch advanced cancer patients.

    PubMed

    Warmenhoven, Franca; van Rijswijk, Eric; Engels, Yvonne; Kan, Cornelis; Prins, Judith; van Weel, Chris; Vissers, Kris

    2012-02-01

    Depression is highly prevalent in advanced cancer patients, but the diagnosis of depressive disorder in patients with advanced cancer is difficult. Screening instruments could facilitate diagnosing depressive disorder in patients with advanced cancer. The aim of this study was to determine the validity of the Beck Depression Inventory (BDI-II) and a single screening question as screening tools for depressive disorder in advanced cancer patients. Patients with advanced metastatic disease, visiting the outpatient palliative care department, were asked to fill out a self-questionnaire containing the Beck Depression Inventory (BDI-II) and a single screening question "Are you feeling depressed?" The mood section of the PRIME-MD was used as a gold standard. Sixty-one patients with advanced metastatic disease were eligible to be included in the study. Complete data were obtained from 46 patients. The area under the curve of the receiver operating characteristics analysis of the BDI-II was 0.82. The optimal cut-off point of the BDI-II was 16 with a sensitivity of 90% and a specificity of 69%. The single screening question showed a sensitivity of 50% and a specificity of 94%. The BDI-II seems an adequate screening tool for a depressive disorder in advanced cancer patients. The sensitivity of a single screening question is poor.

  12. Improvement of electrical resistivity tomography for leachate injection monitoring.

    PubMed

    Clément, R; Descloitres, M; Günther, T; Oxarango, L; Morra, C; Laurent, J-P; Gourc, J-P

    2010-03-01

    Leachate recirculation is a key process in the scope of operating municipal waste landfills as bioreactors, which aims to increase the moisture content to optimize the biodegradation in landfills. Given that liquid flows exhibit a complex behaviour in very heterogeneous porous media, in situ monitoring methods are required. Surface time-lapse electrical resistivity tomography (ERT) is usually proposed. Using numerical modelling with typical 2D and 3D injection plume patterns and 2D and 3D inversion codes, we show that wrong changes of resistivity can be calculated at depth if standard parameters are used for time-lapse ERT inversion. Major artefacts typically exhibit significant increases of resistivity (more than +30%) which can be misinterpreted as gas migration within the waste. In order to eliminate these artefacts, we tested an advanced time-lapse ERT procedure that includes (i) two advanced inversion tools and (ii) two alternative array geometries. The first advanced tool uses invariant regions in the model. The second advanced tool uses an inversion with a "minimum length" constraint. The alternative arrays focus on (i) a pole-dipole array (2D case), and (ii) a star array (3D case). The results show that these two advanced inversion tools and the two alternative arrays remove almost completely the artefacts within +/-5% both for 2D and 3D situations. As a field application, time-lapse ERT is applied using the star array during a 3D leachate injection in a non-hazardous municipal waste landfill. To evaluate the robustness of the two advanced tools, a synthetic model including both true decrease and increase of resistivity is built. The advanced time-lapse ERT procedure eliminates unwanted artefacts, while keeping a satisfactory image of true resistivity variations. This study demonstrates that significant and robust improvements can be obtained for time-lapse ERT monitoring of leachate recirculation in waste landfills. Copyright 2009 Elsevier Ltd. All rights reserved.

  13. Improvement of electrical resistivity tomography for leachate injection monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clement, R., E-mail: remi.clement@hmg.inpg.f; Descloitres, M.; Guenther, T., E-mail: Thomas.Guenther@liag-hannover.d

    2010-03-15

    Leachate recirculation is a key process in the scope of operating municipal waste landfills as bioreactors, which aims to increase the moisture content to optimize the biodegradation in landfills. Given that liquid flows exhibit a complex behaviour in very heterogeneous porous media, in situ monitoring methods are required. Surface time-lapse electrical resistivity tomography (ERT) is usually proposed. Using numerical modelling with typical 2D and 3D injection plume patterns and 2D and 3D inversion codes, we show that wrong changes of resistivity can be calculated at depth if standard parameters are used for time-lapse ERT inversion. Major artefacts typically exhibit significantmore » increases of resistivity (more than +30%) which can be misinterpreted as gas migration within the waste. In order to eliminate these artefacts, we tested an advanced time-lapse ERT procedure that includes (i) two advanced inversion tools and (ii) two alternative array geometries. The first advanced tool uses invariant regions in the model. The second advanced tool uses an inversion with a 'minimum length' constraint. The alternative arrays focus on (i) a pole-dipole array (2D case), and (ii) a star array (3D case). The results show that these two advanced inversion tools and the two alternative arrays remove almost completely the artefacts within +/-5% both for 2D and 3D situations. As a field application, time-lapse ERT is applied using the star array during a 3D leachate injection in a non-hazardous municipal waste landfill. To evaluate the robustness of the two advanced tools, a synthetic model including both true decrease and increase of resistivity is built. The advanced time-lapse ERT procedure eliminates unwanted artefacts, while keeping a satisfactory image of true resistivity variations. This study demonstrates that significant and robust improvements can be obtained for time-lapse ERT monitoring of leachate recirculation in waste landfills.« less

  14. Contact Us | OSTI, US Dept of Energy Office of Scientific and Technical

    Science.gov Websites

    Information skip to main content Sign In Create Account OSTI.GOV title logo U.S. Department of Energy Office of Scientific and Technical Information Search terms: Advanced search options Advanced Tools Public Access Policy Data Services & Dev Tools About FAQs News Sign In Create Account Contact

  15. How Project Management Tools Aid in Association to Advance Collegiate Schools of Business (AACSB) International Maintenance of Accreditation

    ERIC Educational Resources Information Center

    Cann, Cynthia W.; Brumagim, Alan L.

    2008-01-01

    The authors present the case of one business college's use of project management techniques as tools for accomplishing Association to Advance Collegiate Schools of Business (AACSB) International maintenance of accreditation. Using these techniques provides an efficient and effective method of organizing maintenance efforts. In addition, using…

  16. Synthetic biology advances for pharmaceutical production

    PubMed Central

    Breitling, Rainer; Takano, Eriko

    2015-01-01

    Synthetic biology enables a new generation of microbial engineering for the biotechnological production of pharmaceuticals and other high-value chemicals. This review presents an overview of recent advances in the field, describing new computational and experimental tools for the discovery, optimization and production of bioactive molecules, and outlining progress towards the application of these tools to pharmaceutical production systems. PMID:25744872

  17. STATISTICAL TECHNIQUES FOR DETERMINATION AND PREDICTION OF FUNDAMENTAL FISH ASSEMBLAGES OF THE MID-ATLANTIC HIGHLANDS

    EPA Science Inventory

    A statistical software tool, Stream Fish Community Predictor (SFCP), based on EMAP stream sampling in the mid-Atlantic Highlands, was developed to predict stream fish communities using stream and watershed characteristics. Step one in the tool development was a cluster analysis t...

  18. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC 11: SPC & Graphs. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…

  19. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Statistical Process Control.

    ERIC Educational Resources Information Center

    Billings, Paul H.

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…

  20. Reducing Anxiety and Increasing Self-Efficacy within an Advanced Graduate Psychology Statistics Course

    ERIC Educational Resources Information Center

    McGrath, April L.; Ferns, Alyssa; Greiner, Leigh; Wanamaker, Kayla; Brown, Shelley

    2015-01-01

    In this study we assessed the usefulness of a multifaceted teaching framework in an advanced statistics course. We sought to expand on past findings by using this framework to assess changes in anxiety and self-efficacy, and we collected focus group data to ascertain whether students attribute such changes to a multifaceted teaching approach.…

  1. A Statistical Project Control Tool for Engineering Managers

    NASA Technical Reports Server (NTRS)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  2. 48 CFR 31.109 - Advance agreements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Advance agreements. 31.109... REQUIREMENTS CONTRACT COST PRINCIPLES AND PROCEDURES Applicability 31.109 Advance agreements. (a) The extent of... contractors should seek advance agreement on the treatment of special or unusual costs and on statistical...

  3. 48 CFR 31.109 - Advance agreements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Advance agreements. 31.109... REQUIREMENTS CONTRACT COST PRINCIPLES AND PROCEDURES Applicability 31.109 Advance agreements. (a) The extent of... contractors should seek advance agreement on the treatment of special or unusual costs and on statistical...

  4. 48 CFR 31.109 - Advance agreements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advance agreements. 31.109... REQUIREMENTS CONTRACT COST PRINCIPLES AND PROCEDURES Applicability 31.109 Advance agreements. (a) The extent of... contractors should seek advance agreement on the treatment of special or unusual costs and on statistical...

  5. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  6. Molecules to maps: tools for visualization and interaction in support of computational biology.

    PubMed

    Kraemer, E T; Ferrin, T E

    1998-01-01

    The volume of data produced by genome projects, X-ray crystallography, NMR spectroscopy, and electron and confocal microscopy present the bioinformatics community with new challenges for analyzing, understanding, and exchanging this data. At the 1998 Pacific Symposium on Biocomputing, a track entitled 'Molecules to Maps: Tools for Visualization and Interaction in Computational Biology' provided tool developers and users with the opportunity to discuss advances in tools and techniques to assist scientists in evaluating, absorbing, navigating, and correlating this sea of information, through visualization and user interaction. In this paper we present these advances and discuss some of the challenges that remain to be solved.

  7. The Biophysics of Infection.

    PubMed

    Leake, Mark C

    2016-01-01

    Our understanding of the processes involved in infection has grown enormously in the past decade due in part to emerging methods of biophysics. This new insight has been enabled through advances in interdisciplinary experimental technologies and theoretical methods at the cutting-edge interface of the life and physical sciences. For example, this has involved several state-of-the-art biophysical tools used in conjunction with molecular and cell biology approaches, which enable investigation of infection in living cells. There are also new, emerging interfacial science tools which enable significant improvements to the resolution of quantitative measurements both in space and time. These include single-molecule biophysics methods and super-resolution microscopy approaches. These new technological tools in particular have underpinned much new understanding of dynamic processes of infection at a molecular length scale. Also, there are many valuable advances made recently in theoretical approaches of biophysics which enable advances in predictive modelling to generate new understanding of infection. Here, I discuss these advances, and take stock on our knowledge of the biophysics of infection and discuss where future advances may lead.

  8. Functional Analysis of OMICs Data and Small Molecule Compounds in an Integrated "Knowledge-Based" Platform.

    PubMed

    Dubovenko, Alexey; Nikolsky, Yuri; Rakhmatulin, Eugene; Nikolskaya, Tatiana

    2017-01-01

    Analysis of NGS and other sequencing data, gene variants, gene expression, proteomics, and other high-throughput (OMICs) data is challenging because of its biological complexity and high level of technical and biological noise. One way to deal with both problems is to perform analysis with a high fidelity annotated knowledgebase of protein interactions, pathways, and functional ontologies. This knowledgebase has to be structured in a computer-readable format and must include software tools for managing experimental data, analysis, and reporting. Here, we present MetaCore™ and Key Pathway Advisor (KPA), an integrated platform for functional data analysis. On the content side, MetaCore and KPA encompass a comprehensive database of molecular interactions of different types, pathways, network models, and ten functional ontologies covering human, mouse, and rat genes. The analytical toolkit includes tools for gene/protein list enrichment analysis, statistical "interactome" tool for the identification of over- and under-connected proteins in the dataset, and a biological network analysis module made up of network generation algorithms and filters. The suite also features Advanced Search, an application for combinatorial search of the database content, as well as a Java-based tool called Pathway Map Creator for drawing and editing custom pathway maps. Applications of MetaCore and KPA include molecular mode of action of disease research, identification of potential biomarkers and drug targets, pathway hypothesis generation, analysis of biological effects for novel small molecule compounds and clinical applications (analysis of large cohorts of patients, and translational and personalized medicine).

  9. Developing Statistical Literacy with Year 9 Students: A Collaborative Research Project

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2013-01-01

    Advances in technology and communication have increased the amount of statistical information delivered through everyday media. The importance of statistics in everyday life has led to calls for increased attention to statistical literacy in the mathematics curriculum (Watson 2006). Gal (2004) sees statistical literacy as the need for students to…

  10. Teaching Statistics Online: A Decade's Review of the Literature about What Works

    ERIC Educational Resources Information Center

    Mills, Jamie D.; Raju, Dheeraj

    2011-01-01

    A statistics course can be a very challenging subject to teach. To enhance learning, today's modern course in statistics might incorporate many different aspects of technology. Due to advances in technology, teaching statistics online has also become a popular course option. Although researchers are studying how to deliver statistics courses in…

  11. The Shock and Vibration Digest. Volume 15, Number 7

    DTIC Science & Technology

    1983-07-01

    systems noise -- for tant analytical tool, the statistical energy analysis example, from a specific metal, chain driven, con- method, has been the subject...34Experimental Determination of Vibration Parameters Re- ~~~quired in the Statistical Energy Analysis Meth- .,i. 31. Dubowsky, S. and Morris, T.L., "An...34Coupling Loss Factors for 55. Upton, R., "Sound Intensity -. A Powerful New Statistical Energy Analysis of Sound Trans- Measurement Tool," S/V, Sound

  12. DECONV-TOOL: An IDL based deconvolution software package

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Landsman, W. B.

    1992-01-01

    There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

  13. Web-based Data Exploration, Exploitation and Visualization Tools for Satellite Sensor VIS/IR Calibration Applications

    NASA Astrophysics Data System (ADS)

    Gopalan, A.; Doelling, D. R.; Scarino, B. R.; Chee, T.; Haney, C.; Bhatt, R.

    2016-12-01

    The CERES calibration group at NASA/LaRC has developed and deployed a suite of online data exploration and visualization tools targeted towards a range of spaceborne VIS/IR imager calibration applications for the Earth Science community. These web-based tools are driven by the open-source R (Language for Statistical Computing and Visualization) with a web interface for the user to customize the results according to their application. The tool contains a library of geostationary and sun-synchronous imager spectral response functions (SRF), incoming solar spectra, SCIAMACHY and Hyperion Earth reflected visible hyper-spectral data, and IASI IR hyper-spectral data. The suite of six specific web-based tools was designed to provide critical information necessary for sensor cross-calibration. One of the challenges of sensor cross-calibration is accounting for spectral band differences and may introduce biases if not handled properly. The spectral band adjustment factors (SBAF) are a function of the earth target, atmospheric and cloud conditions or scene type and angular conditions, when obtaining sensor radiance pairs. The SBAF will need to be customized for each inter-calibration target and sensor pair. The advantages of having a community open source tool are: 1) only one archive of SCIAMACHY, Hyperion, and IASI datasets needs to be maintained, which is on the order of 50TB. 2) the framework will allow easy incorporation of new satellite SRFs and hyper-spectral datasets and associated coincident atmospheric and cloud properties, such as PW. 3) web tool or SBAF algorithm improvements or suggestions when incorporated can benefit the community at large. 4) The customization effort is on the user rather than on the host. In this paper we discuss each of these tools in detail and explore the variety of advanced options that can be used to constrain the results along with specific use cases to highlight the value-added by these datasets.

  14. Spinoff 2011

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Topics include: Bioreactors Drive Advances in Tissue Engineering; Tooling Techniques Enhance Medical Imaging; Ventilator Technologies Sustain Critically Injured Patients; Protein Innovations Advance Drug Treatments, Skin Care; Mass Analyzers Facilitate Research on Addiction; Frameworks Coordinate Scientific Data Management; Cameras Improve Navigation for Pilots, Drivers; Integrated Design Tools Reduce Risk, Cost; Advisory Systems Save Time, Fuel for Airlines; Modeling Programs Increase Aircraft Design Safety; Fly-by-Wire Systems Enable Safer, More Efficient Flight; Modified Fittings Enhance Industrial Safety; Simulation Tools Model Icing for Aircraft Design; Information Systems Coordinate Emergency Management; Imaging Systems Provide Maps for U.S. Soldiers; High-Pressure Systems Suppress Fires in Seconds; Alloy-Enhanced Fans Maintain Fresh Air in Tunnels; Control Algorithms Charge Batteries Faster; Software Programs Derive Measurements from Photographs; Retrofits Convert Gas Vehicles into Hybrids; NASA Missions Inspire Online Video Games; Monitors Track Vital Signs for Fitness and Safety; Thermal Components Boost Performance of HVAC Systems; World Wind Tools Reveal Environmental Change; Analyzers Measure Greenhouse Gasses, Airborne Pollutants; Remediation Technologies Eliminate Contaminants; Receivers Gather Data for Climate, Weather Prediction; Coating Processes Boost Performance of Solar Cells; Analyzers Provide Water Security in Space and on Earth; Catalyst Substrates Remove Contaminants, Produce Fuel; Rocket Engine Innovations Advance Clean Energy; Technologies Render Views of Earth for Virtual Navigation; Content Platforms Meet Data Storage, Retrieval Needs; Tools Ensure Reliability of Critical Software; Electronic Handbooks Simplify Process Management; Software Innovations Speed Scientific Computing; Controller Chips Preserve Microprocessor Function; Nanotube Production Devices Expand Research Capabilities; Custom Machines Advance Composite Manufacturing; Polyimide Foams Offer Superior Insulation; Beam Steering Devices Reduce Payload Weight; Models Support Energy-Saving Microwave Technologies; Materials Advance Chemical Propulsion Technology; and High-Temperature Coatings Offer Energy Savings.

  15. Statistical methods used in the public health literature and implications for training of public health professionals

    PubMed Central

    Hayat, Matthew J.; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L.

    2017-01-01

    Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals. PMID:28591190

  16. Statistical methods used in the public health literature and implications for training of public health professionals.

    PubMed

    Hayat, Matthew J; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L

    2017-01-01

    Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals.

  17. Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur

    2010-01-01

    A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve algorithm performance accuracy include incorporating additional triggering factors such as tectonic activity, anthropogenic impacts and soil moisture into the algorithm calculation. Despite these limitations, the methodology presented in this regional evaluation is both straightforward to calculate and easy to interpret, making results transferable between regions and allowing findings to be placed within an inter-comparison framework. The regional algorithm scenario represents an important step in advancing regional and global-scale landslide hazard assessment and forecasting.

  18. The Math Problem: Advertising Students' Attitudes toward Statistics

    ERIC Educational Resources Information Center

    Fullerton, Jami A.; Kendrick, Alice

    2013-01-01

    This study used the Students' Attitudes toward Statistics Scale (STATS) to measure attitude toward statistics among a national sample of advertising students. A factor analysis revealed four underlying factors make up the attitude toward statistics construct--"Interest & Future Applicability," "Confidence," "Statistical Tools," and "Initiative."…

  19. Synergistic Role of Newer Techniques for Forensic and Postmortem CT Examinations.

    PubMed

    Blum, Alain; Kolopp, Martin; Teixeira, Pedro Gondim; Stroud, Tyler; Noirtin, Philippe; Coudane, Henry; Martrille, Laurent

    2018-04-30

    The aim of this article is to provide an overview of newer techniques and postprocessing tools that improve the potential impact of CT in forensic situations. CT has become a standard tool in medicolegal practice. Postmortem CT is an essential aid to the pathologist during autopsies. Advances in technology and software are constantly leading to advances in its performance.

  20. Machine Tool Advanced Skills Technology Program (MAST). Overview and Methodology.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology Program (MAST) is a geographical partnership of six of the nation's best two-year colleges located in the six states that have about one-third of the density of metals-related industries in the United States. The purpose of the MAST grant is to develop and implement a national training model to overcome…

  1. A Data Warehouse Architecture for DoD Healthcare Performance Measurements.

    DTIC Science & Technology

    1999-09-01

    design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse of healthcare metrics. With the DoD healthcare...framework, this thesis defines a methodology to design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse...21 F. INABILITY TO CONDUCT HELATHCARE ANALYSIS

  2. A Web-Based Learning Tool Improves Student Performance in Statistics: A Randomized Masked Trial

    ERIC Educational Resources Information Center

    Gonzalez, Jose A.; Jover, Lluis; Cobo, Erik; Munoz, Pilar

    2010-01-01

    Background: e-status is a web-based tool able to generate different statistical exercises and to provide immediate feedback to students' answers. Although the use of Information and Communication Technologies (ICTs) is becoming widespread in undergraduate education, there are few experimental studies evaluating its effects on learning. Method: All…

  3. Learning Axes and Bridging Tools in a Technology-Based Design for Statistics

    ERIC Educational Resources Information Center

    Abrahamson, Dor; Wilensky, Uri

    2007-01-01

    We introduce a design-based research framework, "learning axes and bridging tools," and demonstrate its application in the preparation and study of an implementation of a middle-school experimental computer-based unit on probability and statistics, "ProbLab" (Probability Laboratory, Abrahamson and Wilensky 2002 [Abrahamson, D., & Wilensky, U.…

  4. Statistical Physics in the Era of Big Data

    ERIC Educational Resources Information Center

    Wang, Dashun

    2013-01-01

    With the wealth of data provided by a wide range of high-throughout measurement tools and technologies, statistical physics of complex systems is entering a new phase, impacting in a meaningful fashion a wide range of fields, from cell biology to computer science to economics. In this dissertation, by applying tools and techniques developed in…

  5. Head and neck cancer: proteomic advances and biomarker achievements.

    PubMed

    Rezende, Taia Maria Berto; de Souza Freire, Mirna; Franco, Octávio Luiz

    2010-11-01

    Tumors of the head and neck comprise an important neoplasia group, the incidence of which is increasing in many parts of the world. Recent advances in diagnostic and therapeutic techniques for these lesions have yielded novel molecular targets, uncovered signal pathway dominance, and advanced early cancer detection. Proteomics is a powerful tool for investigating the distribution of proteins and small molecules within biological systems through the analysis of different types of samples. The proteomic profiles of different types of cancer have been studied, and this has provided remarkable advances in cancer understanding. This review covers recent advances for head and neck cancer; it encompasses the risk factors, pathogenesis, proteomic tools that can help in understanding cancer, and new proteomic findings in this type of cancer. Copyright © 2010 American Cancer Society.

  6. Research Education in Undergraduate Occupational Therapy Programs.

    ERIC Educational Resources Information Center

    Petersen, Paul; And Others

    1992-01-01

    Of 63 undergraduate occupational therapy programs surveyed, the 38 responses revealed some common areas covered: elementary descriptive statistics, validity, reliability, and measurement. Areas underrepresented include statistical analysis with or without computers, research design, and advanced statistics. (SK)

  7. An implementation evaluation of a qualitative culture assessment tool.

    PubMed

    Tappin, D C; Bentley, T A; Ashby, L E

    2015-03-01

    Safety culture has been identified as a critical element of healthy and safe workplaces and as such warrants the attention of ergonomists involved in occupational health and safety (OHS). This study sought to evaluate a tool for assessing organisational safety culture as it impacts a common OHS problem: musculoskeletal disorders (MSD). The level of advancement across nine cultural aspects was assessed in two implementation site organisations. These organisations, in residential healthcare and timber processing, enabled evaluation of the tool in contrasting settings, with reported MSD rates also high in both sectors. Interviews were conducted with 39 managers and workers across the two organisations. Interview responses and company documentation were compared by two researchers to the descriptor items for each MSD culture aspect. An assignment of the level of advancement, using a five stage framework, was made for each aspect. The tool was readily adapted to each implementation site context and provided sufficient evidence to assess their levels of advancement. Assessments for most MSD culture aspects were in the mid to upper levels of advancement, although the levels differed within each organisation, indicating that different aspects of MSD culture, as with safety culture, develop at a different pace within organisations. Areas for MSD culture improvement were identified for each organisation. Reflections are made on the use and merits of the tool by ergonomists for addressing MSD risk. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  8. Interferometric correction system for a numerically controlled machine

    DOEpatents

    Burleson, Robert R.

    1978-01-01

    An interferometric correction system for a numerically controlled machine is provided to improve the positioning accuracy of a machine tool, for example, for a high-precision numerically controlled machine. A laser interferometer feedback system is used to monitor the positioning of the machine tool which is being moved by command pulses to a positioning system to position the tool. The correction system compares the commanded position as indicated by a command pulse train applied to the positioning system with the actual position of the tool as monitored by the laser interferometer. If the tool position lags the commanded position by a preselected error, additional pulses are added to the pulse train applied to the positioning system to advance the tool closer to the commanded position, thereby reducing the lag error. If the actual tool position is leading in comparison to the commanded position, pulses are deleted from the pulse train where the advance error exceeds the preselected error magnitude to correct the position error of the tool relative to the commanded position.

  9. Statistical Tutorial | Center for Cancer Research

    Cancer.gov

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data.  ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018.  The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean

  10. Understanding Statistics and Statistics Education: A Chinese Perspective

    ERIC Educational Resources Information Center

    Shi, Ning-Zhong; He, Xuming; Tao, Jian

    2009-01-01

    In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…

  11. PANDA-view: An easy-to-use tool for statistical analysis and visualization of quantitative proteomics data.

    PubMed

    Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping

    2018-05-22

    Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.

  12. Infant Statistical-Learning Ability Is Related to Real-Time Language Processing

    ERIC Educational Resources Information Center

    Lany, Jill; Shoaib, Amber; Thompson, Abbie; Estes, Katharine Graf

    2018-01-01

    Infants are adept at learning statistical regularities in artificial language materials, suggesting that the ability to learn statistical structure may support language development. Indeed, infants who perform better on statistical learning tasks tend to be more advanced in parental reports of infants' language skills. Work with adults suggests…

  13. ROOT: A C++ framework for petabyte data storage, statistical analysis and visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antcheva, I.; /CERN; Ballintijn, M.

    2009-01-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web or a number of different shared file systems. In order to analyze this data, the user can chose outmore » of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way.« less

  14. BMPOS: a Flexible and User-Friendly Tool Sets for Microbiome Studies.

    PubMed

    Pylro, Victor S; Morais, Daniel K; de Oliveira, Francislon S; Dos Santos, Fausto G; Lemos, Leandro N; Oliveira, Guilherme; Roesch, Luiz F W

    2016-08-01

    Recent advances in science and technology are leading to a revision and re-orientation of methodologies, addressing old and current issues under a new perspective. Advances in next generation sequencing (NGS) are allowing comparative analysis of the abundance and diversity of whole microbial communities, generating a large amount of data and findings at a systems level. The current limitation for biologists has been the increasing demand for computational power and training required for processing of NGS data. Here, we describe the deployment of the Brazilian Microbiome Project Operating System (BMPOS), a flexible and user-friendly Linux distribution dedicated to microbiome studies. The Brazilian Microbiome Project (BMP) has developed data analyses pipelines for metagenomic studies (phylogenetic marker genes), conducted using the two main high-throughput sequencing platforms (Ion Torrent and Illumina MiSeq). The BMPOS is freely available and possesses the entire requirement of bioinformatics packages and databases to perform all the pipelines suggested by the BMP team. The BMPOS may be used as a bootable live USB stick or installed in any computer with at least 1 GHz CPU and 512 MB RAM, independent of the operating system previously installed. The BMPOS has proved to be effective for sequences processing, sequences clustering, alignment, taxonomic annotation, statistical analysis, and plotting of metagenomic data. The BMPOS has been used during several metagenomic analyses courses, being valuable as a tool for training, and an excellent starting point to anyone interested in performing metagenomic studies. The BMPOS and its documentation are available at http://www.brmicrobiome.org .

  15. Reliability of CBCT as an assessment tool for mandibular molars furcation defects

    NASA Astrophysics Data System (ADS)

    Marinescu, Adrian George; Boariu, Marius; Rusu, Darian; Stratul, Stefan-Ioan; Ogodescu, Alexandru

    2014-01-01

    Introduction. In numerous clinical situations it is not possible to have an exact clinical evaluation of the furcation defects. Recently the use of CBCT in periodontology has led to an increased precision in diagnostic. Aim. To determine the accuracy of CBCT as diagnostic tool of the furcation defects. Material and method. 19 patients with generalised advanced chronic periodontitis were included in this study, presenting a total of 25 lower molars with different degrees of furcation defects. Clinical and digital measurements (in mm) were performed on all the molars involved. The data obtained has been compared and statistically analysed. Results. The analysis of primary data has demonstrated that all the furcation grade II and III defects were revealed using the CBCT technique. Regarding the incipient defects (grade I Hamp < 3mm), the dimensions measured on CBCT images were slightly bigger. The results have shown that 84% of the defects detected by CBCT have been confirmed by clinical measurements. These data are similar to those revealed by other studies1. Conclusions. The use of CBCT technique in evaluation and diagnosis of human mandibular furcation defects can provide many important information regarding the size and aspect of the interradicular defect, efficiently and noninvasively. CBCT technique is used more effectively in detection of advanced furcation degree compared to incipient ones. However, the CBCT examination cannot replace, at least in this stage of development, the clinical measurements, especially the intraoperative ones, which are considered to represent the „golden standard" in this domain.

  16. Why Research Design and Methods Is So Crucial to Understanding Drug Use/Abuse: Introduction to the Special Issue.

    PubMed

    Scheier, Lawrence M

    2018-06-01

    The collection of articles in this special issue both raise the bar and inspire new thinking with regard to both design and methodology concerns that influence drug use/abuse research. Thematically speaking, the articles focus on issues related to missing data, response formats, strategies for data harmonization, propensity scoring methods as an alternative to randomized control trials, integrative data analysis, statistical corrections to reduce bias from attrition, challenges faced from conducting large-scale evaluations, and employing abductive theory of method as an alternative to the more traditional hypothetico-deductive reasoning. Collectively, these issues are of paramount importance as they provide specific means to improve our investigative tools and refine the logical framework we employ to examine the problem of drug use/abuse. Each of the authors addresses a specific challenge outlining how it affects our current research efforts and then outlines remedies that can advance the field. To their credit, they have included issues that affect both etiology and prevention, thus broadening our horizons as we learn more about developmental processes causally related to drug use/abuse and intervention strategies that can mitigate developmental vulnerability. This is the essential dialogue required to advance our intellectual tool kit and improve the research skills we bring to bear on the important questions facing the field of drug use/abuse. Ultimately, the goal is to increase our ability to identify the causes and consequences of drug use/abuse and find ways to ameliorate these problems as we engage the public health agenda.

  17. PlanetPack: A radial-velocity time-series analysis tool facilitating exoplanets detection, characterization, and dynamical simulations

    NASA Astrophysics Data System (ADS)

    Baluev, Roman V.

    2013-08-01

    We present PlanetPack, a new software tool that we developed to facilitate and standardize the advanced analysis of radial velocity (RV) data for the goal of exoplanets detection, characterization, and basic dynamical N-body simulations. PlanetPack is a command-line interpreter, that can run either in an interactive mode or in a batch mode of automatic script interpretation. Its major abilities include: (i) advanced RV curve fitting with the proper maximum-likelihood treatment of unknown RV jitter; (ii) user-friendly multi-Keplerian as well as Newtonian N-body RV fits; (iii) use of more efficient maximum-likelihood periodograms that involve the full multi-planet fitting (sometimes called as “residual” or “recursive” periodograms); (iv) easily calculatable parametric 2D likelihood function level contours, reflecting the asymptotic confidence regions; (v) fitting under some useful functional constraints is user-friendly; (vi) basic tasks of short- and long-term planetary dynamical simulation using a fast Everhart-type integrator based on Gauss-Legendre spacings; (vii) fitting the data with red noise (auto-correlated errors); (viii) various analytical and numerical methods for the tasks of determining the statistical significance. It is planned that further functionality may be added to PlanetPack in the future. During the development of this software, a lot of effort was made to improve the calculational speed, especially for CPU-demanding tasks. PlanetPack was written in pure C++ (standard of 1998/2003), and is expected to be compilable and useable on a wide range of platforms.

  18. Advanced genetic tools for plant biotechnology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, WS; Yuan, JS; Stewart, CN

    2013-10-09

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis ofmore » large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.« less

  19. Advanced genetic tools for plant biotechnology.

    PubMed

    Liu, Wusheng; Yuan, Joshua S; Stewart, C Neal

    2013-11-01

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis of large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.

  20. Ensemble Streamflow Forecast Improvements in NYC's Operations Support Tool

    NASA Astrophysics Data System (ADS)

    Wang, L.; Weiss, W. J.; Porter, J.; Schaake, J. C.; Day, G. N.; Sheer, D. P.

    2013-12-01

    Like most other water supply utilities, New York City's Department of Environmental Protection (DEP) has operational challenges associated with drought and wet weather events. During drought conditions, DEP must maintain water supply reliability to 9 million customers as well as meet environmental release requirements downstream of its reservoirs. During and after wet weather events, DEP must maintain turbidity compliance in its unfiltered Catskill and Delaware reservoir systems and minimize spills to mitigate downstream flooding. Proactive reservoir management - such as release restrictions to prepare for a drought or preventative drawdown in advance of a large storm - can alleviate negative impacts associated with extreme events. It is important for water managers to understand the risks associated with proactive operations so unintended consequences such as endangering water supply reliability with excessive drawdown prior to a storm event are minimized. Probabilistic hydrologic forecasts are a critical tool in quantifying these risks and allow water managers to make more informed operational decisions. DEP has recently completed development of an Operations Support Tool (OST) that integrates ensemble streamflow forecasts, real-time observations, and a reservoir system operations model into a user-friendly graphical interface that allows its water managers to take robust and defensible proactive measures in the face of challenging system conditions. Since initial development of OST was first presented at the 2011 AGU Fall Meeting, significant improvements have been made to the forecast system. First, the monthly AR1 forecasts ('Hirsch method') were upgraded with a generalized linear model (GLM) utilizing historical daily correlations ('Extended Hirsch method' or 'eHirsch'). The development of eHirsch forecasts improved predictive skill over the Hirsch method in the first week to a month from the forecast date and produced more realistic hydrographs on the tail end of high flow periods. These improvements allowed DEP to more effectively manage water quality control and spill mitigation operations immediately after storm events. Later on, post-processed hydrologic forecasts from the National Weather Service (NWS) including the Advanced Hydrologic Prediction Service (AHPS) and the Hydrologic Ensemble Forecast Service (HEFS) were implemented into OST. These forecasts further increased the predictive skill over the initial statistical models as current basin conditions (e.g. soil moisture, snowpack) and meteorological forecasts (with HEFS) are now explicitly represented. With the post-processed HEFS forecasts, DEP may now truly quantify impacts associated with wet weather events on the horizon, rather than relying on statistical representations of current hydrologic trends. This presentation will highlight the benefits of the improved forecasts using examples from actual system operations.

  1. The Software Architecture of the Upgraded ESA DRAMA Software Suite

    NASA Astrophysics Data System (ADS)

    Kebschull, Christopher; Flegel, Sven; Gelhaus, Johannes; Mockel, Marek; Braun, Vitali; Radtke, Jonas; Wiedemann, Carsten; Vorsmann, Peter; Sanchez-Ortiz, Noelia; Krag, Holger

    2013-08-01

    In the beginnings of man's space flight activities there was the belief that space is so big that everybody could use it without any repercussions. However during the last six decades the increasing use of Earth's orbits has lead to a rapid growth in the space debris environment, which has a big influence on current and future space missions. For this reason ESA issued the "Requirements on Space Debris Mitigation for ESA Projects" [1] in 2008, which apply to all ESA missions henceforth. The DRAMA (Debris Risk Assessment and Mitigation Analysis) software suite had been developed to support the planning of space missions to comply with these requirements. During the last year the DRAMA software suite has been upgraded under ESA contract by TUBS and DEIMOS to include additional tools and increase the performance of existing ones. This paper describes the overall software architecture of the ESA DRAMA software suite. Specifically the new graphical user interface, which manages the five main tools ARES (Assessment of Risk Event Statistics), MIDAS (MASTER-based Impact Flux and Damage Assessment Software), OSCAR (Orbital Spacecraft Active Removal), CROC (Cross Section of Complex Bodies) and SARA (Re-entry Survival and Risk Analysis) is being discussed. The advancements are highlighted as well as the challenges that arise from the integration of the five tool interfaces. A framework had been developed at the ILR and was used for MASTER-2009 and PROOF-2009. The Java based GUI framework, enables the cross-platform deployment, and its underlying model-view-presenter (MVP) software pattern, meet strict design requirements necessary to ensure a robust and reliable method of operation in an environment where the GUI is separated from the processing back-end. While the GUI framework evolved with each project, allowing an increasing degree of integration of services like validators for input fields, it has also increased in complexity. The paper will conclude with an outlook on the future development of the GUI framework, where the potential for advancements will be shown.

  2. Advancement of Tools Supporting Improvement of Work Safety in Selected Industrial Company

    NASA Astrophysics Data System (ADS)

    Gembalska-Kwiecień, Anna

    2018-03-01

    In the presented article, the advancement of tools to improve the safety of work in the researched industrial company was taken into consideration. Attention was paid to the skillful analysis of the working environment, which includes the available technologies, work organization and human capital. These factors determine the development of the best prevention activities to minimize the number of accidents.

  3. Uncertainty in projected point precipitation extremes for hydrological impact analysis of climate change

    NASA Astrophysics Data System (ADS)

    Van Uytven, Els; Willems, Patrick

    2017-04-01

    Current trends in the hydro-meteorological variables indicate the potential impact of climate change on hydrological extremes. Therefore, they trigger an increased importance climate adaptation strategies in water management. The impact of climate change on hydro-meteorological and hydrological extremes is, however, highly uncertain. This is due to uncertainties introduced by the climate models, the internal variability inherent to the climate system, the greenhouse gas scenarios and the statistical downscaling methods. In view of the need to define sustainable climate adaptation strategies, there is a need to assess these uncertainties. This is commonly done by means of ensemble approaches. Because more and more climate models and statistical downscaling methods become available, there is a need to facilitate the climate impact and uncertainty analysis. A Climate Perturbation Tool has been developed for that purpose, which combines a set of statistical downscaling methods including weather typing, weather generator, transfer function and advanced perturbation based approaches. By use of an interactive interface, climate impact modelers can apply these statistical downscaling methods in a semi-automatic way to an ensemble of climate model runs. The tool is applicable to any region, but has been demonstrated so far to cases in Belgium, Suriname, Vietnam and Bangladesh. Time series representing future local-scale precipitation, temperature and potential evapotranspiration (PET) conditions were obtained, starting from time series of historical observations. Uncertainties on the future meteorological conditions are represented in two different ways: through an ensemble of time series, and a reduced set of synthetic scenarios. The both aim to span the full uncertainty range as assessed from the ensemble of climate model runs and downscaling methods. For Belgium, for instance, use was made of 100-year time series of 10-minutes precipitation observations and daily temperature and PET observations at Uccle and a large ensemble of 160 global climate model runs (CMIP5). They cover all four representative concentration pathway based greenhouse gas scenarios. While evaluating the downscaled meteorological series, particular attention was given to the performance of extreme value metrics (e.g. for precipitation, by means of intensity-duration-frequency statistics). Moreover, the total uncertainty was decomposed in the fractional uncertainties for each of the uncertainty sources considered. Research assessing the additional uncertainty due to parameter and structural uncertainties of the hydrological impact model is ongoing.

  4. Calibrating the Difficulty of an Assessment Tool: The Blooming of a Statistics Examination

    ERIC Educational Resources Information Center

    Dunham, Bruce; Yapa, Gaitri; Yu, Eugenia

    2015-01-01

    Bloom's taxonomy is proposed as a tool by which to assess the level of complexity of assessment tasks in statistics. Guidelines are provided for how to locate tasks at each level of the taxonomy, along with descriptions and examples of suggested test questions. Through the "Blooming" of an examination--that is, locating its constituent…

  5. Analytics for Cyber Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  6. On the blind use of statistical tools in the analysis of globular cluster stars

    NASA Astrophysics Data System (ADS)

    D'Antona, Francesca; Caloi, Vittoria; Tailo, Marco

    2018-04-01

    As with most data analysis methods, the Bayesian method must be handled with care. We show that its application to determine stellar evolution parameters within globular clusters can lead to paradoxical results if used without the necessary precautions. This is a cautionary tale on the use of statistical tools for big data analysis.

  7. FUn: a framework for interactive visualizations of large, high-dimensional datasets on the web.

    PubMed

    Probst, Daniel; Reymond, Jean-Louis

    2018-04-15

    During the past decade, big data have become a major tool in scientific endeavors. Although statistical methods and algorithms are well-suited for analyzing and summarizing enormous amounts of data, the results do not allow for a visual inspection of the entire data. Current scientific software, including R packages and Python libraries such as ggplot2, matplotlib and plot.ly, do not support interactive visualizations of datasets exceeding 100 000 data points on the web. Other solutions enable the web-based visualization of big data only through data reduction or statistical representations. However, recent hardware developments, especially advancements in graphical processing units, allow for the rendering of millions of data points on a wide range of consumer hardware such as laptops, tablets and mobile phones. Similar to the challenges and opportunities brought to virtually every scientific field by big data, both the visualization of and interaction with copious amounts of data are both demanding and hold great promise. Here we present FUn, a framework consisting of a client (Faerun) and server (Underdark) module, facilitating the creation of web-based, interactive 3D visualizations of large datasets, enabling record level visual inspection. We also introduce a reference implementation providing access to SureChEMBL, a database containing patent information on more than 17 million chemical compounds. The source code and the most recent builds of Faerun and Underdark, Lore.js and the data preprocessing toolchain used in the reference implementation, are available on the project website (http://doc.gdb.tools/fun/). daniel.probst@dcb.unibe.ch or jean-louis.reymond@dcb.unibe.ch.

  8. Contrast enhanced dual energy spectral mammogram, an emerging addendum in breast imaging

    PubMed Central

    Gnanaprakasam, Francis; Anand, Subhapradha; Krishnaswami, Murali; Ramachandran, Madan

    2016-01-01

    Objective: To assess the role of contrast-enhanced dual-energy spectral mammogram (CEDM) as a problem-solving tool in equivocal cases. Methods: 44 consenting females with equivocal findings on full-field digital mammogram underwent CEDM. All the images were interpreted by two radiologists independently. Confidence of presence was plotted on a three-point Likert scale and probability of cancer was assigned on Breast Imaging Reporting and Data System scoring. Histopathology was taken as the gold standard. Statistical analyses of all variables were performed. Results: 44 breast lesions were included in the study, among which 77.3% lesions were malignant or precancerous and 22.7% lesions were benign or inconclusive. 20% of lesions were identified only on CEDM. True extent of the lesion was made out in 15.9% of cases, multifocality was established in 9.1% of cases and ductal extension was demonstrated in 6.8% of cases. Statistical significance for CEDM was p-value <0.05. Interobserver kappa value was 0.837. Conclusion: CEDM has a useful role in identifying occult lesions in dense breasts and in triaging lesions. In a mammographically visible lesion, CEDM characterizes the lesion, affirms the finding and better demonstrates response to treatment. Hence, we conclude that CEDM is a useful complementary tool to standard mammogram. Advances in knowledge: CEDM can detect and demonstrate lesions even in dense breasts with the advantage of feasibility of stereotactic biopsy in the same setting. Hence, it has the potential to be a screening modality with need for further studies and validation. PMID:27610475

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.A. Krommes

    Fusion physics poses an extremely challenging, practically complex problem that does not yield readily to simple paradigms. Nevertheless, various of the theoretical tools and conceptual advances emphasized at the KaufmanFest 2007 have motivated and/or found application to the development of fusion-related plasma turbulence theory. A brief historical commentary is given on some aspects of that specialty, with emphasis on the role (and limitations) of Hamiltonian/symplectic approaches, variational methods, oscillation-center theory, and nonlinear dynamics. It is shown how to extract a renormalized ponderomotive force from the statistical equations of plasma turbulence, and the possibility of a renormalized K-χ theorem is discussed.more » An unusual application of quasilinear theory to the problem of plasma equilibria in the presence of stochastic magnetic fields is described. The modern problem of zonal-flow dynamics illustrates a confluence of several techniques, including (i) the application of nonlinear-dynamics methods, especially center-manifold theory, to the problem of the transition to plasma turbulence in the face of self-generated zonal flows; and (ii) the use of Hamiltonian formalism to determine the appropriate (Casimir) invariant to be used in a novel wave-kinetic analysis of systems of interacting zonal flows and drift waves. Recent progress in the theory of intermittent chaotic statistics and the generation of coherent structures from turbulence is mentioned, and an appeal is made for some new tools to cope with these interesting and difficult problems in nonlinear plasma physics. Finally, the important influence of the intellectually stimulating research environment fostered by Prof. Allan Kaufman on the author's thinking and teaching methodology is described.« less

  10. A Statistical Testing Approach for Quantifying Software Reliability; Application to an Example System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chu, Tsong-Lun; Varuttamaseni, Athi; Baek, Joo-Seok

    The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRAs of nuclear power plants (NPPs), and (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities.more » Under NRC's sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware, development of a philosophical basis for defining software failure, and identification of desirable attributes of quantitative software reliability methods. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability. This paper describes a statistical software testing approach for quantifying software reliability and applies it to the loop-operating control system (LOCS) of an experimental loop of the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL).« less

  11. Quantitative approaches in climate change ecology

    PubMed Central

    Brown, Christopher J; Schoeman, David S; Sydeman, William J; Brander, Keith; Buckley, Lauren B; Burrows, Michael; Duarte, Carlos M; Moore, Pippa J; Pandolfi, John M; Poloczanska, Elvira; Venables, William; Richardson, Anthony J

    2011-01-01

    Contemporary impacts of anthropogenic climate change on ecosystems are increasingly being recognized. Documenting the extent of these impacts requires quantitative tools for analyses of ecological observations to distinguish climate impacts in noisy data and to understand interactions between climate variability and other drivers of change. To assist the development of reliable statistical approaches, we review the marine climate change literature and provide suggestions for quantitative approaches in climate change ecology. We compiled 267 peer-reviewed articles that examined relationships between climate change and marine ecological variables. Of the articles with time series data (n = 186), 75% used statistics to test for a dependency of ecological variables on climate variables. We identified several common weaknesses in statistical approaches, including marginalizing other important non-climate drivers of change, ignoring temporal and spatial autocorrelation, averaging across spatial patterns and not reporting key metrics. We provide a list of issues that need to be addressed to make inferences more defensible, including the consideration of (i) data limitations and the comparability of data sets; (ii) alternative mechanisms for change; (iii) appropriate response variables; (iv) a suitable model for the process under study; (v) temporal autocorrelation; (vi) spatial autocorrelation and patterns; and (vii) the reporting of rates of change. While the focus of our review was marine studies, these suggestions are equally applicable to terrestrial studies. Consideration of these suggestions will help advance global knowledge of climate impacts and understanding of the processes driving ecological change.

  12. High resolution probabilistic precipitation forecast over Spain combining the statistical downscaling tool PROMETEO and the AEMET short range EPS system (AEMET/SREPS)

    NASA Astrophysics Data System (ADS)

    Cofino, A. S.; Santos, C.; Garcia-Moya, J. A.; Gutierrez, J. M.; Orfila, B.

    2009-04-01

    The Short-Range Ensemble Prediction System (SREPS) is a multi-LAM (UM, HIRLAM, MM5, LM and HRM) multi analysis/boundary conditions (ECMWF, UKMetOffice, DWD and GFS) run twice a day by AEMET (72 hours lead time) over a European domain, with a total of 5 (LAMs) x 4 (GCMs) = 20 members. One of the main goals of this project is analyzing the impact of models and boundary conditions in the short-range high-resolution forecasted precipitation. A previous validation of this method has been done considering a set of climate networks in Spain, France and Germany, by interpolating the prediction to the gauge locations (SREPS, 2008). In this work we compare these results with those obtained by using a statistical downscaling method to post-process the global predictions, obtaining an "advanced interpolation" for the local precipitation using climate network precipitation observations. In particular, we apply the PROMETEO downscaling system based on analogs and compare the SREPS ensemble of 20 members with the PROMETEO statistical ensemble of 5 (analog ensemble) x 4 (GCMs) = 20 members. Moreover, we will also compare the performance of a combined approach post-processing the SREPS outputs using the PROMETEO system. References: SREPS 2008. 2008 EWGLAM-SRNWP Meeting (http://www.aemet.es/documentos/va/divulgacion/conferencias/prediccion/Ewglam/PRED_CSantos.pdf)

  13. Statistical prediction of September Arctic Sea Ice minimum based on stable teleconnections with global climate and oceanic patterns

    NASA Astrophysics Data System (ADS)

    Ionita, M.; Grosfeld, K.; Scholz, P.; Lohmann, G.

    2016-12-01

    Sea ice in both Polar Regions is an important indicator for the expression of global climate change and its polar amplification. Consequently, a broad information interest exists on sea ice, its coverage, variability and long term change. Knowledge on sea ice requires high quality data on ice extent, thickness and its dynamics. However, its predictability depends on various climate parameters and conditions. In order to provide insights into the potential development of a monthly/seasonal signal, we developed a robust statistical model based on ocean heat content, sea surface temperature and atmospheric variables to calculate an estimate of the September minimum sea ice extent for every year. Although previous statistical attempts at monthly/seasonal forecasts of September sea ice minimum show a relatively reduced skill, here it is shown that more than 97% (r = 0.98) of the September sea ice extent can predicted three months in advance by using previous months conditions via a multiple linear regression model based on global sea surface temperature (SST), mean sea level pressure (SLP), air temperature at 850hPa (TT850), surface winds and sea ice extent persistence. The statistical model is based on the identification of regions with stable teleconnections between the predictors (climatological parameters) and the predictand (here sea ice extent). The results based on our statistical model contribute to the sea ice prediction network for the sea ice outlook report (https://www.arcus.org/sipn) and could provide a tool for identifying relevant regions and climate parameters that are important for the sea ice development in the Arctic and for detecting sensitive and critical regions in global coupled climate models with focus on sea ice formation.

  14. A New Scoring System to Predict the Risk for High-risk Adenoma and Comparison of Existing Risk Calculators.

    PubMed

    Murchie, Brent; Tandon, Kanwarpreet; Hakim, Seifeldin; Shah, Kinchit; O'Rourke, Colin; Castro, Fernando J

    2017-04-01

    Colorectal cancer (CRC) screening guidelines likely over-generalizes CRC risk, 35% of Americans are not up to date with screening, and there is growing incidence of CRC in younger patients. We developed a practical prediction model for high-risk colon adenomas in an average-risk population, including an expanded definition of high-risk polyps (≥3 nonadvanced adenomas), exposing higher than average-risk patients. We also compared results with previously created calculators. Patients aged 40 to 59 years, undergoing first-time average-risk screening or diagnostic colonoscopies were evaluated. Risk calculators for advanced adenomas and high-risk adenomas were created based on age, body mass index, sex, race, and smoking history. Previously established calculators with similar risk factors were selected for comparison of concordance statistic (c-statistic) and external validation. A total of 5063 patients were included. Advanced adenomas, and high-risk adenomas were seen in 5.7% and 7.4% of the patient population, respectively. The c-statistic for our calculator was 0.639 for the prediction of advanced adenomas, and 0.650 for high-risk adenomas. When applied to our population, all previous models had lower c-statistic results although one performed similarly. Our model compares favorably to previously established prediction models. Age and body mass index were used as continuous variables, likely improving the c-statistic. It also reports absolute predictive probabilities of advanced and high-risk polyps, allowing for more individualized risk assessment of CRC.

  15. Gaussian curvature analysis allows for automatic block placement in multi-block hexahedral meshing.

    PubMed

    Ramme, Austin J; Shivanna, Kiran H; Magnotta, Vincent A; Grosland, Nicole M

    2011-10-01

    Musculoskeletal finite element analysis (FEA) has been essential to research in orthopaedic biomechanics. The generation of a volumetric mesh is often the most challenging step in a FEA. Hexahedral meshing tools that are based on a multi-block approach rely on the manual placement of building blocks for their mesh generation scheme. We hypothesise that Gaussian curvature analysis could be used to automatically develop a building block structure for multi-block hexahedral mesh generation. The Automated Building Block Algorithm incorporates principles from differential geometry, combinatorics, statistical analysis and computer science to automatically generate a building block structure to represent a given surface without prior information. We have applied this algorithm to 29 bones of varying geometries and successfully generated a usable mesh in all cases. This work represents a significant advancement in automating the definition of building blocks.

  16. Current application of chemometrics in traditional Chinese herbal medicine research.

    PubMed

    Huang, Yipeng; Wu, Zhenwei; Su, Rihui; Ruan, Guihua; Du, Fuyou; Li, Gongke

    2016-07-15

    Traditional Chinese herbal medicines (TCHMs) are promising approach for the treatment of various diseases which have attracted increasing attention all over the world. Chemometrics in quality control of TCHMs are great useful tools that harnessing mathematics, statistics and other methods to acquire information maximally from the data obtained from various analytical approaches. This feature article focuses on the recent studies which evaluating the pharmacological efficacy and quality of TCHMs by determining, identifying and discriminating the bioactive or marker components in different samples with the help of chemometric techniques. In this work, the application of chemometric techniques in the classification of TCHMs based on their efficacy and usage was introduced. The recent advances of chemometrics applied in the chemical analysis of TCHMs were reviewed in detail. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Quantifying Human Response: Linking metrological and psychometric characterisations of Man as a Measurement Instrument

    NASA Astrophysics Data System (ADS)

    Pendrill, L. R.; Fisher, William P., Jr.

    2013-09-01

    A better understanding of how to characterise human response is essential to improved person-centred care and other situations where human factors are crucial. Challenges to introducing classical metrological concepts such as measurement uncertainty and traceability when characterising Man as a Measurement Instrument include the failure of many statistical tools when applied to ordinal measurement scales and a lack of metrological references in, for instance, healthcare. The present work attempts to link metrological and psychometric (Rasch) characterisation of Man as a Measurement Instrument in a study of elementary tasks, such as counting dots, where one knows independently the expected value because the measurement object (collection of dots) is prepared in advance. The analysis is compared and contrasted with recent approaches to this problem by others, for instance using signal error fidelity.

  18. Why Can't We Resolve Recruitment?

    NASA Astrophysics Data System (ADS)

    Ferreira, S. A.; Payne, M. R.; Hátún, H.; MacKenzie, B. R.; Butenschön, M.; Visser, A. W.

    2016-02-01

    During the last century, Johan Hjort's work has lead to signicant advances in explaining anomalous year-classes within sheries science. However, distinguishing between the competing mechanisms of year-class regulation (e.g., food conditions, predation, transport) has proved challenging. We use blue whiting (Micromesistius poutassou) in the North-east Atlantic Ocean as a case study, which, during the late 1990s and early 2000s, generated year-classes up to nearly an order of magnitude higher than those seen before or after. There presently exists no models that can quantify past variations in recruitment for this stock. Using modern stock-statistical and observational tools, we catalog a range of environmentally-driven hypotheses relevant for recruitment of blue whiting, including physical and biogeographic conditions, phenology, parental effects and predation. We have run the analyses to test some hypotheses and results will be presented at the session.

  19. Forecasting techno-social systems: how physics and computing help to fight off global pandemics

    NASA Astrophysics Data System (ADS)

    Vespignani, Alessandro

    2010-03-01

    The crucial issue when planning for adequate public health interventions to mitigate the spread and impact of epidemics is risk evaluation and forecast. This amount to the anticipation of where, when and how strong the epidemic will strike. In the last decade advances in performance in computer technology, data acquisition, statistical physics and complex networks theory allow the generation of sophisticated simulations on supercomputer infrastructures to anticipate the spreading pattern of a pandemic. For the first time we are in the position of generating real time forecast of epidemic spreading. I will review the history of the current H1N1 pandemic, the major road-blocks the community has faced in its containment and mitigation and how physics and computing provide predictive tools that help us to battle epidemics.

  20. The ImageJ ecosystem: an open platform for biomedical image analysis

    PubMed Central

    Schindelin, Johannes; Rueden, Curtis T.; Hiner, Mark C.; Eliceiri, Kevin W.

    2015-01-01

    Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available – from commercial to academic, special-purpose to Swiss army knife, small to large–but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts life science, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. PMID:26153368

  1. The ImageJ ecosystem: An open platform for biomedical image analysis.

    PubMed

    Schindelin, Johannes; Rueden, Curtis T; Hiner, Mark C; Eliceiri, Kevin W

    2015-01-01

    Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available-from commercial to academic, special-purpose to Swiss army knife, small to large-but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on the life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts the life sciences, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. © 2015 Wiley Periodicals, Inc.

  2. Tools and Techniques for Basin-Scale Climate Change Assessment

    NASA Astrophysics Data System (ADS)

    Zagona, E.; Rajagopalan, B.; Oakley, W.; Wilson, N.; Weinstein, P.; Verdin, A.; Jerla, C.; Prairie, J. R.

    2012-12-01

    The Department of Interior's WaterSMART Program seeks to secure and stretch water supplies to benefit future generations and identify adaptive measures to address climate change. Under WaterSMART, Basin Studies are comprehensive water studies to explore options for meeting projected imbalances in water supply and demand in specific basins. Such studies could be most beneficial with application of recent scientific advances in climate projections, stochastic simulation, operational modeling and robust decision-making, as well as computational techniques to organize and analyze many alternatives. A new integrated set of tools and techniques to facilitate these studies includes the following components: Future supply scenarios are produced by the Hydrology Simulator, which uses non-parametric K-nearest neighbor resampling techniques to generate ensembles of hydrologic traces based on historical data, optionally conditioned on long paleo reconstructed data using various Markov Chain techniuqes. Resampling can also be conditioned on climate change projections from e.g., downscaled GCM projections to capture increased variability; spatial and temporal disaggregation is also provided. The simulations produced are ensembles of hydrologic inputs to the RiverWare operations/infrastucture decision modeling software. Alternative demand scenarios can be produced with the Demand Input Tool (DIT), an Excel-based tool that allows modifying future demands by groups such as states; sectors, e.g., agriculture, municipal, energy; and hydrologic basins. The demands can be scaled at future dates or changes ramped over specified time periods. Resulting data is imported directly into the decision model. Different model files can represent infrastructure alternatives and different Policy Sets represent alternative operating policies, including options for noticing when conditions point to unacceptable vulnerabilities, which trigger dynamically executing changes in operations or other options. The over-arching Study Manager provides a graphical tool to create combinations of future supply scenarios, demand scenarios, infrastructure and operating policy alternatives; each scenario is executed as an ensemble of RiverWare runs, driven by the hydrologic supply. The Study Manager sets up and manages multiple executions on multi-core hardware. The sizeable are typically direct model outputs, or post-processed indicators of performance based on model outputs. Post processing statistical analysis of the outputs are possible using the Graphical Policy Analysis Tool or other statistical packages. Several Basin Studies undertaken have used RiverWare to evaluate future scenarios. The Colorado River Basin Study, the most complex and extensive to date, has taken advantage of these tools and techniques to generate supply scenarios, produce alternative demand scenarios and to set up and execute the many combinations of supplies, demands, policies, and infrastructure alternatives. The tools and techniques will be described with example applications.

  3. Learning Outcomes in a Laboratory Environment vs. Classroom for Statistics Instruction: An Alternative Approach Using Statistical Software

    ERIC Educational Resources Information Center

    McCulloch, Ryan Sterling

    2017-01-01

    The role of any statistics course is to increase the understanding and comprehension of statistical concepts and those goals can be achieved via both theoretical instruction and statistical software training. However, many introductory courses either forego advanced software usage, or leave its use to the student as a peripheral activity. The…

  4. Advanced Placement® Statistics Students' Education Choices after High School. Research Notes. RN-38

    ERIC Educational Resources Information Center

    Patterson, Brian F.

    2009-01-01

    Taking the AP Statistics course and exam does not appear to be related to greater interest in the statistical sciences. Despite this finding, with respect to deciding whether to take further statistics course work and majoring in statistics, students appear to feel prepared for, but not interested in, further study. There is certainly more…

  5. Using the Signal Tools and Statistical Tools to Redefine the 24 Solar Terms in Peasant Calendar by Analyzing Surface Temperature and Precipitation

    NASA Astrophysics Data System (ADS)

    Huang, J. Y.; Tung, C. P.

    2017-12-01

    There is an important book called "Peasant Calendar" in the Chinese society. The Peasant Calendar is originally based on the orbit of the Sun and each year is divided into 24 solar terms. Each term has its own special meaning and conception. For example, "Spring Begins" means the end of winter and the beginning of spring. In Taiwan, 24 solar terms play an important role in agriculture because farmers always use the Peasant Calendar to decide when to sow. However, the current solar term in Taiwan is fixed about 15 days. This way doesn't show the temporal variability of climate and also can't truly reflect the regional climate characteristics in different areas.The number of days in each solar term should be more flexible. Since weather is associated with climate, all weather phenomena can be regarded as a multiple fluctuation signal. In this research, 30 years observation data of surface temperature and precipitation from 1976 2016 are used. The data is cut into different time series, such as a week, a month, six months to one year and so on. Signal analysis tools such as wavelet, change point analysis and Fourier transform are used to determine the length of each solar term. After determining the days of each solar term, statistical tests are used to find the relationships between the length of solar terms and climate turbulent (e.g., ENSO and PDO).For example, one of the solar terms called "Major Heat" should typically be more than 20 days in Taiwan due to global warming and heat island effect. The advance of Peasant Calendar can help farmers to make better decision, controlling crop schedule and using the farmland more efficient. For instance, warmer condition can accelerate the accumulation of accumulated temperature, which is the key of crop's growth stage. The result also can be used on disaster reduction (e.g., preventing agricultural damage) and water resources project.

  6. Spinoff 2009

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Topics covered include: Image-Capture Devices Extend Medicine's Reach; Medical Devices Assess, Treat Balance Disorders; NASA Bioreactors Advance Disease Treatments; Robotics Algorithms Provide Nutritional Guidelines; "Anti-Gravity" Treadmills Speed Rehabilitation; Crew Management Processes Revitalize Patient Care; Hubble Systems Optimize Hospital Schedules; Web-based Programs Assess Cognitive Fitness; Electrolyte Concentrates Treat Dehydration; Tools Lighten Designs, Maintain Structural Integrity; Insulating Foams Save Money, Increase Safety; Polyimide Resins Resist Extreme Temperatures; Sensors Locate Radio Interference; Surface Operations Systems Improve Airport Efficiency; Nontoxic Resins Advance Aerospace Manufacturing; Sensors Provide Early Warning of Biological Threats; Robot Saves Soldier's Lives Overseas (MarcBot); Apollo-Era Life Raft Saves Hundreds of Sailors; Circuits Enhance Scientific Instruments and Safety Devices; Tough Textiles Protect Payloads and Public Safety Officers; Forecasting Tools Point to Fishing Hotspots; Air Purifiers Eliminate Pathogens, Preserve Food; Fabrics Protect Sensitive Skin from UV Rays; Phase Change Fabrics Control Temperature; Tiny Devices Project Sharp, Colorful Images; Star-Mapping Tools Enable Tracking of Endangered Animals; Nanofiber Filters Eliminate Contaminants; Modeling Innovations Advance Wind Energy Industry; Thermal Insulation Strips Conserve Energy; Satellite Respondent Buoys Identify Ocean Debris; Mobile Instruments Measure Atmospheric Pollutants; Cloud Imagers Offer New Details on Earth's Health; Antennas Lower Cost of Satellite Access; Feature Detection Systems Enhance Satellite Imagery; Chlorophyll Meters Aid Plant Nutrient Management; Telemetry Boards Interpret Rocket, Airplane Engine Data; Programs Automate Complex Operations Monitoring; Software Tools Streamline Project Management; Modeling Languages Refine Vehicle Design; Radio Relays Improve Wireless Products; Advanced Sensors Boost Optical Communication, Imaging; Tensile Fabrics Enhance Architecture Around the World; Robust Light Filters Support Powerful Imaging Devices; Thermoelectric Devices Cool, Power Electronics; Innovative Tools Advance Revolutionary Weld Technique; Methods Reduce Cost, Enhance Quality of Nanotubes; Gauging Systems Monitor Cryogenic Liquids; Voltage Sensors Monitor Harmful Static; and Compact Instruments Measure Heat Potential.

  7. SU-E-T-398: Feasibility of Automated Tools for Robustness Evaluation of Advanced Photon and Proton Techniques in Oropharyngeal Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, H; Liang, X; Kalbasi, A

    2014-06-01

    Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: protonmore » PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.« less

  8. PinAPL-Py: A comprehensive web-application for the analysis of CRISPR/Cas9 screens.

    PubMed

    Spahn, Philipp N; Bath, Tyler; Weiss, Ryan J; Kim, Jihoon; Esko, Jeffrey D; Lewis, Nathan E; Harismendy, Olivier

    2017-11-20

    Large-scale genetic screens using CRISPR/Cas9 technology have emerged as a major tool for functional genomics. With its increased popularity, experimental biologists frequently acquire large sequencing datasets for which they often do not have an easy analysis option. While a few bioinformatic tools have been developed for this purpose, their utility is still hindered either due to limited functionality or the requirement of bioinformatic expertise. To make sequencing data analysis of CRISPR/Cas9 screens more accessible to a wide range of scientists, we developed a Platform-independent Analysis of Pooled Screens using Python (PinAPL-Py), which is operated as an intuitive web-service. PinAPL-Py implements state-of-the-art tools and statistical models, assembled in a comprehensive workflow covering sequence quality control, automated sgRNA sequence extraction, alignment, sgRNA enrichment/depletion analysis and gene ranking. The workflow is set up to use a variety of popular sgRNA libraries as well as custom libraries that can be easily uploaded. Various analysis options are offered, suitable to analyze a large variety of CRISPR/Cas9 screening experiments. Analysis output includes ranked lists of sgRNAs and genes, and publication-ready plots. PinAPL-Py helps to advance genome-wide screening efforts by combining comprehensive functionality with user-friendly implementation. PinAPL-Py is freely accessible at http://pinapl-py.ucsd.edu with instructions and test datasets.

  9. HTAPP: High-Throughput Autonomous Proteomic Pipeline

    PubMed Central

    Yu, Kebing; Salomon, Arthur R.

    2011-01-01

    Recent advances in the speed and sensitivity of mass spectrometers and in analytical methods, the exponential acceleration of computer processing speeds, and the availability of genomic databases from an array of species and protein information databases have led to a deluge of proteomic data. The development of a lab-based automated proteomic software platform for the automated collection, processing, storage, and visualization of expansive proteomic datasets is critically important. The high-throughput autonomous proteomic pipeline (HTAPP) described here is designed from the ground up to provide critically important flexibility for diverse proteomic workflows and to streamline the total analysis of a complex proteomic sample. This tool is comprised of software that controls the acquisition of mass spectral data along with automation of post-acquisition tasks such as peptide quantification, clustered MS/MS spectral database searching, statistical validation, and data exploration within a user-configurable lab-based relational database. The software design of HTAPP focuses on accommodating diverse workflows and providing missing software functionality to a wide range of proteomic researchers to accelerate the extraction of biological meaning from immense proteomic data sets. Although individual software modules in our integrated technology platform may have some similarities to existing tools, the true novelty of the approach described here is in the synergistic and flexible combination of these tools to provide an integrated and efficient analysis of proteomic samples. PMID:20336676

  10. Thermal protection system (TPS) monitoring using acoustic emission

    NASA Astrophysics Data System (ADS)

    Hurley, D. A.; Huston, D. R.; Fletcher, D. G.; Owens, W. P.

    2011-04-01

    This project investigates acoustic emission (AE) as a tool for monitoring the degradation of thermal protection systems (TPS). The AE sensors are part of an array of instrumentation on an inductively coupled plasma (ICP) torch designed for testing advanced thermal protection aerospace materials used for hypervelocity vehicles. AE are generated by stresses within the material, propagate as elastic stress waves, and can be detected with sensitive instrumentation. Graphite (POCO DFP-2) is used to study gas-surface interaction during degradation of thermal protection materials. The plasma is produced by a RF magnetic field driven by a 30kW power supply at 3.5 MHz, which creates a noisy environment with large spikes when powered on or off. AE are waveguided from source to sensor by a liquid-cooled copper probe used to position the graphite sample in the plasma stream. Preliminary testing was used to set filters and thresholds on the AE detection system (Physical Acoustics PCI-2) to minimize the impact of considerable operating noise. Testing results show good correlation between AE data and testing environment, which dictates the physics and chemistry of the thermal breakdown of the sample. Current efforts for the project are expanding the dataset and developing statistical analysis tools. This study shows the potential of AE as a powerful tool for analysis of thermal protection material thermal degradations with the unique capability of real-time, in-situ monitoring.

  11. A Practical Framework Toward Prediction of Breaking Force and Disintegration of Tablet Formulations Using Machine Learning Tools.

    PubMed

    Akseli, Ilgaz; Xie, Jingjin; Schultz, Leon; Ladyzhynsky, Nadia; Bramante, Tommasina; He, Xiaorong; Deanne, Rich; Horspool, Keith R; Schwabe, Robert

    2017-01-01

    Enabling the paradigm of quality by design requires the ability to quantitatively correlate material properties and process variables to measureable product performance attributes. Conventional, quality-by-test methods for determining tablet breaking force and disintegration time usually involve destructive tests, which consume significant amount of time and labor and provide limited information. Recent advances in material characterization, statistical analysis, and machine learning have provided multiple tools that have the potential to develop nondestructive, fast, and accurate approaches in drug product development. In this work, a methodology to predict the breaking force and disintegration time of tablet formulations using nondestructive ultrasonics and machine learning tools was developed. The input variables to the model include intrinsic properties of formulation and extrinsic process variables influencing the tablet during manufacturing. The model has been applied to predict breaking force and disintegration time using small quantities of active pharmaceutical ingredient and prototype formulation designs. The novel approach presented is a step forward toward rational design of a robust drug product based on insight into the performance of common materials during formulation and process development. It may also help expedite drug product development timeline and reduce active pharmaceutical ingredient usage while improving efficiency of the overall process. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  12. Evaluation of reliability modeling tools for advanced fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Baker, Robert; Scheper, Charlotte

    1986-01-01

    The Computer Aided Reliability Estimation (CARE III) and Automated Reliability Interactice Estimation System (ARIES 82) reliability tools for application to advanced fault tolerance aerospace systems were evaluated. To determine reliability modeling requirements, the evaluation focused on the Draper Laboratories' Advanced Information Processing System (AIPS) architecture as an example architecture for fault tolerance aerospace systems. Advantages and limitations were identified for each reliability evaluation tool. The CARE III program was designed primarily for analyzing ultrareliable flight control systems. The ARIES 82 program's primary use was to support university research and teaching. Both CARE III and ARIES 82 were not suited for determining the reliability of complex nodal networks of the type used to interconnect processing sites in the AIPS architecture. It was concluded that ARIES was not suitable for modeling advanced fault tolerant systems. It was further concluded that subject to some limitations (the difficulty in modeling systems with unpowered spare modules, systems where equipment maintenance must be considered, systems where failure depends on the sequence in which faults occurred, and systems where multiple faults greater than a double near coincident faults must be considered), CARE III is best suited for evaluating the reliability of advanced tolerant systems for air transport.

  13. Therapeutic Gene Editing Safety and Specificity.

    PubMed

    Lux, Christopher T; Scharenberg, Andrew M

    2017-10-01

    Therapeutic gene editing is significant for medical advancement. Safety is intricately linked to the specificity of the editing tools used to cut at precise genomic targets. Improvements can be achieved by thoughtful design of nucleases and repair templates, analysis of off-target editing, and careful utilization of viral vectors. Advancements in DNA repair mechanisms and development of new generations of tools improve targeting of specific sequences while minimizing risks. It is important to plot a safe course for future clinical trials. This article reviews safety and specificity for therapeutic gene editing to spur dialogue and advancement. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Using Statistical Natural Language Processing for Understanding Complex Responses to Free-Response Tasks

    ERIC Educational Resources Information Center

    DeMark, Sarah F.; Behrens, John T.

    2004-01-01

    Whereas great advances have been made in the statistical sophistication of assessments in terms of evidence accumulation and task selection, relatively little statistical work has explored the possibility of applying statistical techniques to data for the purposes of determining appropriate domain understanding and to generate task-level scoring…

  15. MASPECTRAS: a platform for management and analysis of proteomics LC-MS/MS data

    PubMed Central

    Hartler, Jürgen; Thallinger, Gerhard G; Stocker, Gernot; Sturn, Alexander; Burkard, Thomas R; Körner, Erik; Rader, Robert; Schmidt, Andreas; Mechtler, Karl; Trajanoski, Zlatko

    2007-01-01

    Background The advancements of proteomics technologies have led to a rapid increase in the number, size and rate at which datasets are generated. Managing and extracting valuable information from such datasets requires the use of data management platforms and computational approaches. Results We have developed the MAss SPECTRometry Analysis System (MASPECTRAS), a platform for management and analysis of proteomics LC-MS/MS data. MASPECTRAS is based on the Proteome Experimental Data Repository (PEDRo) relational database schema and follows the guidelines of the Proteomics Standards Initiative (PSI). Analysis modules include: 1) import and parsing of the results from the search engines SEQUEST, Mascot, Spectrum Mill, X! Tandem, and OMSSA; 2) peptide validation, 3) clustering of proteins based on Markov Clustering and multiple alignments; and 4) quantification using the Automated Statistical Analysis of Protein Abundance Ratios algorithm (ASAPRatio). The system provides customizable data retrieval and visualization tools, as well as export to PRoteomics IDEntifications public repository (PRIDE). MASPECTRAS is freely available at Conclusion Given the unique features and the flexibility due to the use of standard software technology, our platform represents significant advance and could be of great interest to the proteomics community. PMID:17567892

  16. Precalculus teachers' perspectives on using graphing calculators: an example from one curriculum

    NASA Astrophysics Data System (ADS)

    Karadeniz, Ilyas; Thompson, Denisse R.

    2018-01-01

    Graphing calculators are hand-held technological tools currently used in mathematics classrooms. Teachers' perspectives on using graphing calculators are important in terms of exploring what teachers think about using such technology in advanced mathematics courses, particularly precalculus courses. A descriptive intrinsic case study was conducted to analyse the perspectives of 11 teachers using graphing calculators with potential Computer Algebra System (CAS) capability while teaching Functions, Statistics, and Trigonometry, a precalculus course for 11th-grade students developed by the University of Chicago School Mathematics Project. Data were collected from multiple sources as part of a curriculum evaluation study conducted during the 2007-2008 school year. Although all teachers were using the same curriculum that integrated CAS into the instructional materials, teachers had mixed views about the technology. Graphing calculator features were used much more than CAS features, with many teachers concerned about the use of CAS because of pressures from external assessments. In addition, several teachers found it overwhelming to learn a new technology at the same time they were learning a new curriculum. The results have implications for curriculum developers and others working with teachers to update curriculum and the use of advanced technologies simultaneously.

  17. Capacity Building to Support Governmental Meteorological and Agricultural Communities in East Africa

    NASA Astrophysics Data System (ADS)

    Granger, S. L.; Macharia, D.; Das, N.; Andreadis, K.; Ines, A.

    2016-12-01

    There is a recognized need for data to support decision-making and planning in East Africa where people and national economies depend on rain fed agriculture and are vulnerable to a changing climate and extreme weather events. However, capacity to use existing global data stores and transition promising tools is a gap that severely limits the use and adoption of these data and tools. Although most people think of capacity building as simply training, it is really much more than that and has been more thoroughly described in the public health community as…."the process of developing and strengthening the skills, instincts, abilities, processes and resources that organizations and communities need to survive, adapt, and thrive in the fast-changing world." Data and tools from NASA and other providers are often not used as they could be for technical and institutional reasons. On the technical side, there is the perception that global data stores are impenetrable requiring special expertise to access them, even if the data can be accessed, the technical expertise to understand and use the data and tools may be lacking, and there can be a mismatch between science data and existing user tools. On the institutional side, it may be perceived that remote sensing data and tools are too "expensive", support from upper management may be non-existent due to limited resources or lack of interest, and there can be a lack of appreciation of data and statistics in decision making. How do we overcome some of these barriers to advance the use of remote sensing for applications and to ease transition of data and tools to stakeholders? Experience from recent capacity building efforts in East Africa in support of a NASA-SERVIR Applied Science Project to provide estimates of hydrologic extremes tied to crop yield will be discussed.

  18. Propensity score to detect baseline imbalance in cluster randomized trials: the role of the c-statistic.

    PubMed

    Leyrat, Clémence; Caille, Agnès; Foucher, Yohann; Giraudeau, Bruno

    2016-01-22

    Despite randomization, baseline imbalance and confounding bias may occur in cluster randomized trials (CRTs). Covariate imbalance may jeopardize the validity of statistical inferences if they occur on prognostic factors. Thus, the diagnosis of a such imbalance is essential to adjust statistical analysis if required. We developed a tool based on the c-statistic of the propensity score (PS) model to detect global baseline covariate imbalance in CRTs and assess the risk of confounding bias. We performed a simulation study to assess the performance of the proposed tool and applied this method to analyze the data from 2 published CRTs. The proposed method had good performance for large sample sizes (n =500 per arm) and when the number of unbalanced covariates was not too small as compared with the total number of baseline covariates (≥40% of unbalanced covariates). We also provide a strategy for pre selection of the covariates needed to be included in the PS model to enhance imbalance detection. The proposed tool could be useful in deciding whether covariate adjustment is required before performing statistical analyses of CRTs.

  19. Advanced MR Imaging of the Placenta: Exploring the in utero placenta-brain connection

    PubMed Central

    Andescavage, Nickie Niforatos; DuPlessis, Adre; Limperopoulos, Catherine

    2015-01-01

    The placenta is a vital organ necessary for the healthy neurodevelopment of the fetus. Despite the known associations between placental dysfunction and neurologic impairment, there is a paucity of tools available to reliably assess in vivo placental health and function. Existing clinical tools for placental assessment remain insensitive in predicting and assessing placental well-being. Advanced MRI techniques hold significant promise for the dynamic, non-invasive, real-time assessment of placental health and identification of early placental-based disorders. In this review, we summarize the available clinical tools for placental assessment including ultrasound, Doppler, and conventional MRI. We then explore the emerging role of advanced placental MR imaging techniques for supporting the developing fetus, appraise the strengths and limitations of quantitative MRI in identifying early markers of placental dysfunction for improved pregnancy monitoring and fetal outcomes. PMID:25765905

  20. Effects of an Approach Spacing Flight Deck Tool on Pilot Eyescan

    DOT National Transportation Integrated Search

    2004-02-01

    An airborne tool has been developed based on the concept of an aircraft maintaining a time-based spacing interval from the preceding aircraft. The : Advanced Terminal Area Approach Spacing (ATAAS) tool uses Automatic : Dependent Surveillance-Broadcas...

  1. A review of genome-wide approaches to study the genetic basis for spermatogenic defects.

    PubMed

    Aston, Kenneth I; Conrad, Donald F

    2013-01-01

    Rapidly advancing tools for genetic analysis on a genome-wide scale have been instrumental in identifying the genetic bases for many complex diseases. About half of male infertility cases are of unknown etiology in spite of tremendous efforts to characterize the genetic basis for the disorder. Advancing our understanding of the genetic basis for male infertility will require the application of established and emerging genomic tools. This chapter introduces many of the tools available for genetic studies on a genome-wide scale along with principles of study design and data analysis.

  2. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  3. A Web-Based Treatment Decision Support Tool for Patients With Advanced Knee Arthritis: Evaluation of User Interface and Content Design

    PubMed Central

    Zheng, Hua; Rosal, Milagros C; Li, Wenjun; Borg, Amy; Yang, Wenyun; Ayers, David C

    2018-01-01

    Background Data-driven surgical decisions will ensure proper use and timing of surgical care. We developed a Web-based patient-centered treatment decision and assessment tool to guide treatment decisions among patients with advanced knee osteoarthritis who are considering total knee replacement surgery. Objective The aim of this study was to examine user experience and acceptance of the Web-based treatment decision support tool among older adults. Methods User-centered formative and summative evaluations were conducted for the tool. A sample of 28 patients who were considering total knee replacement participated in the study. Participants’ responses to the user interface design, the clarity of information, as well as usefulness, satisfaction, and acceptance of the tool were collected through qualitative (ie, individual patient interviews) and quantitative (ie, standardized Computer System Usability Questionnaire) methods. Results Participants were older adults with a mean age of 63 (SD 11) years. Three-quarters of them had no technical questions using the tool. User interface design recommendations included larger fonts, bigger buttons, less colors, simpler navigation without extra “next page” click, less mouse movement, and clearer illustrations with simple graphs. Color-coded bar charts and outcome-specific graphs with positive action were easiest for them to understand the outcomes data. Questionnaire data revealed high satisfaction with the tool usefulness and interface quality, and also showed ease of use of the tool, regardless of age or educational status. Conclusions We evaluated the usability of a patient-centered decision support tool designed for advanced knee arthritis patients to facilitate their knee osteoarthritis treatment decision making. The lessons learned can inform other decision support tools to improve interface and content design for older patients’ use. PMID:29712620

  4. Use of Statistical Heuristics in Everyday Inductive Reasoning.

    ERIC Educational Resources Information Center

    Nisbett, Richard E.; And Others

    1983-01-01

    In everyday reasoning, people use statistical heuristics (judgmental tools that are rough intuitive equivalents of statistical principles). Use of statistical heuristics is more likely when (1) sampling is clear, (2) the role of chance is clear, (3) statistical reasoning is normative for the event, or (4) the subject has had training in…

  5. P-Value Club: Teaching Significance Level on the Dance Floor

    ERIC Educational Resources Information Center

    Gray, Jennifer

    2010-01-01

    Courses: Beginning research methods and statistics courses, as well as advanced communication courses that require reading research articles and completing research projects involving statistics. Objective: Students will understand the difference between significant and nonsignificant statistical results based on p-value.

  6. ISSUES IN THE STATISTICAL ANALYSIS OF SMALL-AREA HEALTH DATA. (R825173)

    EPA Science Inventory

    The availability of geographically indexed health and population data, with advances in computing, geographical information systems and statistical methodology, have opened the way for serious exploration of small area health statistics based on routine data. Such analyses may be...

  7. Knowledge-Acquisition Tool For Expert System

    NASA Technical Reports Server (NTRS)

    Disbrow, James D.; Duke, Eugene L.; Regenie, Victoria A.

    1988-01-01

    Digital flight-control systems monitored by computer program that evaluates and recommends. Flight-systems engineers for advanced, high-performance aircraft use knowlege-acquisition tool for expert-system flight-status monitor suppling interpretative data. Interpretative function especially important in time-critical, high-stress situations because it facilitates problem identification and corrective strategy. Conditions evaluated and recommendations made by ground-based engineers having essential knowledge for analysis and monitoring of performances of advanced aircraft systems.

  8. Status and outlook of CFD technology at Mitsubishi Heavy Industries, Nagoya

    NASA Astrophysics Data System (ADS)

    Tanioka, Tadayuki

    1990-09-01

    Computational Fluid Dynamics (CFD) technology has made tremendous progress in the last several years. It has matured to become a practical simulation tool in aircraft industries. In MHI, CFD has become an indispensible tool for aerodynamic design aerospace vehicles. The present status is described of this advanced technology at MHI. Also mentioned are some future advances of the fast growing technology as well as associated hardware requirements.

  9. Web-based data collection: detailed methods of a questionnaire and data gathering tool

    PubMed Central

    Cooper, Charles J; Cooper, Sharon P; del Junco, Deborah J; Shipp, Eva M; Whitworth, Ryan; Cooper, Sara R

    2006-01-01

    There have been dramatic advances in the development of web-based data collection instruments. This paper outlines a systematic web-based approach to facilitate this process through locally developed code and to describe the results of using this process after two years of data collection. We provide a detailed example of a web-based method that we developed for a study in Starr County, Texas, assessing high school students' work and health status. This web-based application includes data instrument design, data entry and management, and data tables needed to store the results that attempt to maximize the advantages of this data collection method. The software also efficiently produces a coding manual, web-based statistical summary and crosstab reports, as well as input templates for use by statistical packages. Overall, web-based data entry using a dynamic approach proved to be a very efficient and effective data collection system. This data collection method expedited data processing and analysis and eliminated the need for cumbersome and expensive transfer and tracking of forms, data entry, and verification. The code has been made available for non-profit use only to the public health research community as a free download [1]. PMID:16390556

  10. Spatially disaggregated population estimates in the absence of national population and housing census data

    PubMed Central

    Wardrop, N. A.; Jochem, W. C.; Bird, T. J.; Chamberlain, H. R.; Clarke, D.; Kerr, D.; Bengtsson, L.; Juran, S.; Seaman, V.; Tatem, A. J.

    2018-01-01

    Population numbers at local levels are fundamental data for many applications, including the delivery and planning of services, election preparation, and response to disasters. In resource-poor settings, recent and reliable demographic data at subnational scales can often be lacking. National population and housing census data can be outdated, inaccurate, or missing key groups or areas, while registry data are generally lacking or incomplete. Moreover, at local scales accurate boundary data are often limited, and high rates of migration and urban growth make existing data quickly outdated. Here we review past and ongoing work aimed at producing spatially disaggregated local-scale population estimates, and discuss how new technologies are now enabling robust and cost-effective solutions. Recent advances in the availability of detailed satellite imagery, geopositioning tools for field surveys, statistical methods, and computational power are enabling the development and application of approaches that can estimate population distributions at fine spatial scales across entire countries in the absence of census data. We outline the potential of such approaches as well as their limitations, emphasizing the political and operational hurdles for acceptance and sustainable implementation of new approaches, and the continued importance of traditional sources of national statistical data. PMID:29555739

  11. Back to BaySICS: a user-friendly program for Bayesian Statistical Inference from Coalescent Simulations.

    PubMed

    Sandoval-Castellanos, Edson; Palkopoulou, Eleftheria; Dalén, Love

    2014-01-01

    Inference of population demographic history has vastly improved in recent years due to a number of technological and theoretical advances including the use of ancient DNA. Approximate Bayesian computation (ABC) stands among the most promising methods due to its simple theoretical fundament and exceptional flexibility. However, limited availability of user-friendly programs that perform ABC analysis renders it difficult to implement, and hence programming skills are frequently required. In addition, there is limited availability of programs able to deal with heterochronous data. Here we present the software BaySICS: Bayesian Statistical Inference of Coalescent Simulations. BaySICS provides an integrated and user-friendly platform that performs ABC analyses by means of coalescent simulations from DNA sequence data. It estimates historical demographic population parameters and performs hypothesis testing by means of Bayes factors obtained from model comparisons. Although providing specific features that improve inference from datasets with heterochronous data, BaySICS also has several capabilities making it a suitable tool for analysing contemporary genetic datasets. Those capabilities include joint analysis of independent tables, a graphical interface and the implementation of Markov-chain Monte Carlo without likelihoods.

  12. A computational DFT study of structural transitions in textured solid-fluid interfaces

    NASA Astrophysics Data System (ADS)

    Yatsyshin, Petr; Parry, Andrew O.; Kalliadasis, Serafim

    2015-11-01

    Fluids adsorbed at walls, in capillary pores and slits, and in more exotic, sculpted geometries such as grooves and wedges can exhibit many new phase transitions, including wetting, pre-wetting, capillary-condensation and filling, compared to their bulk counterparts. As well as being of fundamental interest to the modern statistical mechanical theory of inhomogeneous fluids, these are also relevant to nanofluidics, chemical- and bioengineering. In this talk we will show using a microscopic Density Functional Theory (DFT) for fluids how novel, continuous, interfacial transitions associated with the first-order prewetting line, can occur on steps, in grooves and in wedges, that are sensitive to both the range of the intermolecular forces and interfacial fluctuation effects. These transitions compete with wetting, filling and condensation producing very rich phase diagrams even for relatively simple geometries. We will also discuss practical aspects of DFT calculations, and demonstrate how this statistical-mechanical framework is capable of yielding complex fluid structure, interfacial tensions, and regions of thermodynamic stability of various fluid configurations. As a side note, this demonstrates that DFT is an excellent tool for the investigations of complex multiphase systems. We acknowledge financial support from the European Research Council via Advanced Grant No. 247031.

  13. Econometric Assessment of "One Minute" Paper as a Pedagogic Tool

    ERIC Educational Resources Information Center

    Das, Amaresh

    2010-01-01

    This paper makes an econometric testing of one-minute paper used as a tool to manage and assess instruction in my statistics class. One of our findings is that the one minute paper when I have tested it by using an OLS estimate in a controlled Vs experimental design framework is found to statistically significant and effective in enhancing…

  14. Classroom Research: Assessment of Student Understanding of Sampling Distributions of Means and the Central Limit Theorem in Post-Calculus Probability and Statistics Classes

    ERIC Educational Resources Information Center

    Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy

    2006-01-01

    We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…

  15. ProUCL version 4.1.00 Documentation Downloads

    EPA Pesticide Factsheets

    ProUCL version 4.1.00 represents a comprehensive statistical software package equipped with statistical methods and graphical tools needed to address many environmental sampling and statistical issues as described in various these guidance documents.

  16. Adaptive and Optimal Control of Stochastic Dynamical Systems

    DTIC Science & Technology

    2015-09-14

    Advances in Statistics, Probability and Actuarial Sciences , Vol. 1, World Scientific, 2012, 451- 463. [4] T. E. Duncan and B. Pasik-Duncan, A...S. N. Cohen, T. K. Siu and H. Yang) Advances in Statistics, Probability and Actuarial Sciences , Vol. 1, World Scientific, 2012, 451-463. 4. T. E...games with gen- eral noise processes, Models and Methods in Economics and Management Science : Essays in Honor of Charles S. Tapiero, (eds. F. El

  17. Comparative study of coated and uncoated tool inserts with dry machining of EN47 steel using Taguchi L9 optimization technique

    NASA Astrophysics Data System (ADS)

    Vasu, M.; Shivananda, Nayaka H.

    2018-04-01

    EN47 steel samples are machined on a self-centered lathe using Chemical Vapor Deposition of coated TiCN/Al2O3/TiN and uncoated tungsten carbide tool inserts, with nose radius 0.8mm. Results are compared with each other and optimized using statistical tool. Input (cutting) parameters that are considered in this work are feed rate (f), cutting speed (Vc), and depth of cut (ap), the optimization criteria are based on the Taguchi (L9) orthogonal array. ANOVA method is adopted to evaluate the statistical significance and also percentage contribution for each model. Multiple response characteristics namely cutting force (Fz), tool tip temperature (T) and surface roughness (Ra) are evaluated. The results discovered that coated tool insert (TiCN/Al2O3/TiN) exhibits 1.27 and 1.29 times better than the uncoated tool insert for tool tip temperature and surface roughness respectively. A slight increase in cutting force was observed for coated tools.

  18. Provider Tools for Advance Care Planning and Goals of Care Discussions: A Systematic Review.

    PubMed

    Myers, Jeff; Cosby, Roxanne; Gzik, Danusia; Harle, Ingrid; Harrold, Deb; Incardona, Nadia; Walton, Tara

    2018-01-01

    Advance care planning and goals of care discussions involve the exploration of what is most important to a person, including their values and beliefs in preparation for health-care decision-making. Advance care planning conversations focus on planning for future health care, ensuring that an incapable person's wishes are known and can guide the person's substitute decision maker for future decision-making. Goals of care discussions focus on preparing for current decision-making by ensuring the person's goals guide this process. To provide evidence regarding tools and/or practices available for use by health-care providers to effectively facilitate advance care planning conversations and/or goals of care discussions. A systematic review was conducted focusing on guidelines, randomized trials, comparative studies, and noncomparative studies. Databases searched included MEDLINE, EMBASE, and the proceedings of the International Advance Care Planning Conference and the American Society of Clinical Oncology Palliative Care Symposium. Although several studies report positive findings, there is a lack of consistent patient outcome evidence to support any one clinical tool for use in advance care planning or goals of care discussions. Effective advance care planning conversations at both the population and the individual level require provider education and communication skill development, standardized and accessible documentation, quality improvement initiatives, and system-wide coordination to impact the population level. There is a need for research focused on goals of care discussions, to clarify the purpose and expected outcomes of these discussions, and to clearly differentiate goals of care from advance care planning.

  19. Implementing clinical protocols in oncology: quality gaps and the learning curve phenomenon.

    PubMed

    Kedikoglou, Simos; Syrigos, Konstantinos; Skalkidis, Yannis; Ploiarchopoulou, Fani; Dessypris, Nick; Petridou, Eleni

    2005-08-01

    The quality improvement effort in clinical practice has focused mostly on 'performance quality', i.e. on the development of comprehensive, evidence-based guidelines. This study aimed to assess the 'conformance quality', i.e. the extent to which guidelines once developed are correctly and consistently applied. It also aimed to assess the existence of quality gaps in the treatment of certain patient segments as defined by age or gender and to investigate methods to improve overall conformance quality. A retrospective audit of clinical practice in a well-defined oncology setting was undertaken and the results compared to those obtained from prospectively applying an internally developed clinical protocol in the same setting and using specific tools to increase conformance quality. All indicators showed improvement after the implementation of the protocol that in many cases reached statistical significance, while in the entire cohort advanced age was associated (although not significantly) with sub-optimal delivery of care. A 'learning curve' phenomenon in the implementation of quality initiatives was detected, with all indicators improving substantially in the second part of the prospective study. Clinicians should pay separate attention to the implementation of chosen protocols and employ specific tools to increase conformance quality in patient care.

  20. Measuring the progress of capacity building in the Alberta Policy Coalition for Cancer Prevention.

    PubMed

    Raine, Kim D; Sosa Hernandez, Cristabel; Nykiforuk, Candace I J; Reed, Shandy; Montemurro, Genevieve; Lytvyak, Ellina; MacLellan-Wright, Mary-Frances

    2014-07-01

    The Alberta Policy Coalition for Cancer Prevention (APCCP) represents practitioners, policy makers, researchers, and community organizations working together to coordinate efforts and advocate for policy change to reduce chronic diseases. The aim of this research was to capture changes in the APCCP's capacity to advance its goals over the course of its operation. We adapted the Public Health Agency of Canada's validated Community Capacity-Building Tool to capture policy work. All members of the APCCP were invited to complete the tool in 2010 and 2011. Responses were analyzed using descriptive statistics and t tests. Qualitative comments were analyzed using thematic content analysis. A group process for reaching consensus provided context to the survey responses and contributed to a participatory analysis. Significant improvement was observed in eight out of nine capacity domains. Lessons learned highlight the importance of balancing volume and diversity of intersectoral representation to ensure effective participation, as well as aligning professional and economic resources. Defining involvement and roles within a coalition can be a challenging activity contingent on the interests of each sector represented. The participatory analysis enabled the group to reflect on progress made and future directions for policy advocacy. © 2013 Society for Public Health Education.

  1. Inlet Flow Control and Prediction Technologies for Embedded Propulsion Systems

    NASA Technical Reports Server (NTRS)

    McMillan, Michelle L.; Mackie, Scott A.; Gissen, Abe; Vukasinovic, Bojan; Lakebrink, Matthew T.; Glezer, Ari; Mani, Mori; Mace, James L.

    2011-01-01

    Fail-safe, hybrid, flow control (HFC) is a promising technology for meeting high-speed cruise efficiency, low-noise signature, and reduced fuel-burn goals for future, Hybrid-Wing-Body (HWB) aircraft with embedded engines. This report details the development of HFC technology that enables improved inlet performance in HWB vehicles with highly integrated inlets and embedded engines without adversely affecting vehicle performance. In addition, new test techniques for evaluating Boundary-Layer-Ingesting (BLI)-inlet flow-control technologies developed and demonstrated through this program are documented, including the ability to generate a BLI-like inlet-entrance flow in a direct-connect, wind-tunnel facility, as well as, the use of D-optimal, statistically designed experiments to optimize test efficiency and enable interpretation of results. Validated improvements in numerical analysis tools and methods accomplished through this program are also documented, including Reynolds-Averaged Navier-Stokes CFD simulations of steady-state flow physics for baseline, BLI-inlet diffuser flow, as well as, that created by flow-control devices. Finally, numerical methods were employed in a ground-breaking attempt to directly simulate dynamic distortion. The advances in inlet technologies and prediction tools will help to meet and exceed "N+2" project goals for future HWB aircraft.

  2. Alzheimer's disease in the omics era.

    PubMed

    Sancesario, Giulia M; Bernardini, Sergio

    2018-06-18

    Recent progresses in high-throughput technologies have led to a new scenario in investigating pathologies, named the "Omics era", which integrate the opportunity to collect large amounts of data and information at the molecular and protein levels together with the development of novel computational and statistical tools that are able to analyze and filter such data. Subsequently, advances in genotyping arrays, next generation sequencing, mass spectrometry technology, and bioinformatics allowed for the simultaneous large-scale study of thousands of genes (genomics), epigenetics factors (epigenomics), RNA (transcriptomics), metabolites (metabolomics) and proteins(proteomics), with the possibility of integrating multiple types of omics data ("multi -omics"). All of these technological innovations have modified the approach to the study of complex diseases, such as Alzheimer's Disease (AD), thus representing a promising tool to investigate the relationship between several molecular pathways in AD as well as other pathologies. This review focuses on the current knowledge on the pathology of AD, the recent findings from Omics sciences, and the challenge of the use of Big Data. We then focus on future perspectives for Omics sciences, such as the discovery of novel diagnostic biomarkers or drugs. Copyright © 2018. Published by Elsevier Inc.

  3. phenoVein—A Tool for Leaf Vein Segmentation and Analysis1[OPEN

    PubMed Central

    Pflugfelder, Daniel; Huber, Gregor; Scharr, Hanno; Hülskamp, Martin; Koornneef, Maarten; Jahnke, Siegfried

    2015-01-01

    Precise measurements of leaf vein traits are an important aspect of plant phenotyping for ecological and genetic research. Here, we present a powerful and user-friendly image analysis tool named phenoVein. It is dedicated to automated segmenting and analyzing of leaf veins in images acquired with different imaging modalities (microscope, macrophotography, etc.), including options for comfortable manual correction. Advanced image filtering emphasizes veins from the background and compensates for local brightness inhomogeneities. The most important traits being calculated are total vein length, vein density, piecewise vein lengths and widths, areole area, and skeleton graph statistics, like the number of branching or ending points. For the determination of vein widths, a model-based vein edge estimation approach has been implemented. Validation was performed for the measurement of vein length, vein width, and vein density of Arabidopsis (Arabidopsis thaliana), proving the reliability of phenoVein. We demonstrate the power of phenoVein on a set of previously described vein structure mutants of Arabidopsis (hemivenata, ondulata3, and asymmetric leaves2-101) compared with wild-type accessions Columbia-0 and Landsberg erecta-0. phenoVein is freely available as open-source software. PMID:26468519

  4. Impact of e-Discipline on Children's Screen Time.

    PubMed

    Hawi, Nazir S; Rupert, Maya Samaha

    2015-06-01

    With rapid technological advancement, the prevalence and undesirable effects of excess screen time on children have become a mounting issue worldwide. There are many studies investigating the phenomenon's impact on society (e.g., behavioral, academic, health), but studies that uncover the causes and factors that increase the odds of children's excess screen time are limited. To this end, this study introduces the term "e-discipline" to refer to systematic practices that use screen devices as discipline tools. As such, the aim of this study is to investigate the association between e-discipline and children's screen time by gender. Analysis was performed on 3,141 children aged 7-11 years old. Bivariate logistic regression models were used to calculate the odds of exceeding the American Academy of Pediatrics guidelines of 2 hours of screen time per day by boys and girls whose parents practice e-discipline. The results showed that children whose parents used screen devices as discipline tools had significantly more screen time compared to children whose parents did not. Furthermore, no statistically significant gender differences were found in the odds of exceeding the recommended screen time under e-discipline. Recommendations stemming from all the results are discussed.

  5. Verification of a Multiphysics Toolkit against the Magnetized Target Fusion Concept

    NASA Technical Reports Server (NTRS)

    Thomas, Scott; Perrell, Eric; Liron, Caroline; Chiroux, Robert; Cassibry, Jason; Adams, Robert B.

    2005-01-01

    In the spring of 2004 the Advanced Concepts team at MSFC embarked on an ambitious project to develop a suite of modeling routines that would interact with one another. The tools would each numerically model a portion of any advanced propulsion system. The tools were divided by physics categories, hence the name multiphysics toolset. Currently most of the anticipated modeling tools have been created and integrated. Results are given in this paper for both a quarter nozzle with chemically reacting flow and the interaction of two plasma jets representative of a Magnetized Target Fusion device. The results have not been calibrated against real data as of yet, but this paper demonstrates the current capability of the multiphysics tool and planned future enhancements

  6. Advances in genetics and genomics: use and limitations in achieving malaria elimination goals

    PubMed Central

    Gunawardena, Sharmini; Karunaweera, Nadira D.

    2015-01-01

    Success of the global research agenda towards eradication of malaria will depend on the development of new tools, including drugs, vaccines, insecticides and diagnostics. Genetic and genomic information now available for the malaria parasites, their mosquito vectors and human host, can be harnessed to both develop these tools and monitor their effectiveness. Here we review and provide specific examples of current technological advances and how these genetic and genomic tools have increased our knowledge of host, parasite and vector biology in relation to malaria elimination and in turn enhanced the potential to reach that goal. We then discuss limitations of these tools and future prospects for the successful achievement of global malaria elimination goals. PMID:25943157

  7. Evaluating statistical consistency in the ocean model component of the Community Earth System Model (pyCECT v2.0)

    NASA Astrophysics Data System (ADS)

    Baker, Allison H.; Hu, Yong; Hammerling, Dorit M.; Tseng, Yu-heng; Xu, Haiying; Huang, Xiaomeng; Bryan, Frank O.; Yang, Guangwen

    2016-07-01

    The Parallel Ocean Program (POP), the ocean model component of the Community Earth System Model (CESM), is widely used in climate research. Most current work in CESM-POP focuses on improving the model's efficiency or accuracy, such as improving numerical methods, advancing parameterization, porting to new architectures, or increasing parallelism. Since ocean dynamics are chaotic in nature, achieving bit-for-bit (BFB) identical results in ocean solutions cannot be guaranteed for even tiny code modifications, and determining whether modifications are admissible (i.e., statistically consistent with the original results) is non-trivial. In recent work, an ensemble-based statistical approach was shown to work well for software verification (i.e., quality assurance) on atmospheric model data. The general idea of the ensemble-based statistical consistency testing is to use a qualitative measurement of the variability of the ensemble of simulations as a metric with which to compare future simulations and make a determination of statistical distinguishability. The capability to determine consistency without BFB results boosts model confidence and provides the flexibility needed, for example, for more aggressive code optimizations and the use of heterogeneous execution environments. Since ocean and atmosphere models have differing characteristics in term of dynamics, spatial variability, and timescales, we present a new statistical method to evaluate ocean model simulation data that requires the evaluation of ensemble means and deviations in a spatial manner. In particular, the statistical distribution from an ensemble of CESM-POP simulations is used to determine the standard score of any new model solution at each grid point. Then the percentage of points that have scores greater than a specified threshold indicates whether the new model simulation is statistically distinguishable from the ensemble simulations. Both ensemble size and composition are important. Our experiments indicate that the new POP ensemble consistency test (POP-ECT) tool is capable of distinguishing cases that should be statistically consistent with the ensemble and those that should not, as well as providing a simple, subjective and systematic way to detect errors in CESM-POP due to the hardware or software stack, positively contributing to quality assurance for the CESM-POP code.

  8. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  9. Effects of advanced treatment of municipal wastewater on the White River near Indianapolis, Indiana; trends in water quality, 1978-86

    USGS Publications Warehouse

    Crawford, Charles G.; Wangsness, David J.

    1993-01-01

    The City of Indianapolis has constructed state-of-the-art advanced municipal wastewater-treatment systems to enlarge and upgrade the existing secondary-treatment processes at its Belmont and Southport treatment plants. These new advanced-wastewater-treatment plants became operational in 1983. A nonparametric statistical procedure--a modified form of the Wilcoxon-Mann-Whitney rank-sum test--was used to test for trends in time-series water-quality data from four sites on the White River and from the Belmont and Southport wastewater-treatment plants. Time-series data representative of pre-advanced- (1978-1980) and post-advanced- (1983--86) wastewater-treatment conditions were tested for trends, and the results indicate substantial changes in water quality of treated effluent and of the White River downstream from Indianapolis after implementation of advanced wastewater treatment. Water quality from 1981 through 1982 was highly variable due to plant construction. Therefore, this time period was excluded from the analysis. Water quality at sample sites located upstream from the wastewater-treatment plants was relatively constant during the period of study (1978-86). Analysis of data from the two plants and downstream from the plants indicates statistically significant decreasing trends in effluent concentrations of total ammonia, 5-day biochemical-oxygen demand, fecal-coliform bacteria, total phosphate, and total solids at all sites where sufficient data were available for testing. Because of in-plant nitrification, increases in nitrate concentration were statistically significant in the two plants and in the White River. The decrease in ammonia concentrations and 5-day biochemical-oxygen demand in the White River resulted in a statistically significant increasing trend in dissolved-oxygen concentration in the river because of reduced oxygen demand for nitrification and biochemical oxidation processes. Following implementation of advanced wastewater treatment, the number of river-quality samples that failed to meet the water-quality standards for ammonia and dissolved oxygen that apply to the White River decreased substantially.

  10. Identification of the contribution of contact and aerial biomechanical parameters in acrobatic performance

    PubMed Central

    Haering, Diane; Huchez, Aurore; Barbier, Franck; Holvoët, Patrice; Begon, Mickaël

    2017-01-01

    Introduction Teaching acrobatic skills with a minimal amount of repetition is a major challenge for coaches. Biomechanical, statistical or computer simulation tools can help them identify the most determinant factors of performance. Release parameters, change in moment of inertia and segmental momentum transfers were identified in the prediction of acrobatics success. The purpose of the present study was to evaluate the relative contribution of these parameters in performance throughout expertise or optimisation based improvements. The counter movement forward in flight (CMFIF) was chosen for its intrinsic dichotomy between the accessibility of its attempt and complexity of its mastery. Methods Three repetitions of the CMFIF performed by eight novice and eight advanced female gymnasts were recorded using a motion capture system. Optimal aerial techniques that maximise rotation potential at regrasp were also computed. A 14-segment-multibody-model defined through the Rigid Body Dynamics Library was used to compute recorded and optimal kinematics, and biomechanical parameters. A stepwise multiple linear regression was used to determine the relative contribution of these parameters in novice recorded, novice optimised, advanced recorded and advanced optimised trials. Finally, fixed effects of expertise and optimisation were tested through a mixed-effects analysis. Results and discussion Variation in release state only contributed to performances in novice recorded trials. Moment of inertia contribution to performance increased from novice recorded, to novice optimised, advanced recorded, and advanced optimised trials. Contribution to performance of momentum transfer to the trunk during the flight prevailed in all recorded trials. Although optimisation decreased transfer contribution, momentum transfer to the arms appeared. Conclusion Findings suggest that novices should be coached on both contact and aerial technique. Inversely, mainly improved aerial technique helped advanced gymnasts increase their performance. For both, reduction of the moment of inertia should be focused on. The method proposed in this article could be generalized to any aerial skill learning investigation. PMID:28422954

  11. Effect of Pin Tool Shape on Metal Flow During Friction Stir Welding

    NASA Technical Reports Server (NTRS)

    McClure, J. C.; Coronado, E.; Aloor, S.; Nowak, B.; Murr, L. M.; Nunes, Arthur C., Jr.; Munafo, Paul M. (Technical Monitor)

    2002-01-01

    It has been shown that metal moves behind the rotating Friction Stir Pin Tool in two separate currents or streams. One current, mostly on the advancing side, enters a zone of material that rotates with the pin tool for one or more revolutions and eventually is abandoned behind the pin tool in crescent-shaped pieces. The other current, largely on the retreating side of the pin tool is moved by a wiping process to the back of the pin tool and fills in between the pieces of the rotational zone that have been shed by the rotational zone. This process was studied by using a faying surface copper trace to clarify the metal flow. Welds were made with pin tools having various thread pitches. Decreasing the thread pitch causes the large scale top-to-bottorn flow to break up into multiple vortices along the pin and an unthreaded pin tool provides insufficient vertical motion for there to be a stable rotational zone and flow of material via the rotational zone is not possible leading to porosity on the advancing side of the weld.

  12. An analysis, sensitivity and prediction of winter fog events using FASP model over Indo-Gangetic plains, India

    NASA Astrophysics Data System (ADS)

    Srivastava, S. K., Sr.; Sharma, D. A.; Sachdeva, K.

    2017-12-01

    Indo-Gangetic plains of India experience severe fog conditions during the peak winter months of December and January every year. In this paper an attempt has been to analyze the spatial and temporal variability of winter fog over Indo-Gangetic plains. Further, an attempt has also been made to configure an efficient meso-scale numerical weather prediction model using different parameterization schemes and develop a forecasting tool for prediction of fog during winter months over Indo-Gangetic plains. The study revealed that an alarming increasing positive trend of fog frequency prevails over many locations of IGP. Hot spot and cluster analysis were conducted to identify the high fog prone zones using GIS and inferential statistical tools respectively. Hot spots on an average experiences fog on 68.27% days, it is followed by moderate and cold spots with 48.03% and 21.79% respectively. The study proposes a new FASP (Fog Analysis, sensitivity and prediction) Model for overall analysis and prediction of fog at a particular location and period over IGP. In the first phase of this model long term climatological fog data of a location is analyzed to determine its characteristics and prevailing trend using various advanced statistical techniques. During a second phase a sensitivity test is conducted with different combination of parameterization schemes to determine the most suitable combination for fog simulation over a particular location and period and in the third and final phase, first ARIMA model is used to predict the number of fog days in future . Thereafter, Numerical model is used to predict the various meteorological parameters favourable for fog forecast. Finally, Hybrid model is used for fog forecast over the study location. The results of the FASP model are validated with actual ground based fog data using statistical tools. Forecast Fog-gram generated using hybrid model during Jan 2017 shows highly encouraging results for fog occurrence/Non occurrence between 25 hrs to 72 hours forecast. The model predicted the fog occurrences/Non occurrence with more than 85 % accuracy over most of the locations across the study area. The minimum visibility departure is within 500 m on 90% occasions over the central IGP and within 1000m on more than 80 % occasions over most of the locations across Indo-Gangetic plains.

  13. Linguistic Alternatives to Quantitative Research Strategies. Part One: How Linguistic Mechanisms Advance Research Outcomes

    ERIC Educational Resources Information Center

    Yeager, Joseph; Sommer, Linda

    2007-01-01

    Combining psycholinguistic technologies and systems analysis created advances in motivational profiling and numerous new behavioral engineering applications. These advances leapfrog many mainstream statistical research methods, producing superior research results via cause-effect language mechanisms. Entire industries explore motives ranging from…

  14. Recent Trends in Advance Directives at Nursing Home Admission and One Year after Admission

    ERIC Educational Resources Information Center

    McAuley, William J.; Buchanan, Robert J.; Travis, Shirley S.; Wang, Suojin; Kim, MyungSuk

    2006-01-01

    Purpose: Advance directives are important planning and decision-making tools for individuals in nursing homes. Design and Methods: By using the nursing facility Minimum Data Set, we examined the prevalence of advance directives at admission and 12 months post-admission. Results: The prevalence of having any advance directive at admission declined…

  15. PTMScout, a Web Resource for Analysis of High Throughput Post-translational Proteomics Studies*

    PubMed Central

    Naegle, Kristen M.; Gymrek, Melissa; Joughin, Brian A.; Wagner, Joel P.; Welsch, Roy E.; Yaffe, Michael B.; Lauffenburger, Douglas A.; White, Forest M.

    2010-01-01

    The rate of discovery of post-translational modification (PTM) sites is increasing rapidly and is significantly outpacing our biological understanding of the function and regulation of those modifications. To help meet this challenge, we have created PTMScout, a web-based interface for viewing, manipulating, and analyzing high throughput experimental measurements of PTMs in an effort to facilitate biological understanding of protein modifications in signaling networks. PTMScout is constructed around a custom database of PTM experiments and contains information from external protein and post-translational resources, including gene ontology annotations, Pfam domains, and Scansite predictions of kinase and phosphopeptide binding domain interactions. PTMScout functionality comprises data set comparison tools, data set summary views, and tools for protein assignments of peptides identified by mass spectrometry. Analysis tools in PTMScout focus on informed subset selection via common criteria and on automated hypothesis generation through subset labeling derived from identification of statistically significant enrichment of other annotations in the experiment. Subset selection can be applied through the PTMScout flexible query interface available for quantitative data measurements and data annotations as well as an interface for importing data set groupings by external means, such as unsupervised learning. We exemplify the various functions of PTMScout in application to data sets that contain relative quantitative measurements as well as data sets lacking quantitative measurements, producing a set of interesting biological hypotheses. PTMScout is designed to be a widely accessible tool, enabling generation of multiple types of biological hypotheses from high throughput PTM experiments and advancing functional assignment of novel PTM sites. PTMScout is available at http://ptmscout.mit.edu. PMID:20631208

  16. Automated Comparative Metabolite Profiling of Large LC-ESIMS Data Sets in an ACD/MS Workbook Suite Add-in, and Data Clustering on a New Open-Source Web Platform FreeClust.

    PubMed

    Božičević, Alen; Dobrzyński, Maciej; De Bie, Hans; Gafner, Frank; Garo, Eliane; Hamburger, Matthias

    2017-12-05

    The technological development of LC-MS instrumentation has led to significant improvements of performance and sensitivity, enabling high-throughput analysis of complex samples, such as plant extracts. Most software suites allow preprocessing of LC-MS chromatograms to obtain comprehensive information on single constituents. However, more advanced processing needs, such as the systematic and unbiased comparative metabolite profiling of large numbers of complex LC-MS chromatograms remains a challenge. Currently, users have to rely on different tools to perform such data analyses. We developed a two-step protocol comprising a comparative metabolite profiling tool integrated in ACD/MS Workbook Suite, and a web platform developed in R language designed for clustering and visualization of chromatographic data. Initially, all relevant chromatographic and spectroscopic data (retention time, molecular ions with the respective ion abundance, and sample names) are automatically extracted and assembled in an Excel spreadsheet. The file is then loaded into an online web application that includes various statistical algorithms and provides the user with tools to compare and visualize the results in intuitive 2D heatmaps. We applied this workflow to LC-ESIMS profiles obtained from 69 honey samples. Within few hours of calculation with a standard PC, honey samples were preprocessed and organized in clusters based on their metabolite profile similarities, thereby highlighting the common metabolite patterns and distributions among samples. Implementation in the ACD/Laboratories software package enables ulterior integration of other analytical data, and in silico prediction tools for modern drug discovery.

  17. Tools for quantifying isotopic niche space and dietary variation at the individual and population level.

    USGS Publications Warehouse

    Newsome, Seth D.; Yeakel, Justin D.; Wheatley, Patrick V.; Tinker, M. Tim

    2012-01-01

    Ecologists are increasingly using stable isotope analysis to inform questions about variation in resource and habitat use from the individual to community level. In this study we investigate data sets from 2 California sea otter (Enhydra lutris nereis) populations to illustrate the advantages and potential pitfalls of applying various statistical and quantitative approaches to isotopic data. We have subdivided these tools, or metrics, into 3 categories: IsoSpace metrics, stable isotope mixing models, and DietSpace metrics. IsoSpace metrics are used to quantify the spatial attributes of isotopic data that are typically presented in bivariate (e.g., δ13C versus δ15N) 2-dimensional space. We review IsoSpace metrics currently in use and present a technique by which uncertainty can be included to calculate the convex hull area of consumers or prey, or both. We then apply a Bayesian-based mixing model to quantify the proportion of potential dietary sources to the diet of each sea otter population and compare this to observational foraging data. Finally, we assess individual dietary specialization by comparing a previously published technique, variance components analysis, to 2 novel DietSpace metrics that are based on mixing model output. As the use of stable isotope analysis in ecology continues to grow, the field will need a set of quantitative tools for assessing isotopic variance at the individual to community level. Along with recent advances in Bayesian-based mixing models, we hope that the IsoSpace and DietSpace metrics described here will provide another set of interpretive tools for ecologists.

  18. Dynamic principle for ensemble control tools.

    PubMed

    Samoletov, A; Vasiev, B

    2017-11-28

    Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.

  19. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  20. Advanced data management for optimising the operation of a full-scale WWTP.

    PubMed

    Beltrán, Sergio; Maiza, Mikel; de la Sota, Alejandro; Villanueva, José María; Ayesa, Eduardo

    2012-01-01

    The lack of appropriate data management tools is presently a limiting factor for a broader implementation and a more efficient use of sensors and analysers, monitoring systems and process controllers in wastewater treatment plants (WWTPs). This paper presents a technical solution for advanced data management of a full-scale WWTP. The solution is based on an efficient and intelligent use of the plant data by a standard centralisation of the heterogeneous data acquired from different sources, effective data processing to extract adequate information, and a straightforward connection to other emerging tools focused on the operational optimisation of the plant such as advanced monitoring and control or dynamic simulators. A pilot study of the advanced data manager tool was designed and implemented in the Galindo-Bilbao WWTP. The results of the pilot study showed its potential for agile and intelligent plant data management by generating new enriched information combining data from different plant sources, facilitating the connection of operational support systems, and developing automatic plots and trends of simulated results and actual data for plant performance and diagnosis.

Top