Science.gov

Sample records for advanced statistical tools

  1. Advanced Statistical and Data Analysis Tools for Astrophysics

    NASA Technical Reports Server (NTRS)

    Kashyap, V.; Scargle, Jeffrey D. (Technical Monitor)

    2001-01-01

    The goal of the project is to obtain, derive, and develop statistical and data analysis tools that would be of use in the analyses of high-resolution, high-sensitivity data that are becoming available with new instruments. This is envisioned as a cross-disciplinary effort with a number of collaborators.

  2. Integration of Advanced Statistical Analysis Tools and Geophysical Modeling

    DTIC Science & Technology

    2010-12-01

    later in this section. 2) San Luis Obispo . Extracted features were also provided for MTADS EM61, MTADS magnetics, EM61 cart, and TEMTADS data sets from...subsequent training of statistical classifiers using these features. Results of discrimination studies at Camp Sibert and San Luis Obispo have shown...Comparison of classification performance Figures 10 through 13 show receiver operating characteristics for data sets acquired at San Luis Obispo . Subplot

  3. Advances in statistics

    Treesearch

    Howard Stauffer; Nadav Nur

    2005-01-01

    The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...

  4. Advanced Prosthetic Gait Training Tool

    DTIC Science & Technology

    2011-09-01

    AD_________________ Award Number: W81XWH-10-1-0870 TITLE: Advanced Prosthetic Gait Training Tool...Advanced Prosthetic Gait Training Tool 5b. GRANT NUMBER W81XWH-10-1-0870 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER Rajankumar...produce a computer-based Advanced Prosthetic Gait Training Tool to aid in the training of clinicians at military treatment facilities providing care for

  5. Advanced Welding Tool

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Accutron Tool & Instrument Co.'s welder was originally developed as a tool specifically for joining parts made of plastic or composite materials in any atmosphere to include the airless environment of space. Developers decided on induction or magnetic heating to avoid causing deformation and it also can be used with almost any type of thermoplastic material. Induction coil transfers magnetic flux through the plastic to a metal screen that is sandwiched between the sheets of plastic to be joined. When welder is energized, alternating current produces inductive heating on the screen causing the adjacent plastic surfaces to melt and flow into the mesh, creating a bond on the total surface area. Dave Brown, owner of Great Falls Canoe and Kayak Repair, Vienna, VA, uses a special repair technique based on operation of the Induction Toroid Welder to fix canoes. Whitewater canoeing poses the problem of frequent gashes that are difficult to repair. The main reason is that many canoes are made of plastics. The commercial Induction model is a self-contained, portable welding gun with a switch on the handle to regulate the temperature of the plastic melting screen. Welder has a broad range of applications in the automobile, appliance, aerospace and construction industries.

  6. Advanced Welding Tool

    NASA Astrophysics Data System (ADS)

    1982-01-01

    Accutron Tool & Instrument Co.'s welder was originally developed as a tool specifically for joining parts made of plastic or composite materials in any atmosphere to include the airless environment of space. Developers decided on induction or magnetic heating to avoid causing deformation and it also can be used with almost any type of thermoplastic material. Induction coil transfers magnetic flux through the plastic to a metal screen that is sandwiched between the sheets of plastic to be joined. When welder is energized, alternating current produces inductive heating on the screen causing the adjacent plastic surfaces to melt and flow into the mesh, creating a bond on the total surface area. Dave Brown, owner of Great Falls Canoe and Kayak Repair, Vienna, VA, uses a special repair technique based on operation of the Induction Toroid Welder to fix canoes. Whitewater canoeing poses the problem of frequent gashes that are difficult to repair. The main reason is that many canoes are made of plastics. The commercial Induction model is a self-contained, portable welding gun with a switch on the handle to regulate the temperature of the plastic melting screen. Welder has a broad range of applications in the automobile, appliance, aerospace and construction industries.

  7. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  8. Some statistical tools in hydrology

    USGS Publications Warehouse

    Riggs, H.C.

    1968-01-01

    This chapter of 'Techniques of Water-Resources Investigations' provides background material needed for understanding the statistical procedures most useful in hydrology; it furnishes detailed procedures, with examples, of regression analyses; it describes analysis of variance and covariance and discusses the characteristics of hydrologic data.

  9. Intermediate/Advanced Research Design and Statistics

    NASA Technical Reports Server (NTRS)

    Ploutz-Snyder, Robert

    2009-01-01

    The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs

  10. Statistical Tools for Forensic Analysis of Toolmarks

    SciTech Connect

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  11. New advanced tools for combined ULF wave analysis of multipoint space-borne and ground observations: application to single event and statistical studies

    NASA Astrophysics Data System (ADS)

    Balasis, G.; Papadimitriou, C.; Daglis, I. A.; Georgiou, M.; Giamini, S. A.

    2013-12-01

    In the past decade, a critical mass of high-quality scientific data on the electric and magnetic fields in the Earth's magnetosphere and topside ionosphere has been progressively collected. This data pool will be further enriched by the measurements of the upcoming ESA/Swarm mission, a constellation of three satellites in three different polar orbits between 400 and 550 km altitude, which is expected to be launched in November 2013. New analysis tools that can cope with measurements of various spacecraft at various regions of the magnetosphere and in the topside ionosphere as well as ground stations will effectively enhance the scientific exploitation of the accumulated data. Here, we report on a new suite of algorithms based on a combination of wavelet spectral methods and artificial neural network techniques and demonstrate the applicability of our recently developed analysis tools both for individual case studies and statistical studies of ultra-low frequency (ULF) waves. First, we provide evidence for a rare simultaneous observation of a ULF wave event in the Earth's magnetosphere, topside ionosphere and surface: we have found a specific time interval during the Halloween 2003 magnetic storm, when the Cluster and CHAMP spacecraft were in good local time (LT) conjunction, and have examined the ULF wave activity in the Pc3 (22-100 mHz) and Pc4-5 (1-22 mHz) bands using data from the Geotail, Cluster and CHAMP missions, as well as the CARISMA and GIMA magnetometer networks. Then, we perform a statistical study of Pc3 wave events observed by CHAMP for the full decade (2001-2010) of the satellite vector magnetic data: the creation of a database of such events enabled us to derive valuable statistics for many important physical properties relating to the spatio-temporal location of these waves, the wave power and frequency, as well as other parameters and their correlation with solar wind conditions, magnetospheric indices, electron density data, ring current decay

  12. STATWIZ - AN ELECTRONIC STATISTICAL TOOL (ABSTRACT)

    EPA Science Inventory

    StatWiz is a web-based, interactive, and dynamic statistical tool for researchers. It will allow researchers to input information and/or data and then receive experimental design options, or outputs from data analysis. StatWiz is envisioned as an expert system that will walk rese...

  13. STATWIZ - AN ELECTRONIC STATISTICAL TOOL (ABSTRACT)

    EPA Science Inventory

    StatWiz is a web-based, interactive, and dynamic statistical tool for researchers. It will allow researchers to input information and/or data and then receive experimental design options, or outputs from data analysis. StatWiz is envisioned as an expert system that will walk rese...

  14. Modeling Tool Advances Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  15. Advanced Human Factors Engineering Tool Technologies.

    DTIC Science & Technology

    1987-03-20

    identified the types of tools they would like to see V developed to fill the existing technology gaps. The advanced tools were catego- rized using an...the prototype phase of development were considered candidates for inclusion. The advanced tools were next categorized using an eight point...role, application, status and cost. Decision criteria were then developed as the basis for the tradeoff process to aid in tool selection. To

  16. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is

  17. Understanding the fouling of UF/MF hollow fibres of biologically treated wastewaters using advanced EfOM characterization and statistical tools.

    PubMed

    Filloux, E; Labanowski, J; Croue, J P

    2012-08-01

    Five secondary effluents and a river water source were characterized using size exclusion chromatography (LC-OCD-UVD-OND) and emission-excitation matrix (EEM) fluorescence spectroscopy in order to identify the major effluent organic matter (EfOM) fractions responsible for membrane fouling. This study showed the feasibility of coupling fluorescence EEM and LC-OCD-UVD-OND to investigate the fouling potential as well as a means to differentiate natural organic matter (NOM) from EfOM. The secondary effluents and river water showed a significant difference in organic matter characteristics and fouling potential, highlighting the importance of biological processes and the feed water source on EfOM characteristics and fouling potential. On the basis of statistical analysis, protein-like substances were found to be highly correlated to the fouling potential of secondary effluents. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Advanced Tools for Software Maintenance.

    DTIC Science & Technology

    1982-12-01

    Old Applications ...... 118 11.3.3 Training People to Use New Tools ......... 119 Appendix A. Ada Style Guidelines . . . . . . . . . . . . . 121...and application -specific programming techniques and methods. - The Intelligent Editor provides facilities for manipulating programs at several...are applicable today or in the near future. In identifying tools and techniques, this study focused on one aspect of the maintenance problem

  19. Recent Advances in Algal Genetic Tool Development

    SciTech Connect

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well as prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.

  20. Recent Advances in Algal Genetic Tool Development

    SciTech Connect

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well as prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.

  1. Rapid medical advances challenge the tooling industry.

    PubMed

    Conley, B

    2008-01-01

    The requirement for greater performance in smaller spaces has increased demands for product and process innovation in tubing and other medical products. In turn, these developments have placed greater demands on the producers of the advanced tooling for these products. Tooling manufacturers must now continuously design equipment with much tighter tolerances for more sophisticated coextrusions and for newer generations of multilumen and multilayer tubing.

  2. ADVANCED TOOLS FOR ASSESSING SELECTED ...

    EPA Pesticide Factsheets

    The purpose of this poster is to present the application and assessment of advanced state-of-the-art technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs [azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, methylenedioxymethamphetamine (MDMA)]. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansi

  3. ADVANCED TOOLS FOR ASSESSING SELECTED ...

    EPA Pesticide Factsheets

    The purpose of this poster is to present the application and assessment of advanced technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs (azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, and methylenedioxymethamphetamine). The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technica

  4. Recent Advances in Algal Genetic Tool Development

    DOE PAGES

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less

  5. Writing to Learn Statistics in an Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Northrup, Christian Glenn

    2012-01-01

    This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…

  6. Writing to Learn Statistics in an Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Northrup, Christian Glenn

    2012-01-01

    This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…

  7. Using Tree Diagrams as an Assessment Tool in Statistics Education

    ERIC Educational Resources Information Center

    Yin, Yue

    2012-01-01

    This study examines the potential of the tree diagram, a type of graphic organizer, as an assessment tool to measure students' knowledge structures in statistics education. Students' knowledge structures in statistics have not been sufficiently assessed in statistics, despite their importance. This article first presents the rationale and method…

  8. Using Tree Diagrams as an Assessment Tool in Statistics Education

    ERIC Educational Resources Information Center

    Yin, Yue

    2012-01-01

    This study examines the potential of the tree diagram, a type of graphic organizer, as an assessment tool to measure students' knowledge structures in statistics education. Students' knowledge structures in statistics have not been sufficiently assessed in statistics, despite their importance. This article first presents the rationale and method…

  9. Advanced genetic tools for plant biotechnology

    SciTech Connect

    Liu, WS; Yuan, JS; Stewart, CN

    2013-10-09

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis of large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.

  10. Advanced genetic tools for plant biotechnology.

    PubMed

    Liu, Wusheng; Yuan, Joshua S; Stewart, C Neal

    2013-11-01

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis of large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.

  11. Data Torturing and the Misuse of Statistical Tools

    SciTech Connect

    Abate, Marcey L.

    1999-08-16

    Statistical concepts, methods, and tools are often used in the implementation of statistical thinking. Unfortunately, statistical tools are all too often misused by not applying them in the context of statistical thinking that focuses on processes, variation, and data. The consequences of this misuse may be ''data torturing'' or going beyond reasonable interpretation of the facts due to a misunderstanding of the processes creating the data or the misinterpretation of variability in the data. In the hope of averting future misuse and data torturing, examples are provided where the application of common statistical tools, in the absence of statistical thinking, provides deceptive results by not adequately representing the underlying process and variability. For each of the examples, a discussion is provided on how applying the concepts of statistical thinking may have prevented the data torturing. The lessons learned from these examples will provide an increased awareness of the potential for many statistical methods to mislead and a better understanding of how statistical thinking broadens and increases the effectiveness of statistical tools.

  12. A Hierarchical Statistic Methodology for Advanced Memory System Evaluation

    SciTech Connect

    Sun, X.-J.; He, D.; Cameron, K.W.; Luo, Y.

    1999-04-12

    Advances in technology have resulted in a widening of the gap between computing speed and memory access time. Data access time has become increasingly important for computer system design. Various hierarchical memory architectures have been developed. The performance of these advanced memory systems, however, varies with applications and problem sizes. How to reach an optimal cost/performance design eludes researchers still. In this study, the authors introduce an evaluation methodology for advanced memory systems. This methodology is based on statistical factorial analysis and performance scalability analysis. It is two fold: it first determines the impact of memory systems and application programs toward overall performance; it also identifies the bottleneck in a memory hierarchy and provides cost/performance comparisons via scalability analysis. Different memory systems can be compared in terms of mean performance or scalability over a range of codes and problem sizes. Experimental testing has been performed extensively on the Department of Energy's Accelerated Strategic Computing Initiative (ASCI) machines and benchmarks available at the Los Alamos National Laboratory to validate this newly proposed methodology. Experimental and analytical results show this methodology is simple and effective. It is a practical tool for memory system evaluation and design. Its extension to general architectural evaluation and parallel computer systems are possible and should be further explored.

  13. Conceptualizing a Framework for Advanced Placement Statistics Teaching Knowledge

    ERIC Educational Resources Information Center

    Haines, Brenna

    2015-01-01

    The purpose of this article is to sketch a conceptualization of a framework for Advanced Placement (AP) Statistics Teaching Knowledge. Recent research continues to problematize the lack of knowledge and preparation among secondary level statistics teachers. The College Board's AP Statistics course continues to grow and gain popularity, but is a…

  14. Conceptualizing a Framework for Advanced Placement Statistics Teaching Knowledge

    ERIC Educational Resources Information Center

    Haines, Brenna

    2015-01-01

    The purpose of this article is to sketch a conceptualization of a framework for Advanced Placement (AP) Statistics Teaching Knowledge. Recent research continues to problematize the lack of knowledge and preparation among secondary level statistics teachers. The College Board's AP Statistics course continues to grow and gain popularity, but is a…

  15. Advance Report of Final Mortality Statistics, 1985.

    ERIC Educational Resources Information Center

    Monthly Vital Statistics Report, 1987

    1987-01-01

    This document presents mortality statistics for 1985 for the entire United States. Data analysis and discussion of these factors is included: death and death rates; death rates by age, sex, and race; expectation of life at birth and at specified ages; causes of death; infant mortality; and maternal mortality. Highlights reported include: (1) the…

  16. Advances in Statistical Approaches Oncology Drug Development

    PubMed Central

    Ivanova, Anastasia; Rosner, Gary L.; Marchenko, Olga; Parke, Tom; Perevozskaya, Inna; Wang, Yanping

    2014-01-01

    We describe some recent developments in statistical methodology and practice in oncology drug development from an academic and an industry perspective. Many adaptive designs were pioneered in oncology, and oncology is still at the forefront of novel methods to enable better and faster Go/No-Go decision making while controlling the cost. PMID:25949927

  17. Advanced Algorithms and Statistics for MOS Surveys

    NASA Astrophysics Data System (ADS)

    Bolton, A. S.

    2016-10-01

    This paper presents an individual view on the current state of computational data processing and statistics for inference and discovery in multi-object spectroscopic surveys, supplemented by a historical perspective and a few present-day applications. It is more op-ed than review, and hopefully more readable as a result.

  18. Self-advancing step-tap tool

    NASA Technical Reports Server (NTRS)

    Pettit, Donald R. (Inventor); Penner, Ronald K. (Inventor); Franklin, Larry D. (Inventor); Camarda, Charles J. (Inventor)

    2008-01-01

    Methods and tool for simultaneously forming a bore in a work piece and forming a series of threads in said bore. In an embodiment, the tool has a predetermined axial length, a proximal end, and a distal end, said tool comprising: a shank located at said proximal end; a pilot drill portion located at said distal end; and a mill portion intermediately disposed between said shank and said pilot drill portion. The mill portion is comprised of at least two drill-tap sections of predetermined axial lengths and at least one transition section of predetermined axial length, wherein each of said at least one transition section is sandwiched between a distinct set of two of said at least two drill-tap sections. The at least two drill-tap sections are formed of one or more drill-tap cutting teeth spirally increasing along said at least two drill-tap sections, wherein said tool is self-advanced in said work piece along said formed threads, and wherein said tool simultaneously forms said bore and said series of threads along a substantially similar longitudinal axis.

  19. Statistical spectroscopic tools for biomarker discovery and systems medicine.

    PubMed

    Robinette, Steven L; Lindon, John C; Nicholson, Jeremy K

    2013-06-04

    Metabolic profiling based on comparative, statistical analysis of NMR spectroscopic and mass spectrometric data from complex biological samples has contributed to increased understanding of the role of small molecules in affecting and indicating biological processes. To enable this research, the development of statistical spectroscopy has been marked by early beginnings in applying pattern recognition to nuclear magnetic resonance data and the introduction of statistical total correlation spectroscopy (STOCSY) as a tool for biomarker identification in the past decade. Extensions of statistical spectroscopy now compose a family of related tools used for compound identification, data preprocessing, and metabolic pathway analysis. In this Perspective, we review the theory and current state of research in statistical spectroscopy and discuss the growing applications of these tools to medicine and systems biology. We also provide perspectives on how recent institutional initiatives are providing new platforms for the development and application of statistical spectroscopy tools and driving the development of integrated "systems medicine" approaches in which clinical decision making is supported by statistical and computational analysis of metabolic, phenotypic, and physiological data.

  20. Prognostic Tools in Patients With Advanced Cancer: A Systematic Review.

    PubMed

    Simmons, Claribel P L; McMillan, Donald C; McWilliams, Kerry; Sande, Tonje A; Fearon, Kenneth C; Tuck, Sharon; Fallon, Marie T; Laird, Barry J

    2017-05-01

    In 2005, the European Association for Palliative Care made recommendations for prognostic markers in advanced cancer. Since then, prognostic tools have been developed, evolved, and validated. The aim of this systematic review was to examine the progress in the development and validation of prognostic tools. Medline, Embase Classic and Embase were searched. Eligible studies met the following criteria: patients with incurable cancer, >18 years, original studies, population n ≥100, and published after 2003. Descriptive and quantitative statistical analyses were performed. Forty-nine studies were eligible, assessing seven prognostic tools across different care settings, primary cancer types, and statistically assessed survival prediction. The Palliative Performance Scale was the most studied (n = 21,082), comprising six parameters (six subjective), was externally validated, and predicted survival. The Palliative Prognostic Score composed of six parameters (four subjective and two objective), the Palliative Prognostic Index composed of nine parameters (nine subjective), and the Glasgow Prognostic Score composed of two parameters (two objective) and were all externally validated in more than 2000 patients with advanced cancer and predicted survival. Various prognostic tools have been validated but vary in their complexity, subjectivity, and therefore clinical utility. The Glasgow Prognostic Score would seem the most favorable as it uses only two parameters (both objective) and has prognostic value complementary to the gold standard measure, which is performance status. Further studies comparing all proved prognostic markers in a single cohort of patients with advanced cancer are needed to determine the optimal prognostic tool. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  1. Recent advances in statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  2. Statistical Tools for the Interpretation of Enzootic West Nile virus Transmission Dynamics.

    PubMed

    Caillouët, Kevin A; Robertson, Suzanne

    2016-01-01

    Interpretation of enzootic West Nile virus (WNV) surveillance indicators requires little advanced mathematical skill, but greatly enhances the ability of public health officials to prescribe effective WNV management tactics. Stepwise procedures for the calculation of mosquito infection rates (IR) and vector index (VI) are presented alongside statistical tools that require additional computation. A brief review of advantages and important considerations for each statistic's use is provided.

  3. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    ERIC Educational Resources Information Center

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  4. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    ERIC Educational Resources Information Center

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  5. ANDES: Statistical tools for the ANalyses of DEep Sequencing.

    PubMed

    Li, Kelvin; Venter, Eli; Yooseph, Shibu; Stockwell, Timothy B; Eckerle, Lance D; Denison, Mark R; Spiro, David J; Methé, Barbara A

    2010-07-15

    The advancements in DNA sequencing technologies have allowed researchers to progress from the analyses of a single organism towards the deep sequencing of a sample of organisms. With sufficient sequencing depth, it is now possible to detect subtle variations between members of the same species, or between mixed species with shared biomarkers, such as the 16S rRNA gene. However, traditional sequencing analyses of samples from largely homogeneous populations are often still based on multiple sequence alignments (MSA), where each sequence is placed along a separate row and similarities between aligned bases can be followed down each column. While this visual format is intuitive for a small set of aligned sequences, the representation quickly becomes cumbersome as sequencing depths cover loci hundreds or thousands of reads deep. We have developed ANDES, a software library and a suite of applications, written in Perl and R, for the statistical ANalyses of DEep Sequencing. The fundamental data structure underlying ANDES is the position profile, which contains the nucleotide distributions for each genomic position resultant from a multiple sequence alignment (MSA). Tools include the root mean square deviation (RMSD) plot, which allows for the visual comparison of multiple samples on a position-by-position basis, and the computation of base conversion frequencies (transition/transversion rates), variation (Shannon entropy), inter-sample clustering and visualization (dendrogram and multidimensional scaling (MDS) plot), threshold-driven consensus sequence generation and polymorphism detection, and the estimation of empirically determined sequencing quality values. As new sequencing technologies evolve, deep sequencing will become increasingly cost-efficient and the inter and intra-sample comparisons of largely homogeneous sequences will become more common. We have provided a software package and demonstrated its application on various empirically-derived datasets

  6. A Statistical Project Control Tool for Engineering Managers

    NASA Technical Reports Server (NTRS)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  7. Basic statistical tools in research and data analysis.

    PubMed

    Ali, Zulfiqar; Bhaskar, S Bala

    2016-09-01

    Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  8. Some statistical tools for change-points detection

    NASA Astrophysics Data System (ADS)

    Lebarbier, E.

    2012-04-01

    The homogenization of climatological series sometimes amounts at finding change-points in the distribution of the observation along time. This problem is refereed to as 'segmentation' in the statistical literature. Segmentation raises interesting issues in terms of both statistical modeling, model selection and algorithmics. We will make a brief overview of these issues and present several solutions that have been recently proposed. We will also consider the joint segmentation of several series. Eventually, we will introduce the R package 'CGHseg' (cran.r-project.org/web/packages/cghseg/index.html) that as been originally developed for biological applications, and contains several useful tools for the analysis of climatological series.

  9. Novel statistical tools for monitoring the safety of marketed drugs.

    PubMed

    Almenoff, J S; Pattishall, E N; Gibbs, T G; DuMouchel, W; Evans, S J W; Yuen, N

    2007-08-01

    Robust tools for monitoring the safety of marketed therapeutic products are of paramount importance to public health. In recent years, innovative statistical approaches have been developed to screen large post-marketing safety databases for adverse events (AEs) that occur with disproportionate frequency. These methods, known variously as quantitative signal detection, disproportionality analysis, or safety data mining, facilitate the identification of new safety issues or possible harmful effects of a product. In this article, we describe the statistical concepts behind these methods, as well as their practical application to monitoring the safety of pharmaceutical products using spontaneous AE reports. We also provide examples of how these tools can be used to identify novel drug interactions and demographic risk factors for adverse drug reactions. Challenges, controversies, and frontiers for future research are discussed.

  10. Advanced cryogenics for cutting tools. Final report

    SciTech Connect

    Lazarus, L.J.

    1996-10-01

    The purpose of the investigation was to determine if cryogenic treatment improved the life and cost effectiveness of perishable cutting tools over other treatments or coatings. Test results showed that in five of seven of the perishable cutting tools tested there was no improvement in tool life. The other two tools showed a small gain in tool life, but not as much as when switching manufacturers of the cutting tool. The following conclusions were drawn from this study: (1) titanium nitride coatings are more effective than cryogenic treatment in increasing the life of perishable cutting tools made from all cutting tool materials, (2) cryogenic treatment may increase tool life if the cutting tool is improperly heat treated during its origination, and (3) cryogenic treatment was only effective on those tools made from less sophisticated high speed tool steels. As a part of a recent detailed investigation, four cutting tool manufacturers and two cutting tool laboratories were queried and none could supply any data to substantiate cryogenic treatment of perishable cutting tools.

  11. Advanced Human Factors Engineering Tool Technologies.

    DTIC Science & Technology

    1988-03-01

    representing the government, the military, academe, and private industry were surveyed to identify those tools that are most frequently used or viewed...tools by HFE researchers and practitioners within the academic, industrial , and military settings. % .. J. &@ossion For XTIS GR&&I DTIC TAS 0...267 E. Human Factors Engineering Tools Questionnaire .. ......... . 279 F. Listing of Industry , Government, and Academe

  12. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    ERIC Educational Resources Information Center

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  13. The EU-ADR Web Platform: delivering advanced pharmacovigilance tools.

    PubMed

    Oliveira, José Luis; Lopes, Pedro; Nunes, Tiago; Campos, David; Boyer, Scott; Ahlberg, Ernst; van Mulligen, Erik M; Kors, Jan A; Singh, Bharat; Furlong, Laura I; Sanz, Ferran; Bauer-Mehren, Anna; Carrascosa, Maria C; Mestres, Jordi; Avillach, Paul; Diallo, Gayo; Díaz Acedo, Carlos; van der Lei, Johan

    2013-05-01

    Pharmacovigilance methods have advanced greatly during the last decades, making post-market drug assessment an essential drug evaluation component. These methods mainly rely on the use of spontaneous reporting systems and health information databases to collect expertise from huge amounts of real-world reports. The EU-ADR Web Platform was built to further facilitate accessing, monitoring and exploring these data, enabling an in-depth analysis of adverse drug reactions risks. The EU-ADR Web Platform exploits the wealth of data collected within a large-scale European initiative, the EU-ADR project. Millions of electronic health records, provided by national health agencies, are mined for specific drug events, which are correlated with literature, protein and pathway data, resulting in a rich drug-event dataset. Next, advanced distributed computing methods are tailored to coordinate the execution of data-mining and statistical analysis tasks. This permits obtaining a ranked drug-event list, removing spurious entries and highlighting relationships with high risk potential. The EU-ADR Web Platform is an open workspace for the integrated analysis of pharmacovigilance datasets. Using this software, researchers can access a variety of tools provided by distinct partners in a single centralized environment. Besides performing standalone drug-event assessments, they can also control the pipeline for an improved batch analysis of custom datasets. Drug-event pairs can be substantiated and statistically analysed within the platform's innovative working environment. A pioneering workspace that helps in explaining the biological path of adverse drug reactions was developed within the EU-ADR project consortium. This tool, targeted at the pharmacovigilance community, is available online at https://bioinformatics.ua.pt/euadr/. Copyright © 2012 John Wiley & Sons, Ltd.

  14. Surface Evaluation by Estimation of Fractal Dimension and Statistical Tools

    PubMed Central

    Salac, Petr

    2014-01-01

    Structured and complex data can be found in many applications in research and development, and also in industrial practice. We developed a methodology for describing the structured data complexity and applied it in development and industrial practice. The methodology uses fractal dimension together with statistical tools and with software modification is able to analyse data in a form of sequence (signals, surface roughness), 2D images, and dividing lines. The methodology had not been tested for a relatively large collection of data. For this reason, samples with structured surfaces produced with different technologies and properties were measured and evaluated with many types of parameters. The paper intends to analyse data measured by a surface roughness tester. The methodology shown compares standard and nonstandard parameters, searches the optimal parameters for a complete analysis, and specifies the sensitivity to directionality of samples for these types of surfaces. The text presents application of fractal geometry (fractal dimension) for complex surface analysis in combination with standard roughness parameters (statistical tool). PMID:25250380

  15. Surface evaluation by estimation of fractal dimension and statistical tools.

    PubMed

    Hotar, Vlastimil; Salac, Petr

    2014-01-01

    Structured and complex data can be found in many applications in research and development, and also in industrial practice. We developed a methodology for describing the structured data complexity and applied it in development and industrial practice. The methodology uses fractal dimension together with statistical tools and with software modification is able to analyse data in a form of sequence (signals, surface roughness), 2D images, and dividing lines. The methodology had not been tested for a relatively large collection of data. For this reason, samples with structured surfaces produced with different technologies and properties were measured and evaluated with many types of parameters. The paper intends to analyse data measured by a surface roughness tester. The methodology shown compares standard and nonstandard parameters, searches the optimal parameters for a complete analysis, and specifies the sensitivity to directionality of samples for these types of surfaces. The text presents application of fractal geometry (fractal dimension) for complex surface analysis in combination with standard roughness parameters (statistical tool).

  16. Advances in Statistical Methods for Substance Abuse Prevention Research

    PubMed Central

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  17. New statistical tools for analyzing the structure of animal groups.

    PubMed

    Cavagna, Andrea; Cimarelli, Alessio; Giardina, Irene; Orlandi, Alberto; Parisi, Giorgio; Procaccini, Andrea; Santagati, Raffaele; Stefanini, Fabio

    2008-01-01

    The statistical characterization of the spatial structure of large animal groups has been very limited so far, mainly due to a lack of empirical data, especially in three dimensions (3D). Here we focus on the case of large flocks of starlings (Sturnus vulgaris) in the field. We reconstruct the 3D positions of individual birds within flocks of up to few thousands of elements. In this respect our data constitute a unique set. We perform a statistical analysis of flocks' structure by using two quantities that are new to the field of collective animal behaviour, namely the conditional density and the pair correlation function. These tools were originally developed in the context of condensed matter theory. We explain what is the meaning of these two quantities, how to measure them in a reliable way, and why they are useful in assessing the density fluctuations and the statistical correlations across the group. We show that the border-to-centre density gradient displayed by starling flocks gives rise to an anomalous behaviour of the conditional density. We also find that the pair correlation function has a structure incompatible with a crystalline arrangement of birds. In fact, our results suggest that flocks are somewhat intermediate between the liquid and the gas phase of physical systems.

  18. Advanced REACH Tool: A Bayesian Model for Occupational Exposure Assessment

    PubMed Central

    McNally, Kevin; Warren, Nicholas; Fransman, Wouter; Entink, Rinke Klein; Schinkel, Jody; van Tongeren, Martie; Cherrie, John W.; Kromhout, Hans; Schneider, Thomas; Tielemans, Erik

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate sources of information within a Bayesian statistical framework. The information is obtained from expert knowledge expressed in a calibrated mechanistic model of exposure assessment, data on inter- and intra-individual variability in exposures from the literature, and context-specific exposure measurements. The ART provides central estimates and credible intervals for different percentiles of the exposure distribution, for full-shift and long-term average exposures. The ART can produce exposure estimates in the absence of measurements, but the precision of the estimates improves as more data become available. The methodology presented in this paper is able to utilize partially analogous data, a novel approach designed to make efficient use of a sparsely populated measurement database although some additional research is still required before practical implementation. The methodology is demonstrated using two worked examples: an exposure to copper pyrithione in the spraying of antifouling paints and an exposure to ethyl acetate in shoe repair. PMID:24665110

  19. Alternative Fuel and Advanced Vehicle Tools (AFAVT), AFDC (Fact Sheet)

    SciTech Connect

    Not Available

    2010-01-01

    The Alternative Fuels and Advanced Vehicles Web site offers a collection of calculators, interactive maps, and informational tools to assist fleets, fuel providers, and others looking to reduce petroleum consumption in the transportation sector.

  20. Statistical tools for prognostics and health management of complex systems

    SciTech Connect

    Collins, David H; Huzurbazar, Aparna V; Anderson - Cook, Christine M

    2010-01-01

    Prognostics and Health Management (PHM) is increasingly important for understanding and managing today's complex systems. These systems are typically mission- or safety-critical, expensive to replace, and operate in environments where reliability and cost-effectiveness are a priority. We present background on PHM and a suite of applicable statistical tools and methods. Our primary focus is on predicting future states of the system (e.g., the probability of being operational at a future time, or the expected remaining system life) using heterogeneous data from a variety of sources. We discuss component reliability models incorporating physical understanding, condition measurements from sensors, and environmental covariates; system reliability models that allow prediction of system failure time distributions from component failure models; and the use of Bayesian techniques to incorporate expert judgments into component and system models.

  1. Innovative Tools Advance Revolutionary Weld Technique

    NASA Technical Reports Server (NTRS)

    2009-01-01

    The iconic, orange external tank of the space shuttle launch system not only contains the fuel used by the shuttle s main engines during liftoff but also comprises the shuttle s backbone, supporting the space shuttle orbiter and solid rocket boosters. Given the tank s structural importance and the extreme forces (7.8 million pounds of thrust load) and temperatures it encounters during launch, the welds used to construct the tank must be highly reliable. Variable polarity plasma arc welding, developed for manufacturing the external tank and later employed for building the International Space Station, was until 1994 the best process for joining the aluminum alloys used during construction. That year, Marshall Space Flight Center engineers began experimenting with a relatively new welding technique called friction stir welding (FSW), developed in 1991 by The Welding Institute, of Cambridge, England. FSW differs from traditional fusion welding in that it is a solid-state welding technique, using frictional heat and motion to join structural components without actually melting any of the material. The weld is created by a shouldered pin tool that is plunged into the seam of the materials to be joined. The tool traverses the line while rotating at high speeds, generating friction that heats and softens but does not melt the metal. (The heat produced approaches about 80 percent of the metal s melting temperature.) The pin tool s rotation crushes and stirs the plasticized metal, extruding it along the seam as the tool moves forward. The material cools and consolidates, resulting in a weld with superior mechanical properties as compared to those weld properties of fusion welds. The innovative FSW technology promises a number of attractive benefits. Because the welded materials are not melted, many of the undesirables associated with fusion welding porosity, cracking, shrinkage, and distortion of the weld are minimized or avoided. The process is more energy efficient, safe

  2. Enhanced bio-manufacturing through advanced multivariate statistical technologies.

    PubMed

    Martin, E B; Morris, A J

    2002-11-13

    The paper describes the interrogation of data, from a reaction vessel producing an active pharmaceutical ingredient (API), using advanced multivariate statistical techniques. Due to the limited number of batches available, data augmentation was used to increase the number of batches thereby enabling the extraction of more subtle process behaviour from the data. A second methodology investigated was that of multi-group modelling. This allowed between cluster variability to be removed, thus allowing attention to focus on within process variability. The paper describes how the different approaches enabled the realisation of a better understanding of the factors causing the onset of an impurity formation to be obtained as well demonstrating the power of multivariate statistical data analysis techniques to provide an enhanced understanding of the process.

  3. Evaluation of air quality in a megacity using statistics tools

    NASA Astrophysics Data System (ADS)

    Ventura, Luciana Maria Baptista; de Oliveira Pinto, Fellipe; Soares, Laiza Molezon; Luna, Aderval Severino; Gioda, Adriana

    2017-03-01

    Local physical characteristics (e.g., meteorology and topography) associate to particle concentrations are important to evaluate air quality in a region. Meteorology and topography affect air pollutant dispersions. This study used statistics tools (PCA, HCA, Kruskal-Wallis, Mann-Whitney's test and others) to a better understanding of the relationship between fine particulate matter (PM2.5) levels and seasons, meteorological conditions and air basins. To our knowledge, it is one of the few studies performed in Latin America involving all parameters together. PM2.5 samples were collected in six sampling sites with different emission sources (industrial, vehicular, soil dust) in Rio de Janeiro, Brazil. The PM2.5 daily concentrations ranged from 1 to 61 µg m-3, with averages higher than the annual limit (15 µg m-3) for some of the sites. The results of the statistics evaluation showed that PM2.5 concentrations were not influenced by seasonality. Furthermore, air basins defined previously were not confirmed, because some sites presented similar emission sources. Therefore, new redefinitions of air basins need to be done, once they are important to air quality management.

  4. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  5. Terahertz Tools Advance Imaging for Security, Industry

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Picometrix, a wholly owned subsidiary of Advanced Photonix Inc. (API), of Ann Arbor, Michigan, invented the world s first commercial terahertz system. The company improved the portability and capabilities of their systems through Small Business Innovation Research (SBIR) agreements with Langley Research Center to provide terahertz imaging capabilities for inspecting the space shuttle external tanks and orbiters. Now API s systems make use of the unique imaging capacity of terahertz radiation on manufacturing floors, for thickness measurements of coatings, pharmaceutical tablet production, and even art conservation.

  6. Statistical inference to advance network models in epidemiology.

    PubMed

    Welch, David; Bansal, Shweta; Hunter, David R

    2011-03-01

    Contact networks are playing an increasingly important role in the study of epidemiology. Most of the existing work in this area has focused on considering the effect of underlying network structure on epidemic dynamics by using tools from probability theory and computer simulation. This work has provided much insight on the role that heterogeneity in host contact patterns plays on infectious disease dynamics. Despite the important understanding afforded by the probability and simulation paradigm, this approach does not directly address important questions about the structure of contact networks such as what is the best network model for a particular mode of disease transmission, how parameter values of a given model should be estimated, or how precisely the data allow us to estimate these parameter values. We argue that these questions are best answered within a statistical framework and discuss the role of statistical inference in estimating contact networks from epidemiological data.

  7. SE Tools Overview & Advanced Systems Engineering Lab (ASEL)

    DTIC Science & Technology

    2011-06-02

    Unclassified/FOUO SE Tools Overview & Advanced Systems Engineering Lab ( ASEL ) Pradeep Mendonza TARDEC Systems Engineering Group pradeep.mendonza...Lab ( ASEL ) 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Pradeep Mendonza 5d. PROJECT NUMBER 5e. TASK NUMBER...AGENDA Systems Engineering Capabilities Interactive Reference Guide (IRG) SEG COTS Tools What is ASEL ? Snapshots of the Decision Management Tool

  8. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Statistical Tool Reporting. 1852.223-76 Section 1852.223-76 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.223-76 Federal Automotive Statistical Tool Reporting. As prescribed at 1823.271 and 1851.205, insert the following clause: Federal Automotive Statistical Tool Reporting (JUL 2003)...

  9. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Statistical Tool Reporting. 1852.223-76 Section 1852.223-76 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.223-76 Federal Automotive Statistical Tool Reporting. As prescribed at 1823.271 and 1851.205, insert the following clause: Federal Automotive Statistical Tool Reporting (JUL 2003)...

  10. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Statistical Tool Reporting. 1852.223-76 Section 1852.223-76 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.223-76 Federal Automotive Statistical Tool Reporting. As prescribed at 1823.271 and 1851.205, insert the following clause: Federal Automotive Statistical Tool Reporting (JUL 2003)...

  11. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Statistical Tool Reporting. 1852.223-76 Section 1852.223-76 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.223-76 Federal Automotive Statistical Tool Reporting. As prescribed at 1823.271 and 1851.205, insert the following clause: Federal Automotive Statistical Tool Reporting (JUL 2003)...

  12. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Statistical Tool Reporting. 1852.223-76 Section 1852.223-76 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.223-76 Federal Automotive Statistical Tool Reporting. As prescribed at 1823.271 and 1851.205, insert the following clause: Federal Automotive Statistical Tool Reporting (JUL 2003)...

  13. Advanced machine tools, loading systems viewed

    NASA Astrophysics Data System (ADS)

    Kharkov, V. I.

    1986-03-01

    The machine-tooling complex built from a revolving lathe and a two-armed robot designed to machine short revolving bodies including parts with curvilinear and threaded surfaces from piece blanks in either small-series or series multiitem production is described. The complex consists of: (1) a model 1V340F30 revolving lathe with a vertical axis of rotation, 8-position revolving head on a cross carriage and an Elektronika NTs-31 on-line control system; (2) a gantry-style two-armed M20-Ts robot with a 20-kilogram (20 x 2) load capacity; and (3) an 8-position indexable blank table, one of whose positions is for initial unloading of finished parts. Subsequently, machined parts are set onto the position into which all of the blanks are unloaded. Complex enclosure allows adjustment and process correction during maintenance and convenient observation of the machining process.

  14. A new statistical tool for NOAA local climate studies

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Meyers, J. C.; Hollingshead, A.

    2011-12-01

    The National Weather Services (NWS) Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to efficiently access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NOAA's staff to conduct regional and local climate studies using state-of-the-art station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. LCAT will augment current climate reference materials with information pertinent to the local and regional levels as they apply to diverse variables appropriate to each locality. The LCAT main emphasis is to enable studies of extreme meteorological and hydrological events such as tornadoes, flood, drought, severe storms, etc. LCAT will close a very critical gap in NWS local climate services because it will allow addressing climate variables beyond average temperature and total precipitation. NWS external partners and government agencies will benefit from the LCAT outputs that could be easily incorporated into their own analysis and/or delivery systems. Presently we identified five existing requirements for local climate: (1) Local impacts of climate change; (2) Local impacts of climate variability; (3) Drought studies; (4) Attribution of severe meteorological and hydrological events; and (5) Climate studies for water resources. The methodologies for the first three requirements will be included in the LCAT first phase implementation. Local rate of climate change is defined as a slope of the mean trend estimated from the ensemble of three trend techniques: (1) hinge, (2) Optimal Climate Normals (running mean for optimal time periods), (3) exponentially

  15. Advanced tribology design tools for space mechanisms

    NASA Astrophysics Data System (ADS)

    Roberts, E. W.; Lewis, S. D.

    2001-09-01

    The purpose of this paper is to report on the current status of, and updates to, three well-established ESTL/TM design aids and tools which are frequently used in the design of spacecraft mechanisms. The design aids covered are: - Space Tribology Handbook - DOLLS: a database on space oils and greases - CABARET: a ball bearing analysis code. The Space Tribology Handbook has become established as the definitive guide to space tribology. This paper reports on updates made to the Handbook and the plans to incorporate it into ECSS Guidelines. The database known as DOLLS provides the fundamental information needed for selection of a fluid lubricant for space applications. The database is being upgraded to include details on new oils and greases and, where available, new data on the characteristics of listed fluid lubricants. The bearing analysis code, CABARET, allows the prediction of bearing performance for a range of applications from low-speed mechanisms to high-speed turbo-pumps. Its predictive capabilities include torque, contact stress, stiffness thermal effects, cage motion, and fatigue life. Each design aid and its current status are discussed further.

  16. Tool for Statistical Analysis and Display of Landing Sites

    NASA Technical Reports Server (NTRS)

    Wawrzyniak, Geoffrey; Kennedy, Brian; Knocke, Philip; Michel, John

    2006-01-01

    MarsLS is a software tool for analyzing statistical dispersion of spacecraft-landing sites and displaying the results of its analyses. Originally intended for the Mars Explorer Rover (MER) mission, MarsLS is also applicable to landing sites on Earth and non-MER sites on Mars. MarsLS is a collection of interdependent MATLAB scripts that utilize the MATLAB graphical-user-interface software environment to display landing-site data (see figure) on calibrated image-maps of the Martian or other terrain. The landing-site data comprise latitude/longitude pairs generated by Monte Carlo runs of other computer programs that simulate entry, descent, and landing. Using these data, MarsLS can compute a landing-site ellipse a standard means of depicting the area within which the spacecraft can be expected to land with a given probability. MarsLS incorporates several features for the user s convenience, including capabilities for drawing lines and ellipses, overlaying kilometer or latitude/longitude grids, drawing and/or specifying lines and/or points, entering notes, defining and/or displaying polygons to indicate hazards or areas of interest, and evaluating hazardous and/or scientifically interesting areas. As part of such an evaluation, MarsLS can compute the probability of landing in a specified polygonal area.

  17. Advanced Mathematical Tools in Metrology III

    NASA Astrophysics Data System (ADS)

    Ciarlini, P.

    The Table of Contents for the book is as follows: * Foreword * Invited Papers * The ISO Guide to the Expression of Uncertainty in Measurement: A Bridge between Statistics and Metrology * Bootstrap Algorithms and Applications * The TTRSs: 13 Oriented Constraints for Dimensioning, Tolerancing & Inspection * Graded Reference Data Sets and Performance Profiles for Testing Software Used in Metrology * Uncertainty in Chemical Measurement * Mathematical Methods for Data Analysis in Medical Applications * High-Dimensional Empirical Linear Prediction * Wavelet Methods in Signal Processing * Software Problems in Calibration Services: A Case Study * Robust Alternatives to Least Squares * Gaining Information from Biomagnetic Measurements * Full Papers * Increase of Information in the Course of Measurement * A Framework for Model Validation and Software Testing in Regression * Certification of Algorithms for Determination of Signal Extreme Values during Measurement * A Method for Evaluating Trends in Ozone-Concentration Data and Its Application to Data from the UK Rural Ozone Monitoring Network * Identification of Signal Components by Stochastic Modelling in Measurements of Evoked Magnetic Fields from Peripheral Nerves * High Precision 3D-Calibration of Cylindrical Standards * Magnetic Dipole Estimations for MCG-Data * Transfer Functions of Discrete Spline Filters * An Approximation Method for the Linearization of Tridimensional Metrology Problems * Regularization Algorithms for Image Reconstruction from Projections * Quality of Experimental Data in Hydrodynamic Research * Stochastic Drift Models for the Determination of Calibration Intervals * Short Communications * Projection Method for Lidar Measurement * Photon Flux Measurements by Regularised Solution of Integral Equations * Correct Solutions of Fit Problems in Different Experimental Situations * An Algorithm for the Nonlinear TLS Problem in Polynomial Fitting * Designing Axially Symmetric Electromechanical Systems of

  18. Recent advances in systems metabolic engineering tools and strategies.

    PubMed

    Chae, Tong Un; Choi, So Young; Kim, Je Woong; Ko, Yoo-Sung; Lee, Sang Yup

    2017-10-01

    Metabolic engineering has been playing increasingly important roles in developing microbial cell factories for the production of various chemicals and materials to achieve sustainable chemical industry. Nowadays, many tools and strategies are available for performing systems metabolic engineering that allows systems-level metabolic engineering in more sophisticated and diverse ways by adopting rapidly advancing methodologies and tools of systems biology, synthetic biology and evolutionary engineering. As an outcome, development of more efficient microbial cell factories has become possible. Here, we review recent advances in systems metabolic engineering tools and strategies together with accompanying application examples. In addition, we describe how these tools and strategies work together in simultaneous and synergistic ways to develop novel microbial cell factories. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Statistical methods for the forensic analysis of striated tool marks

    SciTech Connect

    Hoeksema, Amy Beth

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  20. Toward Understanding the Role of Technological Tools in Statistical Learning.

    ERIC Educational Resources Information Center

    Ben-Zvi, Dani

    2000-01-01

    Begins with some context setting on new views of statistics and statistical education reflected in the introduction of exploratory data analysis (EDA) into the statistics curriculum. Introduces a detailed example of an EDA learning activity in the middle school that makes use of the power of the spreadsheet to mediate students' construction of…

  1. Integration of Advanced Statistical Analysis Tools and Geophysical Modeling

    DTIC Science & Technology

    2012-08-01

    this problem with a fingerprinting algorithm that inverts for target location and orientation while holding polarizations fixed at their library values...Cross-Domain Multitask Learning with Latent Probit Mod- els,” Proc. Int. Conf. Machine Learning (ICML), 2012 L. Beran, S.D. Billings and D. Oldenburg

  2. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  3. Machine Tool Advanced Skills Technology Program (MAST). Overview and Methodology.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology Program (MAST) is a geographical partnership of six of the nation's best two-year colleges located in the six states that have about one-third of the density of metals-related industries in the United States. The purpose of the MAST grant is to develop and implement a national training model to overcome…

  4. Microfield exposure tool enables advances in EUV lithography development

    SciTech Connect

    Naulleau, Patrick

    2009-09-07

    With demonstrated resist resolution of 20 nm half pitch, the SEMATECH Berkeley BUV microfield exposure tool continues to push crucial advances in the areas of BUY resists and masks. The ever progressing shrink in computer chip feature sizes has been fueled over the years by a continual reduction in the wavelength of light used to pattern the chips. Recently, this trend has been threatened by unavailability of lens materials suitable for wavelengths shorter than 193 nm. To circumvent this roadblock, a reflective technology utilizing a significantly shorter extreme ultraviolet (EUV) wavelength (13.5 nm) has been under development for the past decade. The dramatic wavelength shrink was required to compensate for optical design limitations intrinsic in mirror-based systems compared to refractive lens systems. With this significant reduction in wavelength comes a variety of new challenges including developing sources of adequate power, photoresists with suitable resolution, sensitivity, and line-edge roughness characteristics, as well as the fabrication of reflection masks with zero defects. While source development can proceed in the absence of available exposure tools, in order for progress to be made in the areas of resists and masks it is crucial to have access to advanced exposure tools with resolutions equal to or better than that expected from initial production tools. These advanced development tools, however, need not be full field tools. Also, implementing such tools at synchrotron facilities allows them to be developed independent of the availability of reliable stand-alone BUY sources. One such tool is the SEMATECH Berkeley microfield exposure tool (MET). The most unique attribute of the SEMA TECH Berkeley MET is its use of a custom-coherence illuminator made possible by its implementation on a synchrotron beamline. With only conventional illumination and conventional binary masks, the resolution limit of the 0.3-NA optic is approximately 25 nm, however

  5. Tools for Assessing Readability of Statistics Teaching Materials

    ERIC Educational Resources Information Center

    Lesser, Lawrence; Wagler, Amy

    2016-01-01

    This article provides tools and rationale for instructors in math and science to make their assessment and curriculum materials (more) readable for students. The tools discussed (MSWord, LexTutor, Coh-Metrix TEA) are readily available linguistic analysis applications that are grounded in current linguistic theory, but present output that can…

  6. Tools for Assessing Readability of Statistics Teaching Materials

    ERIC Educational Resources Information Center

    Lesser, Lawrence; Wagler, Amy

    2016-01-01

    This article provides tools and rationale for instructors in math and science to make their assessment and curriculum materials (more) readable for students. The tools discussed (MSWord, LexTutor, Coh-Metrix TEA) are readily available linguistic analysis applications that are grounded in current linguistic theory, but present output that can…

  7. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  8. Children's Services Statistical Neighbour Benchmarking Tool. Practitioner User Guide

    ERIC Educational Resources Information Center

    National Foundation for Educational Research, 2007

    2007-01-01

    Statistical neighbour models provide one method for benchmarking progress. For each local authority (LA), these models designate a number of other LAs deemed to have similar characteristics. These designated LAs are known as statistical neighbours. Any LA may compare its performance (as measured by various indicators) against its statistical…

  9. Automated Reshelving Statistics as a Tool in Reference Collection Management.

    ERIC Educational Resources Information Center

    Welch, Jeanie M.; Cauble, Lynn A.; Little, Lara B.

    1997-01-01

    Discusses implementation of the automated recording of reshelving statistics for print reference materials and the use of these statistics in reference-collection development and management, especially in making acquisitions and weeding decisions, based on experiences at the University of North Carolina, Charlotte. (Author/LRW)

  10. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  11. Statistical Analysis Tools for Learning in Engineering Laboratories.

    ERIC Educational Resources Information Center

    Maher, Carolyn A.

    1990-01-01

    Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…

  12. Statistical Analysis Tools for Learning in Engineering Laboratories.

    ERIC Educational Resources Information Center

    Maher, Carolyn A.

    1990-01-01

    Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…

  13. Knowledge, Models and Tools in Support of Advanced Distance Learning

    DTIC Science & Technology

    2006-06-01

    authoring iRides simulations and training, Rivets , is a fast C++ program that has been compiled for three Unix-type operating systems: Linux, Silicon...School instructors to introduce core concepts of the tool in advance of teaching about expected value theory. 4.0 Rivets -Linux-based Authoring of...Simulations and Instruction Functioning versions of Rivets , a descendent of the classic RIDES program have been developed for Linux and for the Macintosh

  14. Bacteriophage-based tools: recent advances and novel applications

    PubMed Central

    O'Sullivan, Lisa; Buttimer, Colin; McAuliffe, Olivia; Bolton, Declan; Coffey, Aidan

    2016-01-01

    Bacteriophages (phages) are viruses that infect bacterial hosts, and since their discovery over a century ago they have been primarily exploited to control bacterial populations and to serve as tools in molecular biology. In this commentary, we highlight recent diverse advances in the field of phage research, going beyond bacterial control using whole phage, to areas including biocontrol using phage-derived enzybiotics, diagnostics, drug discovery, novel drug delivery systems and bionanotechnology. PMID:27990274

  15. Anvil Tool in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe, III; Bauman, William, III; Keen, Jeremy

    2007-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. In order for the Anvil Tool to remain available to the meteorologists, the AMU was tasked to transition the tool to the Advanced Weather interactive Processing System (AWIPS). This report describes the work done by the AMU to develop the Anvil Tool for AWIPS to create a graphical overlay depicting the threat from thunderstorm anvil clouds. The AWIPS Anvil Tool is based on the previously deployed AMU MIDDS Anvil Tool. SMG and 45 WS forecasters have used the MIDDS Anvil Tool during launch and landing operations. SMG's primary weather analysis and display system is now AWIPS and the 45 WS has plans to replace MIDDS with AWIPS. The Anvil Tool creates a graphic that users can overlay on satellite or radar imagery to depict the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on an average of the upper-level observed or forecasted winds. The graphic includes 10 and 20 nm standoff circles centered at the location of interest, in addition to one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 degree sector width based on a previous AMU study which determined thunderstorm anvils move in a direction plus or minus 15 degrees of the upper-level (300- to 150-mb) wind direction. This report briefly describes the history of the MIDDS Anvil Tool and then explains how the initial development of the AWIPS Anvil Tool was carried out. After testing was

  16. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  17. A Technology-Based Statistical Reasoning Assessment Tool in Descriptive Statistics for Secondary School Students

    ERIC Educational Resources Information Center

    Chan, Shiau Wei; Ismail, Zaleha

    2014-01-01

    The focus of assessment in statistics has gradually shifted from traditional assessment towards alternative assessment where more attention has been paid to the core statistical concepts such as center, variability, and distribution. In spite of this, there are comparatively few assessments that combine the significant three types of statistical…

  18. Technological Tools in the Introductory Statistics Classroom: Effects on Student Understanding of Inferential Statistics

    ERIC Educational Resources Information Center

    Meletiou-Mavrotheris, Maria

    2004-01-01

    While technology has become an integral part of introductory statistics courses, the programs typically employed are professional packages designed primarily for data analysis rather than for learning. Findings from several studies suggest that use of such software in the introductory statistics classroom may not be very effective in helping…

  19. Evaluation of reliability modeling tools for advanced fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Baker, Robert; Scheper, Charlotte

    1986-01-01

    The Computer Aided Reliability Estimation (CARE III) and Automated Reliability Interactice Estimation System (ARIES 82) reliability tools for application to advanced fault tolerance aerospace systems were evaluated. To determine reliability modeling requirements, the evaluation focused on the Draper Laboratories' Advanced Information Processing System (AIPS) architecture as an example architecture for fault tolerance aerospace systems. Advantages and limitations were identified for each reliability evaluation tool. The CARE III program was designed primarily for analyzing ultrareliable flight control systems. The ARIES 82 program's primary use was to support university research and teaching. Both CARE III and ARIES 82 were not suited for determining the reliability of complex nodal networks of the type used to interconnect processing sites in the AIPS architecture. It was concluded that ARIES was not suitable for modeling advanced fault tolerant systems. It was further concluded that subject to some limitations (the difficulty in modeling systems with unpowered spare modules, systems where equipment maintenance must be considered, systems where failure depends on the sequence in which faults occurred, and systems where multiple faults greater than a double near coincident faults must be considered), CARE III is best suited for evaluating the reliability of advanced tolerant systems for air transport.

  20. Key stakeholders' perspectives on a Web-based advance care planning tool for advanced lung disease.

    PubMed

    Chiarchiaro, Jared; Ernecoff, Natalie C; Buddadhumaruk, Praewpannarai; Rak, Kimberly J; Arnold, Robert M; White, Douglas B

    2015-12-01

    There is a paucity of scalable advance care planning strategies that achieve the diverse goals of patients, families, and clinicians. We convened key stakeholders to gain their perspectives on developing a Web-based advance care planning tool for lung disease. We conducted semistructured interviews with 50 stakeholders: 21 patients with lung disease, 18 surrogates, and 11 clinicians. Interviews explored stakeholders' desired content and design features of a Web-based advance care planning tool. Participants also rated the tool's acceptability and potential usefulness. We analyzed the interviews with modified grounded theory and validated themes through member checking. Stakeholders highly rated the acceptability (median, 5; interquartile range, 5-5) and potential usefulness (median, 5; interquartile range, 4-5) of a Web-based tool. Interviewees offered several suggestions: (1) use videos of medical scenarios and patient narratives rather than text, (2) include interactive content, and (3) allow the user control over how much they complete in 1 sitting. Participants identified challenges and potential solutions, such as how to manage the emotional difficulty of thinking about death and accommodate low computer literacy users. There is strong stakeholder support for the development of a Web-based advance care planning tool for lung disease. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Anvil Forecast Tool in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and National Weather Service Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) was tasked to create a graphical overlay tool for the Meteorological Interactive Data Display System (MIDDS) that indicates the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. The tool creates a graphic depicting the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on the average of the upper level observed or forecasted winds. The graphic includes 10 and 20 n mi standoff circles centered at the location of interest, as well as one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 sector width based on a previous AMU study that determined thunderstorm anvils move in a direction plus or minus 15 of the upper-level wind direction. The AMU was then tasked to transition the tool to the Advanced Weather Interactive Processing System (AWIPS). SMG later requested the tool be updated to provide more flexibility and quicker access to model data. This presentation describes the work performed by the AMU to transition the tool into AWIPS, as well as the subsequent improvements made to the tool.

  2. Simulation Modeling and Statistical Network Tools for Improving Collaboration in Military Logistics

    DTIC Science & Technology

    2008-10-01

    AFRL-RH-WP-TR-2009-0110 Simulation Modeling and Statistical Network Tools for Improving Collaboration in Military Logistics...SUBTITLE Simulation Modeling and Statistical Network Tools for Improving Collaboration in Military Logistics 5a. CONTRACT NUMBER FA8650-07-1-6848...8 1 1.0 SUMMARY This final technical report describes the research findings of the project Simulation Modeling and Statistical Network

  3. Statistical Analysis of Noisy Signals Using Classification Tools

    SciTech Connect

    Thompson, Sandra E.; Heredia-Langner, Alejandro; Johnson, Timothy J.; Foster, Nancy S.; Valentine, Nancy B.; Amonette, James E.

    2005-06-04

    The potential use of chemicals, biotoxins and biological pathogens are a threat to military and police forces as well as the general public. Rapid identification of these agents is made difficult due to the noisy nature of the signal that can be obtained from portable, in-field sensors. In previously published articles, we created a flowchart that illustrated a method for triaging bacterial identification by combining standard statistical techniques for discrimination and identification with mid-infrared spectroscopic data. The present work documents the process of characterizing and eliminating the sources of the noise and outlines how multidisciplinary teams are necessary to accomplish that goal.

  4. Advances in Testing the Statistical Significance of Mediation Effects

    ERIC Educational Resources Information Center

    Mallinckrodt, Brent; Abraham, W. Todd; Wei, Meifen; Russell, Daniel W.

    2006-01-01

    P. A. Frazier, A. P. Tix, and K. E. Barron (2004) highlighted a normal theory method popularized by R. M. Baron and D. A. Kenny (1986) for testing the statistical significance of indirect effects (i.e., mediator variables) in multiple regression contexts. However, simulation studies suggest that this method lacks statistical power relative to some…

  5. Statistical Methods and Tools for Hanford Staged Feed Tank Sampling

    SciTech Connect

    Fountain, Matthew S.; Brigantic, Robert T.; Peterson, Reid A.

    2013-10-01

    This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).

  6. Statistical, economic and other tools for assessing natural aggregate

    USGS Publications Warehouse

    Bliss, J.D.; Moyle, P.R.; Bolm, K.S.

    2003-01-01

    Quantitative aggregate resource assessment provides resource estimates useful for explorationists, land managers and those who make decisions about land allocation, which may have long-term implications concerning cost and the availability of aggregate resources. Aggregate assessment needs to be systematic and consistent, yet flexible enough to allow updating without invalidating other parts of the assessment. Evaluators need to use standard or consistent aggregate classification and statistic distributions or, in other words, models with geological, geotechnical and economic variables or interrelationships between these variables. These models can be used with subjective estimates, if needed, to estimate how much aggregate may be present in a region or country using distributions generated by Monte Carlo computer simulations.

  7. Rapid development and optimization of tablet manufacturing using statistical tools.

    PubMed

    Fernández, Eutimio Gustavo; Cordero, Silvia; Benítez, Malvina; Perdomo, Iraelio; Morón, Yohandro; Morales, Ada Esther; Arce, Milagros Gaudencia; Cuesta, Ernesto; Lugones, Juan; Fernández, Maritza; Gil, Arturo; Valdés, Rodolfo; Fernández, Mirna

    2008-01-01

    The purpose of this paper was to develop a statistical methodology to optimize tablet manufacturing considering drug chemical and physical properties applying a crossed experimental design. The assessed model drug was dried ferrous sulphate and the variables were the hardness and the relative proportions of three excipients, binder, filler and disintegrant. Granule properties were modeled as a function of excipient proportions and tablet parameters were defined by the excipient proportion and hardness. The desirability function was applied to achieve optimal values for excipient proportions and hardness. In conclusion, crossed experimental design using hardness as the only process variable is an efficient strategy to quickly determine the optimal design process for tablet manufacturing. This method can be applied for any tablet manufacturing method.

  8. Statistical Considerations of Data Processing in Giovanni Online Tool

    NASA Technical Reports Server (NTRS)

    Suhung, Shen; Leptoukh, G.; Acker, J.; Berrick, S.

    2005-01-01

    The GES DISC Interactive Online Visualization and Analysis Infrastructure (Giovanni) is a web-based interface for the rapid visualization and analysis of gridded data from a number of remote sensing instruments. The GES DISC currently employs several Giovanni instances to analyze various products, such as Ocean-Giovanni for ocean products from SeaWiFS and MODIS-Aqua; TOMS & OM1 Giovanni for atmospheric chemical trace gases from TOMS and OMI, and MOVAS for aerosols from MODIS, etc. (http://giovanni.gsfc.nasa.gov) Foremost among the Giovanni statistical functions is data averaging. Two aspects of this function are addressed here. The first deals with the accuracy of averaging gridded mapped products vs. averaging from the ungridded Level 2 data. Some mapped products contain mean values only; others contain additional statistics, such as number of pixels (NP) for each grid, standard deviation, etc. Since NP varies spatially and temporally, averaging with or without weighting by NP will be different. In this paper, we address differences of various weighting algorithms for some datasets utilized in Giovanni. The second aspect is related to different averaging methods affecting data quality and interpretation for data with non-normal distribution. The present study demonstrates results of different spatial averaging methods using gridded SeaWiFS Level 3 mapped monthly chlorophyll a data. Spatial averages were calculated using three different methods: arithmetic mean (AVG), geometric mean (GEO), and maximum likelihood estimator (MLE). Biogeochemical data, such as chlorophyll a, are usually considered to have a log-normal distribution. The study determined that differences between methods tend to increase with increasing size of a selected coastal area, with no significant differences in most open oceans. The GEO method consistently produces values lower than AVG and MLE. The AVG method produces values larger than MLE in some cases, but smaller in other cases. Further

  9. Statistical tools to improve assessing agreement between several observers.

    PubMed

    Ruddat, I; Scholz, B; Bergmann, S; Buehring, A-L; Fischer, S; Manton, A; Prengel, D; Rauch, E; Steiner, S; Wiedmann, S; Kreienbrock, L; Campe, A

    2014-04-01

    In the context of assessing the impact of management and environmental factors on animal health, behaviour or performance it has become increasingly important to conduct (epidemiological) studies in the field. Hence, the number of investigated farms per study is considerably high so that numerous observers are needed for investigation. In order to maintain the quality and validity of study results calibration meetings where observers are trained and the current level of agreement is assessed have to be conducted to minimise the observer effect. When study animals were rated independently by the same observers by a categorical variable the exclusion test can be performed to identify disagreeing observers. This statistical test compares for each variable and each observer the observer-specific agreement with the overall agreement among all observers based on kappa coefficients. It accounts for two major challenges, namely the absence of a gold-standard observer and different data type comprising ordinal, nominal and binary data. The presented methods are applied on a reliability study to assess the agreement among eight observers rating welfare parameters of laying hens. The degree to which the observers agreed depended on the investigated item (global weighted kappa coefficients: 0.37 to 0.94). The proposed method and graphical description served to assess the direction and degree to which an observer deviates from the others. It is suggested to further improve studies with numerous observers by conducting calibration meetings and accounting for observer bias.

  10. Distortion in fingerprints: a statistical investigation using shape measurement tools.

    PubMed

    Sheets, H David; Torres, Anne; Langenburg, Glenn; Bush, Peter J; Bush, Mary A

    2014-07-01

    Friction ridge impression appearance can be affected due to the type of surface touched and pressure exerted during deposition. Understanding the magnitude of alterations, regions affected, and systematic/detectable changes occurring would provide useful information. Geometric morphometric techniques were used to statistically characterize these changes. One hundred and fourteen prints were obtained from a single volunteer and impressed with heavy, normal, and light pressure on computer paper, soft gloss paper, 10-print card stock, and retabs. Six hundred prints from 10 volunteers were rolled with heavy, normal, and light pressure on soft gloss paper and 10-print card stock. Results indicate that while different substrates/pressure levels produced small systematic changes in fingerprints, the changes were small in magnitude: roughly the width of one ridge. There were no detectable changes in the degree of random variability of prints associated with either pressure or substrate. In conclusion, the prints transferred reliably regardless of pressure or substrate. © 2014 American Academy of Forensic Sciences.

  11. Statistical tools to analyze continuous glucose monitor data.

    PubMed

    Clarke, William; Kovatchev, Boris

    2009-06-01

    Continuous glucose monitors (CGMs) generate data streams that are both complex and voluminous. The analyses of these data require an understanding of the physical, biochemical, and mathematical properties involved in this technology. This article describes several methods that are pertinent to the analysis of CGM data, taking into account the specifics of the continuous monitoring data streams. These methods include: (1) evaluating the numerical and clinical accuracy of CGM. We distinguish two types of accuracy metrics-numerical and clinical-each having two subtypes measuring point and trend accuracy. The addition of trend accuracy, e.g., the ability of CGM to reflect the rate and direction of blood glucose (BG) change, is unique to CGM as these new devices are capable of capturing BG not only episodically, but also as a process in time. (2) Statistical approaches for interpreting CGM data. The importance of recognizing that the basic unit for most analyses is the glucose trace of an individual, i.e., a time-stamped series of glycemic data for each person, is stressed. We discuss the use of risk assessment, as well as graphical representation of the data of a person via glucose and risk traces and Poincaré plots, and at a group level via Control Variability-Grid Analysis. In summary, a review of methods specific to the analysis of CGM data series is presented, together with some new techniques. These methods should facilitate the extraction of information from, and the interpretation of, complex and voluminous CGM time series.

  12. Advanced Electric Submersible Pump Design Tool for Geothermal Applications

    SciTech Connect

    Xuele Qi; Norman Turnquist; Farshad Ghasripoor

    2012-05-31

    Electrical Submersible Pumps (ESPs) present higher efficiency, larger production rate, and can be operated in deeper wells than the other geothermal artificial lifting systems. Enhanced Geothermal Systems (EGS) applications recommend lifting 300 C geothermal water at 80kg/s flow rate in a maximum 10-5/8-inch diameter wellbore to improve the cost-effectiveness. In this paper, an advanced ESP design tool comprising a 1D theoretical model and a 3D CFD analysis has been developed to design ESPs for geothermal applications. Design of Experiments was also performed to optimize the geometry and performance. The designed mixed-flow type centrifugal impeller and diffuser exhibit high efficiency and head rise under simulated EGS conditions. The design tool has been validated by comparing the prediction to experimental data of an existing ESP product.

  13. Advances in Mass Spectrometric Tools for Probing Neuropeptides

    NASA Astrophysics Data System (ADS)

    Buchberger, Amanda; Yu, Qing; Li, Lingjun

    2015-07-01

    Neuropeptides are important mediators in the functionality of the brain and other neurological organs. Because neuropeptides exist in a wide range of concentrations, appropriate characterization methods are needed to provide dynamic, chemical, and spatial information. Mass spectrometry and compatible tools have been a popular choice in analyzing neuropeptides. There have been several advances and challenges, both of which are the focus of this review. Discussions range from sample collection to bioinformatic tools, although avenues such as quantitation and imaging are included. Further development of the presented methods for neuropeptidomic mass spectrometric analysis is inevitable, which will lead to a further understanding of the complex interplay of neuropeptides and other signaling molecules in the nervous system.

  14. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  15. Mentorship: a successful tool for recruitment, recognition, and advancement.

    PubMed

    Bernice, Judy; Teixeira, Roberta

    2002-01-01

    Mentoring is a valuable tool for recruitment, retention, and staff personal satisfaction in these chaotic times. Effective staffing of health-care laboratories has become increasingly difficult as decreased numbers of qualified medical technologists graduate from medical technology programs. Laboratory managers experience loss of valuable, seasoned employees as those workers pursue fields other than the laboratory. Mentorships to assist young people in making clinical laboratory science career choices can help to expand the number of medical technologists entering the job market. Mentorships to expand the abilities of experienced technologists and to afford opportunities for advancement can enhance job satisfaction and retention.

  16. The novel quantitative technique for assessment of gait symmetry using advanced statistical learning algorithm.

    PubMed

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  17. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2010-12-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling

  18. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2011-05-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling

  19. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  20. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    NASA Astrophysics Data System (ADS)

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-09-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  1. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    PubMed Central

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  2. Simulated Interactive Research Experiments as Educational Tools for Advanced Science.

    PubMed

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M; Hopf, Martin; Arndt, Markus

    2015-09-15

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  3. ADVISOR: a systems analysis tool for advanced vehicle modeling

    NASA Astrophysics Data System (ADS)

    Markel, T.; Brooker, A.; Hendricks, T.; Johnson, V.; Kelly, K.; Kramer, B.; O'Keefe, M.; Sprik, S.; Wipke, K.

    This paper provides an overview of Advanced Vehicle Simulator (ADVISOR)—the US Department of Energy's (DOE's) ADVISOR written in the MATLAB/Simulink environment and developed by the National Renewable Energy Laboratory. ADVISOR provides the vehicle engineering community with an easy-to-use, flexible, yet robust and supported analysis package for advanced vehicle modeling. It is primarily used to quantify the fuel economy, the performance, and the emissions of vehicles that use alternative technologies including fuel cells, batteries, electric motors, and internal combustion engines in hybrid (i.e. multiple power sources) configurations. It excels at quantifying the relative change that can be expected due to the implementation of technology compared to a baseline scenario. ADVISOR's capabilities and limitations are presented and the power source models that are included in ADVISOR are discussed. Finally, several applications of the tool are presented to highlight ADVISOR's functionality. The content of this paper is based on a presentation made at the 'Development of Advanced Battery Engineering Models' workshop held in Crystal City, Virginia in August 2001.

  4. Advanced Reach Tool (ART): development of the mechanistic model.

    PubMed

    Fransman, Wouter; Van Tongeren, Martie; Cherrie, John W; Tischer, Martin; Schneider, Thomas; Schinkel, Jody; Kromhout, Hans; Warren, Nick; Goede, Henk; Tielemans, Erik

    2011-11-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe. The ART mechanistic model is based on a conceptual framework that adopts a source receptor approach, which describes the transport of a contaminant from the source to the receptor and defines seven independent principal modifying factors: substance emission potential, activity emission potential, localized controls, segregation, personal enclosure, surface contamination, and dispersion. ART currently differentiates between three different exposure types: vapours, mists, and dust (fumes, fibres, and gases are presently excluded). Various sources were used to assign numerical values to the multipliers to each modifying factor. The evidence used to underpin this assessment procedure was based on chemical and physical laws. In addition, empirical data obtained from literature were used. Where this was not possible, expert elicitation was applied for the assessment procedure. Multipliers for all modifying factors were peer reviewed by leading experts from industry, research institutes, and public authorities across the globe. In addition, several workshops with experts were organized to discuss the proposed exposure multipliers. The mechanistic model is a central part of the ART tool and with advancing knowledge on exposure, determinants will require updates and refinements on a continuous basis, such as the effect of worker behaviour on personal exposure, 'best practice' values that describe the maximum achievable effectiveness of control measures, the intrinsic emission potential of various solid objects (e.g. metal, glass, plastics, etc.), and extending the applicability domain to certain types of exposures (e.g. gas, fume, and fibre exposure).

  5. An Advanced Tool for Control System Design and Maintenance

    SciTech Connect

    Storm, Joachim; Lohmann, Heinz

    2006-07-01

    The detailed engineering for control systems is usually supported by CAD Tools creating the relevant logic diagrams including software parameters and signal cross references. However at this stage of the design an early V and V process for checking out the functional correctness of the design is not available. The article describes the scope and capabilities of an advanced control system design tool which has the embedded capability of a stand-alone simulation of complex logic structures. The tool provides the following features for constructing logic diagrams for control systems: - Drag and Drop construction of logic diagrams using a predefined symbol sets; - Cross reference facility; - Data extraction facility; - Stand-alone simulation for Logic Diagrams featuring: On the fly changes, signal line animation, value boxes and mini trends etc. - Creation and on-line animation of Compound Objects (Handler); - Code Generation Facility for Simulation; - Code Generation Facility for several control systems. The results of the integrated simulation based V and V process can be used further for initial control system configuration and life cycle management as well as for Engineering Test Bed applications and finally in full Scope Replica Simulators for Operator Training. (authors)

  6. An Advanced Decision Support Tool for Electricity Infrastructure Operations

    SciTech Connect

    Chen, Yousu; Huang, Zhenyu; Wong, Pak C.; Mackey, Patrick S.; Allwardt, Craig H.; Ma, Jian; Greitzer, Frank L.

    2010-01-31

    Electricity infrastructure, as one of the most critical infrastructures in the U.S., plays an important role in modern societies. Its failure would lead to significant disruption of people’s lives, industry and commercial activities, and result in massive economic losses. Reliable operation of electricity infrastructure is an extremely challenging task because human operators need to consider thousands of possible configurations in near real-time to choose the best option and operate the network effectively. In today’s practice, electricity infrastructure operation is largely based on operators’ experience with very limited real-time decision support, resulting in inadequate management of complex predictions and the inability to anticipate, recognize, and respond to situations caused by human errors, natural disasters, or cyber attacks. Therefore, a systematic approach is needed to manage the complex operational paradigms and choose the best option in a near-real-time manner. This paper proposes an advanced decision support tool for electricity infrastructure operations. The tool has the functions of turning large amount of data into actionable information to help operators monitor power grid status in real time; performing trend analysis to indentify system trend at the regional level or system level to help the operator to foresee and discern emergencies, studying clustering analysis to assist operators to identify the relationships between system configurations and affected assets, and interactively evaluating the alternative remedial actions to aid operators to make effective and timely decisions. This tool can provide significant decision support on electricity infrastructure operations and lead to better reliability in power grids. This paper presents examples with actual electricity infrastructure data to demonstrate the capability of this tool.

  7. Basics, common errors and essentials of statistical tools and techniques in anesthesiology research.

    PubMed

    Bajwa, Sukhminder Jit Singh

    2015-01-01

    The statistical portion is a vital component of any research study. The research methodology and the application of statistical tools and techniques have evolved over the years and have significantly helped the research activities throughout the globe. The results and inferences are not accurately possible without proper validation with various statistical tools and tests. The evidencebased anesthesia research and practice has to incorporate statistical tools in the methodology right from the planning stage of the study itself. Though the medical fraternity is well acquainted with the significance of statistics in research, there is a lack of in-depth knowledge about the various statistical concepts and principles among majority of the researchers. The clinical impact and consequences can be serious as the incorrect analysis, conclusions, and false results may construct an artificial platform on which future research activities are replicated. The present tutorial is an attempt to make anesthesiologists aware of the various aspects of statistical methods used in evidence-based research and also to highlight the common areas where maximum number of statistical errors are committed so as to adopt better statistical practices.

  8. Bioinformatics Methods and Tools to Advance Clinical Care

    PubMed Central

    Lecroq, T.

    2015-01-01

    Summary Objectives To summarize excellent current research in the field of Bioinformatics and Translational Informatics with application in the health domain and clinical care. Method We provide a synopsis of the articles selected for the IMIA Yearbook 2015, from which we attempt to derive a synthetic overview of current and future activities in the field. As last year, a first step of selection was performed by querying MEDLINE with a list of MeSH descriptors completed by a list of terms adapted to the section. Each section editor has evaluated separately the set of 1,594 articles and the evaluation results were merged for retaining 15 articles for peer-review. Results The selection and evaluation process of this Yearbook’s section on Bioinformatics and Translational Informatics yielded four excellent articles regarding data management and genome medicine that are mainly tool-based papers. In the first article, the authors present PPISURV a tool for uncovering the role of specific genes in cancer survival outcome. The second article describes the classifier PredictSNP which combines six performing tools for predicting disease-related mutations. In the third article, by presenting a high-coverage map of the human proteome using high resolution mass spectrometry, the authors highlight the need for using mass spectrometry to complement genome annotation. The fourth article is also related to patient survival and decision support. The authors present datamining methods of large-scale datasets of past transplants. The objective is to identify chances of survival. Conclusions The current research activities still attest the continuous convergence of Bioinformatics and Medical Informatics, with a focus this year on dedicated tools and methods to advance clinical care. Indeed, there is a need for powerful tools for managing and interpreting complex, large-scale genomic and biological datasets, but also a need for user-friendly tools developed for the clinicians in their

  9. Clinical holistic health: advanced tools for holistic medicine.

    PubMed

    Ventegodt, Søren; Clausen, Birgitte; Nielsen, May Lyck; Merrick, Joav

    2006-02-24

    According to holistic medical theory, the patient will heal when old painful moments, the traumatic events of life that are often called "gestalts", are integrated in the present "now". The advanced holistic physician's expanded toolbox has many different tools to induce this healing, some that are more dangerous and potentially traumatic than others. The more intense the therapeutic technique, the more emotional energy will be released and contained in the session, but the higher also is the risk for the therapist to lose control of the session and lose the patient to his or her own dark side. To avoid harming the patient must be the highest priority in holistic existential therapy, making sufficient education and training an issue of highest importance. The concept of "stepping up" the therapy by using more and more "dramatic" methods to get access to repressed emotions and events has led us to a "therapeutic staircase" with ten steps: (1) establishing the relationship; (2) establishing intimacy, trust, and confidentiality; (3) giving support and holding; (4) taking the patient into the process of physical, emotional, and mental healing; (5) social healing of being in the family; (6) spiritual healing--returning to the abstract wholeness of the soul; (7) healing the informational layer of the body; (8) healing the three fundamental dimensions of existence: love, power, and sexuality in a direct way using, among other techniques, "controlled violence" and "acupressure through the vagina"; (9) mind-expanding and consciousness-transformative techniques like psychotropic drugs; and (10) techniques transgressing the patient's borders and, therefore, often traumatizing (for instance, the use of force against the will of the patient). We believe that the systematic use of the staircase will greatly improve the power and efficiency of holistic medicine for the patient and we invite a broad cooperation in scientifically testing the efficiency of the advanced holistic

  10. Sandia Advanced MEMS Design Tools v. 3.0

    SciTech Connect

    Yarberry, Victor R.; Allen, James J.; Lantz, Jeffrey W.; Priddy, Brian; Westlin, Belinda; Young, Andrew

    2016-08-25

    This is a major revision to the Sandia Advanced MEMS Design Tools. It replaces all previous versions. New features in this version: Revised to support AutoCAD 2014 and 2015 This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Provide enabling educational information (including pictures, videos, technical information) c) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) d) Facilitate the process of having MEMS fabricated at Sandia National Laboratories e) Facilitate the process of having post-fabrication services performed. While there exists some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.

  11. Sandia Advanced MEMS Design Tools, Version 2.0

    SciTech Connect

    Allen, Jim; McBrayer, John; Miller, Sam; Rodgers, Steve; montague, Steve; Sniegowski, Jeff; Jakubczak, Jay; Yarberry, Vic; Barnes, Steve; Priddy, Brian; Reyes, David; Westling, Belinda

    2002-06-13

    Sandia Advanced MEMS Design Tools is a 5-level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Provide enabling educational information (including pictures, videos, technical information) c)Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) d) Facilitate the process of having MEMS fabricated at SNL e) Facilitate the process of having post-fabrication services performed While there exist some files on the CD that are used in conjunction with the software AutoCAD, these files are not intended for use independent of the CD. NOTE: THE CUSTOMER MUST PURCHASE HIS/HER OWN COPY OF AutoCAD TO USE WITH THESE FILES.

  12. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  13. Carbohydrate Structure Database: tools for statistical analysis of bacterial, plant and fungal glycomes

    PubMed Central

    Egorova, K.S.; Kondakova, A.N.; Toukach, Ph.V.

    2015-01-01

    Carbohydrates are biological blocks participating in diverse and crucial processes both at cellular and organism levels. They protect individual cells, establish intracellular interactions, take part in the immune reaction and participate in many other processes. Glycosylation is considered as one of the most important modifications of proteins and other biologically active molecules. Still, the data on the enzymatic machinery involved in the carbohydrate synthesis and processing are scattered, and the advance on its study is hindered by the vast bulk of accumulated genetic information not supported by any experimental evidences for functions of proteins that are encoded by these genes. In this article, we present novel instruments for statistical analysis of glycomes in taxa. These tools may be helpful for investigating carbohydrate-related enzymatic activities in various groups of organisms and for comparison of their carbohydrate content. The instruments are developed on the Carbohydrate Structure Database (CSDB) platform and are available freely on the CSDB web-site at http://csdb.glycoscience.ru. Database URL: http://csdb.glycoscience.ru PMID:26337239

  14. Carbohydrate Structure Database: tools for statistical analysis of bacterial, plant and fungal glycomes.

    PubMed

    Egorova, K S; Kondakova, A N; Toukach, Ph V

    2015-01-01

    Carbohydrates are biological blocks participating in diverse and crucial processes both at cellular and organism levels. They protect individual cells, establish intracellular interactions, take part in the immune reaction and participate in many other processes. Glycosylation is considered as one of the most important modifications of proteins and other biologically active molecules. Still, the data on the enzymatic machinery involved in the carbohydrate synthesis and processing are scattered, and the advance on its study is hindered by the vast bulk of accumulated genetic information not supported by any experimental evidences for functions of proteins that are encoded by these genes. In this article, we present novel instruments for statistical analysis of glycomes in taxa. These tools may be helpful for investigating carbohydrate-related enzymatic activities in various groups of organisms and for comparison of their carbohydrate content. The instruments are developed on the Carbohydrate Structure Database (CSDB) platform and are available freely on the CSDB web-site at http://csdb.glycoscience.ru. Database URL: http://csdb.glycoscience.ru. © The Author(s) 2015. Published by Oxford University Press.

  15. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  16. The statistical multifragmentation model: Origins and recent advances

    NASA Astrophysics Data System (ADS)

    Donangelo, R.; Souza, S. R.

    2016-07-01

    We review the Statistical Multifragmentation Model (SMM) which considers a generalization of the liquid-drop model for hot nuclei and allows one to calculate thermodynamic quantities characterizing the nuclear ensemble at the disassembly stage. We show how to determine probabilities of definite partitions of finite nuclei and how to determine, through Monte Carlo calculations, observables such as the caloric curve, multiplicity distributions, heat capacity, among others. Some experimental measurements of the caloric curve confirmed the SMM predictions of over 10 years before, leading to a surge in the interest in the model. However, the experimental determination of the fragmentation temperatures relies on the yields of different isotopic species, which were not correctly calculated in the schematic, liquid-drop picture, employed in the SMM. This led to a series of improvements in the SMM, in particular to the more careful choice of nuclear masses and energy densities, specially for the lighter nuclei. With these improvements the SMM is able to make quantitative determinations of isotope production. We show the application of SMM to the production of exotic nuclei through multifragmentation. These preliminary calculations demonstrate the need for a careful choice of the system size and excitation energy to attain maximum yields.

  17. The statistical multifragmentation model: Origins and recent advances

    SciTech Connect

    Donangelo, R.; Souza, S. R.

    2016-07-07

    We review the Statistical Multifragmentation Model (SMM) which considers a generalization of the liquid-drop model for hot nuclei and allows one to calculate thermodynamic quantities characterizing the nuclear ensemble at the disassembly stage. We show how to determine probabilities of definite partitions of finite nuclei and how to determine, through Monte Carlo calculations, observables such as the caloric curve, multiplicity distributions, heat capacity, among others. Some experimental measurements of the caloric curve confirmed the SMM predictions of over 10 years before, leading to a surge in the interest in the model. However, the experimental determination of the fragmentation temperatures relies on the yields of different isotopic species, which were not correctly calculated in the schematic, liquid-drop picture, employed in the SMM. This led to a series of improvements in the SMM, in particular to the more careful choice of nuclear masses and energy densities, specially for the lighter nuclei. With these improvements the SMM is able to make quantitative determinations of isotope production. We show the application of SMM to the production of exotic nuclei through multifragmentation. These preliminary calculations demonstrate the need for a careful choice of the system size and excitation energy to attain maximum yields.

  18. Sandia Advanced MEMS Design Tools, Version 2.2.5

    SciTech Connect

    Yarberry, Victor; Allen, James; Lantz, Jeffery; Priddy, Brian; & Westling, Belinda

    2010-01-19

    The Sandia National Laboratories Advanced MEMS Design Tools, Version 2.2.5, is a collection of menus, prototype drawings, and executables that provide significant productivity enhancements when using AutoCAD to design MEMS components. This release is designed for AutoCAD 2000i, 2002, or 2004 and is supported under Windows NT 4.0, Windows 2000, or XP. SUMMiT V (Sandia Ultra planar Multi level MEMS Technology) is a 5 level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) New features in this version: AutoCAD 2004 support has been added. SafeExplode ? a new feature that explodes blocks without affecting polylines (avoids exploding polylines into objects that are ignored by the DRC and Visualization tools). Layer control menu ? a pull-down menu for selecting layers to isolate, freeze, or thaw. Updated tools: A check has been added to catch invalid block names. DRC features: Added username/password validation, added a method to update the user?s password. SNL_DRC_WIDTH ? a value to control the width of the DRC error lines. SNL_BIAS_VALUE ? a value use to offset selected geometry SNL_PROCESS_NAME ? a value to specify the process name Documentation changes: The documentation has been updated to include the new features. While there exist some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.

  19. Risk of Advanced Neoplasia Using the National Cancer Institute's Colorectal Cancer Risk Assessment Tool.

    PubMed

    Imperiale, Thomas F; Yu, Menggang; Monahan, Patrick O; Stump, Timothy E; Tabbey, Rebeka; Glowinski, Elizabeth; Ransohoff, David F

    2017-01-01

    There is no validated, discriminating, and easy-to-apply tool for estimating risk of colorectal neoplasia. We studied whether the National Cancer Institute's (NCI's) Colorectal Cancer (CRC) Risk Assessment Tool, which estimates future CRC risk, could estimate current risk for advanced colorectal neoplasia among average-risk persons. This cross-sectional study involved individuals age 50 to 80 years undergoing first-time screening colonoscopy. We measured medical and family history, lifestyle information, and physical measures and calculated each person's future CRC risk using the NCI tool's logistic regression equation. We related quintiles of future CRC risk to the current risk of advanced neoplasia (sessile serrated polyp or tubular adenoma ≥ 1 cm, a polyp with villous histology or high-grade dysplasia, or CRC). All statistical tests were two-sided. For 4457 (98.5%) with complete data (mean age = 57.2 years, SD = 6.6 years, 51.7% women), advanced neoplasia prevalence was 8.26%. Based on quintiles of five-year estimated absolute CRC risk, current risks of advanced neoplasia were 2.1% (95% confidence interval [CI] = 1.3% to 3.3%), 4.8% (95% CI = 3.5% to 6.4%), 6.4% (95% CI = 4.9% to 8.2%), 10.0% (95% CI = 8.1% to 12.1%), and 17.6% (95% CI = 15.5% to 20.6%; P < .001). For quintiles of estimated 10-year CRC risk, corresponding current risks for advanced neoplasia were 2.2% (95% CI = 1.4% to 3.5%), 4.8% (95% CI = 3.5% to 6.4%), 6.5% (95% CI = 5.0% to 8.3%), 9.3% (95% CI = 7.5% to 11.4%), and 18.4% (95% CI = 15.9% to 21.1%; P < .001). Among persons with an estimated five-year CRC risk above the median, current risk for advanced neoplasia was 12.8%, compared with 3.7% among those below the median (relative risk = 3.4, 95 CI = 2.7 to 4.4). The NCI's Risk Assessment Tool, which estimates future CRC risk, may be used to estimate current risk for advanced neoplasia, making it potentially useful for tailoring and improving CRC

  20. Advanced statistics: applying statistical process control techniques to emergency medicine: a primer for providers.

    PubMed

    Callahan, Charles D; Griffen, David L

    2003-08-01

    Emergency medicine faces unique challenges in the effort to improve efficiency and effectiveness. Increased patient volumes, decreased emergency department (ED) supply, and an increased emphasis on the ED as a diagnostic center have contributed to poor customer satisfaction and process failures such as diversion/bypass. Statistical process control (SPC) techniques developed in industry offer an empirically based means to understand our work processes and manage by fact. Emphasizing that meaningful quality improvement can occur only when it is exercised by "front-line" providers, this primer presents robust yet accessible SPC concepts and techniques for use in today's ED.

  1. Advanced Infusion Techniques with 3-D Printed Tooling

    SciTech Connect

    Nuttall, David; Elliott, Amy; Post, Brian K.; Love, Lonnie J.

    2016-05-10

    The manufacturing of tooling for large, contoured surfaces for fiber-layup applications requires significant effort to understand the geometry and then to subtractively manufacture the tool. Traditional methods for the auto industry use clay that is hand sculpted. In the marine pleasure craft industry, the exterior of the model is formed from a foam lay-up that is either hand cut or machined to create smooth lines. Engineers and researchers at Oak Ridge National Laboratory s Manufacturing Demonstration Facility (ORNL MDF) collaborated with Magnum Venus Products (MVP) in the development of a process for reproducing legacy whitewater adventure craft via digital scanning and large scale 3-D printed layup molds. The process entailed 3D scanning a legacy canoe form, converting that form to a CAD model, additively manufacturing (3-D Print) the mold tool, and subtractively finishing the mold s transfer surfaces. Future work will include applying a gelcoat to the mold transfer surface and infusing using vacuum assisted resin transfer molding, or VARTM principles, to create a watertight vessel. The outlined steps were performed on a specific canoe geometry found by MVP s principal participant. The intent of utilizing this geometry is to develop an energy efficient and marketable process for replicating complex shapes, specifically focusing on this particular watercraft, and provide a finished product for demonstration to the composites industry. The culminating part produced through this agreement has been slated for public presentation and potential demonstration at the 2016 CAMX (Composites and Advanced Materials eXpo) exposition in Anaheim, CA. Phase I of this collaborative research and development agreement (MDF-15-68) was conducted under CRADA NFE-15-05575 and was initiated on May 7, 2015, with an introduction to the MVP product line, and concluded in March of 2016 with the printing of and processing of a canoe mold. The project partner Magnum Venous Products (MVP) is

  2. Querator: an advanced multi-archive data mining tool

    NASA Astrophysics Data System (ADS)

    Pierfederici, Francesco

    2001-11-01

    In recent years, the operation of large telescopes with wide field detectors - such as the European Southern Observatory (ESO) Wide Field Imager (WFI) on the 2.2 meters telescope at La Silla, Chile - have dramatically increased the amount of astronomical data produced each year. The next survey telescopes, such as the ESO VST, will continue on this trend, producing extremely large datasets. Astronomy, therefore, has become an incredibly data rich field requiring new tools and new strategies to efficiently handle huge archives and fully exploit their scientific content. At the Space Telescope European Coordinating Facility we are working on a new project, code named Querator (http://archive.eso.org/querator/). Querator is an advanced multi-archive search engine built to address the needs of astronomers looking for multicolor imaging data across different astronomical data-centers. Querator returns sets of images of a given astronomical object or search region. A set contains exposures in a number of different wave bands. The user constraints the number of desired wave bands by selecting from a set of instruments, filters or by specifying actual physical units. As far as present-day data-centers are concerned, Querator points out the need for: - an uniform and standard description of archival data and - an uniform and standard description of how the data was acquired (i.e. instrument and observation characteristics). Clearly, these pieces of information will constitute an intermediate layer between the data itself and the data mining tools operating on it. This layered structure is a prerequisite to real data-center inter-operability and, hence, to Virtual Observatories. A detailed description of Querator's design, of the required data structures, of the problems encountered so far and of the proposed solutions will be given in the following pages. Throughout this paper we'll favor the term data-center over archive to stress the need to look at raw-pixels' archives

  3. The New Hampshire watershed tool: a geographic information system tool to estimate streamflow statistics and ground-water-recharge rates

    USGS Publications Warehouse

    Olson, Scott A.; Flynn, Robert H.; Johnston, Craig M.; Tasker, Gary D.

    2005-01-01

    Estimates of low-flow statistics, flow durations, and ground-water-recharge rates are needed to assist water-resource managers in assessing surface-water resources and ground-water availability. Often these estimates are required at ungaged sites where no observed streamflow data are available for analysis. Regression equations for estimating low-flow statistics and flow durations, and for estimating ground-water-recharge rates at ungaged sites have been developed for New Hampshire. However, use of these equations requires numerous input parameters, such as basin and climatic characteristics. This report describes a customized geographic information system (GIS) application, the New Hampshire Watershed Tool, that automates the measurement of the characteristics used for input to the regression equations and calculates the corresponding flow statistics and ground-water-recharge rates.

  4. Astonishing advances in mouse genetic tools for biomedical research.

    PubMed

    Kaczmarczyk, Lech; Jackson, Walker S

    2015-01-01

    The humble house mouse has long been a workhorse model system in biomedical research. The technology for introducing site-specific genome modifications led to Nobel Prizes for its pioneers and opened a new era of mouse genetics. However, this technology was very time-consuming and technically demanding. As a result, many investigators continued to employ easier genome manipulation methods, though resulting models can suffer from overlooked or underestimated consequences. Another breakthrough, invaluable for the molecular dissection of disease mechanisms, was the invention of high-throughput methods to measure the expression of a plethora of genes in parallel. However, the use of samples containing material from multiple cell types could obfuscate data, and thus interpretations. In this review we highlight some important issues in experimental approaches using mouse models for biomedical research. We then discuss recent technological advances in mouse genetics that are revolutionising human disease research. Mouse genomes are now easily manipulated at precise locations thanks to guided endonucleases, such as transcription activator-like effector nucleases (TALENs) or the CRISPR/Cas9 system, both also having the potential to turn the dream of human gene therapy into reality. Newly developed methods of cell type-specific isolation of transcriptomes from crude tissue homogenates, followed by detection with next generation sequencing (NGS), are vastly improving gene regulation studies. Taken together, these amazing tools simplify the creation of much more accurate mouse models of human disease, and enable the extraction of hitherto unobtainable data.

  5. Functional toxicology: tools to advance the future of toxicity testing

    PubMed Central

    Gaytán, Brandon D.; Vulpe, Chris D.

    2014-01-01

    The increased presence of chemical contaminants in the environment is an undeniable concern to human health and ecosystems. Historically, by relying heavily upon costly and laborious animal-based toxicity assays, the field of toxicology has often neglected examinations of the cellular and molecular mechanisms of toxicity for the majority of compounds—information that, if available, would strengthen risk assessment analyses. Functional toxicology, where cells or organisms with gene deletions or depleted proteins are used to assess genetic requirements for chemical tolerance, can advance the field of toxicity testing by contributing data regarding chemical mechanisms of toxicity. Functional toxicology can be accomplished using available genetic tools in yeasts, other fungi and bacteria, and eukaryotes of increased complexity, including zebrafish, fruit flies, rodents, and human cell lines. Underscored is the value of using less complex systems such as yeasts to direct further studies in more complex systems such as human cell lines. Functional techniques can yield (1) novel insights into chemical toxicity; (2) pathways and mechanisms deserving of further study; and (3) candidate human toxicant susceptibility or resistance genes. PMID:24847352

  6. European regulatory tools for advanced therapy medicinal products.

    PubMed

    Flory, Egbert; Reinhardt, Jens

    2013-12-01

    Increasing scientific knowledge and technical innovations in the areas of cell biology, biotechnology and medicine resulted in the development of promising therapeutic approaches for the prevention and treatment of human diseases. Advanced therapy medicinal products (ATMPs) reflect a complex and innovative class of biopharmaceuticals as these products are highly research-driven, characterised by innovative manufacturing processes and heterogeneous with regard to their origin, type and complexity. This class of ATMP integrates gene therapy medicinal products, somatic cell therapy medicinal products and tissue engineering products and are often individualized and patient-specific products. Multiple challenges arise from the nature of ATMPs, which are often developed by micro, small and medium sized enterprises, university and academia, for whom regulatory experiences are limited and regulatory requirements are challenging. Regulatory guidance such as the reflection paper on classification of ATMPs and guidelines highlighting product-specific issues support academic research groups and pharmaceutical companies to foster the development of safe and effective ATMPs. This review provides an overview on the European regulatory aspects of ATMPs and highlights specific regulatory tools such as the ATMP classification procedure, a discussion on the hospital exemption for selected ATMPs as well as borderline issues towards transplants/transfusion products.

  7. European Regulatory Tools for Advanced Therapy Medicinal Products

    PubMed Central

    Flory, Egbert; Reinhardt, Jens

    2013-01-01

    Summary Increasing scientific knowledge and technical innovations in the areas of cell biology, biotechnology and medicine resulted in the development of promising therapeutic approaches for the prevention and treatment of human diseases. Advanced therapy medicinal products (ATMPs) reflect a complex and innovative class of biopharmaceuticals as these products are highly research-driven, characterised by innovative manufacturing processes and heterogeneous with regard to their origin, type and complexity. This class of ATMP integrates gene therapy medicinal products, somatic cell therapy medicinal products and tissue engineering products and are often individualized and patient-specific products. Multiple challenges arise from the nature of ATMPs, which are often developed by micro, small and medium sized enterprises, university and academia, for whom regulatory experiences are limited and regulatory requirements are challenging. Regulatory guidance such as the reflection paper on classification of ATMPs and guidelines highlighting product-specific issues support academic research groups and pharmaceutical companies to foster the development of safe and effective ATMPs. This review provides an overview on the European regulatory aspects of ATMPs and highlights specific regulatory tools such as the ATMP classification procedure, a discussion on the hospital exemption for selected ATMPs as well as borderline issues towards transplants/transfusion products. PMID:24474890

  8. Advances in deep-UV processing using cluster tools

    NASA Astrophysics Data System (ADS)

    Escher, Gary C.; Tepolt, Gary; Mohondro, Robert D.

    1993-09-01

    Deep-UV laser lithography has shown the capability of supporting the manufacture of multiple generations of integrated circuits (ICs) due to its wide process latitude and depth of focus (DOF) for 0.2 micrometers to 0.5 micrometers feature sizes. This capability has been attained through improvements in deep-UV wide field lens technology, excimer lasers, steppers and chemically amplified, positive deep-UV resists. Chemically amplified deep-UV resists are required for 248 nm lithography due to the poor absorption and sensitivity of conventional novolac resists. The acid catalyzation processes of the new resists requires control of the thermal history and environmental conditions of the lithographic process. Work is currently underway at several resist vendors to reduce the need for these controls, but practical manufacturing solutions exist today. One of these solutions is the integration of steppers and resist tracks into a `cluster tool' or `Lithocell' to insure a consistent thermal profile for the resist process and reduce the time the resist is exposed to atmospheric contamination. The work here reports processing and system integration results with a Machine Technology, Inc (MTI) post-exposure bake (PEB) track interfaced with an advanced GCA XLS 7800 deep-UV stepper [31 mm diameter, variable NA (0.35 - 0.53) and variable sigma (0.3 - 0.74)].

  9. The scientific modeling assistant: An advanced software tool for scientific model building

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  10. The scientific modeling assistant: An advanced software tool for scientific model building

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  11. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  12. Classification of human colonic tissues using FTIR spectra and advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Zwielly, A.; Argov, S.; Salman, A.; Bogomolny, E.; Mordechai, S.

    2010-04-01

    One of the major public health hazards is colon cancer. There is a great necessity to develop new methods for early detection of cancer. If colon cancer is detected and treated early, cure rate of more than 90% can be achieved. In this study we used FTIR microscopy (MSP), which has shown a good potential in the last 20 years in the fields of medical diagnostic and early detection of abnormal tissues. Large database of FTIR microscopic spectra was acquired from 230 human colonic biopsies. Five different subgroups were included in our database, normal and cancer tissues as well as three stages of benign colonic polyps, namely, mild, moderate and severe polyps which are precursors of carcinoma. In this study we applied advanced mathematical and statistical techniques including principal component analysis (PCA) and linear discriminant analysis (LDA), on human colonic FTIR spectra in order to differentiate among the mentioned subgroups' tissues. Good classification accuracy between normal, polyps and cancer groups was achieved with approximately 85% success rate. Our results showed that there is a great potential of developing FTIR-micro spectroscopy as a simple, reagent-free viable tool for early detection of colon cancer in particular the early stages of premalignancy among the benign colonic polyps.

  13. A Web-Based Learning Tool Improves Student Performance in Statistics: A Randomized Masked Trial

    ERIC Educational Resources Information Center

    Gonzalez, Jose A.; Jover, Lluis; Cobo, Erik; Munoz, Pilar

    2010-01-01

    Background: e-status is a web-based tool able to generate different statistical exercises and to provide immediate feedback to students' answers. Although the use of Information and Communication Technologies (ICTs) is becoming widespread in undergraduate education, there are few experimental studies evaluating its effects on learning. Method: All…

  14. A Web-Based Learning Tool Improves Student Performance in Statistics: A Randomized Masked Trial

    ERIC Educational Resources Information Center

    Gonzalez, Jose A.; Jover, Lluis; Cobo, Erik; Munoz, Pilar

    2010-01-01

    Background: e-status is a web-based tool able to generate different statistical exercises and to provide immediate feedback to students' answers. Although the use of Information and Communication Technologies (ICTs) is becoming widespread in undergraduate education, there are few experimental studies evaluating its effects on learning. Method: All…

  15. Development of a statistical tool for the estimation of riverbank erosion probability

    NASA Astrophysics Data System (ADS)

    Varouchakis, Emmanouil

    2016-04-01

    Riverbank erosion affects river morphology and local habitat, and results in riparian land loss, property and infrastructure damage, and ultimately flood defence weakening. An important issue concerning riverbank erosion is the identification of the vulnerable areas in order to predict river changes and assist stream management/restoration. An approach to predict areas vulnerable to erosion is to quantify the erosion probability by identifying the underlying relations between riverbank erosion and geomorphological or hydrological variables that prevent or stimulate erosion. In the present work, a innovative statistical methodology is proposed to predict the probability of presence or absence of erosion in a river section. A physically based model determines the locations vulnerable to erosion by quantifying the potential eroded area. The derived results are used to determine validation locations for the evaluation of the statistical tool performance. The statistical tool is based on a series of independent local variables and employs the Logistic Regression methodology. It is developed in two forms, Logistic Regression and Locally Weighted Logistic Regression, which both deliver useful and accurate results. The second form though, provides the most accurate results as it validates the presence or absence of erosion at all validation locations. The proposed tool is easy to use, accurate and can be applied to any region and river. Varouchakis, E. A., Giannakis, G. V., Lilli, M. A., Ioannidou, E., Nikolaidis, N. P., and Karatzas, G. P.: Development of a statistical tool for the estimation of riverbank erosion probability, SOIL (EGU), in print, 2016.

  16. Calibrating the Difficulty of an Assessment Tool: The Blooming of a Statistics Examination

    ERIC Educational Resources Information Center

    Dunham, Bruce; Yapa, Gaitri; Yu, Eugenia

    2015-01-01

    Bloom's taxonomy is proposed as a tool by which to assess the level of complexity of assessment tasks in statistics. Guidelines are provided for how to locate tasks at each level of the taxonomy, along with descriptions and examples of suggested test questions. Through the "Blooming" of an examination--that is, locating its constituent…

  17. The Design and Implementation of an Interactive Learning Tool for Statistical Reasoning with Uncertainty.

    ERIC Educational Resources Information Center

    Vastola, Deborah A.; Walker, Ellen L.

    1995-01-01

    Describes a computer application that uses graphics, color, and animation to make the learning of statistical reasoning with uncertainty easier and more fun. Notes that the initial implementation focuses on the Dempster-Shafer model, although the design is a framework that can incorporate other models. Also discusses how the tool may be used to…

  18. Learning Axes and Bridging Tools in a Technology-Based Design for Statistics

    ERIC Educational Resources Information Center

    Abrahamson, Dor; Wilensky, Uri

    2007-01-01

    We introduce a design-based research framework, "learning axes and bridging tools," and demonstrate its application in the preparation and study of an implementation of a middle-school experimental computer-based unit on probability and statistics, "ProbLab" (Probability Laboratory, Abrahamson and Wilensky 2002 [Abrahamson, D., & Wilensky, U.…

  19. Calibrating the Difficulty of an Assessment Tool: The Blooming of a Statistics Examination

    ERIC Educational Resources Information Center

    Dunham, Bruce; Yapa, Gaitri; Yu, Eugenia

    2015-01-01

    Bloom's taxonomy is proposed as a tool by which to assess the level of complexity of assessment tasks in statistics. Guidelines are provided for how to locate tasks at each level of the taxonomy, along with descriptions and examples of suggested test questions. Through the "Blooming" of an examination--that is, locating its constituent…

  20. Learning Axes and Bridging Tools in a Technology-Based Design for Statistics

    ERIC Educational Resources Information Center

    Abrahamson, Dor; Wilensky, Uri

    2007-01-01

    We introduce a design-based research framework, "learning axes and bridging tools," and demonstrate its application in the preparation and study of an implementation of a middle-school experimental computer-based unit on probability and statistics, "ProbLab" (Probability Laboratory, Abrahamson and Wilensky 2002 [Abrahamson, D., & Wilensky, U.…

  1. STRING 3: An Advanced Groundwater Flow Visualization Tool

    NASA Astrophysics Data System (ADS)

    Schröder, Simon; Michel, Isabel; Biedert, Tim; Gräfe, Marius; Seidel, Torsten; König, Christoph

    2016-04-01

    The visualization of 3D groundwater flow is a challenging task. Previous versions of our software STRING [1] solely focused on intuitive visualization of complex flow scenarios for non-professional audiences. STRING, developed by Fraunhofer ITWM (Kaiserslautern, Germany) and delta h Ingenieurgesellschaft mbH (Witten, Germany), provides the necessary means for visualization of both 2D and 3D data on planar and curved surfaces. In this contribution we discuss how to extend this approach to a full 3D tool and its challenges in continuation of Michel et al. [2]. This elevates STRING from a post-production to an exploration tool for experts. In STRING moving pathlets provide an intuition of velocity and direction of both steady-state and transient flows. The visualization concept is based on the Lagrangian view of the flow. To capture every detail of the flow an advanced method for intelligent, time-dependent seeding is used building on the Finite Pointset Method (FPM) developed by Fraunhofer ITWM. Lifting our visualization approach from 2D into 3D provides many new challenges. With the implementation of a seeding strategy for 3D one of the major problems has already been solved (see Schröder et al. [3]). As pathlets only provide an overview of the velocity field other means are required for the visualization of additional flow properties. We suggest the use of Direct Volume Rendering and isosurfaces for scalar features. In this regard we were able to develop an efficient approach for combining the rendering through raytracing of the volume and regular OpenGL geometries. This is achieved through the use of Depth Peeling or A-Buffers for the rendering of transparent geometries. Animation of pathlets requires a strict boundary of the simulation domain. Hence, STRING needs to extract the boundary, even from unstructured data, if it is not provided. In 3D we additionally need a good visualization of the boundary itself. For this the silhouette based on the angle of

  2. Infrastructure requirement of knowledge management system model of statistical learning tool (SLT) for education community

    NASA Astrophysics Data System (ADS)

    Abdullah, Rusli; Samah, Bahaman Abu; Bolong, Jusang; D'Silva, Jeffrey Lawrence; Shaffril, Hayrol Azril Mohamed

    2014-09-01

    Today, teaching and learning (T&L) using technology as tool is becoming more important especially in the field of statistics as a part of the subject matter in higher education system environment. Eventhough, there are many types of technology of statistical learnig tool (SLT) which can be used to support and enhance T&L environment, however, there is lack of a common standard knowledge management as a knowledge portal for guidance especially in relation to infrastructure requirement of SLT in servicing the community of user (CoU) such as educators, students and other parties who are interested in performing this technology as a tool for their T&L. Therefore, there is a need of a common standard infrastructure requirement of knowledge portal in helping CoU for managing of statistical knowledge in acquiring, storing, desseminating and applying of the statistical knowedge for their specific purposes. Futhermore, by having this infrastructure requirement of knowledge portal model of SLT as a guidance in promoting knowledge of best practise among the CoU, it can also enhance the quality and productivity of their work towards excellence of statistical knowledge application in education system environment.

  3. Advanced Fuel Cycle Economic Tools, Algorithms, and Methodologies

    SciTech Connect

    David E. Shropshire

    2009-05-01

    The Advanced Fuel Cycle Initiative (AFCI) Systems Analysis supports engineering economic analyses and trade-studies, and requires a requisite reference cost basis to support adequate analysis rigor. In this regard, the AFCI program has created a reference set of economic documentation. The documentation consists of the “Advanced Fuel Cycle (AFC) Cost Basis” report (Shropshire, et al. 2007), “AFCI Economic Analysis” report, and the “AFCI Economic Tools, Algorithms, and Methodologies Report.” Together, these documents provide the reference cost basis, cost modeling basis, and methodologies needed to support AFCI economic analysis. The application of the reference cost data in the cost and econometric systems analysis models will be supported by this report. These methodologies include: the energy/environment/economic evaluation of nuclear technology penetration in the energy market—domestic and internationally—and impacts on AFCI facility deployment, uranium resource modeling to inform the front-end fuel cycle costs, facility first-of-a-kind to nth-of-a-kind learning with application to deployment of AFCI facilities, cost tradeoffs to meet nuclear non-proliferation requirements, and international nuclear facility supply/demand analysis. The economic analysis will be performed using two cost models. VISION.ECON will be used to evaluate and compare costs under dynamic conditions, consistent with the cases and analysis performed by the AFCI Systems Analysis team. Generation IV Excel Calculations of Nuclear Systems (G4-ECONS) will provide static (snapshot-in-time) cost analysis and will provide a check on the dynamic results. In future analysis, additional AFCI measures may be developed to show the value of AFCI in closing the fuel cycle. Comparisons can show AFCI in terms of reduced global proliferation (e.g., reduction in enrichment), greater sustainability through preservation of a natural resource (e.g., reduction in uranium ore depletion), value from

  4. Enhancing interest in statistics among computer science students using computer tool entrepreneur role play

    NASA Astrophysics Data System (ADS)

    Judi, Hairulliza Mohamad; Sahari @ Ashari, Noraidah; Eksan, Zanaton Hj

    2017-04-01

    Previous research in Malaysia indicates that there is a problem regarding attitude towards statistics among students. They didn't show positive attitude in affective, cognitive, capability, value, interest and effort aspects although did well in difficulty. This issue should be given substantial attention because students' attitude towards statistics may give impacts on the teaching and learning process of the subject. Teaching statistics using role play is an appropriate attempt to improve attitudes to statistics, to enhance the learning of statistical techniques and statistical thinking, and to increase generic skills. The objectives of the paper are to give an overview on role play in statistics learning and to access the effect of these activities on students' attitude and learning in action research framework. The computer tool entrepreneur role play is conducted in a two-hour tutorial class session of first year students in Faculty of Information Sciences and Technology (FTSM), Universiti Kebangsaan Malaysia, enrolled in Probability and Statistics course. The results show that most students feel that they have enjoyable and great time in the role play. Furthermore, benefits and disadvantages from role play activities were highlighted to complete the review. Role play is expected to serve as an important activities that take into account students' experience, emotions and responses to provide useful information on how to modify student's thinking or behavior to improve learning.

  5. GeneTools--application for functional annotation and statistical hypothesis testing.

    PubMed

    Beisvag, Vidar; Jünge, Frode K R; Bergum, Hallgeir; Jølsum, Lars; Lydersen, Stian; Günther, Clara-Cecilie; Ramampiaro, Heri; Langaas, Mette; Sandvik, Arne K; Laegreid, Astrid

    2006-10-24

    Modern biology has shifted from "one gene" approaches to methods for genomic-scale analysis like microarray technology, which allow simultaneous measurement of thousands of genes. This has created a need for tools facilitating interpretation of biological data in "batch" mode. However, such tools often leave the investigator with large volumes of apparently unorganized information. To meet this interpretation challenge, gene-set, or cluster testing has become a popular analytical tool. Many gene-set testing methods and software packages are now available, most of which use a variety of statistical tests to assess the genes in a set for biological information. However, the field is still evolving, and there is a great need for "integrated" solutions. GeneTools is a web-service providing access to a database that brings together information from a broad range of resources. The annotation data are updated weekly, guaranteeing that users get data most recently available. Data submitted by the user are stored in the database, where it can easily be updated, shared between users and exported in various formats. GeneTools provides three different tools: i) NMC Annotation Tool, which offers annotations from several databases like UniGene, Entrez Gene, SwissProt and GeneOntology, in both single- and batch search mode. ii) GO Annotator Tool, where users can add new gene ontology (GO) annotations to genes of interest. These user defined GO annotations can be used in further analysis or exported for public distribution. iii) eGOn, a tool for visualization and statistical hypothesis testing of GO category representation. As the first GO tool, eGOn supports hypothesis testing for three different situations (master-target situation, mutually exclusive target-target situation and intersecting target-target situation). An important additional function is an evidence-code filter that allows users, to select the GO annotations for the analysis. GeneTools is the first "all in one

  6. GeneTools – application for functional annotation and statistical hypothesis testing

    PubMed Central

    Beisvag, Vidar; Jünge, Frode KR; Bergum, Hallgeir; Jølsum, Lars; Lydersen, Stian; Günther, Clara-Cecilie; Ramampiaro, Heri; Langaas, Mette; Sandvik, Arne K; Lægreid, Astrid

    2006-01-01

    Background Modern biology has shifted from "one gene" approaches to methods for genomic-scale analysis like microarray technology, which allow simultaneous measurement of thousands of genes. This has created a need for tools facilitating interpretation of biological data in "batch" mode. However, such tools often leave the investigator with large volumes of apparently unorganized information. To meet this interpretation challenge, gene-set, or cluster testing has become a popular analytical tool. Many gene-set testing methods and software packages are now available, most of which use a variety of statistical tests to assess the genes in a set for biological information. However, the field is still evolving, and there is a great need for "integrated" solutions. Results GeneTools is a web-service providing access to a database that brings together information from a broad range of resources. The annotation data are updated weekly, guaranteeing that users get data most recently available. Data submitted by the user are stored in the database, where it can easily be updated, shared between users and exported in various formats. GeneTools provides three different tools: i) NMC Annotation Tool, which offers annotations from several databases like UniGene, Entrez Gene, SwissProt and GeneOntology, in both single- and batch search mode. ii) GO Annotator Tool, where users can add new gene ontology (GO) annotations to genes of interest. These user defined GO annotations can be used in further analysis or exported for public distribution. iii) eGOn, a tool for visualization and statistical hypothesis testing of GO category representation. As the first GO tool, eGOn supports hypothesis testing for three different situations (master-target situation, mutually exclusive target-target situation and intersecting target-target situation). An important additional function is an evidence-code filter that allows users, to select the GO annotations for the analysis. Conclusion GeneTools

  7. Comparing Simple and Advanced Video Tools as Supports for Complex Collaborative Design Processes

    ERIC Educational Resources Information Center

    Zahn, Carmen; Pea, Roy; Hesse, Friedrich W.; Rosen, Joe

    2010-01-01

    Working with digital video technologies, particularly advanced video tools with editing capabilities, offers new prospects for meaningful learning through design. However, it is also possible that the additional complexity of such tools does "not" advance learning. We compared in an experiment the design processes and learning outcomes…

  8. Comparing Simple and Advanced Video Tools as Supports for Complex Collaborative Design Processes

    ERIC Educational Resources Information Center

    Zahn, Carmen; Pea, Roy; Hesse, Friedrich W.; Rosen, Joe

    2010-01-01

    Working with digital video technologies, particularly advanced video tools with editing capabilities, offers new prospects for meaningful learning through design. However, it is also possible that the additional complexity of such tools does "not" advance learning. We compared in an experiment the design processes and learning outcomes…

  9. Statistical tools for determining appropriate selection of regression models for analysis of environmental fate datasets.

    PubMed

    Aldworth, Jeremy; Jackson, Scott H

    2008-05-01

    In the NAFTA regulatory community, a currently consistent methodology used to estimate dissipation times for environmental fate data is not applied. This work demonstrates through a case study that the inappropriate use of pseudo-first-order regression models can result in inaccurate estimates of soil degradation rates, and it proposes some statistical tools that can be used to identify an appropriate statistical model to fit a particular environmental fate dataset. Diagnostic procedures have been proposed to identify the appropriate scale, and statistical testing procedures have been proposed to select the appropriate model within that scale. Results from this work demonstrate that, unless the proposed diagnostic and statistical procedures are used, inaccurate estimates of dissipation times may result. Copyright (c) 2008 Society of Chemical Industry.

  10. Informing the judgments of fingerprint analysts using quality metric and statistical assessment tools.

    PubMed

    Langenburg, Glenn; Champod, Christophe; Genessay, Thibault

    2012-06-10

    The aim of this research was to evaluate how fingerprint analysts would incorporate information from newly developed tools into their decision making processes. Specifically, we assessed effects using the following: (1) a quality tool to aid in the assessment of the clarity of the friction ridge details, (2) a statistical tool to provide likelihood ratios representing the strength of the corresponding features between compared fingerprints, and (3) consensus information from a group of trained fingerprint experts. The measured variables for the effect on examiner performance were the accuracy and reproducibility of the conclusions against the ground truth (including the impact on error rates) and the analyst accuracy and variation for feature selection and comparison. The results showed that participants using the consensus information from other fingerprint experts demonstrated more consistency and accuracy in minutiae selection. They also demonstrated higher accuracy, sensitivity, and specificity in the decisions reported. The quality tool also affected minutiae selection (which, in turn, had limited influence on the reported decisions); the statistical tool did not appear to influence the reported decisions.

  11. The Web as an educational tool for/in learning/teaching bioinformatics statistics.

    PubMed

    Oliver, J; Pisano, M E; Alonso, T; Roca, P

    2005-12-01

    Statistics provides essential tool in Bioinformatics to interpret the results of a database search or for the management of enormous amounts of information provided from genomics, proteomics and metabolomics. The goal of this project was the development of a software tool that would be as simple as possible to demonstrate the use of the Bioinformatics statistics. Computer Simulation Methods (CSMs) developed using Microsoft Excel were chosen for their broad range of applications, immediate and easy formula calculation, immediate testing and easy graphics representation, and of general use and acceptance by the scientific community. The result of these endeavours is a set of utilities which can be accessed from the following URL: http://gmein.uib.es/bioinformatica/statistics. When tested on students with previous coursework with traditional statistical teaching methods, the general opinion/overall consensus was that Web-based instruction had numerous advantages, but traditional methods with manual calculations were also needed for their theory and practice. Once having mastered the basic statistical formulas, Excel spreadsheets and graphics were shown to be very useful for trying many parameters in a rapid fashion without having to perform tedious calculations. CSMs will be of great importance for the formation of the students and professionals in the field of bioinformatics, and for upcoming applications of self-learning and continuous formation.

  12. New advanced radio diagnostics tools for Space Weather Program

    NASA Astrophysics Data System (ADS)

    Krankowski, A.; Rothkaehl, H.; Atamaniuk, B.; Morawski, M.; Zakharenkova, I.; Cherniak, I.; Otmianowska-Mazur, K.

    2013-12-01

    To give a more detailed and complete understanding of physical plasma processes that govern the solar-terrestrial space, and to develop qualitative and quantitative models of the magnetosphere-ionosphere-thermosphere coupling, it is necessary to design and build the next generation of instruments for space diagnostics and monitoring. Novel ground- based wide-area sensor networks, such as the LOFAR (Low Frequency Array) radar facility, comprising wide band, and vector-sensing radio receivers and multi-spacecraft plasma diagnostics should help solve outstanding problems of space physics and describe long-term environmental changes. The LOw Frequency ARray - LOFAR - is a new fully digital radio telescope designed for frequencies between 30 MHz and 240 MHz located in Europe. The three new LOFAR stations will be installed until summer 2015 in Poland. The LOFAR facilities in Poland will be distributed among three sites: Lazy (East of Krakow), Borowiec near Poznan and Baldy near Olsztyn. All they will be connected via PIONIER dedicated links to Poznan. Each site will host one LOFAR station (96 high-band+96 low-band antennas). They will most time work as a part of European network, however, when less charged, they can operate as a national network The new digital radio frequency analyzer (RFA) on board the low-orbiting RELEC satellite was designed to monitor and investigate the ionospheric plasma properties. This two-point ground-based and topside ionosphere-located space plasma diagnostic can be a useful new tool for monitoring and diagnosing turbulent plasma properties. The RFA on board the RELEC satellite is the first in a series of experiments which is planned to be launched into the near-Earth environment. In order to improve and validate the large scales and small scales ionospheric structures we will used the GPS observations collected at IGS/EPN network employed to reconstruct diurnal variations of TEC using all satellite passes over individual GPS stations and the

  13. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC I. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…

  14. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Statistical Process Control.

    ERIC Educational Resources Information Center

    Billings, Paul H.

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…

  15. Using the Student Research Project to Integrate Macroeconomics and Statistics in an Advanced Cost Accounting Course

    ERIC Educational Resources Information Center

    Hassan, Mahamood M.; Schwartz, Bill N.

    2014-01-01

    This paper discusses a student research project that is part of an advanced cost accounting class. The project emphasizes active learning, integrates cost accounting with macroeconomics and statistics by "learning by doing" using real world data. Students analyze sales data for a publicly listed company by focusing on the company's…

  16. Reducing Anxiety and Increasing Self-Efficacy within an Advanced Graduate Psychology Statistics Course

    ERIC Educational Resources Information Center

    McGrath, April L.; Ferns, Alyssa; Greiner, Leigh; Wanamaker, Kayla; Brown, Shelley

    2015-01-01

    In this study we assessed the usefulness of a multifaceted teaching framework in an advanced statistics course. We sought to expand on past findings by using this framework to assess changes in anxiety and self-efficacy, and we collected focus group data to ascertain whether students attribute such changes to a multifaceted teaching approach.…

  17. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  18. Analysis methodology and development of a statistical tool for biodistribution data from internal contamination with actinides.

    PubMed

    Lamart, Stephanie; Griffiths, Nina M; Tchitchek, Nicolas; Angulo, Jaime F; Van der Meeren, Anne

    2017-03-01

    The aim of this work was to develop a computational tool that integrates several statistical analysis features for biodistribution data from internal contamination experiments. These data represent actinide levels in biological compartments as a function of time and are derived from activity measurements in tissues and excreta. These experiments aim at assessing the influence of different contamination conditions (e.g. intake route or radioelement) on the biological behavior of the contaminant. The ever increasing number of datasets and diversity of experimental conditions make the handling and analysis of biodistribution data difficult. This work sought to facilitate the statistical analysis of a large number of datasets and the comparison of results from diverse experimental conditions. Functional modules were developed using the open-source programming language R to facilitate specific operations: descriptive statistics, visual comparison, curve fitting, and implementation of biokinetic models. In addition, the structure of the datasets was harmonized using the same table format. Analysis outputs can be written in text files and updated data can be written in the consistent table format. Hence, a data repository is built progressively, which is essential for the optimal use of animal data. Graphical representations can be automatically generated and saved as image files. The resulting computational tool was applied using data derived from wound contamination experiments conducted under different conditions. In facilitating biodistribution data handling and statistical analyses, this computational tool ensures faster analyses and a better reproducibility compared with the use of multiple office software applications. Furthermore, re-analysis of archival data and comparison of data from different sources is made much easier. Hence this tool will help to understand better the influence of contamination characteristics on actinide biokinetics. Our approach can aid

  19. Exploring students’ perceived and actual ability in solving statistical problems based on Rasch measurement tools

    NASA Astrophysics Data System (ADS)

    Azila Che Musa, Nor; Mahmud, Zamalia; Baharun, Norhayati

    2017-09-01

    One of the important skills that is required from any student who are learning statistics is knowing how to solve statistical problems correctly using appropriate statistical methods. This will enable them to arrive at a conclusion and make a significant contribution and decision for the society. In this study, a group of 22 students majoring in statistics at UiTM Shah Alam were given problems relating to topics on testing of hypothesis which require them to solve the problems using confidence interval, traditional and p-value approach. Hypothesis testing is one of the techniques used in solving real problems and it is listed as one of the difficult concepts for students to grasp. The objectives of this study is to explore students’ perceived and actual ability in solving statistical problems and to determine which item in statistical problem solving that students find difficult to grasp. Students’ perceived and actual ability were measured based on the instruments developed from the respective topics. Rasch measurement tools such as Wright map and item measures for fit statistics were used to accomplish the objectives. Data were collected and analysed using Winsteps 3.90 software which is developed based on the Rasch measurement model. The results showed that students’ perceived themselves as moderately competent in solving the statistical problems using confidence interval and p-value approach even though their actual performance showed otherwise. Item measures for fit statistics also showed that the maximum estimated measures were found on two problems. These measures indicate that none of the students have attempted these problems correctly due to reasons which include their lack of understanding in confidence interval and probability values.

  20. Development of a statistical tool for the estimation of riverbank erosion probability

    NASA Astrophysics Data System (ADS)

    Varouchakis, E. A.; Giannakis, G. V.; Lilli, M. A.; Ioannidou, E.; Nikolaidis, N. P.; Karatzas, G. P.

    2016-01-01

    Riverbank erosion affects river morphology and local habitat, and results in riparian land loss, property and infrastructure damage, and ultimately flood defence weakening. An important issue concerning riverbank erosion is the identification of the vulnerable areas in order to predict river changes and assist stream management/restoration. An approach to predict areas vulnerable to erosion is to quantify the erosion probability by identifying the underlying relations between riverbank erosion and geomorphological or hydrological variables that prevent or stimulate erosion. In the present work, a statistical methodology is proposed to predict the probability of the presence or absence of erosion in a river section. A physically based model determines the locations vulnerable to erosion by quantifying the potential eroded area. The derived results are used to determine validation locations for the evaluation of the statistical tool performance. The statistical tool is based on a series of independent local variables and employs the logistic regression methodology. It is developed in two forms, logistic regression and locally weighted logistic regression, which both deliver useful and accurate results. The second form, though, provides the most accurate results as it validates the presence or absence of erosion at all validation locations. The proposed tool is easy to use and accurate and can be applied to any region and river.

  1. Discrete rearranging disordered patterns, part I: robust statistical tools in two or three dimensions.

    PubMed

    Graner, F; Dollet, B; Raufaste, C; Marmottant, P

    2008-04-01

    Discrete rearranging patterns include cellular patterns, for instance liquid foams, biological tissues, grains in polycrystals; assemblies of particles such as beads, granular materials, colloids, molecules, atoms; and interconnected networks. Such a pattern can be described as a list of links between neighbouring sites. Performing statistics on the links between neighbouring sites yields average quantities (hereafter "tools") as the result of direct measurements on images. These descriptive tools are flexible and suitable for various problems where quantitative measurements are required, whether in two or in three dimensions. Here, we present a coherent set of robust tools, in three steps. First, we revisit the definitions of three existing tools based on the texture matrix. Second, thanks to their more general definition, we embed these three tools in a self-consistent formalism, which includes three additional ones. Third, we show that the six tools together provide a direct correspondence between a small scale, where they quantify the discrete pattern's local distortion and rearrangements, and a large scale, where they help describe a material as a continuous medium. This enables to formulate elastic, plastic, fluid behaviours in a common, self-consistent modelling using continuous mechanics. Experiments, simulations and models can be expressed in the same language and directly compared. As an example, a companion paper (P. Marmottant, C. Raufaste, and F. Graner, this issue, 25 (2008) DOI 10.1140/epje/i2007-10300-7) provides an application to foam plasticity.

  2. DAnTE: a statistical tool for quantitative analysis of -omics data.

    PubMed

    Polpitiya, Ashoka D; Qian, Wei-Jun; Jaitly, Navdeep; Petyuk, Vladislav A; Adkins, Joshua N; Camp, David G; Anderson, Gordon A; Smith, Richard D

    2008-07-01

    Data Analysis Tool Extension (DAnTE) is a statistical tool designed to address challenges associated with quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide-to-protein rollup methods, an extensive array of plotting functions and a comprehensive hypothesis-testing scheme that can handle unbalanced data and random effects. The graphical user interface (GUI) is designed to be very intuitive and user friendly. DAnTE may be downloaded free of charge at http://omics.pnl.gov/software/. An example dataset with instructions on how to perform a series of analysis steps is available at http://omics.pnl.gov/software/

  3. A modern reproducible method for the histologic grading of astrocytomas with statistical classification tools.

    PubMed

    Röhrl, Norbert; Iglesias-Rozas, José R; Weidl, Galia

    2008-02-01

    To investigate whether statistical classification tools can infer the correct World Health Organization (WHO) grade from standardized histologic features in astrocytomas and how these tools compare with GRADO-IGL, an earlier computer-assisted method. A total of 794 human brain astrocytomas were studied between January 1976 and June 2005. The presence of 50 histologic features was rated in 4 categories from 0 (not present) to 3 (abundant) by visual inspection of the sections under a microscope. All tumors were also classified with the corresponding WHO grade between I and IV. We tested the prediction performance of several statistical classification tools (learning vector quantization [LVQ], supervised relevance neural gas [SRNG], support vector machines [SVM], and generalized regression neural network [GRNN]) for this data set. The WHO grade was predicted correctly from histologic features in close to 80% of the cases by 2 modern classifiers (SRNG and SVM), and GRADO-IGL was predicted correctly in > 84% of the cases by a GRNN. A standardized report, based the 50 histologic features, can be used in conjunction with modern classification tools as an objective and reproducible method for histologic grading of astrocytomas.

  4. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.

    1990-01-01

    A prototype is described that can serve as a scientific-modeling software tool to facilitate the development of useful scientific models. The prototype is developed for applications to planetary modeling, and specific examples are given that relate to the atmosphere of Titan. The scientific modeling tool employs a high-level domain-specific modeling language, several data-display facilities, and a library of experimental datasets and scientific equations. The planetary modeling prototype links uncomputed physical variables to computed variables with computational transformations based on a backchaining procedure. The system - implemented in LISP with an object-oriented knowledge-representation tool - is run on a workstation that provides interface with several models. The prototype is expected to form the basis for a sophisticated modeling tool that can permit active experimentation.

  5. A Tool Preference Choice Method for RNA Secondary Structure Prediction by SVM with Statistical Tests

    PubMed Central

    Hor, Chiou-Yi; Yang, Chang-Biau; Chang, Chia-Hung; Tseng, Chiou-Ting; Chen, Hung-Hsin

    2013-01-01

    The Prediction of RNA secondary structures has drawn much attention from both biologists and computer scientists. Many useful tools have been developed for this purpose. These tools have their individual strengths and weaknesses. As a result, based on support vector machines (SVM), we propose a tool choice method which integrates three prediction tools: pknotsRG, RNAStructure, and NUPACK. Our method first extracts features from the target RNA sequence, and adopts two information-theoretic feature selection methods for feature ranking. We propose a method to combine feature selection and classifier fusion in an incremental manner. Our test data set contains 720 RNA sequences, where 225 pseudoknotted RNA sequences are obtained from PseudoBase, and 495 nested RNA sequences are obtained from RNA SSTRAND. The method serves as a preprocessing way in analyzing RNA sequences before the RNA secondary structure prediction tools are employed. In addition, the performance of various configurations is subject to statistical tests to examine their significance. The best base-pair accuracy achieved is 75.5%, which is obtained by the proposed incremental method, and is significantly higher than 68.8%, which is associated with the best predictor, pknotsRG. PMID:23641141

  6. Statistical Tools And Artificial Intelligence Approaches To Predict Fracture In Bulk Forming Processes

    NASA Astrophysics Data System (ADS)

    Di Lorenzo, R.; Ingarao, G.; Fonti, V.

    2007-05-01

    The crucial task in the prevention of ductile fracture is the availability of a tool for the prediction of such defect occurrence. The technical literature presents a wide investigation on this topic and many contributions have been given by many authors following different approaches. The main class of approaches regards the development of fracture criteria: generally, such criteria are expressed by determining a critical value of a damage function which depends on stress and strain paths: ductile fracture is assumed to occur when such critical value is reached during the analysed process. There is a relevant drawback related to the utilization of ductile fracture criteria; in fact each criterion usually has good performances in the prediction of fracture for particular stress - strain paths, i.e. it works very well for certain processes but may provide no good results for other processes. On the other hand, the approaches based on damage mechanics formulation are very effective from a theoretical point of view but they are very complex and their proper calibration is quite difficult. In this paper, two different approaches are investigated to predict fracture occurrence in cold forming operations. The final aim of the proposed method is the achievement of a tool which has a general reliability i.e. it is able to predict fracture for different forming processes. The proposed approach represents a step forward within a research project focused on the utilization of innovative predictive tools for ductile fracture. The paper presents a comparison between an artificial neural network design procedure and an approach based on statistical tools; both the approaches were aimed to predict fracture occurrence/absence basing on a set of stress and strain paths data. The proposed approach is based on the utilization of experimental data available, for a given material, on fracture occurrence in different processes. More in detail, the approach consists in the analysis of

  7. Evaluation of statistical tools used in short-term repeated dose administration toxicity studies with rodents.

    PubMed

    Kobayashi, Katsumi; Pillai, K Sadasivan; Sakuratani, Yuki; Abe, Takemaru; Kamata, Eiichi; Hayashi, Makoto

    2008-02-01

    In order to know the different statistical tools used to analyze the data obtained from twenty-eight-day repeated dose oral toxicity studies with rodents and the impact of these statistical tools on interpretation of data obtained from the studies, study reports of 122 numbers of twenty-eight-day repeated dose oral toxicity studies conducted in rats were examined. It was found that both complex and easy routes of decision trees were followed for the analysis of the quantitative data. These tools include Scheffe's test, non-parametric type Dunnett's and Scheffe's tests with very low power. Few studies used the non-parametric Dunnett type test and Mann-Whitney's U test. Though Chi-square and Fisher's tests are widely used for analysis of qualitative data, their sensitivity to detect a treatment-related effect is questionable. Mann-Whitney's U test has better sensitivity to analyze qualitative data than the chi-square and Fisher's tests. We propose Dunnett's test for analysis of quantitative data obtained from twenty-eight-day repeated dose oral toxicity tests and for qualitative data, Mann-Whitney's U test. For both tests, one-sided test with p=0.05 may be applied.

  8. Development of a statistical tool for the estimation of riverbank erosion probability

    NASA Astrophysics Data System (ADS)

    Varouchakis, E. A.; Giannakis, G. V.; Lilli, M. A.; Ioannidou, E.; Nikolaidis, N. P.; Karatzas, G. P.

    2015-06-01

    Riverbank erosion affects river morphology and local habitat and results in riparian land loss, property and infrastructure damage, and ultimately flood defence weakening. An important issue concerning riverbank erosion is the identification of the vulnerable areas in order to predict river changes and assist stream management/restoration. An approach to predict vulnerable to erosion areas is to quantify the erosion probability by identifying the underlying relations between riverbank erosion and geomorphological or hydrological variables that prevent or stimulate erosion. In the present work, a combined deterministic and statistical methodology is proposed to predict the probability of presence or absence of erosion in a river section. A physically based model determines the vulnerable to erosion locations by quantifying the potential eroded area. The derived results are used to determine validation locations for the statistical tool performance evaluation. The statistical tool is based on a series of independent local variables and employs the Logistic Regression methodology. It is developed in two forms, Logistic Regression and Locally Weighted Logistic Regression, which both deliver useful and accurate results. The second form though provides the most accurate results as it validates the presence or absence of erosion at all validation locations. The proposed methodology is easy to use, accurate and can be applied to any region and river.

  9. Advanced PANIC quick-look tool using Python

    NASA Astrophysics Data System (ADS)

    Ibáñez, José-Miguel; García Segura, Antonio J.; Storz, Clemens; Fried, Josef W.; Fernández, Matilde; Rodríguez Gómez, Julio F.; Terrón, V.; Cárdenas, M. C.

    2012-09-01

    PANIC, the Panoramic Near Infrared Camera, is an instrument for the Calar Alto Observatory currently being integrated in laboratory and whose first light is foreseen for end 2012 or early 2013. We present here how the PANIC Quick-Look tool (PQL) and pipeline (PAPI) are being implemented, using existing rapid programming Python technologies and packages, together with well-known astronomical software suites (Astromatic, IRAF) and parallel processing techniques. We will briefly describe the structure of the PQL tool, whose main characteristics are the use of the SQLite database and PyQt, a Python binding of the GUI toolkit Qt.

  10. SU-E-T-259: A Statistical and Machine Learning-Based Tool for Modeling and Visualization of Radiotherapy Treatment Outcomes.

    PubMed

    Oh, J; Wang, Y; Apte, A; Deasy, J

    2012-06-01

    Effective radiotherapy outcomes modeling could provide physicians with better understanding of the underlying disease mechanism, enabling to early predict outcomes and ultimately allowing for individualizing treatment for patients at high risk. This requires not only sophisticated statistical methods, but user-friendly visualization and data analysis tools. Unfortunately, few tools are available to support these requirements in radiotherapy community. Our group has developed Matlab-based in-house software called DREES for statistical modeling of radiotherapy treatment outcomes. We have noticed that advanced machine learning techniques can be used as useful tools for analyzing and modeling the outcomes data. To this end, we have upgraded DREES such that it takes advantage of useful Statistics and Bioinformatics toolboxes in Matlab that provide robust statistical data modeling and analysis methods as well as user-friendly visualization and graphical interface. Newly added key features include variable selection, discriminant analysis and decision tree for classification, and k-means and hierarchical clustering functions. Also, existing graphical tools and statistical methods in DREES were replaced with a library of the Matlab toolboxes. We analyzed several radiotherapy outcomes datasets with our tools and showed that these can be effectively used for building normal tissue complication probability (NTCP) and tumor control probability (TCP) models. We have developed an integrated software tool for modeling and visualization of radiotherapy outcomes data within the Matlab programming environment. It is our expectation that this tool could help physicians and scientists better understand the complex mechanism of disease and identify clinical and biological factors related to outcomes. © 2012 American Association of Physicists in Medicine.

  11. XML based tools for assessing potential impact of advanced technology space validation

    NASA Technical Reports Server (NTRS)

    Some, Raphael R.; Weisbin, Charles

    2004-01-01

    A hierarchical XML database and related analysis tools are being developed by the New Millennium Program to provide guidance on the relative impact, to future NASA missions, of advanced technologies under consideration for developmental funding.

  12. Human Factors Evaluation of Advanced Electric Power Grid Visualization Tools

    SciTech Connect

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin

    2009-04-01

    This report describes initial human factors evaluation of four visualization tools (Graphical Contingency Analysis, Force Directed Graphs, Phasor State Estimator and Mode Meter/ Mode Shapes) developed by PNNL, and proposed test plans that may be implemented to evaluate their utility in scenario-based experiments.

  13. A comprehensive tool for the statistical comparison of Large Surveys to Models of the Galaxy

    NASA Astrophysics Data System (ADS)

    Ritter, Andreas

    2014-01-01

    The advent of large spectroscopic surveys of the Galaxy offers the possibility to compare Galactic models to actual measurements for the first time. I have developed a tool for the comprehensive comparison of any large data set to the predictions made by models of the Galaxy using sophisticated statistical methods, and to visualise the results for any given direction. This enables us to point out systematic differences between the model and the measurements, as well as to identify new (sub-)structures in the Galaxy. These results can then be used to improve the models, which in turn will allow us to find even more substructures like stellar streams, moving groups, or clusters. In this paper I show the potential of this tool by applying it to the RAdial Velocity Experiment (RAVE, Steinmetz 2003) and the Besançon model of the Galaxy Robin et al. 2003.

  14. Analyzing Planck and low redshift data sets with advanced statistical methods

    NASA Astrophysics Data System (ADS)

    Eifler, Tim

    The recent ESA/NASA Planck mission has provided a key data set to constrain cosmology that is most sensitive to physics of the early Universe, such as inflation and primordial NonGaussianity (Planck 2015 results XIII). In combination with cosmological probes of the LargeScale Structure (LSS), the Planck data set is a powerful source of information to investigate late time phenomena (Planck 2015 results XIV), e.g. the accelerated expansion of the Universe, the impact of baryonic physics on the growth of structure, and the alignment of galaxies in their dark matter halos. It is the main objective of this proposal to re-analyze the archival Planck data, 1) with different, more recently developed statistical methods for cosmological parameter inference, and 2) to combine Planck and ground-based observations in an innovative way. We will make the corresponding analysis framework publicly available and believe that it will set a new standard for future CMB-LSS analyses. Advanced statistical methods, such as the Gibbs sampler (Jewell et al 2004, Wandelt et al 2004) have been critical in the analysis of Planck data. More recently, Approximate Bayesian Computation (ABC, see Weyant et al 2012, Akeret et al 2015, Ishida et al 2015, for cosmological applications) has matured to an interesting tool in cosmological likelihood analyses. It circumvents several assumptions that enter the standard Planck (and most LSS) likelihood analyses, most importantly, the assumption that the functional form of the likelihood of the CMB observables is a multivariate Gaussian. Beyond applying new statistical methods to Planck data in order to cross-check and validate existing constraints, we plan to combine Planck and DES data in a new and innovative way and run multi-probe likelihood analyses of CMB and LSS observables. The complexity of multiprobe likelihood analyses scale (non-linearly) with the level of correlations amongst the individual probes that are included. For the multi

  15. A review of three simple plant models and corresponding statistical tools for basic research in homeopathy.

    PubMed

    Betti, Lucietta; Trebbi, Grazia; Zurla, Michela; Nani, Daniele; Peruzzi, Maurizio; Brizzi, Maurizio

    2010-12-14

    In this paper, we review three simple plant models (wheat seed germination, wheat seedling growth, and infected tobacco plants) that we set up during a series of experiments carried out from 1991 to 2009 in order to study the effects of homeopathic treatments. We will also describe the set of statistical tools applied in the different models. The homeopathic treatment used in our experiments was arsenic trioxide (As₂O₃) diluted in a decimal scale and dynamized. Since the most significant results were achieved with the 45th decimal potency, both for As₂O₃ (As 45x) and water (W 45x), we here report a brief summary of these results. The statistical analysis was performed by using parametric and nonparametric tests, and Poisson distribution had an essential role when dealing with germination experiments. Finally, we will describe some results related to the changes in variability, which seems to be one of the targets of homeopathic treatment effect.

  16. Boosting the power of schizophrenia genetics by leveraging new statistical tools.

    PubMed

    Andreassen, Ole A; Thompson, Wesley K; Dale, Anders M

    2014-01-01

    Genome-wide association studies (GWAS) have identified a large number of gene variants associated with schizophrenia, but these variants explain only a small portion of the heritability. It is becoming increasingly clear that schizophrenia is influenced by many genes, most of which have effects too small to be identified using traditional GWAS statistical methods. By applying recently developed Empirical Bayes statistical approaches, we have demonstrated that functional genic elements show differential contribution to phenotypic variance, with some elements (regulatory regions and exons) showing strong enrichment for association with schizophrenia. Applying related methods, we also showed abundant genetic overlap (pleiotropy) between schizophrenia and other phenotypes, including bipolar disorder, cardiovascular disease risk factors, and multiple sclerosis. We estimated the number of gene variants with effects in schizophrenia and bipolar disorder to be approximately 1.2%. By applying our novel statistical framework, we dramatically improved gene discovery and detected a large number of new gene loci associated with schizophrenia that have not yet been identified with standard GWAS methods. Utilizing independent schizophrenia substudies, we showed that these new loci have high replication rates in de novo samples, indicating that they likely represent true schizophrenia risk genes. The new statistical tools provide a powerful approach for uncovering more of the missing heritability of schizophrenia and other complex disorders. In conclusion, the highly polygenic architecture of schizophrenia strongly suggests the utility of research approaches that recognize schizophrenia neuropathology as a complex dynamic system, with many small gene effects integrated in functional networks.

  17. AeroStat: NASA Giovanni Tool for Statistical Intercomparison of Aerosols

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Petrenko, M.; Leptoukh, G. G.; Lynnes, C.; Da Silva, D.; Hegde, M.; Ichoku, C. M.

    2011-12-01

    Giovanni is a NASA's interactive online visualization and analysis tool for exploring very large global Earth science datasets. One of the new Giovanni analytical and statistical tools is called AeroStat, and it is designed to perform the direct statistical intercomparison of global aerosol parameters. Currently, we incorporate the MAPSS (A Multi-sensor Aerosol Products Sampling System) data that provides spatio-temporal statistics for multiple spatial spaceborne Level 2 aerosol products (MODIS Terra, MODIS Aqua, MISR, POLDER, OMI and CALIOP) sampled over AERONET ground stations. The dataset period, 1997-2011 (up to date), is long enough to encompass a number of scientifically challenging cases of long-term global aerosol validation from multi-sensors. AeroStat allows users to easily visualize and analyze in details the statistical properties of such cases, including data collected from multiple sensors and quality assurance (QA) properties of these data. One of the goals of AeroStat is to also provide a collaborative research environment, where aerosol scientists can share pertinent research workflow information, including data cases of interest, algorithms, best practices, and known errors, with the broader science community and enable other users of the system to easily reproduce and independently verify their results. Furthermore, AeroStat provides an easy access to the data provenance (data lineage) and quality information, which allows for a convenient tracing of scientific results back to their original input data, thus further ensuring the reliability of these results. Case studies will be presented to show the described functionality and capabilities of AeroStat, and possible directions of the future development.

  18. Accuracy Evaluation of a Mobile Mapping System with Advanced Statistical Methods

    NASA Astrophysics Data System (ADS)

    Toschi, I.; Rodríguez-Gonzálvez, P.; Remondino, F.; Minto, S.; Orlandini, S.; Fuller, A.

    2015-02-01

    This paper discusses a methodology to evaluate the precision and the accuracy of a commercial Mobile Mapping System (MMS) with advanced statistical methods. So far, the metric potentialities of this emerging mapping technology have been studied in few papers, where generally the assumption that errors follow a normal distribution is made. In fact, this hypothesis should be carefully verified in advance, in order to test how well the Gaussian classic statistics can adapt to datasets that are usually affected by asymmetrical gross errors. The workflow adopted in this study relies on a Gaussian assessment, followed by an outlier filtering process. Finally, non-parametric statistical models are applied, in order to achieve a robust estimation of the error dispersion. Among the different MMSs available on the market, the latest solution provided by RIEGL is here tested, i.e. the VMX-450 Mobile Laser Scanning System. The test-area is the historic city centre of Trento (Italy), selected in order to assess the system performance in dealing with a challenging and historic urban scenario. Reference measures are derived from photogrammetric and Terrestrial Laser Scanning (TLS) surveys. All datasets show a large lack of symmetry that leads to the conclusion that the standard normal parameters are not adequate to assess this type of data. The use of non-normal statistics gives thus a more appropriate description of the data and yields results that meet the quoted a-priori errors.

  19. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  20. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1993-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  1. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1993-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  2. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation

  3. [EpiInfo as a research and teaching tool in epidemiology and statistics: strengths and weaknesses].

    PubMed

    Mannocci, Alice; Bontempi, Claudio; Giraldi, Guglielmo; Chiaradia, Giacomina; de Waure, Chiara; Sferrazza, Antonella; Ricciardi, Walter; Boccia, Antonio; La Torre, Giuseppe

    2012-01-01

    EpiInfo is a free software developed in 1988 by the Centers for Disease Control and Prevention (CDC) in Atlanta to facilitate field epidemiological investigations and statistical analysis. The aim of this study was to assess whether the software represents, in the Italian biomedical field, an effective analytical research tool and a practical and simple epidemiology and biostatistics teaching tool. A questionnaire consisting of 20 multiple-choice and open questions was administered to 300 healthcare workers, including doctors, biologists, nurses, medical students and interns, at the end of a CME course in epidemiology and biostatistics. Sixty-four percent of participants were aged between 26 and 45 years, 52% were women and 73% were unmarried. Results show that women are more likely to utilize EpiInfo in their research activities with respect to men (p = 0.023), as are individuals aged 26-45 years with respect to the older and younger age groups (p = 0.023) and unmarried participants with respect to those married (p = 0.010). Thirty-one percent of respondents consider EpiInfo to be more than adequate for analysis of their research data and 52% consider it to be sufficiently so. The inclusion of an EpiInfo course in statistics and epidemiology modules facilitates the understanding of theoretical concepts and allows researchers to more easily perform some of the clinical/epidemiological research activities.

  4. SEPEM: A tool for statistical modeling the solar energetic particle environment

    NASA Astrophysics Data System (ADS)

    Crosby, Norma; Heynderickx, Daniel; Jiggens, Piers; Aran, Angels; Sanahuja, Blai; Truscott, Pete; Lei, Fan; Jacobs, Carla; Poedts, Stefaan; Gabriel, Stephen; Sandberg, Ingmar; Glover, Alexi; Hilgers, Alain

    2015-07-01

    Solar energetic particle (SEP) events are a serious radiation hazard for spacecraft as well as a severe health risk to humans traveling in space. Indeed, accurate modeling of the SEP environment constitutes a priority requirement for astrophysics and solar system missions and for human exploration in space. The European Space Agency's Solar Energetic Particle Environment Modelling (SEPEM) application server is a World Wide Web interface to a complete set of cross-calibrated data ranging from 1973 to 2013 as well as new SEP engineering models and tools. Both statistical and physical modeling techniques have been included, in order to cover the environment not only at 1 AU but also in the inner heliosphere ranging from 0.2 AU to 1.6 AU using a newly developed physics-based shock-and-particle model to simulate particle flux profiles of gradual SEP events. With SEPEM, SEP peak flux and integrated fluence statistics can be studied, as well as durations of high SEP flux periods. Furthermore, effects tools are also included to allow calculation of single event upset rate and radiation doses for a variety of engineering scenarios.

  5. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  6. Identifying differentially expressed proteins in two-dimensional electrophoresis experiments: inputs from transcriptomics statistical tools.

    PubMed

    Artigaud, Sébastien; Gauthier, Olivier; Pichereau, Vianney

    2013-11-01

    Two-dimensional electrophoresis is a crucial method in proteomics that allows the characterization of proteins' function and expression. This usually implies the identification of proteins that are differentially expressed between two contrasting conditions, for example, healthy versus diseased in human proteomics biomarker discovery and stressful conditions versus control in animal experimentation. The statistical procedures that lead to such identifications are critical steps in the 2-DE analysis workflow. They include a normalization step and a test and probability correction for multiple testing. Statistical issues caused by the high dimensionality of the data and large-scale multiple testing have been a more active topic in transcriptomics than proteomics, especially in microarray analysis. We thus propose to adapt innovative statistical tools developed for microarray analysis and incorporate them in the 2-DE analysis pipeline. In this article, we evaluate the performance of different normalization procedures, different statistical tests and false discovery rate calculation methods with both real and simulated datasets. We demonstrate that the use of statistical procedures adapted from microarrays lead to notable increase in power as well as a minimization of false-positive discovery rate. More specifically, we obtained the best results in terms of reliability and sensibility when using the 'moderate t-test' from Smyth in association with classic false discovery rate from Benjamini and Hochberg. The methods discussed are freely available in the 'prot2D' open source R-package from Bioconductor (http://www.bioconductor.org//) under the terms of the GNU General Public License (version 2 or later). sebastien.artigaud@univ-brest.fr or sebastien.artigaud@gmx.com.

  7. Advanced clinical technology--clinical tool or expensive toy?

    PubMed

    Ganz, C H

    1995-10-01

    There should no longer be any doubt in dentists' minds when it comes to the importance of technology in the day-to-day activities of the dental office. But there is, and this doubt must be addressed if the profession is to grow and prosper to its full potential. Denying the importance of technology--or putting on blinders--only leads to complacency, and clearly does not support our professions goal of providing the finest dental care possible. This article addresses the use of technology in a clinically-related manner, and attempts to provide the practitioner with valuable practice management tools to allow for a prosperous and efficient dental practice.

  8. A new scoring system in Cystic Fibrosis: statistical tools for database analysis - a preliminary report.

    PubMed

    Hafen, G M; Hurst, C; Yearwood, J; Smith, J; Dzalilov, Z; Robinson, P J

    2008-10-05

    Cystic fibrosis is the most common fatal genetic disorder in the Caucasian population. Scoring systems for assessment of Cystic fibrosis disease severity have been used for almost 50 years, without being adapted to the milder phenotype of the disease in the 21st century. The aim of this current project is to develop a new scoring system using a database and employing various statistical tools. This study protocol reports the development of the statistical tools in order to create such a scoring system. The evaluation is based on the Cystic Fibrosis database from the cohort at the Royal Children's Hospital in Melbourne. Initially, unsupervised clustering of the all data records was performed using a range of clustering algorithms. In particular incremental clustering algorithms were used. The clusters obtained were characterised using rules from decision trees and the results examined by clinicians. In order to obtain a clearer definition of classes expert opinion of each individual's clinical severity was sought. After data preparation including expert-opinion of an individual's clinical severity on a 3 point-scale (mild, moderate and severe disease), two multivariate techniques were used throughout the analysis to establish a method that would have a better success in feature selection and model derivation: 'Canonical Analysis of Principal Coordinates' and 'Linear Discriminant Analysis'. A 3-step procedure was performed with (1) selection of features, (2) extracting 5 severity classes out of a 3 severity class as defined per expert-opinion and (3) establishment of calibration datasets. (1) Feature selection: CAP has a more effective "modelling" focus than DA.(2) Extraction of 5 severity classes: after variables were identified as important in discriminating contiguous CF severity groups on the 3-point scale as mild/moderate and moderate/severe, Discriminant Function (DF) was used to determine the new groups mild, intermediate moderate, moderate, intermediate

  9. A new scoring system in Cystic Fibrosis: statistical tools for database analysis – a preliminary report

    PubMed Central

    Hafen, GM; Hurst, C; Yearwood, J; Smith, J; Dzalilov, Z; Robinson, PJ

    2008-01-01

    Background Cystic fibrosis is the most common fatal genetic disorder in the Caucasian population. Scoring systems for assessment of Cystic fibrosis disease severity have been used for almost 50 years, without being adapted to the milder phenotype of the disease in the 21st century. The aim of this current project is to develop a new scoring system using a database and employing various statistical tools. This study protocol reports the development of the statistical tools in order to create such a scoring system. Methods The evaluation is based on the Cystic Fibrosis database from the cohort at the Royal Children's Hospital in Melbourne. Initially, unsupervised clustering of the all data records was performed using a range of clustering algorithms. In particular incremental clustering algorithms were used. The clusters obtained were characterised using rules from decision trees and the results examined by clinicians. In order to obtain a clearer definition of classes expert opinion of each individual's clinical severity was sought. After data preparation including expert-opinion of an individual's clinical severity on a 3 point-scale (mild, moderate and severe disease), two multivariate techniques were used throughout the analysis to establish a method that would have a better success in feature selection and model derivation: 'Canonical Analysis of Principal Coordinates' and 'Linear Discriminant Analysis'. A 3-step procedure was performed with (1) selection of features, (2) extracting 5 severity classes out of a 3 severity class as defined per expert-opinion and (3) establishment of calibration datasets. Results (1) Feature selection: CAP has a more effective "modelling" focus than DA. (2) Extraction of 5 severity classes: after variables were identified as important in discriminating contiguous CF severity groups on the 3-point scale as mild/moderate and moderate/severe, Discriminant Function (DF) was used to determine the new groups mild, intermediate moderate

  10. Advanced tools and framework for historical film restoration

    NASA Astrophysics Data System (ADS)

    Croci, Simone; Aydın, Tunç Ozan; Stefanoski, Nikolce; Gross, Markus; Smolic, Aljosa

    2017-01-01

    Digital restoration of film content that has historical value is crucial for the preservation of cultural heritage. Also, digital restoration is not only a relevant application area of various video processing technologies that have been developed in computer graphics literature but also involves a multitude of unresolved research challenges. Currently, the digital restoration workflow is highly labor intensive and often heavily relies on expert knowledge. We revisit some key steps of this workflow and propose semiautomatic methods for performing them. To do that we build upon state-of-the-art video processing techniques by adding the components necessary for enabling (i) restoration of chemically degraded colors of the film stock, (ii) removal of excessive film grain through spatiotemporal filtering, and (iii) contrast recovery by transferring contrast from the negative film stock to the positive. We show that when applied individually our tools produce compelling results and when applied in concert significantly improve the degraded input content. Building on a conceptual framework of film restoration ensures the best possible combination of tools and use of available materials.

  11. AMAS: a fast tool for alignment manipulation and computing of summary statistics

    PubMed Central

    2016-01-01

    The amount of data used in phylogenetics has grown explosively in the recent years and many phylogenies are inferred with hundreds or even thousands of loci and many taxa. These modern phylogenomic studies often entail separate analyses of each of the loci in addition to multiple analyses of subsets of genes or concatenated sequences. Computationally efficient tools for handling and computing properties of thousands of single-locus or large concatenated alignments are needed. Here I present AMAS (Alignment Manipulation And Summary), a tool that can be used either as a stand-alone command-line utility or as a Python package. AMAS works on amino acid and nucleotide alignments and combines capabilities of sequence manipulation with a function that calculates basic statistics. The manipulation functions include conversions among popular formats, concatenation, extracting sites and splitting according to a pre-defined partitioning scheme, creation of replicate data sets, and removal of taxa. The statistics calculated include the number of taxa, alignment length, total count of matrix cells, overall number of undetermined characters, percent of missing data, AT and GC contents (for DNA alignments), count and proportion of variable sites, count and proportion of parsimony informative sites, and counts of all characters relevant for a nucleotide or amino acid alphabet. AMAS is particularly suitable for very large alignments with hundreds of taxa and thousands of loci. It is computationally efficient, utilizes parallel processing, and performs better at concatenation than other popular tools. AMAS is a Python 3 program that relies solely on Python’s core modules and needs no additional dependencies. AMAS source code and manual can be downloaded from http://github.com/marekborowiec/AMAS/ under GNU General Public License. PMID:26835189

  12. A 3D Interactive Multi-object Segmentation Tool using Local Robust Statistics Driven Active Contours

    PubMed Central

    Gao, Yi; Kikinis, Ron; Bouix, Sylvain; Shenton, Martha; Tannenbaum, Allen

    2012-01-01

    Extracting anatomical and functional significant structures renders one of the important tasks for both the theoretical study of the medical image analysis, and the clinical and practical community. In the past, much work has been dedicated only to the algorithmic development. Nevertheless, for clinical end users, a well designed algorithm with an interactive software is necessary for an algorithm to be utilized in their daily work. Furthermore, the software would better be open sourced in order to be used and validated by not only the authors but also the entire community. Therefore, the contribution of the present work is twofolds: First, we propose a new robust statistics based conformal metric and the conformal area driven multiple active contour framework, to simultaneously extract multiple targets from MR and CT medical imagery in 3D. Second, an open source graphically interactive 3D segmentation tool based on the aforementioned contour evolution is implemented and is publicly available for end users on multiple platforms. In using this software for the segmentation task, the process is initiated by the user drawn strokes (seeds) in the target region in the image. Then, the local robust statistics are used to describe the object features, and such features are learned adaptively from the seeds under a non-parametric estimation scheme. Subsequently, several active contours evolve simultaneously with their interactions being motivated by the principles of action and reaction — This not only guarantees mutual exclusiveness among the contours, but also no longer relies upon the assumption that the multiple objects fill the entire image domain, which was tacitly or explicitly assumed in many previous works. In doing so, the contours interact and converge to equilibrium at the desired positions of the desired multiple objects. Furthermore, with the aim of not only validating the algorithm and the software, but also demonstrating how the tool is to be used, we

  13. AMAS: a fast tool for alignment manipulation and computing of summary statistics.

    PubMed

    Borowiec, Marek L

    2016-01-01

    The amount of data used in phylogenetics has grown explosively in the recent years and many phylogenies are inferred with hundreds or even thousands of loci and many taxa. These modern phylogenomic studies often entail separate analyses of each of the loci in addition to multiple analyses of subsets of genes or concatenated sequences. Computationally efficient tools for handling and computing properties of thousands of single-locus or large concatenated alignments are needed. Here I present AMAS (Alignment Manipulation And Summary), a tool that can be used either as a stand-alone command-line utility or as a Python package. AMAS works on amino acid and nucleotide alignments and combines capabilities of sequence manipulation with a function that calculates basic statistics. The manipulation functions include conversions among popular formats, concatenation, extracting sites and splitting according to a pre-defined partitioning scheme, creation of replicate data sets, and removal of taxa. The statistics calculated include the number of taxa, alignment length, total count of matrix cells, overall number of undetermined characters, percent of missing data, AT and GC contents (for DNA alignments), count and proportion of variable sites, count and proportion of parsimony informative sites, and counts of all characters relevant for a nucleotide or amino acid alphabet. AMAS is particularly suitable for very large alignments with hundreds of taxa and thousands of loci. It is computationally efficient, utilizes parallel processing, and performs better at concatenation than other popular tools. AMAS is a Python 3 program that relies solely on Python's core modules and needs no additional dependencies. AMAS source code and manual can be downloaded from http://github.com/marekborowiec/AMAS/ under GNU General Public License.

  14. Source apportionment advances using polar plots of bivariate correlation and regression statistics

    NASA Astrophysics Data System (ADS)

    Grange, Stuart K.; Lewis, Alastair C.; Carslaw, David C.

    2016-11-01

    This paper outlines the development of enhanced bivariate polar plots that allow the concentrations of two pollutants to be compared using pair-wise statistics for exploring the sources of atmospheric pollutants. The new method combines bivariate polar plots, which provide source characteristic information, with pair-wise statistics that provide information on how two pollutants are related to one another. The pair-wise statistics implemented include weighted Pearson correlation and slope from two linear regression methods. The development uses a Gaussian kernel to locally weight the statistical calculations on a wind speed-direction surface together with variable-scaling. Example applications of the enhanced polar plots are presented by using routine air quality data for two monitoring sites in London, United Kingdom for a single year (2013). The London examples demonstrate that the combination of bivariate polar plots, correlation, and regression techniques can offer considerable insight into air pollution source characteristics, which would be missed if only scatter plots and mean polar plots were used for analysis. Specifically, using correlation and slopes as pair-wise statistics, long-range transport processes were isolated and black carbon (BC) contributions to PM2.5 for a kerbside monitoring location were quantified. Wider applications and future advancements are also discussed.

  15. The use of machine learning and nonlinear statistical tools for ADME prediction.

    PubMed

    Sakiyama, Yojiro

    2009-02-01

    Absorption, distribution, metabolism and excretion (ADME)-related failure of drug candidates is a major issue for the pharmaceutical industry today. Prediction of ADME by in silico tools has now become an inevitable paradigm to reduce cost and enhance efficiency in pharmaceutical research. Recently, machine learning as well as nonlinear statistical tools has been widely applied to predict routine ADME end points. To achieve accurate and reliable predictions, it would be a prerequisite to understand the concepts, mechanisms and limitations of these tools. Here, we have devised a small synthetic nonlinear data set to help understand the mechanism of machine learning by 2D-visualisation. We applied six new machine learning methods to four different data sets. The methods include Naive Bayes classifier, classification and regression tree, random forest, Gaussian process, support vector machine and k nearest neighbour. The results demonstrated that ensemble learning and kernel machine displayed greater accuracy of prediction than classical methods irrespective of the data set size. The importance of interaction with the engineering field is also addressed. The results described here provide insights into the mechanism of machine learning, which will enable appropriate usage in the future.

  16. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    NASA Astrophysics Data System (ADS)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  17. Scratching the itch: new tools to advance understanding of scabies.

    PubMed

    Mounsey, Kate E; McCarthy, James S; Walton, Shelley F

    2013-01-01

    Scabies remains a significant public health problem worldwide. Research into aspects of Sarcoptes scabiei biology and host-parasite interactions has been impeded by an inability to maintain mites in vitro and by limited access to parasite material and infected subjects. The generation of comprehensive expressed sequence tag libraries has enabled the initial characterisation of molecules of interest to diagnostics, vaccines, and drug resistance. The recent development and utilisation of animal models, combined with next-generation technologies, is anticipated to lead to new strategies to prevent, diagnose, and treat scabies, ultimately improving skin health in both human and veterinary settings. This article will summarise recent molecular and immunologic advances on scabies, and will address priorities for the exciting 'next chapter' of scabies research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Power spectra as a diagnostic tool in probing statistical/nonstatistical behavior in unimolecular reactions

    NASA Astrophysics Data System (ADS)

    Chang, Xiaoyen Y.; Sewell, Thomas D.; Raff, Lionel M.; Thompson, Donald L.

    1992-11-01

    The possibility of utilizing different types of power spectra obtained from classical trajectories as a diagnostic tool to identify the presence of nonstatistical dynamics is explored by using the unimolecular bond-fission reactions of 1,2-difluoroethane and the 2-chloroethyl radical as test cases. In previous studies, the reaction rates for these systems were calculated by using a variational transition-state theory and classical trajectory methods. A comparison of the results showed that 1,2-difluoroethane is a nonstatistical system, while the 2-chloroethyl radical behaves statistically. Power spectra for these two systems have been generated under various conditions. The characteristics of these spectra are as follows: (1) The spectra for the 2-chloroethyl radical are always broader and more coupled to other modes than is the case for 1,2-difluoroethane. This is true even at very low levels of excitation. (2) When an internal energy near or above the dissociation threshold is initially partitioned into a local C-H stretching mode, the power spectra for 1,2-difluoroethane broaden somewhat, but discrete and somewhat isolated bands are still clearly evident. In contrast, the analogous power spectra for the 2-chloroethyl radical exhibit a near complete absence of isolated bands. The general appearance of the spectrum suggests a very high level of mode-to-mode coupling, large intramolecular vibrational energy redistribution (IVR) rates, and global statistical behavior. (3) The appearance of the power spectrum for the 2-chloroethyl radical is unaltered regardless of whether the initial C-H excitation is in the CH2 or the CH2Cl group. This result also suggests statistical behavior. These results are interpreted to mean that power spectra may be used as a diagnostic tool to assess the statistical character of a system. The presence of a diffuse spectrum exhibiting a nearly complete loss of isolated structures indicates that the dissociation dynamics of the molecule will

  19. Advanced statistical process control: controlling sub-0.18-μm lithography and other processes

    NASA Astrophysics Data System (ADS)

    Zeidler, Amit; Veenstra, Klaas-Jelle; Zavecz, Terrence E.

    2001-08-01

    Feed-forward, as a method to control the Lithography process for Critical Dimensions and Overlay, is well known in the semiconductors industry. However, the control provided by simple averaging feed-forward methodologies is not sufficient to support the complexity of a sub-0.18micrometers lithography process. Also, simple feed-forward techniques are not applicable for logics and ASIC production due to many different products, lithography chemistry combinations and the short memory of the averaging method. In the semiconductors industry, feed-forward control applications are generally called APC, Advanced Process Control applications. Today, there are as many APC methods as the number of engineers involved. To meet the stringent requirements of 0.18 micrometers production, we selected a method that is described in SPIE 3998-48 (March 2000) by Terrence Zavecz and Rene Blanquies from Yield Dynamics Inc. This method is called PPC, Predictive Process Control, and employs a methodology of collecting measurement results and the modeled bias attributes of expose tools, reticles and the incoming process in a signatures database. With PPC, before each lot exposure, the signatures of the lithography tool, the reticle and the incoming process are used to predict the setup of the lot process and the expected lot results. Benefits derived from such an implementation are very clear; there is no limitation of the number of products or lithography-chemistry combinations and the technique avoids the short memory of conventional APC techniques. ... and what's next? (Rob Morton, Philips assignee to International Sematech). The next part of the paper will try to answer this question. Observing that CMP and metal deposition significantly influence CD's and overlay results, and even Contact Etch can have a significant influence on Metal 5 overlay, we developed a more general PPC for lithography. Starting with the existing lithography PPC applications database, the authors extended the

  20. Advanced Epi Tools for Gallium Nitride Light Emitting Diode Devices

    SciTech Connect

    Patibandla, Nag; Agrawal, Vivek

    2012-12-01

    Over the course of this program, Applied Materials, Inc., with generous support from the United States Department of Energy, developed a world-class three chamber III-Nitride epi cluster tool for low-cost, high volume GaN growth for the solid state lighting industry. One of the major achievements of the program was to design, build, and demonstrate the world’s largest wafer capacity HVPE chamber suitable for repeatable high volume III-Nitride template and device manufacturing. Applied Materials’ experience in developing deposition chambers for the silicon chip industry over many decades resulted in many orders of magnitude reductions in the price of transistors. That experience and understanding was used in developing this GaN epi deposition tool. The multi-chamber approach, which continues to be unique in the ability of the each chamber to deposit a section of the full device structure, unlike other cluster tools, allows for extreme flexibility in the manufacturing process. This robust architecture is suitable for not just the LED industry, but GaN power devices as well, both horizontal and vertical designs. The new HVPE technology developed allows GaN to be grown at a rate unheard of with MOCVD, up to 20x the typical MOCVD rates of 3{micro}m per hour, with bulk crystal quality better than the highest-quality commercial GaN films grown by MOCVD at a much cheaper overall cost. This is a unique development as the HVPE process has been known for decades, but never successfully commercially developed for high volume manufacturing. This research shows the potential of the first commercial-grade HVPE chamber, an elusive goal for III-V researchers and those wanting to capitalize on the promise of HVPE. Additionally, in the course of this program, Applied Materials built two MOCVD chambers, in addition to the HVPE chamber, and a robot that moves wafers between them. The MOCVD chambers demonstrated industry-leading wavelength yield for GaN based LED wafers and industry

  1. Regional Sediment Management (RSM) Modeling Tools: Integration of Advanced Sediment Transport Tools into HEC-RAS

    DTIC Science & Technology

    2014-06-01

    Simulating sediment transport with unsteady flow also leverages several existing modeling tools native to the unsteady flow environment for sediment ...GIS and sediment routing of the proposed removal of Ballville Dam, Sandusky River, Ohio. Journal of the American Water Resources Association 38:1,549...ERDC/CHL CHETN-XIV-36 June 2014 Approved for public release; distribution is unlimited. Regional Sediment Management (RSM) Modeling Tools

  2. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    SciTech Connect

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  3. Advanced Flow Control as a Management Tool in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Wugalter, S.

    1974-01-01

    Advanced Flow Control is closely related to Air Traffic Control. Air Traffic Control is the business of the Federal Aviation Administration. To formulate an understanding of advanced flow control and its use as a management tool in the National Airspace System, it becomes necessary to speak somewhat of air traffic control, the role of FAA, and their relationship to advanced flow control. Also, this should dispell forever, any notion that advanced flow control is the inspirational master valve scheme to be used on the Alaskan Oil Pipeline.

  4. Advanced tools for astronomical time series and image analysis

    NASA Astrophysics Data System (ADS)

    Scargle, Jeffrey D.

    The algorithms described here, which I have developed for applications in X-ray and γ-ray astronomy, will hopefully be of use in other ways, perhaps aiding in the exploration of modern astronomy's data cornucopia. The goal is to describe principled approaches to some ubiquitous problems, such as detection and characterization of periodic and aperiodic signals, estimation of time delays between multiple time series, and source detection in noisy images with noisy backgrounds. The latter problem is related to detection of clusters in data spaces of various dimensions. A goal of this work is to achieve a unifying view of several related topics: signal detection and characterization, cluster identification, classification, density estimation, and multivariate regression. In addition to being useful for analysis of data from space-based and ground-based missions, these algorithms may be a basis for a future automatic science discovery facility, and in turn provide analysis tools for the Virtual Observatory. This chapter has ties to those by Larry Bretthorst, Tom Loredo, Alanna Connors, Fionn Murtagh, Jim Berger, David van Dyk, Vicent Martinez & Enn Saar.

  5. An advanced artificial intelligence tool for menu design.

    PubMed

    Khan, Abdus Salam; Hoffmann, Achim

    2003-01-01

    The computer-assisted menu design still remains a difficult task. Usually knowledge that aids in menu design by a computer is hard-coded and because of that a computerised menu planner cannot handle the menu design problem for an unanticipated client. To address this problem we developed a menu design tool, MIKAS (menu construction using incremental knowledge acquisition system), an artificial intelligence system that allows the incremental development of a knowledge-base for menu design. We allow an incremental knowledge acquisition process in which the expert is only required to provide hints to the system in the context of actual problem instances during menu design using menus stored in a so-called Case Base. Our system incorporates Case-Based Reasoning (CBR), an Artificial Intelligence (AI) technique developed to mimic human problem solving behaviour. Ripple Down Rules (RDR) are a proven technique for the acquisition of classification knowledge from expert directly while they are using the system, which complement CBR in a very fruitful way. This combination allows the incremental improvement of the menu design system while it is already in routine use. We believe MIKAS allows better dietary practice by leveraging a dietitian's skills and expertise. As such MIKAS has the potential to be helpful for any institution where dietary advice is practised.

  6. Atomic force microscopy as an advanced tool in neuroscience

    PubMed Central

    Jembrek, Maja Jazvinšćak; Šimić, Goran; Hof, Patrick R.; Šegota, Suzana

    2015-01-01

    This review highlights relevant issues about applications and improvements of atomic force microscopy (AFM) toward a better understanding of neurodegenerative changes at the molecular level with the hope of contributing to the development of effective therapeutic strategies for neurodegenerative illnesses. The basic principles of AFM are briefly discussed in terms of evaluation of experimental data, including the newest PeakForce Quantitative Nanomechanical Mapping (QNM) and the evaluation of Young’s modulus as the crucial elasticity parameter. AFM topography, revealed in imaging mode, can be used to monitor changes in live neurons over time, representing a valuable tool for high-resolution detection and monitoring of neuronal morphology. The mechanical properties of living cells can be quantified by force spectroscopy as well as by new AFM. A variety of applications are described, and their relevance for specific research areas discussed. In addition, imaging as well as non-imaging modes can provide specific information, not only about the structural and mechanical properties of neuronal membranes, but also on the cytoplasm, cell nucleus, and particularly cytoskeletal components. Moreover, new AFM is able to provide detailed insight into physical structure and biochemical interactions in both physiological and pathophysiological conditions. PMID:28123795

  7. Powerful tools for genetic modification: Advances in gene editing.

    PubMed

    Roesch, Erica A; Drumm, Mitchell L

    2017-09-27

    Recent discoveries and technical advances in genetic engineering, methods called gene or genome editing, provide hope for repairing genes that cause diseases like cystic fibrosis (CF) or otherwise altering a gene for therapeutic benefit. There are both hopes and hurdles with these technologies, with new ideas emerging almost daily. Initial studies using intestinal organoid cultures carrying the common, F508del mutation have shown that gene editing by CRISPR/Cas9 can convert cells lacking CFTR function to cells with normal channel function, providing a precedent that this technology can be harnessed for CF. While this is an important precedent, the challenges that remain are not trivial. A logistical issue for this and many other genetic diseases is genetic heterogeneity. Approximately, 2000 mutations associated with CF have been found in CFTR, the gene responsible for CF, and thus a feasible strategy that would encompass all individuals affected by the disease is particularly difficult to envision. However, single strategies that would be applicable to all subjects affected by CF have been conceived and are being investigated. With all of these approaches, efficiency (the proportion of cells edited), accuracy (how often other sites in the genome are affected), and delivery of the gene editing components to the desired cells are perhaps the most significant, impending hurdles. Our understanding of each of these areas is increasing rapidly, and while it is impossible to predict when a successful strategy will reach the clinic, there is every reason to believe it is a question of "when" and not "if." © 2017 Wiley Periodicals, Inc.

  8. Electrochemical Processing Tools for Advanced Copper Interconnects: An Introduction

    NASA Astrophysics Data System (ADS)

    Datta, Madhav

    The change from vacuum-deposited aluminum to electroplated copper in 1997 brought about a paradigm shift in interconnect technology and in chip making [1]. Since then, most of the leading chip manufacturers have converted to electroplated Cu technology for chip interconnects. Cu interconnects are fabricated by dual Damascene process which is referred to a metallization patterning process by which two insulator (dielectric) levels are patterned, filled with copper, and planarized to create a metal layer consisting of vias and lines. The process steps consist of laying a sandwich of two levels of insulator and etch stop layers that are patterned as holes for vias and troughs for lines. They are then filled with a single metallization step. Finally, the excess material is removed, and the wafer is planarized by chemical mechanical polishing (CMP). While finer details of exact sequence of fabrication steps vary, the end result of forming a metal layer remains the same in which vias are formed in the lower layer, and trenches are formed in the upper layer. Electroplating enables deposition of Cu in via holes and overlying trenches in a single step thus eliminating a via/line interface and significantly reducing the cycle time. Due to these reasons and due to relatively less expensive tooling, electroplating is a cost-effective and efficient process for Cu interconnects [2, 3]. Compared with vacuum deposition processes, electroplated Cu provides improved super filling capabilities and abnormal grain growth phenomena. These properties contribute significantly to improved reliability of Cu interconnects. With the proper choice of additives and plating conditions, void-free, seam-free Damascene deposits are obtained which eliminates surface-like fast diffusion paths for Cu electromigration.

  9. Advancing alternate tools: why science education needs CRP and CRT

    NASA Astrophysics Data System (ADS)

    Dodo Seriki, Vanessa

    2016-09-01

    Ridgeway and Yerrick's paper, Whose banner are we waving?: exploring STEM partnerships for marginalized urban youth, unearthed the tensions that existed between a local community "expert" and a group of students and their facilitator in an afterschool program. Those of us who work with youth who are traditionally marginalized, understand the importance of teaching in culturally relevant ways, but far too often—as Ridgeway and Yerrick shared—community partners have beliefs, motives, and ideologies that are incompatible to the program's mission and goals. Nevertheless, we often enter partnerships assuming that the other party understands the needs of the students or community; understands how in U.S. society White is normative while all others are deficient; and understands how to engage with students in culturally relevant ways. This forum addresses the underlying assumption, described in the Ridgeway and Yerrick article, that educators—despite their background and experiences—are able to teach in culturally relevant ways. Additionally, I assert based on the finding in the article that just as Ladson-Billings and Tate (Teach Coll Rec 97(1):47-68, 1995) asserted, race in the U.S. society, as a scholarly pursuit, was under theorized. The same is true of science education; race in science education is under theorized and the use of culturally relevant pedagogy and critical race theory as a pedagogical model and analytical tool, respectively, in science education is minimal. The increased use of both would impact our understanding of who does science, and how to broaden participation among people of color.

  10. Investigation of the gibberellic acid optimization with a statistical tool from Penicillium variable in batch reactor.

    PubMed

    Isa, Nur Kamilah Md; Mat Don, Mashitah

    2014-01-01

    The culture conditions for gibberellic acid (GA3) production by the fungus Penicillium variable (P. variable) was optimized using a statistical tool, response surface methodology (RSM). Interactions of culture conditions and optimization of the system were studied using Box-Behnken design (BBD) with three levels of three variables in a batch flask reactor. Experimentation showed that the model developed based on RSM and BBD had predicted GA3 production with R(2) = 0.987. The predicted GA3 production was optimum (31.57 mg GA3/kg substrate) when the culture conditions were at 7 days of incubation period, 21% v/w of inoculum size, and 2% v/w of olive oil concentration as a natural precursor. The results indicated that RSM and BBD methods were effective for optimizing the culture conditions of GA3 production by P. variable mycelia.

  11. Statistical thermodynamics concepts and mathematical tools for a multi-agent ecosystem.

    PubMed

    Segovia, Javier

    2014-01-01

    Finding the distribution of systems over their possible states is a mathematical problem. One possible solution is the method of the most probable distribution developed by Boltzmann. This method has been instrumental in developing statistical mechanics and explaining the origin of many thermodynamics concepts, like entropy or temperature, but is also applicable in many other fields like ecology or economics. Artificial ecosystems have many features in common with ecological or economic systems, but surprisingly the method does not appear to have been very successful in this field of application. The hypothesis of this article is that this failure is due to the incorrect interpretation of the method's concepts and mathematical tools. We propose to review and reinterpret the method so that it can be correctly applied and all its potential exploited in order to study and characterize the global behavior of an artificial multi-agent ecosystem.

  12. An Interactive Tool For Semi-automated Statistical Prediction Using Earth Observations and Models

    NASA Astrophysics Data System (ADS)

    Zaitchik, B. F.; Berhane, F.; Tadesse, T.

    2015-12-01

    We developed a semi-automated statistical prediction tool applicable to concurrent analysis or seasonal prediction of any time series variable in any geographic location. The tool was developed using Shiny, JavaScript, HTML and CSS. A user can extract a predictand by drawing a polygon over a region of interest on the provided user interface (global map). The user can select the Climatic Research Unit (CRU) precipitation or Climate Hazards Group InfraRed Precipitation with Station data (CHIRPS) as predictand. They can also upload their own predictand time series. Predictors can be extracted from sea surface temperature, sea level pressure, winds at different pressure levels, air temperature at various pressure levels, and geopotential height at different pressure levels. By default, reanalysis fields are applied as predictors, but the user can also upload their own predictors, including a wide range of compatible satellite-derived datasets. The package generates correlations of the variables selected with the predictand. The user also has the option to generate composites of the variables based on the predictand. Next, the user can extract predictors by drawing polygons over the regions that show strong correlations (composites). Then, the user can select some or all of the statistical prediction models provided. Provided models include Linear Regression models (GLM, SGLM), Tree-based models (bagging, random forest, boosting), Artificial Neural Network, and other non-linear models such as Generalized Additive Model (GAM) and Multivariate Adaptive Regression Splines (MARS). Finally, the user can download the analysis steps they used, such as the region they selected, the time period they specified, the predictand and predictors they chose and preprocessing options they used, and the model results in PDF or HTML format. Key words: Semi-automated prediction, Shiny, R, GLM, ANN, RF, GAM, MARS

  13. New advances in methodology for statistical tests useful in geostatistical studies

    SciTech Connect

    Borgman, L.E.

    1988-05-01

    Methodology for statistical procedures to perform tests of hypothesis pertaining to various aspects of geostatistical investigations has been slow in developing. The correlated nature of the data precludes most classical tests and makes the design of new tests difficult. Recent studies have led to modifications of the classical t test which allow for the intercorrelation. In addition, results for certain nonparametric tests have been obtained. The conclusions of these studies provide a variety of new tools for the geostatistician in deciding questions on significant differences and magnitudes.

  14. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    NASA Astrophysics Data System (ADS)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  15. Statistical Dimensioning of Nutrient Loading Reduction: LLR Assessment Tool for Lake Managers.

    PubMed

    Kotamäki, Niina; Pätynen, Anita; Taskinen, Antti; Huttula, Timo; Malve, Olli

    2015-08-01

    Implementation of the EU Water Framework Directive (WFD) has set a great challenge on river basin management planning. Assessing the water quality of lakes and coastal waters as well as setting the accepted nutrient loading levels requires appropriate decision supporting tools and models. Uncertainty that is inevitably related to the assessment results and rises from several sources calls for more precise quantification and consideration. In this study, we present a modeling tool, called lake load response (LLR), which can be used for statistical dimensioning of the nutrient loading reduction. LLR calculates the reduction that is needed to achieve good ecological status in a lake in terms of total nutrients and chlorophyll a (chl-a) concentration. We show that by combining an empirical nutrient retention model with a hierarchical chl-a model, the national lake monitoring data can be used more efficiently for predictions to a single lake. To estimate the uncertainties, we separate the residual variability and the parameter uncertainty of the modeling results with the probabilistic Bayesian modeling framework. LLR has been developed to answer the urgent need for fast and simple assessment methods, especially when implementing WFD at such an extensive scale as in Finland. With a case study for an eutrophic Finnish lake, we demonstrate how the model can be utilized to set the target loadings and to see how the uncertainties are quantified and how they are accumulating within the modeling chain.

  16. Comparative evaluation of spectroscopic models using different multivariate statistical tools in a multicancer scenario

    NASA Astrophysics Data System (ADS)

    Ghanate, A. D.; Kothiwale, S.; Singh, S. P.; Bertrand, Dominique; Krishna, C. Murali

    2011-02-01

    Cancer is now recognized as one of the major causes of morbidity and mortality. Histopathological diagnosis, the gold standard, is shown to be subjective, time consuming, prone to interobserver disagreement, and often fails to predict prognosis. Optical spectroscopic methods are being contemplated as adjuncts or alternatives to conventional cancer diagnostics. The most important aspect of these approaches is their objectivity, and multivariate statistical tools play a major role in realizing it. However, rigorous evaluation of the robustness of spectral models is a prerequisite. The utility of Raman spectroscopy in the diagnosis of cancers has been well established. Until now, the specificity and applicability of spectral models have been evaluated for specific cancer types. In this study, we have evaluated the utility of spectroscopic models representing normal and malignant tissues of the breast, cervix, colon, larynx, and oral cavity in a broader perspective, using different multivariate tests. The limit test, which was used in our earlier study, gave high sensitivity but suffered from poor specificity. The performance of other methods such as factorial discriminant analysis and partial least square discriminant analysis are at par with more complex nonlinear methods such as decision trees, but they provide very little information about the classification model. This comparative study thus demonstrates not just the efficacy of Raman spectroscopic models but also the applicability and limitations of different multivariate tools for discrimination under complex conditions such as the multicancer scenario.

  17. Comparative evaluation of spectroscopic models using different multivariate statistical tools in a multicancer scenario.

    PubMed

    Ghanate, A D; Kothiwale, S; Singh, S P; Bertrand, Dominique; Krishna, C Murali

    2011-02-01

    Cancer is now recognized as one of the major causes of morbidity and mortality. Histopathological diagnosis, the gold standard, is shown to be subjective, time consuming, prone to interobserver disagreement, and often fails to predict prognosis. Optical spectroscopic methods are being contemplated as adjuncts or alternatives to conventional cancer diagnostics. The most important aspect of these approaches is their objectivity, and multivariate statistical tools play a major role in realizing it. However, rigorous evaluation of the robustness of spectral models is a prerequisite. The utility of Raman spectroscopy in the diagnosis of cancers has been well established. Until now, the specificity and applicability of spectral models have been evaluated for specific cancer types. In this study, we have evaluated the utility of spectroscopic models representing normal and malignant tissues of the breast, cervix, colon, larynx, and oral cavity in a broader perspective, using different multivariate tests. The limit test, which was used in our earlier study, gave high sensitivity but suffered from poor specificity. The performance of other methods such as factorial discriminant analysis and partial least square discriminant analysis are at par with more complex nonlinear methods such as decision trees, but they provide very little information about the classification model. This comparative study thus demonstrates not just the efficacy of Raman spectroscopic models but also the applicability and limitations of different multivariate tools for discrimination under complex conditions such as the multicancer scenario.

  18. Advances in Coupling of Kinetics and Molecular Scale Tools to Shed Light on Soil Biogeochemical Processes

    SciTech Connect

    Sparks, Donald

    2014-09-02

    Biogeochemical processes in soils such as sorption, precipitation, and redox play critical roles in the cycling and fate of nutrients, metal(loid)s and organic chemicals in soil and water environments. Advanced analytical tools enable soil scientists to track these processes in real-time and at the molecular scale. Our review focuses on recent research that has employed state-of-the-art molecular scale spectroscopy, coupled with kinetics, to elucidate the mechanisms of nutrient and metal(loid) reactivity and speciation in soils. We found that by coupling kinetics with advanced molecular and nano-scale tools major advances have been made in elucidating important soil chemical processes including sorption, precipitation, dissolution, and redox of metal(loids) and nutrients. Such advances will aid in better predicting the fate and mobility of nutrients and contaminants in soils and water and enhance environmental and agricultural sustainability.

  19. Utility of the advanced chronic kidney disease patient management tools: case studies.

    PubMed

    Patwardhan, Meenal B; Matchar, David B; Samsa, Gregory P; Haley, William E

    2008-01-01

    Appropriate management of advanced chronic kidney disease (CKD) delays or limits its progression. The Advanced CKD Patient Management Toolkit was developed using a process-improvement technique to assist patient management and address CKD-specific management issues. We pilot tested the toolkit in 2 community nephrology practices, assessed the utility of individual tools, and evaluated the impact on conformance to an advanced CKD guideline through patient chart abstraction. Tool use was distinct in the 2 sites and depended on the site champion's involvement, the extent of process reconfiguration demanded by a tool, and its perceived value. Baseline conformance varied across guideline recommendations (averaged 54%). Posttrial conformance increased in all clinical areas (averaged 59%). Valuable features of the toolkit in real-world settings were its ability to: facilitate tool selection, direct implementation efforts in response to a baseline performance audit, and allow selection of tool versions and customizing them. Our results suggest that systematically created, multifaceted, and customizable tools can promote guideline conformance.

  20. Statistical Tools for Fitting Models of the Population Consequences of Acoustic Disturbance to Data from Marine Mammal Populations (PCAD Tools II)

    DTIC Science & Technology

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Statistical Tools for Fitting Models of the Population...Schick Centre for Research into Ecological and Environmental Modelling (CREEM) University of St Andrews St Andrews, KY16 9LZ, UK phone: +44-1334...build a coherent statistical framework for modeling the effects of disturbance, particularly acoustic disturbance, on different species of marine mammals

  1. Validation of surrogate endpoints in advanced solid tumors: systematic review of statistical methods, results, and implications for policy makers.

    PubMed

    Ciani, Oriana; Davis, Sarah; Tappenden, Paul; Garside, Ruth; Stein, Ken; Cantrell, Anna; Saad, Everardo D; Buyse, Marc; Taylor, Rod S

    2014-07-01

    Licensing of, and coverage decisions on, new therapies should rely on evidence from patient-relevant endpoints such as overall survival (OS). Nevertheless, evidence from surrogate endpoints may also be useful, as it may not only expedite the regulatory approval of new therapies but also inform coverage decisions. It is, therefore, essential that candidate surrogate endpoints be properly validated. However, there is no consensus on statistical methods for such validation and on how the evidence thus derived should be applied by policy makers. We review current statistical approaches to surrogate-endpoint validation based on meta-analysis in various advanced-tumor settings. We assessed the suitability of two surrogates (progression-free survival [PFS] and time-to-progression [TTP]) using three current validation frameworks: Elston and Taylor's framework, the German Institute of Quality and Efficiency in Health Care's (IQWiG) framework and the Biomarker-Surrogacy Evaluation Schema (BSES3). A wide variety of statistical methods have been used to assess surrogacy. The strength of the association between the two surrogates and OS was generally low. The level of evidence (observation-level versus treatment-level) available varied considerably by cancer type, by evaluation tools and was not always consistent even within one specific cancer type. Not in all solid tumors the treatment-level association between PFS or TTP and OS has been investigated. According to IQWiG's framework, only PFS achieved acceptable evidence of surrogacy in metastatic colorectal and ovarian cancer treated with cytotoxic agents. Our study emphasizes the challenges of surrogate-endpoint validation and the importance of building consensus on the development of evaluation frameworks.

  2. Use of graphical statistical process control tools to monitor and improve outcomes in cardiac surgery.

    PubMed

    Smith, Ian R; Garlick, Bruce; Gardner, Michael A; Brighouse, Russell D; Foster, Kelley A; Rivers, John T

    2013-02-01

    Graphical Statistical Process Control (SPC) tools have been shown to promptly identify significant variations in clinical outcomes in a range of health care settings. We explored the application of these techniques to qualitatively inform the routine cardiac surgical morbidity and mortality (M&M) review process at a single site. Baseline clinical and procedural data relating to 4774 consecutive cardiac surgical procedures, performed between the 1st January 2003 and the 30th April 2011, were retrospectively evaluated. A range of appropriate performance measures and benchmarks were developed and evaluated using a combination of CUmulative SUM (CUSUM) charts, Exponentially Weighted Moving Average (EWMA) charts and Funnel Plots. Charts have been discussed at the unit's routine M&M meetings. Risk adjustment (RA) based on EuroSCORE has been incorporated into the charts to improve performance. Discrete and aggregated measures, including Blood Product/Reoperation, major acute post-procedural complications and Length of Stay/Readmission<28 days have proved to be usable measures for monitoring outcomes. Monitoring trends in minor morbidities provides a valuable warning of impending changes in significant events. Instances of variation in performance have been examined and could be related to differences in individual operator performance via individual operator curves. SPC tools facilitate near "real-time" performance monitoring allowing early detection and intervention in altered performance. Careful interpretation of charts for group and individual operators has proven helpful in detecting and differentiating systemic vs. individual variation. Copyright © 2012 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  3. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  4. msap: a tool for the statistical analysis of methylation-sensitive amplified polymorphism data.

    PubMed

    Pérez-Figueroa, A

    2013-05-01

    In this study msap, an R package which analyses methylation-sensitive amplified polymorphism (MSAP or MS-AFLP) data is presented. The program provides a deep analysis of epigenetic variation starting from a binary data matrix indicating the banding pattern between the isoesquizomeric endonucleases HpaII and MspI, with differential sensitivity to cytosine methylation. After comparing the restriction fragments, the program determines if each fragment is susceptible to methylation (representative of epigenetic variation) or if there is no evidence of methylation (representative of genetic variation). The package provides, in a user-friendly command line interface, a pipeline of different analyses of the variation (genetic and epigenetic) among user-defined groups of samples, as well as the classification of the methylation occurrences in those groups. Statistical testing provides support to the analyses. A comprehensive report of the analyses and several useful plots could help researchers to assess the epigenetic and genetic variation in their MSAP experiments. msap is downloadable from CRAN (http://cran.r-project.org/) and its own webpage (http://msap.r-forge.R-project.org/). The package is intended to be easy to use even for those people unfamiliar with the R command line environment. Advanced users may take advantage of the available source code to adapt msap to more complex analyses.

  5. Raman spectroscopy coupled with advanced statistics for differentiating menstrual and peripheral blood.

    PubMed

    Sikirzhytskaya, Aliaksandra; Sikirzhytski, Vitali; Lednev, Igor K

    2014-01-01

    Body fluids are a common and important type of forensic evidence. In particular, the identification of menstrual blood stains is often a key step during the investigation of rape cases. Here, we report on the application of near-infrared Raman microspectroscopy for differentiating menstrual blood from peripheral blood. We observed that the menstrual and peripheral blood samples have similar but distinct Raman spectra. Advanced statistical analysis of the multiple Raman spectra that were automatically (Raman mapping) acquired from the 40 dried blood stains (20 donors for each group) allowed us to build classification model with maximum (100%) sensitivity and specificity. We also demonstrated that despite certain common constituents, menstrual blood can be readily distinguished from vaginal fluid. All of the classification models were verified using cross-validation methods. The proposed method overcomes the problems associated with currently used biochemical methods, which are destructive, time consuming and expensive.

  6. Earthquake information products and tools from the Advanced National Seismic System (ANSS)

    USGS Publications Warehouse

    Wald, Lisa

    2006-01-01

    This Fact Sheet provides a brief description of postearthquake tools and products provided by the Advanced National Seismic System (ANSS) through the U.S. Geological Survey Earthquake Hazards Program. The focus is on products specifically aimed at providing situational awareness in the period immediately following significant earthquake events.

  7. Geospatial (s)tools: integration of advanced epidemiological sampling and novel diagnostics.

    PubMed

    Cringoli, Giuseppe; Rinaldi, Laura; Albonico, Marco; Bergquist, Robert; Utzinger, Jürg

    2013-05-01

    Large-scale control and progressive elimination of a wide variety of parasitic diseases is moving to the fore. Indeed, there is good pace and broad political commitment. Yet, there are some worrying signs ahead, particularly the anticipated declines in funding and coverage of key interventions, and the paucity of novel tools and strategies. Further and intensified research and development is thus urgently required. We discuss advances in epidemiological sampling, diagnostic tools and geospatial methodologies. We emphasise the need for integrating sound epidemiological designs (e.g. cluster-randomised sampling) with innovative diagnostic tools and strategies (e.g. Mini-FLOTAC for detection of parasitic elements and pooling of biological samples) and high-resolution geospatial tools. Recognising these challenges, standardisation of quality procedures, and innovating, validating and applying new tools and strategies will foster and sustain long-term control and eventual elimination of human and veterinary public health issues.

  8. The Taguchi methodology as a statistical tool for biotechnological applications: a critical appraisal.

    PubMed

    Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J

    2008-04-01

    Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.

  9. Temporal aspects of surface water quality variation using robust statistical tools.

    PubMed

    Mustapha, Adamu; Aris, Ahmad Zaharin; Ramli, Mohammad Firuz; Juahir, Hafizan

    2012-01-01

    Robust statistical tools were applied on the water quality datasets with the aim of determining the most significance parameters and their contribution towards temporal water quality variation. Surface water samples were collected from four different sampling points during dry and wet seasons and analyzed for their physicochemical constituents. Discriminant analysis (DA) provided better results with great discriminatory ability by using five parameters with (P < 0.05) for dry season affording more than 96% correct assignation and used five and six parameters for forward and backward stepwise in wet season data with P-value (P < 0.05) affording 68.20% and 82%, respectively. Partial correlation results revealed that there are strong (r(p) = 0.829) and moderate (r(p) = 0.614) relationships between five-day biochemical oxygen demand (BOD(5)) and chemical oxygen demand (COD), total solids (TS) and dissolved solids (DS) controlling for the linear effect of nitrogen in the form of ammonia (NH(3)) and conductivity for dry and wet seasons, respectively. Multiple linear regression identified the contribution of each variable with significant values r = 0.988, R(2) = 0.976 and r = 0.970, R(2) = 0.942 (P < 0.05) for dry and wet seasons, respectively. Repeated measure t-test confirmed that the surface water quality varies significantly between the seasons with significant value P < 0.05.

  10. Degrees of separation as a statistical tool for evaluating candidate genes.

    PubMed

    Nelson, Ronald M; Pettersson, Mats E

    2014-12-01

    Selection of candidate genes is an important step in the exploration of complex genetic architecture. The number of gene networks available is increasing and these can provide information to help with candidate gene selection. It is currently common to use the degree of connectedness in gene networks as validation in Genome Wide Association (GWA) and Quantitative Trait Locus (QTL) mapping studies. However, it can cause misleading results if not validated properly. Here we present a method and tool for validating the gene pairs from GWA studies given the context of the network they co-occur in. It ensures that proposed interactions and gene associations are not statistical artefacts inherent to the specific gene network architecture. The CandidateBacon package provides an easy and efficient method to calculate the average degree of separation (DoS) between pairs of genes to currently available gene networks. We show how these empirical estimates of average connectedness are used to validate candidate gene pairs. Validation of interacting genes by comparing their connectedness with the average connectedness in the gene network will provide support for said interactions by utilising the growing amount of gene network information available. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Temporal Aspects of Surface Water Quality Variation Using Robust Statistical Tools

    PubMed Central

    Mustapha, Adamu; Aris, Ahmad Zaharin; Ramli, Mohammad Firuz; Juahir, Hafizan

    2012-01-01

    Robust statistical tools were applied on the water quality datasets with the aim of determining the most significance parameters and their contribution towards temporal water quality variation. Surface water samples were collected from four different sampling points during dry and wet seasons and analyzed for their physicochemical constituents. Discriminant analysis (DA) provided better results with great discriminatory ability by using five parameters with (P < 0.05) for dry season affording more than 96% correct assignation and used five and six parameters for forward and backward stepwise in wet season data with P-value (P < 0.05) affording 68.20% and 82%, respectively. Partial correlation results revealed that there are strong (rp = 0.829) and moderate (rp = 0.614) relationships between five-day biochemical oxygen demand (BOD5) and chemical oxygen demand (COD), total solids (TS) and dissolved solids (DS) controlling for the linear effect of nitrogen in the form of ammonia (NH3) and conductivity for dry and wet seasons, respectively. Multiple linear regression identified the contribution of each variable with significant values r = 0.988, R2 = 0.976 and r = 0.970, R2 = 0.942 (P < 0.05) for dry and wet seasons, respectively. Repeated measure t-test confirmed that the surface water quality varies significantly between the seasons with significant value P < 0.05. PMID:22919302

  12. Chemical indices and methods of multivariate statistics as a tool for odor classification.

    PubMed

    Mahlke, Ingo T; Thiesen, Peter H; Niemeyer, Bernd

    2007-04-01

    Industrial and agricultural off-gas streams are comprised of numerous volatile compounds, many of which have substantially different odorous properties. State-of-the-art waste-gas treatment includes the characterization of these molecules and is directed at, if possible, either the avoidance of such odorants during processing or the use of existing standardized air purification techniques like bioscrubbing or afterburning, which however, often show low efficiency under ecological and economical regards. Selective odor separation from the off-gas streams could ease many of these disadvantages but is not yet widely applicable. Thus, the aim of this paper is to identify possible model substances in selective odor separation research from 155 volatile molecules mainly originating from livestock facilities, fat refineries, and cocoa and coffee production by knowledge-based methods. All compounds are examined with regard to their structure and information-content using topological and information-theoretical indices. Resulting data are fitted in an observation matrix, and similarities between the substances are computed. Principal component analysis and k-means cluster analysis are conducted showing that clustering of indices data can depict odor information correlating well to molecular composition and molecular shape. Quantitative molecule describtion along with the application of such statistical means therefore provide a good classification tool of malodorant structure properties with no thermodynamic data needed. The approximate look-alike shape of odorous compounds within the clusters suggests a fair choice of possible model molecules.

  13. Monte Carlo Simulations in Statistical Physics -- From Basic Principles to Advanced Applications

    NASA Astrophysics Data System (ADS)

    Janke, Wolfhard

    2013-08-01

    This chapter starts with an overview of Monte Carlo computer simulation methodologies which are illustrated for the simple case of the Ising model. After reviewing importance sampling schemes based on Markov chains and standard local update rules (Metropolis, Glauber, heat-bath), nonlocal cluster-update algorithms are explained which drastically reduce the problem of critical slowing down at second-order phase transitions and thus improve the performance of simulations. How this can be quantified is explained in the section on statistical error analyses of simulation data including the effect of temporal correlations and autocorrelation times. Histogram reweighting methods are explained in the next section. Eventually, more advanced generalized ensemble methods (simulated and parallel tempering, multicanonical ensemble, Wang-Landau method) are discussed which are particularly important for simulations of first-order phase transitions and, in general, of systems with rare-event states. The setup of scaling and finite-size scaling analyses is the content of the following section. The chapter concludes with two advanced applications to complex physical systems. The first example deals with a quenched, diluted ferromagnet, and in the second application we consider the adsorption properties of macromolecules such as polymers and proteins to solid substrates. Such systems often require especially tailored algorithms for their efficient and successful simulation.

  14. Optimization of Sinter Plant Operating Conditions Using Advanced Multivariate Statistics: Intelligent Data Processing

    NASA Astrophysics Data System (ADS)

    Fernández-González, Daniel; Martín-Duarte, Ramón; Ruiz-Bustinza, Íñigo; Mochón, Javier; González-Gasca, Carmen; Verdeja, Luis Felipe

    2016-08-01

    Blast furnace operators expect to get sinter with homogenous and regular properties (chemical and mechanical), necessary to ensure regular blast furnace operation. Blends for sintering also include several iron by-products and other wastes that are obtained in different processes inside the steelworks. Due to their source, the availability of such materials is not always consistent, but their total production should be consumed in the sintering process, to both save money and recycle wastes. The main scope of this paper is to obtain the least expensive iron ore blend for the sintering process, which will provide suitable chemical and mechanical features for the homogeneous and regular operation of the blast furnace. The systematic use of statistical tools was employed to analyze historical data, including linear and partial correlations applied to the data and fuzzy clustering based on the Sugeno Fuzzy Inference System to establish relationships among the available variables.

  15. Customized First and Second Order Statistics Based Operators to Support Advanced Texture Analysis of MRI Images

    PubMed Central

    Cinque, Luigi

    2013-01-01

    Texture analysis is the process of highlighting key characteristics thus providing an exhaustive and unambiguous mathematical description of any object represented in a digital image. Each characteristic is connected to a specific property of the object. In some cases the mentioned properties represent aspects visually perceptible which can be detected by developing operators based on Computer Vision techniques. In other cases these properties are not visually perceptible and their computation is obtained by developing operators based on Image Understanding approaches. Pixels composing high quality medical images can be considered the result of a stochastic process since they represent morphological or physiological processes. Empirical observations have shown that these images have visually perceptible and hidden significant aspects. For these reasons, the operators can be developed by means of a statistical approach. In this paper we present a set of customized first and second order statistics based operators to perform advanced texture analysis of Magnetic Resonance Imaging (MRI) images. In particular, we specify the main rules defining the role of an operator and its relationship with other operators. Extensive experiments carried out on a wide dataset of MRI images of different body regions demonstrating usefulness and accuracy of the proposed approach are also reported. PMID:23840276

  16. The Advanced REACH Tool (ART): incorporation of an exposure measurement database.

    PubMed

    Schinkel, Jody; Ritchie, Peter; Goede, Henk; Fransman, Wouter; van Tongeren, Martie; Cherrie, John W; Tielemans, Erik; Kromhout, Hans; Warren, Nicholas

    2013-07-01

    This article describes the structure, functionalities, and content of the Advanced REACH Tool (ART) exposure database (version 1.5). The incorporation of the exposure database into ART allows users who do not have their own measurement data for their exposure scenario, to update the exposure estimates produced by the mechanistic model using analogous measurement series selected from the ART exposure measurement database. Depending on user input for substance category and activity (sub)classes, the system selects exposure measurement series from the exposure database. The comprehensive scenario descriptions and summary statistics assist the user in deciding if the measurement series are indeed fully analogous. After selecting one or more analogous data sets, the data are used by the Bayesian module of the ART system to update the mechanistically modeled exposure estimates. The 1944 exposure measurements currently stored in the ART exposure measurement database cover 9 exposure situations for handling solid objects (n = 65), 42 situations for handling powders, granules, or pelletized material (n = 488), 5 situations for handling low-volatility liquids (n = 88), 35 situations for handling volatile liquids (n = 870), and 26 situations for handling liquids in which powders are dissolved or dispersed (resulting in exposure to mist) (n = 433). These 117 measurement series form a good basis for supporting user exposure estimates. However, by increasing the diversity of exposure situations and the number of measurement series in the database, the usefulness of the ART system will be further improved. Suggestions to stimulate the process of sharing exposure measurement data both to increase the available data in the ART and for other purposes are made.

  17. PREFACE: Advanced many-body and statistical methods in mesoscopic systems

    NASA Astrophysics Data System (ADS)

    Anghel, Dragos Victor; Sabin Delion, Doru; Sorin Paraoanu, Gheorghe

    2012-02-01

    It has increasingly been realized in recent times that the borders separating various subfields of physics are largely artificial. This is the case for nanoscale physics, physics of lower-dimensional systems and nuclear physics, where the advanced techniques of many-body theory developed in recent times could provide a unifying framework for these disciplines under the general name of mesoscopic physics. Other fields, such as quantum optics and quantum information, are increasingly using related methods. The 6-day conference 'Advanced many-body and statistical methods in mesoscopic systems' that took place in Constanta, Romania, between 27 June and 2 July 2011 was, we believe, a successful attempt at bridging an impressive list of topical research areas: foundations of quantum physics, equilibrium and non-equilibrium quantum statistics/fractional statistics, quantum transport, phases and phase transitions in mesoscopic systems/superfluidity and superconductivity, quantum electromechanical systems, quantum dissipation, dephasing, noise and decoherence, quantum information, spin systems and their dynamics, fundamental symmetries in mesoscopic systems, phase transitions, exactly solvable methods for mesoscopic systems, various extension of the random phase approximation, open quantum systems, clustering, decay and fission modes and systematic versus random behaviour of nuclear spectra. This event brought together participants from seventeen countries and five continents. Each of the participants brought considerable expertise in his/her field of research and, at the same time, was exposed to the newest results and methods coming from the other, seemingly remote, disciplines. The talks touched on subjects that are at the forefront of topical research areas and we hope that the resulting cross-fertilization of ideas will lead to new, interesting results from which everybody will benefit. We are grateful for the financial and organizational support from IFIN-HH, Ovidius

  18. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection

    PubMed Central

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-01-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces. PMID:21487489

  19. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection.

    PubMed

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-12-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces.

  20. Extended statistical entropy analysis as a quantitative management tool for water resource systems

    NASA Astrophysics Data System (ADS)

    Sobantka, Alicja; Rechberger, Helmut

    2010-05-01

    The use of entropy in hydrology and water resources has been applied to various applications. As water resource systems are inherently spatial and complex, a stochastic description of these systems is needed, and entropy theory enables development of such a description by providing determination of the least-biased probability distributions with limited knowledge and data. Entropy can also serve as a basis for risk and reliability analysis. The relative entropy has been variously interpreted as a measure freedom of choice, uncertainty and disorder, information content, missing information or information gain or loss. In the analysis of empirical data, entropy is another measure of dispersion, an alternative to the variance. Also, as an evaluation tool, the statistical entropy analysis (SEA) has been developed by previous workers to quantify the power of a process to concentrate chemical elements. Within this research programme the SEA is aimed to be extended for application to chemical compounds and tested for its deficits and potentials in systems where water resources play an important role. The extended SEA (eSEA) will be developed first for the nitrogen balance in waste water treatment plants (WWTP). Later applications on the emission of substances to water bodies such as groundwater (e.g. leachate from landfills) will also be possible. By applying eSEA to the nitrogen balance in a WWTP, all possible nitrogen compounds, which may occur during the water treatment process, are taken into account and are quantified in their impact towards the environment and human health. It has been shown that entropy reducing processes are part of modern waste management. Generally, materials management should be performed in a way that significant entropy rise is avoided. The entropy metric might also be used to perform benchmarking on WWTPs. The result out of this management tool would be the determination of the efficiency of WWTPs. By improving and optimizing the efficiency

  1. Statistical tools for the temporal analysis and classification of lung lesions.

    PubMed

    Barros Netto, Stelmo Magalhães; Corrêa Silva, Aristófanes; Lopes, Hélio; Cardoso de Paiva, Anselmo; Acatauassú Nunes, Rodolfo; Gattass, Marcelo

    2017-04-01

    Lung cancer remains one of the most common cancers globally. Temporal evaluation is an important tool for analyzing the malignant behavior of lesions during treatment, or of indeterminate lesions that may be benign. This work proposes a methodology for the analysis, quantification, and visualization of small (local) and large (global) changes in lung lesions. In addition, we extract textural features for the classification of lesions as benign or malignant. We employ the statistical concept of uncertainty to associate each voxel of a lesion to a probability that changes occur in the lesion over time. We employ the Jensen divergence and hypothesis test locally to verify voxel-to-voxel changes, and globally to capture changes in lesion volumes. For the local hypothesis test, we determine that the change in density varies by between 3.84 and 40.01% of the lesion volume in a public database of malignant lesions under treatment, and by between 5.76 and 35.43% in a private database of benign lung nodules. From the texture analysis of regions in which the density changes occur, we are able to discriminate lung lesions with an accuracy of 98.41%, which shows that these changes can indicate the true nature of the lesion. In addition to the visual aspects of the density changes occurring in the lesions over time, we quantify these changes and analyze the entire set using volumetry. In the case of malignant lesions, large b-divergence values are associated with major changes in lesion volume. In addition, this occurs when the change in volume is small but is associated with significant changes in density, as indicated by the histogram divergence. For benign lesions, the methodology shows that even in cases where the change in volume is small, a change of density occurs. This proves that even in lesions that appear stable, a change in density occurs. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Measuring political commitment and opportunities to advance food and nutrition security: piloting a rapid assessment tool.

    PubMed

    Fox, Ashley M; Balarajan, Yarlini; Cheng, Chloe; Reich, Michael R

    2015-06-01

    Lack of political commitment has been identified as a primary reason for the low priority that food and nutrition interventions receive from national governments relative to the high disease burden caused by malnutrition. Researchers have identified a number of factors that contribute to food and nutrition's 'low-priority cycle' on national policy agendas, but few tools exist to rapidly measure political commitment and identify opportunities to advance food and nutrition on the policy agenda. This article presents a theory-based rapid assessment approach to gauging countries' level of political commitment to food and nutrition security and identifying opportunities to advance food and nutrition on the policy agenda. The rapid assessment tool was piloted among food and nutrition policymakers and planners in 10 low- and middle-income countries in April to June 2013. Food and nutrition commitment and policy opportunity scores were calculated for each country and strategies to advance food and nutrition on policy agendas were designed for each country. The article finds that, in a majority of countries, political leaders had verbally and symbolically committed to addressing food and nutrition, but adequate financial resources were not allocated to implement specific programmes. In addition, whereas the low cohesion of the policy community has been viewed a major underlying cause of the low-priority status of food and nutrition, the analysis finds that policy community cohesion and having a well thought-out policy alternative were present in most countries. This tool may be useful to policymakers and planners providing information that can be used to benchmark and/or evaluate advocacy efforts to advance reforms in the food and nutrition sector; furthermore, the results can help identify specific strategies that can be employed to move the food and nutrition agenda forward. This tool complements others that have been recently developed to measure national commitment to

  3. Synthetic biology and molecular genetics in non-conventional yeasts: Current tools and future advances.

    PubMed

    Wagner, James M; Alper, Hal S

    2016-04-01

    Coupling the tools of synthetic biology with traditional molecular genetic techniques can enable the rapid prototyping and optimization of yeast strains. While the era of yeast synthetic biology began in the well-characterized model organism Saccharomyces cerevisiae, it is swiftly expanding to include non-conventional yeast production systems such as Hansenula polymorpha, Kluyveromyces lactis, Pichia pastoris, and Yarrowia lipolytica. These yeasts already have roles in the manufacture of vaccines, therapeutic proteins, food additives, and biorenewable chemicals, but recent synthetic biology advances have the potential to greatly expand and diversify their impact on biotechnology. In this review, we summarize the development of synthetic biological tools (including promoters and terminators) and enabling molecular genetics approaches that have been applied in these four promising alternative biomanufacturing platforms. An emphasis is placed on synthetic parts and genome editing tools. Finally, we discuss examples of synthetic tools developed in other organisms that can be adapted or optimized for these hosts in the near future.

  4. Advanced Engineering Tools for Structural Analysis of Advanced Power Plants Application to the GE ESBWR Design

    SciTech Connect

    Gamble, R.E.; Fanning, A.; Diaz Llanos, M.; Moreno, A.; Carrasco, A.

    2002-07-01

    Experience in the design of nuclear reactors for power generation shows that the plant structures and buildings involved are one of the major contributors to plant capital investment. Consequently, the design of theses elements must be optimised if cost reductions in future reactors are to be achieved. The benefits of using the 'Best Estimate Approach' are well known in the area of core and systems design. This consists in developing accurate models of a plant's phenomenology and behaviour, minimising the margins. Different safety margins have been applied in the past when performing structural analyses. Three of these margins can be identified: - increasing the value of the load by a factor that depends on the load frequency; - decreasing the resistance of the structure's resistance, and - safety margins introduced through two step analysis. The first two type of margins are established in the applicable codes in order to provide design safety margins. The third one derives from limitations in tools which, in the past, did not allow obtaining an accurate model in which both the dynamic and static loads could be evaluated simultaneously. Nowadays, improvements in hardware and software have eliminated the need for two-step calculations in structural analysis (dynamic plus static), allowing the creation one-through finite element models in which all loads, both dynamic and static, are combined without the determination of the equivalent static loads from the dynamic loads. This paper summarizes how these models and methods have been applied to optimize the Reactor Building structural design of the General Electric (GE) ESBWR Passive Plant. The work has focused on three areas: - the design of the Gravity Driven Cooling System (GDCS) Pools as pressure boundary between the Drywell and the Wet-well; - the evaluation of the thickness of the Reactor Building foundation slab, and - the global structural evaluation of the Reactor Building.

  5. Statistically Optimal Approximations of Astronomical Signals: Implications to Classification and Advanced Study of Variable Stars

    NASA Astrophysics Data System (ADS)

    Andronov, I. L.; Chinarova, L. L.; Kudashkina, L. S.; Marsakova, V. I.; Tkachenko, M. G.

    2016-06-01

    We have elaborated a set of new algorithms and programs for advanced time series analysis of (generally) multi-component multi-channel observations with irregularly spaced times of observations, which is a common case for large photometric surveys. Previous self-review on these methods for periodogram, scalegram, wavelet, autocorrelation analysis as well as on "running" or "sub-interval" local approximations were self-reviewed in (2003ASPC..292..391A). For an approximation of the phase light curves of nearly-periodic pulsating stars, we use a Trigonometric Polynomial (TP) fit of the statistically optimal degree and initial period improvement using differential corrections (1994OAP.....7...49A). For the determination of parameters of "characteristic points" (minima, maxima, crossings of some constant value etc.) we use a set of methods self-reviewed in 2005ASPC..335...37A, Results of the analysis of the catalogs compiled using these programs are presented in 2014AASP....4....3A. For more complicated signals, we use "phenomenological approximations" with "special shapes" based on functions defined on sub-intervals rather on the complete interval. E. g. for the Algol-type stars we developed the NAV ("New Algol Variable") algorithm (2012Ap.....55..536A, 2012arXiv1212.6707A, 2015JASS...32..127A), which was compared to common methods of Trigonometric Polynomial Fit (TP) or local Algebraic Polynomial (A) fit of a fixed or (alternately) statistically optimal degree. The method allows determine the minimal set of parameters required for the "General Catalogue of Variable Stars", as well as an extended set of phenomenological and astrophysical parameters which may be used for the classification. Totally more that 1900 variable stars were studied in our group using these methods in a frame of the "Inter-Longitude Astronomy" campaign (2010OAP....23....8A) and the "Ukrainian Virtual Observatory" project (2012KPCB...28...85V).

  6. High Frequency Circuit Simulator: An Advanced Electromagnetic Simulation Tool for Microwave Sources

    NASA Astrophysics Data System (ADS)

    Zhu, Xiao Fang; Yang, Zhong Hai; Li, Bin; Li, Jian Qing; Xu, Li

    2009-08-01

    High Frequency Circuit Simulator (HFCS) is developed as an advanced electromagnetic simulation tool for microwave sources, which is based on Finite Integration Technique (FIT). In this paper, the detail of the design and realization of HFCS is provided and for validation one actual Helical Slow-Wave Structure (HSWS) is fully analyzed. Convergent process is studied and the cold-test characteristics (including dispersion, coupling impedance and attenuation constant) are calculated and compared with those from MAFIA. The consistency of the results of these two simulation tools has proved the reliability and validity of HFCS.

  7. Preparing High School Students for Success in Advanced Placement Statistics: An Investigation of Pedagogies and Strategies Used in an Online Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Potter, James Thomson, III

    2012-01-01

    Research into teaching practices and strategies has been performed separately in AP Statistics and in K-12 online learning (Garfield, 2002; Ferdig, DiPietro, Black & Dawson, 2009). This study seeks combine the two and build on the need for more investigation into online teaching and learning in specific content (Ferdig et al, 2009; DiPietro,…

  8. Preparing High School Students for Success in Advanced Placement Statistics: An Investigation of Pedagogies and Strategies Used in an Online Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Potter, James Thomson, III

    2012-01-01

    Research into teaching practices and strategies has been performed separately in AP Statistics and in K-12 online learning (Garfield, 2002; Ferdig, DiPietro, Black & Dawson, 2009). This study seeks combine the two and build on the need for more investigation into online teaching and learning in specific content (Ferdig et al, 2009; DiPietro,…

  9. Advanced Risk Reduction Tool (ARRT) Special Case Study Report: Science and Engineering Technical Assessments (SETA) Program

    NASA Technical Reports Server (NTRS)

    Kirsch, Paul J.; Hayes, Jane; Zelinski, Lillian

    2000-01-01

    This special case study report presents the Science and Engineering Technical Assessments (SETA) team's findings for exploring the correlation between the underlying models of Advanced Risk Reduction Tool (ARRT) relative to how it identifies, estimates, and integrates Independent Verification & Validation (IV&V) activities. The special case study was conducted under the provisions of SETA Contract Task Order (CTO) 15 and the approved technical approach documented in the CTO-15 Modification #1 Task Project Plan.

  10. Randomized, Controlled Trial of an Advance Care Planning Video Decision Support Tool for Patients With Advanced Heart Failure.

    PubMed

    El-Jawahri, Areej; Paasche-Orlow, Michael K; Matlock, Dan; Stevenson, Lynne Warner; Lewis, Eldrin F; Stewart, Garrick; Semigran, Marc; Chang, Yuchiao; Parks, Kimberly; Walker-Corkery, Elizabeth S; Temel, Jennifer S; Bohossian, Hacho; Ooi, Henry; Mann, Eileen; Volandes, Angelo E

    2016-07-05

    Conversations about goals of care and cardiopulmonary resuscitation (CPR)/intubation for patients with advanced heart failure can be difficult. This study examined the impact of a video decision support tool and patient checklist on advance care planning for patients with heart failure. This was a multisite, randomized, controlled trial of a video-assisted intervention and advance care planning checklist versus a verbal description in 246 patients ≥64 years of age with heart failure and an estimated likelihood of death of >50% within 2 years. Intervention participants received a verbal description for goals of care (life-prolonging care, limited care, and comfort care) and CPR/intubation plus a 6-minute video depicting the 3 levels of care, CPR/intubation, and an advance care planning checklist. Control subjects received only the verbal description. The primary analysis compared the proportion of patients preferring comfort care between study arms immediately after the intervention. Secondary outcomes were CPR/intubation preferences and knowledge (6-item test; range, 0-6) after intervention. In the intervention group, 27 (22%) chose life-prolonging care, 31 (25%) chose limited care, 63 (51%) selected comfort care, and 2 (2%) were uncertain. In the control group, 50 (41%) chose life-prolonging care, 27 (22%) selected limited care, 37 (30%) chose comfort care, and 8 (7%) were uncertain (P<0.001). Intervention participants (compared with control subjects) were more likely to forgo CPR (68% versus 35%; P<0.001) and intubation (77% versus 48%; P<0.001) and had higher mean knowledge scores (4.1 versus 3.0; P<0.001). Patients with heart failure who viewed a video were more informed, more likely to select a focus on comfort, and less likely to desire CPR/intubation compared with patients receiving verbal information only. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01589120. © 2016 American Heart Association, Inc.

  11. Mnemonic Aids during Tests: Worthless Frivolity or Effective Tool in Statistics Education?

    ERIC Educational Resources Information Center

    Larwin, Karen H.; Larwin, David A.; Gorman, Jennifer

    2012-01-01

    Researchers have explored many pedagogical approaches in an effort to assist students in finding understanding and comfort in required statistics courses. This study investigates the impact of mnemonic aids used during tests on students' statistics course performance in particular. In addition, the present study explores several hypotheses that…

  12. "I am Not a Statistic": Identities of African American Males in Advanced Science Courses

    NASA Astrophysics Data System (ADS)

    Johnson, Diane Wynn

    The United States Bureau of Labor Statistics (2010) expects new industries to generate approximately 2.7 million jobs in science and technology by the year 2018, and there is concern as to whether there will be enough trained individuals to fill these positions. A tremendous resource remains untapped, African American students, especially African American males (National Science Foundation, 2009). Historically, African American males have been omitted from the so called science pipeline. Fewer African American males pursue a science discipline due, in part; to limiting factors they experience in school and at home (Ogbu, 2004). This is a case study of African American males who are enrolled in advanced science courses at a predominantly African American (84%) urban high school. Guided by expectancy-value theory (EVT) of achievement related results (Eccles, 2009; Eccles et al., 1983), twelve African American male students in two advanced science courses were observed in their science classrooms weekly, participated in an in-depth interview, developed a presentation to share with students enrolled in a tenth grade science course, responded to an open-ended identity questionnaire, and were surveyed about their perceptions of school. Additionally, the students' teachers were interviewed, and seven of the students' parents. The interview data analyses highlighted the important role of supportive parents (key socializers) who had high expectations for their sons and who pushed them academically. The students clearly attributed their enrollment in advanced science courses to their high regard for their science teachers, which included positive relationships, hands-on learning in class, and an inviting and encouraging learning environment. Additionally, other family members and coaches played important roles in these young men's lives. Students' PowerPoint(c) presentations to younger high school students on why they should take advanced science courses highlighted these

  13. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC 11: SPC & Graphs. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…

  14. Coping, Stress, and Job Satisfaction as Predictors of Advanced Placement Statistics Teachers' Intention to Leave the Field

    ERIC Educational Resources Information Center

    McCarthy, Christopher J.; Lambert, Richard G.; Crowe, Elizabeth W.; McCarthy, Colleen J.

    2010-01-01

    This study examined the relationship of teachers' perceptions of coping resources and demands to job satisfaction factors. Participants were 158 Advanced Placement Statistics high school teachers who completed measures of personal resources for stress prevention, classroom demands and resources, job satisfaction, and intention to leave the field…

  15. Coping, Stress, and Job Satisfaction as Predictors of Advanced Placement Statistics Teachers' Intention to Leave the Field

    ERIC Educational Resources Information Center

    McCarthy, Christopher J.; Lambert, Richard G.; Crowe, Elizabeth W.; McCarthy, Colleen J.

    2010-01-01

    This study examined the relationship of teachers' perceptions of coping resources and demands to job satisfaction factors. Participants were 158 Advanced Placement Statistics high school teachers who completed measures of personal resources for stress prevention, classroom demands and resources, job satisfaction, and intention to leave the field…

  16. Statistical tools in published articles of a public health journal in 2013 and 2014: bibliometric cross-sectional study.

    PubMed

    Arcila Quiceno, Víctor; García Restrepo, Elizabeth; Gómez Rúa, Natalia; Montenegro Martínez, Gino; Silva Ayçaguer, Luis Carlos

    2015-08-31

    Research projects use statistical resources to express in numerical or graphic terms different magnitudes like frequencies, differences or associations. The purpose of this paper is to describe the statistics tools utilization, with special emphasis in the use of conventional statistical tests and confidence intervals, to communicate results in a renowned public health peer reviewed journal in Colombia. We included the 84 articles published in the journal between 2013 and 2014. The most used resource is frequency analysis (89.3%), followed by p values (65.5%) and confidence intervals (53.6%); 48.9% of the papers used confidence intervals together with p values; 29.8% use neither of them; 16.7% of the articles only used p values and 4.8% only confidence intervals. Descriptive statistics is a tool widely used in research results presentation; the critics and caveats suggesting to avoid the exclusive use of the statistical signification test in the results presentation are not followed in the analysis and presentation of the research results.

  17. Advanced gradient-index lens design tools to maximize system performance and reduce SWaP

    NASA Astrophysics Data System (ADS)

    Campbell, Sawyer D.; Nagar, Jogender; Brocker, Donovan E.; Easum, John A.; Turpin, Jeremiah P.; Werner, Douglas H.

    2016-05-01

    GRadient-INdex (GRIN) lenses have long been of interest due to their potential for providing levels of performance unachievable with traditional homogeneous lenses. While historically limited by a lack of suitable materials, rapid advancements in manufacturing techniques, including 3D printing, have recently kindled a renewed interest in GRIN optics. Further increasing the desire for GRIN devices has been the advent of Transformation Optics (TO), which provides the mathematical framework for representing the behavior of electromagnetic radiation in a given geometry by "transforming" it to an alternative, usually more desirable, geometry through an appropriate mapping of the constituent material parameters. Using TO, aspherical lenses can be transformed to simpler spherical and flat geometries or even rotationally-asymmetric shapes which result in true 3D GRIN profiles. Meanwhile, there is a critical lack of suitable design tools which can effectively evaluate the optical wave propagation through 3D GRIN profiles produced by TO. Current modeling software packages for optical lens systems also lack advanced multi-objective global optimization capability which allows the user to explicitly view the trade-offs between all design objectives such as focus quality, FOV, ▵nand focal drift due to chromatic aberrations. When coupled with advanced design methodologies such as TO, wavefront matching (WFM), and analytical achromatic GRIN theory, these tools provide a powerful framework for maximizing SWaP (Size, Weight and Power) reduction in GRIN-enabled optical systems. We provide an overview of our advanced GRIN design tools and examples which minimize the presence of mono- and polychromatic aberrations in the context of reducing SWaP.

  18. Development of the Music Therapy Assessment Tool for Advanced Huntington's Disease: A Pilot Validation Study.

    PubMed

    O'Kelly, Julian; Bodak, Rebeka

    2016-01-01

    Case studies of people with Huntington's disease (HD) report that music therapy provides a range of benefits that may improve quality of life; however, no robust music therapy assessment tools exist for this population. Develop and conduct preliminary psychometric testing of a music therapy assessment tool for patients with advanced HD. First, we established content and face validity of the Music Therapy Assessment Tool for Advanced HD (MATA-HD) through focus groups and field testing. Second, we examined psychometric properties of the resulting MATA-HD in terms of its construct validity, internal consistency, and inter-rater and intra-rater reliability over 10 group music therapy sessions with 19 patients. The resulting MATA-HD included a total of 15 items across six subscales (Arousal/Attention, Physical Presentation, Communication, Musical, Cognition, and Psychological/Behavioral). We found good construct validity (r ≥ 0.7) for Mood, Communication Level, Communication Effectiveness, Choice, Social Behavior, Arousal, and Attention items. Cronbach's α of 0.825 indicated good internal consistency across 11 items with a common focus of engagement in therapy. The inter-rater reliability (IRR) Intra-Class Coefficient (ICC) scores averaged 0.65, and a mean intra-rater ICC reliability of 0.68 was obtained. Further training and retesting provided a mean of IRR ICC of 0.7. Preliminary data indicate that the MATA-HD is a promising tool for measuring patient responses to music therapy interventions across psychological, physical, social, and communication domains of functioning in patients with advanced HD. © the American Music Therapy Association 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. CONFERENCE NOTE: International Workshop on Advanced Mathematical Tools in Metrology, Villa Gualino, Torino, Italy, 20 22 October 1993

    NASA Astrophysics Data System (ADS)

    1993-01-01

    Preliminary Programme The three-day programme features approximately twenty-five invited contributions. Participants may present a poster on the topic "Applications for Industrial Measurements", concerning applied mathematics, software development and computer-based measurements. 20 October Two plenary talks on mathematical methods and metrological applications "Numerical Methods and Modelling" Partial differential equations and integral equations Methods of identification and validation Algorithms for approximation Geometrical shape determination of industrial solids Round Table 21 October "Data Analysis" Spectral analysis and wavelets Calibration of precision instrumentation Comparison measurement of standards Statistical methods in metrology Robust estimation and outliers Applications of the bootstrap method Round Table 22 October (in cooperation with SIMAI and ASP) "Applications for Industrial Measurements" Data acquisition Measurement software, standard computational modules and their validation Round Table Industrial presentations Discussion of poster presentations Conclusions Lecturers Mathematicians from the international metrological community; mathematicians from Italian universities (Politecnico of Torino, Milano, Università di Genova, Milano, Padova, Roma, Trento); scientists and mathematicians from national standards institutes and the Italian National Research Council. The workshop will be of interest to people in universities, research centres and industry who are involved in measurement and need advanced mathematical tools to solve their problems, and to those who work in the development of these mathematical tools. Metrology is concerned with measurement at the highest level of precision. Advances in metrology depend on many factors: improvements in scientific and technical knowledge, instrumentation quality, better use of advanced mathematical tools and the development of new tools. In some countries, metrological institutions have a tradition of

  20. Anvil Forecast Tool in the Advanced Weather Interactive Processing System (AWIPS)

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Launch Weather Officers (LWOs) from the 45th Weather Squadron (45 WS) and forecasters from the National Weather Service (NWS) Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violating the Lightning Launch Commit Criteria (LLCC) (Krider et al. 2006; Space Shuttle Flight Rules (FR), NASA/JSC 2004)). As a result, the Applied Meteorology Unit (AMU) developed a tool that creates an anvil threat corridor graphic that can be overlaid on satellite imagery using the Meteorological Interactive Data Display System (MIDDS, Short and Wheeler, 2002). The tool helps forecasters estimate the locations of thunderstorm anvils at one, two, and three hours into the future. It has been used extensively in launch and landing operations by both the 45 WS and SMG. The Advanced Weather Interactive Processing System (AWIPS) is now used along with MIDDS for weather analysis and display at SMG. In Phase I of this task, SMG tasked the AMU to transition the tool from MIDDS to AWIPS (Barrett et aI., 2007). For Phase II, SMG requested the AMU make the Anvil Forecast Tool in AWIPS more configurable by creating the capability to read model gridded data from user-defined model files instead of hard-coded files. An NWS local AWIPS application called AGRID was used to accomplish this. In addition, SMG needed to be able to define the pressure levels for the model data, instead of hard-coding the bottom level as 300 mb and the top level as 150 mb. This paper describes the initial development of the Anvil Forecast Tool for MIDDS, followed by the migration of the tool to AWIPS in Phase I. It then gives a detailed presentation of the Phase II improvements to the AWIPS tool.

  1. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  2. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    PubMed

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education.

  3. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education

    PubMed Central

    Masel, J.; Humphrey, P. T.; Blackburn, B.; Levine, J. A.

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students’ intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes’ theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. PMID:26582236

  4. Choosing a new CD4 technology: Can statistical method comparison tools influence the decision?

    PubMed

    Scott, Lesley E; Kestens, Luc; Pattanapanyasat, Kovit; Sukapirom, Kasma; Stevens, Wendy S

    2017-03-10

    Method comparison tools are used to determine the accuracy, precision, agreement, and clinical relevance of a new or improved technology versus a reference technology. Guidelines for the most appropriate method comparison tools as well as their acceptable limits are lacking and not standardized for CD4 counting technologies. Different method comparison tools were applied to a previously published CD4 dataset (n = 150 data pairs) evaluating five different CD4 counting technologies (TruCOUNT, Dual Platform, FACSCount, Easy CD4, CyFlow) on a single specimen. Bland-Altman, percentage similarity, percent difference, concordance correlation, sensitivity, specificity and misclassification method comparison tools were applied as well as visualization of agreement with Passing Bablock and Bland-Altman scatter plots. The FACSCount (median CD4 = 245 cells/µl) was considered the reference for method comparison. An algorithm was developed using best practices of the most applicable method comparison tools, and together with a modified heat map was found useful for method comparison of CD4 qualitative and quantitative results. The algorithm applied the concordance correlation for overall accuracy and precision, then standard deviation of the absolute bias and percentage similarity coefficient of variation to identify agreement, and lastly sensitivity and misclassification rates for clinical relevance. Combining method comparison tools is more useful in evaluating CD4 technologies compared to a reference CD4. This algorithm should be further validated using CD4 external quality assessment data and studies with larger sample sizes. © 2017 Clinical Cytometry Society. © 2017 International Clinical Cytometry Society.

  5. Statistical tools for analysing the data obtained from repeated dose toxicity studies with rodents: a comparison of the statistical tools used in Japan with that of used in other countries.

    PubMed

    Kobayashi, Katsumi; Pillai, K Sadasivan; Guhatakurta, Soma; Cherian, K M; Ohnishi, Mariko

    2011-01-01

    In the present study, an attempt was made to compare the statistical tools used for analysing the data of repeated dose toxicity studies with rodents conducted in 45 countries, with that of Japan. The study revealed that there was no congruence among the countries in the use of statistical tools for analysing the data obtained from the above studies. For example, to analyse the data obtained from repeated dose toxicity studies with rodents, Scheffé's multiple range and Dunnett type (joint type Dunnett) tests are commonly used in Japan, but in other countries use of these statistical tools is not so common. However, statistical techniques used for testing the above data for homogeneity of variance and inter-group comparisons do not differ much between Japan and other countries. In Japan, the data are generally not tested for normality and the same is true with the most of the countries investigated. In the present investigation, out of 127 studies examined, data of only 6 studies were analysed for both homogeneity of variance and normal distribution. For examining homogeneity of variance, we propose Levene's test, since the commonly used Bartlett's test may show heterogeneity in variance in all the groups, if a slight heterogeneity in variance is seen any one of the groups. We suggest the data may be examined for both homogeneity of variance and normal distribution. For the data of the groups that do not show heterogeneity of variance, to find the significant difference among the groups, we recommend Dunnett's test, and for those show heterogeneity of variance, we recommend Steel's test.

  6. Statistical description of the macrostructure of diamond-containing powder tool materials

    NASA Astrophysics Data System (ADS)

    Vinokurov, G. G.; Sharin, P. P.; Popov, O. N.

    2015-12-01

    The macrostructure of diamond-containing tool material has been investigated. The potentials of application of a cluster theory for processing a digital metallographic image of a diamond-containing powder material are substantiated. It is proposed to consider agglomerates of diamond grains to estimate the heterogeneity of a two-phase macrostructure.

  7. WORD STATISTICS IN THE GENERATION OF SEMANTIC TOOLS FOR INFORMATION SYSTEMS.

    ERIC Educational Resources Information Center

    STONE, DON C.

    ONE OF THE PROBLEMS IN INFORMATION STORAGE AND RETRIEVAL SYSTEMS OF TECHNICAL DOCUMENTS IS THE INTERPRETATION OF WORDS USED TO INDEX DOCUMENTS. SEMANTIC TOOLS, DEFINED AS CHANNELS FOR THE COMMUNICATION OF WORD MEANINGS BETWEEN TECHNICAL EXPERTS, DOCUMENT INDEXERS, AND SEARCHERS, PROVIDE ONE METHOD OF DEALING WITH THE PROBLEM OF MULTIPLE…

  8. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  9. A Manually Operated, Advance Off-Stylet Insertion Tool for Minimally Invasive Cochlear Implantation Surgery

    PubMed Central

    Kratchman, Louis B.; Schurzig, Daniel; McRackan, Theodore R.; Balachandran, Ramya; Noble, Jack H.; Webster, Robert J.; Labadie, Robert F.

    2014-01-01

    The current technique for cochlear implantation (CI) surgery requires a mastoidectomy to gain access to the cochlea for electrode array insertion. It has been shown that microstereotactic frames can enable an image-guided, minimally invasive approach to CI surgery called percutaneous cochlear implantation (PCI) that uses a single drill hole for electrode array insertion, avoiding a more invasive mastoidectomy. Current clinical methods for electrode array insertion are not compatible with PCI surgery because they require a mastoidectomy to access the cochlea; thus, we have developed a manually operated electrode array insertion tool that can be deployed through a PCI drill hole. The tool can be adjusted using a preoperative CT scan for accurate execution of the advance off-stylet (AOS) insertion technique and requires less skill to operate than is currently required to implant electrode arrays. We performed three cadaver insertion experiments using the AOS technique and determined that all insertions were successful using CT and microdissection. PMID:22851233

  10. MATISSE: Multi-purpose Advanced Tool for Instruments for the Solar System Exploration

    NASA Astrophysics Data System (ADS)

    Zinzi, A.; Capria, M. T.; Antonelli, L. A.

    2013-09-01

    In planetary sciences, design, assemble and launch onboard instruments are only preliminary steps toward the final aim of converting data into scientific knowledge, as the real challenge is the data analysis and interpretation. Up to now data have been generally stored in "old style" archives, i.e. common ftp servers where the user can manually search for data browsing directories organized in a time order manner. However, as datasets to be stored and searched become particularly large, this latter task absorb a great part of the time, subtracting time to the real scientific work. In order to reduce the time spent to search and analyze data MATISSE (Multi-purpose Advanced Tool for Instruments for the Solar System Exploration), a new set of software tools developed together with the scientific teams of the instruments involved, is under development at ASDC (ASI Science Data Center), whose experience in space missions data management is well known (e.g., [1], [2], [3], [4]).

  11. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  12. NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Craig, D. A.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The objective of this Technical Interchange Meeting was to increase the quantity and quality of technical, cost, and programmatic data used to model the impact of investing in different technologies. The focus of this meeting was the Technology Tool Box (TTB), a database of performance, operations, and programmatic parameters provided by technologists and used by systems engineers. The TTB is the data repository used by a system of models known as the Advanced Technology Lifecycle Analysis System (ATLAS). This report describes the result of the November meeting, and also provides background information on ATLAS and the TTB.

  13. A clinical assessment tool for advanced theory of mind performance in 5 to 12 year olds.

    PubMed

    O'Hare, Anne E; Bremner, Lynne; Nash, Marysia; Happé, Francesca; Pettigrew, Luisa M

    2009-06-01

    One hundred forty typically developing 5- to 12-year-old children were assessed with a test of advanced theory of mind employing Happé's strange stories. There was no significant difference in performance between boys and girls. The stories discriminated performance across the different ages with the lowest performance being in the younger children who nevertheless managed to achieve a third of their potential total. However, some of the individual mentalising concepts such as persuasion were too difficult for these younger children. This normative data provides a useful clinical tool to measure mentalising ability in more able children with autism spectrum disorder.

  14. SmartWay Truck Tool-Advanced Class: Getting the Most out of Your SmartWay Participation

    EPA Pesticide Factsheets

    This EPA presentation provides information on the Advanced SmartWay Truck Tool; it's background, development, participation, data collection, usage, fleet categories, emission metrics, ranking system, performance data, reports, and schedule for 2017.

  15. Advanced Telecommunications in U.S. Public Elementary and Secondary Schools, Fall 1996. Statistics in Brief.

    ERIC Educational Resources Information Center

    Heaviside, Sheila; And Others

    The "Survey of Advanced Telecommunications in U.S. Public Elementary and Secondary Schools, Fall 1996" collected information from 911 regular United States public elementary and secondary schools regarding the availability and use of advanced telecommunications, and in particular, access to the Internet, plans to obtain Internet access, use of…

  16. Plant Aquaporins: Genome-Wide Identification, Transcriptomics, Proteomics, and Advanced Analytical Tools

    PubMed Central

    Deshmukh, Rupesh K.; Sonah, Humira; Bélanger, Richard R.

    2016-01-01

    Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is

  17. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database.

    SciTech Connect

    Quock, D. E. R.; Cianciarulo, M. B.; APS Engineering Support Division; Purdue Univ.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  18. A Randomized Controlled Trial of an Advance Care Planning Video Decision Support Tool for Patients with Advanced Heart Failure

    PubMed Central

    El-Jawahri, Areej; Paasche-Orlow, Michael K.; Matlock, Dan; Stevenson, Lynne; Lewis, Eldrin F.; Stewart, Garrick; Semigran, Marc; Chang, Yuchiao; Parks, Kimberly; Walker-Corkery, Elizabeth S.; Temel, Jennifer S.; Bohossian, Hacho; Ooi, Henry; Mann, Eileen; Volandes, Angelo E.

    2016-01-01

    Background Conversations about goals of care and CPR/intubation for patients with advanced health failure (HF) can be difficult. This study examined the impact of a video decision support tool and a patient checklist on advance care planning (ACP) for patients with HF. Methods Multi-site randomized controlled trial of a video-assisted intervention and ACP checklist versus a verbal description in 246 patients ≥ 64 years with HF and an estimated likelihood of death of > 50% within two years. Intervention participants received a verbal description for goals of care (life-prolonging care, limited care, and comfort care) and CPR/intubation plus a six-minute video depicting the three levels of care and CPR/intubation as well as ACP checklist. Controls received only the verbal description. The primary analysis compared the proportion of patients preferring comfort care between study arms immediately after the intervention. Secondary outcomes were CPR/intubation preferences and knowledge (6-item test, range 0-6) after intervention. Results In the intervention group, 27 (22%) chose life-prolonging, 31 (25%) limited, 63 (51%) comfort, with two (2%) uncertain. In the control group, 50 (41%) chose life-prolonging, 27 (22%) limited, 37 (30%) comfort, with eight (7%) uncertain (P<0.001). Intervention participants (vs. controls) were more likely to forgo CPR (68% vs. 35%, P <0.001) and intubation (77% vs. 48%, P <0.001), and had higher mean knowledge scores (4.1 vs. 3.0; P < 0.001). Conclusions Patients with HF who viewed a video were more informed, more likely to select a focus on comfort, and less likely to desire CPR/intubation compared to patients receiving verbal information only. PMID:27358437

  19. An Analysis of Energy Savings Possible Through Advances in Automotive Tooling Technology

    SciTech Connect

    Rick Schmoyer, RLS

    2004-12-03

    The use of lightweight and highly formable advanced materials in automobile and truck manufacturing has the potential to save fuel. Advances in tooling technology would promote the use of these materials. This report describes an energy savings analysis performed to approximate the potential fuel savings and consequential carbon-emission reductions that would be possible because of advances in tooling in the manufacturing of, in particular, non-powertrain components of passenger cars and heavy trucks. Separate energy analyses are performed for cars and heavy trucks. Heavy trucks are considered to be Class 7 and 8 trucks (trucks rated over 26,000 lbs gross vehicle weight). A critical input to the analysis is a set of estimates of the percentage reductions in weight and drag that could be achieved by the implementation of advanced materials, as a consequence of improved tooling technology, which were obtained by surveying tooling industry experts who attended a DOE Workshop, Tooling Technology for Low-Volume Vehicle Production, held in Seattle and Detroit in October and November 2003. The analysis is also based on 2001 fuel consumption totals and on energy-audit component proportions of fuel use due to drag, rolling resistance, and braking. The consumption proportions are assumed constant over time, but an allowance is made for fleet growth. The savings for a particular component is then the product of total fuel consumption, the percentage reduction of the component, and the energy audit component proportion. Fuel savings estimates for trucks also account for weight-limited versus volume-limited operations. Energy savings are assumed to be of two types: (1) direct energy savings incurred through reduced forces that must be overcome to move the vehicle or to slow it down in braking. and (2) indirect energy savings through reductions in the required engine power, the production and transmission of which incur thermodynamic losses, internal friction, and other

  20. EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing

    PubMed Central

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590

  1. Design and contents of an advanced distance-based statistics course for a PhD in nursing program.

    PubMed

    Azuero, Andres; Wilbanks, Bryan; Pryor, Erica

    2013-01-01

    Doctoral nursing students and researchers are expected to understand, critique, and conduct research that uses advanced quantitative methodology. The authors describe the design and contents of a distance-based course in multivariate statistics for PhD students in nursing and health administration, compare the design to recommendations found in the literature for distance-based statistics education, and compare the course contents to a tabulation of the methodologies used in a sample of recently published quantitative dissertations in nursing. The authors conclude with a discussion based on these comparisons as well as with experiences in course implementation and directions for future course development.

  2. Genetic susceptibility and gastric cancer risk: the importance of meta-analyses as a statistical tool.

    PubMed

    García-González, María Asunción; Lanas, Angel

    2014-01-01

    Gastric cancer (GC) is a complex disease and a worldwide health burden due to its high prevalence and poor prognosis. A deeper knowledge of the factors involved in the development and progression of GC could help to identify subpopulations at risk that therefore require surveillance or early treatment strategies. Current research is based on the study of genetic variants that confer a higher risk of GC and their interactions with environmental exposure. Recently, meta-analysis has emerged as an important statistical method involving pooling of data from individual association studies to increase statistical power and obtain more conclusive results. Given the importance of chronic inflammation in the process of gastric carcinogenesis, the present article reviews the most recent meta-analyses of the contribution of cytokine gene polymorphisms to GC risk.

  3. Determination of reference limits: statistical concepts and tools for sample size calculation.

    PubMed

    Wellek, Stefan; Lackner, Karl J; Jennen-Steinmetz, Christine; Reinhard, Iris; Hoffmann, Isabell; Blettner, Maria

    2014-12-01

    Reference limits are estimators for 'extreme' percentiles of the distribution of a quantitative diagnostic marker in the healthy population. In most cases, interest will be in the 90% or 95% reference intervals. The standard parametric method of determining reference limits consists of computing quantities of the form X̅±c·S. The proportion of covered values in the underlying population coincides with the specificity obtained when a measurement value falling outside the corresponding reference region is classified as diagnostically suspect. Nonparametrically, reference limits are estimated by means of so-called order statistics. In both approaches, the precision of the estimate depends on the sample size. We present computational procedures for calculating minimally required numbers of subjects to be enrolled in a reference study. The much more sophisticated concept of reference bands replacing statistical reference intervals in case of age-dependent diagnostic markers is also discussed.

  4. Statistical Methods and Tools for Uxo Characterization (SERDP Final Technical Report)

    SciTech Connect

    Pulsipher, Brent A.; Gilbert, Richard O.; Wilson, John E.; Hassig, Nancy L.; Carlson, Deborah K.; O'Brien, Robert F.; Bates, Derrick J.; Sandness, Gerald A.; Anderson, Kevin K.

    2004-11-15

    The Strategic Environmental Research and Development Program (SERDP) issued a statement of need for FY01 titled Statistical Sampling for Unexploded Ordnance (UXO) Site Characterization that solicited proposals to develop statistically valid sampling protocols for cost-effective, practical, and reliable investigation of sites contaminated with UXO; protocols that could be validated through subsequent field demonstrations. The SERDP goal was the development of a sampling strategy for which a fraction of the site is initially surveyed by geophysical detectors to confidently identify clean areas and subsections (target areas, TAs) that had elevated densities of anomalous geophysical detector readings that could indicate the presence of UXO. More detailed surveys could then be conducted to search the identified TAs for UXO. SERDP funded three projects: those proposed by the Pacific Northwest National Laboratory (PNNL) (SERDP Project No. UXO 1199), Sandia National Laboratory (SNL), and Oak Ridge National Laboratory (ORNL). The projects were closely coordinated to minimize duplication of effort and facilitate use of shared algorithms where feasible. This final report for PNNL Project 1199 describes the methods developed by PNNL to address SERDP's statement-of-need for the development of statistically-based geophysical survey methods for sites where 100% surveys are unattainable or cost prohibitive.

  5. Advanced Tools for River Science: EAARL and MD_SWMS: Chapter 3

    USGS Publications Warehouse

    Kinzel, Paul J.

    2009-01-01

    Disruption of flow regimes and sediment supplies, induced by anthropogenic or climatic factors, can produce dramatic alterations in river form, vegetation patterns, and associated habitat conditions. To improve habitat in these fluvial systems, resource managers may choose from a variety of treatments including flow and/or sediment prescriptions, vegetation management, or engineered approaches. Monitoring protocols developed to assess the morphologic response of these treatments require techniques that can measure topographic changes above and below the water surface efficiently, accurately, and in a standardized, cost-effective manner. Similarly, modeling of flow, sediment transport, habitat, and channel evolution requires characterization of river morphology for model input and verification. Recent developments by the U.S. Geological Survey with regard to both remotely sensed methods (the Experimental Advanced Airborne Research LiDAR; EAARL) and computational modeling software (the Multi-Dimensional Surface-Water Modeling System; MD_SWMS) have produced advanced tools for spatially explicit monitoring and modeling in aquatic environments. In this paper, we present a pilot study conducted along the Platte River, Nebraska, that demonstrates the combined use of these river science tools.

  6. A Statistical Bias Correction Tool for Generating Climate Change Scenarios in Indonesia based on CMIP5 Datasets

    NASA Astrophysics Data System (ADS)

    Faqih, A.

    2017-03-01

    Providing information regarding future climate scenarios is very important in climate change study. The climate scenario can be used as basic information to support adaptation and mitigation studies. In order to deliver future climate scenarios over specific region, baseline and projection data from the outputs of global climate models (GCM) is needed. However, due to its coarse resolution, the data have to be downscaled and bias corrected in order to get scenario data with better spatial resolution that match the characteristics of the observed data. Generating this downscaled data is mostly difficult for scientist who do not have specific background, experience and skill in dealing with the complex data from the GCM outputs. In this regards, it is necessary to develop a tool that can be used to simplify the downscaling processes in order to help scientist, especially in Indonesia, for generating future climate scenario data that can be used for their climate change-related studies. In this paper, we introduce a tool called as “Statistical Bias Correction for Climate Scenarios (SiBiaS)”. The tool is specially designed to facilitate the use of CMIP5 GCM data outputs and process their statistical bias corrections relative to the reference data from observations. It is prepared for supporting capacity building in climate modeling in Indonesia as part of the Indonesia 3rd National Communication (TNC) project activities.

  7. In Silico PCR Tools for a Fast Primer, Probe, and Advanced Searching.

    PubMed

    Kalendar, Ruslan; Muterko, Alexandr; Shamekova, Malika; Zhambakin, Kabyl

    2017-01-01

    The polymerase chain reaction (PCR) is fundamental to molecular biology and is the most important practical molecular technique for the research laboratory. The principle of this technique has been further used and applied in plenty of other simple or complex nucleic acid amplification technologies (NAAT). In parallel to laboratory "wet bench" experiments for nucleic acid amplification technologies, in silico or virtual (bioinformatics) approaches have been developed, among which in silico PCR analysis. In silico NAAT analysis is a useful and efficient complementary method to ensure the specificity of primers or probes for an extensive range of PCR applications from homology gene discovery, molecular diagnosis, DNA fingerprinting, and repeat searching. Predicting sensitivity and specificity of primers and probes requires a search to determine whether they match a database with an optimal number of mismatches, similarity, and stability. In the development of in silico bioinformatics tools for nucleic acid amplification technologies, the prospects for the development of new NAAT or similar approaches should be taken into account, including forward-looking and comprehensive analysis that is not limited to only one PCR technique variant. The software FastPCR and the online Java web tool are integrated tools for in silico PCR of linear and circular DNA, multiple primer or probe searches in large or small databases and for advanced search. These tools are suitable for processing of batch files that are essential for automation when working with large amounts of data. The FastPCR software is available for download at http://primerdigital.com/fastpcr.html and the online Java version at http://primerdigital.com/tools/pcr.html .

  8. COLLABORATIVE RESEARCH:USING ARM OBSERVATIONS & ADVANCED STATISTICAL TECHNIQUES TO EVALUATE CAM3 CLOUDS FOR DEVELOPMENT OF STOCHASTIC CLOUD-RADIATION

    SciTech Connect

    Somerville, Richard

    2013-08-22

    The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key step in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been a collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen).

  9. genipe: an automated genome-wide imputation pipeline with automatic reporting and statistical tools.

    PubMed

    Lemieux Perreault, Louis-Philippe; Legault, Marc-André; Asselin, Géraldine; Dubé, Marie-Pierre

    2016-12-01

    Genotype imputation is now commonly performed following genome-wide genotyping experiments. Imputation increases the density of analyzed genotypes in the dataset, enabling fine-mapping across the genome. However, the process of imputation using the most recent publicly available reference datasets can require considerable computation power and the management of hundreds of large intermediate files. We have developed genipe, a complete genome-wide imputation pipeline which includes automatic reporting, imputed data indexing and management, and a suite of statistical tests for imputed data commonly used in genetic epidemiology (Sequence Kernel Association Test, Cox proportional hazards for survival analysis, and linear mixed models for repeated measurements in longitudinal studies).

  10. Laser vision: lidar as a transformative tool to advance critical zone science

    NASA Astrophysics Data System (ADS)

    Harpold, A. A.; Marshall, J. A.; Lyon, S. W.; Barnhart, T. B.; Fisher, B.; Donovan, M.; Brubaker, K. M.; Crosby, C. J.; Glenn, N. F.; Glennie, C. L.; Kirchner, P. B.; Lam, N.; Mankoff, K. D.; McCreight, J. L.; Molotch, N. P.; Musselman, K. N.; Pelletier, J.; Russo, T.; Sangireddy, H.; Sjöberg, Y.; Swetnam, T.; West, N.

    2015-01-01

    Laser vision: lidar as a transformative tool to advance critical zone science. Observation and quantification of the Earth surface is undergoing a revolutionary change due to the increased spatial resolution and extent afforded by light detection and ranging (lidar) technology. As a consequence, lidar-derived information has led to fundamental discoveries within the individual disciplines of geomorphology, hydrology, and ecology. These disciplines form the cornerstones of Critical Zone (CZ) science, where researchers study how interactions among the geosphere, hydrosphere, and ecosphere shape and maintain the "zone of life", extending from the groundwater to the vegetation canopy. Lidar holds promise as a transdisciplinary CZ research tool by simultaneously allowing for quantification of topographic, vegetative, and hydrological data. Researchers are just beginning to utilize lidar datasets to answer synergistic questions in CZ science, such as how landforms and soils develop in space and time as a function of the local climate, biota, hydrologic properties, and lithology. This review's objective is to demonstrate the transformative potential of lidar by critically assessing both challenges and opportunities for transdisciplinary lidar applications. A review of 147 peer-reviewed studies utilizing lidar showed that 38 % of the studies were focused in geomorphology, 18 % in hydrology, 32 % in ecology, and the remaining 12 % have an interdisciplinary focus. We find that using lidar to its full potential will require numerous advances across CZ applications, including new and more powerful open-source processing tools, exploiting new lidar acquisition technologies, and improved integration with physically-based models and complementary in situ and remote-sensing observations. We provide a five-year vision to utilize and advocate for the expanded use of lidar datasets to benefit CZ science applications.

  11. Lithology and mineralogy recognition from geochemical logging tool data using multivariate statistical analysis.

    PubMed

    Konaté, Ahmed Amara; Ma, Huolin; Pan, Heping; Qin, Zhen; Ahmed, Hafizullah Abba; Dembele, N'dji Dit Jacques

    2017-10-01

    The availability of a deep well that penetrates deep into the Ultra High Pressure (UHP) metamorphic rocks is unusual and consequently offers a unique chance to study the metamorphic rocks. One such borehole is located in the southern part of Donghai County in the Sulu UHP metamorphic belt of Eastern China, from the Chinese Continental Scientific Drilling Main hole. This study reports the results obtained from the analysis of oxide log data. A geochemical logging tool provides in situ, gamma ray spectroscopy measurements of major and trace elements in the borehole. Dry weight percent oxide concentration logs obtained for this study were SiO2, K2O, TiO2, H2O, CO2, Na2O, Fe2O3, FeO, CaO, MnO, MgO, P2O5 and Al2O3. Cross plot and Principal Component Analysis methods were applied for lithology characterization and mineralogy description respectively. Cross plot analysis allows lithological variations to be characterized. Principal Component Analysis shows that the oxide logs can be summarized by two components related to the feldspar and hydrous minerals. This study has shown that geochemical logging tool data is accurate and adequate to be tremendously useful in UHP metamorphic rocks analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Computational AstroStatistics: fast and efficient tools for analysing huge astronomical data sources

    NASA Astrophysics Data System (ADS)

    Nichol, Robert C.; Chong, S.; Connolly, A. J.; Davies, S.; Genovese, C.; Hopkins, A. M.; Miller, C. J.; Moore, A. W.; Pelleg, D.; Richards, G. T.; Schneider, J.; Szapudi, I.; Wasserman, L.

    I present here a review of past and present multi-disciplinary research of the Pittsburgh Computational AstroStatistics (PiCA) group. This group is dedicated to developing fast and efficient statistical algorithms for analysing huge astronomical data sources. I begin with a short review of multi-resolutional kd-trees which are the building blocks for many of our algorithms. For example, quick range queries and fast N-point correlation functions. I will present new results from the use of Mixture Models (Connolly et al. 2000) in desity estimation of multi-color data from the Sloan Digital Sky Survey (SDSS). Specifically, the selection of quasars and the automated identification of X-ray sources. I will also present a brief overview of the False Discovery Rate (FDR) procedure (Miller et al. 2001) and show how it has been used in the detection of "Baryon Wiggles" in the local galaxy power spectrum and source identification in radio data. Finally, I will look forward to new research on an automatied Bayes Network anomaly detector and the possible use of the Locally Linear Embedding algorithm (LLE; Roweis & Saul 2000) for spectral classification of SDSS spectra.

  13. Development of Experimental and Computational Aeroacoustic Tools for Advanced Liner Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.; Watson, Willie R.; Nark, Douglas N.; Parrott, Tony L.; Gerhold, Carl H.; Brown, Martha C.

    2006-01-01

    Acoustic liners in aircraft engine nacelles suppress radiated noise. Therefore, as air travel increases, increasingly sophisticated tools are needed to maximize noise suppression. During the last 30 years, NASA has invested significant effort in development of experimental and computational acoustic liner evaluation tools. The Curved Duct Test Rig is a 152-mm by 381- mm curved duct that supports liner evaluation at Mach numbers up to 0.3 and source SPLs up to 140 dB, in the presence of user-selected modes. The Grazing Flow Impedance Tube is a 51- mm by 63-mm duct currently being fabricated to operate at Mach numbers up to 0.6 with source SPLs up to at least 140 dB, and will replace the existing 51-mm by 51-mm duct. Together, these test rigs allow evaluation of advanced acoustic liners over a range of conditions representative of those observed in aircraft engine nacelles. Data acquired with these test ducts are processed using three aeroacoustic propagation codes. Two are based on finite element solutions to convected Helmholtz and linearized Euler equations. The third is based on a parabolic approximation to the convected Helmholtz equation. The current status of these computational tools and their associated usage with the Langley test rigs is provided.

  14. Contemporary molecular tools in microbial ecology and their application to advancing biotechnology.

    PubMed

    Rashid, Mamoon; Stingl, Ulrich

    2015-12-01

    Novel methods in microbial ecology are revolutionizing our understanding of the structure and function of microbes in the environment, but concomitant advances in applications of these tools to biotechnology are mostly lagging behind. After more than a century of efforts to improve microbial culturing techniques, about 70-80% of microbial diversity - recently called the "microbial dark matter" - remains uncultured. In early attempts to identify and sample these so far uncultured taxonomic lineages, methods that amplify and sequence ribosomal RNA genes were extensively used. Recent developments in cell separation techniques, DNA amplification, and high-throughput DNA sequencing platforms have now made the discovery of genes/genomes of uncultured microorganisms from different environments possible through the use of metagenomic techniques and single-cell genomics. When used synergistically, these metagenomic and single-cell techniques create a powerful tool to study microbial diversity. These genomics techniques have already been successfully exploited to identify sources for i) novel enzymes or natural products for biotechnology applications, ii) novel genes from extremophiles, and iii) whole genomes or operons from uncultured microbes. More can be done to utilize these tools more efficiently in biotechnology. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Molecular tools for functional genomics in filamentous fungi: recent advances and new strategies.

    PubMed

    Jiang, Dewei; Zhu, Wei; Wang, Yunchuan; Sun, Chang; Zhang, Ke-Qin; Yang, Jinkui

    2013-12-01

    Advances in genetic transformation techniques have made important contributions to molecular genetics. Various molecular tools and strategies have been developed for functional genomic analysis of filamentous fungi since the first DNA transformation was successfully achieved in Neurospora crassa in 1973. Increasing amounts of genomic data regarding filamentous fungi are continuously reported and large-scale functional studies have become common in a wide range of fungal species. In this review, various molecular tools used in filamentous fungi are compared and discussed, including methods for genetic transformation (e.g., protoplast transformation, electroporation, and microinjection), the construction of random mutant libraries (e.g., restriction enzyme mediated integration, transposon arrayed gene knockout, and Agrobacterium tumefaciens mediated transformation), and the analysis of gene function (e.g., RNA interference and transcription activator-like effector nucleases). We also focused on practical strategies that could enhance the efficiency of genetic manipulation in filamentous fungi, such as choosing a proper screening system and marker genes, assembling target-cassettes or vectors effectively, and transforming into strains that are deficient in the nonhomologous end joining pathway. In summary, we present an up-to-date review on the different molecular tools and latest strategies that have been successfully used in functional genomics in filamentous fungi.

  16. Completion of the Edward Air Force Base Statistical Guidance Wind Tool

    NASA Technical Reports Server (NTRS)

    Dreher, Joseph G.

    2008-01-01

    The goal of this task was to develop a GUI using EAFB wind tower data similar to the KSC SLF peak wind tool that is already in operations at SMG. In 2004, MSFC personnel began work to replicate the KSC SLF tool using several wind towers at EAFB. They completed the analysis and QC of the data, but due to higher priority work did not start development of the GUI. MSFC personnel calculated wind climatologies and probabilities of 10-minute peak wind occurrence based on the 2-minute average wind speed for several EAFB wind towers. Once the data were QC'ed and analyzed the climatologies were calculated following the methodology outlined in Lambert (2003). The climatologies were calculated for each tower and month, and then were stratified by hour, direction (10" sectors), and direction (45" sectors)/hour. For all climatologies, MSFC calculated the mean, standard deviation and observation counts of the Zminute average and 10-minute peak wind speeds. MSFC personnel also calculated empirical and modeled probabilities of meeting or exceeding specific 10- minute peak wind speeds using PDFs. The empirical PDFs were asymmetrical and bounded on the left by the 2- minute average wind speed. They calculated the parametric PDFs by fitting the GEV distribution to the empirical distributions. Parametric PDFs were calculated in order to smooth and interpolate over variations in the observed values due to possible under-sampling of certain peak winds and to estimate probabilities associated with average winds outside the observed range. MSFC calculated the individual probabilities of meeting or exceeding specific 10- minute peak wind speeds by integrating the area under each curve. The probabilities assist SMG forecasters in assessing the shuttle FR for various Zminute average wind speeds. The A M ' obtained the processed EAFB data from Dr. Lee Bums of MSFC and reformatted them for input to Excel PivotTables, which allow users to display different values with point

  17. Exon array data analysis using Affymetrix power tools and R statistical software

    PubMed Central

    2011-01-01

    The use of microarray technology to measure gene expression on a genome-wide scale has been well established for more than a decade. Methods to process and analyse the vast quantity of expression data generated by a typical microarray experiment are similarly well-established. The Affymetrix Exon 1.0 ST array is a relatively new type of array, which has the capability to assess expression at the individual exon level. This allows a more comprehensive analysis of the transcriptome, and in particular enables the study of alternative splicing, a gene regulation mechanism important in both normal conditions and in diseases. Some aspects of exon array data analysis are shared with those for standard gene expression data but others present new challenges that have required development of novel tools. Here, I will introduce the exon array and present a detailed example tutorial for analysis of data generated using this platform. PMID:21498550

  18. Exon array data analysis using Affymetrix power tools and R statistical software.

    PubMed

    Lockstone, Helen E

    2011-11-01

    The use of microarray technology to measure gene expression on a genome-wide scale has been well established for more than a decade. Methods to process and analyse the vast quantity of expression data generated by a typical microarray experiment are similarly well-established. The Affymetrix Exon 1.0 ST array is a relatively new type of array, which has the capability to assess expression at the individual exon level. This allows a more comprehensive analysis of the transcriptome, and in particular enables the study of alternative splicing, a gene regulation mechanism important in both normal conditions and in diseases. Some aspects of exon array data analysis are shared with those for standard gene expression data but others present new challenges that have required development of novel tools. Here, I will introduce the exon array and present a detailed example tutorial for analysis of data generated using this platform.

  19. Power analysis as a tool to identify statistically informative indicators for monitoring coral reef disturbances.

    PubMed

    Van Wynsberge, Simon; Gilbert, Antoine; Guillemot, Nicolas; Heintz, Tom; Tremblay-Boyer, Laura

    2017-07-01

    Extensive biological field surveys are costly and time consuming. To optimize sampling and ensure regular monitoring on the long term, identifying informative indicators of anthropogenic disturbances is a priority. In this study, we used 1800 candidate indicators by combining metrics measured from coral, fish, and macro-invertebrate assemblages surveyed from 2006 to 2012 in the vicinity of an ongoing mining project in the Voh-Koné-Pouembout lagoon, New Caledonia. We performed a power analysis to identify a subset of indicators which would best discriminate temporal changes due to a simulated chronic anthropogenic impact. Only 4% of tested indicators were likely to detect a 10% annual decrease of values with sufficient power (>0.80). Corals generally exerted higher statistical power than macro-invertebrates and fishes because of lower natural variability and higher occurrence. For the same reasons, higher taxonomic ranks provided higher power than lower taxonomic ranks. Nevertheless, a number of families of common sedentary or sessile macro-invertebrates and fishes also performed well in detecting changes: Echinometridae, Isognomidae, Muricidae, Tridacninae, Arcidae, and Turbinidae for macro-invertebrates and Pomacentridae, Labridae, and Chaetodontidae for fishes. Interestingly, these families did not provide high power in all geomorphological strata, suggesting that the ability of indicators in detecting anthropogenic impacts was closely linked to reef geomorphology. This study provides a first operational step toward identifying statistically relevant indicators of anthropogenic disturbances in New Caledonia's coral reefs, which can be useful in similar tropical reef ecosystems where little information is available regarding the responses of ecological indicators to anthropogenic disturbances.

  20. PECA: a novel statistical tool for deconvoluting time-dependent gene expression regulation.

    PubMed

    Teo, Guoshou; Vogel, Christine; Ghosh, Debashis; Kim, Sinae; Choi, Hyungwon

    2014-01-03

    Protein expression varies as a result of intricate regulation of synthesis and degradation of messenger RNAs (mRNA) and proteins. Studies of dynamic regulation typically rely on time-course data sets of mRNA and protein expression, yet there are no statistical methods that integrate these multiomics data and deconvolute individual regulatory processes of gene expression control underlying the observed concentration changes. To address this challenge, we developed Protein Expression Control Analysis (PECA), a method to quantitatively dissect protein expression variation into the contributions of mRNA synthesis/degradation and protein synthesis/degradation, termed RNA-level and protein-level regulation respectively. PECA computes the rate ratios of synthesis versus degradation as the statistical summary of expression control during a given time interval at each molecular level and computes the probability that the rate ratio changed between adjacent time intervals, indicating regulation change at the time point. Along with the associated false-discovery rates, PECA gives the complete description of dynamic expression control, that is, which proteins were up- or down-regulated at each molecular level and each time point. Using PECA, we analyzed two yeast data sets monitoring the cellular response to hyperosmotic and oxidative stress. The rate ratio profiles reported by PECA highlighted a large magnitude of RNA-level up-regulation of stress response genes in the early response and concordant protein-level regulation with time delay. However, the contributions of RNA- and protein-level regulation and their temporal patterns were different between the two data sets. We also observed several cases where protein-level regulation counterbalanced transcriptomic changes in the early stress response to maintain the stability of protein concentrations, suggesting that proteostasis is a proteome-wide phenomenon mediated by post-transcriptional regulation.

  1. Advances in physical activity and nutrition environment assessment tools and applications: recommendations.

    PubMed

    Glanz, Karen; Sallis, James F; Saelens, Brian E

    2015-05-01

    In the past 15 years, researchers, practitioners, and community residents and leaders have become increasingly interested in associations among built environments and physical activity, diet, and obesity. Numerous tools to measure activity and food environments have been developed but vary in quality and usability. Future progress depends on aligning these tools with new communication technology and increasing their utility for planning and policy. The Built Environment Assessment Training Institute Think Thank was held in July 2013. Expert participants discussed priorities, gaps, and promising opportunities to advance the science and practice of measuring obesity-related built environments. Participants proposed and voted on recommended future directions in two categories: "big ideas" and additional recommendations. Recommendations for the first "big idea" involve developing new, simplified built environment assessment tools and deploying them through online trainings and easily accessible web-based apps. Future iterations of the tools would link to databases of key locations (e.g., parks, food stores); have built-in scoring and analysis; and provide clear, simple feedback to users. A second "big idea" addresses dissemination of results from built environment assessments and translation into policies including land use and food access planning. Additional recommendations include (1) improving multidisciplinary collaborations; (2) engaging stakeholders across sectors; (3) centralized data resource centers; (4) increased use of emerging technologies to communicate findings; and (5) advocating for expanded funding for measurement development, training, and dissemination. Implementing these recommendations is likely to improve the quality of built environment measures and expand their use in research and practice. Copyright © 2015 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  2. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    PubMed

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1].

  3. Symptom Clusters in Advanced Cancer Patients: An Empirical Comparison of Statistical Methods and the Impact on Quality of Life.

    PubMed

    Dong, Skye T; Costa, Daniel S J; Butow, Phyllis N; Lovell, Melanie R; Agar, Meera; Velikova, Galina; Teckle, Paulos; Tong, Allison; Tebbutt, Niall C; Clarke, Stephen J; van der Hoek, Kim; King, Madeleine T; Fayers, Peter M

    2016-01-01

    Symptom clusters in advanced cancer can influence patient outcomes. There is large heterogeneity in the methods used to identify symptom clusters. To investigate the consistency of symptom cluster composition in advanced cancer patients using different statistical methodologies for all patients across five primary cancer sites, and to examine which clusters predict functional status, a global assessment of health and global quality of life. Principal component analysis and exploratory factor analysis (with different rotation and factor selection methods) and hierarchical cluster analysis (with different linkage and similarity measures) were used on a data set of 1562 advanced cancer patients who completed the European Organization for the Research and Treatment of Cancer Quality of Life Questionnaire-Core 30. Four clusters consistently formed for many of the methods and cancer sites: tense-worry-irritable-depressed (emotional cluster), fatigue-pain, nausea-vomiting, and concentration-memory (cognitive cluster). The emotional cluster was a stronger predictor of overall quality of life than the other clusters. Fatigue-pain was a stronger predictor of overall health than the other clusters. The cognitive cluster and fatigue-pain predicted physical functioning, role functioning, and social functioning. The four identified symptom clusters were consistent across statistical methods and cancer types, although there were some noteworthy differences. Statistical derivation of symptom clusters is in need of greater methodological guidance. A psychosocial pathway in the management of symptom clusters may improve quality of life. Biological mechanisms underpinning symptom clusters need to be delineated by future research. A framework for evidence-based screening, assessment, treatment, and follow-up of symptom clusters in advanced cancer is essential. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  4. Using Microsoft Excel to teach statistics in a graduate advanced practice nursing program.

    PubMed

    DiMaria-Ghalili, Rose Ann; Ostrow, C Lynne

    2009-02-01

    This article describes the authors' experiences during 3 years of using Microsoft Excel to teach graduate-level statistics, as part of the research core required by the American Association of Colleges of Nursing for all professional graduate nursing programs. The advantages to using this program instead of specialized statistical programs are ease of accessibility, increased transferability of skills, and reduced cost for students. The authors share their insight about realistic goals for teaching statistics to master's-level students and the resources that are available to faculty to help them to learn and use Excel in their courses. Several online sites that are excellent resources for both faculty and students are discussed. Detailed attention is given to an online course (Carnegie-Mellon University Open Learning Initiative, n.d.), which the authors have incorporated into their graduate-level research methods course.

  5. Statistical analyses of the magnet data for the advanced photon source storage ring magnets

    SciTech Connect

    Kim, S.H.; Carnegie, D.W.; Doose, C.; Hogrefe, R.; Kim, K.; Merl, R.

    1995-05-01

    The statistics of the measured magnetic data of 80 dipole, 400 quadrupole, and 280 sextupole magnets of conventional resistive designs for the APS storage ring is summarized. In order to accommodate the vacuum chamber, the curved dipole has a C-type cross section and the quadrupole and sextupole cross sections have 180{degrees} and 120{degrees} symmetries, respectively. The data statistics include the integrated main fields, multipole coefficients, magnetic and mechanical axes, and roll angles of the main fields. The average and rms values of the measured magnet data meet the storage ring requirements.

  6. Tools based on multivariate statistical analysis for classification of soil and groundwater in Apulian agricultural sites.

    PubMed

    Ielpo, Pierina; Leardi, Riccardo; Pappagallo, Giuseppe; Uricchio, Vito Felice

    2017-06-01

    In this paper, the results obtained from multivariate statistical techniques such as PCA (Principal component analysis) and LDA (Linear discriminant analysis) applied to a wide soil data set are presented. The results have been compared with those obtained on a groundwater data set, whose samples were collected together with soil ones, within the project "Improvement of the Regional Agro-meteorological Monitoring Network (2004-2007)". LDA, applied to soil data, has allowed to distinguish the geographical origin of the sample from either one of the two macroaeras: Bari and Foggia provinces vs Brindisi, Lecce e Taranto provinces, with a percentage of correct prediction in cross validation of 87%. In the case of the groundwater data set, the best classification was obtained when the samples were grouped into three macroareas: Foggia province, Bari province and Brindisi, Lecce and Taranto provinces, by reaching a percentage of correct predictions in cross validation of 84%. The obtained information can be very useful in supporting soil and water resource management, such as the reduction of water consumption and the reduction of energy and chemical (nutrients and pesticides) inputs in agriculture.

  7. Laser flow cytometry as a tool for the advancement of clinical medicine.

    PubMed

    Aebisher, David; Bartusik, Dorota; Tabarkiewicz, Jacek

    2017-01-01

    Flow cytometry is a classic laser technology. With the discovery of the cytometer, flow cytometry has become a primary tool in biodiagnostic research. This review focuses on current applications of flow cytometry to the diagnosis of disease and treatment monitoring at the single-cell level. A description of the principles of flow cytometry and a brief overview of the major applications are presented. Our criteria for selecting research papers for this review are those that show advances in biomedicine and pharmacotherapy achieved by using non-invasive flow cytometry. New concepts for diagnosis and classification based on quantitative measurements of cellular parameters and the expression of specific differentiation antigens on the surface of cells will be discussed herein. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  8. Advanced Launch Technology Life Cycle Analysis Using the Architectural Comparison Tool (ACT)

    NASA Technical Reports Server (NTRS)

    McCleskey, Carey M.

    2015-01-01

    Life cycle technology impact comparisons for nanolauncher technology concepts were performed using an Affordability Comparison Tool (ACT) prototype. Examined are cost drivers and whether technology investments can dramatically affect the life cycle characteristics. Primary among the selected applications was the prospect of improving nanolauncher systems. As a result, findings and conclusions are documented for ways of creating more productive and affordable nanolauncher systems; e.g., an Express Lane-Flex Lane concept is forwarded, and the beneficial effect of incorporating advanced integrated avionics is explored. Also, a Functional Systems Breakdown Structure (F-SBS) was developed to derive consistent definitions of the flight and ground systems for both system performance and life cycle analysis. Further, a comprehensive catalog of ground segment functions was created.

  9. Community-based participatory research as a tool to advance environmental health sciences.

    PubMed Central

    O'Fallon, Liam R; Dearry, Allen

    2002-01-01

    The past two decades have witnessed a rapid proliferation of community-based participatory research (CBPR) projects. CBPR methodology presents an alternative to traditional population-based biomedical research practices by encouraging active and equal partnerships between community members and academic investigators. The National Institute of Environmental Health Sciences (NIEHS), the premier biomedical research facility for environmental health, is a leader in promoting the use of CBPR in instances where community-university partnerships serve to advance our understanding of environmentally related disease. In this article, the authors highlight six key principles of CBPR and describe how these principles are met within specific NIEHS-supported research investigations. These projects demonstrate that community-based participatory research can be an effective tool to enhance our knowledge of the causes and mechanisms of disorders having an environmental etiology, reduce adverse health outcomes through innovative intervention strategies and policy change, and address the environmental health concerns of community residents. PMID:11929724

  10. A decision support tool for synchronizing technology advances with strategic mission objectives

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda S.; Willoughby, John K.

    1992-01-01

    Successful accomplishment of the objectives of many long-range future missions in areas such as space systems, land-use planning, and natural resource management requires significant technology developments. This paper describes the development of a decision-support data-derived tool called MisTec for helping strategic planners to determine technology development alternatives and to synchronize the technology development schedules with the performance schedules of future long-term missions. Special attention is given to the operations, concept, design, and functional capabilities of the MisTec. The MisTec was initially designed for manned Mars mission, but can be adapted to support other high-technology long-range strategic planning situations, making it possible for a mission analyst, planner, or manager to describe a mission scenario, determine the technology alternatives for making the mission achievable, and to plan the R&D activity necessary to achieve the required technology advances.

  11. Development of Advanced Space Weather Tools at Goddard Space Flight Center

    NASA Astrophysics Data System (ADS)

    Hesse, Michael; Kuznetsova, Masha; Pulkkinen, Antti; Zheng, Yihua; Maddox, Marlo; Rastaetter, Lutz

    2013-04-01

    Space research, and, consequently, space weather forecasting are immature disciplines. Scientific knowledge is accumulated frequently, which changes our understanding or how solar eruptions occur, and of how they impact targets near or on the Earth, or targets throughout the heliosphere. Along with continuous progress in understanding, space research and forecasting models are advancing rapidly in capability, often providing substantially increases in space weather value over time scales of less than a year. Furthermore, the majority of space environment information available today is, particularly in the solar and heliospheric domains, derived from research missions. An optimal forecasting environment needs to be flexible enough to benefit from this rapid development, and flexible enough to adapt to evolving data sources, many of which may also stem from non-US entities. This presentation will analyze the experiences obtained by developing and operating both a forecasting service for NASA, and it will provide an overview of tool development at Goddard Space Flight Center.

  12. A decision support tool for synchronizing technology advances with strategic mission objectives

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda S.; Willoughby, John K.

    1992-01-01

    Successful accomplishment of the objectives of many long-range future missions in areas such as space systems, land-use planning, and natural resource management requires significant technology developments. This paper describes the development of a decision-support data-derived tool called MisTec for helping strategic planners to determine technology development alternatives and to synchronize the technology development schedules with the performance schedules of future long-term missions. Special attention is given to the operations, concept, design, and functional capabilities of the MisTec. The MisTec was initially designed for manned Mars mission, but can be adapted to support other high-technology long-range strategic planning situations, making it possible for a mission analyst, planner, or manager to describe a mission scenario, determine the technology alternatives for making the mission achievable, and to plan the R&D activity necessary to achieve the required technology advances.

  13. Flipping the Classroom and Student Performance in Advanced Statistics: Evidence from a Quasi-Experiment

    ERIC Educational Resources Information Center

    Touchton, Michael

    2015-01-01

    I administer a quasi-experiment using undergraduate political science majors in statistics classes to evaluate whether "flipping the classroom" (the treatment) alters students' applied problem-solving performance and satisfaction relative to students in a traditional classroom environment (the control). I also assess whether general…

  14. REGSTATTOOLS: freeware statistical tools for the analysis of disease population databases used in health and social studies.

    PubMed

    Esteban, Laura; Clèries, Ramon; Gálvez, Jordi; Pareja, Laura; Escribà, Josep Maria; Sanz, Xavier; Izquierdo, Angel; Galcerán, Jaume; Ribes, Josepa

    2013-03-07

    The repertoire of statistical methods dealing with the descriptive analysis of the burden of a disease has been expanded and implemented in statistical software packages during the last years. The purpose of this paper is to present a web-based tool, REGSTATTOOLShttp://regstattools.net intended to provide analysis for the burden of cancer, or other group of disease registry data. Three software applications are included in REGSTATTOOLS: SART (analysis of disease's rates and its time trends), RiskDiff (analysis of percent changes in the rates due to demographic factors and risk of developing or dying from a disease) and WAERS (relative survival analysis). We show a real-data application through the assessment of the burden of tobacco-related cancer incidence in two Spanish regions in the period 1995-2004. Making use of SART we show that lung cancer is the most common cancer among those cancers, with rising trends in incidence among women. We compared 2000-2004 data with that of 1995-1999 to assess percent changes in the number of cases as well as relative survival using RiskDiff and WAERS, respectively. We show that the net change increase in lung cancer cases among women was mainly attributable to an increased risk of developing lung cancer, whereas in men it is attributable to the increase in population size. Among men, lung cancer relative survival was higher in 2000-2004 than in 1995-1999, whereas it was similar among women when these time periods were compared. Unlike other similar applications, REGSTATTOOLS does not require local software installation and it is simple to use, fast and easy to interpret. It is a set of web-based statistical tools intended for automated calculation of population indicators that any professional in health or social sciences may require.

  15. Applicability of some statistical tools to predict optimum adsorption isotherm after linear and non-linear regression analysis.

    PubMed

    Ncibi, Mohamed Chaker

    2008-05-01

    In any single component isotherm study, determining the best-fitting model is a key analysis to mathematically describe the involved sorption system and, therefore, to explore the related theoretical assumptions. Hence, several error calculation functions have been widely used to estimate the error deviations between experimental and theoretically predicted equilibrium adsorption values (Q(e,exp)vs.Q(e,theo) as X- and Y-axis, respectively), including the average relative error deviation, the Marquardt's percent standard error deviation, the hybrid fractional error function, the sum of the squares of the errors, the correlation coefficient and the residuals. In this study, five other statistical functions are analysed to investigate their applicability as suitable tools to evaluate isotherm model fitness, namely the Pearson correlation coefficient, the coefficient of determination, the Chi-square test, the F-test and the Student's T-test, using the commonly-used functions as references. The adsorption of textile dye onto Posidonia oceanica seagrass fibres was carried out, as study case, in batch mode at 20 degrees C. Besides, and in order to get an overall approach of the possible utilization of these statistical functions within the studied item, the examination was realized for both linear and non-linear regression analysis. The related results showed that, among the five studied statistical tools, the chi(2) and Student's T-tests were suitable to determine the best-fitting isotherm model for the case of linear modelling approach. On the other hand, dealing with the non-linear analysis, despite the Student's T-test, all the other functions gave satisfactorily results, by agreeing the commonly-used error functions calculation.

  16. Complex Spine Pathology Simulator: An Innovative Tool for Advanced Spine Surgery Training.

    PubMed

    Gragnaniello, Cristian; Abou-Hamden, Amal; Mortini, Pietro; Colombo, Elena V; Bailo, Michele; Seex, Kevin A; Litvack, Zachary; Caputy, Anthony J; Gagliardi, Filippo

    2016-11-01

    Background Technical advancements in spine surgery have made possible the treatment of increasingly complex pathologies with less morbidity. Time constraints in surgeons' training have made it necessary to develop new training models for spine pathology. Objective To describe the application of a novel compound, Stratathane resin ST-504 derived polymer (SRSDP), that can be injected at different spinal target locations to mimic spinal epidural, subdural extra-axial, and intra-axial pathologies for the use in advanced surgical training. Material and Methods Fresh-frozen thoracolumbar and cervical spine segments of human and sheep cadavers were used to study the model. SRSDP is initially liquid after mixing, allowing it to be injected into target areas where it expands and solidifies, mimicking the entire spectrum of spinal pathologies. Results Different polymer concentrations have been codified to vary adhesiveness, texture, spread capability, deformability, and radiologic visibility. Polymer injection was performed under fluoroscopic guidance through pathology-specific injection sites that avoided compromising the surgical approach for subsequent excision of the artificial lesion. Inflation of a balloon catheter of the desired size was used to displace stiff cadaveric neurovascular structures to mimic pathology-related mass effect. Conclusion The traditional cadaveric training models principally only allow surgeons to practice the surgical approach. The complex spine pathology simulator is a novel educational tool that in a user-friendly, low-cost fashion allows trainees to practice advanced technical skills in the removal of complex spine pathology, potentially shortening some of the aspects of the learning curve of operative skills that may otherwise take many years to acquire.

  17. Research Registries: A Tool to Advance Understanding of Rare Neuro-Ophthalmic Diseases

    PubMed Central

    Blankshain, Kimberly D; Moss, Heather E

    2016-01-01

    Background Medical research registries (MRR) are organized systems used to collect, store and analyze patient information. They are important tools for medical research with particular application to the study of rare diseases, including those seen in neuro-ophthalmic practice. Evidence Acquisition Evidence for this review was gathered from the writers’ experiences creating a comprehensive neuro-ophthalmology registry and review of the literature. Results MRR are typically observational and prospective databases of de-identified patient information. The structure is flexible and can accommodate a focus on specific diseases or treatments, surveillance of patient populations, physician quality improvement, or recruitment for future studies. They are particularly useful for the study of rare diseases. They can be integrated into the hierarchy of medical research at many levels provided their construction is well organized and they have several key characteristics including an easily manipulated database, comprehensive information on carefully selected patients and comply with human subjects regulations. MRR pertinent to neuro-ophthalmology include the UIC neuro-ophthalmology registry, Susac Syndrome Registry, Intracranial Hypertension Registry as well as larger scale patient outcome registries being developed by professional societies. Conclusion Medical research registries have a variety of forms and applications. With careful planning and clear goals, they are flexible and powerful research tools that can support multiple different study designs, and through this have the potential to advance understanding and care of neuro-ophthalmic diseases. PMID:27389624

  18. MATISSE: Multi-purpose Advanced Tool for Instruments for the Solar System Exploration .

    NASA Astrophysics Data System (ADS)

    Zinzi, A.; Capria, M. T.; Antonelli, L. A.

    In planetary sciences, design, assemble and launch onboard instruments are only preliminary steps toward the final aim of converting data into scientific knowledge, as the real challenge is the data analysis and interpretation. Up to now data have been generally stored in "old style" archives, i.e. common ftp servers where the user can manually search for data browsing directories organized in a time order manner. However, as datasets to be stored and searched become particularly large, this latter task absorbs a great part of the time, subtracting time to the real scientific work. In order to reduce the time spent to search and analyze data MATISSE (Multi-purpose Advanced Tool for Instruments for the Solar System Exploration), a new set of software tools developed together with the scientific teams of the instruments involved, is under development at ASDC (ASI Science Data Center), whose experience in space missions data management is well known (e.g., \\citealt{verrecchia07,pittori09,giommi09,massaro11}) and its features and aims will be presented here.

  19. Research Registries: A Tool to Advance Understanding of Rare Neuro-Ophthalmic Diseases.

    PubMed

    Blankshain, Kimberly D; Moss, Heather E

    2016-09-01

    Medical research registries (MRR) are organized systems used to collect, store, and analyze patient information. They are important tools for medical research with particular application to the study of rare diseases, including those seen in neuro-ophthalmic practice. Evidence for this review was gathered from the writers' experiences creating a comprehensive neuro-ophthalmology registry and review of the literature. MRR are typically observational and prospective databases of de-identified patient information. The structure is flexible and can accommodate a focus on specific diseases or treatments, surveillance of patient populations, physician quality improvement, or recruitment for future studies. They are particularly useful for the study of rare diseases. They can be integrated into the hierarchy of medical research at many levels provided their construction is well organized and they have several key characteristics including an easily manipulated database, comprehensive information on carefully selected patients, and comply with human subjects regulations. MRR pertinent to neuro-ophthalmology include the University of Illinois at Chicago neuro-ophthalmology registry, Susac Syndrome Registry, Intracranial Hypertension Registry, and larger-scale patient outcome registries being developed by professional societies. MRR have a variety of forms and applications. With careful planning and clear goals, they are flexible and powerful research tools that can support multiple different study designs, and this can provide the potential to advance understanding and care of neuro-ophthalmic diseases.

  20. DNA technological progress toward advanced diagnostic tools to support human hookworm control.

    PubMed

    Gasser, R B; Cantacessi, C; Loukas, A

    2008-01-01

    Blood-feeding hookworms are parasitic nematodes of major human health importance. Currently, it is estimated that 740 million people are infected worldwide, and more than 80 million of them are severely affected clinically by hookworm disease. In spite of the health problems caused and the advances toward the development of vaccines against some hookworms, limited attention has been paid to the need for improved, practical methods of diagnosis. Accurate diagnosis and genetic characterization of hookworms is central to their effective control. While traditional diagnostic methods have considerable limitations, there has been some progress toward the development of molecular-diagnostic tools. The present article provides a brief background on hookworm disease of humans, reviews the main methods that have been used for diagnosis and describes progress in establishing polymerase chain reaction (PCR)-based methods for the specific diagnosis of hookworm infection and the genetic characterisation of the causative agents. This progress provides a foundation for the rapid development of practical, highly sensitive and specific diagnostic and analytical tools to be used in improved hookworm prevention and control programmes.

  1. Laser vision: lidar as a transformative tool to advance critical zone science

    NASA Astrophysics Data System (ADS)

    Harpold, A. A.; Marshall, J. A.; Lyon, S. W.; Barnhart, T. B.; Fisher, B. A.; Donovan, M.; Brubaker, K. M.; Crosby, C. J.; Glenn, N. F.; Glennie, C. L.; Kirchner, P. B.; Lam, N.; Mankoff, K. D.; McCreight, J. L.; Molotch, N. P.; Musselman, K. N.; Pelletier, J.; Russo, T.; Sangireddy, H.; Sjöberg, Y.; Swetnam, T.; West, N.

    2015-06-01

    Observation and quantification of the Earth's surface is undergoing a revolutionary change due to the increased spatial resolution and extent afforded by light detection and ranging (lidar) technology. As a consequence, lidar-derived information has led to fundamental discoveries within the individual disciplines of geomorphology, hydrology, and ecology. These disciplines form the cornerstones of critical zone (CZ) science, where researchers study how interactions among the geosphere, hydrosphere, and biosphere shape and maintain the "zone of life", which extends from the top of unweathered bedrock to the top of the vegetation canopy. Fundamental to CZ science is the development of transdisciplinary theories and tools that transcend disciplines and inform other's work, capture new levels of complexity, and create new intellectual outcomes and spaces. Researchers are just beginning to use lidar data sets to answer synergistic, transdisciplinary questions in CZ science, such as how CZ processes co-evolve over long timescales and interact over shorter timescales to create thresholds, shifts in states and fluxes of water, energy, and carbon. The objective of this review is to elucidate the transformative potential of lidar for CZ science to simultaneously allow for quantification of topographic, vegetative, and hydrological processes. A review of 147 peer-reviewed lidar studies highlights a lack of lidar applications for CZ studies as 38 % of the studies were focused in geomorphology, 18 % in hydrology, 32 % in ecology, and the remaining 12 % had an interdisciplinary focus. A handful of exemplar transdisciplinary studies demonstrate lidar data sets that are well-integrated with other observations can lead to fundamental advances in CZ science, such as identification of feedbacks between hydrological and ecological processes over hillslope scales and the synergistic co-evolution of landscape-scale CZ structure due to interactions amongst carbon, energy, and water cycles

  2. A Complex Approach to UXO Discrimination: Combining Advanced EMI Forward Models and Statistical Signal Processing

    DTIC Science & Technology

    2012-01-01

    Figure 64: TEMTADS multi-static response matrix eigenvalues versus time for some samples of requested anomalies...right) time-decay profiles for MM anomaly #2504. The thin red lines show a library sample , while the thick blue and green lines show the inversion...characterization and classification of the targets without the need for a forward model. Joint diagonalization has become an important tool for

  3. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  4. A Kernel of Truth: Statistical Advances in Polygenic Variance Component Models for Complex Human Pedigrees

    PubMed Central

    Blangero, John; Diego, Vincent P.; Dyer, Thomas D.; Almeida, Marcio; Peralta, Juan; Kent, Jack W.; Williams, Jeff T.; Almasy, Laura; Göring, Harald H. H.

    2014-01-01

    Statistical genetic analysis of quantitative traits in large pedigrees is a formidable computational task due to the necessity of taking the non-independence among relatives into account. With the growing awareness that rare sequence variants may be important in human quantitative variation, heritability and association study designs involving large pedigrees will increase in frequency due to the greater chance of observing multiple copies of rare variants amongst related individuals. Therefore, it is important to have statistical genetic test procedures that utilize all available information for extracting evidence regarding genetic association. Optimal testing for marker/phenotype association involves the exact calculation of the likelihood ratio statistic which requires the repeated inversion of potentially large matrices. In a whole genome sequence association context, such computation may be prohibitive. Toward this end, we have developed a rapid and efficient eigensimplification of the likelihood that makes analysis of family data commensurate with the analysis of a comparable sample of unrelated individuals. Our theoretical results which are based on a spectral representation of the likelihood yield simple exact expressions for the expected likelihood ratio test statistic (ELRT) for pedigrees of arbitrary size and complexity. For heritability, the ELRT is: −∑ln[1+ĥ2(λgi−1)], where ĥ2 and λgi are respectively the heritability and eigenvalues of the pedigree-derived genetic relationship kernel (GRK). For association analysis of sequence variants, the ELRT is given by ELRT[hq2>0:unrelateds]−(ELRT[ht2>0:pedigrees]−ELRT[hr2>0:pedigrees]), where ht2,hq2, and hr2 are the total, quantitative trait nucleotide, and residual heritabilities, respectively. Using these results, fast and accurate analytical power analyses are possible, eliminating the need for computer simulation. Additional benefits of eigensimplification include a simple method for

  5. Live-site UXO classification studies using advanced EMI and statistical models

    NASA Astrophysics Data System (ADS)

    Shamatava, I.; Shubitidze, F.; Fernandez, J. P.; Bijamov, A.; Barrowes, B. E.; O'Neill, K.

    2011-06-01

    In this paper we present the inversion and classification performance of the advanced EMI inversion, processing and discrimination schemes developed by our group when applied to the ESTCP Live-Site UXO Discrimination Study carried out at the former Camp Butner in North Carolina. The advanced models combine: 1) the joint diagonalization (JD) algorithm to estimate the number of potential anomalies from the measured data without inversion, 2) the ortho-normalized volume magnetic source (ONVMS) to represent targets' EMI responses and extract their intrinsic "feature vectors," and 3) the Gaussian mixture algorithm to classify buried objects as targets of interest or not starting from the extracted discrimination features. The studies are conducted using cued datasets collected with the next-generation TEMTADS and MetalMapper (MM) sensor systems. For the cued TEMTADS datasets we first estimate the data quality and the number of targets contributing to each signal using the JD technique. Once we know the number of targets we proceed to invert the data using a standard non-linear optimization technique in order to determine intrinsic parameters such as the total ONVMS for each potential target. Finally we classify the targets using a library-matching technique. The MetalMapper data are all inverted as multi-target scenarios, and the resulting intrinsic parameters are grouped using an unsupervised Gaussian mixture approach. The potential targets of interest are a 37-mm projectile, an M48 fuze, and a 105-mm projectile. During the analysis we requested the ground truth for a few selected anomalies to assist in the classification task. Our results were scored independently by the Institute for Defense Analyses, who revealed that our advanced models produce superb classification when starting from either TEMTADS or MM cued datasets.

  6. MEMHDX: an interactive tool to expedite the statistical validation and visualization of large HDX-MS datasets.

    PubMed

    Hourdel, Véronique; Volant, Stevenn; O'Brien, Darragh P; Chenal, Alexandre; Chamot-Rooke, Julia; Dillies, Marie-Agnès; Brier, Sébastien

    2016-11-15

    With the continued improvement of requisite mass spectrometers and UHPLC systems, Hydrogen/Deuterium eXchange Mass Spectrometry (HDX-MS) workflows are rapidly evolving towards the investigation of more challenging biological systems, including large protein complexes and membrane proteins. The analysis of such extensive systems results in very large HDX-MS datasets for which specific analysis tools are required to speed up data validation and interpretation. We introduce a web application and a new R-package named 'MEMHDX' to help users analyze, validate and visualize large HDX-MS datasets. MEMHDX is composed of two elements. A statistical tool aids in the validation of the results by applying a mixed-effects model for each peptide, in each experimental condition, and at each time point, taking into account the time dependency of the HDX reaction and number of independent replicates. Two adjusted P-values are generated per peptide, one for the 'Change in dynamics' and one for the 'Magnitude of ΔD', and are used to classify the data by means of a 'Logit' representation. A user-friendly interface developed with Shiny by RStudio facilitates the use of the package. This interactive tool allows the user to easily and rapidly validate, visualize and compare the relative deuterium incorporation on the amino acid sequence and 3D structure, providing both spatial and temporal information. MEMHDX is freely available as a web tool at the project home page http://memhdx.c3bi.pasteur.fr CONTACT: marie-agnes.dillies@pasteur.fr or sebastien.brier@pasteur.frSupplementary information: Supplementary data is available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  7. A resilient and efficient CFD framework: Statistical learning tools for multi-fidelity and heterogeneous information fusion

    NASA Astrophysics Data System (ADS)

    Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em

    2017-09-01

    Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a ;patch dynamics; flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more ;microscopic; simulation. We consider, as such ;auxiliary; models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in

  8. S2O - A software tool for integrating research data from general purpose statistic software into electronic data capture systems.

    PubMed

    Bruland, Philipp; Dugas, Martin

    2017-01-07

    Data capture for clinical registries or pilot studies is often performed in spreadsheet-based applications like Microsoft Excel or IBM SPSS. Usually, data is transferred into statistic software, such as SAS, R or IBM SPSS Statistics, for analyses afterwards. Spreadsheet-based solutions suffer from several drawbacks: It is generally not possible to ensure a sufficient right and role management; it is not traced who has changed data when and why. Therefore, such systems are not able to comply with regulatory requirements for electronic data capture in clinical trials. In contrast, Electronic Data Capture (EDC) software enables a reliable, secure and auditable collection of data. In this regard, most EDC vendors support the CDISC ODM standard to define, communicate and archive clinical trial meta- and patient data. Advantages of EDC systems are support for multi-user and multicenter clinical trials as well as auditable data. Migration from spreadsheet based data collection to EDC systems is labor-intensive and time-consuming at present. Hence, the objectives of this research work are to develop a mapping model and implement a converter between the IBM SPSS and CDISC ODM standard and to evaluate this approach regarding syntactic and semantic correctness. A mapping model between IBM SPSS and CDISC ODM data structures was developed. SPSS variables and patient values can be mapped and converted into ODM. Statistical and display attributes from SPSS are not corresponding to any ODM elements; study related ODM elements are not available in SPSS. The S2O converting tool was implemented as command-line-tool using the SPSS internal Java plugin. Syntactic and semantic correctness was validated with different ODM tools and reverse transformation from ODM into SPSS format. Clinical data values were also successfully transformed into the ODM structure. Transformation between the spreadsheet format IBM SPSS and the ODM standard for definition and exchange of trial data is feasible

  9. Beyond repeated-measures analysis of variance: advanced statistical methods for the analysis of longitudinal data in anesthesia research.

    PubMed

    Ma, Yan; Mazumdar, Madhu; Memtsoudis, Stavros G

    2012-01-01

    Research in the field of anesthesiology relies heavily on longitudinal designs for answering questions about long-term efficacy and safety of various anesthetic and pain regimens. Yet, anesthesiology research is lagging in the use of advanced statistical methods for analyzing longitudinal data. The goal of this article was to increase awareness of the advantages of modern statistical methods and promote their use in anesthesia research. Here we introduce 2 modern and advanced statistical methods for analyzing longitudinal data: the generalized estimating equations (GEE) and mixed-effects models (MEM). These methods were compared with the conventional repeated-measures analysis of variance (RM-ANOVA) through a clinical example with 2 types of end points (continuous and binary). In addition, we compared GEE and MEM to RM-ANOVA through a simulation study with varying sample sizes, varying number of repeated measures, and scenarios with and without missing data. In the clinical study, the 3 methods are found to be similar in terms of statistical estimation, whereas the parameter interpretations are somewhat different. The simulation study shows that the methods of GEE and MEM are more efficient in that they are able to achieve higher power with smaller sample size or lower number of repeated measurements in both complete and missing data scenarios. Based on their advantages over RM-ANOVA, GEE and MEM should be strongly considered for the analysis of longitudinal data. In particular, GEE should be used to explore overall average effects, and MEM should be used when subject-specific effects (in addition to overall average effects) are of primary interest.

  10. Advancing the science of spatial neglect rehabilitation: an improved statistical approach with mixed linear modeling.

    PubMed

    Goedert, Kelly M; Boston, Raymond C; Barrett, A M

    2013-01-01

    VALID RESEARCH ON NEGLECT REHABILITATION DEMANDS A STATISTICAL APPROACH COMMENSURATE WITH THE CHARACTERISTICS OF NEGLECT REHABILITATION DATA: neglect arises from impairment in distinct brain networks leading to large between-subject variability in baseline symptoms and recovery trajectories. Studies enrolling medically ill, disabled patients, may suffer from missing, unbalanced data, and small sample sizes. Finally, assessment of rehabilitation requires a description of continuous recovery trajectories. Unfortunately, the statistical method currently employed in most studies of neglect treatment [repeated measures analysis of variance (ANOVA), rANOVA] does not well-address these issues. Here we review an alternative, mixed linear modeling (MLM), that is more appropriate for assessing change over time. MLM better accounts for between-subject heterogeneity in baseline neglect severity and in recovery trajectory. MLM does not require complete or balanced data, nor does it make strict assumptions regarding the data structure. Furthermore, because MLM better models between-subject heterogeneity it often results in increased power to observe treatment effects with smaller samples. After reviewing current practices in the field, and the assumptions of rANOVA, we provide an introduction to MLM. We review its assumptions, uses, advantages, and disadvantages. Using real and simulated data, we illustrate how MLM may improve the ability to detect effects of treatment over ANOVA, particularly with the small samples typical of neglect research. Furthermore, our simulation analyses result in recommendations for the design of future rehabilitation studies. Because between-subject heterogeneity is one important reason why studies of neglect treatments often yield conflicting results, employing statistical procedures that model this heterogeneity more accurately will increase the efficiency of our efforts to find treatments to improve the lives of individuals with neglect.

  11. Advancing the Science of Spatial Neglect Rehabilitation: An Improved Statistical Approach with Mixed Linear Modeling

    PubMed Central

    Goedert, Kelly M.; Boston, Raymond C.; Barrett, A. M.

    2013-01-01

    Valid research on neglect rehabilitation demands a statistical approach commensurate with the characteristics of neglect rehabilitation data: neglect arises from impairment in distinct brain networks leading to large between-subject variability in baseline symptoms and recovery trajectories. Studies enrolling medically ill, disabled patients, may suffer from missing, unbalanced data, and small sample sizes. Finally, assessment of rehabilitation requires a description of continuous recovery trajectories. Unfortunately, the statistical method currently employed in most studies of neglect treatment [repeated measures analysis of variance (ANOVA), rANOVA] does not well-address these issues. Here we review an alternative, mixed linear modeling (MLM), that is more appropriate for assessing change over time. MLM better accounts for between-subject heterogeneity in baseline neglect severity and in recovery trajectory. MLM does not require complete or balanced data, nor does it make strict assumptions regarding the data structure. Furthermore, because MLM better models between-subject heterogeneity it often results in increased power to observe treatment effects with smaller samples. After reviewing current practices in the field, and the assumptions of rANOVA, we provide an introduction to MLM. We review its assumptions, uses, advantages, and disadvantages. Using real and simulated data, we illustrate how MLM may improve the ability to detect effects of treatment over ANOVA, particularly with the small samples typical of neglect research. Furthermore, our simulation analyses result in recommendations for the design of future rehabilitation studies. Because between-subject heterogeneity is one important reason why studies of neglect treatments often yield conflicting results, employing statistical procedures that model this heterogeneity more accurately will increase the efficiency of our efforts to find treatments to improve the lives of individuals with neglect. PMID

  12. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  13. Statistical models for fever forecasting based on advanced body temperature monitoring.

    PubMed

    Jordan, Jorge; Miro-Martinez, Pau; Vargas, Borja; Varela-Entrecanales, Manuel; Cuesta-Frau, David

    2017-02-01

    Body temperature monitoring provides health carers with key clinical information about the physiological status of patients. Temperature readings are taken periodically to detect febrile episodes and consequently implement the appropriate medical countermeasures. However, fever is often difficult to assess at early stages, or remains undetected until the next reading, probably a few hours later. The objective of this article is to develop a statistical model to forecast fever before a temperature threshold is exceeded to improve the therapeutic approach to the subjects involved. To this end, temperature series of 9 patients admitted to a general internal medicine ward were obtained with a continuous monitoring Holter device, collecting measurements of peripheral and core temperature once per minute. These series were used to develop different statistical models that could quantify the probability of having a fever spike in the following 60 minutes. A validation series was collected to assess the accuracy of the models. Finally, the results were compared with the analysis of some series by experienced clinicians. Two different models were developed: a logistic regression model and a linear discrimination analysis model. Both of them exhibited a fever peak forecasting accuracy greater than 84%. When compared with experts' assessment, both models identified 35 (97.2%) of 36 fever spikes. The models proposed are highly accurate in forecasting the appearance of fever spikes within a short period in patients with suspected or confirmed febrile-related illnesses. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Advances in statistical methods to map quantitative trait loci in outbred populations.

    PubMed

    Hoeschele, I; Uimari, P; Grignola, F E; Zhang, Q; Gage, K M

    1997-11-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown.

  15. Advances in Statistical Methods to Map Quantitative Trait Loci in Outbred Populations

    PubMed Central

    Hoeschele, I.; Uimari, P.; Grignola, F. E.; Zhang, Q.; Gage, K. M.

    1997-01-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown. PMID:9383084

  16. Recent advances in statistical methods for the estimation of sediment and nutrient transport in rivers

    NASA Astrophysics Data System (ADS)

    Colin, T. A.

    1995-07-01

    This paper reviews advances in methods for estimating fluvial transport of suspended sediment and nutrients. Research from the past four years, mostly dealing with estimating monthly and annual loads, is emphasized. However, because this topic has not appeared in previous IUGG reports, some research prior to 1990 is included. The motivation for studying sediment transport has shifted during the past few decades. In addition to its role in filling reservoirs and channels, sediment is increasingly recognized as an important part of fluvial ecosystems and estuarine wetlands. Many groups want information about sediment transport [Bollman, 1992]: Scientists trying to understand benthic biology and catchment hydrology; citizens and policy-makers concerned about environmental impacts (e.g. impacts of logging [Beschta, 1978] or snow-fences [Sturges, 1992]); government regulators considering the effectiveness of programs to protect in-stream habitat and downstream waterbodies; and resource managers seeking to restore wetlands.

  17. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates.

    PubMed

    Schwämmle, Veit; León, Ileana Rodríguez; Jensen, Ole Nørregaard

    2013-09-06

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods for analyses of triplicate data sets that exhibited more than 1000 features and more than 50% missing values. The maximum number of differentially represented features was identified by using limma and rank products methods in a complementary manner. We therefore recommend combined usage of these methods as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling

  18. Development, Implementation and Application of Micromechanical Analysis Tools for Advanced High Temperature Composites

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This document contains the final report to the NASA Glenn Research Center (GRC) for the research project entitled Development, Implementation, and Application of Micromechanical Analysis Tools for Advanced High-Temperature Composites. The research supporting this initiative has been conducted by Dr. Brett A. Bednarcyk, a Senior Scientist at OM in Brookpark, Ohio from the period of August 1998 to March 2005. Most of the work summarized herein involved development, implementation, and application of enhancements and new capabilities for NASA GRC's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package. When the project began, this software was at a low TRL (3-4) and at release version 2.0. Due to this project, the TRL of MAC/GMC has been raised to 7 and two new versions (3.0 and 4.0) have been released. The most important accomplishments with respect to MAC/GMC are: (1) A multi-scale framework has been built around the software, enabling coupled design and analysis from the global structure scale down to the micro fiber-matrix scale; (2) The software has been expanded to analyze smart materials; (3) State-of-the-art micromechanics theories have been implemented and validated within the code; (4) The damage, failure, and lifing capabilities of the code have been expanded from a very limited state to a vast degree of functionality and utility; and (5) The user flexibility of the code has been significantly enhanced. MAC/GMC is now the premier code for design and analysis of advanced composite and smart materials. It is a candidate for the 2005 NASA Software of the Year Award. The work completed over the course of the project is summarized below on a year by year basis. All publications resulting from the project are listed at the end of this report.

  19. Dynamic statistical optimization of GNSS radio occultation bending angles: advanced algorithm and performance analysis

    NASA Astrophysics Data System (ADS)

    Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.

    2015-08-01

    We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS)-based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAllenging Minisatellite Payload (CHAMP) and Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction of random errors (standard deviations) of optimized bending angles down to about half of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; and (4) realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well-characterized and high-quality atmospheric profiles over the entire stratosphere.

  20. Dynamic statistical optimization of GNSS radio occultation bending angles: an advanced algorithm and its performance analysis

    NASA Astrophysics Data System (ADS)

    Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.

    2015-01-01

    We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS) based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically-varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAMP and COSMIC measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction in random errors (standard deviations) of optimized bending angles down to about two-thirds of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; (4) produces realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well characterized and high-quality atmospheric profiles over the entire stratosphere.

  1. MicrobiomeAnalyst: a web-based tool for comprehensive statistical, visual and meta-analysis of microbiome data.

    PubMed

    Dhariwal, Achal; Chong, Jasmine; Habib, Salam; King, Irah L; Agellon, Luis B; Xia, Jianguo

    2017-04-26

    The widespread application of next-generation sequencing technologies has revolutionized microbiome research by enabling high-throughput profiling of the genetic contents of microbial communities. How to analyze the resulting large complex datasets remains a key challenge in current microbiome studies. Over the past decade, powerful computational pipelines and robust protocols have been established to enable efficient raw data processing and annotation. The focus has shifted toward downstream statistical analysis and functional interpretation. Here, we introduce MicrobiomeAnalyst, a user-friendly tool that integrates recent progress in statistics and visualization techniques, coupled with novel knowledge bases, to enable comprehensive analysis of common data outputs produced from microbiome studies. MicrobiomeAnalyst contains four modules - the Marker Data Profiling module offers various options for community profiling, comparative analysis and functional prediction based on 16S rRNA marker gene data; the Shotgun Data Profiling module supports exploratory data analysis, functional profiling and metabolic network visualization of shotgun metagenomics or metatranscriptomics data; the Taxon Set Enrichment Analysis module helps interpret taxonomic signatures via enrichment analysis against >300 taxon sets manually curated from literature and public databases; finally, the Projection with Public Data module allows users to visually explore their data with a public reference data for pattern discovery and biological insights. MicrobiomeAnalyst is freely available at http://www.microbiomeanalyst.ca. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. MicrobiomeAnalyst: a web-based tool for comprehensive statistical, visual and meta-analysis of microbiome data

    PubMed Central

    Dhariwal, Achal; Chong, Jasmine; Habib, Salam; King, Irah L.; Agellon, Luis B.

    2017-01-01

    Abstract The widespread application of next-generation sequencing technologies has revolutionized microbiome research by enabling high-throughput profiling of the genetic contents of microbial communities. How to analyze the resulting large complex datasets remains a key challenge in current microbiome studies. Over the past decade, powerful computational pipelines and robust protocols have been established to enable efficient raw data processing and annotation. The focus has shifted toward downstream statistical analysis and functional interpretation. Here, we introduce MicrobiomeAnalyst, a user-friendly tool that integrates recent progress in statistics and visualization techniques, coupled with novel knowledge bases, to enable comprehensive analysis of common data outputs produced from microbiome studies. MicrobiomeAnalyst contains four modules - the Marker Data Profiling module offers various options for community profiling, comparative analysis and functional prediction based on 16S rRNA marker gene data; the Shotgun Data Profiling module supports exploratory data analysis, functional profiling and metabolic network visualization of shotgun metagenomics or metatranscriptomics data; the Taxon Set Enrichment Analysis module helps interpret taxonomic signatures via enrichment analysis against >300 taxon sets manually curated from literature and public databases; finally, the Projection with Public Data module allows users to visually explore their data with a public reference data for pattern discovery and biological insights. MicrobiomeAnalyst is freely available at http://www.microbiomeanalyst.ca. PMID:28449106

  3. Kinetic Analysis of Dynamic Positron Emission Tomography Data using Open-Source Image Processing and Statistical Inference Tools.

    PubMed

    Hawe, David; Hernández Fernández, Francisco R; O'Suilleabháin, Liam; Huang, Jian; Wolsztynski, Eric; O'Sullivan, Finbarr

    2012-05-01

    In dynamic mode, positron emission tomography (PET) can be used to track the evolution of injected radio-labelled molecules in living tissue. This is a powerful diagnostic imaging technique that provides a unique opportunity to probe the status of healthy and pathological tissue by examining how it processes substrates. The spatial aspect of PET is well established in the computational statistics literature. This article focuses on its temporal aspect. The interpretation of PET time-course data is complicated because the measured signal is a combination of vascular delivery and tissue retention effects. If the arterial time-course is known, the tissue time-course can typically be expressed in terms of a linear convolution between the arterial time-course and the tissue residue. In statistical terms, the residue function is essentially a survival function - a familiar life-time data construct. Kinetic analysis of PET data is concerned with estimation of the residue and associated functionals such as flow, flux, volume of distribution and transit time summaries. This review emphasises a nonparametric approach to the estimation of the residue based on a piecewise linear form. Rapid implementation of this by quadratic programming is described. The approach provides a reference for statistical assessment of widely used one- and two-compartmental model forms. We illustrate the method with data from two of the most well-established PET radiotracers, (15)O-H(2)O and (18)F-fluorodeoxyglucose, used for assessment of blood perfusion and glucose metabolism respectively. The presentation illustrates the use of two open-source tools, AMIDE and R, for PET scan manipulation and model inference.

  4. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Creech, Dennis M.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2012-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent go-to group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA s design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer s needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces

  5. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2013-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent "go-to" group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA's design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer's needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces

  6. Advances in In Vitro and In Silico Tools for Toxicokinetic Dose ...

    EPA Pesticide Factsheets

    Recent advances in vitro assays, in silico tools, and systems biology approaches provide opportunities for refined mechanistic understanding for chemical safety assessment that will ultimately lead to reduced reliance on animal-based methods. With the U.S. commercial chemical landscape encompassing thousands of chemicals with limited data, safety assessment strategies that reliably predict in vivo systemic exposures and subsequent in vivo effects efficiently are a priority. Quantitative in vitro-in vivo extrapolation (QIVIVE) is a methodology that facilitates the explicit and quantitative application of in vitro experimental data and in silico modeling to predict in vivo system behaviors and can be applied to predict chemical toxicokinetics, toxicodynamics and also population variability. Tiered strategies that incorporate sufficient information to reliably inform the relevant decision context will facilitate acceptance of these alternative data streams for safety assessments. This abstract does not necessarily reflect U.S. EPA policy. This talk will provide an update to an international audience on the state of science being conducted within the EPA’s Office of Research and Development to develop and refine approaches that estimate internal chemical concentrations following a given exposure, known as toxicokinetics. Toxicokinetic approaches hold great potential in their ability to link in vitro activities or toxicities identified during high-throughput screen

  7. Advances in the genetic dissection of plant cell walls: tools and resources available in Miscanthus

    PubMed Central

    Slavov, Gancho; Allison, Gordon; Bosch, Maurice

    2013-01-01

    Tropical C4 grasses from the genus Miscanthus are believed to have great potential as biomass crops. However, Miscanthus species are essentially undomesticated, and genetic, molecular and bioinformatics tools are in very early stages of development. Furthermore, similar to other crops targeted as lignocellulosic feedstocks, the efficient utilization of biomass is hampered by our limited knowledge of the structural organization of the plant cell wall and the underlying genetic components that control this organization. The Institute of Biological, Environmental and Rural Sciences (IBERS) has assembled an extensive collection of germplasm for several species of Miscanthus. In addition, an integrated, multidisciplinary research programme at IBERS aims to inform accelerated breeding for biomass productivity and composition, while also generating fundamental knowledge. Here we review recent advances with respect to the genetic characterization of the cell wall in Miscanthus. First, we present a summary of recent and on-going biochemical studies, including prospects and limitations for the development of powerful phenotyping approaches. Second, we review current knowledge about genetic variation for cell wall characteristics of Miscanthus and illustrate how phenotypic data, combined with high-density arrays of single-nucleotide polymorphisms, are being used in genome-wide association studies to generate testable hypotheses and guide biological discovery. Finally, we provide an overview of the current knowledge about the molecular biology of cell wall biosynthesis in Miscanthus and closely related grasses, discuss the key conceptual and technological bottlenecks, and outline the short-term prospects for progress in this field. PMID:23847628

  8. Ares First Stage "Systemology" - Combining Advanced Systems Engineering and Planning Tools to Assure Mission Success

    NASA Technical Reports Server (NTRS)

    Seiler, James; Brasfield, Fred; Cannon, Scott

    2008-01-01

    Ares is an integral part of NASA s Constellation architecture that will provide crew and cargo access to the International Space Station as well as low earth orbit support for lunar missions. Ares replaces the Space Shuttle in the post 2010 time frame. Ares I is an in-line, two-stage rocket topped by the Orion Crew Exploration Vehicle, its service module, and a launch abort system. The Ares I first stage is a single, five-segment reusable solid rocket booster derived from the Space Shuttle Program's reusable solid rocket motor. The Ares second or upper stage is propelled by a J-2X main engine fueled with liquid oxygen and liquid hydrogen. This paper describes the advanced systems engineering and planning tools being utilized for the design, test, and qualification of the Ares I first stage element. Included are descriptions of the current first stage design, the milestone schedule requirements, and the marriage of systems engineering, detailed planning efforts, and roadmapping employed to achieve these goals.

  9. Improved equilibrium reconstructions by advanced statistical weighting of the internal magnetic measurements.

    PubMed

    Murari, A; Gelfusa, M; Peluso, E; Gaudio, P; Mazon, D; Hawkes, N; Point, G; Alper, B; Eich, T

    2014-12-01

    In a Tokamak the configuration of the magnetic fields remains the key element to improve performance and to maximise the scientific exploitation of the device. On the other hand, the quality of the reconstructed fields depends crucially on the measurements available. Traditionally in the least square minimisation phase of the algorithms, used to obtain the magnetic field topology, all the diagnostics are given the same weights, a part from a corrective factor taking into account the error bars. This assumption unduly penalises complex diagnostics, such as polarimetry, which have a limited number of highly significant measurements. A completely new method to choose the weights, to be given to the internal measurements of the magnetic fields for improved equilibrium reconstructions, is presented in this paper. The approach is based on various statistical indicators applied to the residuals, the difference between the actual measurements and their estimates from the reconstructed equilibrium. The potential of the method is exemplified using the measurements of the Faraday rotation derived from JET polarimeter. The results indicate quite clearly that the weights have to be determined carefully, since the inappropriate choice can have significant repercussions on the quality of the magnetic reconstruction both in the edge and in the core. These results confirm the limitations of the assumption that all the diagnostics have to be given the same weight, irrespective of the number of measurements they provide and the region of the plasma they probe.

  10. How Project Management Tools Aid in Association to Advance Collegiate Schools of Business (AACSB) International Maintenance of Accreditation

    ERIC Educational Resources Information Center

    Cann, Cynthia W.; Brumagim, Alan L.

    2008-01-01

    The authors present the case of one business college's use of project management techniques as tools for accomplishing Association to Advance Collegiate Schools of Business (AACSB) International maintenance of accreditation. Using these techniques provides an efficient and effective method of organizing maintenance efforts. In addition, using…

  11. Advances in In Vitro and In Silico Tools for Toxicokinetic Dose Modeling and Predictive Toxicology (WC10)

    EPA Science Inventory

    Recent advances in vitro assays, in silico tools, and systems biology approaches provide opportunities for refined mechanistic understanding for chemical safety assessment that will ultimately lead to reduced reliance on animal-based methods. With the U.S. commercial chemical lan...

  12. How Project Management Tools Aid in Association to Advance Collegiate Schools of Business (AACSB) International Maintenance of Accreditation

    ERIC Educational Resources Information Center

    Cann, Cynthia W.; Brumagim, Alan L.

    2008-01-01

    The authors present the case of one business college's use of project management techniques as tools for accomplishing Association to Advance Collegiate Schools of Business (AACSB) International maintenance of accreditation. Using these techniques provides an efficient and effective method of organizing maintenance efforts. In addition, using…

  13. StatXFinder: a web-based self-directed tool that provides appropriate statistical test selection for biomedical researchers in their scientific studies.

    PubMed

    Suner, Aslı; Karakülah, Gökhan; Koşaner, Özgün; Dicle, Oğuz

    2015-01-01

    The improper use of statistical methods is common in analyzing and interpreting research data in biological and medical sciences. The objective of this study was to develop a decision support tool encompassing the commonly used statistical tests in biomedical research by combining and updating the present decision trees for appropriate statistical test selection. First, the decision trees in textbooks, published articles, and online resources were scrutinized, and a more comprehensive unified one was devised via the integration of 10 distinct decision trees. The questions also in the decision steps were revised by simplifying and enriching of the questions with examples. Then, our decision tree was implemented into the web environment and the tool titled StatXFinder was developed. Finally, usability and satisfaction questionnaires were applied to the users of the tool, and StatXFinder was reorganized in line with the feedback obtained from these questionnaires. StatXFinder provides users with decision support in the selection of 85 distinct parametric and non-parametric statistical tests by directing 44 different yes-no questions. The accuracy rate of the statistical test recommendations obtained by 36 participants, with the cases applied, were 83.3 % for "difficult" tests, and 88.9 % for "easy" tests. The mean system usability score of the tool was found 87.43 ± 10.01 (minimum: 70-maximum: 100). A statistically significant difference could not be seen between total system usability score and participants' attributes (p value >0.05). The User Satisfaction Questionnaire showed that 97.2 % of the participants appreciated the tool, and almost all of the participants (35 of 36) thought of recommending the tool to the others. In conclusion, StatXFinder, can be utilized as an instructional and guiding tool for biomedical researchers with limited statistics knowledge. StatXFinder is freely available at http://webb.deu.edu.tr/tb/statxfinder.

  14. Machine learning and statistical methods for the prediction of maximal oxygen uptake: recent advances.

    PubMed

    Abut, Fatih; Akay, Mehmet Fatih

    2015-01-01

    Maximal oxygen uptake (VO2max) indicates how many milliliters of oxygen the body can consume in a state of intense exercise per minute. VO2max plays an important role in both sport and medical sciences for different purposes, such as indicating the endurance capacity of athletes or serving as a metric in estimating the disease risk of a person. In general, the direct measurement of VO2max provides the most accurate assessment of aerobic power. However, despite a high level of accuracy, practical limitations associated with the direct measurement of VO2max, such as the requirement of expensive and sophisticated laboratory equipment or trained staff, have led to the development of various regression models for predicting VO2max. Consequently, a lot of studies have been conducted in the last years to predict VO2max of various target audiences, ranging from soccer athletes, nonexpert swimmers, cross-country skiers to healthy-fit adults, teenagers, and children. Numerous prediction models have been developed using different sets of predictor variables and a variety of machine learning and statistical methods, including support vector machine, multilayer perceptron, general regression neural network, and multiple linear regression. The purpose of this study is to give a detailed overview about the data-driven modeling studies for the prediction of VO2max conducted in recent years and to compare the performance of various VO2max prediction models reported in related literature in terms of two well-known metrics, namely, multiple correlation coefficient (R) and standard error of estimate. The survey results reveal that with respect to regression methods used to develop prediction models, support vector machine, in general, shows better performance than other methods, whereas multiple linear regression exhibits the worst performance.

  15. Machine learning and statistical methods for the prediction of maximal oxygen uptake: recent advances

    PubMed Central

    Abut, Fatih; Akay, Mehmet Fatih

    2015-01-01

    Maximal oxygen uptake (VO2max) indicates how many milliliters of oxygen the body can consume in a state of intense exercise per minute. VO2max plays an important role in both sport and medical sciences for different purposes, such as indicating the endurance capacity of athletes or serving as a metric in estimating the disease risk of a person. In general, the direct measurement of VO2max provides the most accurate assessment of aerobic power. However, despite a high level of accuracy, practical limitations associated with the direct measurement of VO2max, such as the requirement of expensive and sophisticated laboratory equipment or trained staff, have led to the development of various regression models for predicting VO2max. Consequently, a lot of studies have been conducted in the last years to predict VO2max of various target audiences, ranging from soccer athletes, nonexpert swimmers, cross-country skiers to healthy-fit adults, teenagers, and children. Numerous prediction models have been developed using different sets of predictor variables and a variety of machine learning and statistical methods, including support vector machine, multilayer perceptron, general regression neural network, and multiple linear regression. The purpose of this study is to give a detailed overview about the data-driven modeling studies for the prediction of VO2max conducted in recent years and to compare the performance of various VO2max prediction models reported in related literature in terms of two well-known metrics, namely, multiple correlation coefficient (R) and standard error of estimate. The survey results reveal that with respect to regression methods used to develop prediction models, support vector machine, in general, shows better performance than other methods, whereas multiple linear regression exhibits the worst performance. PMID:26346869

  16. Recent progress and challenges in population genetics of polyploid organisms: an overview of current state-of-the-art molecular and statistical tools.

    PubMed

    Dufresne, France; Stift, Marc; Vergilino, Roland; Mable, Barbara K

    2014-01-01

    Despite the importance of polyploidy and the increasing availability of new genomic data, there remain important gaps in our knowledge of polyploid population genetics. These gaps arise from the complex nature of polyploid data (e.g. multiple alleles and loci, mixed inheritance patterns, association between ploidy and mating system variation). Furthermore, many of the standard tools for population genetics that have been developed for diploids are often not feasible for polyploids. This review aims to provide an overview of the state-of-the-art in polyploid population genetics and to identify the main areas where further development of molecular techniques and statistical theory is required. We review commonly used molecular tools (amplified fragment length polymorphism, microsatellites, Sanger sequencing, next-generation sequencing and derived technologies) and their challenges associated with their use in polyploid populations: that is, allele dosage determination, null alleles, difficulty of distinguishing orthologues from paralogues and copy number variation. In addition, we review the approaches that have been used for population genetic analysis in polyploids and their specific problems. These problems are in most cases directly associated with dosage uncertainty and the problem of inferring allele frequencies and assumptions regarding inheritance. This leads us to conclude that for advancing the field of polyploid population genetics, most priority should be given to development of new molecular approaches that allow efficient dosage determination, and to further development of analytical approaches to circumvent dosage uncertainty and to accommodate 'flexible' modes of inheritance. In addition, there is a need for more simulation-based studies that test what kinds of biases could result from both existing and novel approaches. © 2013 John Wiley & Sons Ltd.

  17. The Advanced Dementia Prognostic Tool (ADEPT): A Risk Score to Estimate Survival in Nursing Home Residents with Advanced Dementia

    PubMed Central

    Mitchell, Susan L.; Miller, Susan C.; Teno, Joan M.; Davis, Roger B.; Shaffer, Michele L.

    2010-01-01

    Context Estimating life expectancy is challenging in advanced dementia. Objectives To create a risk score to estimate survival in nursing home (NH) residents with advanced dementia. Methods This was a retrospective cohort study performed in the setting of all licensed US NHs. Residents with advanced dementia living in US NHs in 2002 were identified using Minimum Data Set (MDS) assessments. Mortality data from Medicare files were used to determine 12-month survival. Independent variables were selected from the MDS. Cox proportional hazards regression was used to model survival. The accuracy of the final model was assessed using the area under the receiver operating characteristic curve (AUROC). To develop a risk score, points were assigned to variables in the final model based on parameter estimates. Residents meeting hospice eligibility guidelines for dementia, based on MDS data, were identified. The AUROC assessed the accuracy of hospice guidelines to predict six-month survival. Results Over 12 months, 40.6% of residents with advanced dementia (n=22,405) died. Twelve variables best predicted survival: length of stay, age, male, dyspnea, pressure ulcers, total functional dependence, bedfast, insufficient intake, bowel incontinence, body mass index, weight loss, and congestive heart failure. The AUROC for the final model was 0.68. The risk score ranged from 0–32 points (higher scores indicate worse survival). Only 15.9% of residents met hospice eligibility guidelines for which the AUROC predicting six-month survival was 0.53. Conclusion A mortality risk score derived from MDS data predicted six-month survival in advanced dementia with moderate accuracy. The predictive ability of hospice guidelines, simulated with MDS data, was poor. PMID:20621437

  18. Use of statistical tools to evaluate the reductive dechlorination of high levels of TCE in microcosm studies.

    PubMed

    Harkness, Mark; Fisher, Angela; Lee, Michael D; Mack, E Erin; Payne, Jo Ann; Dworatzek, Sandra; Roberts, Jeff; Acheson, Carolyn; Herrmann, Ronald; Possolo, Antonio

    2012-04-01

    A large, multi-laboratory microcosm study was performed to select amendments for supporting reductive dechlorination of high levels of trichloroethylene (TCE) found at an industrial site in the United Kingdom (UK) containing dense non-aqueous phase liquid (DNAPL) TCE. The study was designed as a fractional factorial experiment involving 177 bottles distributed between four industrial laboratories and was used to assess the impact of six electron donors, bioaugmentation, addition of supplemental nutrients, and two TCE levels (0.57 and 1.90 mM or 75 and 250 mg/L in the aqueous phase) on TCE dechlorination. Performance was assessed based on the concentration changes of TCE and reductive dechlorination degradation products. The chemical data was evaluated using analysis of variance (ANOVA) and survival analysis techniques to determine both main effects and important interactions for all the experimental variables during the 203-day study. The statistically based design and analysis provided powerful tools that aided decision-making for field application of this technology. The analysis showed that emulsified vegetable oil (EVO), lactate, and methanol were the most effective electron donors, promoting rapid and complete dechlorination of TCE to ethene. Bioaugmentation and nutrient addition also had a statistically significant positive impact on TCE dechlorination. In addition, the microbial community was measured using phospholipid fatty acid analysis (PLFA) for quantification of total biomass and characterization of the community structure and quantitative polymerase chain reaction (qPCR) for enumeration of Dehalococcoides organisms (Dhc) and the vinyl chloride reductase (vcrA) gene. The highest increase in levels of total biomass and Dhc was observed in the EVO microcosms, which correlated well with the dechlorination results. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Advancing Our Understanding of the Link between Statistical Learning and Language Acquisition: The Need for Longitudinal Data.

    PubMed

    Arciuli, Joanne; Torkildsen, Janne von Koss

    2012-01-01

    Mastery of language can be a struggle for some children. Amongst those that succeed in achieving this feat there is variability in proficiency. Cognitive scientists remain intrigued by this variation. A now substantial body of research suggests that language acquisition is underpinned by a child's capacity for statistical learning (SL). Moreover, a growing body of research has demonstrated that variability in SL is associated with variability in language proficiency. Yet, there is a striking lack of longitudinal data. To date, there has been no comprehensive investigation of whether a capacity for SL in young children is, in fact, associated with language proficiency in subsequent years. Here we review key studies that have led to the need for this longitudinal research. Advancing the language acquisition debate via longitudinal research has the potential to transform our understanding of typical development as well as disorders such as autism, specific language impairment, and dyslexia.

  20. Predictive Modeling of Estrogen Receptor Binding Agents Using Advanced Cheminformatics Tools and Massive Public Data

    PubMed Central

    Ribay, Kathryn; Kim, Marlene T.; Wang, Wenyi; Pinolini, Daniel; Zhu, Hao

    2016-01-01

    Estrogen receptors (ERα) are a critical target for drug design as well as a potential source of toxicity when activated unintentionally. Thus, evaluating potential ERα binding agents is critical in both drug discovery and chemical toxicity areas. Using computational tools, e.g., Quantitative Structure-Activity Relationship (QSAR) models, can predict potential ERα binding agents before chemical synthesis. The purpose of this project was to develop enhanced predictive models of ERα binding agents by utilizing advanced cheminformatics tools that can integrate publicly available bioassay data. The initial ERα binding agent data set, consisting of 446 binders and 8307 non-binders, was obtained from the Tox21 Challenge project organized by the NIH Chemical Genomics Center (NCGC). After removing the duplicates and inorganic compounds, this data set was used to create a training set (259 binders and 259 non-binders). This training set was used to develop QSAR models using chemical descriptors. The resulting models were then used to predict the binding activity of 264 external compounds, which were available to us after the models were developed. The cross-validation results of training set [Correct Classification Rate (CCR) = 0.72] were much higher than the external predictivity of the unknown compounds (CCR = 0.59). To improve the conventional QSAR models, all compounds in the training set were used to search PubChem and generate a profile of their biological responses across thousands of bioassays. The most important bioassays were prioritized to generate a similarity index that was used to calculate the biosimilarity score between each two compounds. The nearest neighbors for each compound within the set were then identified and its ERα binding potential was predicted by its nearest neighbors in the training set. The hybrid model performance (CCR = 0.94 for cross validation; CCR = 0.68 for external prediction) showed significant improvement over the original QSAR

  1. PREDICT: a diagnostic accuracy study of a tool for predicting mortality within one year: who should have an advance healthcare directive?

    PubMed

    Richardson, Philip; Greenslade, Jaimi; Shanmugathasan, Sulochana; Doucet, Katherine; Widdicombe, Neil; Chu, Kevin; Brown, Anthony

    2015-01-01

    CARING is a screening tool developed to identify patients who have a high likelihood of death in 1 year. This study sought to validate a modified CARING tool (termed PREDICT) using a population of patients presenting to the Emergency Department. In total, 1000 patients aged over 55 years who were admitted to hospital via the Emergency Department between January and June 2009 were eligible for inclusion in this study. Data on the six prognostic indicators comprising PREDICT were obtained retrospectively from patient records. One-year mortality data were obtained from the State Death Registry. Weights were applied to each PREDICT criterion, and its final score ranged from 0 to 44. Receiver operator characteristic analyses and diagnostic accuracy statistics were used to assess the accuracy of PREDICT in identifying 1-year mortality. The sample comprised 976 patients with a median (interquartile range) age of 71 years (62-81 years) and a 1-year mortality of 23.4%. In total, 50% had ≥1 PREDICT criteria with a 1-year mortality of 40.4%. Receiver operator characteristic analysis gave an area under the curve of 0.86 (95% confidence interval: 0.83-0.89). Using a cut-off of 13 points, PREDICT had a 95.3% (95% confidence interval: 93.6-96.6) specificity and 53.9% (95% confidence interval: 47.5-60.3) sensitivity for predicting 1-year mortality. PREDICT was simpler than the CARING criteria and identified 158 patients per 1000 admitted who could benefit from advance care planning. PREDICT was successfully applied to the Australian healthcare system with findings similar to the original CARING study conducted in the United States. This tool could improve end-of-life care by identifying who should have advance care planning or an advance healthcare directive. © The Author(s) 2014.

  2. Advanced Differential Radar Interferometry (A-DInSAR) as integrative tool for a structural geological analysis

    NASA Astrophysics Data System (ADS)

    Crippa, B.; Calcagni, L.; Rossi, G.; Sternai, P.

    2009-04-01

    Advanced Differential SAR interferometry (A-DInSAR) is a technique monitoring large-coverage surface deformations using a stack of interferograms generated from several complex SLC SAR images, acquired over the same target area at different times. In this work are described the results of a procedure to calculate terrain motion velocity on highly correlated pixels (E. Biescas, M. Crosetto, M. Agudo, O. Monserrat e B. Crippa: Two Radar Interferometric Approaches to Monitor Slow and Fast Land Deformation, 2007) in two area Gemona - Friuli, Northern Italy, Pollino - Calabria, Southern Italy, and, furthermore, are presented some consideration, based on successful examples of the present analysis. The choice of these pixels whose displacement velocity is calculated depends on the dispersion index value (DA) or using coherence values along the stack interferograms. A-DInSAR technique allows to obtain highly reliable velocity values of the vertical displacement. These values concern the movement of minimum surfaces of about 80m2 at the maximum resolution and the minimum velocity that can be recognized is of the order of mm/y. Because of the high versatility of the technology, because of the large dimensions of the area that can be analyzed (of about 10000Km2) and because of the high precision and reliability of the results obtained, we think it is possible to exploit radar interferometry to obtain some important information about the structural context of the studied area, otherwise very difficult to recognize. Therefore we propose radar interferometry as a valid investigation tool whose results must be considered as an important integration of the data collected in fieldworks.

  3. Bayesian statistics as a new tool for spectral analysis - I. Application for the determination of basic parameters of massive stars

    NASA Astrophysics Data System (ADS)

    Mugnes, J.-M.; Robert, C.

    2015-11-01

    Spectral analysis is a powerful tool to investigate stellar properties and it has been widely used for decades now. However, the methods considered to perform this kind of analysis are mostly based on iteration among a few diagnostic lines to determine the stellar parameters. While these methods are often simple and fast, they can lead to errors and large uncertainties due to the required assumptions. Here, we present a method based on Bayesian statistics to find simultaneously the best combination of effective temperature, surface gravity, projected rotational velocity, and microturbulence velocity, using all the available spectral lines. Different tests are discussed to demonstrate the strength of our method, which we apply to 54 mid-resolution spectra of field and cluster B stars obtained at the Observatoire du Mont-Mégantic. We compare our results with those found in the literature. Differences are seen which are well explained by the different methods used. We conclude that the B-star microturbulence velocities are often underestimated. We also confirm the trend that B stars in clusters are on average faster rotators than field B stars.

  4. Condenser: a statistical aggregation tool for multi-sample quantitative proteomic data from Matrix Science Mascot Distiller™.

    PubMed

    Knudsen, Anders Dahl; Bennike, Tue; Kjeldal, Henrik; Birkelund, Svend; Otzen, Daniel Erik; Stensballe, Allan

    2014-05-30

    We describe Condenser, a freely available, comprehensive open-source tool for merging multidimensional quantitative proteomics data from the Matrix Science Mascot Distiller Quantitation Toolbox into a common format ready for subsequent bioinformatic analysis. A number of different relative quantitation technologies, such as metabolic (15)N and amino acid stable isotope incorporation, label-free and chemical-label quantitation are supported. The program features multiple options for curative filtering of the quantified peptides, allowing the user to choose data quality thresholds appropriate for the current dataset, and ensure the quality of the calculated relative protein abundances. Condenser also features optional global normalization, peptide outlier removal, multiple testing and calculation of t-test statistics for highlighting and evaluating proteins with significantly altered relative protein abundances. Condenser provides an attractive addition to the gold-standard quantitative workflow of Mascot Distiller, allowing easy handling of larger multi-dimensional experiments. Source code, binaries, test data set and documentation are available at http://condenser.googlecode.com/. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Development of 3D multimedia with advanced computer animation tools for outreach activities related to Meteor Science and Meteoritics

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.

    2012-09-01

    Documentaries related to Astronomy and Planetary Sciences are a common and very attractive way to promote the interest of the public in these areas. These educational tools can get benefit from new advanced computer animation software and 3D technologies, as these allow making these documentaries even more attractive. However, special care must be taken in order to guarantee that the information contained in them is serious and objective. In this sense, an additional value is given when the footage is produced by the own researchers. With this aim, a new documentary produced and directed by Prof. Madiedo has been developed. The documentary, which has been entirely developed by means of advanced computer animation tools, is dedicated to several aspects of Meteor Science and Meteoritics. The main features of this outreach and education initiative are exposed here.

  6. Capability index--a statistical process control tool to aid in udder health control in dairy herds.

    PubMed

    Niza-Ribeiro, J; Noordhuizen, J P T M; Menezes, J C

    2004-08-01

    Bulk milk somatic cell count (BMSCC) averages have been used to evaluate udder health both at the individual or the herd level as well as milk quality and hygiene. The authors show that the BMSCC average is not the best tool to be used in udder health control programs and that it can be replaced with advantage by the capability index (Cpk). The Cpk is a statistical process control tool traditionally used by engineers to validate, monitor, and predict the expected behavior of processes or machines. The BMSCC data of 13 consecutive months of production from 414 dairy herds as well as SCC from all cows in the DHI program from 264 herds in the same period were collected. The Cpk and the annual BMSCC average (AAVG) of all the herds were calculated. Confronting the herd's performance explained by the Cpk and AAVG with the European Union (EU) official limit for BMSCC of 400,000 cells/mL, it was noticed that the Cpk accurately classified the compliance of the 414 farms, whereas the AAVG misclassified 166 (40%) of the 414 selected farms. The annual prevalence of subclinical mastitis (SMP) of each herd was calculated with individual SCC data from the same 13-mo period. Cows with more than 200,000 SCC/mL were considered as having subclinical mastitis. A logistic regression model to relate the Cpk and the herd's subclinical mastitis prevalence was calculated. The model is: SMPe = 0.475 e(-0.5286 x Cpk). The validation of the model was carried out evaluating the relation between the observed SMP and the predicted SMPe, in terms of the linear correlation coefficient (R2) and the mean difference between SMP and SMPe (i.e., mean square error of prediction). The validation suggests that our model can be used to estimate the herd's SMP with the herd's Cpk. The Cpk equation relates the herd's BMSCC with the EU official SCC limit, thus the logistic regression model enables the adoption of critical limits for subclinical mastitis, taking into consideration the legal standard for SCC.

  7. Tracking the route to sustainability: a service evaluation tool for an advance care planning model developed for community palliative care services.

    PubMed

    Blackford, Jeanine; Street, Annette

    2012-08-01

    The study aim was to develop a service evaluation tool for an advance care planning model implemented in community palliative care. Internationally, advance care planning programmes usually measure success by completion rate of advance directives or plans. This outcome measure provides little information to assist nurse managers to embed advance care planning into usual care and measure their performance and quality over time. An evaluation tool was developed to address this need in Australian community palliative care services. Multisite action research approach. Three community palliative care services located in Victoria, Australia, participated. Qualitative and quantitative data collection strategies were used to develop the Advance Care Planning-Service Evaluation Tool. The Advance Care Planning-Service Evaluation Tool identified advance care planning progress over time across three stages of Establishment, Consolidation and Sustainability within previously established Model domains of governance, documentation, practice, education, quality improvement and community engagement. The tool was used by nurses either as a peer-assessment or self-assessment tool that assisted services to track their implementation progress as well as plan further change strategies. The Advance Care Planning-Service Evaluation Tool was useful to nurse managers in community palliative care. It provided a clear outline of service progress, level of achievement and provided clear direction for planning future changes. The Advance Care Planning-Service Evaluation Tool enables nurses in community palliative care to monitor, evaluate and plan quality improvement of their advance care planning model to improve end-of-life care. As the tool describes generic healthcare processes, there is potential transferability of the tool to other types of services. © 2012 Blackwell Publishing Ltd.

  8. Next generation human skin constructs as advanced tools for drug development.

    PubMed

    Abaci, H E; Guo, Zongyou; Doucet, Yanne; Jacków, Joanna; Christiano, Angela

    2017-01-01

    Many diseases, as well as side effects of drugs, manifest themselves through skin symptoms. Skin is a complex tissue that hosts various specialized cell types and performs many roles including physical barrier, immune and sensory functions. Therefore, modeling skin in vitro presents technical challenges for tissue engineering. Since the first attempts at engineering human epidermis in 1970s, there has been a growing interest in generating full-thickness skin constructs mimicking physiological functions by incorporating various skin components, such as vasculature and melanocytes for pigmentation. Development of biomimetic in vitro human skin models with these physiological functions provides a new tool for drug discovery, disease modeling, regenerative medicine and basic research for skin biology. This goal, however, has long been delayed by the limited availability of different cell types, the challenges in establishing co-culture conditions, and the ability to recapitulate the 3D anatomy of the skin. Recent breakthroughs in induced pluripotent stem cell (iPSC) technology and microfabrication techniques such as 3D-printing have allowed for building more reliable and complex in vitro skin models for pharmaceutical screening. In this review, we focus on the current developments and prevailing challenges in generating skin constructs with vasculature, skin appendages such as hair follicles, pigmentation, immune response, innervation, and hypodermis. Furthermore, we discuss the promising advances that iPSC technology offers in order to generate in vitro models of genetic skin diseases, such as epidermolysis bullosa and psoriasis. We also discuss how future integration of the next generation human skin constructs onto microfluidic platforms along with other tissues could revolutionize the early stages of drug development by creating reliable evaluation of patient-specific effects of pharmaceutical agents. Impact statement Skin is a complex tissue that hosts various

  9. Design and optimization of disintegrating pellets of MCC by non-aqueous extrusion process using statistical tools.

    PubMed

    Gurram, Rajesh Kumar; Gandra, Suchithra; Shastri, Nalini R

    2016-03-10

    The objective of the study was to design and optimize a disintegrating pellet formulation of microcrystalline cellulose by non-aqueous extrusion process for a water sensitive drug using various statistical tools. Aspirin was used as a model drug. Disintegrating matrix pellets of aspirin using propylene glycol as a non-aqueous granulation liquid and croscarmellose as a disintegrant was developed. Plackett-Burman design was initially conducted to screen and identify the significant factors. Final optimization of formula was performed by response surface methodology using a central composite design. The critical attributes of the pellet dosage forms (dependent variables); disintegration time, sphericity and yield were predicted with adequate accuracy based on the regression model. Pareto charts and contour charts were studied to understand the influence of factors and predict the responses. A design space was constructed to meet the desirable targets of the responses in terms of disintegration time <5min, maximum yield, sphericity >0.95 and friability <1.7%. The optimized matrix pellets were enteric coated using Eudragit L 100. The drug release from the enteric coated pellets after 30min in the basic media was ~93% when compared to ~77% from the marketed pellets. The delayed release pellets stored at 25°C/60% RH were stable for a period of 10mo. In conclusion, it can be stated that the developed process for disintegrating pellets using non-aqueous granulating agents can be used as an alternative technique for various water sensitive drugs, circumventing the application of volatile organic solvents in conventional drug layering on inert cores. The scope of this study can be further extended to hydrophobic drugs, which may benefit from the rapid disintegration property and the use of various hydrophilic excipients used in the optimized pellet formulation to enhance dissolution and in turn improve bioavailability. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    SciTech Connect

    Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.; Qualls, A. L.

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactor innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  11. Incorporation of a community pharmacy-based heart failure assessment tool by advanced pharmacy practice experience students.

    PubMed

    Kelling, Sarah E; Walker, Paul C; Mason, James G; Zara, Nadir; Bleske, Barry E

    To incorporate a published clinical tool related to heart failure (HF) assessment into advanced pharmacy practice experiences in the community pharmacy setting to provide a meaningful and innovative learning experience for students. Sixteen independent and chain community pharmacies that served as advanced pharmacy practice experience locations. Sixteen community pharmacy locations served as rotation sites and participated in data collection (8 chain and 8 independent). This was the first study in which pharmacy students used The One-Minute Clinic for Heart Failure (TOM-C HF) tool to assess HF within the community pharmacy setting. Trained student pharmacists identified patients who may have heart failure by evaluating medication dispensing records, interviewed the patient using the TOM-C HF tool, and made interventions as clinically appropriate. The number of students using the TOM-C HF tool, the number and types of interventions made, and student perceptions about the educational and professional value of the patient interaction. Thirty-three of 83 (40%) students completed 63 patient assessments. Thirty-five percent of patients (22/63) were candidates for an intervention. Interventions were performed in 9 of 22 patients (41%). More than 65% of students found the patient interaction to have educational and professional value. Students were able to assess HF patients and make interventions in a community pharmacy setting. The majority of students also perceived some value in these assessments. The incorporation of a clinical tool in the community setting driven by fourth-year pharmacy students has been shown to be feasible and to provide both a novel advanced practice experience. In addition, it may be expandable to the services offered at community pharmacies. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  12. Severe Impairment Rating Scale: A Useful and Brief Cognitive Assessment Tool for Advanced Dementia for Nursing Home Residents.

    PubMed

    Yeo, Cindy; Lim, Wee Shiong; Chan, Mark; Ho, Xin Qin; Anthony, Philomena Vasantha; Han, Huey Charn; Chong, Mei Sian

    2016-02-01

    To investigate the utility of the Severe Impairment Rating Scale (SIRS) as a cognitive assessment tool among nursing home residents with advanced dementia, we conducted a cross-sectional study of 96 residents in 3 nursing homes with Functional Assessment Staging Test (FAST) stage 6a and above. We compared the discriminatory ability of SIRS with the Chinese version of Mini-Mental State Examination, Abbreviated Mental Test, and Clock Drawing Test. Among the cognitive tests, SIRS showed the least "floor" effect and had the best capacity to distinguish very severe (FAST stages 7d-f) dementia (area under the curve 0.80 vs 0.46-0.76 for the other tools). The SIRS had the best correlation with FAST staging (r = -.59, P < .01) and, unlike the other 3 tools, exhibited only minimal change in correlation when adjusted for education and ethnicity. Our results support the utility of SIRS as a brief cognitive assessment tool for advanced dementia in the nursing home setting.

  13. Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)

    2001-01-01

    The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.

  14. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    ERIC Educational Resources Information Center

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  15. Advances in Chimera Grid Tools for Multi-Body Dynamics Simulations and Script Creation

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    This viewgraph presentation contains information about (1) Framework for multi-body dynamics - Geometry Manipulation Protocol (GMP), (2) Simulation procedure using Chimera Grid Tools (CGT) and OVERFLOW-2 (3) Further recent developments in Chimera Grid Tools OVERGRID, Grid modules, Script library and (4) Future work.

  16. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    ERIC Educational Resources Information Center

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  17. Advances in Chimera Grid Tools for Multi-Body Dynamics Simulations and Script Creation

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    This viewgraph presentation contains information about (1) Framework for multi-body dynamics - Geometry Manipulation Protocol (GMP), (2) Simulation procedure using Chimera Grid Tools (CGT) and OVERFLOW-2 (3) Further recent developments in Chimera Grid Tools OVERGRID, Grid modules, Script library and (4) Future work.

  18. Characterization and detection of Vero cells infected with Herpes Simplex Virus type 1 using Raman spectroscopy and advanced statistical methods.

    PubMed

    Salman, A; Shufan, E; Zeiri, L; Huleihel, M

    2014-07-01

    Herpes viruses are involved in a variety of human disorders. Herpes Simplex Virus type 1 (HSV-1) is the most common among the herpes viruses and is primarily involved in human cutaneous disorders. Although the symptoms of infection by this virus are usually minimal, in some cases HSV-1 might cause serious infections in the eyes and the brain leading to blindness and even death. A drug, acyclovir, is available to counter this virus. The drug is most effective when used during the early stages of the infection, which makes early detection and identification of these viral infections highly important for successful treatment. In the present study we evaluated the potential of Raman spectroscopy as a sensitive, rapid, and reliable method for the detection and identification of HSV-1 viral infections in cell cultures. Using Raman spectroscopy followed by advanced statistical methods enabled us, with sensitivity approaching 100%, to differentiate between a control group of Vero cells and another group of Vero cells that had been infected with HSV-1. Cell sites that were "rich in membrane" gave the best results in the differentiation between the two categories. The major changes were observed in the 1195-1726 cm(-1) range of the Raman spectrum. The features in this range are attributed mainly to proteins, lipids, and nucleic acids.

  19. Identification of fungal phytopathogens using Fourier transform infrared-attenuated total reflection spectroscopy and advanced statistical methods

    NASA Astrophysics Data System (ADS)

    Salman, Ahmad; Lapidot, Itshak; Pomerantz, Ami; Tsror, Leah; Shufan, Elad; Moreh, Raymond; Mordechai, Shaul; Huleihel, Mahmoud

    2012-01-01

    The early diagnosis of phytopathogens is of a great importance; it could save large economical losses due to crops damaged by fungal diseases, and prevent unnecessary soil fumigation or the use of fungicides and bactericides and thus prevent considerable environmental pollution. In this study, 18 isolates of three different fungi genera were investigated; six isolates of Colletotrichum coccodes, six isolates of Verticillium dahliae and six isolates of Fusarium oxysporum. Our main goal was to differentiate these fungi samples on the level of isolates, based on their infrared absorption spectra obtained using the Fourier transform infrared-attenuated total reflection (FTIR-ATR) sampling technique. Advanced statistical and mathematical methods: principal component analysis (PCA), linear discriminant analysis (LDA), and k-means were applied to the spectra after manipulation. Our results showed significant spectral differences between the various fungi genera examined. The use of k-means enabled classification between the genera with a 94.5% accuracy, whereas the use of PCA [3 principal components (PCs)] and LDA has achieved a 99.7% success rate. However, on the level of isolates, the best differentiation results were obtained using PCA (9 PCs) and LDA for the lower wavenumber region (800-1775 cm-1), with identification success rates of 87%, 85.5%, and 94.5% for Colletotrichum, Fusarium, and Verticillium strains, respectively.

  20. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs (azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, and methylenedioxy...

  1. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs (azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, and methylenedioxy...

  2. Statistical tools for dose individualization of mycophenolic acid and tacrolimus co-administered during the first month after renal transplantation.

    PubMed

    Musuamba, Flora T; Mourad, Michel; Haufroid, Vincent; De Meyer, Martine; Capron, Arnaud; Delattre, Isabelle K; Verbeeck, Roger K; Wallemacq, Pierre

    2013-05-01

    To predict simultaneously the area under the concentration-time curve during one dosing interval [AUC(0,12 h)] for mycophenolic acid (MPA) and tacrolimus (TAC), when concomitantly used during the first month after transplantation, based on common blood samples. Data were from two different sources, real patient pharmacokinetic (PK) profiles from 65 renal transplant recipients and 9000 PK profiles simulated from previously published models on MPA or TAC in the first month after transplantation. Multiple linear regression (MLR) and Bayesian estimation using optimal samples were performed to predict MPA and TAC AUC(0,12 h) based on two concentrations. The following models were retained: AUC(0,12 h) = 16.5 + 4.9 × C1.5 + 6.7 × C3.5 (r(2) = 0.82, rRMSE = 9%, with simulations and r(2) = 0.66, rRMSE = 24%, with observed data) and AUC(0,12 h) = 24.3 + 5.9 × C1.5 + 12.2 × C3.5 (r(2) = 0.94, rRMSE = 12.3%, with simulations r(2) = 0.74, rRMSE = 15%, with observed data) for MPA and TAC, respectively. In addition, bayesian estimators were developed including parameter values from final models and values of concentrations at 1.5 and 3.5 h after dose. Good agreement was found between predicted and reference AUC(0,12 h) values: r(2) = 0.90, rRMSE = 13% and r(2) = 0.97, rRMSE = 5% with simulations for MPA and TAC, respectively and r(2) = 0.75, rRMSE = 11% and r(2) = 0.83, rRMSE = 7% with observed data for MPA and TAC, respectively. Statistical tools were developed for simultaneous MPA and TAC therapeutic drug monitoring. They can be incorporated in computer programs for patient dose individualization. © 2012 The Authors. British Journal of Clinical Pharmacology © 2012 The British Pharmacological Society.

  3. Small Payloads and Advanced Concepts for Exploration (SPACE) Early Mission Design Tool

    NASA Astrophysics Data System (ADS)

    Leiter, R. M.; Himwich, Z.; Natarajan, A.; Rosenthall, J.; Clark, P. E.

    2014-10-01

    This tool allows scientists and engineers in early phases of mission development to select a compact instrument for a science application and find initial estimates mass, power, volume, and bandwidth for a specific nanosatellite mission.

  4. CRISPR/Cas9: an advanced tool for editing plant genomes.

    PubMed

    Samanta, Milan Kumar; Dey, Avishek; Gayen, Srimonta

    2016-10-01

    To meet current challenges in agriculture, genome editing using sequence-specific nucleases (SSNs) is a powerful tool for basic and applied plant biology research. Here, we describe the principle and application of available genome editing tools, including zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs) and the clustered regularly interspaced short palindromic repeat associated CRISPR/Cas9 system. Among these SSNs, CRISPR/Cas9 is the most recently characterized and rapidly developing genome editing technology, and has been successfully utilized in a wide variety of organisms. This review specifically illustrates the power of CRISPR/Cas9 as a tool for plant genome engineering, and describes the strengths and weaknesses of the CRISPR/Cas9 technology compared to two well-established genome editing tools, ZFNs and TALENs.

  5. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  6. Advanced repair solution of clear defects on HTPSM by using nanomachining tool

    NASA Astrophysics Data System (ADS)

    Lee, Hyemi; Kim, Munsik; Jung, Hoyong; Kim, Sangpyo; Yim, Donggyu

    2015-10-01

    As the mask specifications become tighter for low k1 lithography, more aggressive repair accuracy is required below sub 20nm tech. node. To meet tight defect specifications, many maskshops select effective repair tools according to defect types. Normally, pattern defects are repaired by the e-beam repair tool and soft defects such as particles are repaired by the nanomachining tool. It is difficult for an e-beam repair tool to remove particle defects because it uses chemical reaction between gas and electron, and a nanomachining tool, which uses physical reaction between a nano-tip and defects, cannot be applied for repairing clear defects. Generally, film deposition process is widely used for repairing clear defects. However, the deposited film has weak cleaning durability, so it is easily removed by accumulated cleaning process. Although the deposited film is strongly attached on MoSiN(or Qz) film, the adhesive strength between deposited Cr film and MoSiN(or Qz) film becomes weaker and weaker by the accumulated energy when masks are exposed in a scanner tool due to the different coefficient of thermal expansion of each materials. Therefore, whenever a re-pellicle process is needed to a mask, all deposited repair points have to be confirmed whether those deposition film are damaged or not. And if a deposition point is damaged, repair process is needed again. This process causes longer and more complex process. In this paper, the basic theory and the principle are introduced to recover clear defects by using nanomachining tool, and the evaluated results are reviewed at dense line (L/S) patterns and contact hole (C/H) patterns. Also, the results using a nanomachining were compared with those using an e-beam repair tool, including the cleaning durability evaluated by the accumulated cleaning process. Besides, we discuss the phase shift issue and the solution about the image placement error caused by phase error.

  7. Statistically advanced, self-similar, radial probability density functions of atmospheric and under-expanded hydrogen jets

    NASA Astrophysics Data System (ADS)

    Ruggles, Adam J.

    2015-11-01

    This paper presents improved statistical insight regarding the self-similar scalar mixing process of atmospheric hydrogen jets and the downstream region of under-expanded hydrogen jets. Quantitative planar laser Rayleigh scattering imaging is used to probe both jets. The self-similarity of statistical moments up to the sixth order (beyond the literature established second order) is documented in both cases. This is achieved using a novel self-similar normalization method that facilitated a degree of statistical convergence that is typically limited to continuous, point-based measurements. This demonstrates that image-based measurements of a limited number of samples can be used for self-similar scalar mixing studies. Both jets exhibit the same radial trends of these moments demonstrating that advanced atmospheric self-similarity can be applied in the analysis of under-expanded jets. Self-similar histograms away from the centerline are shown to be the combination of two distributions. The first is attributed to turbulent mixing. The second, a symmetric Poisson-type distribution centered on zero mass fraction, progressively becomes the dominant and eventually sole distribution at the edge of the jet. This distribution is attributed to shot noise-affected pure air measurements, rather than a diffusive superlayer at the jet boundary. This conclusion is reached after a rigorous measurement uncertainty analysis and inspection of pure air data collected with each hydrogen data set. A threshold based upon the measurement noise analysis is used to separate the turbulent and pure air data, and thusly estimate intermittency. Beta-distributions (four parameters) are used to accurately represent the turbulent distribution moments. This combination of measured intermittency and four-parameter beta-distributions constitutes a new, simple approach to model scalar mixing. Comparisons between global moments from the data and moments calculated using the proposed model show excellent

  8. Exposure to Alcoholism in the Family: United States, 1988. Advance Data from Vital and Health Statistics of the National Center for Health Statistics. Number 205.

    ERIC Educational Resources Information Center

    Schoenborn, Charlotte A.

    This report is based on data from the 1988 National Health Interview Survey on Alcohol (NHIS-Alcohol), part of the ongoing National Health Interview Survey conducted by the National Center for Health Statistics. Interviews for the NHIS are conducted in person by staff of the United States Bureau of the Census. Information is collected on each…

  9. Potential for MERLIN-Expo, an advanced tool for higher tier exposure assessment, within the EU chemical legislative frameworks.

    PubMed

    Suciu, Nicoleta; Tediosi, Alice; Ciffroy, Philippe; Altenpohl, Annette; Brochot, Céline; Verdonck, Frederik; Ferrari, Federico; Giubilato, Elisa; Capri, Ettore; Fait, Gabriella

    2016-08-15

    MERLIN-Expo merges and integrates advanced exposure assessment methodologies, allowing the building of complex scenarios involving several pollution sources and targets. The assessment of exposure and risks to human health from chemicals is of major concern for policy and ultimately benefits all citizens. The development and operational fusion of the advanced exposure assessment methodologies envisaged in the MERLIN-Expo tool will have a significant impact in the long term on several policies dealing with chemical safety management. There are more than 30 agencies in Europe related to exposure and risk evaluation of chemicals, which have an important role in implementing EU policies, having especially tasks of technical, scientific, operational and/or regulatory nature. The main purpose of the present paper is to introduce MERLIN-Expo and to highlight its potential for being effectively integrated within the group of tools available to assess the risk and exposure of chemicals for EU policy. The main results show that the tool is highly suitable for use in site-specific or local impact assessment, with minor modifications it can also be used for Plant Protection Products (PPPs), biocides and REACH, while major additions would be required for a comprehensive application in the field of consumer and worker exposure assessment. Copyright © 2016. Published by Elsevier B.V.

  10. Implementation of a professional portfolio: a tool to demonstrate professional development for advanced practice.

    PubMed

    Chamblee, Tracy B; Dale, Juanita Conkin; Drews, Barbie; Spahis, Joanna; Hardin, Teri

    2015-01-01

    The literature has a gap related to professional development for APRNs. In the United States, many health care organizations use clinical advancement programs for registered nurses, but APRNs are not often included in these programs. If APRNs are included, advancement opportunities are very limited. At CMC, implementation of a professional portfolio resulted in increased satisfaction among APPs regarding their ability to showcase professional growth and expertise, as well as the uniqueness of their advanced practice. Use of the professional portfolio led to improved recognition by APS and organizational leaders of APP performance excellence during the annual performance evaluation, as well as improved recognition among APP colleagues in terms of nominations for honors and awards.

  11. New methodology to baseline and match AME polysilicon etcher using advanced diagnostic tools

    NASA Astrophysics Data System (ADS)

    Poppe, James; Shipman, John; Reinhardt, Barbara E.; Roussel, Myriam; Hedgecock, Raymond; Fonda, Arturo

    1999-09-01

    As process controls tighten in the semiconductor industry, the need to understand the variables that determine system performance become more important. For plasma etch systems, process success depends on the control of key parameters such as: vacuum integrity, pressure, gas flows, and RF power. It is imperative to baseline, monitor, and control these variables. This paper presents an overview of the methods and tools used by Motorola BMC fabrication facility to characterize an Applied Materials polysilicon etcher. Tool performance data obtained from our traditional measurement techniques are limited in their scope and do not provide a complete picture of the ultimate tool performance. Presently the BMC traditional characterization tools provide a snapshot of the static operation of the equipment under test (EUT); however, complete evaluation of the dynamic performance cannot be monitored without the aid of specialized diagnostic equipment. To provide us with a complete system baseline evaluation of the polysilicon etcher, three diagnostic tools were utilized: Lucas Labs Vacuum Diagnostic System, Residual Gas Analyzer, and the ENI Voltage/Impedance Probe. The diagnostic methodology used to baseline and match key parameters of qualified production equipment has had an immense impact on other equipment characterization in the facility. It has resulted in reduced cycle time for new equipment introduction as well.

  12. Just-in-Time Teaching: A Tool for Enhancing Student Engagement in Advanced Foreign Language Learning

    ERIC Educational Resources Information Center

    Abreu, Laurel; Knouse, Stephanie

    2014-01-01

    Scholars have indicated a need for further research on effective pedagogical strategies designed for advanced foreign language courses in the postsecondary setting, especially in light of decreased enrollments at this level and the elimination of foreign language programs altogether in some institutions (Paesani & Allen, 2012). This article…

  13. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  14. Genetic tools for advancement of Synechococcus sp. PCC 7002 as a cyanobacterial chassis

    DOE PAGES

    Ruffing, Anne M.; Jensen, Travis J.; Strickland, Lucas M.

    2016-11-10

    Successful implementation of modified cyanobacteria as hosts for industrial applications requires the development of a cyanobacterial chassis. The cyanobacterium Synechococcus sp. PCC 7002 embodies key attributes for an industrial host, including a fast growth rate and high salt, light, and temperature tolerances. Here, this study addresses key limitations in the advancement of Synechococcus sp. PCC 7002 as an industrial chassis.

  15. Prototype Tool and Focus Group Evaluation for an Advanced Trajectory-Based Operations Concept

    NASA Technical Reports Server (NTRS)

    Guerreiro, Nelson M.; Jones, Denise R.; Barmore, Bryan E.; Butler, Ricky W.; Hagen, George E.; Maddalon, Jeffrey M.; Ahmad, Nash'at N.

    2017-01-01

    Trajectory-based operations (TBO) is a key concept in the Next Generation Air Transportation System transformation of the National Airspace System (NAS) that will increase the predictability and stability of traffic flows, support a common operational picture through the use of digital data sharing, facilitate more effective collaborative decision making between airspace users and air navigation service providers, and enable increased levels of integrated automation across the NAS. NASA has been developing trajectory-based systems to improve the efficiency of the NAS during specific phases of flight and is now also exploring Advanced 4-Dimensional Trajectory (4DT) operational concepts that will integrate these technologies and incorporate new technology where needed to create both automation and procedures to support gate-to-gate TBO. A TBO Prototype simulation toolkit has been developed that demonstrates initial functionality of an Advanced 4DT TBO concept. Pilot and controller subject matter experts (SMEs) were brought to the Air Traffic Operations Laboratory at NASA Langley Research Center for discussions on an Advanced 4DT operational concept and were provided an interactive demonstration of the TBO Prototype using four example scenarios. The SMEs provided feedback on potential operational, technological, and procedural opportunities and concerns. This paper describes an Advanced 4DT operational concept, the TBO Prototype, the demonstration scenarios and methods used, and the feedback obtained from the pilot and controller SMEs in this focus group activity.

  16. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  17. Advanced Technologies as Educational Tools in Science: Concepts, Applications, and Issues. Monograph Series Number 8.

    ERIC Educational Resources Information Center

    Kumar, David D.; And Others

    Systems incorporating two advanced technologies, hypermedia systems and intelligent tutors, are examined with respect to their potential impact on science education. The conceptual framework underlying these systems is discussed first. Applications of systems are then presented with examples of each in operation within the context of science…

  18. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced state-of-the-art technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs [azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, m...

  19. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced state-of-the-art technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs [azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, m...

  20. Optimization of Friction Stir Welding Tool Advance Speed via Monte-Carlo Simulation of the Friction Stir Welding Process.

    PubMed

    Fraser, Kirk A; St-Georges, Lyne; Kiss, Laszlo I

    2014-04-30

    Recognition of the friction stir welding process is growing in the aeronautical and aero-space industries. To make the process more available to the structural fabrication industry (buildings and bridges), being able to model the process to determine the highest speed of advance possible that will not cause unwanted welding defects is desirable. A numerical solution to the transient two-dimensional heat diffusion equation for the friction stir welding process is presented. A non-linear heat generation term based on an arbitrary piecewise linear model of friction as a function of temperature is used. The solution is used to solve for the temperature distribution in the Al 6061-T6 work pieces. The finite difference solution of the non-linear problem is used to perform a Monte-Carlo simulation (MCS). A polynomial response surface (maximum welding temperature as a function of advancing and rotational speed) is constructed from the MCS results. The response surface is used to determine the optimum tool speed of advance and rotational speed. The exterior penalty method is used to find the highest speed of advance and the associated rotational speed of the tool for the FSW process considered. We show that good agreement with experimental optimization work is possible with this simplified model. Using our approach an optimal weld pitch of 0.52 mm/rev is obtained for 3.18 mm thick AA6061-T6 plate. Our method provides an estimate of the optimal welding parameters in less than 30 min of calculation time.

  1. Optimization of Friction Stir Welding Tool Advance Speed via Monte-Carlo Simulation of the Friction Stir Welding Process

    PubMed Central

    Fraser, Kirk A.; St-Georges, Lyne; Kiss, Laszlo I.

    2014-01-01

    Recognition of the friction stir welding process is growing in the aeronautical and aero-space industries. To make the process more available to the structural fabrication industry (buildings and bridges), being able to model the process to determine the highest speed of advance possible that will not cause unwanted welding defects is desirable. A numerical solution to the transient two-dimensional heat diffusion equation for the friction stir welding process is presented. A non-linear heat generation term based on an arbitrary piecewise linear model of friction as a function of temperature is used. The solution is used to solve for the temperature distribution in the Al 6061-T6 work pieces. The finite difference solution of the non-linear problem is used to perform a Monte-Carlo simulation (MCS). A polynomial response surface (maximum welding temperature as a function of advancing and rotational speed) is constructed from the MCS results. The response surface is used to determine the optimum tool speed of advance and rotational speed. The exterior penalty method is used to find the highest speed of advance and the associated rotational speed of the tool for the FSW process considered. We show that good agreement with experimental optimization work is possible with this simplified model. Using our approach an optimal weld pitch of 0.52 mm/rev is obtained for 3.18 mm thick AA6061-T6 plate. Our method provides an estimate of the optimal welding parameters in less than 30 min of calculation time. PMID:28788627

  2. Towards the characterization of noise sources in a supersonic three-stream jet using advanced analysis tools

    NASA Astrophysics Data System (ADS)

    Ruscher, Christopher; Gogineni, Sivaram

    2016-11-01

    Strict noise regulation set by governing bodies currently make supersonic commercial aviation impractical. One of the many challenges that exist in developing practical supersonic commercial aircraft is the noise produced by the engine's exhaust jet. A promising method of jet noise reduction for supersonic applications is through the addition of extra exhaust streams. Data for an axisymmetric three-stream nozzle were generated using the Naval Research Laboratory's JENRE code. This data will be compared to experimental results obtained by NASA for validation purposes. Once the simulation results show satisfactory agreement to the experiments, advanced analysis tools will be applied to the simulation data to characterize potential noise sources. The tools to be applied include methods that are based on proper orthogonal decomposition, wavelet decomposition, and stochastic estimation. Additionally, techniques such as empirical mode decomposition and momentum potential theorem will be applied to the data as well.

  3. Recent advances in microbial production of fuels and chemicals using tools and strategies of systems metabolic engineering.

    PubMed

    Cho, Changhee; Choi, So Young; Luo, Zi Wei; Lee, Sang Yup

    2015-11-15

    The advent of various systems metabolic engineering tools and strategies has enabled more sophisticated engineering of microorganisms for the production of industrially useful fuels and chemicals. Advances in systems metabolic engineering have been made in overproducing natural chemicals and producing novel non-natural chemicals. In this paper, we review the tools and strategies of systems metabolic engineering employed for the development of microorganisms for the production of various industrially useful chemicals belonging to fuels, building block chemicals, and specialty chemicals, in particular focusing on those reported in the last three years. It was aimed at providing the current landscape of systems metabolic engineering and suggesting directions to address future challenges towards successfully establishing processes for the bio-based production of fuels and chemicals from renewable resources. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Statistical tools for managing the Ambikapur aquifer in central India for sustainable hydrological development of the region

    NASA Astrophysics Data System (ADS)

    Sharma, S. K.

    2009-04-01

    Statistical tools for managing the Ambikapur aquifer in central India for sustainable hydrological development of the region Despite India's tremendous progress on all fronts after independence in 1947, the fact remains that it is one of the poorest nation in the world in terms of per capita income and energy consumption which is considered to be the gauge of the economic situation of any country. In case of India, it is nearly one tenth of the developed nations. If economic condition of its people is to be raised, then country has to boost its agriculture production which is largely monsoon dependent and to exploit its conventional and unconventional energy sources at a very rapid growth rate. Although, worldwide, 70% of the water that is withdrawn for human use is used for agriculture, 22% for industry and 8% is used for domestic services. But in India which is a low income country, 82% is used for agriculture, 10% for industry and 8% for domestic services. Therefore, India needs new sources of water to reduce the risk of dependency on the monsoon for the Sustainable Development of the country. It is in this connection that the Ambikapur Basin in the Central India has been studied for sustainable water withdrawal. At present, the crops in the Ambikapur region are totally monsoon dependent. However, with the initiatives of the State Government, 25 boreholes in an area of about 25 square kilometers have been drilled up to a depth of 500m and completed in the Gondwana sandstone. The water quality and the discharge rates have been established to sustain the crops of the area which is the only livelihood of the local people , in case the monsoon fails. The hydraulic properties of the aquifer like Transmissivity (T) and the Coefficient of Storage (S) were determined following the graphic method of Jacob and Theis. The rate of discharge (Q) of the pumped well was estimated at 4.05 x 10 to the power 3 cubic meters per second and the values of other parameters like T at

  5. Advanced Algorithms and Automation Tools for Discrete Ordinates Methods in Parallel Environments

    SciTech Connect

    Alireza Haghighat

    2003-05-07

    This final report discusses major accomplishments of a 3-year project under the DOE's NEER Program. The project has developed innovative and automated algorithms, codes, and tools for solving the discrete ordinates particle transport method efficiently in parallel environments. Using a number of benchmark and real-life problems, the performance and accuracy of the new algorithms have been measured and analyzed.

  6. Continuous Symmetry and Chemistry Teachers: Learning Advanced Chemistry Content through Novel Visualization Tools

    ERIC Educational Resources Information Center

    Tuvi-Arad, Inbal; Blonder, Ron

    2010-01-01

    In this paper we describe the learning process of a group of experienced chemistry teachers in a specially designed workshop on molecular symmetry and continuous symmetry. The workshop was based on interactive visualization tools that allow molecules and their symmetry elements to be rotated in three dimensions. The topic of continuous symmetry is…

  7. Continuous Symmetry and Chemistry Teachers: Learning Advanced Chemistry Content through Novel Visualization Tools

    ERIC Educational Resources Information Center

    Tuvi-Arad, Inbal; Blonder, Ron

    2010-01-01

    In this paper we describe the learning process of a group of experienced chemistry teachers in a specially designed workshop on molecular symmetry and continuous symmetry. The workshop was based on interactive visualization tools that allow molecules and their symmetry elements to be rotated in three dimensions. The topic of continuous symmetry is…

  8. Bioinformatics Methods and Tools to Advance Clinical Care. Findings from the Yearbook 2015 Section on Bioinformatics and Translational Informatics.

    PubMed

    Soualmia, L F; Lecroq, T

    2015-08-13

    To summarize excellent current research in the field of Bioinformatics and Translational Informatics with application in the health domain and clinical care. We provide a synopsis of the articles selected for the IMIA Yearbook 2015, from which we attempt to derive a synthetic overview of current and future activities in the field. As last year, a first step of selection was performed by querying MEDLINE with a list of MeSH descriptors completed by a list of terms adapted to the section. Each section editor has evaluated separately the set of 1,594 articles and the evaluation results were merged for retaining 15 articles for peer-review. The selection and evaluation process of this Yearbook's section on Bioinformatics and Translational Informatics yielded four excellent articles regarding data management and genome medicine that are mainly tool-based papers. In the first article, the authors present PPISURV a tool for uncovering the role of specific genes in cancer survival outcome. The second article describes the classifier PredictSNP which combines six performing tools for predicting disease-related mutations. In the third article, by presenting a high-coverage map of the human proteome using high resolution mass spectrometry, the authors highlight the need for using mass spectrometry to complement genome annotation. The fourth article is also related to patient survival and decision support. The authors present datamining methods of large-scale datasets of past transplants. The objective is to identify chances of survival. The current research activities still attest the continuous convergence of Bioinformatics and Medical Informatics, with a focus this year on dedicated tools and methods to advance clinical care. Indeed, there is a need for powerful tools for managing and interpreting complex, large-scale genomic and biological datasets, but also a need for user-friendly tools developed for the clinicians in their daily practice. All the recent research and

  9. Predicting advanced neoplasia at colonoscopy in a diverse population with the National Cancer Institute colorectal cancer risk-assessment tool.

    PubMed

    Ladabaum, Uri; Patel, Ashley; Mannalithara, Ajitha; Sundaram, Vandana; Mitani, Aya; Desai, Manisha

    2016-09-01

    Tailoring screening to colorectal cancer (CRC) risk could improve screening effectiveness. Most CRCs arise from advanced neoplasia (AN) that dwells for years. To date, no available colorectal neoplasia risk score has been validated externally in a diverse population. The authors explored whether the National Cancer Institute (NCI) CRC risk-assessment tool, which was developed to predict future CRC risk, could predict current AN prevalence in a diverse population, thereby allowing its use in risk stratification for screening. This was a prospective examination of the relation between predicted 10-year CRC risk and the prevalence of AN, defined as advanced or multiple (≥3 adenomatous, ≥5 serrated) adenomatous or sessile serrated polyps, in individuals undergoing screening colonoscopy. Among 509 screenees (50% women; median age, 58 years; 61% white, 5% black, 10% Hispanic, and 24% Asian), 58 (11%) had AN. The prevalence of AN increased progressively from 6% in the lowest risk-score quintile to 17% in the highest risk-score quintile (P = .002). Risk-score distributions in individuals with versus without AN differed significantly (median, 1.38 [0.90-1.87] vs 1.02 [0.62-1.57], respectively; P = .003), with substantial overlap. The discriminatory accuracy of the tool was modest, with areas under the curve of 0.61 (95% confidence interval [CI], 0.54-0.69) overall, 0.59 (95% CI, 0.49-0.70) for women, and 0.63 (95% CI, 0.53-0.73) for men. The results did not change substantively when the analysis was restricted to adenomatous lesions or to screening procedures without any additional incidental indication. The NCI CRC risk-assessment tool displays modest discriminatory accuracy in predicting AN at screening colonoscopy in a diverse population. This tool may aid shared decision-making in clinical practice. Cancer 2016. © 2016 American Cancer Society. Cancer 2016;122:2663-2670. © 2016 American Cancer Society. © 2016 American Cancer Society.

  10. Nanoscale chemical imaging of Bacillus subtilis spores by combining tip-enhanced Raman scattering and advanced statistical tools.

    PubMed

    Rusciano, Giulia; Zito, Gianluigi; Isticato, Rachele; Sirec, Teja; Ricca, Ezio; Bailo, Elena; Sasso, Antonio

    2014-12-23

    Tip-enhanced Raman Scattering (TERS) has recently emerged as a powerful spectroscopic technique capable of providing subdiffraction morphological and chemical information on samples. In this work, we apply TERS spectroscopy for surface analysis of the Bacillus subtilis spore, a very attractive biosystem for a wide range of applications regulated by the spore surface properties. The observed spectra reflect the complex and heterogeneous environment explored by the plasmonic tip, therefore exhibiting significant point-to-point variations at the nanoscale. Herein, we demonstrate that TERS data processing via principal component analysis allows handling such spectral changes, thus enabling an unbiased correlative imaging based on TERS. Our experimental outcomes suggest a denser arrangement of both proteins and carbohydrates on specific spore surface regions simultaneously revealed by AFM phase imaging. Successful TERS analysis of spores' surface is useful for bacterial surface-display systems and drug delivery applications.

  11. Advances in Omics and Bioinformatics Tools for Systems Analyses of Plant Functions

    PubMed Central

    Mochida, Keiichi; Shinozaki, Kazuo

    2011-01-01

    Omics and bioinformatics are essential to understanding the molecular systems that underlie various plant functions. Recent game-changing sequencing technologies have revitalized sequencing approaches in genomics and have produced opportunities for various emerging analytical applications. Driven by technological advances, several new omics layers such as the interactome, epigenome and hormonome have emerged. Furthermore, in several plant species, the development of omics resources has progressed to address particular biological properties of individual species. Integration of knowledge from omics-based research is an emerging issue as researchers seek to identify significance, gain biological insights and promote translational research. From these perspectives, we provide this review of the emerging aspects of plant systems research based on omics and bioinformatics analyses together with their associated resources and technological advances. PMID:22156726

  12. Analysis of Simulation Tools for the Study of Advanced Marine Power Systems.

    DTIC Science & Technology

    1992-09-01

    Propulsion Plant Guide for USS WORDEN (CG 18). [3] "Advanced Electrical Distribution System", Notes provided by H. Hegner , DTRC Electrical Systems Division, 18...meeting between David Taylor Research Center (H. Hegner ), MIT Department of Electrical Engineering (Dr. J.L. Kirtley), and the Naval Postgraduate...School Monterey, CA 93943-5000 6. David Taylor Research Center 1 Attn: Henry Hegner Annapolis Lab (Code 2714) Annapolis, MD 21402-5067 7. David Taylor

  13. Genetic tools for advancement of Synechococcus sp. PCC 7002 as a cyanobacterial chassis

    SciTech Connect

    Ruffing, Anne M.; Jensen, Travis J.; Strickland, Lucas M.

    2016-11-10

    Successful implementation of modified cyanobacteria as hosts for industrial applications requires the development of a cyanobacterial chassis. The cyanobacterium Synechococcus sp. PCC 7002 embodies key attributes for an industrial host, including a fast growth rate and high salt, light, and temperature tolerances. Here, this study addresses key limitations in the advancement of Synechococcus sp. PCC 7002 as an industrial chassis.

  14. Portfolio use as a tool to demonstrate professional development in advanced nursing practice.

    PubMed

    Hespenheide, Molly; Cottingham, Talisha; Mueller, Gail

    2011-01-01

    A concrete way of recognizing and rewarding clinical leadership, excellence in practice, and personal and professional development of the advanced practice registered nurse (APRN) is lacking in the literature and healthcare institutions in the United States. This article presents the process of developing and evaluating a professional development program designed to address this gap. The program uses APRN Professional Performance Standards, Relationship-Based Care, and the Magnet Forces as a guide and theoretical base. A key tenet of the program is the creation of a professional portfolio. Narrative reflections are included that illustrate the convergence of theories. A crosswalk supports this structure, guides portfolio development, and operationalizes the convergence of theories as they specifically relate to professional development in advanced practice. Implementation of the program has proven to be challenging and rewarding. Feedback from APRNs involved in the program supports program participation as a meaningful method to recognize excellence in advanced practice and a clear means to foster ongoing professional growth and development.

  15. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    NASA Astrophysics Data System (ADS)

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.

    2014-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and

  16. The use of statistical tools in field testing of putative effects of genetically modified plants on nontarget organisms.

    PubMed

    Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F

    2013-08-01

    To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches - for example, analysis of variance (ANOVA) - are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will

  17. Anvil Forecast Tool in the Advanced Weather Interactive Processing System, Phase II

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III

    2008-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Light Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input.

  18. ADVANCEMENT OF NUCLEIC ACID-BASED TOOLS FOR MONITORING IN SITU REDUCTIVE DECHLORINATION

    SciTech Connect

    Vangelas, K; ELIZABETH EDWARDS, E; FRANK LOFFLER, F; Brian02 Looney, B

    2006-11-17

    Regulatory protocols generally recognize that destructive processes are the most effective mechanisms that support natural attenuation of chlorinated solvents. In many cases, these destructive processes will be biological processes and, for chlorinated compounds, will often be reductive processes that occur under anaerobic conditions. The existing EPA guidance (EPA, 1998) provides a list of parameters that provide indirect evidence of reductive dechlorination processes. In an effort to gather direct evidence of these processes, scientists have identified key microorganisms and are currently developing tools to measure the abundance and activity of these organisms in subsurface systems. Drs. Edwards and Luffler are two recognized leaders in this field. The research described herein continues their development efforts to provide a suite of tools to enable direct measures of biological processes related to the reductive dechlorination of TCE and PCE. This study investigated the strengths and weaknesses of the 16S rRNA gene-based approach to characterizing the natural attenuation capabilities in samples. The results suggested that an approach based solely on 16S rRNA may not provide sufficient information to document the natural attenuation capabilities in a system because it does not distinguish between strains of organisms that have different biodegradation capabilities. The results of the investigations provided evidence that tools focusing on relevant enzymes for functionally desired characteristics may be useful adjuncts to the 16SrRNA methods.

  19. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  20. A Multi-layer, Data-driven Advanced Reasoning Tool for Intelligent Data Mining and Analysis for Smart Grids

    SciTech Connect

    Lu, Ning; Du, Pengwei; Greitzer, Frank L.; Guo, Xinxin; Hohimer, Ryan E.; Pomiak, Yekaterina G.

    2012-12-31

    This paper presents the multi-layer, data-driven advanced reasoning tool (M-DART), a proof-of-principle decision support tool for improved power system operation. M-DART will cross-correlate and examine different data sources to assess anomalies, infer root causes, and anneal data into actionable information. By performing higher-level reasoning “triage” of diverse data sources, M-DART focuses on early detection of emerging power system events and identifies highest priority actions for the human decision maker. M-DART represents a significant advancement over today’s grid monitoring technologies that apply offline analyses to derive model-based guidelines for online real-time operations and use isolated data processing mechanisms focusing on individual data domains. The development of the M-DART will bridge these gaps by reasoning about results obtained from multiple data sources that are enabled by the smart grid infrastructure. This hybrid approach integrates a knowledge base that is trained offline but tuned online to capture model-based relationships while revealing complex causal relationships among data from different domains.

  1. Biodosimetric tools for a fast triage of people accidentally exposed to ionising radiation. Statistical and computational aspects.

    PubMed

    Ainsbury, Elizabeth A; Barquinero, J Francesc

    2009-01-01

    Consideration of statistical methodology is essential for the application of cytogenetic and other biodosimetry techniques to triage for mass casualty situations. This is because the requirement for speed and accuracy in biodosimetric triage necessarily introduces greater uncertainties than would be acceptable in day-to-day biodosimetry. Additionally, in a large scale accident type situation, it is expected that a large number of laboratories from around the world will assist and it is likely that each laboratory will use one or more different dosimetry techniques. Thus issues arise regarding combination of results and the associated errors. In this article we discuss the statistical and computational aspects of radiation biodosimetry for triage in a large scale accident-type situation. The current status of statistical analysis techniques is reviewed and suggestions are made for improvements to these methods which will allow first responders to estimate doses quickly and reliably for suspected exposed persons.

  2. Combining Statistical Tools and Ecological Assessments in the Study of Biodeterioration Patterns of Stone Temples in Angkor (Cambodia)

    NASA Astrophysics Data System (ADS)

    Caneva, G.; Bartoli, F.; Savo, V.; Futagami, Y.; Strona, G.

    2016-09-01

    Biodeterioration is a major problem for the conservation of cultural heritage materials. We provide a new and original approach to analyzing changes in patterns of colonization (Biodeterioration patterns, BPs) by biological agents responsible for the deterioration of outdoor stone materials. Here we analyzed BPs of four Khmer temples in Angkor (Cambodia) exposed to variable environmental conditions, using qualitative ecological assessments and statistical approaches. The statistical analyses supported the findings obtained with the qualitative approach. Both approaches provided additional information not otherwise available using one single method. Our results indicate that studies on biodeterioration can benefit from integrating diverse methods so that conservation efforts might become more precise and effective.

  3. Combining Statistical Tools and Ecological Assessments in the Study of Biodeterioration Patterns of Stone Temples in Angkor (Cambodia).

    PubMed

    Caneva, G; Bartoli, F; Savo, V; Futagami, Y; Strona, G

    2016-09-06

    Biodeterioration is a major problem for the conservation of cultural heritage materials. We provide a new and original approach to analyzing changes in patterns of colonization (Biodeterioration patterns, BPs) by biological agents responsible for the deterioration of outdoor stone materials. Here we analyzed BPs of four Khmer temples in Angkor (Cambodia) exposed to variable environmental conditions, using qualitative ecological assessments and statistical approaches. The statistical analyses supported the findings obtained with the qualitative approach. Both approaches provided additional information not otherwise available using one single method. Our results indicate that studies on biodeterioration can benefit from integrating diverse methods so that conservation efforts might become more precise and effective.

  4. Combining Statistical Tools and Ecological Assessments in the Study of Biodeterioration Patterns of Stone Temples in Angkor (Cambodia)

    PubMed Central

    Caneva, G.; Bartoli, F.; Savo, V.; Futagami, Y.; Strona, G.

    2016-01-01

    Biodeterioration is a major problem for the conservation of cultural heritage materials. We provide a new and original approach to analyzing changes in patterns of colonization (Biodeterioration patterns, BPs) by biological agents responsible for the deterioration of outdoor stone materials. Here we analyzed BPs of four Khmer temples in Angkor (Cambodia) exposed to variable environmental conditions, using qualitative ecological assessments and statistical approaches. The statistical analyses supported the findings obtained with the qualitative approach. Both approaches provided additional information not otherwise available using one single method. Our results indicate that studies on biodeterioration can benefit from integrating diverse methods so that conservation efforts might become more precise and effective. PMID:27597658

  5. The Advanced Light Source: A new tool for research in atomic and molecular physics

    NASA Astrophysics Data System (ADS)

    Schlachter, F.; Robinson, A.

    1991-04-01

    The Advanced Light Source at the Lawrence Berkeley Laboratory will be the world's brightest synchrotron radiation source in the extreme ultraviolet and soft x-ray regions of the spectrum when it begins operation in 1993. It will be available as a national user facility to researchers in a broad range of disciplines, including materials science, atomic and molecular physics, chemistry, biology, imaging, and technology. The high brightness of the ALS will be particularly well suited to high-resolution studies of tenuous targets, such as excited atoms, ions, and clusters.

  6. Advanced techniques in IR thermography as a tool for the pest management professional

    NASA Astrophysics Data System (ADS)

    Grossman, Jon L.

    2006-04-01

    Within the past five years, the Pest Management industry has become aware that IR thermography can aid in the detection of pest infestations and locate other conditions that are within the purview of the industry. This paper will review the applications that can be utilized by the pest management professional and discuss the advanced techniques that may be required in conjunction with thermal imaging to locate insect and other pest infestations, moisture within structures, the verification of data and the special challenges associated with the inspection process.

  7. Pantograph catenary dynamic optimisation based on advanced multibody and finite element co-simulation tools

    NASA Astrophysics Data System (ADS)

    Massat, Jean-Pierre; Laurent, Christophe; Bianchi, Jean-Philippe; Balmès, Etienne

    2014-05-01

    This paper presents recent developments undertaken by SNCF Innovation & Research Department on numerical modelling of pantograph catenary interaction. It aims at describing an efficient co-simulation process between finite element (FE) and multibody (MB) modelling methods. FE catenary models are coupled with a full flexible MB representation with pneumatic actuation of pantograph. These advanced functionalities allow new kind of numerical analyses such as dynamic improvements based on innovative pneumatic suspensions or assessment of crash risks crossing areas that demonstrate the powerful capabilities of this computing approach.

  8. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    SciTech Connect

    Coleman, Justin L.; Bolisetti, Chandu; Veeraraghavan, Swetha; Parisi, Carlo; Prescott, Steven R.; Gupta, Abhinav

    2016-09-01

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporates deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.

  9. The Patient Dignity Inventory: Just another evaluation tool? Experiences with advanced cancer patients.

    PubMed

    Rullán, María; Arantzamendi, María; Carvajal, Ana; Martínez, Marina; Saenz de Ormijana, Amaia; Centeno, Carlos

    2017-06-21

    The Patient Dignity Inventory (PDI) evaluates sources of distress related to the feeling of loss of dignity and was designed for patients at the end of life. The aim of the present work was to generate a better understanding of the experiences of healthcare staff when using the PDI. An exploratory qualitative study is presented about the experience of 4 professionals who applied the PDI to 124 advanced-cancer patients. Our study consisted of an analysis of their experiences, taken from information generated in a focus group. A thematic analysis was performed on the information generated at that meeting by two researchers working independently. The initial experiences with the PDI on the part of the professionals led them to systematically administer the questionnaire as part of an interview instead of having patients fill it out themselves in written form. What started out as an evaluation very often led to a profound conversation on the meaning of life, dignity, and other sensitive, key issues related to the process of the illness. The PDI has intrinsic therapeutic value and is useful in clinical practice, and it is also a way of examining issues related to dignity and the meaning of life within the context of advanced-stage illness. There is a need for studies that examine patient experiences through a PDI-based interview.

  10. Analytical tools employed to determine pharmaceutical compounds in wastewaters after application of advanced oxidation processes.

    PubMed

    Afonso-Olivares, Cristina; Montesdeoca-Esponda, Sarah; Sosa-Ferrera, Zoraida; Santana-Rodríguez, José Juan

    2016-12-01

    Today, the presence of contaminants in the environment is a topic of interest for society in general and for the scientific community in particular. A very large amount of different chemical substances reaches the environment after passing through wastewater treatment plants without being eliminated. This is due to the inefficiency of conventional removal processes and the lack of government regulations. The list of compounds entering treatment plants is gradually becoming longer and more varied because most of these compounds come from pharmaceuticals, hormones or personal care products, which are increasingly used by modern society. As a result of this increase in compound variety, to address these emerging pollutants, the development of new and more efficient removal technologies is needed. Different advanced oxidation processes (AOPs), especially photochemical AOPs, have been proposed as supplements to traditional treatments for the elimination of pollutants, showing significant advantages over the use of conventional methods alone. This work aims to review the analytical methodologies employed for the analysis of pharmaceutical compounds from wastewater in studies in which advanced oxidation processes are applied. Due to the low concentrations of these substances in wastewater, mass spectrometry detectors are usually chosen to meet the low detection limits and identification power required. Specifically, time-of-flight detectors are required to analyse the by-products.

  11. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Garcia, Jessica; Beers, Benjamin; Philips, Alan; Holt, James B.; Threet, Grady E., Jr.

    2013-01-01

    The Earth to Orbit (ETO) Team of the Advanced Concepts Office (ACO) at NASA Marshal Space Flight Center (MSFC) is considered the preeminent group to go to for prephase A and phase A concept definition. The ACO team has been at the forefront of a multitude of launch vehicle studies determining the future direction of the Agency as a whole due, in part, to their rapid turnaround time in analyzing concepts and their ability to cover broad trade spaces of vehicles in that limited timeframe. Each completed vehicle concept includes a full mass breakdown of each vehicle to tertiary subsystem components, along with a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. Additionally, a structural analysis of the vehicle based on material properties and geometries is performed as well as an analysis to determine the flight loads based on the trajectory outputs. As mentioned, the ACO Earth to Orbit Team prides themselves on their rapid turnaround time and often need to fulfill customer requests within limited schedule or little advanced notice. Due to working in this fast paced environment, the ETO team has developed some finely honed skills and methods to maximize the delivery capability to meet their customer needs. This paper will describe the interfaces between the 3 primary disciplines used in the design process; weights and sizing, trajectory, and structural analysis, as well as the approach each discipline employs to streamline their particular piece of the design process.

  12. Advancing of Russian ChemBioGrid by bringing Data Management tools into collaborative environment.

    PubMed

    Zhuchkov, Alexey; Tverdokhlebov, Nikolay; Kravchenko, Alexander

    2006-01-01

    Virtual organizations of researchers need effective tools to work collaboratively with huge sets of heterogeneous data distributed over HealthGrid. This paper describes a mechanism of supporting Digital Libraries in High-Performance Computing environment based on Grid technology. The proposed approach provides abilities to assemble heterogeneous data from distributed sources into integrated virtual collections by using OGSA-DAI. The core of the conception is a Repository of Meta-Descriptions that are sets of metadata which define personal and collaborative virtual collections on base of virtualized information resources. The Repository is kept in a native XML-database Sedna and is maintained by Grid Data Services.

  13. The use of statistical tools in field testing of putative effects of genetically modified plants on nontarget organisms

    PubMed Central

    Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F

    2013-01-01

    Abstract To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches – for example, analysis of variance (ANOVA) – are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and

  14. Reporting guidance considerations from a statistical perspective: overview of tools to enhance the rigour of reporting of randomised trials and systematic reviews.

    PubMed

    Hutton, Brian; Wolfe, Dianna; Moher, David; Shamseer, Larissa

    2017-05-01

    Research waste has received considerable attention from the biomedical community. One noteworthy contributor is incomplete reporting in research publications. When detailing statistical methods and results, ensuring analytic methods and findings are completely documented improves transparency. For publications describing randomised trials and systematic reviews, guidelines have been developed to facilitate complete reporting. This overview summarises aspects of statistical reporting in trials and systematic reviews of health interventions. A narrative approach to summarise features regarding statistical methods and findings from reporting guidelines for trials and reviews was taken. We aim to enhance familiarity of statistical details that should be reported in biomedical research among statisticians and their collaborators. We summarise statistical reporting considerations for trials and systematic reviews from guidance documents including the Consolidated Standards of Reporting Trials (CONSORT) Statement for reporting of trials, the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) Statement for trial protocols, the Statistical Analyses and Methods in the Published Literature (SAMPL) Guidelines for statistical reporting principles, the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement for systematic reviews and PRISMA for Protocols (PRISMA-P). Considerations regarding sharing of study data and statistical code are also addressed. Reporting guidelines provide researchers with minimum criteria for reporting. If followed, they can enhance research transparency and contribute improve quality of biomedical publications. Authors should employ these tools for planning and reporting of their research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  15. Implementation of statistical tools to support identification and management of persistent Listeria monocytogenes contamination in smoked fish processing plants.

    PubMed

    Malley, Thomas J V; Stasiewicz, Matthew J; Gröhn, Yrjö T; Roof, Sherry; Warchocki, Steven; Nightingale, Kendra; Wiedmann, Martin

    2013-05-01

    Listeria monocytogenes persistence in food processing plants is a key source of postprocessing contamination of ready-to-eat foods. Thus, identification and elimination of sites where L. monocytogenes persists (niches) is critical. Two smoked fish processing plants were used as models to develop and implement environmental sampling plans (i) to identify persistent L. monocytogenes subtypes (EcoRI ribotypes) using two statistical approaches and (ii) to identify and eliminate likely L. monocytogenes niches. The first statistic, a binomial test based on ribotype frequencies, was used to evaluate L. monocytogenes ribotype recurrences relative to reference distributions extracted from a public database; the second statistic, a binomial test based on previous positives, was used to measure ribotype occurrences as a risk factor for subsequent isolation of the same ribotype. Both statistics revealed persistent ribotypes in both plants based on data from the initial 4 months of sampling. The statistic based on ribotype frequencies revealed persistence of particular ribotypes at specific sampling sites. Two adaptive sampling strategies guided plant interventions during the study: sampling multiple times before and during processing and vector swabbing (i.e., sampling of additional sites in different directions [vectors] relative to a given site). Among sites sampled for 12 months, a Poisson model regression revealed borderline significant monthly decreases in L. monocytogenes isolates at both plants (P = 0.026 and 0.076). Our data indicate elimination of an L. monocytogenes niche on a food contact surface; niches on nonfood contact surfaces were not eliminated. Although our data illustrate the challenge of identifying and eliminating L. monocytogenes niches, particularly at nonfood contact sites in small and medium plants, the methods for identification of persistence we describe here should broadly facilitate science-based identification of microbial persistence.

  16. GenSAA: A tool for advancing satellite monitoring with graphical expert systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.; Luczak, Edward C.

    1993-01-01

    During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real time data for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At the NASA Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.

  17. Recent advances in developing molecular tools for targeted genome engineering of mammalian cells.

    PubMed

    Lim, Kwang-il

    2015-01-01

    Various biological molecules naturally existing in diversified species including fungi, bacteria, and bacteriophage have functionalities for DNA binding and processing. The biological molecules have been recently actively engineered for use in customized genome editing of mammalian cells as the molecule-encoding DNA sequence information and the underlying mechanisms how the molecules work are unveiled. Excitingly, multiple novel methods based on the newly constructed artificial molecular tools have enabled modifications of specific endogenous genetic elements in the genome context at efficiencies that are much higher than that of the conventional homologous recombination based methods. This minireview introduces the most recently spotlighted molecular genome engineering tools with their key features and ongoing modifications for better performance. Such ongoing efforts have mainly focused on the removal of the inherent DNA sequence recognition rigidity from the original molecular platforms, the addition of newly tailored targeting functions into the engineered molecules, and the enhancement of their targeting specificity. Effective targeted genome engineering of mammalian cells will enable not only sophisticated genetic studies in the context of the genome, but also widely-applicable universal therapeutics based on the pinpointing and correction of the disease-causing genetic elements within the genome in the near future.

  18. Development of tools for safety analysis of control software in advanced reactors

    SciTech Connect

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  19. Advances in the development of molecular tools for the control of bovine babesiosis in Mexico.

    PubMed

    Mosqueda, J; Figueroa, J V; Alvarez, A; Bautista, R; Falcon, A; Ramos, A; Canto, G; Vega, C A

    2007-05-01

    The severe negative impact that bovine babesiosis has in the Mexican cattle industry has not been ameliorated basically due to the lack of safe and effective commercially available vaccines and sensitive and reliable diagnostic tests. In recent years, the Bovine Babesiosis Laboratory at the National Center for Disciplinary Research in Veterinary Parasitology-INIFAP in Morelos State, Mexico has been directing efforts towards three main research areas: (1) The development of in vitro culture-derived, improved and safer live vaccines. This has been done in two ways: using gamma-irradiated bovine serum and erythrocytes for the in vitro culture of vaccine strains, which reduces the risk of contaminating pathogens, and improving the immune response, by the addition of L. casei, a strong stimulant of the innate immune system. (2) The study of antigens considered as vaccine candidates with the goal of developing a recombinant vaccine that suits the country's needs. Knowing their degree of conservation or variation in Mexican isolates, their phylogenetic relationship and their protective, immuno-stimulatory properties, are first steps towards that goal. (3) The development of new tools for diagnosis, detection and discrimination of bovine babesiosis is the third area. Developing variants of ELISA, which are more reliable than the currently used IFAT, are a priority, and finally, taking advantage of the genomes of Babesia bigemina, and B. bovis, we are identifying genes than allow us to discriminate isolates using molecular tools.

  20. Emerging tools for continuous nutrient monitoring networks: Sensors advancing science and water resources protection

    USGS Publications Warehouse

    Pellerin, Brian; Stauffer, Beth A; Young, Dwane A; Sullivan, Daniel J.; Bricker, Suzanne B.; Walbridge, Mark R; Clyde, Gerard A; Shaw, Denice M

    2016-01-01

    Sensors and enabling technologies are becoming increasingly important tools for water quality monitoring and associated water resource management decisions. In particular, nutrient sensors are of interest because of the well-known adverse effects of nutrient enrichment on coastal hypoxia, harmful algal blooms, and impacts to human health. Accurate and timely information on nutrient concentrations and loads is integral to strategies designed to minimize risk to humans and manage the underlying drivers of water quality impairment. Using nitrate sensors as an example, we highlight the types of applications in freshwater and coastal environments that are likely to benefit from continuous, real-time nutrient data. The concurrent emergence of new tools to integrate, manage and share large data sets is critical to the successful use of nutrient sensors and has made it possible for the field of continuous nutrient monitoring to rapidly move forward. We highlight several near-term opportunities for Federal agencies, as well as the broader scientific and management community, that will help accelerate sensor development, build and leverage sites within a national network, and develop open data standards and data management protocols that are key to realizing the benefits of a large-scale, integrated monitoring network. Investing in these opportunities will provide new information to guide management and policies designed to protect and restore our nation’s water resources.

  1. Neuron-Miner: An Advanced Tool for Morphological Search and Retrieval in Neuroscientific Image Databases.

    PubMed

    Conjeti, Sailesh; Mesbah, Sepideh; Negahdar, Mohammadreza; Rautenberg, Philipp L; Zhang, Shaoting; Navab, Nassir; Katouzian, Amin

    2016-10-01

    The steadily growing amounts of digital neuroscientific data demands for a reliable, systematic, and computationally effective retrieval algorithm. In this paper, we present Neuron-Miner, which is a tool for fast and accurate reference-based retrieval within neuron image databases. The proposed algorithm is established upon hashing (search and retrieval) technique by employing multiple unsupervised random trees, collectively called as Hashing Forests (HF). The HF are trained to parse the neuromorphological space hierarchically and preserve the inherent neuron neighborhoods while encoding with compact binary codewords. We further introduce the inverse-coding formulation within HF to effectively mitigate pairwise neuron similarity comparisons, thus allowing scalability to massive databases with little additional time overhead. The proposed hashing tool has superior approximation of the true neuromorphological neighborhood with better retrieval and ranking performance in comparison to existing generalized hashing methods. This is exhaustively validated by quantifying the results over 31266 neuron reconstructions from Neuromorpho.org dataset curated from 147 different archives. We envisage that finding and ranking similar neurons through reference-based querying via Neuron Miner would assist neuroscientists in objectively understanding the relationship between neuronal structure and function for applications in comparative anatomy or diagnosis.

  2. Virtual charge state separator as an advanced tool coupling measurements and simulations

    NASA Astrophysics Data System (ADS)

    Yaramyshev, S.; Vormann, H.; Adonin, A.; Barth, W.; Dahl, L.; Gerhard, P.; Groening, L.; Hollinger, R.; Maier, M.; Mickat, S.; Orzhekhovskaya, A.

    2015-05-01

    A new low energy beam transport for a multicharge uranium beam will be built at the GSI High Current Injector (HSI). All uranium charge states coming from the new ion source will be injected into GSI heavy ion high current HSI Radio Frequency Quadrupole (RFQ), but only the design ions U4 + will be accelerated to the final RFQ energy. A detailed knowledge about injected beam current and emittance for pure design U4 + ions is necessary for a proper beam line design commissioning and operation, while measurements are possible only for a full beam including all charge states. Detailed measurements of the beam current and emittance are performed behind the first quadrupole triplet of the beam line. A dedicated algorithm, based on a combination of measurements and the results of advanced beam dynamics simulations, provides for an extraction of beam current and emittance values for only the U4 + component of the beam. The proposed methods and obtained results are presented.

  3. The advanced light source — a new tool for research in atomic physics

    NASA Astrophysics Data System (ADS)

    Schlachter, A. S.

    1991-03-01

    The Advanced Light Source, a third-generation national synchrotron-radiation facility now under construction at the Lawrence Berkeley Laboratory in Berkeley, California, is scheduled to begin serving qualified users across a broad spectrum of research areas in the spring of 1993. Undulators will generate high-brightness, partially coherent, plane polarized, soft x-ray and ultraviolet (XUV) radiation from below 10 eV to above 2 keV. Wigglers and bend magnets will generate high fluxes of x-rays to photon energies above 10 keV. The ALS will have an extensive research program in which XUV radiation is used to study matter in all its varied gaseous, liquid, and solid forms.

  4. The advanced light source: A new tool for research in atomic physics

    NASA Astrophysics Data System (ADS)

    Schlachter, A. S.

    1990-09-01

    The Advanced Light Source, a third-generation national synchrotron-radiation facility now under construction at the Lawrence Berkeley Laboratory in Berkeley, California, is scheduled to begin serving qualified users across a broad spectrum of research areas in the spring of 1993. Undulators will generate high-brightness, partially coherent, plane polarized, soft-x-ray and ultraviolet (XUV) radiation from below 10 eV to above 2 keV. Wigglers and bend magnets will generate high fluxes of x-rays to photon energies above 10 keV. The ALS will have an extensive research program in which XUV radiation is used to study matter in all its varied gaseous, liquid, and solid forms.

  5. Using explanatory crop models to develop simple tools for Advanced Life Support system studies

    NASA Technical Reports Server (NTRS)

    Cavazzoni, J.

    2004-01-01

    System-level analyses for Advanced Life Support require mathematical models for various processes, such as for biomass production and waste management, which would ideally be integrated into overall system models. Explanatory models (also referred to as mechanistic or process models) would provide the basis for a more robust system model, as these would be based on an understanding of specific processes. However, implementing such models at the system level may not always be practicable because of their complexity. For the area of biomass production, explanatory models were used to generate parameters and multivariable polynomial equations for basic models that are suitable for estimating the direction and magnitude of daily changes in canopy gas-exchange, harvest index, and production scheduling for both nominal and off-nominal growing conditions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  6. Using explanatory crop models to develop simple tools for Advanced Life Support system studies

    NASA Technical Reports Server (NTRS)

    Cavazzoni, J.

    2004-01-01

    System-level analyses for Advanced Life Support require mathematical models for various processes, such as for biomass production and waste management, which would ideally be integrated into overall system models. Explanatory models (also referred to as mechanistic or process models) would provide the basis for a more robust system model, as these would be based on an understanding of specific processes. However, implementing such models at the system level may not always be practicable because of their complexity. For the area of biomass production, explanatory models were used to generate parameters and multivariable polynomial equations for basic models that are suitable for estimating the direction and magnitude of daily changes in canopy gas-exchange, harvest index, and production scheduling for both nominal and off-nominal growing conditions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  7. Advances in de novo strain design using integrated systems and synthetic biology tools.

    PubMed

    Ng, Chiam Yu; Khodayari, Ali; Chowdhury, Anupam; Maranas, Costas D

    2015-10-01

    Recent efforts in expanding the range of biofuel and biorenewable molecules using microbial production hosts have focused on the introduction of non-native pathways in model organisms and the bio-prospecting of non-model organisms with desirable features. Current challenges lie in the assembly and coordinated expression of the (non-)native pathways and the elimination of competing pathways and undesirable regulation. Several systems and synthetic biology approaches providing contrasting top-down and bottom-up strategies, respectively, have been developed. In this review, we discuss recent advances in both in silico and experimental approaches for metabolic pathway design and engineering, with a critical assessment of their merits and remaining challenges. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Automatic brain matter segmentation of computed tomography images using a statistical model: A tool to gain working time!

    PubMed

    Bertè, Francesco; Lamponi, Giuseppe; Bramanti, Placido; Calabrò, Rocco S

    2015-10-01

    Brain computed tomography (CT) is useful diagnostic tool for the evaluation of several neurological disorders due to its accuracy, reliability, safety and wide availability. In this field, a potentially interesting research topic is the automatic segmentation and recognition of medical regions of interest (ROIs). Herein, we propose a novel automated method, based on the use of the active appearance model (AAM) for the segmentation of brain matter in CT images to assist radiologists in the evaluation of the images. The method described, that was applied to 54 CT images coming from a sample of outpatients affected by cognitive impairment, enabled us to obtain the generation of a model overlapping with the original image with quite good precision. Since CT neuroimaging is in widespread use for detecting neurological disease, including neurodegenerative conditions, the development of automated tools enabling technicians and physicians to reduce working time and reach a more accurate diagnosis is needed.

  9. Automatic brain matter segmentation of computed tomography images using a statistical model: A tool to gain working time!

    PubMed Central

    Bertè, Francesco; Lamponi, Giuseppe; Bramanti, Placido

    2015-01-01

    Brain computed tomography (CT) is useful diagnostic tool for the evaluation of several neurological disorders due to its accuracy, reliability, safety and wide availability. In this field, a potentially interesting research topic is the automatic segmentation and recognition of medical regions of interest (ROIs). Herein, we propose a novel automated method, based on the use of the active appearance model (AAM) for the segmentation of brain matter in CT images to assist radiologists in the evaluation of the images. The method described, that was applied to 54 CT images coming from a sample of outpatients affected by cognitive impairment, enabled us to obtain the generation of a model overlapping with the original image with quite good precision. Since CT neuroimaging is in widespread use for detecting neurological disease, including neurodegenerative conditions, the development of automated tools enabling technicians and physicians to reduce working time and reach a more accurate diagnosis is needed. PMID:26427894

  10. Suite of tools for statistical N-gram language modeling for pattern mining in whole genome sequences.

    PubMed

    Ganapathiraju, Madhavi K; Mitchell, Asia D; Thahir, Mohamed; Motwani, Kamiya; Ananthasubramanian, Seshan

    2012-12-01

    Genome sequences contain a number of patterns that have biomedical significance. Repetitive sequences of various kinds are a primary component of most of the genomic sequence patterns. We extended the suffix-array based Biological Language Modeling Toolkit to compute n-gram frequencies as well as n-gram language-model based perplexity in windows over the whole genome sequence to find biologically relevant patterns. We present the suite of tools and their application for analysis on whole human genome sequence.

  11. Novel statistical tools for management of public databases facilitate community-wide replicability and control of false discovery.

    PubMed

    Rosset, Saharon; Aharoni, Ehud; Neuvirth, Hani

    2014-07-01

    Issues of publication bias, lack of replicability, and false discovery have long plagued the genetics community. Proper utilization of public and shared data resources presents an opportunity to ameliorate these problems. We present an approach to public database management that we term Quality Preserving Database (QPD). It enables perpetual use of the database for testing statistical hypotheses while controlling false discovery and avoiding publication bias on the one hand, and maintaining testing power on the other hand. We demonstrate it on a use case of a replication server for GWAS findings, underlining its practical utility. We argue that a shift to using QPD in managing current and future biological databases will significantly enhance the community's ability to make efficient and statistically sound use of the available data resources. © 2014 WILEY PERIODICALS, INC.

  12. GAPIT version 2: an enhanced integrated tool for genomic association and prediction

    USDA-ARS?s Scientific Manuscript database

    Most human diseases and agriculturally important traits are complex. Dissecting their genetic architecture requires continued development of innovative and powerful statistical methods. Corresponding advances in computing tools are critical to efficiently use these statistical innovations and to enh...

  13. Developing advanced tools for modelling extreme sea level climate change in European Seas

    NASA Astrophysics Data System (ADS)

    She, Jun; Murawski, Jens; Hintz, Kasper S.

    2017-04-01

    With increasing speed of global warming, sea level rise in the European coasts has become increasing threats to our social-economy and safety. "Hundred-year storm surge events" have been reported in different locations in recent years. Ocean hydrodynamic modelling is one of the major tools for reconstructing and predicting sea level changes in climate scales. Although storm surge modelling is one of the most classic applications of ocean models, there still exist changes in producing accurate sea level variability in all European Sea coasts, especially for the extreme events. This presentation addresses major challenges in pan-European storm surge modelling, presenting sea level simulation results from a two-way nested pan-European Sea (with 10 sub-domains) three-dimensional hydrodynamic model HIROMB-BOOS (HBM). The difference of using two-dimensional and three-dimensional models for storm surge prediction is also analyzed based on past years' operational experiences.

  14. Microfluidic chips with multi-junctions: an advanced tool in recovering proteins from inclusion bodies

    PubMed Central

    Yamaguchi, Hiroshi; Miyazaki, Masaya

    2015-01-01

    Active recombinant proteins are used for studying the biological functions of genes and for the development of therapeutic drugs. Overexpression of recombinant proteins in bacteria often results in the formation of inclusion bodies, which are protein aggregates with non-native conformations. Protein refolding is an important process for obtaining active recombinant proteins from inclusion bodies. However, the conventional refolding method of dialysis or dilution is time-consuming and recovered active protein yields are often low, and a cumbersome trial-and-error process is required to achieve success. To circumvent these difficulties, we used controllable diffusion through laminar flow in microchannels to regulate the denaturant concentration. This method largely aims at reducing protein aggregation during the refolding procedure. This Commentary introduces the principles of the protein refolding method using microfluidic chips and the advantage of our results as a tool for rapid and efficient recovery of active recombinant proteins from inclusion bodies. PMID:25531187

  15. Microfluidic chips with multi-junctions: an advanced tool in recovering proteins from inclusion bodies.

    PubMed

    Yamaguchi, Hiroshi; Miyazaki, Masaya

    2015-01-01

    Active recombinant proteins are used for studying the biological functions of genes and for the development of therapeutic drugs. Overexpression of recombinant proteins in bacteria often results in the formation of inclusion bodies, which are protein aggregates with non-native conformations. Protein refolding is an important process for obtaining active recombinant proteins from inclusion bodies. However, the conventional refolding method of dialysis or dilution is time-consuming and recovered active protein yields are often low, and a cumbersome trial-and-error process is required to achieve success. To circumvent these difficulties, we used controllable diffusion through laminar flow in microchannels to regulate the denaturant concentration. This method largely aims at reducing protein aggregation during the refolding procedure. This Commentary introduces the principles of the protein refolding method using microfluidic chips and the advantage of our results as a tool for rapid and efficient recovery of active recombinant proteins from inclusion bodies.

  16. Regions of Unusual Statistical Properties as Tools in the Search for Horizontally Transferred Genes in Escherichia coli

    NASA Astrophysics Data System (ADS)

    Putonti, Catherine; Chumakov, Sergei; Chavez, Arturo; Luo, Yi; Graur, Dan; Fox, George E.; Fofanov, Yuriy

    2006-09-01

    The observed diversity of statistical characteristics along genomic sequences is the result of the influences of a variety of ongoing processes including horizontal gene transfer, gene loss, genome rearrangements, and evolution. The rate at which various processes affect the genome typically varies between different genomic regions. Thus, variations in statistical properties seen in different regions of a genome are often associated with its evolution and functional organization. Analysis of such properties is therefore relevant to many ongoing biomedical research efforts. Similarity Plot or S-plot is a Windows-based application for large-scale comparisons and 2D visualization of similarities between genomic sequences. This application combines two approaches wildly used in genomics: window analysis of statistical characteristics along genomes and dot-plot visual representation. S-plot is effective in detecting highly similar regions between two genomes. Within a single genome, S-plot has the ability to identify highly dissimilar regions displaying unusual compositional properties. The application was used to perform a comparative analysis of 50+ microbial genomes as well as many eukaryote genomes including human, rat, mouse, and drosophila. We illustrate the uses of S-Plot in a comparison involving Escherichia coli K12 and E. coli O157:H7.

  17. A software tool for advanced MRgFUS prostate therapy planning and follow up

    NASA Astrophysics Data System (ADS)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  18. Competence in advanced older people nursing: development of 'nursing older people--competence evaluation tool'.

    PubMed

    Bing-Jonsson, Pia Cecilie; Bjørk, Ida Torunn; Hofoss, Dag; Kirkevold, Marit; Foss, Christina

    2015-03-01

    Community care is characterised by a move from institutionalised to home-based care, a large patient population with comorbidities including cognitive failure, and nurses who struggle to keep up with their many competence demands. No study has examined the competence of nurses based on present demands, and an instrument for this purpose is lacking. We conducted a Delphi study based in Norway to develop the substantial content of a new competence measurement instrument. We sought to reach consensus regarding which nursing staff competence is most relevant to meet the current needs of older patients. A total of 42 experts participated in three consecutive panel investigations. Snowball sampling was used. The experts were clinicians, leaders, teachers, researchers and relatives of older people who required nursing. In Round 1, all experts were interviewed individually. These data were analysed using meaning coding and categorisation. In Rounds 2 and 3, the data were collected using electronic questionnaires and analysed quantitatively with SPSS. The experts agreed that health promotion as well as disease prevention, treatment, palliative care, ethics and regulation, assessment and taking action, covering basic needs, communication and documentation, responsibility and activeness, cooperation, and attitudes towards older people were the most relevant categories of competence. The experts showed clear consensus regarding the most relevant and current competence for nurses of older people. Assuming that older people in need of health care have the same requirements across cultures, this study's findings could be used as a basis for international studies. Those who nurse older people require competence that is complex and comprehensive. One way to evaluate nursing competence is through evaluation tools such as the Nursing Older People--Competence Evaluation tool. © 2014 John Wiley & Sons Ltd.

  19. Advanced Vibration Analysis Tools and New Strategies for Robust Design of Turbine Engine Rotors

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2002-01-01

    The adverse effects of small, random structural irregularities among the blades, called mistuning, can result in blade forced-response amplitudes and stresses that are much larger than those predicted for a perfectly tuned rotor. Manufacturing tolerances, deviations in material properties, or nonuniform operational wear causes mistuning; therefore, mistuning is unavoidable. Furthermore, even a small mistuning can have a dramatic effect on the vibratory behavior of a rotor because it can lead to spatial localization of the vibration energy (see the following photographs). As a result, certain blades may experience forced response amplitudes and stresses that are substantially larger than those predicted by an analysis of the nominal (tuned) design. Unfortunately, these random uncertainties in blade properties, and the immense computational effort involved in obtaining statistically reliable design data, combine to make this aspect of rotor design cumbersome.

  20. Advancing spaceborne tools for the characterization of planetary ionospheres and circumstellar environments

    NASA Astrophysics Data System (ADS)

    Douglas, Ewan Streets

    This work explores remote sensing of planetary atmospheres and their circumstellar surroundings. The terrestrial ionosphere is a highly variable space plasma embedded in the thermosphere. Generated by solar radiation and predominantly composed of oxygen ions at high altitudes, the ionosphere is dynamically and chemically coupled to the neutral atmosphere. Variations in ionospheric plasma density impact radio astronomy and communications. Inverting observations of 83.4 nm photons resonantly scattered by singly ionized oxygen holds promise for remotely sensing the ionospheric plasma density. This hypothesis was tested by comparing 83.4 nm limb profiles recorded by the Remote Atmospheric and Ionospheric Detection System aboard the International Space Station to a forward model driven by coincident plasma densities measured independently via ground-based incoherent scatter radar. A comparison study of two separate radar overflights with different limb profile morphologies found agreement between the forward model and measured limb profiles. A new implementation of Chapman parameter retrieval via Markov chain Monte Carlo techniques quantifies the precision of the plasma densities inferred from 83.4 nm emission profiles. This first study demonstrates the utility of 83.4 nm emission for ionospheric remote sensing. Future visible and ultraviolet spectroscopy will characterize the composition of exoplanet atmospheres; therefore, the second study advances technologies for the direct imaging and spectroscopy of exoplanets. Such spectroscopy requires the development of new technologies to separate relatively dim exoplanet light from parent star light. High-contrast observations at short wavelengths require spaceborne telescopes to circumvent atmospheric aberrations. The Planet Imaging Concept Testbed Using a Rocket Experiment (PICTURE) team designed a suborbital sounding rocket payload to demonstrate visible light high-contrast imaging with a visible nulling coronagraph

  1. Advances in developing molecular-diagnostic tools for strongyloid nematodes of equids: fundamental and applied implications.

    PubMed

    Gasser, Robin B; Hung, Guo-Chiuan; Chilton, Neil B; Beveridge, Ian

    2004-02-01

    Infections of equids with parasitic nematodes of the order Strongylida (subfamilies Strongylinae and Cyathostominae) are of major veterinary importance. In last decades, the widespread use of drugs against these parasites has led to problems of resistance within the Cyathostominae, and to an increase in their prevalence and intensity of infection. Novel control strategies, based on improved knowledge of parasite biology and epidemiology, have thus become important. However, there are substantial limitations in the understanding of fundamental biological and systematic aspects of these parasites, which have been due largely to limitations in their specific identification and diagnosis using traditional, morphological approaches. Recently, there has been progress in the development of DNA-based approaches for the specific identification of strongyloids of equids for systematic studies and disease diagnosis. The present article briefly reviews information on the classification, biology, pathogenesis, epidemiology of equine strongyloids and the diagnosis of infections, highlights knowledge gaps in these areas, describes recent advances in the use of molecular techniques for the genetic characterisation, specific identification and differentiation of strongyloids of equids as a basis for fundamental investigations of the systematics, population biology and ecology.

  2. The advanced light source at Lawrence Berkeley laboratory: a new tool for research in atomic physics

    NASA Astrophysics Data System (ADS)

    Schlachter, Alfred S.; Robinson, Arthur L.

    1991-04-01

    The Advanced Light Source, a third-generation national synchrotron-radiation facility now under construction at the Lawrence Berkeley Laboratory, is scheduled to begin serving qualified users across a broad spectrum of research areas in the spring of 1993. Based on a low-emittance electron storage ring optimized to operate at 1.5 GeV, the ALS will have 10 long straight sections available for insertion devices (undulators and wigglers) and 24 high-quality bend-magnet ports. The short pulse width (30-50 ps) will be ideal for time-resolved measurements. Undulators will generate high-brightness partially coherent soft X-ray and ultraviolet (XUV) radiation from below 10 eV to above 2 keV; this radiation is plane polarized. Wigglers and bend magnets will extend the spectrum by generating high fluxes of X-rays to photon energies above 10 keV. The ALS will have an extensive research program in which XUV radiation is used to study matter in all its varied gaseous, liquid, and solid forms. The high brightness will open new areas of research in the materials sciences, such as spatially resolved spectroscopy (spectromicroscopy), and in biology, such as X-ray microscopy with element-specific sensitivity; the high flux will allow measurements in atomic physics and chemistry to be made with tenuous gas-phase targets. Technological applications could include lithography and nano-fabrication.

  3. Propulsion Simulations Using Advanced Turbulence Models with the Unstructured Grid CFD Tool, TetrUSS

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.; Frink, Neal T.; Deere, Karen A.; Pandya, Mohangna J.

    2004-01-01

    A computational investigation has been completed to assess the capability of TetrUSS for exhaust nozzle flows. Three configurations were chosen for this study (1) an axisymmetric supersonic jet, (2) a transonic axisymmetric boattail with solid sting operated at different Reynolds number and Mach number, and (3) an isolated non-axisymmetric nacelle with a supersonic cruise nozzle. These configurations were chosen because existing experimental data provided a means for measuring the ability of TetrUSS for simulating complex nozzle flows. The main objective of this paper is to validate the implementation of advanced two-equation turbulence models in the unstructured-grid CFD code USM3D for propulsion flow cases. USM3D is the flow solver of the TetrUSS system. Three different turbulence models, namely, Menter Shear Stress Transport (SST), basic k epsilon, and the Spalart-Allmaras (SA) are used in the present study. The results are generally in agreement with other implementations of these models in structured-grid CFD codes. Results indicate that USM3D provides accurate simulations for complex aerodynamic configurations with propulsion integration.

  4. ModelTest Server: a web-based tool for the statistical selection of models of nucleotide substitution online.

    PubMed

    Posada, David

    2006-07-01

    ModelTest server is a web-based application for the selection of models of nucleotide substitution using the program ModelTest. The server takes as input a text file with likelihood scores for the set of candidate models. Models can be selected with hierarchical likelihood ratio tests, or with the Akaike or Bayesian information criteria. The output includes several statistics for the assessment of model selection uncertainty, for model averaging or to estimate the relative importance of model parameters. The server can be accessed at http://darwin.uvigo.es/software/modeltest_server.html.

  5. ModelTest Server: a web-based tool for the statistical selection of models of nucleotide substitution online

    PubMed Central

    Posada, David

    2006-01-01

    ModelTest server is a web-based application for the selection of models of nucleotide substitution using the program ModelTest. The server takes as input a text file with likelihood scores for the set of candidate models. Models can be selected with hierarchical likelihood ratio tests, or with the Akaike or Bayesian information criteria. The output includes several statistics for the assessment of model selection uncertainty, for model averaging or to estimate the relative importance of model parameters. The server can be accessed at . PMID:16845102

  6. From interventionist imaging to intraoperative guidance: New perspectives by combining advanced tools and navigation with radio-guided surgery.

    PubMed

    Vidal-Sicart, S; Valdés Olmos, R; Nieweg, O E; Faccini, R; Grootendorst, M R; Wester, H J; Navab, N; Vojnovic, B; van der Poel, H; Martínez-Román, S; Klode, J; Wawroschek, F; van Leeuwen, F W B

    2017-08-03

    The integration of medical imaging technologies into diagnostic and therapeutic approaches can provide a preoperative insight into both anatomical (e.g. using computed tomography (CT), magnetic resonance (MR) imaging, or ultrasound (US)), as well as functional aspects (e.g. using single photon emission computed tomography (SPECT), positron emission tomography (PET), lymphoscintigraphy, or optical imaging). Moreover, some imaging modalities are also used in an interventional setting (e.g. CT, US, gamma or optical imaging) where they provide the surgeon with real-time information during the procedure. Various tools and approaches for image-guided navigation in cancer surgery are becoming feasible today. With the development of new tracers and portable imaging devices, these advances will reinforce the role of interventional molecular imaging. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  7. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science

    PubMed Central

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883

  8. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.

    PubMed

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets.

  9. A new tool to give hospitalists feedback to improve interprofessional teamwork and advance patient care.

    PubMed

    Chesluk, Benjamin J; Bernabeo, Elizabeth; Hess, Brian; Lynn, Lorna A; Reddy, Siddharta; Holmboe, Eric S

    2012-11-01

    Teamwork is a vital skill for health care professionals, but the fragmented systems within which they work frequently do not recognize or support good teamwork. The American Board of Internal Medicine has developed and is testing the Teamwork Effectiveness Assessment Module (TEAM), a tool for physicians to evaluate how they perform as part of an interprofessional patient care team. The assessment provides hospitalist physicians with feedback data drawn from their own work of caring for patients, in a way that is intended to support immediate, concrete change efforts to improve the quality of patient care. Our approach demonstrates the value of looking at teamwork in the real world of health care-that is, as it occurs in the actual contexts in which providers work together to care for patients. The assessment of individual physicians' teamwork competencies may play a role in the larger effort to bring disparate health professions together in a system that supports and rewards a team approach in hope of improving patient care.

  10. Genetic tool development underpins recent advances in thermophilic whole‐cell biocatalysts

    PubMed Central

    Taylor, M. P.; van Zyl, L.; Tuffin, I. M.; Leak, D. J.; Cowan, D. A.

    2011-01-01

    Summary The environmental value of sustainably producing bioproducts from biomass is now widely appreciated, with a primary target being the economic production of fuels such as bioethanol from lignocellulose. The application of thermophilic prokaryotes is a rapidly developing niche in this field, driven by their known catabolic versatility with lignocellulose‐derived carbohydrates. Fundamental to the success of this work has been the development of reliable genetic and molecular systems. These technical tools are now available to assist in the development of other (hyper)thermophilic strains with diverse phenotypes such as hemicellulolytic and cellulolytic properties, branched chain alcohol production and other ‘valuable bioproduct’ synthetic capabilities. Here we present an insight into the historical limitations, recent developments and current status of a number of genetic systems for thermophiles. We also highlight the value of reliable genetic methods for increasing our knowledge of thermophile physiology. We argue that the development of robust genetic systems is paramount in the evolution of future thermophilic based bioprocesses and make suggestions for future approaches and genetic targets that will facilitate this process. PMID:21310009

  11. New advances and validation of knowledge management tools for critical care using classifier techniques.

    PubMed Central

    Frize, M.; Wang, L.; Ennett, C. M.; Nickerson, B. G.; Solven, F. G.; Stevenson, M.

    1998-01-01

    An earlier version (2.0) of the case-based reasoning (CBR) tool, called IDEAS for ICU's, allowed users to compare the ten closest matching cases to the newest patient admission, using a large database of intensive care patient records, and physician-selected matching-weights [1,2]. The new version incorporates matching-weights, which have been determined quantitatively. A faster CBR matching engine has also been incorporated into the new CBR. In a second approach, a back-propagation, feed-forward artificial neural network estimated two classes of the outcome "duration of artificial ventilation" for a subset of the database used for the CBR work. Weight-elimination was successfully applied to reduce the number of input variables and speed-up the estimation of outcomes. New experiments examined the impact of using a different number of input variables on the performance of the ANN, measured by correct classification rates (CCR) and the Average Squared Error (ASE). PMID:9929280

  12. Advanced semi-active engine and transmission mounts: tools for modelling, analysis, design, and tuning

    NASA Astrophysics Data System (ADS)

    Farjoud, Alireza; Taylor, Russell; Schumann, Eric; Schlangen, Timothy

    2014-02-01

    This paper is focused on modelling, design, and testing of semi-active magneto-rheological (MR) engine and transmission mounts used in the automotive industry. The purpose is to develop a complete analysis, synthesis, design, and tuning tool that reduces the need for expensive and time-consuming laboratory and field tests. A detailed mathematical model of such devices is developed using multi-physics modelling techniques for physical systems with various energy domains. The model includes all major features of an MR mount including fluid dynamics, fluid track, elastic components, decoupler, rate-dip, gas-charged chamber, MR fluid rheology, magnetic circuit, electronic driver, and control algorithm. Conventional passive hydraulic mounts can also be studied using the same mathematical model. The model is validated using standard experimental procedures. It is used for design and parametric study of mounts; effects of various geometric and material parameters on dynamic response of mounts can be studied. Additionally, this model can be used to test various control strategies to obtain best vibration isolation performance by tuning control parameters. Another benefit of this work is that nonlinear interactions between sub-components of the mount can be observed and investigated. This is not possible by using simplified linear models currently available.

  13. Measuring the Benefits of Public Chargers and Improving Infrastructure Deployments Using Advanced Simulation Tools: Preprint

    SciTech Connect

    Wood, Eric; Neubauer. Jeremy; Burton, Evan

    2015-02-01

    With support from the U.S. Department of Energy's Vehicle Technologies Office, the National Renewable Energy Laboratory developed BLAST-V -- the Battery Lifetime Analysis and Simulation Tool for Vehicles. The addition of high-resolution spatial-temporal travel histories enables BLAST-V to investigate user-defined infrastructure rollouts of publically accessible charging infrastructure, as well as quantify impacts on vehicle and station owners in terms of improved vehicle utility and station throughput. This paper presents simulation outputs from BLAST-V that quantify the utility improvements of multiple distinct rollouts of publically available Level 2 electric vehicle supply equipment (EVSE) in the Seattle, Washington, metropolitan area. Publically available data on existing Level 2 EVSE are also used as an input to BLAST-V. The resulting vehicle utility is compared to a number of mock rollout scenarios. Discussion focuses on the estimated number of Level 2 stations necessary to substantially increase vehicle utility and how stations can be strategically sited to maximize their potential benefit to prospective electric vehicle owners.

  14. SERS as an advanced tool for investigating chloroethyl nitrosourea derivatives complexation with DNA.

    PubMed

    Agarwal, Shweta; Ray, Bhumika; Mehrotra, Ranjana

    2015-11-01

    We report surface-enhanced Raman spectroscopic (SERS) studies on free calf thymus DNA and its complexes with anti-tumor chloroethyl nitrosourea derivatives; semustine and nimustine. Since, first incident of SERS in 1974, it has rapidly established into an analytical tool, which can be used for the trace detection and characterization of analytes. Here, we depict yet another application of SERS in the field of drug-DNA interaction and thereby, its promising role in rational designing of new chemotherapeutic agents. Vibrational spectral analysis has been performed in an attempt to delineate the anti-cancer action mechanism of above mentioned nitrosourea derivatives. Strong SERS bands associated with the complexation of DNA with semustine and nimustine have been observed, which reveal binding of nitrosourea derivatives with heterocyclic nitrogenous base pair of DNA duplex. Formation of dG-dC interstrand cross-link in DNA double helices is also suggested by the SERS spectral outcomes of CENUs-DNA adduct. Results, demonstrated here, reflect recent progress in the newly developing field of drug-DNA interaction analysis via SERS. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. AULA-Advanced Virtual Reality Tool for the Assessment of Attention: Normative Study in Spain.

    PubMed

    Iriarte, Yahaira; Diaz-Orueta, Unai; Cueto, Eduardo; Irazustabarrena, Paula; Banterla, Flavio; Climent, Gema

    2016-06-01

    The present study describes the obtention of normative data for the AULA test, a virtual reality tool designed to evaluate attention problems, especially in children and adolescents. The normative sample comprised 1,272 participants (48.2% female) with an age range from 6 to 16 years (M = 10.25, SD = 2.83). The AULA test administered to them shows both visual and auditory stimuli, while randomized distractors of ecological nature appear progressively. Variables provided by AULA were clustered in different categories for their posterior analysis. Differences by age and gender were analyzed, resulting in 14 groups, 7 per sex group. Differences between visual and auditory attention were also obtained. Obtained normative data are relevant for the use of AULA for evaluating attention in Spanish children and adolescents in a more ecological way. Further studies will be needed to determine sensitivity and specificity of AULA to measure attention in different clinical populations. (J. of Att. Dis. 2016; 20(6) 542-568). © The Author(s) 2012.

  16. ASTRID© - Advanced Solar Tubular ReceIver Design: A powerful tool for receiver design and optimization

    NASA Astrophysics Data System (ADS)

    Frantz, Cathy; Fritsch, Andreas; Uhlig, Ralf

    2017-06-01

    In solar tower power plants the receiver is one of the critical components. It converts the solar radiation into heat and must withstand high heat flux densities and high daily or even hourly gradients (due to passage of clouds). For this reason, the challenge during receiver design is to find a reasonable compromise between receiver efficiency, reliability, lifetime and cost. There is a strong interaction between the heliostat field, the receiver and the heat transfer fluid. Therefore, a proper receiver design needs to consider these components within the receiver optimization. There are several design and optimization tools for receivers, but most of them focus only on the receiver, ignoring the heliostat field and other parts of the plant. During the last years DLR developed the ASTRIDcode for tubular receiver concept simulation. The code comprises both a high and a low-detail model. The low-detail model utilizes a number of simplifications which allow the user to screen a high number of receiver concepts for optimization purposes. The high-detail model uses a FE model and is able to compute local absorber and salt temperatures with high accuracy. One key strength of the ASTRIDcode is its interface to a ray tracing software which simulates a realistic heat flux distributions on the receiver surface. The results generated by the ASTRIDcode have been validated by CFD simulations and measurement data.

  17. qDNAmod: a statistical model-based tool to reveal intercellular heterogeneity of DNA modification from SMRT sequencing data

    PubMed Central

    Feng, Zhixing; Li, Jing; Zhang, Jing-Ren; Zhang, Xuegong

    2014-01-01

    In an isogenic cell population, phenotypic heterogeneity among individual cells is common and critical for survival of the population under different environment conditions. DNA modification is an important epigenetic factor that can regulate phenotypic heterogeneity. The single molecule real-time (SMRT) sequencing technology provides a unique platform for detecting a wide range of DNA modifications, including N6-methyladenine (6-mA), N4-methylcytosine (4-mC) and 5-methylcytosine (5-mC). Here we present qDNAmod, a novel bioinformatic tool for genome-wide quantitative profiling of intercellular heterogeneity of DNA modification from SMRT sequencing data. It is capable of estimating proportion of isogenic haploid cells, in which the same loci of the genome are differentially modified. We tested the reliability of qDNAmod with the SMRT sequencing data of Streptococcus pneumoniae strain ST556. qDNAmod detected extensive intercellular heterogeneity of DNA methylation (6-mA) in a clonal population of ST556. Subsequent biochemical analyses revealed that the recognition sequences of two type I restriction–modification (R-M) systems are responsible for the intercellular heterogeneity of DNA methylation initially identified by qDNAmod. qDNAmod thus represents a valuable tool for studying intercellular phenotypic heterogeneity from genome-wide DNA modification. PMID:25404133

  18. Femtosecond Microbunched Electron Beam — A New Tool for Advanced Accelerator Research

    NASA Astrophysics Data System (ADS)

    Pogorelsky, I. V.; Babzien, M.; Ben-Zvi, I.; Kusche, K. P.; Pavlishin, I. V.; Yakimenko, V.; Dilley, C. E.; Gottschalk, S. C.; Kimura, W. D.; Steinhauer, L. C.; Kallos, E.; Katsouleas, T.; Muggli, P.; Zigler, A.; Banna, S.; Schächter, L.; Cline, D. B.; Zhou, F.; Kamiya, Y.; Kumita, T.

    2006-04-01

    We employed periodic trains of femtosecond electron bunches for testing several novel concepts of acceleration. A microwave-driven linac sends a 45-MeV electron beam (e-beam) through a magnetic wiggler wherein the e-beam energy is modulated via the inverse free electron laser (IFEL) technique by interacting with a 30-GW CO2 laser beam, so creating 3 fs long microbunches separated by a 30 fs laser period. We show several examples of utilizing such a femtosecond bunch train in advanced accelerator and radiation source research. We demonstrated that microbunching improves the performance of the laser acceleration process compared to the previously investigated single-bunch technique. Specifically, microbunches were phased to the electromagnetic wave of the CO2 laser beam inside a matched tapered wiggler where ˜80% of electrons gained energy as an ensemble while maintaining a narrow energy spread (i.e., monoenergetic). Another plasma wakefield acceleration (PWFA) experiment explored resonant wakefield excitation in an electric discharge plasma with the plasma frequency matched to that of the CO2 laser. Simulations predict orders-of-magnitude enhancement in the wakefield's amplitude compared with that attained with single bunches. In the Particle Acceleration by Stimulated Emission of Radiation (PASER) experiment, we tested a prediction that an active laser medium can produce particle acceleration by stimulating the emission of radiation. The process benefits from the action of a periodic train of microbunches resonating with the laser transition. Finally, we analyze prospects for using partially coherent x-ray sources based on Thomson backscattering from the electron microbunch train.

  19. Multivariate Statistical Analysis: a tool for groundwater quality assessment in the hidrogeologic region of the Ring of Cenotes, Yucatan, Mexico.

    NASA Astrophysics Data System (ADS)

    Ye, M.; Pacheco Castro, R. B.; Pacheco Avila, J.; Cabrera Sansores, A.

    2014-12-01

    The karstic aquifer of Yucatan is a vulnerable and complex system. The first fifteen meters of this aquifer have been polluted, due to this the protection of this resource is important because is the only source of potable water of the entire State. Through the assessment of groundwater quality we can gain some knowledge about the main processes governing water chemistry as well as spatial patterns which are important to establish protection zones. In this work multivariate statistical techniques are used to assess the groundwater quality of the supply wells (30 to 40 meters deep) in the hidrogeologic region of the Ring of Cenotes, located in Yucatan, Mexico. Cluster analysis and principal component analysis are applied in groundwater chemistry data of the study area. Results of principal component analysis show that the main sources of variation in the data are due sea water intrusion and the interaction of the water with the carbonate rocks of the system and some pollution processes. The cluster analysis shows that the data can be divided in four clusters. The spatial distribution of the clusters seems to be random, but is consistent with sea water intrusion and pollution with nitrates. The overall results show that multivariate statistical analysis can be successfully applied in the groundwater quality assessment of this karstic aquifer.

  20. Multistate Statistical Modeling: A Tool to Build a Lung Cancer Microsimulation Model That Includes Parameter Uncertainty and Patient Heterogeneity.

    PubMed

    Bongers, Mathilda L; de Ruysscher, Dirk; Oberije, Cary; Lambin, Philippe; Uyl-de Groot, Carin A; Coupé, V M H

    2016-01-01

    With the shift toward individualized treatment, cost-effectiveness models need to incorporate patient and tumor characteristics that may be relevant to treatment planning. In this study, we used multistate statistical modeling to inform a microsimulation model for cost-effectiveness analysis of individualized radiotherapy in lung cancer. The model tracks clinical events over time and takes patient and tumor features into account. Four clinical states were included in the model: alive without progression, local recurrence, metastasis, and death. Individual patients were simulated by repeatedly sampling a patient profile, consisting of patient and tumor characteristics. The transitioning of patients between the health states is governed by personalized time-dependent hazard rates, which were obtained from multistate statistical modeling (MSSM). The model simulations for both the individualized and conventional radiotherapy strategies demonstrated internal and external validity. Therefore, MSSM is a useful technique for obtaining the correlated individualized transition rates that are required for the quantification of a microsimulation model. Moreover, we have used the hazard ratios, their 95% confidence intervals, and their covariance to quantify the parameter uncertainty of the model in a correlated way. The obtained model will be used to evaluate the cost-effectiveness of individualized radiotherapy treatment planning, including the uncertainty of input parameters. We discuss the model-building process and the strengths and weaknesses of using MSSM in a microsimulation model for individualized radiotherapy in lung cancer.